From: Silver
6.8
This is a laugh riot.
Hitokage wrote:
Comments about Wolfram's unoriginality and hyperbole aside, even if you did find such a construct for a desired pattern, you'd still have to define what output was useful and what is to be discarded.
Border wrote:
Here is a very technical description of what Jan Sloot was working towards:
http://www.endlesscompression.com/
Could this ever work, in any form? Maybe not movies as a few kilobytes and source files as a few hundred megabytes....but maybe 200MB movies and 1 TB source files? The debunking article is good, though I'm not sure how solid some parts of the rhetoric are ("A source file can't account for every movie because there's an infinite number of movies possible" Huh?).
Dsal wrote:
Yep. But I suppose it's possible, although unlikely, that for any given block of data, there is a procedural algorithm that could precisely generate it. Maybe if someone was able to construct a huge database of mappings of all possible block values (heh...) to a generating procedural algorithm it could work. Then they'd just look up the block in the database and only output the procedural algorithm parameters to the file. You could then take the output and repeat the process until there was no further compression realized.
Monk wrote:
In theory you could even make a doom clome with bump mapping in 64k
Here is one at 96k
http://www.theprodukkt.com/kkrieger.html