New computer source code true?

In summary: DVD player to play it. This technology has been around for some years and has been allegedly patented by Dutch inventor Jan Sloot, but so far no one has been able to produce anything more than a few demonstrations.The second story is about a company, Spintronics, that supposedly has developed a way to store data without compression, by using the electrons in the molecules of a material to store the data. The company has allegedly been backed by some of the biggest names in technology, but so far no one has been able to produce anything more than a few demonstrations.
  • #1
Monique
Staff Emeritus
Science Advisor
Gold Member
4,219
67
Did anyone hear the story about the dutch inventor Jan Sloot from Nieuwegein who supposedly invented a new computer source code, but died by a heart-attack the day before he was going to place the patent at a lawyers office?

The story aired on television a few days ago and is covered in a cloud of mystery :bugeye: many top people in the industry are involved and were ready to invest in the ground-shaking technology, but no one actually ever was told the secret. Supposedly the inventor was able to play 16 movies from a 64 kb chip at high speed without reading from the harddrive, supposed he was able to store 64 full size movies on a single chip.

His invention involved a new way of digitizing data, which was extremely efficient..

Here's the story http://www.gids.nl/techno/jan-sloot.html try babelfish to translate it :wink: dutch-speakers can watch the show online http://www.netwerk.tv/index.jsp?p=items&r=deze_week&a=131206
 
Last edited by a moderator:
Computer science news on Phys.org
  • #2
Perhaps it's the language barrier, but none of this makes any sense. Are you suggesting this person developed some kind of compression algorithm that can fit 16 full length movies in 64 kilobytes?

- Warren
 
  • #3
My friend is in information technology and he believes this stuff is true. I don't think it's real, but he's convinced by the major names involved and the way investors were eager to join in on the business.

It should be an algorithm that manages data in a novel but very simple manner. The guy who developed it was very wary about patenting it, since the idea was so simple. His technology was supported by the ex-director of world online and the follow-up director of Philips (major electronics business), although no one ever had the technology in hands but the inventor..

I don't know much about compression, nothing really :P I remember from the documentary they said that instead of compressing things together, he was taking it apart.

I imagine that he created a code that acts like a key, where he could define long strings of data with a single command. You only store the simple command and expand all the data later. But I'm a skeptic, the cloud of mystery is a little too thick in my opinion..
 
  • #4
Well, the information theorists can tell you exactly how much non-redundant information is in a full-length movie, and they'll say you cannot compress the movie to any size smaller than that, in general. It's entirely possible to coincidentally be able to compress one movie to a very small size, but not ANY movie.

People have been perpetrating hoaxes about enormous compression for years. They're all snake oil. I honestly wouldn't be surprised if this guy fleeced the corporate funding people, staged his own death, and left the country with the money.

- Warren
 
  • #5
Yeah, how could you expand the data or whatever without even reading the hard-drive?

64k is freakin small :)
 
  • #6
So how does compression work now? zip, rar, mp3 files? You can take out redundant information, the wavelengths you do not hear or freeze the background of the film when it does not move.

But if you can break up the code in blocks, and only store those blocks, you can dramatically reduce the size if you know the code.

It won't be good for the economy if this were true: no one would need dsl if the data can easily transmitted through phone lines, no one would buy ipods if thousands of songs fit on a cd.
 
  • #7
at high speed

Anyone heard of spintronics? Making electrons spin instead of moving back and forth? That way, you can strore A LOT more data, and access it infinitely faster. It would be very nifty if that was his invention... but I doubt it.

I can dream, can't I?

Andy
AMW Bonfire
 
  • #8
It won't be good for the economy if this were true.

But it would be very good for consumers! And anyone selling the technology... cha-ching!

Andy
AMW Bonfire
 
  • #9
I found the following article which mentions this device

https://doc.telin.nl/dscgi/ds.py/Get/File-21154/Babet_Final_Report.pdf

They seem to think that in fact most of the information was stored separately, and the 64kb was a recipe for putting it together.

Possibly it was intended as more of a license management device - make the library of image prototypes freely available, but require the chip to be present to actually play the movie.
 
Last edited by a moderator:
  • #10
From Chronon's link:
At least two different sensational claims for extreme compression technologies were extensively presented during last few months. First of them, the so-called "Sloot Digital Coding System" was specially designed for compressing digital movies, and is supposed to have been capable of recording one complete, full resolution movie, on an eight kilo byte memory. For a 100-minute movie, at TV quality, this means a compression ratio of approximately 14.25 million to one. Note that current DVDs use the MPEG2 specifications to achieve a compression ratio of about 15 to 1, and the latest MPEG4 encoders can compress a movie about 100 to 175 times, but then with a considerable loss in quality. So, there is a legitimate interest in finding out whether these claims are credible, possible, and reproducible. Unfortunately, there is no scientific or technical information to support these claims, so we can only conclude for now that if a movie has been "compressed" and stored on an 8kB memory chip, then a large amount of the original information should have been available within the "de-compressor" (player). Indeed, further investigations confirmed that the 8kB memory was not used for storing the movie data, but rather a "recipe" to reconstruct an approximate version of the video frames. A huge library for image prototypes had to be used in combination with the 8 kilobytes of externally stored information. Even so, the quality of the movies played by this system was rather low. Another remark refers to the encoding process, which apparently had to be adapted to each particular movie. This suggests that the supposed prototype libraries were not generic (i.e. suited for encoding any movie).
 
Last edited:
  • #11
More detailed: for every movie there are a limited amount of sounds and colors. The basic data would be stored in five algorithms. Every algorithm would be a max size of 74 Mb, 370 Mb in total: the motor of the invention. The only thing needed to get it started would be a fitting key. Sloot would make for every movie screen a unique code, which in total would result in a unique code. The last code, the key, only takes 1 kb of memory, independent of the length of the movie. On one simple chip he could thus store tens of keys, 64 keys on a smartcard. So against payment you could be sent the key to a number of movies through your cell phone, which can be put into the 370 Mb algorithms that are present in the player. :confused:
 
  • #12
I can only imagine how poor the quality of the movie is if it will be compress to such a small size.

I read a similar article on 2600 magazine on how to compress movies so you can watch it in your PDA or Ipaq.
 
  • #13
So basically he more or less encrypted movies that were compressed to pretty junky quality?

Wow :)

I have some source code to compress files and it only increases the file size by 200%. Anyone interested?
 
  • #14
Monique said:
More detailed: for every movie there are a limited amount of sounds and colors. The basic data would be stored in five algorithms. Every algorithm would be a max size of 74 Mb, 370 Mb in total: the motor of the invention. The only thing needed to get it started would be a fitting key. Sloot would make for every movie screen a unique code, which in total would result in a unique code. The last code, the key, only takes 1 kb of memory, independent of the length of the movie. On one simple chip he could thus store tens of keys, 64 keys on a smartcard. So against payment you could be sent the key to a number of movies through your cell phone, which can be put into the 370 Mb algorithms that are present in the player. :confused:


Ah, I see what you're saying. Sort of like those Text to Speech programs. The different pieces of sounds, syllables, are already stored in the computer and you simply input the text. So a 100KB text file could be turned into a 10MB or whatever .wav file. Kind of an incomplete analogy, but that's how I'm thinking about it.

However, there are probably a million reasons why this would either be impossible or extremely inefficent to do with video.
 
  • #15
No one answered this:
Monique said:
So how does compression work now? zip, rar, mp3 files? You can take out redundant information, the wavelengths you do not hear or freeze the background of the film when it does not move.

But if you can break up the code in blocks, and only store those blocks, you can dramatically reduce the size if you know the code.
Media and text require completely different types of compression.

[simplifications here]
Compression schemes like zip and rar find patterns and replace the contents with markers. You could take out the word "compression" out of this post and replace it with "$1", then put in a table at the end that $1="compression". Decompressing the files re-assembles the words.

Media can be encoded the same way, but its much more difficult (try zipping a photo and see how much smaller it gets - 2% if you're lucky). Most media compression schemes actually reduce the quality of the media, but in ways that you won't notice. If you zoom in on a .jpg picture, you'll see little blocks of color - say, a blue sky has 4 pixels in a square that are almost, but not quite, the same shade of blue. .jpg compression replaces all 4 with the same shade of blue. Most photos can be compressed by upwards of 90% in this way without noticing the loss. DVD uses .jpg compression.

Newer forms of media compression compare frames: if that block of 4 shades of blue is still there in the next frame, you can replace both blocks at the same time. Divx does this. As you can imagine, this is extremely processor intensive.
 
  • #16
aychamo said:
I have some source code to compress files and it only increases the file size by 200%. Anyone interested?
:rofl: no thanks, I think I'll pass..

thanks russ for that explanation :smile:
 
  • #17
You can't use zip to compress a jpg file much, because jpg is already a "compressed" format, as russ explained. You can use zip to compress a bitmap file, though, to an enormous degree. The bitmap format is completely uncompressed.

Some newer formats, like png (portable network graphic) are lossless as well, and rapidly gaining ground on the internet. The latex images on this forum are png files, in fact. :smile:

- Warren
 
  • #18
So when do you predict IE will have full support for png?
 
  • #19
IE does not already support png? What do you mean by "full support?"

- Warren
 
  • #20
DVD compression

russ_watters said:
If you zoom in on a .jpg picture, you'll see little blocks of color - say, a blue sky has 4 pixels in a square that are almost, but not quite, the same shade of blue. .jpg compression replaces all 4 with the same shade of blue. Most photos can be compressed by upwards of 90% in this way without noticing the loss. DVD uses .jpg compression.
DVD uses MPEG-2 compression.
 
  • #21
chroot said:
IE does not already support png? What do you mean by "full support?"

I think he means support for the transparency channel. I don't know about Exploder 6 (never tuched the stuff) but on lower versions the transparency was not handlet at all...

As for the compression for the original thread, I don't believe it's true. yes you can put a lot of info in 8k of memory (there is somewhere on the internet a contest to do animation in assembler and in only 4k of RAM and there is some impressing stuff there) but a movie, really? let's say you discovered some extremely new way to eliminate the redundancy in the movie, but you still have to store the different backgrounds, actor's faces, all the big differences that are innerent in a normal movie.
 
  • #22
  • #23
hmm this is very neat stuff... i understand that all of you say there would be loss of quality in any picture/movie application. but what about data transfer between a server and client in an online game set up?

as bandwidth for online games costs money, would it be feasible (including security issues) to reduce bandwidth use by just sending the 'keys' to the client computer, which holds the library files? instead of the functions and all the rest. i ask about security as well, because whenever one contemplates putting more information in the game client, you're putting more tools in the hands of the enemy :) and that opens you up to some pretty nasty hacking.
 
  • #24
Monique said:
Supposedly the inventor was able to play 16 movies from a 64 kb chip at high speed without reading from the harddrive, supposed he was able to store 64 full size movies on a single chip.
Having actually done some work in this field...

A claim like this is right up there with over unity devices or proof that Einstein was wrong. :uhh:
 

1. What is "New computer source code true"?

"New computer source code true" refers to newly created computer programming instructions that accurately represent the intended functionality of a computer program.

2. How is "New computer source code true" different from existing code?

"New computer source code true" is different from existing code in that it is newly created and has not been modified or altered in any way. It should accurately represent the intended functionality without any errors or bugs.

3. How can I ensure that my "New computer source code true" is accurate?

To ensure that your "New computer source code true" is accurate, it is important to thoroughly test and debug the code. It is also helpful to have multiple people review the code for any potential errors or issues.

4. Why is it important for "New computer source code true" to be accurate?

It is important for "New computer source code true" to be accurate because computer programs rely on precise instructions to function correctly. Inaccurate code can lead to errors, crashes, and other issues that can negatively impact the user experience.

5. Can "New computer source code true" be modified or updated?

Yes, "New computer source code true" can be modified or updated if necessary. However, it is important to thoroughly test and debug any changes to ensure that the code remains accurate and functional.

Similar threads

  • Programming and Computer Science
Replies
7
Views
658
  • Art, Music, History, and Linguistics
Replies
1
Views
1K
  • Programming and Computer Science
2
Replies
60
Views
16K
  • General Discussion
Replies
12
Views
6K
  • General Discussion
Replies
12
Views
5K
  • Special and General Relativity
Replies
13
Views
2K
  • General Discussion
3
Replies
78
Views
9K
  • MATLAB, Maple, Mathematica, LaTeX
Replies
7
Views
2K
  • MATLAB, Maple, Mathematica, LaTeX
Replies
7
Views
3K
Back
Top