Image Compression you can implement in a day

  • Thread starter Thread starter Superposed_Cat
  • Start date Start date
  • Tags Tags
    Compression Image
AI Thread Summary
The discussion focuses on implementing a simple image compression algorithm without using libraries, emphasizing the need for a solution that can be executed quickly. Huffman encoding is suggested as a viable option, with the possibility of modifying existing examples found online. Additionally, the idea of calculating pixel differences from adjacent pixels and applying Huffman compression to these differences is proposed. Other methods mentioned include LZW compression and Run Length Encoding (RLE), which exploits correlations along scan lines. The conversation also touches on advanced techniques involving successive scan line correlations and references the FAX standard for further insights into compression methods. Overall, the emphasis is on finding straightforward algorithms that can be effectively implemented within a short timeframe.
Superposed_Cat
Messages
388
Reaction score
5
I need to implement a decent image compression algorithm in a day or two, without using libraries.
So it can't be anything with multiple layers, like jpeg.
I was thinking huffman encoding,

Which would be a good algorithm I can learn to implement in a day or two that is not terrible at it's job.
 
Technology news on Phys.org
Superposed_Cat said:
I need to implement a decent image compression algorithm in a day or two, without using libraries.
So it can't be anything with multiple layers, like jpeg.
I was thinking huffman encoding,

Which would be a good algorithm I can learn to implement in a day or two that is not terrible at it's [SIC] job.
You can find good examples on the internet of huffman encrypt/decrypt. I remember years ago lifting one and modifying it to my needs in a couple of hours.
 
  • Like
Likes Klystron, Superposed_Cat and anorlunda
.
phinds said:
You can find good examples on the internet of huffman encrypt/decrypt. I remember years ago lifting one and modifying it to my needs in a couple of hours.
Another idea is to compute the difference between a pixel value and the pixel value to the left or above it, and use huffman compression on those differences. (you can just use unsigned values,)
 
1) The simplest of course in just Run Length encoding. That takes advantage of correlations along scan lines.

2) The next step up is correlation between succesive scan lines (copy the pixel above) combined with Run Length.

3) One more step up is copy 2) above that is <offset or different length> by X pixels.

Number 3 above is used in FAX machines. The specific codes for the above are described in the FAX standard. I worked on the early prototypes several decades ago and don't recall the details. This Google search returns many results and probably has the actual standard listed.
https://www.google.com/search?&q=fax+code+standards
The hardware implementation required a one-scanline buffer for the look-behind, some logic, and 3 or 4 EEprom memories for the table lookup/encode-decode.

Cheers,
Tom
 
  • Like
Likes berkeman
Thread 'Is this public key encryption?'
I've tried to intuit public key encryption but never quite managed. But this seems to wrap it up in a bow. This seems to be a very elegant way of transmitting a message publicly that only the sender and receiver can decipher. Is this how PKE works? No, it cant be. In the above case, the requester knows the target's "secret" key - because they have his ID, and therefore knows his birthdate.
I tried a web search "the loss of programming ", and found an article saying that all aspects of writing, developing, and testing software programs will one day all be handled through artificial intelligence. One must wonder then, who is responsible. WHO is responsible for any problems, bugs, deficiencies, or whatever malfunctions which the programs make their users endure? Things may work wrong however the "wrong" happens. AI needs to fix the problems for the users. Any way to...
Back
Top