Where do I fit in the history of data compression?

In summary, the ZX Spectrum was a limited computer that could only address each pixel individually. The artist had to draw the artwork and then transfer it to a large piece of graph paper coloring every single pixel individually. He invented a method of digital image data compression which reduced the file size tremendously.
  • #1
Whipley Snidelash
66
19
TL;DR Summary
Where do I fit in the history of digital image data compression (if at all)? Anybody do it before I did?
In 1982 I was given a ZX Spectrum by a Timex employee. It was one of only three English units that had been converted to NTSC from PAL. They worked for Timex but had their own company on the side to write software for the Timex computer and for the spectrum They hired me to do the title screen artwork for 3 pieces of their software.

This computer was limited. It couldn’t address each pixel separately. It had 4 x 8 pixel character positions that were restricted to two colors each, paper and ink. I had to draw the artwork and then transfer it to a large piece of graph paper coloring every single pixel individually within the constraints. I had to address every pixel position. The file would’ve been huge because the way the computer drew the image on the screen was left to right top to bottom, one pixel at a time.

If I recall correctly the file was a series of one or two numbers separated by commas for each pixel position. To say this was unwieldy is an understatement. It didn’t take me long to figure out going left to right there were consecutive series of numbers where the color and thus the numbers stayed the same because of the artwork. It was then I realized that I didn’t have to address each pixel position for a series of pixels where the color stayed the same I just needed to put in the color and the number of consecutive pixels that had it.

This resulted in compressing the graphic file tremendously. I basically invented a method of digital image data compression. Did anyone do it before I did?
 
Last edited by a moderator:
  • Like
Likes jedishrfu
Technology news on Phys.org
  • #2
It’s likely that you rediscovered a scheme that was already in use On mainframes under a different context like inter-mainframe data communication. When resources are limited, programmers find a way around it.

In the 1960s and 1970s programmers did a lot of tricks that never saw the light of day because code and algorithms were proprietary and weren’t shared, copyrighted or patented.

it wasn’t until the hobbyist micro computer days In the mid 1970s that code sharing became popular in Computer interest groups, magazines and books.

Wikipedia has a brief summary of data compression schemes shown here:

https://en.wikipedia.org/wiki/Data_compression
 
Last edited:
  • Like
Likes anorlunda
  • #3
I didn’t think they were doing digital images before personal computers, just software code. I didn’t think mainframe software had graphic cover or title screens back then either. I compressed the file of a digital image it’s not really the same as compressing software.
 
  • #4
Good article Jedi. But it doesn’t have any dates or history. But this was there and it’s almost exactly what I did:

Lossless data compression algorithms usually exploit statistical redundancy to represent data without losing any information, so that the process is reversible. Lossless compression is possible because most real-world data exhibits statistical redundancy. For example, an image may have areas of color that do not change over several pixels; instead of coding "red pixel, red pixel, ..." the data may be encoded as "279 red pixels". This is a basic example of run-length encoding; there are many schemes to reduce file size by eliminating redundancy.”
Wikipedia
 
  • #5
The Run Length Encoding article linked in your quote dates the technique to 1967.
 
  • Like
Likes jedishrfu
  • #6
Nice article. The 67 date is for television signals not digital. It also states that Hitachi patented it in 1983 for digital images which is after I did it.
 
  • Like
Likes jedishrfu
  • #7
I found another article and digital images were in use in the 60’s. Somebody probably compressed the data back then.
 
  • Like
Likes jedishrfu
  • #8
Sadly, patenting of the technique would have used the RLE example as prior art ie someone invented it in a different context but they did it first. The long strings of repeated characters were found in data communications and screen based data input applications as used on IBM 3270 terminals and other vendors.

I‘ve had ideas that were pretty good like yours but were found to be copies of prior art after a patent search.

Great minds think alike.

I had a boss say that to me once in an attempt to vicariously claim credit when I was a junior programmer and I added the quip and few minds think at all. He caught my drift and I got a Hey.
 
  • #9
Patenting history is a funny thing. Tesla was screwed out of a patent by Marconi Only to have it reversed years later when the US Govt was paying huge royalties to Marconis company then Tesla ideas were resurrected as prior art.

https://en.wikipedia.org/wiki/Nikola_Tesla

Wireless lawsuits
When World War I broke out, the British cut the transatlantic telegraph cable linking the US to Germany in order to control the flow of information between the two countries. They also tried to shut off German wireless communication to and from the US by having the US Marconi Company sue the German radio company Telefunken for patent infringement.[174] Telefunken brought in the physicists Jonathan Zenneck and Karl Ferdinand Braun for their defense, and hired Tesla as a witness for two years for $1,000 a month. The case stalled and then went moot when the US entered the war against Germany in 1917.[174][175]

In 1915, Tesla attempted to sue the Marconi Company for infringement of his wireless tuning patents. Marconi's initial radio patent had been awarded in the US in 1897, but his 1900 patent submission covering improvements to radio transmission had been rejected several times, before it was finally approved in 1904, on the grounds that it infringed on other existing patents including two 1897 Tesla wireless power tuning patents.[137][176][177] Tesla's 1915 case went nowhere,[178] but in a related case, where the Marconi Company tried to sue the US government over WWI patent infringements, a Supreme Court of the United States 1943 decision restored the prior patents of Oliver Lodge, John Stone, and Tesla.[179] The court declared that their decision had no bearing on Marconi's claim as the first to achieve radio transmission, just that since Marconi's claim to certain patented improvements were questionable, the company could not claim infringement on those same patents.[137][180]
 
Last edited:
  • #10
Whipley Snidelash said:
I didn’t think they were doing digital images before personal computers, just software code. I didn’t think mainframe software had graphic cover or title screens back then either. I compressed the file of a digital image it’s not really the same as compressing software.
I don't mean to minimize your achievement, @Whipley Snidelash − I think that your redundancy reduction implemented a good insight; however, graphics have been run on mainframes since at least as early as the mid-'60s. There were of course no startup splash screen graphics; however, in the late '70s Tektronix and IBM graphics terminals and Calcomp plotters were not uncommon at universities and other research-oriented installations.

From http://www.cadhistory.net/13%20IBM,%20Lockheed%20and%20Dassault.pdf:
CADAM (Computer-graphics Augmented Design and Manufacturing) began as an internal mainframe application referred to as “Project Design” within Lockheed’s Burbank, California operation in 1965. It was initially implemented on IBM 360 computers using IBM’s 2250 graphics display terminals. From the start, a primary objective was to minimize response time. Working with IBM, the two companies determined that optimum productivity would be achieved if the response time for individual operations could be kept under 0.5 seconds.​

In the late '70s, GDDM was not uncommon − https://en.wikipedia.org/wiki/Graphical_Data_Display_Manager − it was based in part on GKS ##-## https://en.wikipedia.org/wiki/Graphical_Kernel_System.
and PHIGS (Programmer's Hierarchical Interactive Graphics Standard) − https://en.wikipedia.org/wiki/PHIGS.

Also, especially, e.g., in the case of a run of blanks at the end of a line, a keypunch operator would leave trailing blanks unpunched, rather than punching blanks or nulls (in EBCDIC, X'40 and X'00', respectively), and when the 80-column logical records were stored on disk or tape or were printed, the first non-coded column would be interpreted as CRLF.

I think also that your insight is related to what was specified beginning in 1999 by the WC3 Consortium in establishing standards for SVG, which e.g. involves specifying text that defines a path between point A and point B instead of specying a raster image.

From https://en.wikipedia.org/wiki/Scalable_Vector_Graphics:

1602943979798.png


This image illustrates the difference between bitmap and vector images. The bitmap image is composed of a fixed set of pixels, while the vector image is composed of a fixed set of shapes. In the picture, scaling the bitmap reveals the pixels while scaling the vector image preserves the shapes.​

A rudimentary use of run-legth encoding that has been aroung since sinve the late 1800s as FEN ##-## https://en.wikipedia.org/wiki/Forsyth–Edwards_Notation, and since '93 as pgn ##-## https://en.wikipedia.org/wiki/Portable_Game_Notation, is chess position encoding that notes left-to-right consecutive unoccupied squares by the numeric counts thereof.
 
Last edited:
  • Like
Likes anorlunda
  • #11
Whipley Snidelash said:
This resulted in compressing the graphic file tremendously. I basically invented a method of digital image data compression. Did anyone do it before I did?
Perhaps the most famous work in that field was Claude E. Shannon: "A Mathematical Theory of Communication", Bell System Technical Journal, Vol. 27, pp. 379–423, 623–656, 1948 The field is data compression and compressing image files is just one application of that, and Shannon's great contribution was to show the limits on how far communications could be compressed.

Thomas Edison (1837-1931) also did work on data compression for telegraphy.

I wager that historians might find even earlier efforts.

The more smart people we have on this planet, the more difficult it becomes to have an idea that is truly original.
 
  • #12
Whipley Snidelash said:
Summary:: Where do I fit in the history of digital image data compression (if at all)? Anybody do it before I did?

This resulted in compressing the graphic file tremendously. I basically invented a method of digital image data compression. Did anyone do it before I did?
I have re-invented quite a number of things, and actually invented a few. The actual inventions were more lucrative, but for me it is no less satisfying to have the flash of insight. So who cares whether it was new to the planet (probably someone on planet ZOG already did it any way)Incidentally it is good to meet a devotee of both the right Rev. Spooner and Rocket J Squirrel
 

1. How did data compression first begin?

Data compression can be traced back to the earliest forms of written communication, where symbols were used to represent words and phrases. However, the first documented use of data compression techniques was in the telegraph industry in the 19th century, where Morse code was used to transmit messages in a more efficient manner.

2. Who is credited with the invention of modern data compression techniques?

The modern era of data compression is largely attributed to the work of mathematician Claude Shannon in the 1940s. His groundbreaking paper titled "A Mathematical Theory of Communication" laid the foundation for many of the compression techniques still used today.

3. How has data compression evolved over time?

Data compression has evolved significantly since its beginnings in telegraph communication. With the rise of computers and digital technology, more sophisticated algorithms and techniques have been developed, leading to higher levels of compression and faster processing speeds.

4. What are the main benefits of data compression?

Data compression offers several benefits, including reduced file sizes, faster transmission speeds, and lower storage costs. It also allows for more efficient use of bandwidth and can improve overall system performance.

5. What are the current challenges in the field of data compression?

One of the main challenges in data compression is finding a balance between compression efficiency and loss of data. While higher compression rates can result in significant file size reduction, it can also lead to loss of data or decrease in quality. Additionally, as technology continues to advance and data becomes more complex, there is a constant need for new and improved compression techniques to keep up with the demand for faster and more efficient data processing.

Similar threads

  • Programming and Computer Science
Replies
11
Views
997
  • MATLAB, Maple, Mathematica, LaTeX
Replies
1
Views
1K
  • Set Theory, Logic, Probability, Statistics
2
Replies
37
Views
4K
  • Electrical Engineering
Replies
1
Views
1K
  • Other Physics Topics
Replies
9
Views
3K
Replies
4
Views
1K
  • Computing and Technology
Replies
0
Views
197
  • Programming and Computer Science
Replies
29
Views
3K
  • Calculus and Beyond Homework Help
Replies
4
Views
1K
Back
Top