# Doubt in Digital Technology

challarao
Hi!
I am so fascinating about the computers and digital technology...
If we consider a number or a letter they have specific digital codes and compilers convert them into numbers or letters for humans...
In computers each and every signal is digitalized,in that case videos are also digitalized..
I've been puzzling how they are coded,suppose in a video file a person wears a red shirt and he has white skin..then in order to output the signal to the monitor...are there any colour codes in computers to send output...?
SOrry for any mistakes I made..
Please explain me if you know these aspects...
Any help or suggestion or weblink highly appreciated...
Thanxxx..waiting for replies

I can't quite understand your question. However colors are coded by describing them as the sum of three primaries (blue, green, and red). For each primary the intensity is then digitized and these three numbers describe the color at each pixel of the image.

The above is the basic description for a bitmap code. In practice there are methods to condense the data, using correlation between neighboring pixels.

Gold Member
Colors are generally described with a set of 32 bits segmented into 8 bits per color component Red, Green, Blue and Alpha.

So for every dot on your screen there is a code like:
11111111000000001111111100000000

A blu-ray sized movie with a 1920x1080 resolution will have 2,073,600 pixels. At 32 bits each that is 66,355,200. At 24 frames per second that is 1536 Megabits per second.

However, Bluray players only operate up to about 40 Megabits per second for video.

Luckily, a good deal of the data we see on our screen is redundant. The way they can cram all of that information into such a small stream (Nearly 40x smaller than the raw format) is through data compression algorithms that take advantage of certain kinds of redundancy in the images and video they compress.

Last edited:
Pattonias
Cool, I have never had this explained to me before. Interesting to know.

challarao
what is "54 68 65 20 61 74 68 65 69 73 74 73 20 72 69 6f 74 65 64 20 61 66 74 65 72 20 74 68 65 20 44 75 74 63 68 20 70 75 62 6c 69 73 68 65 64 20 61 20 62 6c 61 6e 6b 20 63 61 72 74 6f 6f 6e 2e
ΘΤ "

Homework Helper
Colors are generally described with a set of 32 bits segmented into 8 bits per color component Red, Green, Blue and Alpha.
Alpha is a transparency value used when combining multiple images (how much of one image shows through another):

24 bit RGB color uses 8 bits per color as mentioned, and is the most common method used on PC's.

Many PC video cards also support 30 bit RGB color at 10 bits per color, but with a limited selection of resolutions, and generally digital monitors don't support 10 bits per color (just CRT and some digital projectors). Few PC games support 30 bit color, one that I remember was Tomb Raider - Angel of Darkness made back in 2003.

36 bit color is used in digital cinema, but I'm not sure if that's RGB.

48 bit RGB color, at 16 bits per color is supported on some graphic workstations.

http://en.wikipedia.org/wiki/RGB_color_model

http://en.wikipedia.org/wiki/Digital_cinema

challarao
Alpha is a transparency value used when combining multiple images (how much of one image shows through another):

24 bit RGB color uses 8 bits per color as mentioned, and is the most common method used on PC's.

Many PC video cards also support 30 bit RGB color at 10 bits per color, but with a limited selection of resolutions, and generally digital monitors don't support 10 bits per color (just CRT and some digital projectors). Few PC games support 30 bit color, one that I remember was Tomb Raider - Angel of Darkness made back in 2003.

36 bit color is used in digital cinema, but I'm not sure if that's RGB.

48 bit RGB color, at 16 bits per color is supported on some graphic workstations.

http://en.wikipedia.org/wiki/RGB_color_model

http://en.wikipedia.org/wiki/Digital_cinema
Thanks for replies
A small tag question
If digital signals of a video are transferring through one computer to another computer the second computer can decode the signals and process them.But when this transfer is between analogous and digital how can computer convert them.Is there any decoding of voltage differences from analogous circuits.
Especially I am asking about analogous to digital converter programs..are they exist.?

Homework Helper
analog to digital converter programs..are they exist.?
Yes, most modern cameras, camcorders, and video capture devices in genernal, convert light into some type of digital format, using various types of components that convert the analog intensity of light for each color into some number of bits of data per color.

Older camcorders (Betacam, Betamax, VHS, Hi-8) converted the analog inputs into analog signals on a magnetic tape, without a conversion into a digital format. Fully analog TV's also converted received analog signals into analog levels of brightness for each color for display. The only digital aspects to these devices were the timings for sweep rates and refresh rates.

There were also analog computers around 40 to 50 years ago that could solve simple feedback type differential equations using components that could integrate in addition to being able to add, subtract, multiply, divide voltage levels. These were typically range limited to +/- 100 volts and only accurate to about .1 volt.