Convert Bitmap to B&W - C Programming Tips

  • Thread starter Thread starter noblerare
  • Start date Start date
Click For Summary
The discussion centers around converting a colored bitmap image to a black and white version using C programming techniques. The simplest method mentioned is to take the average of the RGB values for each pixel. However, a more accurate approach involves using weighted coefficients: R*0.299, G*0.587, and B*0.114, which reflect the human eye's sensitivity to different colors. These coefficients are derived from the ITU-R recommendation 601 and are rooted in color television history. The process involves calculating a weighted average of the RGB values to create a grayscale representation, where each new RGB value is set to this average. This method effectively treats the color image as three separate grayscale layers corresponding to red, green, and blue.
noblerare
Messages
49
Reaction score
0
Hi,

I was recently introduced to image manipulations and bit processing in my C programming class. I am pretty interested in the many creative ways an image can be encrypted and manipulated.

I was wondering, does anyone know of a way to change a given bitmap into a black and white rendition of the original colored image?

What do we do to the bytes?

Thanks!

Samuel
 
Technology news on Phys.org
The easy way is to take an average.

The more advanced way is to weigh them as R*0.299 + G*0.587 + B*0.114, this would result in a better approximation of apparent brightness.
 
hamster143 said:
The easy way is to take an average.

The more advanced way is to weigh them as R*0.299 + G*0.587 + B*0.114, this would result in a better approximation of apparent brightness.

That's interesting. Is there some explanation behind the magic numbers?
 
DavidSnider said:
That's interesting. Is there some explanation behind the magic numbers?

I don't know all the details, but these are the standard coefficients used to convert RGB to YUV, and presumably they are based on sensitivity of the human eye to different bandwidths. These numbers are present in ITU-R recommendation 601 (dating to 1982) and they are most likely older than that, possibly as old as color television.
 
So are you saying that if I take the average of the RGB values in each pixel and assign the average to each RGB value, that I would get a B&W rendition of the original image?

How exactly does that work?
 
noblerare said:
So are you saying that if I take the average of the RGB values in each pixel and assign the average to each RGB value, that I would get a B&W rendition of the original image?

How exactly does that work?

Try thinking of it this way: A color picture is 3 layers of greyscale images that specify the red green and blue components. All you are doing is taking a weighted average of the layers.

Average = Original.R*0.299 + Original.G*0.587 + Original.B*0.114;
New.R = Average ; New.G = Average ; New.B = Average;
 

Similar threads

  • · Replies 21 ·
Replies
21
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 7 ·
Replies
7
Views
6K
Replies
1
Views
14K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 3 ·
Replies
3
Views
5K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 7 ·
Replies
7
Views
2K
Replies
3
Views
3K