I Why do modern X ray images have better resolution compared to older ones?

artis
Messages
1,479
Reaction score
976
It just occurred to me while watching some old X ray images , what is the reason behind why modern X ray images have so much more detail than older ones especially really old ones from the beginning of the technology at the start of 20th century?

I ask this because (unlike in a CT scanner which is more sophisticated) the traditional X ray image technique doesn't seem to have changed that much, you still stand in front of the X ray source (tube inside a movable head positioned in front of you) and behind you (say for example in a chest X ray) is a board that hold the film that is exposed (maybe some newer variants don't have the film anymore instead capture the image and transfer it to a digital format?)

So given that the very technique for producing X rays by electronic means haven't changed , we still accelerate electrons to high KE and make them hit a target producing photons within the X ray energy range , could it be that the reason for the much better resolution is due to the advances in X ray capture , like the films we use to capture the rays and convert them to a picture etc? Or has there been some notable advances also in X ray tubes and how much we can focus the output of them?
 
  • Like
Likes Delta2
Physics news on Phys.org
A smaller and brighter x-ray source would be an obvious way to improve the resolution.
I don't have a comparison with historic sources.
 
@mfb by brighter you mean the intensity of the beam or just higher accelerating voltage/higher frequency photons?

I would imagine the intensity because with low intensity the probability of absorption increases and the image doesn't have a sharp edge between parts of higher density versus those of lower density like bone vs tissue ,
 
A higher intensity. Lowers the time required for the picture, reduces the motion of the patient.
 
  • Informative
  • Like
Likes artis and berkeman
Some advances in x-ray technology since the early 20th century that would lead to improved image quality...
  • Optimized (or at least much improved) x-ray spectra. For a given imaging site there are standard protocols in place that provide ideal kilovoltage settings and filtration so as to optimize the contrast in the primary signal for the thickness of the patient and the tissues etc. that will be imaged.
  • Quality control standards and regular testing.
  • Modern x-ray units don't use film anymore. They're flat panel imagers that use amorphous silicon photodiodes. After passing through the patient, the x-rays will first interact with a phosphor or scintillator of some sort. The thickness of this will have been optimized for the particular application. The photodiode array typically requires less light to form an image than film, which leads to (i) reduced dose to the patient, (ii) as above, reduced motion blur.
  • Also having an electronic image allows one to perform digital processing of the signal. Once you characterize a point-spread function for your panel system, on a basic level you can apply a deconvolution filter to improve the image. Other filter techniques can get more sophisticated, reducing the scatter signal, for example.
  • Also consider the target focal spot size. A smaller x-ray source means a better image. I'm not sure what they would have been using in the early 20th century, but surely heating would have been an issue and since there would have been a lot of "one size fits all" machines, this would have limited the minimum focal spot size. Typical mammography focal spots are sub mm, I believe.
  • Reduced ripple voltage. Modern voltage pulses are pretty much rectangular, but back in the day the AC rectification came with "ripple" which would have presented some challenges to controlling the emitted x-ray spectra, I suspect.
 
  • Like
  • Informative
Likes Delta2, mfb, artis and 1 other person
artis said:
(maybe some newer variants don't have the film anymore instead capture the image and transfer it to a digital format?)
Yes and then I think computer software enhances the images. It is amazing what computer software image processing techniques can do.
 
An advantage of digital radiography over conventional screen/film radiography is not in the detail that can be appreciated but in the extended range of penetrating radiation that can be observed. Let me elaborate. Screen/film radiography depends on the characteristic of converting a radiation exposure to a decrease in optical density (opaqueness; OD =-log (fraction of light transmitted) ) depending on the exposure. The human eye can only distinguish about 30 shades of gray between black and white. The sensitivity of film to represent OD from x-ray exposure is limited to a relatively narrow range of exposures where a change in OD vs a change in exposure (contrast) is high. So the x-ray exposure ( determined by the tube KVp, current, and time) must be selected carefully for the particular patient as well as the possible reason for the exam to present the ODs in an optimal range. The human eye has trouble concerning differences in low and high OD under normal viewing conditions leaving areas that are too light or too dark unavailable for analysis. (Too dark areas can be studies with a brighter light but contrast is lost). Digital radiography in detecting a larger range of radiation exposures makes this information available that otherwise would have been lost in the under and overexposed regions on the film.

So we find that digital radiography all but eliminates x-rays because incorrectly setting the x-ray machine, it requires lower exposures, and allows one to "see" radiation exposures beyond what films can represent by allowing one to electronically shift the OD range of the computer monitor to different ranges of exposure. Additionally, it can also modify the contrast of the image as well as smooth noise and sharpen edges. The images can be made to look a lot crisper.

The resolution however is not generally that much improved; the reason being that the focal spot of the tube remains about 0.6 -1.0 mm for normal radiography and the distance that the observed structures are away from the detector which affects blurring (increased penumbra). Additionally, the pixel size of the image ultimately limits resolution for large format films (14 x 17 inches) used for abdominal studies. Mammograms and dental x-ray can look spectacular because they require less exposure which permits a smaller focal spot size down to 0.3mm, and the detector being placed basically against the area of interest further reducing the effect of a finite focal spot size.
 
  • Like
  • Informative
Likes artis, mfb, Delta2 and 2 others
Back
Top