Analog TV signal, CRT workings

In summary: Different, but similar.5)...what synchronizes the firing of the electron gun and the raster/deflection of the beam in order for the picture to be accurate? My own guess would be that there is a local oscillator in the tv set as well as a common oscillator at the transmitting facility so they synchronize the video signal to match the horizontal scan frequency (for PAL that being 15,625 kHz) so that whenever the video signal makes the gun fire the beam as a whole is in the correct position?Yes, it would seem so.
  • #1
artis
1,481
976
I had a talk with a friend of mine , just wanted to make sure I'm not wrong.
I actually have one older CRT set with an open back in another room so I went to check my "question" out a bit before asking.

1) Is it right that the video signal (after some input demodulation etc boards) directly then goes and controls the cathode ray electron gun?

2)Am I right in assuming that the deflection coils only move (raster) the electron beam across the screen to form horizontal lines/retrace as well as perform vertical retrace of the beam diagonally in order to start a new frame while the exact pattern in which the electrons from each of the 3 guns (color tv)or single gun(black&white) hit specific spots/pixels on the screen in each horizontal line is controlled by the video signal in the electron gun?

3) If I had a single white stripe vertically in the midst of my screen does that mean that in a single frame in each of the horizontal lines the electron gun would only fire at the moment which coincides with the time when the beam is at the mid position?

4) Would it be fair to say that the way in which a video signal controls the control grid of a CRT electron gun is very similar if not exact to the way an audio signal would control the grid of an audio amplifier vacuum tube ? in an audio tube there is considerable idle bias current to achieve linear operation and minimize crossover distortion, what happens within a CRT electron gun in this regard ? I would tend to think that the crt's electron gun is much more nonlinear in this regard.

5)What synchronizes the firing of the electron gun and the raster/deflection of the beam in order for the picture to be accurate? My own guess would be that there is a local oscillator in the tv set as well as a common oscillator at the transmitting facility so they synchronize the video signal to match the horizontal scan frequency (for PAL that being 15,625 kHz) so that whenever the video signal makes the gun fire the beam as a whole is in the correct position ?PS. what about the vertical frame rate is that synchronized to grid frequency and at which end, the transmission one or both? I see for PAL it's 50hz which are interlaced so a "real" frame is formed only at 25hz while for NTSC it is 60hz i so a real frame is 30 hz. Was this done deliberately to simplify broadcasting equipment?

thanks.
 
Last edited:
Engineering news on Phys.org
  • #2
(1) Yes, the three cathode rays are controlled by an RGB video signal (of a kind) around 100-200V DC as I recall.
(2) The video signal is continuous across the screen: for color TV there is a metal mask before the screen which prevents the rays to hit the wrong pixels.
(3) if you have a single, intensive vertical stripe then likely the output capacitor of the horizontal deflection circuit is dead: no or reduced horizontal deflection (while the HV part, which is powered by the horizontal deflection is still working)
(4) it IS an oversized, specialized electron tube, with all the nonlinearity what an electron tube can offer
(5) the H and V deflection sync is usually based on some kind of PLL-like circuitry, but there is no actual pixel sync at all: you can adjust some delays and bias currents to set the picture to the middle
 
  • #3
I hope this attempt at a reply is more readable.
1) Is it right that the video signal (after some input demodulation etc boards) directly then goes and controls the cathode ray electron gun?
Yes. Usually by cathode modulation rather than "grid" modulation.
2)Am I right in assuming that the deflection coils only move (raster) the electron beam across the screen to form horizontal lines/retrace as well as perform vertical retrace of the beam diagonally in order to start a new frame while the exact pattern in which the electrons from each of the 3 guns (color tv)or single gun(black&white) hit specific spots/pixels on the screen in each horizontal line is controlled by the video signal in the electron gun?
Yes. The vertical retrace might take the time of a few scan lines, but the beam is blanked during this time so you do not see it.
3) If I had a single white stripe vertically in the midst of my screen does that mean that in a single frame in each of the horizontal lines the electron gun would only fire at the moment which coincides with the time when the beam is at the mid position?
Every frame, the gun fires at that moment.
4) Would it be fair to say that the way in which a video signal controls the control grid of a CRT electron gun is very similar if not exact to the way an audio signal would control the grid of an audio amplifier vacuum tube ? in an audio tube there is considerable idle bias current to achieve linear operation and minimize crossover distortion, what happens within a CRT electron gun in this regard ? I would tend to think that the crt's electron gun is much more nonlinear in this regard.
It is the same principle. Remember that the video signal is uni-directional, going from black (zero volts say) to white (a positive voltage, say). So we normally bias the tube to cut off, and the video drives it "on". Any triode device has a non linear characteristic where the plate current is proportional to the grid voltage ^3/2. This creates a visual effect where whites are too bright, and is referred to as gamma. It is normal for the TV station to apply the opposite characteristic to compensate.
5)What synchronizes the firing of the electron gun and the raster/deflection of the beam in order for the picture to be accurate? My own guess would be that there is a local oscillator in the tv set as well as a common oscillator at the transmitting facility so they synchronize the video signal to match the horizontal scan frequency (for PAL that being 15,625 kHz) so that whenever the video signal makes the gun fire the beam as a whole is in the correct position ?
The video signal is usually referenced to 1 volt for measurement purposes. Black is represented by 0.3 volts, and white by 1 volt. At the end of every line, video is disconnected, or blanked, and during this period a pulse from 0.3 down to 0 volts is inserted. This is the line synchronising signal. At the receiver, these pulses are separated out and applied to a PLL generating the line scan.
The frame synchronsing signal is also a group of pulses applied at the end of every frame and occupying a few lines.


PS. what about the vertical frame rate is that synchronized to grid frequency and at which end, the transmission one or both? I see for PAL it's 50hz which are interlaced so a "real" frame is formed only at 25hz while for NTSC it is 60hz i so a real frame is 30 hz. Was this done deliberately to simplify broadcasting equipment?
The scan frequencies are exact and, since about 1955, not locked to the mains. The 50 Hz vertical scan is called the field scan and two of these make a frame, of which we have 25 per second. It was originally thought by EMI when they developed the system in the 1930s that interlaced scan would provide 405 line resolution using two 225 1/2 line pictures, and so halve the bandwidth. They also thought that the 50Hz field frequency would give better brightness flicker than 25Hz frame frequency. However, it has been found that there is annoying flicker on moving objects using interlacing, and this negates the advantages, so the modern idea is to avoid using interlacing.
 
  • Like
Likes artis
  • #4
tech99 said:
PS. what about the vertical frame rate is that synchronized to grid frequency and at which end, the transmission one or both? I see for PAL it's 50hz which are interlaced so a "real" frame is formed only at 25hz while for NTSC it is 60hz i so a real frame is 30 hz. Was this done deliberately to simplify broadcasting equipment?
The scan frequencies are exact and, since about 1955, not locked to the mains. The 50 Hz vertical scan is called the field scan and two of these make a frame, of which we have 25 per second. It was originally thought by EMI when they developed the system in the 1930s that interlaced scan would provide 405 line resolution using two 225 1/2 line pictures, and so halve the bandwidth. They also thought that the 50Hz field frequency would give better brightness flicker than 25Hz frame frequency. However, it has been found that there is annoying flicker on moving objects using interlacing, and this negates the advantages, so the modern idea is to avoid using interlacing.
It was always my understanding that an important feature of the interlaced scanning was to prevent an unsightly "bloom" in brightness should there be a slight side by side overscan from any imperfection in the rastor scanning geometry. If a segment of rastor were to be partially re-scanned by the very next sweep that phosphor would show doubly bright and spoil the image. If the offending second sweep came on an alternate field, the phosphor would have already decayed and no inordinate brightness would result. So the rastor scan lines always had a null line interspersed and thereby the aiming requirements much mitigated.

Incidentally I am in awe of the folks who folded a backward compatible color signal into existing black and white TV signal structure using tube driven analog circuitry.
 
  • Like
Likes Ibix and dlgoff
  • #5
@Rive no my tv is not broken the question was just to better understand.
@hutchphd I think I have also read that this explanation was the main cause, in order for the phosphor to be able to show adequately it had to be interlaced and each of the two "fields" making up a single frame had to be different lines, like the first field odd number of lines while the second even. In this way the whole screen was "updated" evenly , I read that if they drew a whole frame within one try due to the rather "slow" raster of the gun the top of the frame would start to fade while the bottom is still being drawn so the whole picture would flicker more.
@tech99 so from what you are saying I understand that the screen basically does what the input video signal tells it to do, both color information as well as blanking and frame separation is driven from the video signal?

In this case , we are transmitting in digital form for the last decade or more and I now receive my tv signal from a common internet fiber optic, I did a upgrade on my older CRT set by having a decoder added, in other words I soldered a SCART type cable to my older TV's input "radio block" while taking out the older one meant for the 75ohm cable. So I can now connect the SCART cable to my digital decoder and watch a nice picture.
If what you say is true then I suppose that from this digital to analog (RGB I guess) input the TV now displays the picture differently than it did back when it worked from a typical analog cable signal?
I guess I have to look up what type of scanning is present within such a signal instead of the old one.
All in all the picture seems much better compared to the one I was used to as a kid back in the days, yet the TV is the same.
 
  • #6
artis said:
@Rive no my tv is not broken the question was just to better understand.
Just sounded like that...

artis said:
If what you say is true then I suppose that from this digital to analog (RGB I guess) input the TV now displays the picture differently than it did back when it worked from a typical analog cable signal?
The original HF => RGB&sync system has some serious bandwidth limitations: with switching to composite and, in your case: possibly SCART with RGBI the data content of the picture will be naturally improved.
 
  • #7
hutchphd said:
Incidentally I am in awe of the folks who folded a backward compatible color signal into existing black and white TV signal structure using tube driven analog circuitry.
The NTSC work was incredible, even by today's standards. In particular:-
The insertion of the colour sub carrier between the monochrome side frequencies.
The orange-cyan transform. This gives more bandwidth to colours along this axis of the colour triangle, where the acuity of the eye is greatest.
The locking of the demodulating carrier to a short reference burst in the line blanking period, at a time when PLLs were unknown.
The shadow mask tube.
I am however, also in awe of the original monochrome 405 line development in the UK by EMI, which introduced the concepts of black level, blanking intervals, sync pulses, front and back porches and interlacing. And all this without an oscilloscope! Some years later when they saw their waveform on a CRO they were surprised to see that the picture waveform was exactly as intended.
 
  • Like
Likes hutchphd, Ibix and Averagesupernova
  • #8
hutchphd said:
It was always my understanding that an important feature of the interlaced scanning was to prevent an unsightly "bloom" in brightness should there be a slight side by side overscan from any imperfection in the rastor scanning geometry.
The OP question is too broad for a satisfactory answer. There are dozens of important factors to consider in a CRT.

I am not sure how relevant your reason for the choice of an interlaced picture was on any but the very early TV CRTs. Doubling the number of lines is not a problem for an appropriately designed CRT. (e.g. HDTV). According to what I learned, it's to do with movement portrayal. Jerky movement is a serious problem at 12.5 frames a second (or 15, for the 'other' standard). By using interlace, there are 25 alternating fields per second but (almost) the full vertical resolution is available (405 , 525 or 625 line pictures). It's a bit like the Maltese Cross shutter that's used in film projectors which increases the frequency of the flicker and is better to watch, even for the old 18 (?) frames per second.

artis said:
I guess I have to look up what type of scanning is present within such a signal instead of the old one.
The scanning would have to be essentially the same. It would be pointless to try to make it different because all the TV scan controls are designed specifically for decoded standard PAL or NTSC signals. The scanning circuits have a seriously hard job to scan and focus three separate modulated electron beams at all times to the right places on the screen. A digital to analogue standards converter (in the 'set top box') has to assemble a PAL or NTSC signal to provide an RF signal for a conventional set and also, for SCART connection, R,G,B signals, a Blanking signal and a composite - plus a lot of other stuff including sound etc. etc. SCART is a real dog's dinner, as all 'compatible' interfaces used when standards change.
On the whole I'd rather buy the very cheapest flat screen TV and consign PAL and NTSC to History (however interesting and exciting those systems were in their day).
 
  • #9
I might be wrong but I think @sophiecentaur you got your numbers off because the interlaced signal of analog tv was 50hz so 50 fields a second where 25 are even lines and 25 odd ones when put together you get 25 full frames or 25hz , same for NTSC only 60hz interlaced and 30 full frames per sec.Well SCART is now also history I guess , although I still have a SCART output from my digital tv decoder along a HDMI output, it's nice because I have a few old tv sets that I converted (well just added 3 wires from the SCART cable to my tv's input demodulation circuitry, although I can't remember which exact ones) seems to work fine. although SCART is said to support even 1080p the signal from my decoder through SCART doesn't seem to do so. The HDMI output on the other hand does so perfectly.

I guess the whole "greatness" of flat screens is that they can be big and not flicker since the whole screen changes together instead of being drawn by electron beam.I wonder just for curiosity (even though this would have been a cost savings nightmare) why they didn't make a CRT with a cube shape , in other words instead of having a single gun to cover the whole screen just have multiple guns each one covering just a tiny portion of a screen (sort of like LED flat screens control their backlight) and then each gun just hits a tiny area which it can do much faster and the blurring effects associated with the geometry of the beam etc would not be a problem , although the CRT drive circuitry would probably have to be much more advanced as the signal would need to be spliced up and the exact part given to each gun to reproduce.
 
Last edited:
  • #10
A problem with reducing flicker with analogue is that we do not have a storage medium; we have to rely on the eye and persistence of vision. This makes it impossible to do tricks as you describe.
 
  • Like
Likes sophiecentaur
  • #11
artis said:
I might be wrong but I think @sophiecentaur you got your numbers off because the interlaced signal of analog tv was 50hz so 50 fields a second where 25 are even lines and 25 odd ones when put together you get 25 full frames or 25hz , same for NTSC only 60hz interlaced and 30 full frames per sec.
Sorry - you are absolutely right about that. My memory divided the field rate by two twice!
The basic idea is still there, though. For still images the definition is the same as for non interlace but movement portrayal is much better.
artis said:
why they didn't make a CRT with a cube shape , in other words instead of having a single gun to cover the whole screen just have multiple guns each one covering just a tiny portion of a screen (sort of like LED flat screens control their backlight)
You can bet your life that someone, somewhere actually considered virtually every possible way of getting big sharp displays with good motion portrayal. So your basic idea probably did go through someone's mind.
Fact is that they had to use Three (RGB) Guns and those guns needed to produce spots in the same place on the screen. That was a nightmare to arrange with the old shadow mask screen. Trinitron and beyond made life (and convergence) easier but the edges of multiple sub screens would need to register very well in order to avoid a grid of blurred lines all over the screen.
Of course, the Plasma screen has, effectively, a matrix of many tiny CRTs all over the screen. A fearsome thing in many ways - and the (old fashioned) electronics that drove it would have been difficult. I had my son's old one for a few years. Great sweaty beast that whistled and fizzed until, finally the high power driving circuits gave up and it produced lines and patches of green iirc.
As @tech99 says, without a cheap storage medium, none of the clever systems were possible. I remember the first Field store Standards Converters were the size of a small house! (Analogue Quartz field delay lines for storage)
 
  • #12
Regarding tricks to improve analogue pictures, at one stage "spot wobble", using high frequency Y modulation, was introduced to broaden the lines until they touched; this is the optimum width. Also notice that in the vertical direction, we have spatial sampling, so a 525 picture could only portray 262 distinct pixels (or cycles) in that direction, whereas along the lines there is no spatial sampling, so it could in principle portray more pixels.
Also, when Baird was investigating his TV system, he found that left to right scanning, as when reading, gave better flicker results due to the brain being programmed to read text.
 
  • #13
Now just a side idea here more for fun than anything but to my mind a laser beam is much more focused over a small distance (the distance between gun and screen being very small ) than an electron beam because an electron beam spreads out (mostly due to space charge effects? the same thing that limits electron beam currents etc.) so cannot stay as focused and so needs additional things to keep check on the beam like a shadow mask which wouldn't be needed if the beam was "razor thin" and sharp. I'm only not sure how a laser beam would excite a pixel to shine and how one would "deflect" the beam to raster every pixel.

Well what do you know I sort of feel like a retarded inventor that comes up with ideas on his own but only to find someone has done them already, seems like such a device exists already , I just did some googling , wow interesting, this probably needs a thread on it's own.

https://en.wikipedia.org/wiki/Laser-powered_phosphor_display
 
  • #14
artis said:
to my mind a laser beam is much more focused over a small distance
For precise imaging a laser is very good but TV display needs a bright, efficient set of three primaries and I wonder if you could get the appropriate colours with a laser. Then there would be the problem of actually achieving a scan. Wouldn't it need to be a mechanical mirror? Moving parts are not good for length of life but perhaps you could use piezo electric. Electrons are probably easier to scan in many ways, I would think.
But, in any case, no one would use scanning for a TV display these days although there are many other applications for cutting and forming.
However, it's random thoughts like yours that are sometimes the precursors to great inventions so I can't knock it entirely as an idea.
 
  • #15
Well it was just an idea for me but seems like some folks have done it more with this than just had an idea and that has happened already time ago.
Well if you ask me I too doubt this can have any serious rivalry for flat screens especially given that even LCD now is going to be obsolete and ever higher resolution screens with even smaller pixels are going into market like OLED with the light emitting diode cells.

Anyway googling "laser cathode ray tube" or similarly I found many research papers from around the early 2000's which makes sense as back then LCD screens were still rather big and their picture contrast and resolution was still poor so I assume having a "better" CRT which could be lighter and have higher resolution etc seemed a real hing
 
  • #16
  • Like
Likes sophiecentaur
  • #17
sophiecentaur said:
For precise imaging a laser is very good but TV display needs a bright, efficient set of three primaries and I wonder if you could get the appropriate colours with a laser. Then there would be the problem of actually achieving a scan. Wouldn't it need to be a mechanical mirror? Moving parts are not good for length of life but perhaps you could use piezo electric. Electrons are probably easier to scan in many ways, I would think.
But, in any case, no one would use scanning for a TV display these days although there are many other applications for cutting and forming.
However, it's random thoughts like yours that are sometimes the precursors to great inventions so I can't knock it entirely as an idea.
At Radiolympia in about 1939, the largest screen TV was made by Scophony Ltd. It used a discharge tube as its light source and a light modulator using an ultrasonic light valve, where ultrasound passes through a liquid-filled cell, which creates diffraction patterns. Line scan was obtained by a very small and fast rotating mirror drum with an arc of fixed mirrors to multiply the number of lines. But it was not the precursor of domestic receivers.
Before rejecting mechanical systems, just think of the video recorder, or the hard drive, both miracles of engineering.
 
  • #18
@tech99 maybe you happen to have any links or drawings of the device you just described?
 
  • #20
tech99 said:
a very small and fast rotating mirror drum
Nice one. There was an old method for measuring c which used a rotating polygonal mirror. That was an alternative to the rotating gear wheel of Fiseau's method. Also, there was a bizarre telecine system (film to TV) which also used a rotating polygonal mirror which allowed each film frame to be scanned by two sequential fields. All great makeshift devices, waiting for the same jobs to be done electronically. Personally I wouldn't have liked the job of fettling such equipment every morning so the 'real scientist / engineer' could study it.

tech99 said:
Before rejecting mechanical systems, just think of the video recorder, or the hard drive, both miracles of engineering.
Agreed. They were both 'good ideas' (much better than that comment sounds, now I've written it !). The video recorder could only be made to work once there was the appropriate electronics to adjust for the dreadful timing of rotating and wobbling shafts and drums. The Quadruplex Head device from Ampex used Two Inch wide video tape and the heads scanned the tape at high speed laterally. Helical scan was a great advance and needed less fettling and the home VHS was horrid but brilliant at the same time.
Hard drives are still with us but their days are certainly numbered. Still, we couldn't have found that black hole without them so they should go down in history.

PS did anyone ever hear of VERA? it was the Video Electronic Recording Apparatus and was brought out by the BBC Engineering Research Department around the same time that Ampex did the job much better. VERA used massive reels of tape and the signal was recorded along the length of the tape! It did record and playback successfully but a fault in the transport could produce a room full of spilled tape before the Off Button could be pressed.
 
  • #21
Thank you for reminding me about VERA.
 
  • Like
Likes sophiecentaur
  • #22
PS. I am still writing from a pc with a HDD ,now that you mentioned it @sophiecentaur it makes me feel old :D

So my decoder uses a digital input that has the typical 8 wire internet cable which itself comes from a modem which receives it's signal from an optical cable so that's already one stage of demodulation from optical to simple digital and then my tv decoder has it's SCART output for the older style tv's so that's another part of signal change into another type, here in also is my question so basically the tv decoder with it's circuitry makes a full analog video/audio waveform from a digital signal , a waveform which resembles that which would have been transmitted via air back in the analog days only without the carrier frequency as now there is no need for that because the signal has already arrived in my house through different means?
 
  • #23
artis said:
it makes me feel old :D
I am jumping the gun a bit, here. Solid state memory can't handle TB of storage (not at price you and I could afford) but it's only a matter of time.
I remember buying 1MB of SSD for my old Psion3A and it cost me 100GBP. I still have it in a desk drawer. Useless!

artis said:
here in also is my question so basically the tv decoder with it's circuitry makes a full analog video/audio waveform from a digital signal , a waveform which resembles that which would have been transmitted via air back in the analog days only without the carrier frequency as now there is no need for that because the signal has already arrived in my house through different means?
SCART uses baseband signals and there are several options in the SCART spec / pinout. You can use RGB and sync signals (four wires), Composite, which is like a baseband PAL signal on a single wire or s-video which has a wire for luminance (high res) and a wire for the chrominance signal (lower res). Which do you want to use and why?

I must say, this thread has certainly covered a lot of ground about TV matters in general.
 
  • #24
I already attached the SCART pinouts (soldered them to the demodulator board) but now I don't remember which were which , I used a diagram a friend of mine who was an electrician gave me as years ago there were still many older sets that did not have a SCART or any other input but only the old 75ohm coax one.
 
  • #25
artis said:
I already attached the SCART pinouts
If that did the job then what do you want now? You are being too vague about it for me to understand where you want to go from here.

I remember that set top boxes used to have a UHF output (as did VCRs), which could be tuned to a vacant UHF channel and which mimicked a regular broadcast signal. If the one you have doesn't have such UHF out then there's nothing you can do about it.
 
  • #26
Well you misunderstood me , everything works I was just wondering about the difference in signals between the one that was originally transmitted through air and came in with a local antenna by a 75ohm coax into my tv's UHF and the one that comes from the SCART rgb and is now soldered to my tv's inner demodulator board, apparently there is difference between these signals because the original UHF input had an additional so called (at least here) "radio board" which I assume was the first stage that separated the RF carrier from the actual TV waveform, right?
Because as far as I understand the typical signals that are meant for close distance , namely S-video or composite or RGB are just video signals and they don't have any high freq carrier associated with them as the carrier was only needed for broadcast of tv signal over air over large distances.
PS. As for tv signals we now have digital tv so they are still transmitting over air but the waveform is digital and I am not sure what the carrier looks like as I suppose there is a difference between the carrier for a digital broadcast vs analog or is it the same?
 
  • #27
artis said:
I was just wondering about the difference in signals between the one that was originally transmitted through air and came in with a local antenna by a 75ohm coax into my tv's UHF and the one that comes from the SCART rgb and is now soldered to my tv's inner demodulator board
There a a number of options for a SCART pinout, as I mentioned previously, I have no idea which one you used but they involve different numbers of wires according to the spec.. If you wanted to provide the TVs with an RF signal then you would need to Amplitude Modulate a UHF carrier, (via an Intermediate Frequency stage to produce the right RF spectrum). I mentioned this too.
 
  • #28
Ok so basically I did a small googling and found out that there are miniature boxes now available aka modulators where one can input a desired video signal (whether HDMI or RGB or other) and the modulator modulates that video signal onto a carrier signal (using QAM as I see it) and then provides the modulated RF to a coax output.
an example of such device in the link below
https://www.google.com/search?q=vec...Uux4sKHT5vCGYQ_AUIDCgA&biw=1280&bih=888&dpr=1

So does this mean that in theory if I had an RF power amplifier and an antenna I could make my own analog TV broadcast over an area (the size of which determined by the power of my transmitter) ?
 
  • #29
artis said:
So does this mean that in theory if I had an RF power amplifier and an antenna I could make my own analog TV broadcast over an area (the size of which determined by the power of my transmitter) ?
That link describes an HDMI distribution system. That's not regular Analogue TV. What receivers do you actually have? If you are talking Digital then why are we involving SCART and the like? Do any TV sets actually have both SCART and HDMI? Two different generations, surely. You must state you problem more precisely or we could be wasting a lot of time in idle discussion.
"In theory" you could set up a whole broadcasting system but what really would be the point? You could end up spending a lot of money on kit, to feed some ancient (?) TVs which would be very hard to repair because the spares are less and less available. IMO you risk wasting a lot of money if you launch out on any serious Engineering with the level of knowledge you appear to have (from many of your posts).
 
  • #30
No @sophiecentaur I am not engineering anything with this , I am doing other things at the moment this is just reading and understanding.
It's just that the idea of one having the capability to transmit one's own "tv" over a given area is a very intriguing one. (probably a similar feeling felt by amateur radio enthusiasts) because owning miles of cable and distribution is much less likely for a individual to do yet transmitting via air is easier from a technical perspective.

as for the box in the link I was just thinking from the description if it that it can take in a video signal via a HDMI cable from say a computer and modulate that signal onto a carrier and output the signal via a coax output which seemingly can be then used top feed into an amplifier and an antenna , otherwise why would that little box use QAM ?
I haven't looked at the specifics of this particular modulator but I assume one could similarly have a digital modulator and output a digital transmission signal modulated onto a carrier and then transmit that , I think the basic idea would be the same only difference that in the analog case one could use a simple antenna and feed it into an older style tv while having the digital signal would require a decoder for use with an older tv, right?
 
  • #31
artis said:
No @sophiecentaur I am not engineering anything with this , I am doing other things at the moment this is just reading and understanding.
I'm not sure how to treat this. On one hand, everyone should be free to fantasize. On the other hand, PF is not fond of giving advice how to do something that would be illegal. Even if you don't plan anything illegal, potentially thousands of other people in the coming years may find this thread by Google search and some of them may use the information for their own purposes.

You have adequate answers to your original question. Thread closed.
 
  • Like
Likes sophiecentaur

1. What is an analog TV signal?

An analog TV signal is a type of television signal that uses continuously varying electrical signals to transmit video and audio information. It is the older technology used in traditional televisions before the switch to digital signals.

2. How does an analog TV signal work?

Analog TV signals work by encoding video and audio information into an electrical signal that is transmitted over the airwaves. The signal is then received by a television antenna and decoded by the TV's tuner, which converts it into the images and sound that we see on the screen.

3. What is a CRT and how does it work?

CRT (Cathode Ray Tube) is a type of technology used in older television sets and computer monitors. It works by using a cathode ray tube to generate an electron beam that is directed towards a phosphor-coated screen, creating the images and colors that we see on the screen.

4. Why is analog TV being phased out?

Analog TV is being phased out in favor of digital TV because digital signals offer higher quality images and sound, as well as more efficient use of the broadcast spectrum. Digital signals also allow for additional features such as high-definition TV and interactive services.

5. Can I still watch analog TV with a digital TV?

Yes, you can still watch analog TV with a digital TV by using a digital converter box. However, most broadcasters have switched to digital signals, so the availability of analog channels may be limited. It is recommended to upgrade to a digital TV for better quality and more channel options.

Similar threads

  • Electrical Engineering
Replies
12
Views
1K
Replies
32
Views
2K
Replies
9
Views
641
Replies
11
Views
2K
  • Quantum Physics
Replies
26
Views
2K
  • Introductory Physics Homework Help
Replies
1
Views
1K
Replies
21
Views
4K
  • Advanced Physics Homework Help
Replies
1
Views
2K
  • Introductory Physics Homework Help
Replies
2
Views
4K
  • Electrical Engineering
Replies
8
Views
9K
Back
Top