1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

I Wouldn't using AC to power a lamp result in flickering?

  1. Dec 1, 2017 #1
    Hello,

    I was thinking that using AC to supply power to a lamp would cause it to flicker, considering that the potential difference keeps changing signs.

    This would mean that the current supplies voltage to the lamp, only to take it back after a really small amount of time. Voltage say's something about the amount of elektrons per time unit that a component 'recieves'. So it's really just a constant process of giving and taking elektrons from the lamp.

    Upping the frequency would mean you simply repeat this process quicker. Would that not mean that lamps are actually flickering at all times, unless you use DC? Obviously the light would flicker really fast so you wouldn't be able to notice it.

    My English is ok, but not amazing, so I hope the jargon used is actually correct English aswell.

    -Y
     
  2. jcsd
  3. Dec 1, 2017 #2

    anorlunda

    Staff: Mentor

    You're right, it does. However, the blink rate (2x power frequency) is too fast to see in most circumstances and too fast for most people to see.

    When I was younger and my eyes were much better, I could sometimes see flicker in fluorescent lights. I recall noting that I could see flicker in Europe (with 50Hz power) but not in America (with 60Hz power). That was the limit of my visual perception. By the time I reached middle age, my eyes were no longer good enough to see flicker.

    It also effects the lifetime of incandescent bulbs, because the filaments head/cool expand/shrink with each cycle. In the Thomas Edison museum in Menlo Park, NJ some of Edison's DC light bulbs have been burning continuously for 120 years. That's much easier to do with DC.
     
  4. Dec 1, 2017 #3

    CWatters

    User Avatar
    Science Advisor
    Homework Helper

    The current heats the filament in both directions. The AC is normally 50 or 60 cycles per second so the lamp is heated 100 or 120 times per second. The time between heating pulses is very short so the filament doesn't cool down enough to see the flicker.
     
  5. Dec 1, 2017 #4

    FactChecker

    User Avatar
    Science Advisor
    Gold Member

    You need to be specific about the type of bulb you are talking about.
    An ordinary incandescent bulb filament heats up the same regardless of the direction of current flow. A smoothly alternating current goes through zero current as it changes from one current direction to the other, so the filament would briefly be heated less at that time. But it probably cools off such a small amount that it would be better to say that the brightness is "wavering" rather than "flickering".

    Florescent bulbs usually flicker. I don't know if LEDs do or not. It probably depends on the electronics that converts standard wall socket power to the power supplied to the LED.
     
  6. Dec 1, 2017 #5

    CWatters

    User Avatar
    Science Advisor
    Homework Helper

    Yes LEDs can also flicker. The ones that look like filaments seem to be the worse. Some LED respond so fast it's possible to make them deliberately flicker at high frequencies and use that to make a network. Think it's called lifi?
     
  7. Dec 1, 2017 #6

    Nugatory

    User Avatar

    Staff: Mentor

  8. Dec 1, 2017 #7

    sophiecentaur

    User Avatar
    Science Advisor
    Gold Member

    Those two frequencies fall around a very critical value, concerning the sensitivity of our eyes to flicker. The US chose 60Hz, which produces 120 brightness peaks per second. The UK chose 50Hz, which produces just 100 peaks per second. 60 Hz is a lot less visible. There is another factor which used to make flicker in the US less visible. US uses about half the mains voltage that UK does. That means the lamp filaments tend to be more massive and they heat up and cool down slower, which reduces the temperature swing (and hence the brightness). Double whammy; lighting in the US (filament lighting) was much more satisfactory. Likewise, the (analogue) TV frame repetition rate was higher in the US and the pictures flickered less.

    LEDs have a much sharper on/off characteristic (also fluorescent tubes) and the flicker is more noticeable. I notice it particularly with drips falling from a water tap. They appear as a set of bright beads rather than a single stream.

    I don't t think that's an accurate enough description of how electrons are involved in 'electricity'. It is more likely to mis-lead than help anyone. At any time, there are the same number of electrons in a lamp.
     
  9. Dec 1, 2017 #8

    berkeman

    User Avatar

    Staff: Mentor

  10. Dec 1, 2017 #9

    tech99

    User Avatar
    Gold Member

    As a matter of interest, a flashlamp bulb can respond to audio frequencies, and can be used for optical communication.
     
  11. Dec 1, 2017 #10

    sophiecentaur

    User Avatar
    Science Advisor
    Gold Member

    Yep. Long before LEDs and 'Optical Communications'. Not a very linear system though but it definitely works.
     
  12. Dec 1, 2017 #11

    rcgldr

    User Avatar
    Homework Helper

    As posted above, in the case of incandescent bulbs, the filaments have a relatively slow response time, sort of a video equivalent to reverb, also described as persistence, where the filament continues to glow for a while even after current is shut off. The end result is the intensity doesn't vary much and isn't that noticeable. The worst case flicker I've seen are half-wave LED Christmas lights, which just use a diode to only allow half of each AC cycle to power the LED's. Full wave LED lights use a rectifier circuit to power the LED's on both halves of an AC cycle. Some LED circuits may include capacitors to level out the voltage / current to reduce flicker.

    CRT monitor flicker is affected by refresh rate and persistence of the phosphors. There's a trade off between flicker and smearing of images with the persistence of the phosphors used in a CRT monitor. CRT TV's have a slower effective refresh rate than CRT computer monitors, so the CRT TV's use somewhat longer persistence phosphors. Most CRT computer monitors will have some flicker at 60 hz, and 75 hz or 85 hz is needed to virtually eliminate flicker since that is what the phosphors persistence is set for. The main exception is the old IBM 3270 series CRT monochrome (green) monitors. The 3270 monitors were "block oriented" terminals, typically displaying a fixed text screen with fields to be filled in by the operator. The persistence of the phosphors on these monitors was about 1.5 seconds: moving the cursor at 10 characters per second left a trail of about 15 or so cursor images of diminishing brightness. If the screen display was changed to a new set of fields, it would take about 1.5 seconds for the prior image to fade away. In addition to eliminating flicker, the 3720 monochrome monitors used a very thin and sharp font.
     
  13. Dec 1, 2017 #12
    There is a measurable 120 Hz flicker in US incandescent lighting, a stronger flicker at the same rate in fluorescent illumination, and a maximal on-off flicker in AC-driven LED light. However, the cyclic variation of brightness is too fast for most people to perceive without a device to count the on/off cycles.

    Back in the days of music pressed into vinyl disks, stroboscopic calibration disks (with a central hole like a record) were often used to check a turntable's rate of rotation. Under AC illumination, a series of radial lines on the turning disk would appear to stand still when the phonograph turntable was accurately rotating 33 1/3 times per minute. The standing-still effect was visible under incandescent lighting, but could be seen more clearly under fluorescent light.
     
  14. Dec 2, 2017 #13

    sophiecentaur

    User Avatar
    Science Advisor
    Gold Member

    My (pretty high class) Garrard (401?) record playing deck had a pattern round the outside of the turntable and a magnetic brake to vary the speed with. No crystal controlled drives in those days in the home.
     
  15. Dec 2, 2017 #14

    anorlunda

    Staff: Mentor

    Thanks! That is an excellent illustration of the topic.
     
  16. Dec 2, 2017 #15
  17. Dec 2, 2017 #16
    Indeed! Our physics class (1969/70 ?) did a 'scrap-heap challenge' demo of the system. The receiver's sensor was an OC71 transistor (with the black paint scraped off). We never got it to work over any significant distance in daylight but after dark it worked fine provided you had good optical alignment and tweaked the OC71's bias to cope with what residual background light remained. The best we managed was about 150 metres.
     
  18. Dec 2, 2017 #17

    OmCheeto

    User Avatar
    Gold Member
    2016 Award

    Currently doing a test, as I think I have one of these sets of lights:

    2017.12.02.pf.science.png

    @dlgoff , did I do this right?
     
  19. Dec 2, 2017 #18

    dlgoff

    User Avatar
    Science Advisor
    Gold Member

    Looks good to me.
     
  20. Dec 2, 2017 #19

    rcgldr

    User Avatar
    Homework Helper

    Is the display showing zero volts at the top (and I assume negative voltage "spikes")? I would expect a bit over half of the time spent at zero volts on a half wave LED lights. Each LED has a small circuit, and there may be some type of capacitor to help a bit. I have both half wave and full wave LED lights, and the difference is very visible.
     
  21. Dec 2, 2017 #20
    I believe 60Hz was set by Nicola Tesla to reduce the cost of electrical transformers for high tension transmission lines. 50Hz transformers would be larger and more expensive. Making it higher than 60Hz reduces transformer efficiency. I think Europe chose 50Hz to avoid Tesla's patents. As far a flicker is concerned 60Hz is better than 50. Flicker is smoothed by the our optical rods and cones that have a response time curve covering about 50 milliseconds however, since this a ramp and decay curve you can still detect flicker at 20 times a second or greater. Some people are quite sensitive to the flicker of florescent lights that flicker at 120 times a second. They don't actually see the flicker but their eyes tire quickly because their iris is attempting to respond and getting mixed signals. Your iris, when you are young responds quite quickly as a measure to protect you from bright lights but with age this response slows.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted



Similar Discussions: Wouldn't using AC to power a lamp result in flickering?
  1. Disco Light lamp power (Replies: 5)

  2. Power in AC Circuit (Replies: 2)

  3. Ac power (Replies: 2)

  4. Lamp brightness with AC (Replies: 21)

Loading...