jgeverin172
- 3
- 0
Hey Guys (my first post!),
I've been trying to dig up this information for quite some time now, and I figure physicsforums may be the place to ask.
I keep 'hearing' of some research that was done that concludes that people interpret 80-85% (or some other fraction) of dim level as the same to them as 100% dimming level; that is, people (for the most part) couldn't see the difference in dim level on some load. Even though in actuality, the presented dim levels would be different. Please ask me to rephrase if it's not clear enough.
Obviously many factors would be involved in said 'study'. What kind of loads was said 'experiment' performed on? (LEDs, incandascents, CFLs, etc.)
Was the dimming done with a potentiometer dimmer or PWM? (if PWM, then 80-85% dimming would imply 80-85% duty cycle!)
There may be other factors too, but anyways...
What I'm trying to figure out is WHO came up with said study? Also, if anyone knows of said experiment, then I'd love to see a link to it! Has anyone heard of this idea before? The idea being again: People can't visually tell the difference in dimming between 100% and 80-85% (or some other dimming level). I've heard of this idea floating around awhile, but could never figure out WHO concluded this; or WHERE did this idea come from?
My initial suspects were Lutron and EnergyStar. Lutron touted their product that produces 70-75% dimming on normally fully on loads, claiming that people don't notice. EnergyStar had a big article on the internet somewhere on potential savings in lighting.
If anyone knows what I'm talking about, and knows where this idea stemmed from, please help.
Thank you for your time!
I've been trying to dig up this information for quite some time now, and I figure physicsforums may be the place to ask.
I keep 'hearing' of some research that was done that concludes that people interpret 80-85% (or some other fraction) of dim level as the same to them as 100% dimming level; that is, people (for the most part) couldn't see the difference in dim level on some load. Even though in actuality, the presented dim levels would be different. Please ask me to rephrase if it's not clear enough.
Obviously many factors would be involved in said 'study'. What kind of loads was said 'experiment' performed on? (LEDs, incandascents, CFLs, etc.)
Was the dimming done with a potentiometer dimmer or PWM? (if PWM, then 80-85% dimming would imply 80-85% duty cycle!)
There may be other factors too, but anyways...
What I'm trying to figure out is WHO came up with said study? Also, if anyone knows of said experiment, then I'd love to see a link to it! Has anyone heard of this idea before? The idea being again: People can't visually tell the difference in dimming between 100% and 80-85% (or some other dimming level). I've heard of this idea floating around awhile, but could never figure out WHO concluded this; or WHERE did this idea come from?
My initial suspects were Lutron and EnergyStar. Lutron touted their product that produces 70-75% dimming on normally fully on loads, claiming that people don't notice. EnergyStar had a big article on the internet somewhere on potential savings in lighting.
If anyone knows what I'm talking about, and knows where this idea stemmed from, please help.
Thank you for your time!