I’ve always been mildly bothered by the LED lighting in my home, as if it’s simultaneously bright but not illuminating. In simple consumer terms, if I wanted to shop for a variant that more closely replicated incandescent lighting, what exactly am I looking for on the packaging? Or does this not exist?
It’s called SSI, spectral similarity index. SSI is specified for a color temperature, eg 3200 or 5600. 100 is identical to tungsten or sunlight. Values above 85 are good.
In the UK I've not been able to find high wattage (10-20W) LED lightbulbs with high CRI, some don't even mention it in listings, let alone SSI, which I have never seen.
Where are you seeing these? Is this industrial/commercial suppliers?
The push toward LED seems to be primarily for emission target related reasons. It is very hard to buy incandescent bulbs in the UK; even for those of us that accept the cost implications. Also, many less expensive LEDs flicker at the rate of the frequency supply of the current (ie 240 or 120 Hz). This is very annoying and related to the instantaneous response of LED vs the averaging effect of the alternating current through an actual glowing hot filament. It is interesting to read on the development of blue and white LED technology.
In the EU this was indeed done for energy efficiency/emissions. Incandescent bulbs were gradually banned from normal sale, starting with the most energy hungry (diffused 100W) and gradually expanding until only low-wattage and special-purpose bulbs were left. Special-purpose bulbs cover a large variety for everything where switching didn't make sense, like machine shops or historic buildings. LEDs aren't mandated per se, but they are the most attractive alternative. And because this all happened before brexit the UK has the same rules, unless they revised any of them post-brexit
For the most part this was a very positive step. Prices for LED bulbs plunged when they went from the "premium" energy-efficient alternative to the default option. But you also get a lot of crap on the market, and stuffing LEDs in form factors designed for incandescent bulbs makes good electrical and thermal design challenging. Even for those brands that actually try
> LEDs aren't mandated per se, but they are the most attractive alternative.
Yeah, basically what the EU did was to say: For X Watts of electricity at least X Lumen of light has to be produced. And this number was gradually increased. Since old school light bulbs are quite inefficient when it comes to producing light, they slowly had to be phased out.
> The push toward LED seems to be primarily for emission target related reasons
Is this true? I’ve got LEDs in my house because they cost vastly less to run, and because I rarely have to replace the bulbs.
Some cheap LEDs do flicker (at 50 or 60 Hz). But that’s fairly easily solved. I don’t think I’ve noticed the flicker since some cheap bulbs I bought in 2014 or so.
Well… (Sorry, let me put my tinfoil hat on.) Yeah, well that noticed part is what is worrisome to me. I do worry that there is some effect on our brains even though we might not perceive the flicker.
As an analogy, I got into those supposedly audiophile "Class D" (or "Class T") amplifiers over a decade ago. Every day I turned on the music in my office and coded with the T-amp playing. I would have told you at the time that, indeed, it sounded amazing.
Some time later I built a tube amplifier (The Darling [2], in case anyone cares—I've since built perhaps a dozen more).
When I brought it into the office and swapped it out for the T-amp, the change was sublime but immediately noticeable. I hate to fall back on audiophile terminology but it's the best I have for the experience: I was suddenly aware of "listening fatigue" that had been a component of the T-amp. I hadn't even known it had been fatiguing until I heard the tube amp in its place for days on end.
With the loss of color fidelity and the flickering issue, I'm embarrassed to say that incandescent is starting to look good to me again.
I might, as an experiment, replace only those lights that we turn on in the evening when we are relaxing, reading.
It costs less to run because less energy is used; I'm pretty sure incandescent bulbs aren't emitting anything by themself!
"The push" is from the government, perhaps consumer demand is "the pull".
It does seem an easy win for govts to easily conform.
I buy the ones that are suitable for dimmable switches (even tho I don't have dimmers) because there is discernible flicker with most other LED bulbs if you for eg wave your arm through the air or made a saccade. There is a certification (i think) for LED bulbs that are closer to
sunlight in their emission spectrum
>Is this true? I’ve got LEDs in my house because they cost vastly less to run, and because I rarely have to replace the bulbs.
At least in EU is true. Citing from Wikipedia: "The 2005 Ecodesign directive covered energy-using products (EuP), which use, generate, transfer or measure energy, including consumer goods such as boilers, water heaters, computers, televisions, and industrial products such as transformers. The implementing measures focus on those products which have a high potential for reducing greenhouse gas emissions at low cost, through reduced energy demand."
If I were able to see the flicker of mains supplied LED lighting (which I cannot), then I would be very tempted to install low-voltage DC LED lighting, which presumably does not flicker.
It only doesn't flicker if there's no power driving circuitry - eg just LEDs and a resistor.
Otherwise, if there is a power IC present, there is flicker, though fast enough for most humans to not perceive normally (you can still check it by waving your hand in front of the light and seeing the strobed afterimage.)
There is a 15-30% difference between the groups at baseline (fig 8c-9c, 8d-9d), about the same magnitude as the claimed effect of the experimental condition.
I think the result would be much stronger if these baselines were comparable, so they show they have accounted for other variables like time of day and light history. I am also skeptical of any effect in the retina lasting 6 weeks, with no fading.
Consider that people are often exposed to much more infrared light outdoors, so "worked under a relatively dim incandescent lamp" is not a particularly novel stimulus. Imagine that any of these people spent time outdoors during the six weeks - thousands of times more infrared light there.
Very interesting. I've always thought that there was something a bit "off" about LED torches and car headlamps; the brightness is there, but something about the light just doesn't seem to illuminate as well as an old dim incandescent or even fluorescent tube.
It's usually the Color Rendering Index (the spectrum of frequencies that the light puts out). Incandescent bulbs more of less mimic that of the Sun, they are "black body radiators". Cheap LEDs tend to be missing a lot of the red spectrum.
However, you can get LEDs that do this well. Look for one with a "CRI" of 95% or higher.
There's a massive difference between the 2600K of regular incandescent bulbs, and the 6000K of sunlight. That's why hollywood used HMIs until they migrated to LED.
They're saying that the visual performance is indirectly affected by invisible wavelengths somehow. Not that you can see the difference between two types.
They are saying that, and most real world LED lighting uses very cheap diodes, like, 99.9999% of them, which create very poor colour compared with incandescent bulbs, which create perfect colour representation.
It's a big thing and you can buy LEDs which produce a better colour range, but they're much more expensive and not as energy efficient, because creating bold reds costs hard energy that no diode trick will ever get around that.
I get that they're more efficient in some sense, but man the LED streetlights and other big lamps are so irritating and make things like like such ass compared to mercury vapor or even sodium lights.
True. Yet, somehow more and more cities install them blindly because efficiency. I remember when I moved to Odense Denmark in 2013 - they had LED street lights all over the place. I thought - this is the future compared to my uderdeveloped post soviet Latvia. And yet, I remeber when I moved back, streets at night looked so yellow because the city still relied on sodium lights. And my eyes felt much more comfortable. At the time I wrote it off to nostalgia or something, and here we are.
Just to point to anybody that comes here directly, the article has no relation at all with perceived illumination, color fidelity, or anything else people complain about leds.
It's an interesting niche topic that you may want your working place to notice if you work indoors.
Was just discussing last week with a colleague how for the same 'lumen' there was such a dramatic difference between led and incandescent bulbs for ease of reading paper books.
You can't buy heat lamps? They are even more infrared and last longer.
Also LED lighting can have infrared, have a significantly more smoother spectrum curve and still last +20k hours without burnout. The cheaper bulb spectra that they show is a blue led + phosphor coating, but there are infrared LEDs, UV leds, and more. You can make quite the convincing sun simulation, even better than any incandescent bulb, but there is almost no demand for UV + Infrared super full spectrum lighting unfortunately. Only movie & theater lights come close.
>LED lighting can have infrared, have a significantly more smoother spectrum curve and still last +20k hours without burnout
Do you have a link to a bulb that you can purchase meeting all these criteria? The only one I'm aware of was this obscure "StarLike" that was never actually sold in bulk. LEDs can be made good in theory sure, but in practice they are all terrible in light quality compared to a standard incandescent.
It should be noted that even if we assume that the conclusion of this study is correct, i.e. that artificial lighting should have a wide spectrum including near-infrared light, that does not mean that returning to classic incandescent lamps is the right solution for this problem.
The incandescent lamps with tungsten filaments have a much lower temperature than the Sun, thus much more energy is radiated in infrared than needed.
There was about a year or two ago a discussion about a very interesting research paper that reported results from testing an improved kind of incandescent lamp, with energy efficiency and lifetime comparable to the LED lamps.
The high energy efficiency was achieved by enclosing the lamp in a reflecting surface, which prevented energy loss by radiation, except for a window that let light out, which was frequency-selective, so only visible light got out, while infrared stayed inside. The lamp used a carbon filament in an environment that prevented the evaporation of the filament.
With such a lamp, one can make a tradeoff between energy efficiency and the content of healthy near infrared light, by a judicious choice of the frequency cutoff for the window through which light exits the lamp.
Even with enough near-infrared light, the efficiency should be a few times higher than for classic incandescent lamps, though not as good as for LED lamps. Presumably, one could reach an efficiency similar to that of the compact fluorescent lamps (which was about half of that of LED lamps), for such an incandescent lamp that also provides near-infrared light.
How does enclosing the lamp in reflective material help with the energy efficiency? Isn't the infrared radiation emitted anyway? Doesn't that make the lamp overheat?
If the reflective material is ideal, by definition no infrared or other radiation is emitted.
Perhaps I was not clear, but the reflective surface was the interior surface, so it reflected any light, visible or infrared, back towards the emitting filament, while the front window reflected only the infrared, while transmitting the visible light.
The lamp does not overheat, because the filament is kept at a constant temperature, the same as in a classic incandescent lamp. The difference is that you need a much lower electrical current through it for maintaining the temperature, because most of the heat is not lost away, like in a classic lamp. The fact that you need a much smaller electrical current for the same temperature is the source of the greater energy efficiency.
Only if you had used the same electrical current as in a classic lamp, the lamp would have overheated and the filament destroyed, but you have no reason to do that, like you also do not want to use in a classic lamp a current higher than nominal, which would overheat and destroy it.
This should also be true for TL lights. Which kinda contradicts common sense seeing that those are used all over the place in offices, kitchens, and hospitals, makes me think this paper is bogus.
I've been using incandescent more often. All my vanity lights are 40w appliance bulbs now. The difference at night is remarkable. The LED is just too much even at 2700k. I still prefer LED for high power situations like br30/40 can lights.
Yes, there is something obviously wrong with most LED lights, but it isn't too much of short wavelength light, but on the contrary. It's the near absence of cyan light in most LEDs. Our eyes are by far the most sensitive to it, the majority of receptors in the eye are sensitive to it, and we may focus primarily on it (focus differs for different wavelengths). This is how you get the feeling of something being wrong with your vision as you for example walk into a mall, and so on.
If anything, higher temperature lights seem to make it better, not worse, but the problem will persist as long as the cyan hole stays there.
Sensitivity peak for humans is in cyan (~510nm) only for low-light conditions (night vision / rod cells). In daylight (cone cells) it's green-yellow (555nm).
https://www.giangrandi.ch/optics/eye/eye.shtml
There are some full spectrum led lights, they just cost over $100 a piece. And they might get banned in the future for not being energy efficient enough.
No mention of CRI which seems kind of odd. LEDs for lighting are increasingly graded by how natural their emission spectrum is. Older lights are quite bad, newer ones sacrifice a tiny bit of performance for more uniform spectrum.
They use rf numbers, which is a newer standard, so that's probably good.
However, the experimental group (extra light sources) got rf 91 bulbs, and the control ("LED lighting") got rf 85 bulbs.
The two scales are not exactly comparable, but they both max out at 100. The only source I could find that discusses both says that > 90 CRI is "excellent" and just below that is "very good". It says > 85 rf is "very good", which tells me it's comparable to a mid-80's CRI bulb.
If I accidentally buy a mid-80 CRI bulb, I either return it to the store, or just throw it away.
So, I'd say this study's experimental setup doesn't support any useful conclusions. They showed that so-painfully-bad-California-won't-subsidize-them LEDs are worse than passable LEDs with supplementation from another light source.
The passable LEDs in the study are probably comparable to the cheap ones at our local hardware store, but worse than the ones that cost $10-20 on amazon ten years ago.
This would have been much more interesting if they'd compared high-end LEDs with and without supplementation, and found a difference. (And by "high-end", I mean "still much cheaper then the electricity they save")
CRI is a pretty bad rating system. They are showing the full spectrum graphs which is what you'd want anyway. Spectral Similarity Index (SSI) is the better number
Sure, but I don't see them mention what they're actually using for LEDs at all. They mention a "colour fidelity index" but I'd expect a manufacturer part number or something so I can pull the datasheet.
Funny enough, the best evidence for this study is that they should probably move somewhere with more sunlight if they can't spell "color" right... /s
I think CRI is not important here as thats a measure in the visual spectrum. The paper talks about all the missing wavelength outside of the visual spectrum.
I hate the LED street lamps so much. I can tell they've got a really spiky and unnatural spectrum, unlike the HPS lights, not to mention that they're white or bright yellow...
I found some interesting tidbit about this bigger issue. And I want to share how to more easily check it.
We many times see some people reporting that they clearly see lower quality LED light flicker and is really distracting to them and even causes them headaches.
Now, I didn't see this until recently (unless in failing lights) in the right conditions. If the light is very, very dim: For instance, only 1 light on in the night, and you are in a division far away from the light so that it's extremely dim. There, I could finally really see it flicker.
I've replaced that light for a better one and the effect went away.
I have incandescent light bulbs at home I have to pretty much smuggle from China. It's amazing how we're replaying the asbestos playbook a century later. Only this time it's government mandated.
Asbestos was pushed as a magical solution to problems of fire in homes without paying attention to the health effects. It took 80 years for the obvious to become law.
Leds are pushed as a solution to energy consumption by humans without paying any attention to the health effects. Hopefully it will be less than 80 years of cancers and metabolic disruption before the obvious is done.
But this time the regulation was captured pre-emptively, to the point that following best scientific advice for your health is illegal is most of the developed world.
There's a mostly-unsubstantiated-by-data belief that LED lighting can cause health problems by some combination of flickering and narrow color spectrum.
I’m guessing the Russian theory that asbestos is totally fine and isn’t harmful? The Russians still use asbestos and say it’s a plot by the west that we got rid of asbestos in our buildings. (Don’t shoot the messenger here, I have no dog in this fight and am not expressing an opinion)
Why is it that right now there is still on the frontpage of an "article being found flawed after 6k citations " ( https://statmodeling.stat.columbia.edu/2026/01/22/aking/ ) but this random article coming out of nowhere makes the front page on the same day?
People really should get it and stop sharing newly published papers to the general public. The value of one single academic paper is exactly 0. Even a handful of such articles still has 0 value to the general public. This is only of interest to other academics (or labs, countries, etc.) who may have the power to reproduce it in a controlled environment.
Be very skeptical of correlations like this that have dubious or poorly understood causation. Be even more skeptical if they are about day-to-day stuff that would likely have large swaths of people able to reproduce something like it on huge scales yet they haven't. Extraordinary claims require extraordinary evidence.
This article is not making an extraordinary claim, and your offence is hyperbolic. Analysis of research should not be restricted to the academe, but careful not to cherry puck research.
Considering the percentage of live mitochondria that are exposed to external light in a human this seems like an enormous effect. The effect we'd expect from publication bias though is already pretty big. I'm going to go with the latter until we've got some replication, and a plausible mechanism (like.. why wouldn't whales be badly sick if this was a thing?).
Scientific Reports is a junk journal fyi. Not conclusive, but indicative.
Despite saying the visible flux component is "small" and that the tungsten lamps "were not expected to [be used] as task lamps," Figure 6 (a) and (c) shows... desk lamps right at the work stations like task lamps! Not only is this experimentally unblinded, but the visible light immediately in front of the test subjects is noticeably brighter and warmer. The effect could simply be due to reduced eye strain.
What would James Randi do? "Extraordinary claims require extraordinary proof," and unfortunately this isn't it.
This would be more interesting if they add a visible light filter on the lamps so they only emit infrared radiation, and have an identical double-blind control with a 60 watt heater bulb so it emits no SWIR but the same radiant heat (which could confound and/or unblind).
> In humans a single 3 min 670 nm exposure improves colour vision within 3 h, which is sustained for almost a week
That seems remarkable and almost too good to be true?
I’ve always been mildly bothered by the LED lighting in my home, as if it’s simultaneously bright but not illuminating. In simple consumer terms, if I wanted to shop for a variant that more closely replicated incandescent lighting, what exactly am I looking for on the packaging? Or does this not exist?
What's available depends on the form factor, but there are some manufacturers that offer some choice in the 2700k 90+cri space nowadays.
Ra value is (often) written on the package, go for 95+, it’s a bit hard to find but the difference is real. I Do not buy under 90.
It’s called SSI, spectral similarity index. SSI is specified for a color temperature, eg 3200 or 5600. 100 is identical to tungsten or sunlight. Values above 85 are good.
In the UK I've not been able to find high wattage (10-20W) LED lightbulbs with high CRI, some don't even mention it in listings, let alone SSI, which I have never seen.
Where are you seeing these? Is this industrial/commercial suppliers?
1 reply →
The push toward LED seems to be primarily for emission target related reasons. It is very hard to buy incandescent bulbs in the UK; even for those of us that accept the cost implications. Also, many less expensive LEDs flicker at the rate of the frequency supply of the current (ie 240 or 120 Hz). This is very annoying and related to the instantaneous response of LED vs the averaging effect of the alternating current through an actual glowing hot filament. It is interesting to read on the development of blue and white LED technology.
In the EU this was indeed done for energy efficiency/emissions. Incandescent bulbs were gradually banned from normal sale, starting with the most energy hungry (diffused 100W) and gradually expanding until only low-wattage and special-purpose bulbs were left. Special-purpose bulbs cover a large variety for everything where switching didn't make sense, like machine shops or historic buildings. LEDs aren't mandated per se, but they are the most attractive alternative. And because this all happened before brexit the UK has the same rules, unless they revised any of them post-brexit
For the most part this was a very positive step. Prices for LED bulbs plunged when they went from the "premium" energy-efficient alternative to the default option. But you also get a lot of crap on the market, and stuffing LEDs in form factors designed for incandescent bulbs makes good electrical and thermal design challenging. Even for those brands that actually try
> LEDs aren't mandated per se, but they are the most attractive alternative.
Yeah, basically what the EU did was to say: For X Watts of electricity at least X Lumen of light has to be produced. And this number was gradually increased. Since old school light bulbs are quite inefficient when it comes to producing light, they slowly had to be phased out.
> The push toward LED seems to be primarily for emission target related reasons
Is this true? I’ve got LEDs in my house because they cost vastly less to run, and because I rarely have to replace the bulbs.
Some cheap LEDs do flicker (at 50 or 60 Hz). But that’s fairly easily solved. I don’t think I’ve noticed the flicker since some cheap bulbs I bought in 2014 or so.
"I don’t think I’ve noticed the flicker…"
Well… (Sorry, let me put my tinfoil hat on.) Yeah, well that noticed part is what is worrisome to me. I do worry that there is some effect on our brains even though we might not perceive the flicker.
As an analogy, I got into those supposedly audiophile "Class D" (or "Class T") amplifiers over a decade ago. Every day I turned on the music in my office and coded with the T-amp playing. I would have told you at the time that, indeed, it sounded amazing.
Some time later I built a tube amplifier (The Darling [2], in case anyone cares—I've since built perhaps a dozen more).
When I brought it into the office and swapped it out for the T-amp, the change was sublime but immediately noticeable. I hate to fall back on audiophile terminology but it's the best I have for the experience: I was suddenly aware of "listening fatigue" that had been a component of the T-amp. I hadn't even known it had been fatiguing until I heard the tube amp in its place for days on end.
With the loss of color fidelity and the flickering issue, I'm embarrassed to say that incandescent is starting to look good to me again.
I might, as an experiment, replace only those lights that we turn on in the evening when we are relaxing, reading.
[1] https://en.wikipedia.org/wiki/Class-T_amplifier
[2] https://www.diyaudio.com/community/threads/darling-1626-amp.... and https://imgur.com/gallery/oh-darling-tube-amplifier-Lq2Sx
1 reply →
It costs less to run because less energy is used; I'm pretty sure incandescent bulbs aren't emitting anything by themself! "The push" is from the government, perhaps consumer demand is "the pull".
It does seem an easy win for govts to easily conform.
I buy the ones that are suitable for dimmable switches (even tho I don't have dimmers) because there is discernible flicker with most other LED bulbs if you for eg wave your arm through the air or made a saccade. There is a certification (i think) for LED bulbs that are closer to sunlight in their emission spectrum
>Is this true? I’ve got LEDs in my house because they cost vastly less to run, and because I rarely have to replace the bulbs.
At least in EU is true. Citing from Wikipedia: "The 2005 Ecodesign directive covered energy-using products (EuP), which use, generate, transfer or measure energy, including consumer goods such as boilers, water heaters, computers, televisions, and industrial products such as transformers. The implementing measures focus on those products which have a high potential for reducing greenhouse gas emissions at low cost, through reduced energy demand."
https://en.wikipedia.org/wiki/Ecodesign_Directive
If I were able to see the flicker of mains supplied LED lighting (which I cannot), then I would be very tempted to install low-voltage DC LED lighting, which presumably does not flicker.
It only doesn't flicker if there's no power driving circuitry - eg just LEDs and a resistor.
Otherwise, if there is a power IC present, there is flicker, though fast enough for most humans to not perceive normally (you can still check it by waving your hand in front of the light and seeing the strobed afterimage.)
3 replies →
There is a 15-30% difference between the groups at baseline (fig 8c-9c, 8d-9d), about the same magnitude as the claimed effect of the experimental condition.
I think the result would be much stronger if these baselines were comparable, so they show they have accounted for other variables like time of day and light history. I am also skeptical of any effect in the retina lasting 6 weeks, with no fading.
Consider that people are often exposed to much more infrared light outdoors, so "worked under a relatively dim incandescent lamp" is not a particularly novel stimulus. Imagine that any of these people spent time outdoors during the six weeks - thousands of times more infrared light there.
Indeed - these study results are fairly substantial if they can be independently reproduced by more studies at bigger scales.
Very interesting. I've always thought that there was something a bit "off" about LED torches and car headlamps; the brightness is there, but something about the light just doesn't seem to illuminate as well as an old dim incandescent or even fluorescent tube.
It's usually the Color Rendering Index (the spectrum of frequencies that the light puts out). Incandescent bulbs more of less mimic that of the Sun, they are "black body radiators". Cheap LEDs tend to be missing a lot of the red spectrum.
However, you can get LEDs that do this well. Look for one with a "CRI" of 95% or higher.
The CRI is an imperfect metric, I watch for both CRI and R9, both should be high.
There's a massive difference between the 2600K of regular incandescent bulbs, and the 6000K of sunlight. That's why hollywood used HMIs until they migrated to LED.
3 replies →
They're saying that the visual performance is indirectly affected by invisible wavelengths somehow. Not that you can see the difference between two types.
They are saying that, and most real world LED lighting uses very cheap diodes, like, 99.9999% of them, which create very poor colour compared with incandescent bulbs, which create perfect colour representation.
It's a big thing and you can buy LEDs which produce a better colour range, but they're much more expensive and not as energy efficient, because creating bold reds costs hard energy that no diode trick will ever get around that.
1 reply →
I get that they're more efficient in some sense, but man the LED streetlights and other big lamps are so irritating and make things like like such ass compared to mercury vapor or even sodium lights.
True. Yet, somehow more and more cities install them blindly because efficiency. I remember when I moved to Odense Denmark in 2013 - they had LED street lights all over the place. I thought - this is the future compared to my uderdeveloped post soviet Latvia. And yet, I remeber when I moved back, streets at night looked so yellow because the city still relied on sodium lights. And my eyes felt much more comfortable. At the time I wrote it off to nostalgia or something, and here we are.
Even a colour filter would help with the harshness.
2 replies →
Just to point to anybody that comes here directly, the article has no relation at all with perceived illumination, color fidelity, or anything else people complain about leds.
It's an interesting niche topic that you may want your working place to notice if you work indoors.
Was just discussing last week with a colleague how for the same 'lumen' there was such a dramatic difference between led and incandescent bulbs for ease of reading paper books.
Which one of the two was better for it?
Incandescent, by miles. Not even in the same ballpark. Even just candlelight beats led.
7 replies →
Someone please tell the Australian government now that we've essentially banned other forms of lighting. (except fluorescent)
You can buy full spectrum LED lights (99 CRI, or grow lamps)
The article uses LED as synonym for typical LED lightning.
How is "full spectrum" defined in this case? Visible spectrum is not the subject of the paper, as they care about infrared.
10 replies →
Call me when there's lights with a cri r9 of 99
You can't buy heat lamps? They are even more infrared and last longer.
Also LED lighting can have infrared, have a significantly more smoother spectrum curve and still last +20k hours without burnout. The cheaper bulb spectra that they show is a blue led + phosphor coating, but there are infrared LEDs, UV leds, and more. You can make quite the convincing sun simulation, even better than any incandescent bulb, but there is almost no demand for UV + Infrared super full spectrum lighting unfortunately. Only movie & theater lights come close.
>LED lighting can have infrared, have a significantly more smoother spectrum curve and still last +20k hours without burnout
Do you have a link to a bulb that you can purchase meeting all these criteria? The only one I'm aware of was this obscure "StarLike" that was never actually sold in bulk. LEDs can be made good in theory sure, but in practice they are all terrible in light quality compared to a standard incandescent.
https://budgetlightforum.com/t/sunlike-vs-starlike/64155/7
2 replies →
Typical electricity rates in Australia are up to 40c/kWh or so.
Do you really think $5 AUD per month per bulb that you’re running 8 hours a day is worth it for better spectrum quality?
Are we also going to ban powerful computers since they use lots of power?
4 replies →
It should be noted that even if we assume that the conclusion of this study is correct, i.e. that artificial lighting should have a wide spectrum including near-infrared light, that does not mean that returning to classic incandescent lamps is the right solution for this problem.
The incandescent lamps with tungsten filaments have a much lower temperature than the Sun, thus much more energy is radiated in infrared than needed.
There was about a year or two ago a discussion about a very interesting research paper that reported results from testing an improved kind of incandescent lamp, with energy efficiency and lifetime comparable to the LED lamps.
The high energy efficiency was achieved by enclosing the lamp in a reflecting surface, which prevented energy loss by radiation, except for a window that let light out, which was frequency-selective, so only visible light got out, while infrared stayed inside. The lamp used a carbon filament in an environment that prevented the evaporation of the filament.
With such a lamp, one can make a tradeoff between energy efficiency and the content of healthy near infrared light, by a judicious choice of the frequency cutoff for the window through which light exits the lamp.
Even with enough near-infrared light, the efficiency should be a few times higher than for classic incandescent lamps, though not as good as for LED lamps. Presumably, one could reach an efficiency similar to that of the compact fluorescent lamps (which was about half of that of LED lamps), for such an incandescent lamp that also provides near-infrared light.
How does enclosing the lamp in reflective material help with the energy efficiency? Isn't the infrared radiation emitted anyway? Doesn't that make the lamp overheat?
If the reflective material is ideal, by definition no infrared or other radiation is emitted.
Perhaps I was not clear, but the reflective surface was the interior surface, so it reflected any light, visible or infrared, back towards the emitting filament, while the front window reflected only the infrared, while transmitting the visible light.
The lamp does not overheat, because the filament is kept at a constant temperature, the same as in a classic incandescent lamp. The difference is that you need a much lower electrical current through it for maintaining the temperature, because most of the heat is not lost away, like in a classic lamp. The fact that you need a much smaller electrical current for the same temperature is the source of the greater energy efficiency.
Only if you had used the same electrical current as in a classic lamp, the lamp would have overheated and the filament destroyed, but you have no reason to do that, like you also do not want to use in a classic lamp a current higher than nominal, which would overheat and destroy it.
This should also be true for TL lights. Which kinda contradicts common sense seeing that those are used all over the place in offices, kitchens, and hospitals, makes me think this paper is bogus.
I've been using incandescent more often. All my vanity lights are 40w appliance bulbs now. The difference at night is remarkable. The LED is just too much even at 2700k. I still prefer LED for high power situations like br30/40 can lights.
I don't think that this is the reason.
Yes, there is something obviously wrong with most LED lights, but it isn't too much of short wavelength light, but on the contrary. It's the near absence of cyan light in most LEDs. Our eyes are by far the most sensitive to it, the majority of receptors in the eye are sensitive to it, and we may focus primarily on it (focus differs for different wavelengths). This is how you get the feeling of something being wrong with your vision as you for example walk into a mall, and so on.
If anything, higher temperature lights seem to make it better, not worse, but the problem will persist as long as the cyan hole stays there.
Sensitivity peak for humans is in cyan (~510nm) only for low-light conditions (night vision / rod cells). In daylight (cone cells) it's green-yellow (555nm). https://www.giangrandi.ch/optics/eye/eye.shtml
Also, LED strobes to dim. Which is unpleasant.
There are some full spectrum led lights, they just cost over $100 a piece. And they might get banned in the future for not being energy efficient enough.
No mention of CRI which seems kind of odd. LEDs for lighting are increasingly graded by how natural their emission spectrum is. Older lights are quite bad, newer ones sacrifice a tiny bit of performance for more uniform spectrum.
They use rf numbers, which is a newer standard, so that's probably good.
However, the experimental group (extra light sources) got rf 91 bulbs, and the control ("LED lighting") got rf 85 bulbs.
The two scales are not exactly comparable, but they both max out at 100. The only source I could find that discusses both says that > 90 CRI is "excellent" and just below that is "very good". It says > 85 rf is "very good", which tells me it's comparable to a mid-80's CRI bulb.
If I accidentally buy a mid-80 CRI bulb, I either return it to the store, or just throw it away.
So, I'd say this study's experimental setup doesn't support any useful conclusions. They showed that so-painfully-bad-California-won't-subsidize-them LEDs are worse than passable LEDs with supplementation from another light source.
The passable LEDs in the study are probably comparable to the cheap ones at our local hardware store, but worse than the ones that cost $10-20 on amazon ten years ago.
This would have been much more interesting if they'd compared high-end LEDs with and without supplementation, and found a difference. (And by "high-end", I mean "still much cheaper then the electricity they save")
CRI is a pretty bad rating system. They are showing the full spectrum graphs which is what you'd want anyway. Spectral Similarity Index (SSI) is the better number
Sure, but I don't see them mention what they're actually using for LEDs at all. They mention a "colour fidelity index" but I'd expect a manufacturer part number or something so I can pull the datasheet.
Funny enough, the best evidence for this study is that they should probably move somewhere with more sunlight if they can't spell "color" right... /s
2 replies →
I think CRI is not important here as thats a measure in the visual spectrum. The paper talks about all the missing wavelength outside of the visual spectrum.
Out of curiosity:
a) How do Philips Hue bulbs stack up?
b) Did Philips update them generationally and assuming they are decent now, how recently?
I hate the LED street lamps so much. I can tell they've got a really spiky and unnatural spectrum, unlike the HPS lights, not to mention that they're white or bright yellow...
In EU, the ROHS directive forbids even more types of lightbulbs, beside incandescent:
Ban on all fluorescent tubes (T5 and T8 lamps) from August 24, 2023
Ban on all CFL lamps from February 24, 2023
Extension of the exemption granted to HPD lamps from 3 to 5 years
Extension of the exemption for special purpose lamps from 3 to 5 years
ever since they replaced streetlamps with led (like a decade ago?) I can't see anything anymore before dawn
there may be more light (photons) but their spectrum is too limited for my eyes to see like halogen, etc.
I still only use compact florescent in my home, led is useless to me
I found some interesting tidbit about this bigger issue. And I want to share how to more easily check it.
We many times see some people reporting that they clearly see lower quality LED light flicker and is really distracting to them and even causes them headaches.
Now, I didn't see this until recently (unless in failing lights) in the right conditions. If the light is very, very dim: For instance, only 1 light on in the night, and you are in a division far away from the light so that it's extremely dim. There, I could finally really see it flicker.
I've replaced that light for a better one and the effect went away.
I have incandescent light bulbs at home I have to pretty much smuggle from China. It's amazing how we're replaying the asbestos playbook a century later. Only this time it's government mandated.
where do you purchase yours out of curiosity? My incandescent light bulb dealer on Ebay stopped selling them...
> It's amazing how we're replaying the asbestos playbook a century later
Can you elaborate?
Asbestos was pushed as a magical solution to problems of fire in homes without paying attention to the health effects. It took 80 years for the obvious to become law.
Leds are pushed as a solution to energy consumption by humans without paying any attention to the health effects. Hopefully it will be less than 80 years of cancers and metabolic disruption before the obvious is done.
But this time the regulation was captured pre-emptively, to the point that following best scientific advice for your health is illegal is most of the developed world.
2 replies →
There's a mostly-unsubstantiated-by-data belief that LED lighting can cause health problems by some combination of flickering and narrow color spectrum.
8 replies →
I’m guessing the Russian theory that asbestos is totally fine and isn’t harmful? The Russians still use asbestos and say it’s a plot by the west that we got rid of asbestos in our buildings. (Don’t shoot the messenger here, I have no dog in this fight and am not expressing an opinion)
3 replies →
Why is it that right now there is still on the frontpage of an "article being found flawed after 6k citations " ( https://statmodeling.stat.columbia.edu/2026/01/22/aking/ ) but this random article coming out of nowhere makes the front page on the same day?
People really should get it and stop sharing newly published papers to the general public. The value of one single academic paper is exactly 0. Even a handful of such articles still has 0 value to the general public. This is only of interest to other academics (or labs, countries, etc.) who may have the power to reproduce it in a controlled environment.
Be very skeptical of correlations like this that have dubious or poorly understood causation. Be even more skeptical if they are about day-to-day stuff that would likely have large swaths of people able to reproduce something like it on huge scales yet they haven't. Extraordinary claims require extraordinary evidence.
You can also look at all the papers it's citing too...
This article is not making an extraordinary claim, and your offence is hyperbolic. Analysis of research should not be restricted to the academe, but careful not to cherry puck research.
It seems like a pretty extraordinary claim to me.
Considering the percentage of live mitochondria that are exposed to external light in a human this seems like an enormous effect. The effect we'd expect from publication bias though is already pretty big. I'm going to go with the latter until we've got some replication, and a plausible mechanism (like.. why wouldn't whales be badly sick if this was a thing?).
3 replies →
Scientific Reports is a junk journal fyi. Not conclusive, but indicative.
Despite saying the visible flux component is "small" and that the tungsten lamps "were not expected to [be used] as task lamps," Figure 6 (a) and (c) shows... desk lamps right at the work stations like task lamps! Not only is this experimentally unblinded, but the visible light immediately in front of the test subjects is noticeably brighter and warmer. The effect could simply be due to reduced eye strain.
What would James Randi do? "Extraordinary claims require extraordinary proof," and unfortunately this isn't it.
This would be more interesting if they add a visible light filter on the lamps so they only emit infrared radiation, and have an identical double-blind control with a 60 watt heater bulb so it emits no SWIR but the same radiant heat (which could confound and/or unblind).