Comment by HNisCIS
1 day ago
No mention of CRI which seems kind of odd. LEDs for lighting are increasingly graded by how natural their emission spectrum is. Older lights are quite bad, newer ones sacrifice a tiny bit of performance for more uniform spectrum.
They use rf numbers, which is a newer standard, so that's probably good.
However, the experimental group (extra light sources) got rf 91 bulbs, and the control ("LED lighting") got rf 85 bulbs.
The two scales are not exactly comparable, but they both max out at 100. The only source I could find that discusses both says that > 90 CRI is "excellent" and just below that is "very good". It says > 85 rf is "very good", which tells me it's comparable to a mid-80's CRI bulb.
If I accidentally buy a mid-80 CRI bulb, I either return it to the store, or just throw it away.
So, I'd say this study's experimental setup doesn't support any useful conclusions. They showed that so-painfully-bad-California-won't-subsidize-them LEDs are worse than passable LEDs with supplementation from another light source.
The passable LEDs in the study are probably comparable to the cheap ones at our local hardware store, but worse than the ones that cost $10-20 on amazon ten years ago.
This would have been much more interesting if they'd compared high-end LEDs with and without supplementation, and found a difference. (And by "high-end", I mean "still much cheaper then the electricity they save")
CRI is a pretty bad rating system. They are showing the full spectrum graphs which is what you'd want anyway. Spectral Similarity Index (SSI) is the better number
Sure, but I don't see them mention what they're actually using for LEDs at all. They mention a "colour fidelity index" but I'd expect a manufacturer part number or something so I can pull the datasheet.
Funny enough, the best evidence for this study is that they should probably move somewhere with more sunlight if they can't spell "color" right... /s
They are not using an index much, they are showing you actual spectral graphs. Ask an AI to understand how that is the actual info you'd want.
2 replies →
I think CRI is not important here as thats a measure in the visual spectrum. The paper talks about all the missing wavelength outside of the visual spectrum.
What is the relationship between CRI and how broad (or narrow) the spectrum output by the LED is? Is CRI automatically better for broader-spectrum LEDs? Or is that too simplistic?
Slightly overly simplistic because the broader-spectrum LEDs could be broad-but-spikey for their output, resulting in light that is broad spectrum, but has a bad CRI (because it's eg really blue).
Out of curiosity:
a) How do Philips Hue bulbs stack up?
b) Did Philips update them generationally and assuming they are decent now, how recently?