Comment by nicoburns
10 hours ago
It's usually the Color Rendering Index (the spectrum of frequencies that the light puts out). Incandescent bulbs more of less mimic that of the Sun, they are "black body radiators". Cheap LEDs tend to be missing a lot of the red spectrum.
However, you can get LEDs that do this well. Look for one with a "CRI" of 95% or higher.
The CRI is an imperfect metric, I watch for both CRI and R9, both should be high.
There's a massive difference between the 2600K of regular incandescent bulbs, and the 6000K of sunlight. That's why hollywood used HMIs until they migrated to LED.
There is, but most humans are used to the spectral pattern of black body radiators at all color temperatures. Be that sunlight at higher temperatues or fires / candlelight at lower temperatures.
Regular exposure to fire/candlelight has only existed for a hundredthousand years, while mitochondria have existed almost unchanged for millions of years.
So even that assumption would require further study.
For professional applications there are sulfur plasma lamps which have a continuous spectrum at high efficiency. Unfortunately they aren't economical below about 1000 watts which is impractical for many applications.
The technology basically works by continuously microwaving (think oven) a small amount of sulfur gas. The development of solid-state microwave emitters — most microwave generation is still done with vacuum tubes — might help miniaturize the devices. However, it's hard to beat the simplicity of an LED.
wow - had no idea these things existed, fascinating.