What is efficacy?
There has been a lot of confusion in the lighting industry about efficacy and what it means. A straightforward way to express the efficacy of a product is the amount of useful light, in lumens, that it produces compared with the amount of electrical power, in watts, that it consumes in its circuit (including the light source and anything else that draws power from the circuit such as a ballast or driver).
This is called useful lumens per circuit watt – a widely used term that sounds simple, but turns out to be rather complicated, and can be used by different suppliers to make wildly differing claims.
What is a ‘useful lumen’?
Let’s start breaking it down. Lumens, as mentioned in last month’s column, are determined by what the human eye perceives, and that depends on how the eye responds to light. Efficacy is literally in the eye of the beholder. We determine it by measuring spectral information using a spectrophotometer, or simulating it using a photometer, and converting the result into lumens.
But we’re not just talking about lumens, we’re talking about useful lumens, and whether or not a lumen is useful depends on the context. Lamps and luminaires can be direct or indirect and have characteristics that are directional or non-directional. By classifying the type of directionality we can work towards a meaningful measure of the number of lumens that are ‘useful’ for a directly illuminated task.
The European Union has defined a ‘directional lamp’ as having at least 80 per cent light output within a cone of 120 degrees (defined by a solid angle). Any other lamp is ‘non-directional’.
For a non-directional light source, the useful lumens figure is the total amount of luminous flux coming from the source. For a directional light source it’s a bit more complex. The EU’s latest approach is to promote lamps that have good efficacy for lighting a task, and therefore focus the maximum amount of light into the useful 90-degree cone (defined by a solid angle). The system then measures only the light captured in this 90-degree cone and reports it as useful lumens.
What is a watt?
Now we’ve pinned down useful lumens, let’s talk about watts. Light sources can be battery operated or driven by AC or DC mains power. DC power is particularly efficient when coupled with LED lamps and luminaires, and is becoming more important.
An important consideration for the luminous efficacy calculation is that the correct watts values are used for the product being tested. The efficacy of a self-ballasted lamp will be determined by the total power drawn from the mains supply. That means that the power in watts will include the conversion from high-voltage AC to lower-voltage DC and then to a constant-current driver circuit. Each step has inefficiencies built in which reduce the overall efficacy.
Another lamp may be designed to be purely DC driven, without conversion. An assessment might give the impression that such a DC-powered product is more efficient, but remember that AC power will have been converted to DC power somewhere else down the line (this will become more interesting when homes and buildings start to be powered directly from mains DC supplies). This causes confusion and it is important to check whether product performance figures are derived from a DC or AC supply.
There are many elements that make up solid-state lighting and most of them affect efficacy (see above). As we’ve seen, useful lumens per circuit watt is what we’re really interested in, but some manufacturers may use datasheet values for the LED package, which may not represent the lumens emitted under the conditions in which the source will be used.
The LED package can be tested in a number of ways. The first is the quick screening test performed on the LED, in which a current pulse is applied to the LED and the light output is measured. This is the ‘cold’ lumen test and results in the highest lumen value that can be achieved. The second way is to apply constant DC current to the LED with the case at a temperature that represents the temperature that may be reached inside a lamp or luminaire. The is the ‘hot’ lumen value and is much closer to the value that the LED package will emit in real world conditions.
Specifiers should always make sure that ‘hot’ lumens are used as the starting point for any discussion of luminous efficacy.
Of course most LED lamp or luminaire products have lenses or reflectors on top of the LED. These optics introduce losses and so the number of lumens emitted from the LED will not be the same as the number emitted from the lamp or luminaire.
The driver circuitry will also introduce losses and further lower the efficacy.
So, manufacturers can make a number of different claims, but if specifiers ask the right questions, they should get the right answers. A group of UK lighting bodies has developed guidelines for specification of LED products – bit.ly/ledspecs – and I’ll discuss the questions specifiers should ask about product efficacy in more detail in a future column.
When do you consider power factor?
When using AC power, you may also need to consider a product’s power factor. For a DC-powered device it’s easy: the watts value is the voltage in volts multiplied by the current in amps. But for an AC-powered device, volts and amps are not always in phase, meaning the power will fluctuate. This means you have to calculate the true power in watts by multiplying the volts by the amps at each moment in time and take the average.
The fact that current and voltage may be out of phase means you need to know the maximum volts and amps at any one time. Multiplying these maximum V and A values gives the apparent power (VA), which is also measured in watts.
If the volts and amps are in phase, true power and apparent power will be the same. The power factor is the ratio between the true power and the apparent power – a value between 0 and 1. The higher the number, the closer the true power is to the apparent power.
For LED products, electronic drivers can provide power factor values as low as 0.5 and as high as 0.95. Power factor is becoming more important for specifiers because it affects the total electrical loading of the installation.
So when you want to know how much the electricity is costing you, you use watts. When you are specifying equipment loads, fuses, and wiring sizes you use the VA values, which refer to the maximum current and voltage. When you want to know efficacy, you need to know true power in watts.