For those of you just starting with electronics, here is a quick explanation of how to power your LEDs, and why selecting the right resistor is so important.
First off, LEDs are diodes. They happen to emit light, but you must think of them as diodes, not lamps.
What is a diode? Well you should already know, but as a quick refresher a diode is a semiconductor that can only pass current in one direction and has a strange property – no matter where it is in a circuit, the voltage across it will be constant. This means no matter how much current is passing through it, the voltage across its terminals will remain the same. The LED doesn’t draw current like a resistor, it passes current like a switch.
What does this mean practically? Say you have a 9v battery and connect an LED. The voltage across the LED will remain about 2v, which is typical for LEDs. This leaves 7v across the wires connecting the LED to the battery. How much current will this wire draw? At 7v, as much as the battery can supply – which is way too much for the LED. POP! (Most LEDs can only handle about 20mA before damage occurs.)
Now let’s put a resistor in series with the LED. Again, the LED will have 2v across it, and now the resistor will have 7v across it. Ohm’s law will tell us how much current is passing through the resistor, and thus through the entire circuit. Since the limiting factor for our LED is current, we can choose a resistor that only draws as much current as our LED needs.
An example with real numbers:
1. Assume our LED has a “forward voltage”, or “voltage drop” of 2v, and a maximum current of 20mA. (Let’s use 15mA for a margin of safety and so we don’t minimize it’s lifespan.)
2. Our power supply is 9v
9v - 2v = 7v across the resistor.
Resistance = 7 Volts / 0.015 Amps = 467 Ohms (Ohm's Law)
A common value near this is 470 Ohm (always err on higher resistance), which we will choose to complete a circuit that passes about 15mA through our LED.