Exactly. But that works OK, even with DC designers often strobe the LEDs off and on really fast to save power and heat dissipation, and the eye does not notice the difference (unless you wave your fingers back and forth to see the strobe effect, the same as with a fluorescent light tube). If you put two LEDs back to back in opposite polarity on an AC circuit, each one would be on 50% of the time, and both would look lit.somewhereinusa wrote:. . . If you apply AC it would be turning off and on 60 times a second.
Executive summary:
When I am making an indicator for switch position or other status in "12 volt" circuits, I use a 1200 ohm resistor on the hot side of a standard plastic (Radio Shack or budget bag) LED to limit the current to 50% for long life. If I wanted brighter, I would use 750 ohms (a non-standard size, I would use two 1500 ohm resistors in parallel) for 70% current at 12.6 volts and 100% at 17.1 volts.
Details:
Think of an LED as a diode that emits light, not a light bulb. When the marked side is pointed to the negative, it will conduct electricity and also light up. When the marked side is pointed to the positive, it acts as a one-way valve and blocks the flow of electricity (provided the break-down voltage is not exceeded). The brightness depends on the amount of current passing through the LED.
A standard cheap plastic-cased LED is usually rated for around 20 mA (0.02 amps) and 2.1 volts. Specially colored LEDs might have different ratings. LEDs in series with resistors are not like bulbs. Bulbs would divide up the voltage with the series resistor based on the proportion of their resistances. LEDs say "I'll take 2.1 volts, you can have the rest." Selecting a resistor requires considering how much current you want to run through the LED, and what "the rest" of the voltage is. For a 20 mA LED, use 50 ohms per excess volts to get 100% current, or 100 ohms per excess volt to get 50% current.
In my example, 1200 ohms = 10 mA at 12.00 volts, plus 2.1 volts for the LED = 50% current at 14.1 volts, and 100% current at 24 + 2.1= 26.1 volts. This gives long life on a fluctuating voltage source like batteries having a charge/discharge cycle. For a brighter indicator, 750 ohms in series allows the 20 mA at 15.00 excess volts, and chances are good that the charger wont go over 15 + 2.1 = 17.1 volts and overheat the LED.
Without current limiting, the LED attempts to drop 2.1 volts in whatever circuit it is in. Put it across a 12-volt battery, and it will attempt to draw the battery down to 2.1 volts. The only limit is the internal resistance of the battery and wiring. So it will draw about 6,000 amps for as long as it can, which is about one quarter of the blink of an eye. As Somewhere in USA has said, you let the light out all at once.
You could use the same LED circuit designs on AC power - 12,000 ohms on the hot lead would have 50% current at 122.1 volts, and 100% current at 242.1 volts. 7,500 ohms would give 100% current at 152.1 volts.
Using resistors is simple, but it has the same problem as incandescents: at 14.1 volts, 85% of the power is used as heat in the resistor. Using a single LED for an indicator, this cannot be avoided no matter how you control the current.
Some LED modules have internal switching power supplies that adapt to changing voltages to give steady light. Some are good, and some not so good. A buddy loaned me one model of LED bulb replacement to try in my camper dome light. The FM radio in the screen tent next door noised up as soon as it was turned on and started switching. No thanks.