Well now I've hit another wall, lol. They worked fine with a 9v test run, and they worked fine installed to the 12v for about twenty mins then nothin. Two of my LEDs no longer work even when connected to a battery alone. I had a 360 ohm resistor in a circuit with three 1.7 watt 3000 mcd 20a LEDs. Any ideas?
Yeah, hope you enjoyed that light while it lasted, and I hope you like your LEDs cooked to well done.
Sorry everyone, it's time for a little math, if that offends you move along to another post.
Something seems a little off in your specs for the LEDs. I'm thinking that perhaps they are 1.7
Volt, 20
ma (milliamps, thousandths of an Amp). Those are probably pretty accurate values for a 3000 mcd LED since 1.7 Watts would be a really bright LED and 20 Amps is an awful lot of current. We will continue this discussion based on my assumptions.
To calculate the correct value for a resistor, you need to know several things. One is how much voltage you are expecting to drop with the resistor, since the LED needs to run at a much lower voltage. The next is how much current is expected to be available/limited. The third which will be used later is to calculate how much power the resistor has to handle (dissipate).
To find a resistance value, we use Ohm's Law, which tells us that:
Resistance = Voltage / Current
When performing the Voltage part of the calculation, always start with the maximum voltage that will be supplied (we will ignore short voltage spikes in this instance). So, in the G6 the maximum voltage is 14.6 Volts with the engine running and the voltage regulator in the alternator supplying maximum voltage (there was a note about this in the owner's manual). So, on to the math.
The voltage that needs to be dropped will be the maximum voltage (14.6) minus the operational voltage that the LED needs. According to the assumption made earlier, that works out as 14.6 - 1.7 = 12.9 Volts. You didn't mention if you are using one resistor per LED but since you are putting them in each door I will assume that is true. If you were running multiple LEDs with one resistor the math changes, and the wiring could get messy.
OK, we have a voltage drop of 12.9 Volts, and a current of 20 ma or 0.020 Amps. Using Ohm's law we get:
Resistance = Voltage / Current
Resistance = 12.9 / 0.020
Resistance = 645 ohms
Reversing the math, we learn that you fed as much as 36 ma or about 75% more than the rated current to your LEDs using the 390 Ohm resistor, which would have resulted in a nice, bright light for up to a couple of minutes with the LEDs eventually fading out. With the 9 Volt battery you were running them at about 25 ma which would look pretty normal at first but over the period of several weeks to several years the LEDs would get darker as they decayed. Of course the battery would go dead first... but I digress.
Ok, just a little tidying up to do. 645 Ohms is not a common value for resistors, but 680 Ohms is. That will give you about 19 ma of current maximum, which is nice and safe and still gives you bright light. When choosing a resistor value for something like this you always want to go to a higher resistance value to keep the current in the safe zone. Next is the power rating for the resistor.
Power = Voltage * Current
Power = 12.9 (drop across the resistor) * 0.020
Power = 0.258 Watts
You need to use a 1/2 Watt-rated resistor (commonly available) as the 1/4 Watt (0.250 Watt) resistors you might be supplied can't be relied on to not overheat and go bad over time. Also when installing the resistor keep it away from anything that can melt like plastic sheeting or door components. 1/2 Watt is not a
lot of heat but over time it can cause some damage.
Class dismissed.
~ MattInSoCal