Originally Posted By: bmwtechguy
Q,
One question I have is how does putting in an 80 watt bulb INCREASE the circuit's resistance? The resistance in a circuit like this is more or less a constant, and putting in a load that increases the current flow (80 watts vs 35-55 watts) would actually DECREASE the overall resistance (ohms) in the circuit, causing more current to flow up to the point that a fuse blows or the wire melts. The load is usually the current limiting device in a properly designed and healthy (no corrosion or poor connctions) circuit.
yo!
You need to think Ohm's law (V=I*R) where voltage is equal to resistance times current. Copper wire is not an absolute perfect conductor and comes with resistance in it (if you look deeper into the subject, you'll realise that when you buy copper wires, it is determined by resistance/length).
Granted, we use the same wire that was destined for use in 35Watt lighting system, and assuming that the length hasn't been changed (same wiring harness, no change in wiring gauge, etc.). Let's make an artificial value out of it for ease of calculation (say, 0.5ohms in total wiring-related resistance). Given 35Watt out of a 12VDC bulb consumption, that equates to approx. 2.92Amps.
Using Ohm's law to calculate the total voltage drop (V=I*R), 2.92Amps *0.5ohms =1.46VDC approx.
So, in other words: if you feed a 12V into the system, what your 35Watt lightbulb "sees" is 12V-1.46VDC = 10.54VDC on it's filament end.
Now, let's change the calculation a bit too: with 80Watt bulb: you get 6.67amps and once again, with the total voltage drop would work out to be 3.335VDC, or in other words: your 80Watt lightbulb filament's end will only sees 8.665VDC after the voltage drop!
Finally, to answer your question RE: why putting an 80watt lightbulb would increase the circuit's resistance, first let me feed you with some physics link:
http://www.allaboutcircuits.com/vol_1/chpt_12/6.html
Other than certain metal alloys such as carbon, silicon, germanium, etc. which comes with negative coefficient of resistance (meaning that the higher the temperature, the lower the resistance), all other metals, incl. copper, etc. the higher the temperature the higher the resistance.
In other words: when comparing to you feeding only 2.92amps through that same wire with 0.5ohms resistance under 20C (assuming no change to the copper wire's temperature), when feeding 6.67amps into that very same wire at 20C with 0.5ohms resistance, the wire is gonna heat up more and thus causing the resistance to rise until the wire reach a temperature "equilibrium" point where the resistance will surely be more than that of 0.5Ohms@20C to begin with (could be 0.6? 0.7ohms? all have to be calculated).
Bottomline: when in doubt, don't do silly things RE: retrofitting a 80Watt lightbulb into a harness only capable of 35Watts bulb.
Hope that answers your questions.
Q.