Donate SIGN UP

electricity

Avatar Image
hobo | 18:38 Wed 08th Feb 2006 | Science
13 Answers

why does resistance increase at high voltages


Gravatar

Answers

1 to 13 of 13rss feed

Best Answer

No best answer has yet been selected by hobo. Once a best answer has been selected, it will be shown here.

For more on marking an answer as the "Best Answer", please visit our FAQ.
A number of factors involved but one simple explanation is that the increased voltage causes an increase in current. The increased current causes a heating effect. Increased temperature causes an increase in resistance for most metals
Ohm's Law will give you the answer...
Not sure that Ohms law helps here. Ohms law is really to do with the voltage/current ratio. I assume the question is about the resistance of a given material increasing as the voltage is increased. As above, this is likely to be a temperature effect and would not obey Ohms Law (which assumes a constant temperture)
sorry if this is really dumb but i thought it was to do with the increase in heat caused by the increased current, causing greater vibrations of the metal atoms in the wire, in turn making it harder for the electorns to pass through the wire thus increasing resistance

dynamicduo & steve are about right.


have you heard about supercooling cables to allow a greater current flow through them? you can get some amazing voltages through when this is used.

For a given amount of power transfer, increasing the voltage requires proportionately less current, thereby reducing heating and power losses.

mibn2cweus


I think this is making it too complicated. To get a certain power, then an increase in voltage would require a reduced current. However, power output isn't something you can keep constant whilst increasing the voltage unless you increase the resistance in order to reduce the current. This goes against the original question which, we assume, refers to the same resistance component throughout.

dynamicduo, Yes, assuming a �resistance increase at high voltages� as this thread does, implies an increase of current as well, so you are completely right. But I would like to make it clear that the increased resistance results from the heating caused by the increase of current. The increase in resistance due to the heating effect of current can (in appropriate circumstances) actually be reduced by using a higher voltage. This benefit is derived when delivering power at substantially higher voltages than those required by most users, over long distances in power transmission lines as I am certain you are fully aware. Again, you are probably correct in stating that I have overcomplicated the issue at hand. I guess I just saw the stars * * * given for your original post and went a little crazy. I will try a little more restraint from now on; no promises! >
You have all assumed that the resistance does increase at high voltages, why should it?

stanleyman
You have raised an important question for which providing an appropriate answer will hopefully satisfy a requirement imposed by the way the question generating this thread was phrased. Does everyone have their thinking cap in place?

When electrical current flows through a resistance some heat is generated. Under normal circumstances, even a conductor will exhibit a relatively small value of resistance.
In a well designed circuit the resistance of a resistor or conductor will not be significantly effected by the normal operating current flowing through it. However, if the current through it increases to a level that approaches or exceeds its designed current carrying capacity it will no longer be able to dissipate heat as quickly as it is generated. As temperature increases the resistance will increase in some materials. The combined increase of heating and resistance may lead to a failure of a conductor or resistor. This is what causes fuses to blow when excessive current flows through them; they melt or vaporize depending on how strong the current is.


continued below. . .

Now that this basis is established all we need to do is understand the relationship that voltage has to all of this. Voltage is a measurement of the potential force which causes electrical current to flow through a conductor or electronic circuit. (For those wondering voltage might be thought of as analogous to water pressure in a plumbing system and current would be the actual flow of water. Electrical current is the flow of electrons, as through a wire.) The more current that flows through a given value of resistance the greater is the voltage that will be measured at each end of that resistance. As the current or resistance increases, so does the voltage.
If the voltage (electrical pressure) applied across the ends of a resistance is increased to a level that causes an excessive amount of current to flow, the heating due to the current will cause an increase in the resistance of materials with a positive temperature coefficient.
This phenomenon is evident in a light bulb. The light bulb gives the best light when operating under these extreme conditions and slightly exceeding its designed operating voltage will greatly diminish how long it will last before burning out. The resistance of a cold light bulb filament is relatively low compared to its resistance after it has begun to give off light. When a light bulb is first energized, a rush of current flows through it until it heats up gaining more resistance. This, �in rush� of current is stressful and is the reason why light bulbs often fail, �burn out�, when they are first, �turned on�.

I hope that I have not complicated this issue beyond necessity, that this explanation along with what has previously been posted is sufficienty explanatory and has not caused excessive pressure under your thinking cap. If you smell something burning it might be wise to remove your thinking cap to allow things to cool down a bit.

Sorry Mib but I dropped off before getting to the end of that :) The question was, why does resistance increase at high voltages? It doesn't, voltage is irrelevant, resistance remains constant and is only relevant to the current flowing through it.


If the questioner means why does the resistance increase when the voltage increases then as I think everyone understands, the current also increases causing the load to heat and increase the resistance as per the first answer by dynamicduo.

stanleyman
If you wanted the short answer to, "You have all assumed that the resistance does increase at high voltages, why should it?" all you had to do is say so.

Because.

Anyway, I hope you enjoyed your nap; if so than your welcome.

1 to 13 of 13rss feed

Do you know the answer?

electricity

Answer Question >>