> Will a higher voltage lead to more power?

Will a higher voltage lead to more power?

Posted at: 2015-01-07 
For a particular power, the higher the voltage, the lower the current. Power, VA = Volts x Amps.

As previously mentioned, high voltage is used to transmit electrical power over long distances.

Because of the lower current, smaller CSA cable can be used, reducing power loss in transmission and also weight of cables between pylons. Economy.

However the voltage is reduced at the business end for industrial and domestic use.

Again however if you have a fixed load, resistive or reactive and fixed frequency supply, increasing the voltage beyond its design will result in higher current, more power and more than likely failure of the unit due to overload.

You say Ohms Law is "Very Basic" and so it is, for many electrical calculations and has been for nearly 200 years, yet you say " You can't fully grasp the relationship between Volts, Amps and Power".?

Amps = Volts / Resistance. I = V / R.

V = I x R

P = I Sqd x R.

In AC, R = Z, the impedance.

The relationships between each side of a transformer are mathematically simple, but if you think about it a bit it can get conceptually confusing. The basic rule is that the power into the transformer equals the power out, so if voltage increases on one side, the current must fall.

The piece of information they never tell you at first and that ties it all together (for me anyway) is that when there is a load on one side of transformer it "looks" like a different value load to the other side. This is how the voltage can go up but the current falls eg because a 10 Ohm load on one side looks like 40 Ohms to the other side if the voltage doubles (and the current halves).

A Tesla coil is basically a high frequency transformer. Transformers cannot produce more power at the output than what goes in at the input, in fact there are losses involved. If the losses are ignored, the power out equals the power in, so if the input was 12 volts at 10 amps the power would be 12 x 10 = 120 watts. If the output voltage was 12,000 volts the current would have to be 120 / 12000 = 0.01 amps to have the same power output.

Power = voltage x current.

When high voltage is used on power lines, the current is reduced allowing smaller diameter conductors and reduced losses due to heating ( known as I2R losses, due to the power dissipation in a resistance being proportional to the square of the current).

power is product of voltage and current. 100V at 1 amp has as much power as 1000V at 100mA, 10kV at 10mA, and 100kV at 1mA. The load decides what the current will be and to get 1amp from 100V, the load resistor has to be 100 ohms. for 1kV, for 100watt, resistance has to be 10k, and for 10kV it has to be 1meg. for 100kV it is 100 meg..Just divide voltage by the current to get this.

Power is generated at about 10kv but boosted to 250kV to ensure that the current is lesser than if 10kv is used for transmission. It is brought back to 4.4kv and again to 110V before supplied to consumers. The high voltage transmission reduces the current and thus reduces losses in transmission due to resistance of wire carrying the load current. all this is done at 50 or 60 hz. If you switch on a 100watt bulb at 110V, the current in the 220kV line increases by 100/220,000 or about 0.45mA, ideally. the current in the bulb is 100/110 amps.

Note that even if voltage is present, until a load is placed no power is taken off from the source. If you switch on a 100watt bulb connected to 110V, then you get a current and hence 100W power. When switch is off the 100V exists, but since no load is connected, no power is utilised. you can do an experiment with a 1.5v battery. Connect a load of say 10 ohms, and you might get a current about 150ma. the power in 10 ohms will be 1.5*.15=.225 watt or 225 milliwatt. If you place a resistor of 100mwatt, the resistor becomes too hot. If resistance of 10 ohm but rated at 1 watt is placed, the resistor is just fine though it too gets a bit warmer.

Okay, so I really don't completely understand electronics. Ohm's law is pretty basic and stuff, but I don't fully grasp the relationship between current, voltage, and power.

My primary confusion lies with The Tesla Coil. From my understanding, it increases voltage immensely, while keeping a relatively low current. So I figured that it would also increase the power to do work (like power a motor). However, when I share this thought, people say it doesn't do that. I just want to understand why not, and also, "what are the benefits of higher voltage?"