Will we have HVDC in home socket one day?

It’s been many years since locomotives moved from DC to AC motors … VFD/IGBT etc …
That is a very different thing. It is diesel generator powered and that AC is really not a constant frequency AC that we power our grid at (60Hz in US). If anything it is more of an inverter powered AC, driven from a DC generated by the diesel generator.
 
The advantage to Alternating Current power systems, and why they are so widely accepted, is because of the Transformation factor and conductor costs of the transmission system, due to low I^2*R losses at high voltages.

The DC system only has a place in power generation when the power sources and the loads are very local (close to each other.) Otherwise, the I^2*R power losses in the conductors will be very high.

Something else to consider; Many appliances need 120V, which is a split from the 240V supply using the Neutral lead.

How would you do this with DC, assuming all motors could operate on AC or DC.
I was thinking if home is powered by DC it would be 240V DC, no reason to stay with 110V anyways as the rest of the world run fine on 220/240V AC.

Powerline would still be high voltage if not locally at 240V, I would think the grid would be where it is now or use the HVDC transmission we are using anyways. It would just be a matter of whether to switch the last mile to 240V DC or 110VAc per phase.

For backward compatibility I would imagine the 3 wire on the pole would become 2 110VAC phases and the last one turn into a 240V DC, and homes will be wired with an additional 240V DC from the pole, and "some" appliances and sockets will turn into a new 4 prong plug that's backward compatible, so it can use the existing 110V, existing 220V split phase, or DC 240V (or some reasonable conversion kit that change the wiring of heating element from split phase 220V to the 110V powering the 110V motor and electronics, but the heating element go into DC). New appliances or electronics will just use the 240V DC directly, like car charger, HVAC, induction range, computer, home electronics that use switching power supply, etc. To me this make sense.

If this movement gain momentum it would be another 200 years before the 110V / 240V AC finally be converted into 240V DC. Or, maybe 3 phase into the house and running 3 phase motor, 3 phase charger, 3 phase induction range and call it a day. We can't even move our nation to metric, what am I thinking.
 
I was thinking if home is powered by DC it would be 240V DC, no reason to stay with 110V anyways as the rest of the world run fine on 220/240V AC.

Powerline would still be high voltage if not locally at 240V, I would think the grid would be where it is now or use the HVDC transmission we are using anyways. It would just be a matter of whether to switch the last mile to 240V DC or 110VAc per phase.

For backward compatibility I would imagine the 3 wire on the pole would become 2 110VAC phases and the last one turn into a 240V DC, and homes will be wired with an additional 240V DC from the pole, and "some" appliances and sockets will turn into a new 4 prong plug that's backward compatible, so it can use the existing 110V, existing 220V split phase, or DC 240V (or some reasonable conversion kit that change the wiring of heating element from split phase 220V to the 110V powering the 110V motor and electronics, but the heating element go into DC). New appliances or electronics will just use the 240V DC directly, like car charger, HVAC, induction range, computer, home electronics that use switching power supply, etc. To me this make sense.

If this movement gain momentum it would be another 200 years before the 110V / 240V AC finally be converted into 240V DC. Or, maybe 3 phase into the house and running 3 phase motor, 3 phase charger, 3 phase induction range and call it a day. We can't even move our nation to metric, what am I thinking.
I think you're still neglecting the I^2*R losses that would be inherent in such a DC system for a typical 20kW home requirement. The gauge of copper needed for the feedlines would become cost prohibitive.

And turning AC into DC would require massive rectifiers which in turn have their own power losses.

I still fail to see that you have provided any system advantages for DC over AC transmission.
 
I was thinking if home is powered by DC it would be 240V DC, no reason to stay with 110V anyways as the rest of the world run fine on 220/240V AC.

Powerline would still be high voltage if not locally at 240V, I would think the grid would be where it is now or use the HVDC transmission we are using anyways. It would just be a matter of whether to switch the last mile to 240V DC or 110VAc per phase.

For backward compatibility I would imagine the 3 wire on the pole would become 2 110VAC phases and the last one turn into a 240V DC, and homes will be wired with an additional 240V DC from the pole, and "some" appliances and sockets will turn into a new 4 prong plug that's backward compatible, so it can use the existing 110V, existing 220V split phase, or DC 240V (or some reasonable conversion kit that change the wiring of heating element from split phase 220V to the 110V powering the 110V motor and electronics, but the heating element go into DC). New appliances or electronics will just use the 240V DC directly, like car charger, HVAC, induction range, computer, home electronics that use switching power supply, etc. To me this make sense.

If this movement gain momentum it would be another 200 years before the 110V / 240V AC finally be converted into 240V DC. Or, maybe 3 phase into the house and running 3 phase motor, 3 phase charger, 3 phase induction range and call it a day. We can't even move our nation to metric, what am I thinking.
You would still have to have power supplies in everything. Desktop computer power supplies must supply +12V, +5V, +3.3V, -12V, and +5Vsb. Induction ranges need AC current to work. A pretty big reason why HVDC transmission is used at all is because it doesn’t require 2 separate grids to synchronize together.
 
I think you're still neglecting the I^2*R losses that would be inherent in such a DC system for a typical 20kW home requirement. The gauge of copper needed for the feedlines would become cost prohibitive.

And turning AC into DC would require massive rectifiers which in turn have their own power losses.

I still fail to see that you have provided any system advantages for DC over AC transmission.
Hmm, I did some reading, and it seems like DC can tolerate higher voltage (say 240V AC is rms and it is actually 679 Vpp), and if the DC is converted from the same high voltage AC at the last loop it will come into the house as 679VDC (safety be darned, let's say). Then the I^2*R loss would actually be less than 240VAC (wouldn't 240VAC still have I2R loss?), and certainly with 679VDC it is less current and therefore less loss than 240VDC (I should have thought of that first).

Turning AC to DC requiring massive rectifiers is true, but if we run at higher voltage then we also have less I2R loss with less current (or you can power way higher power stuff like car charging). Maybe picking which voltage to run DC at is the key, maybe getting into the house at 679VDC is the better alternative than 240VDC and have an advantage over 240VAC (Vrms). Then at the panel we can step down to lower voltage for lower power stuff using DC DC converter and PWM for safety?

I guess 3 phase AC is a better choice for home, unless you have a solar roof that output HVDC to begin and you can skip the DC to AC inverter and feed the HVDC directly into inverter driven load (like inverter driven HVAC or car charging).
 
Last edited:
You would still have to have power supplies in everything. Desktop computer power supplies must supply +12V, +5V, +3.3V, -12V, and +5Vsb. Induction ranges need AC current to work. A pretty big reason why HVDC transmission is used at all is because it doesn’t require 2 separate grids to synchronize together.
Yes, if home is DC powered it would still need to convert from hundreds Vdc to 12V and below. It won't be free conversion just maybe slightly less loss with pwm step down. If converting from high voltage AC to low voltage DC is not too expensive but converting from high voltage AC to high voltage DC is, then it is not worth it.
 
I have one of those ECM fan motors on my "high efficiency" home AC unit. It drives the condenser fan. Expensive little guy, that lasts 5 years in the harsh environment. I've gotten good at repairing it, using components from the last removals.

Really, it's silly. Any quality fan induction motor is about 90% efficient. This motor is 93% efficient and can vary it's speed depending on whether the home AC is running the 2.5 or 5 ton compressor. Any well designed induction motor can efficiently run multiple speeds too. At 1/6th the initial cost!

Replacing that motor multiple times, has cost me far more than the few dollars of power it's saved.

In any design, always optimize the important stuff first. A motor that uses 50 cents worth of power per month won't save much if it's made twice as efficient.
Agree, it is like why do we need delicate CVT if we have reliable 9 or 10 speed automatic.
 
To be sure, I'm not saying it's the greatest thing since sliced bread. Just that I've wondered about it, what with the possibility of tying home solar with home batteries with a good amount of home things that already do or could run on DC directly.

In the end, everything runs as a system, and it's the overall systematic efficiency that matters. Including cost to consumer (or business).
 
I got my elbow on 480vac 3 phase once and survived. That was enough to get my attention . I was working alone on an X-Ray Heart Cath Lab and the X-Ray tube rotor chassis swung in and got me. I had Navy electronics training and they had all sorts of cautions about have someone nearby to pull the plug if you got attached to DC.
 
I got my elbow on 480vac 3 phase once and survived. That was enough to get my attention . I was working alone on an X-Ray Heart Cath Lab and the X-Ray tube rotor chassis swung in and got me. I had Navy electronics training and they had all sorts of cautions about have someone nearby to pull the plug if you got attached to DC.
That's one big thing with AC vs DC, getting shocked by AC at least allows the muscles to quiver which makes it easier to get free from the shock. Straight DC just causes a clamp down that doesn't let go.

AC is also easier to convert between voltages, just throw an appropriately sized transformer in and take whatever clean output voltage you need. AC transformers are 98%+ efficient and relatively cheap. DC either requires inverters and transformers or some other form of DC-DC conversion. Pretty much all of them introduce noise in the system that needs additional filtering for clean power output. DC Converters are also less efficient (80-95%) and more expensive. Less efficent also means they generate more heat and may require active cooling which further reduces their efficiency.
 
While superconductors that can be at hot summer outdoor temperatures are still a lot of wishful thinking, if such superconductors are ever available at a low enough cost vs distance, then low voltage DC distribution will take on a whole new light. But such conductors may be as long time coming as cold-fusion. In fact it would be interesting to see which one comes into common existence first. But by the time either of those exist it is likely none of us will still be around.

If home solar power ever becomes cheap enough that it becomes common, then that also may shine a new light on low voltage home DC power becoming somewhat common.
 
How do you step down a DC voltage unless you use a large and inefficient, power robbing rheostat to do so? What constitutes a variable load and how do you control it?
SCR or mosfet current controllers.
 
When a voltage is applied across a thyristor no current flows because neither transistor is conducting. As a result there is no complete path across the device. If a small current is passed through the gate electrode, this will turn "on" the transistor TR2. When this occurs it will cause the collector of TR2 to fall towards the voltage on the emitter, i.e. the cathode of the whole device. When this occurs it will cause current to flow through the base of TR1 and turn this transistor "on". Again this will now try to pull the voltage on the collector of TR1 towards its emitter voltage. This will cause current to flow in the emitter of TR2, causing its "on" state to be maintained. In this way it only requires a small trigger pulse on the gate to turn the thyristor on. Once switched on, the thyristor can only be turned off by removing the supply voltage.

The operation of the thyristor considered in this way is relatively straightforward to understand.


So how do you turn off an SCR off in a DC line?
 
I don't think you'd use a thyristor/SCR in the first place. MOSFET, or if necessary, an IGBT.

One can also use FET's in place of diodes, in a synchronize convertor, so as to get rid of the diode loss. The MOSFET gate of course requires control, so the controller is more complicated; and it's not zero loss, as the gate current is real. [Well, it's capacitive so it's not "real" but you get the idea!]
 
When a voltage is applied across a thyristor no current flows because neither transistor is conducting. As a result there is no complete path across the device. If a small current is passed through the gate electrode, this will turn "on" the transistor TR2. When this occurs it will cause the collector of TR2 to fall towards the voltage on the emitter, i.e. the cathode of the whole device. When this occurs it will cause current to flow through the base of TR1 and turn this transistor "on". Again this will now try to pull the voltage on the collector of TR1 towards its emitter voltage. This will cause current to flow in the emitter of TR2, causing its "on" state to be maintained. In this way it only requires a small trigger pulse on the gate to turn the thyristor on. Once switched on, the thyristor can only be turned off by removing the supply voltage.

The operation of the thyristor considered in this way is relatively straightforward to understand.


So how do you turn off an SCR off in a DC line?
If you ground the gate well enough ( use a transistor that has a low enough C to E voltage drop when in saturation ) you can pull enough of the internal self feeding gate current out of the SCR to cause it to turn off.

But, as Supton has posted, there are other devices that will work for that application.
 
Last edited:
If you ground the gate well enough ( use a transistor that has a low enough C to E voltage drop when in saturation ) you can pull enough of the internal self feeding gate current out of the SCR to cause it to turn off.

But, as Supton has posted, there are other devices that will work for that application.
I only have limited knowledge of power electronics but most of what I know tells me switching from high to low voltage with minimal loss requires some sort of FET, may not be MOS but other design, and with some sort of PWM to reduce loss and capacitor / inductor to filter out noise / buffer the output.

Also many electronics these days have their own low power, always on circuit to control the rest of the high power circuit has adjustable voltage and frequency, and may be turned off during idle / sleep. This could also be done with high to low voltage switches / conversion controller (a small always on device will be converting a little bit of high voltage to power itself, and its control will be used to switch the main, big power rail).
 
Last edited:
Also, there is a problem with running a switching power supply from a switching power supply. The first one has to have an extra large filter capacitor, or a battery on its output to handle the current spike load of the second switcher it is supplying power to.

There was a thread here on BITOG more than a year ago where someone was trying to run a switcher from a switcher and it would not work even though the output of the first one supplying power to the second one was rated at more than what the RMS draw of the second one. The pulse draw of the second one exceeded the amount of current the first one could supply, and the first one could not maintain voltage, and therefore the second one did not get a high enough input voltage and second one would not work.

In other words if somewhere further up the feed to the DC plug in the house there was a switcher reducing the voltage, and it was feeding a switcher in something like a computer, then the first one would have to have extra large output filter capacitor(s).
 
Last edited:
Back
Top