110 or 220

If you show me a 60V washing machine I’ll show you a fuse the size of France.

The safety issue of making everything 220V concerns me. Aside from the fact that you need better quality wiring with 220V appliances, it’s also far easier to electrocute yourself. It’s more than twice as dangerous - with 220V you have the situation where if you grab a live appliance with your hand, you won’t be able to open your hand to let go.

I also have some doubts about how 220V can save energy. The savings would mainly be in the wires traveling from the transformer on the power pole to the house, but for most homes this is a very short distance so the savings will be negligible.

As for the appliances themselves, 220V only makes sense for high-power needs like in air-conditioners and water heaters - low-power appliances like fluorescent lights and computers work well at 110V. In fact, computers require a built-in power supply which steps down the voltage to 12V and 5V (you’d instantly fry your silicon chips at 110V). Laptop computers use even lower voltages internally (I believe it’s 3.3V).

Most solar photovoltaic and small-sized wind power generators operate at 12V, though 110V is available. With a 12V system you can use automotive and RV appliances - I’ve actually thought of doing this to run my computer and some lights. Wouldn’t be practical to run a refrigerator, let alone an air-conditioner or water heater, but otherwise I’m a fan of low-voltage and low power appliances. And in the very near future, energy is going to become THE issue.

peakoil.com

cheers,
DB

In addition to the 110

I still remember when they changed the voltage in belgium mid-late 60’s early 70’s I think.

I know my grandmother still had one device with a transformer a couple of years ago. That’s what people did back than, using a transformer until they bought a new appliance.

Same happend with change from bottle gas to pipegas. Every gas appliance needed to be changed.

In Belgium at this moment, they are already thinking to upgrade to 480V. We already have 220-240V 2-3 fase +earth, 360V 2-3 fase +earth, now 480V three fase +earth. If I remember correctly, earth has ot be less than 10 ohm.

I know that many new houses in Belgium are installed with electricity circuits 5-24 V, for domotica purpose or lighting on 12-24V. Main would be 220 or 380V.

It’s going to be a pain at first but within 5-10 years most things are going to be replaced anyway.

Oh yeah forgot, the main gas supply pipe of a new house has to be connected to earth too.

…only problem regarding the Gas & Electricity in Belgium is the fact that these companies/institutions are not speaking to eachother , or making complete MAPS when opening the surface to lay a new pipe or a new cable.

It resulted in more than 20 deaths a few months ago when a major gaspipe exploded… :blush:

SOLAR energy in 2025. H2O in our cars. Ban the gas & fuel !

Hope someone from Greenpeace is reading this ! :slight_smile:

and … in the 60’s people didn’t have that many electric appliances … TV was just going mainstream in Belgium. Frigerators weren’t that widely available or used, cooking or heating on electricity was rare, aircon? never heared of.

and … since about 15 years -20 years in Belgium you need to have a special automated switch installed that shuts down as soon as there is leak to the ground.

Especially useful when coming in touch with the plugs or wires in the ‘wet rooms’.

I think they shut down at 30 mA. Must be installed on all power outlets in kitchen, bathroom, laundry room and basements and outside, garden etc…

110V is fine if you want to wire a house with huge cables and use enormous fuses. Very few people get electrocuted in Taiwan but plenty die from fires caused by the ridiculous currents crappy Taiwanese wiring is asked to handle, and the cavalier use of fuses and total disregard for any kind of grounding.

The set-up in Taiwan using 220 for Air Con mixed with 110 is probably rahter unique is it?
No wonder that there are so many fatalities due to the wiring issues.

But one good thing is happening, you need to use different plugs.

True, but you shouldn’t be able to get into this kind of situation in the first place. Appliances should be designed that they cannot become live and ELCBs/RCDs also provide protection against this kind of scenario.

See my calculation on the first page; there is a small difference but probably not enough to make a huge impact in practice.
The main purpose of using a higher voltage is that you can use smaller wires but deliver the same amount of power.
[220V circuits in Germany are wired with 1.5mm2 while 3-phase circuits (380V) connect via 2.5m2 to the main distribution.]

Considering that houses will have more and more appliances the total power consumption increases, in which case 220V or a 3-Phase supply even will be of advantage. And since power supplies of computers (and laptops) usually can handle 110 and 220V I think this is not an important issue, more important is that you cater for the worst case, i.e. the equipment that requires lot’s of power (like the mentioned water heaters, air-conditioners etc.).

No more worries about 110 or 220 V !!

One solution for everybody! Took some time to study the case. :notworthy:

datv.de/Verschiedenes/Schuko-Gardena.jpg

[quote=“ceevee369”]No more worries about 110 or 220 V !!

One solution for everybody! Took some time to study the case. :notworthy:

datv.de/Verschiedenes/Schuko-Gardena.jpg[/quote]
Water-power, eh? :smiley:

I’m from the US, where we have 110. I live in Taiwan, which is 110.

Ergo, 110 is just right for me!

But if I had my druthers, the whole world would be a standard 220.

I voted 220V for the reason Rascal already explained in his math session. You may think it is not much for devices consuming “less” power, but then again think of how many of those devices are out there. Please remember that even the industry is bound to these voltages, they are running lots of their equipment on 200V at maximum, where it would be 380V in let’s say Germany.

Dog’s breakfast, please don’t confuse the voltages used inside devices with those used for power distribution. Mainboard designers have a hell of a time to get the core voltages distributed to a “modern” CPU (P4E, anyone?) without much loss (due to exorbitant currents). If they made a mistake in their design, your beautiful new P4/whatever would bid farewell on first bootup with an even more beautiful meltdown.

And for the “safety” part: Are there any statistics telling how many fires were caused by overheated electric installations? That would be interesting. When I was working in Taichung, a company across the street had a fire once, caused by their electric installation. The first thing our boss had to do was order everyone to pull unneeded electric devices out and only plug them in when we actually needed them. Great! That exactly would cause the small resistance between the contacts in plug and socket needed to “warm up” a bit…

So: Installations aren’t safe now, it’s just an “unsafety” different from what we would have with 220V, but at least we would save some power and would perhaps not need to talk about a fourth nuclear power plant.

I’m not an electrical engineer, but I’ve made the observation that transformers throw off a lot of waste heat (which means wasted power). And the greater the change in voltages the more heat is produced. For example, I’ve noticed that when I plug my laptop into a 220V socket, the transformer brick gets much hotter than when it’s plugged into 110V. So I believe that these devices are wasting less power when running on 110V than on 220V. They’d waste even less if plugged into a 12V power source. Indeed, you can buy 12V transformer bricks for laptops - these barely get warm in use.

Of course, you’re not getting something for nothing. The power that enters your house comes off the power pole via a big transformer that supplies your whole building (in an apartment) or whole neighborhood (in a more rural area). The longer the wires between your house and the big transformer, the more power is wasted in transit. Higher voltages are less wasteful for transmitting over long distances. So the savings I get in my laptop transformer by using low voltage could be wasted in the transmission wires.

[quote=“dl7und”]
And for the “safety” part: Are there any statistics telling how many fires were caused by overheated electric installations? That would be interesting.[/quote]

I wasn’t thinking in terms of fire, I was thinking in terms of getting electrocuted. Have you ever touched a live 220V wire? It more than hurts - it can really kill you. This is much less likely to happen with 110V. Years ago I had a job repairing pinball machines - those things ran 50V and I’ve been shocked many times, but it’s not too painful. And 12V is so little that you can’t even feel it.

In Taiwan, lots of electrical fires are caused by crap wiring and overloaded sockets. The electrical code here is a joke. Aside from the lack of a third ground wire, there’s the fact that most rooms have no more than one power outlet on the wall, which means running lots of plug adaptors and extension cords along the floor. The exposed wires often become frayed, occasionally chewed by rats or pets, and wet from mopping the floor - a recipe for disaster.

[quote=“dl7und”]
So: Installations aren’t safe now, it’s just an “unsafety” different from what we would have with 220V, but at least we would save some power and would perhaps not need to talk about a fourth nuclear power plant.[/quote]

I’m not convinced that switching to 220V would make much of a dent in power consumption. I wish I had some figures to back me up, but I don’t. However, I can easily believe that Chen Shuibian’s administration is touting this as the solution to the fourth nuke plant, just like the dozen windmills Taipower erected (with a combined daily power output equal to one taxi ride in Taipei). I’d love to see some hard figures - if switching to 220V would save 10% across the board, then yes, it would be worthwhile and I’d be all for it. If it would only save 1%, then no, it’s pretty much a public relations exercise.

Now if you think I’m really opposed to 220V power, you’re misjudging me. For sure, 220V has its place, like for water heaters, air-conditioners, industrial uses (like arc welding), etc. And that’s really not a problem here in Taiwan - 220V is readily available. And as far as know, every air-conditioner and electric water heater sold in Taiwan today already runs on 220V (I’ve looked in the stores). My own home has both 220V and 110V outlets. However, since I don’t have an air-conditioner or electric water heater, the only thing I’m using 220V for is my water pump.

Better yet - if you really want to save power, get rid of the air-conditioner, wear shorts and a t-shirt, and plug in an electric fan. That’s what I do, and my power bills are consistently low. Even better would be to stick some 12-volt photovoltaic panels on the roof, and attach them to a bank of 12-volt batteries and power-sipping 12-volt appliances. I intend to do that, but that will wait until after I move (I’m not throwing any money into my current dumpy house). I also want to replace my currently power-hungry desktop computer with a VIA EPIA mini-itx (mini-itx.com). And while I’m on this soapbox, people should get rid of their SUV’s and overpowered cars, get a small-engined subcompact or motorbike (or bicycle?). Then we’ll be talking about some real energy savings, not symbolic miniscule amounts.

peace,
DB

That was probably a linear power supply and a badly designed transformer. Toroidal transformers are more efficient, but also more expensive. With switching power supplies however you shouldn’t have much loss anymore - if they are designed and used (They have something like an optimum power output, where they are most efficient.) properly.

Well, this should mostly apply to linear power supplies. If you power something with 12V and only need 5V in the circuitry, but at 3A, then your heatsink has to burn 3*7=21W.

While a switching power supply allows wider voltage ranges, it is most efficient when it works at fixed input and output voltages with a defined static load.

Sorry, now it’s getting technical, but I hope you stay with me. Here is some introduction to SPS. Sorry, couldn’t find anything better as a short intro, if you got more time to spend, National Semiconductor has excellent datasheets with lots of explanations. Let’s only look at step-down designs (input voltage higher than output) for now:

Simply said, an SPS takes a higher voltage, loads a capacitor with it until the voltage on the capacitor reaches a desired level and then switches the input voltage off. Since some load is draining power out of that capacitor, the voltage in it will go down and the input voltage will be switched on again. SPS always have some ripple on their output voltage due to this, which needs to be “designed away”. This switching happens pretty fast, simple designs work at 50kHz, more expensive ones even exceed 1MHz.

So, it doesn’t really matter how high the input voltage is, as long as the SPS can process it. A lower input voltage is actually not very good for an SPS, you can try to design one here, just enter your desired voltages.

But again, I was talking about distribution of electricity, not operation voltages of devices.

I completely agree with everything except the last sentence, here you mistaken again distribution and operation. Every piece of wire has its resistance. Since R=U/I and therefor I=U/R, we can see that higher voltage means less current over the same line (constant R), this is the magic behind high voltage overland lines. Your laptop power supply was probably just optimized for 110V, although it can also operate on 220V. If you get a power supply only for 220V (but with the correct output voltages), you may find it’s producing less heat than your current one. That’s just a design problem.

[quote][quote=“dl7und”]
And for the “safety” part: Are there any statistics telling how many fires were caused by overheated electric installations? That would be interesting.[/quote]
I wasn’t thinking in terms of fire, I was thinking in terms of getting electrocuted. Have you ever touched a live 220V wire? It more than hurts - it can really kill you. This is much less likely to happen with 110V. Years ago I had a job repairing pinball machines - those things ran 50V and I’ve been shocked many times, but it’s not too painful. And 12V is so little that you can’t even feel it.[/quote]

I know, I’ve touched more than 220V. I had the “pleasure” once to touch the wrong point in a Yaesu FT-220’s linear amplifier stage - while it was working, of course… There was a time when I was taught that 48V and below are considered “safe”. However, a very important part in getting electrocuted is the current that flows, which is determined by your body’s resistance. Just try to touch some metal parts on an ordinary pc these days while you’re sweating…

So, we can get electrocuted even now, with 110V. That’s why I wouldn’t want to only double the voltage, but also replace the current plugs and sockets - and introduce the idea of “grounding” devices. They can do it, they do it all the time when they produce for the European market.

But back to the fire. In Rascal’s example there were 25W and 100W of “wasted” energy. Since we were taught at school that energe can’t be “produced” or “disappear”, but only change its form, that energy must have gone somewhere. It has become heat. And here the big question is how high is that power and how widely is it “distributed”.

If a whole extension cord is warmed up with 25W, it may get warm, but the insulation will hopefully not melt. (Hopefully…) If those 25W however are concentrated in a socket in the wall, then that socket may very well start a fire. I have a solder iron running at 30W. Would somebody like to hold the tip while it’s powered on?

And this is the thing that really frightens me often. When I see how much power is distributed through the cheapest extension cords and how people unplug high power devices each time they don’t need them because they think it’s “safer” - oh dear…

Yes, and due to what I wrote above, the situation now is not much “safer” than it would be with 220V on the same equipment.

OK, let’s do some math. For simpler calculation I use these devices: a lamp with 55W, an oven with 1100W and an A/C with 2200W. On 110V these devices need the following currents to achieve those powers:

  • lamp: 0.5A
  • oven: 10A
  • A/C: 20A

Now since our wiring is not yet superconducting, we will have some resistance produced by the wiring and connectors between the “big” transformer (in the building or nearby) and our device. Let’s assume there is only one Ohm of resistance, what happens then? We’re “losing” energy on the line.

Two things happen here actually: The resistor (wiring) is causing a voltage drop depending on the current drawn (U=I*R - more current, more loss) and it heats up depending on the voltage difference (the “lost” voltage) and the current drawn.

Voltage drop

  • lamp: 0.5V
  • oven: 10V
  • A/C: 20V

Power to dissipate

  • lamp: 0.25W
  • oven: 100W
  • A/C: 400W

This is also the power your device lacks. The oven has only 1000W instead of 1100W and the A/C only 1800W instead of 2200W, thanks to the distribution system’s resistance. Now imagine you got 2 Ohm resistance…

The same calculations with 220V.

Currents

  • lamp: 0.25A
  • oven: 5A
  • A/C: 10A

Voltage drop

  • lamp: 0.25V
  • oven: 5V
  • A/C: 10V

Heat to dissipate (or “lost energy”)

  • lamp: 0.125W
  • oven: 25W
  • A/C: 100W

Got it? :loco:

:smiley: I have a few installed at my place, but I never use them.

Sorry, but that way you would waste a lot of energy you gained from the sun. Just calculate a few devices like above, but with 12V supply. You must have very short and thick wires if you don’t want to waste too much. Otherwise use an inverter to 110/220V.

:smiley: I’m running a few of those, a C3@800MHz from their first series is working here and here. I’m still working on a plan to get my internet access completely powered by the sun…

That is correct. Transformers have an efficiency of around 90%, i.e. 10% get’s wasted as heat.
Actually all electrical equipment wastes energy, some more, some less.

Laptop power supplies are typically switching mode, not transformer based. Mine “burns” also at 110V btw, it outputs 19V / 4.74A (=90W). The max. input current is stated as 1.5A, that makes 165W at 110V, thus it produces 75W of heat.
At 220V we would only need half of the input current, after all the output requires only 90W and thus the power consumption is not dependent on the voltage alone but rather the product of voltage and current (=power).

Using a 12V supply may reduce the losses somewhat, but how do you get the 12V power source to start with? If the 12V is derived from your mains supply you are merely shifting the conversion from your laptop (it’s own power supply adapter) to another place (e.g. transformer in the mains distribution).
Of course you could derive the 12V from batteries or a solar panel even, but don’t forget that you will still have cable losses, possibly causing a voltage drop that the resulting voltage at the outlet is not sufficient anymore. And not all equipment runs on 12V exactly.
Thus the only viable solution is to use an AC power distribution and transform it to the specific requirements of each equipment.

Yeah, a few times. And I live to tell the tale. :wink:

I made my first experience when I was two or three years old and found a screw on the carpet which nicely fitted into the wall outlet - and “lucky” as I am I hit the ‘life’ on the first attempt. Autsch.

No real disagreement here, but most solar panel systems do run on 12V or 24V DC. You can’t directly connect the solar panels to house’s 110/220V electrical wiring because solar is DC, and (conventional) houses run AC. You’d have to use a power-wasting invertor to get 110V or 220V. You can buy quite a few 12V appliances (thanks to the car and RV industry), but forget about running your air-conditioner on 12V.

A little bit of info gleaned from solar vendors…

Solar (photovoltaic) panels range in size from 5 watts to nearly 200 watts. Most solar panels up to 120 watts are 12 volt. Many of the 120 watt and higher PV panels are 24 volts - and a few are special “odd” voltages, such as 30-40 volts, mainly designed for grid tie applications, such as the Kyocera KC167. Those panels CAN be use on battery charging systems when using the Ouback Power MX60 charge controller, but not other charge controllers.

store.solar-electric.com/solarpanels.html

More good stuff here…

absak.com/alternative-energy … power.html

cheers,
DB

Yes. It was in the Apple Daily a while back. They had a little graph (of course) but the statistics came from the government, or some central collection agency. Wish I could find it. One of the biggest causes of accidents in Taiwan.

Why is everyone so frightened of getting electrocuted? Do people in America get electrocuted a lot? How? I really don’t understand. I’ve never heard of anyone getting electrocuted, except from my dad who was an electrician and used to get zapped all the time.

Seriously though. Electrocuted? Really? Where? At home? :astonished: