Prathamesh Posted January 28, 2012 Posted January 28, 2012 I want to know the significance of the number "230" for the general household electric supply. Why "230" why not anything else?????
TonyMcC Posted January 28, 2012 Posted January 28, 2012 (edited) I want to know the significance of the number "230" for the general household electric supply. Why "230" why not anything else????? No particular reason as far as I know. There is a list of different countries and the mains voltage they use:- http://en.wikipedia.org/wiki/Mains_electricity_by_country By and large the higher the voltage the less the power loss in the connecting wiring, but the lower the voltage the less the risk of electrocution. Edited January 28, 2012 by TonyMcC
John Cuthber Posted January 28, 2012 Posted January 28, 2012 There were two systems, one based on 240 volts- because people like things in dozens and, more practically, it's a multiple of 1,2,3,4,5,6,8,10,12 and so on. The other system was based on 220 V- this dates back to the days when supplies were not well built and could easily lose up to 10% and so, for a nominally 100 volt supply at the user's end of the cable , they added 10% to give 110 volts at the generator's end. Two of those in anti-phase gave a nominal 220V system. The first was adopted in the UK and the second in much of Europe. When they came to harmonise the two the split the difference and came up with 230.
TonyMcC Posted January 28, 2012 Posted January 28, 2012 There were two systems, one based on 240 volts- because people like things in dozens and, more practically, it's a multiple of 1,2,3,4,5,6,8,10,12 and so on. The other system was based on 220 V- this dates back to the days when supplies were not well built and could easily lose up to 10% and so, for a nominally 100 volt supply at the user's end of the cable , they added 10% to give 110 volts at the generator's end. Two of those in anti-phase gave a nominal 220V system. The first was adopted in the UK and the second in much of Europe. When they came to harmonise the two the split the difference and came up with 230. I'm sorry, but the following link gives a completely different "History":- http://en.wikipedia.org/wiki/Mains_electricity I have only heard of the number 240 being chosen for the reasons you state in respect of the number of pennies in the old British pound.
timo Posted January 28, 2012 Posted January 28, 2012 And of course the anonymous guy who wrote the Wikipedia article is much more credible than the anonymous guy who posts as "John Cuthber" on sfn
TonyMcC Posted January 28, 2012 Posted January 28, 2012 And of course the anonymous guy who wrote the Wikipedia article is much more credible than the anonymous guy who posts as "John Cuthber" on sfn Of course It is possible that the anonymous guy who posts as TonyMcC and who is now retired after a working life (including teaching) in electricity and electronics may not have come across the given reasons for 240 being used for mains supplies.
John Cuthber Posted January 28, 2012 Posted January 28, 2012 OK, so wiki says " Edison selected 100 volts for the lamp as a compromise between distribution costs and lamp costs. Generation was maintained at 110 volts to allow for a voltage drop between generator and lamp." which is essentially the same reason I gave. And it gives no explanation for 240V so it certainly doesn't state or imply that I was wrong. So, it's completely different in the sense of being the same.
TonyMcC Posted January 28, 2012 Posted January 28, 2012 (edited) And it gives no explanation for 240V so it certainly doesn't state or imply that I was wrong. So, it's completely different in the sense of being the same. All I can definitely say is that I have never heard this explanation so cannot say whether you are right or wrong. I would have expected to have come across it if you are correct. A reference showing that you are correct would earn an apology from me. Edited January 28, 2012 by TonyMcC
John Cuthber Posted January 28, 2012 Posted January 28, 2012 (edited) OK, so what you said was "I'm sorry, but the following link gives a completely different "History":- http://en.wikipedia....ins_electricity" but the link doesn't give a different story at all- it gives the same story for 110 and no story for 240. As for " would have expected to have come across it if you are correct." Perhaps you just did. You came across it here. Some (I think most) high voltage transmission and industrial power are distributed in the UK at multiples of 11 volts (such as 11KV and 132KV) One of the voltages available industrially is 440V 3 phase. I think that would give about 254 V between phases so that's not it. The fact remains that if I measure the mains voltage it will be pretty close to 240V and someone must have decided on that, even though it made for slightly odd ratios in the transformers when they stepped it down from 11KV. Edited January 28, 2012 by John Cuthber
TonyMcC Posted January 28, 2012 Posted January 28, 2012 OK, so what you said was "I'm sorry, but the following link gives a completely different "History":- http://en.wikipedia....ins_electricity" but the link doesn't give a different story at all- it gives the same story for 110 and no story for 240. As for " would have expected to have come across it if you are correct." Perhaps you just did. You came across it here. Some (I think most) high voltage transmission and industrial power are distributed in the UK at multiples of 11 volts (such as 11KV and 132KV) One of the voltages available industrially is 440V 3 phase. I think that would give about 254 V between phases so that's not it. The fact remains that if I measure the mains voltage it will be pretty close to 240V and someone must have decided on that, even though it made for slightly odd ratios in the transformers when they stepped it down from 11KV. OK, let's get back to square 1 - What is the significance of 230V? Depending on the country the question might have been asked using any of the values 110V:115V:120V:127V:220V:230V:240V. Because of this variation I deduce the value chosen is somewhat arbitrary. It seems to have been chosen at various times to suit the technology of the time. What I doubt is some authoritative body said " We choose 240V because it divides by so many factors" As I have said - I may be wrong but if so I am surprised that I have never come across this as a statement (before). 1
John Cuthber Posted January 28, 2012 Posted January 28, 2012 "OK, let's get back to square 1 - What is the significance of 230V?" It's half way between 220 and 240 The UK uesd 240 and most of the rest of the EU used 220 so they decided to compromise. They changed the nominal figure http://www.legislation.gov.uk/uksi/1994/3021/regulation/4/made but the tolerance is such that they didn't actually have to change the voltage. The true origin's of the UK's preference for 240 are probably lost in history but, fundamentally, it was an arbitrary choice. They chose 240. Possibly because they were born on the 20th of December and they multiplied the numbers together . Perhaps they consulted their astrologer but I think it's most likely to be because it's an easy number to do arithmetic with.It makes calculation of currents and resistances easier . Incidentally the numbers you show 110V:115V:120V:127V:220V:230V:240V can be explained as nice round number, their multiples of 110% (with or without a factor of root3 for 3 phase systems), a political fudge, and 240V or half of it. 100 or 200 V would be nice round numbers, and they are arbitrary. 110 is what you get by adding 10% to 100 115 is what you get between phases on a 200 V 3 phase system 120 s another number with lots of factors- it's half of 240 127 is the voltage between phases on a 3 phase 220V system 220 is 110% of 200 230 is a political fudge 240 has lots of factors. It is the number of old pennies in a pound or the number of hours in as many days as people have fingers. It's a score of dozens. Those are not coincidental. In the days when you had to do arithmetic the hard way, there was a lot to be said for numbers with many factors. What other reason can you give for 240? 2
TonyMcC Posted January 28, 2012 Posted January 28, 2012 (edited) t is the number of old pennies in a pound or the number of hours in as many days as people have fingers. It's a score of dozens. Those are not coincidental. In the days when you had to do arithmetic the hard way, there was a lot to be said for numbers with many factors. What other reason can you give for 240? The best I can come up with is that the nation grid was standardised at 132KV by the 1926 Electricity(Supply) Act. This does have 240 as one of its factors. But I can't easily find why this value was chosen. I don't intend to spend money trying to get copies of minutes of meetings from archives (if they are available). As I say, I have never heard or read of the factor argument being a reason for its choice and in my line of work and experience I would have expected to. I have to wonder if you have heard or read of this or whether it just seems a reasonable idea to you. If you have heard or read of this I would like details please. By the way I believe present day systems step down to domestic systems along the lines of 400KV, 275KV,132KV, 34KV, 11KV ,230V. Some pretty strange ratios here. Incidentally, I am pleased that when I had a gliding accident about 35 years ago I "only" collided with 11KV lines! Edited January 28, 2012 by TonyMcC
ewmon Posted January 29, 2012 Posted January 29, 2012 Keep in mind that 230VAC is a derived rms "voltage" computed from the peak-to-peak voltage Vrms = √½∙Vp-p so, Vp-p would be 230/√½ or 325VAC. Also, 230VAC being a single-phase value, the two-phase value would be √3 times more, which is 398VAC. Perhaps the two-phase was/is the standard. If it's 400.00VAC, then the one-phase is 400/√3, or 230.94VAC.
Suxamethonium Posted January 29, 2012 Posted January 29, 2012 This reply is fairly specific to Australian systems but Im sure the principle is the same. I have clearly defined what I can source, what I have speculated and what I have heard. Take it as such. Ok, I have sourced (Electrical Wiring Practice Vol. 1 -ISBN 007471052-4) that Australian generators manufacture electricity in 3 phases (and a neutral is taken) and that these phases are 2(pi)/3 radians out of phase. Originally Australia was on a 240v Neutral to phase/415v Phase-Phase system but has reverted to 230/400v. Australian generaters typically produce 11kV, 17kV, 22kV or 23kV outputs and is typcally stepped up to 66kV, 132kV and 330kV (particularly primary grid except some states have 500kV systems) lines to cover distance efficiently. Ok, now a bit of speculation: From these high volatages, I'm assuming likely that the transformers have close to whole number ratios (like 1/600 or 1/500) which in Australia result primarily about 220 and 240V leading the average to be about 230 (+10%/-6%) - this in itself is obviously inconclusive but when it comes to 110v or 220v systems, it was probably just an arbitary decision made by each country. The values themselves are probably just a result of using whole ratios in tranformers- seeing as the generators are producing voltage in multiples of 11, it is not surprising the rest of the system follows suit. Finally, call it a rumour: I also heard many countries that went with the 240/415 or 230/400 systems did so because there was a greater voltage available (as opposed to 110/230 systems) from the phase-phase outlet without needing to step up in high demand industrial appliances. It is also more effecient transporting this along the low voltage distribution grid. I don't think it was at all influenced by safety- they are both as likely to be lethal as each other.
insane_alien Posted January 29, 2012 Posted January 29, 2012 In the early days, there were all sorts of voltages going about depending on what was your local power station/company (no standardised grid). There was probably a lot of lobbying and bribing when time came to standardise so it was likely chosen by the company with the most money to throw around when the government stepped in. Why THEY chose 240V? who knows. maybe it was their lucky number.
TonyMcC Posted January 29, 2012 Posted January 29, 2012 Whilst we are speculating, I imagine an important factor examined 1n the 1926 Electricity(Supply)Act was efficiency if the UK was to have a standard countrywide connected system. The higher the voltage used on the transmission lines the lower will be loss due to heating of the lines. One of the considerations was likely to be just how high can we go? I would think that insulation between the lines and the metal pylons may have been a limiting factor. This of course brings in the need to consider Vpeak for the values given in Vrms as mentioned by Ewmon. e.g. the insulators for the 132 KV system would have to withstand peak voltages of approx 187 KV. Perhaps someone decided on a 200 KV limit as a maximum. Perhaps they started with this maximum they considered safe and worked their way down. Perhaps starting from this basis they then decided to have a nice many factored number as mentioned by John Cuthber. Who knows?
Suxamethonium Posted January 29, 2012 Posted January 29, 2012 Whilst we are speculating, I imagine an important factor examined 1n the 1926 Electricity(Supply)Act was efficiency if the UK was to have a standard countrywide connected system. The higher the voltage used on the transmission lines the lower will be loss due to heating of the lines. One of the considerations was likely to be just how high can we go? I would think that insulation between the lines and the metal pylons may have been a limiting factor. This of course brings in the need to consider Vpeak for the values given in Vrms as mentioned by Ewmon. e.g. the insulators for the 132 KV system would have to withstand peak voltages of approx 187 KV. Perhaps someone decided on a 200 KV limit as a maximum. Perhaps they started with this maximum they considered safe and worked their way down. Perhaps starting from this basis they then decided to have a nice many factored number as mentioned by John Cuthber. Who knows? They still use such high voltages to transport the electricity over distance, but it was never used for a GPO (our GPOs would glow from corona effect and leak electricity). I don't think there is any reason for the maximum high voltage to have influenced the GPO voltage (other than maybe similar multiples, which I suggested may be based on whole, rounded transformer ratios). (e.g. nationwide australia is the 230/400vac system at GPO and 3-phase outlets, but in different states 500kV or 330kV are the voltage in the high-voltage transmission lines used).
Prathamesh Posted January 29, 2012 Author Posted January 29, 2012 Thank you people for all your replies. So finally we conclude that the reason for choosing 240 volt for domestic supply is quite arbitrary (Or if there is some specific reason, we have lost it in history).
AKHIL NAIR Posted May 8, 2012 Posted May 8, 2012 I think i got your query.If i not got please let me know.Following writing is about what i understood The 240 v is per phase voltage.The transformer at substation convert 33/11 kv step down to 415v which is 3 phase value and then per phase value is been calculated the 3 phase value divided by sqrt of 3 which is aprox 240v. Moreover our house hold system is now inbound with this value
tomgwyther Posted May 8, 2012 Posted May 8, 2012 Speculation: Maybe it makes (Or maybe used to make) the billing easier for electric company. That is if the electricity meter - for some reason - is only capable of measuring amps and you need to know how many watt-hours are being used; a constant-ish 240v reconciles a decimal currency with a 24 hour day quite nicely. I reckon the accountants had something to do with it.
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now