PDA

View Full Version : T'internet & Energy consumption?


pzu
20th Feb 2013, 16:08
Last night (Tues 19 Feb 2013), was watching 'Press Review' on SKY NEWS - point of discussion was upcoming (2018 -20) Power shortage in UK due various reasons (Gas Price, Wind Farms, late delivery of Nuclear option and reduction in Coal Powered stations;

One of the 'pundits' Jonathon Maitland raised Internet usage as a major factor in Future Energy consumption - any comments

AS an aside he threw in the 'factoid' that a Google search was equivalent to boiling a kettle!!!

PZU _ Out of Africa (Retired)

axefurabz
20th Feb 2013, 16:43
AS an aside he threw in the 'factoid' that a Google search was equivalent to boiling a kettle!!!

Hmmm, bloody small kettle!

G-CPTN
20th Feb 2013, 16:49
BBC NEWS | Technology | 'Carbon cost' of Google revealed (http://news.bbc.co.uk/1/hi/7823387.stm)

west lakes
20th Feb 2013, 17:04
The load of the individual computers and screens is an issue. Not in the home but we had a few cases where schools were installing large numbers of them "to keep up"

The load of over 100 devices in a school is significant on top of what they were using already. Two local secondary schools had to increase their supply capacity as a result!

mike-wsm
20th Feb 2013, 18:18
Modern computing devices use less power - when did you last burn your hand on your tablet? And the small amount of power that is used is helping to heat the building, so you have to reckon that saving into the equation.

I once visited IBM at Winchester and they told me the waste heat from the dp center was collected and used to heat the buildings. An interesting example of 'heating by thought'.

G-CPTN
20th Feb 2013, 18:25
When the IBM System 7 (http://en.wikipedia.org/wiki/IBM_System/7) mainframes were installed (1970s) they required air-conditioned rooms to counteract the heat generated.

AlpineSkier
20th Feb 2013, 19:07
Wasn't that normal for all main-frames back then ?

G-CPTN
20th Feb 2013, 19:16
I never saw the earlier mainframe (CALL 360), but when they moved to System 7 a new computer 'suite' was built.

Dushan
20th Feb 2013, 21:09
I once visited IBM at Winchester and they told me the waste heat from the DP center was collected and used to heat the buildings. An interesting example of 'heating by thought'.

Mike :ok: I haven't herd this term in 20 years.

Dushan
20th Feb 2013, 21:12
Wasn't that normal for all main-frames back then ?

IBM 360 up to model 50 was air cooled, after that it was water cooled.

BigEndBob
20th Feb 2013, 22:33
Biggest problem is the number of flat screen tele's.
Years ago household had one crt of about 80 to 100 watts.
Now theres 2 or 3 plasmas/lcd pulling 120-180 odd watts.
Of which one might be connected to a computer 150-200w.
Don't forget all these new build flats, probably electrically heated.
Still not to worry, i won't be able to read my electricity bill because of ruined eyesight from low wattage energy saving light bulbs!

Windy Militant
20th Feb 2013, 23:03
Tain't the Lappy's or tablets that needs the leccy it's the server farms!
takes a whole lot of leccky to grow servers it does.

Also it takes a fair bit of oomph to fire the signals down the line!
Your particular packet of info might only need miliamps but remember there's a few billion others whizzing about!

Used to look after the cooling plant for a Cray XMP, off the top of my head it had a couple of tons of cooling. I know one thing if the compressors tripped out for any reason you had twenty minutes to restart them. After that cold well temperature rose above the limit set to trigger the auto controlled crash shut down sequence to stop the core from suffering a melt down. :eek:

IBM 3090 was water cooled, we had one of them as well.

mixture
20th Feb 2013, 23:24
One of the 'pundits' Jonathon Maitland raised Internet usage as a major factor in Future Energy consumption - any comments

I have not seen his comments. But his wikipedia suggests he is a mere journalist with zip experience in the IT or Telecomms industry.

I would put his comments down as utter nonsense. There isn't going to be a vast increase in energy consumption caused by internet usage because systems are only going to become more efficient, not less. For example, in the last 10 years, there has been a great increase in the adoption of virtualisaton technology, where you deploy virtual server instances rather than throwing new physical servers at the problem .... so that means you end up with fewer larger, but more efficient servers, rather than hundreds of smaller less efficient ones. Wholesale level communications are pretty much 100% fibre these days... and you can play clever tricks with fibre to transmit quantities of data many times the capacity of even the best copper cable option.... more capacity per cable means fewer cards, fewer routers/servers etc.

On a per user basis, more energy is probably used by the users themselves who are using older equipment, or in the case of businesses, less energy efficient older established network setups that are harder and more complicated to replace than replacing desktops and servers.

AS an aside he threw in the 'factoid' that a Google search was equivalent to boiling a kettle!!!

Over egging the pudding I would suggest. Google are quite careful on their PUE figures, and given the significant audience I don't think you could boil it down to amounting to kettle per user !

Erwin Schroedinger
21st Feb 2013, 06:05
I did a Google search, enquiring as to the energy used during a Google search.

September 2011 article:

6 Things You’d Never Guess About Google’s Energy Use | TIME.com (http://techland.time.com/2011/09/09/6-things-youd-never-guess-about-googles-energy-use/)

One Google search is equal to turning on a 60W light bulb for 17 seconds

Google says it spends about 0.0003 kWh of energy on an average search query, translating to roughly 0.2g of carbon dioxide. Related fact: searching the web 100 times is equivalent to drinking 1.5 tablespoons of orange juice, Google says. That’s hard work!

yotty
21st Feb 2013, 06:44
BigEndBob. We got a new Panasonic TV lately and reading from the "EU energy consumption label" the 47" LED / LCD has a consumption of 69 watts. And an average yearly consumption of 95 kWh. I don't expect the same reliability as our old CRT though! :)

probes
21st Feb 2013, 07:01
Well, but one has to pay for something anyway - PC, paper or TV (and it seems to me young people prefer to watch things on their puters and when you're working on it, you won't watch TV). Wouldn't producing and disposing of the batteries needed for all the different gadgets (phones-pads-etc) be a bigger issue in the long run?

mike-wsm
21st Feb 2013, 07:35
Electricity supplies are already doomed, there isn't time to build enough power stations to meet future demand. So keep those old-fashioned paper books to read when the power's off and all those batteries have run down. :p

MG23
21st Feb 2013, 09:42
Tain't the Lappy's or tablets that needs the leccy it's the server farms!
takes a whole lot of leccky to grow servers it does.

Yeah, my new gaming PC takes about 200W at the wall when playing games, and about 50W when web browsing. Our servers use about 200W when idle.

On the other hand, if you can replace a hundred desktop PCs with one big server and a hundred tablets, you've probably saved a few kilowatts.

Sunnyjohn
21st Feb 2013, 09:58
Electricity supplies are already doomed, there isn't time to build enough power stations to meet future demand.
Not quite true, Mike. In the UK, at least, the government can't find anybody willing to fund them. Now if anybody would like to buy an Arran sweater, I do a nice line in these at . . .

MG23
21st Feb 2013, 10:22
Not quite true, Mike. In the UK, at least, the government can't find anybody willing to fund them..

Hasn't the British government committed itself to reducing carbon emissions by 80% before 2030, or something silly like that?

If so, it's not hard to see why no-one would want to invest billions in new fossil fuel power station that will take decades to pay back their construction costs.

vulcanised
21st Feb 2013, 11:57
Fear not !

HMG have matters in hand and are saving us all from the darkness by commissioning ever more windmills http://images.ibsrv.net/ibsrv/res/src:www.pprune.org/get/images/smilies/evil.gif

wings folded
21st Feb 2013, 14:04
The real drain on the leccy is not the computer, it comes from those, visited by the insomnia fairies, who get up at 3 in the morning, put on the lights and brew a cup of tea while they pprune.

arcniz
21st Feb 2013, 14:17
On a per user basis, more energy is probably used by the users themselves who are using older equipment, or in the case of businesses, less energy efficient older established network setups that are harder and more complicated to replace than replacing desktops and servers.

Don't believe it!

One has been designing computers in the industry - CPU's in particular, but whole systems oftentimes as the practical requirement - for donkey's years - forty and some, with a decade of warmup practice before that.

Yer highly-advanced computer mainframe of 1970 -- that would solve serious real-world problems in half a dozen computer languages for four or five hundred people all working at the same time (each able to talk over dial-in phone lines at the furious rate of 10 or maybe 30 bytes per second). The physical thing was, with supporting peripherals, a clump of 6 or 12 or 18 refrigerator-size boxes, plus a half-dozen or more washing machines for various peripherals. Input power, for wiring purposes, was around 125-150 Kilowatts for the computing equipment, and same again, or a bit more, for the lights, air conditioning, etc. that were necessary adjuncts. The stand-by power system to keep it going when mains power dropped out, was a pair of full-size commercial truck-trailers, one with a massive motor generator in it, the other with quite many gallons of diesel fuel. That was kept running 24x7, since the thing depended on a few tons of rotating mass to be able to pick up the loss of line power in a fraction of a cycle. A non-operating backup for the two semi-trailers was usually necessary also, just in case.

Performance of the computer system was scraping, using every trick known, to hit one million 32-bit instructions executed per second (MIPS). That was the practical state-of the art, pretty much the best of the best, in the early 1970's. Purchase cost for the computing center stuff described - just the computer and not counting the room, the AC, the building, wiring, or backup power system nor any of the substantial staffing and operational costs, was on the order of $2.2 million $ US, (at a time when $32 would purchase an ounce of gold internationally, giving a net equivalent cost of 68750 ounces gold, or $110 million$$ at current rates ). Those systems were greatly in demand, with delivery waiting times of 3 to 6 months after an order, IF a prospective could get a delivery commitment.

Needless to say, the systems weren't easy to build, being state-of-the-arty and all. Many components, many connections, very difficult to test, etc. They also weren't very reliable in service. Despite careful, conscientious designs, there were just so many components operating at top performance in a large system that failures were common. If one had ten million parts, connections, etc per system, and the mean-time-between-failure averaged a million hours in service per component, then there was sure to be a lot of fixing going on after a couple years of use.

The software operating system that ran this better-than-typical large mainframe computer system, the kind of engine that an important corporation or university might have as its showcase main computing tool, was about 2.5 megabytes of compiled code, in total, including program compilers for half a dozen languages. Memory was bulky and very expensive -- magnetic cores knitted by hand into large arrays by patient ladies in Puerto Rico - with 128K bytes being sufficient to support the entire process as described, or 256K per System for the really big spenders.

The US Space Program, into the mid-80's, was designed and executed with computers comparable or less powerful than the one described above.




fin

Always on the look for a bargain at the trailing-edge of the year's tech prod marketing cycle, I recently acquired a Dell laptop of undistinguished sort that was on sale and met my need of the moment. In some ways it is cheaply put-together, yet entirely fit-for-purpose, and so is unlikely to withstand long heavy-travel use, but still it has much more capability, in a much more favorable and useable configuration, than did the 70's mainframe just described:

IT is a completely self-contained portable computer, with internal battery supply that will run for four or five hours, in principle, when the power goes out. The total idle operating mains power requirement, as measured, is only about 15 watts for computer, 17-inch high-res display, and all the desirable interfaces and peripheral features of the current time. Performance, at a guess, is three MIPS, (several times faster than the ancient Mainframe described above), it weighs about as much as a grown cat (clunky for a contemporary laptop, but lighter-than air compared to our 10+ton 70's mainframe.) Memory capacity and disk storage are thousands of times larger than on the old mainframe, and much, much faster.

Cost for this marvel, quantity 1, was about $500 US, the price of three tenths of an ounce of gold, roughly, at today's rate. Odds are that it will work flawlessly for a couple years, living an easy life, then develop some minor flaw, and hence will join the stacks and stacks layers of old and older computers in the hall just outside my retreat-office door, held in patient optimistic reserve for some sort of computer-shortage emergency unlikely to occur over the waning duration of my shift.

--- So, bottom line now is that one can easily get a new cheap but capable laptop and a 12 watt LED light bulb, replace one old 100W bulb (sending it on to Dr. Draper, of course), and then run them together for as long as patience allows, thereby having the equivalent of $100 million of 1970's computer plus some bright light, and altogether a net cost power saving of more than 50 percent over operation of the old lamp bulb, by itself.

probes
21st Feb 2013, 15:20
well, that was thorough! :)
makes one feel a lot better for sure.

Otherwise, as for ...those, visited by the insomnia fairies, who get up at 3 in the morning, put on the lights and brew a cup of tea while they PPRuNe - I wanted to remind it's not safe to wander in a pitch-dark house, as we all know now, so the lights have to be on. LED it has to be, then. :)

Sunnyjohn
21st Feb 2013, 19:09
LED it has to be, then. Indeed. I am told that they interfere with wireless transmission but I have two wireless speaker systems running from my computer and the several LED lamps I have seem to to affect them not one whit.

wings folded
21st Feb 2013, 19:18
so the lights have to be on. LED it has to be, then. http://images.ibsrv.net/ibsrv/res/src:www.pprune.org/get/images/smilies/smile.gif

Takes a bloody long time to boil a mug of water for tea using LEDs.

Far better take a huge slug of a decent single malt; it:

1 Fettles the insomnia
2 Uses far less kilowatts

but

3 Can lead to posts which are slightly difficult to decipher, even by the scribe himself, the next day

G-SCUD
23rd Feb 2013, 14:21
I'm going all misty-eyed now, thinking about how I lost my computing virginity to a DEC PDP-8 - the then almost incredibly tiny (at about the size of a filing cabinet) 'mini computer'.

How I loved our Switch Register foreplay, teaching her how to read before we got down and dirty with the Teletype. 'Deposit and Clear Accumulator', 'Skip if Non-zero Link' and the like were our sweet words of tenderness...

Anyone else remember PDP-8 or -11?

UniFoxOs
23rd Feb 2013, 14:43
thereby having the equivalent of $100 million of 1970's computer

Come on, Arcniz, you know that ain't so. Yer 1970's mainframe had less MIPS, maybe, but many of the instructions were a hell of a lot more powerful than a PC (move a big block of memory from one place to another with one instruction?) and they (well the ones I worked on) had autonomous "channels" that would take care of many peripherals at the same time while the CPU was crunching. And your PC won't keep a few dozen time-sharing users going at the same time, either.

Still good value comparatively, though, but you're comparing a mini with a roller.

UFO

WhatsaLizad?
23rd Feb 2013, 15:03
Recently purchased a 60" widescreen LED lit TV.

Max bright daylight viewing, uses around 140 Watt

Minimum dim setting at night, 40 Watt

Average setting, decently lit is around 70-80 Watt.


Less than that single 100 Watt bulb that I've seen burning by itself in the middle of the Amazon without a single light within a 200 mile radius :}