Samsung Galaxy Book review: Hands-on with Samsung’s high-end hybrid
I'm talking about electricity costs here, if it's a fairly recent one, assuming it's on 24 hours a day.
Thanks Djohn. So that's about £180 per year - more than I thought.
I wonder if any of the chip manufacturers have thought of applying to full-size PCs their power-saving schemes used for laptops, eg slowing down the clock when nothing particularly demanding is happening, but automatically going to full speed when required. Or perhaps they already do this.
I guess that flat-panel screens use less power than CRT-based ones too.
It looks like improved technology, with faster processors, made computing more expensive and less environmental (more power consumed = more CO2 produced by power stations), but perhaps new power-saving technologies might reverse this.
You would get a rough guess from your PSU. (and yes I know that they do not run to the limit). Say you have a 250 watt PSU in your PC then in 4 hours that would be 1KW (one unit of electricity). So that would be 6 units in 24 hours. Plus you would have the monitor (say 50 watts or 20 to 30 watts for a TFT) plus say 10 watts for the modem and another 10 watts if you have a router. Extra again if you leave the printer switched on. So it can get quite expensive over a year or so which is why my PC system including cable modem and router are ALL switched off when not being used. Some of the more expensive PSU's will regulate power according to the system requirements and these are more economical to use. The foregoing does not take into account standby conditions.
Not a lot.
My PC is on for 18 hours a day running at 100% full load on the processor and i reckon it adds £6 a month to the electric bill
BTW a CRT monitor is a lot higher than 50W, it is that on standby. maybe 200 or 300W for a CRT.
Hi there The Sack, yep you have surprised me. My two CRT monitors are well tucked away with their manuals but I have just been on the Belinea website and they give the power for a 17" CRT monitor as "ON" 120 watts; "SLEEP" 3 watts. They also give exactly the same figures for a 19" CRT monitor. My 28" widescreen JVC TV lists power usage as Maximum 168 watts, Average 115 watts, standby 0.8 watt. With regard to the last figure I wonder why the government energywatch site tells you to turn off TV's when not in use - do not leave them on standby!
£6 a week for full on 18 hours daily seems a bit on the low side.
I remember hearing about 10 years ago that a TV set left on standby could still consume up to something like 20W (presumably a particularly inefficient one), which doesn't sound very high, but if half the population were to do this, it amounts to quite a bit of power. Looks like the manufacturers have got their act together in the last few years and greatly reduced standby power.
I've got an Avometer connected where the fuse would go in a plug which serves a distribution board.
15" Fujitsu monitor 94ma on standby
250ma switched on
Comp. 2hd win98 550m athlon 500ma on startup, includes monitor
Epson printer switched off 50ma
Total while writiing this 550 milliamps
Dunno what you pay for electricity so you'll have to work that out for yourself. Just going to unplug the printer :)
Just to mention that leaving a TV on standby has been classed as the second largest cause of house fires after the good old Chip Pans.
When TV's are left on standby they are still "live" in a sense and so become far more dangerous (likely to spark internally and cause a fire) than when they are even on.
For this very reason, a fire officer recently advised (not that I do it anyway) that you should NEVER leave a TV on standby for any length of time.
A little out of character for the thread but seeing as it was brought up...
This thread is now locked and can not be replied to.