# Laptop/Electricity Usage

Survey Surfer 21:45 28 Oct 05
Locked

Does anyone know the amount of electricity a laptop uses when the battery is not being used?
I am wondering approximately how much it costs to run a computer several hours a day.

Regards,
Jezzy

i.tech 22:03 28 Oct 05

Try contacting the manuafacturer - they should be able to give you a rough idea

Fruit Bat /\0/\ 22:12 28 Oct 05

Approx power = 150 watts

7 hrs = 1KW

1KW per hour = 8 pence

Typical family house uses bout 72p per day

Your laptop on for 7 hours = 8p

jz 22:12 28 Oct 05

It depends on the model and what programme you're running, but probably around 25 Watts. Cost per day is:

Num hours used per day * Power in Watts * Price per kilowatt hour / 1000

Price per kilowatt hour is something like 7.5p.

So, if you use your laptop for 5 hours per day, it costs:

5 * 25 * 7.5 / 1000 = just under 1p per day (less than 4 pounds per year).

jz 22:14 28 Oct 05

Fruit Bat: I think that amount of power would be for a standard PC, not a laptop. The power for a standard PC would be higher than 150 watts if you have a very fast processor and a fast graphics card.

jz 22:28 28 Oct 05

I should explain that laptops are designed to use as little power as possible for good battery life between charges.

The circuitry is designed to use low power, but is also designed to slow down if you're not doing much (eg typing into a word processor) to use less power. When the software is working hard (eg playing a computer game) the circutry speeds up so uses more power. Even so, playing a game on a laptop uses less power than on a standard PC, since the circuitry is designed to minimise power use.

Fruit Bat /\0/\ 22:36 28 Oct 05

Laptop it self says something like 25 watts

However look at the power adaptor and calculate the input power.

Watts = volts x amps

I was of course rouning everything up to get the maximum for when fully charging etc. and agree I'm well over the top at 8p perday

jz 22:42 28 Oct 05

Actually, Watts is only Volts x Amps for DC. For AC, something called 'Power factor' comes into play, so Watts is often quite a bit less than Volts x Amps.

Also, the input current rating of the power supply is often far higher than the average current used by the power supply, since the supply sometimes has to provide higher current than average for small amounts of time (eg when simultaneously, the hard disks is rattling away and the processor and other circuitry is working at its hardest).

Fruit Bat /\0/\ 10:39 29 Oct 05

For AC, something called 'Power factor' comes into play, so Watts is often quite a bit less than Volts x Amps.

Most loads are resitive and watts = more than amps volts for haevy loads powerfactor correction is applied by inserting capacitors in the circuit to reduce the costs of power. Companies that us very heavy rsitive loas may hve massive banks of capactitors fitte across thier supply eg induction arc furnaces.

A good example of this in the home, is a capacitor is fitted across the supply to counter act the resitive load of a choke in a flourescent light fitting.

spargo 12:47 29 Oct 05

Fruit Bat /\0/
Slight correction, I think you mean 'Inductive' when you say 'resistive'.
A resistive load has a power factor of 1.0, so for a resistive load Volts X Amps = Watts.

jz 19:40 29 Oct 05

"watts = more than amps volts"

Watts is never more than Amps x Volts. It is exactly Amps x Volts for a resitive load (for which power factor is 1), but is less than Amps x Volts for power factor less than 1. Power factor is always between 0 and 1.

This thread is now locked and can not be replied to.

Nintendo Switch (Nintendo NX) release date, price, specs and preview trailer: Codename NX console…

1995-2015: How technology has changed the world in 20 years

8 things designers (and brands) need to know about the modern woman

How to speed up a slow Mac: 19 great tips to make an iMac, MacBook or Mac mini run faster | Speed…