Nintendo Switch (Nintendo NX) release date, price, specs and preview trailer: Codename NX console…
Does anyone know the amount of electricity a laptop uses when the battery is not being used?
I am wondering approximately how much it costs to run a computer several hours a day.
Approx power = 150 watts
7 hrs = 1KW
1KW per hour = 8 pence
Typical family house uses bout 72p per day
Your laptop on for 7 hours = 8p
It depends on the model and what programme you're running, but probably around 25 Watts. Cost per day is:
Num hours used per day * Power in Watts * Price per kilowatt hour / 1000
Price per kilowatt hour is something like 7.5p.
So, if you use your laptop for 5 hours per day, it costs:
5 * 25 * 7.5 / 1000 = just under 1p per day (less than 4 pounds per year).
Fruit Bat: I think that amount of power would be for a standard PC, not a laptop. The power for a standard PC would be higher than 150 watts if you have a very fast processor and a fast graphics card.
I should explain that laptops are designed to use as little power as possible for good battery life between charges.
The circuitry is designed to use low power, but is also designed to slow down if you're not doing much (eg typing into a word processor) to use less power. When the software is working hard (eg playing a computer game) the circutry speeds up so uses more power. Even so, playing a game on a laptop uses less power than on a standard PC, since the circuitry is designed to minimise power use.
Laptop it self says something like 25 watts
However look at the power adaptor and calculate the input power.
Watts = volts x amps
I was of course rouning everything up to get the maximum for when fully charging etc. and agree I'm well over the top at 8p perday
Actually, Watts is only Volts x Amps for DC. For AC, something called 'Power factor' comes into play, so Watts is often quite a bit less than Volts x Amps.
Also, the input current rating of the power supply is often far higher than the average current used by the power supply, since the supply sometimes has to provide higher current than average for small amounts of time (eg when simultaneously, the hard disks is rattling away and the processor and other circuitry is working at its hardest).
For AC, something called 'Power factor' comes into play, so Watts is often quite a bit less than Volts x Amps.
Power factor, lagging for resitive loads, leading for capacitive loads.
Most loads are resitive and watts = more than amps volts for haevy loads powerfactor correction is applied by inserting capacitors in the circuit to reduce the costs of power. Companies that us very heavy rsitive loas may hve massive banks of capactitors fitte across thier supply eg induction arc furnaces.
A good example of this in the home, is a capacitor is fitted across the supply to counter act the resitive load of a choke in a flourescent light fitting.
Fruit Bat /\0/
Slight correction, I think you mean 'Inductive' when you say 'resistive'.
A resistive load has a power factor of 1.0, so for a resistive load Volts X Amps = Watts.
"watts = more than amps volts"
Watts is never more than Amps x Volts. It is exactly Amps x Volts for a resitive load (for which power factor is 1), but is less than Amps x Volts for power factor less than 1. Power factor is always between 0 and 1.
This thread is now locked and can not be replied to.