Laptop battery‑life numbers look great on the box because they’re measured in conditions you will never use in real life. Manufacturers run their tests in ultra‑controlled, ultra‑optimised scenarios designed to squeeze out the biggest number possible — not to reflect how people actually work.
Here’s what they typically do:
Manufacturers test at 30–50% brightness, sometimes even lower. In real life, students run at 70–100% because lecture halls, libraries, and classrooms are bright. Brightness is one of the biggest battery drains, so this alone inflates the numbers massively.
Battery tests often use:
A looping video
A lightweight browser script
Zero background apps
No antivirus scans
No Teams/Zoom
No OneDrive sync
No TTS or audio/recording software
No extensions
No multitasking
In other words: nothing like actual student usage.
Some tests minimise network activity or use a local file instead of real web browsing. But real students are constantly:
Streaming
Syncing
Downloading
Using cloud storage
Running online tools
Network activity hits the battery hard.
Manufacturers often enable:
Battery saver
Low‑power CPU states
Reduced refresh rates
Aggressive sleep timers
These settings make the laptop feel sluggish — so nobody uses them.
Batteries degrade from day one. The advertised number is from a fresh, factory‑new cell in a cool, controlled lab. Real batteries often lose 10–20% capacity in the first year.
When a laptop says “Up to 12 hours”, it usually means:
6–8 hours for normal use
4–6 hours for academic use
2–4 hours for Teams/Zoom calls
1-2 Hours for TTS or recording/AI apps
Manufacturers know this — but “up to” lets them legally publish the best‑case scenario.
Students run the exact tasks that drain batteries fastest:
Teams/Zoom
Chrome with 10+tabs
OneDrive sync
PDF readers
Accessibility tools
Background apps
High brightness
So a laptop advertised as “12 hours” often delivers 4–6 hours in real academic conditions.