We're aware that some users are experiencing technical issues which the team are working to resolve. See the Community Noticeboard for more info. Thank you for your patience.
📨 Have you signed up to the Forum's new Email Digest yet? Get a selection of trending threads sent straight to your inbox daily, weekly or monthly!

Video card vs Battery life - laptop

Options
My MSI laptop has two different video choices - a lower spec Intel Iris XE and a high spec NVIDIA Geforce RTX 4060  - I have been running on the high spec until recently, when I noticed the battery was not lasting as long as I  expected
Running on the NVIDIA - Battery lasts 1 1/4 hours
Running on the INTEL - Battery lasts for 3 1/2 hours - 4 hours !!
Anybody offer any advice as to why the massive difference ?

If I was half as smart as I think I am - I'd be twice as smart as I REALLY am.

Comments

  • TadleyBaggie
    TadleyBaggie Posts: 6,633 Forumite
    Part of the Furniture 1,000 Posts Photogenic Name Dropper
    It's because the NVidiea graphics has much higher power consumption because it has faster processors.
  • Rodders53
    Rodders53 Posts: 2,663 Forumite
    Part of the Furniture 1,000 Posts Name Dropper Photogenic
    The gaming graphics card is used to provide higher performance graphics in such game-playing use (and some applications can use the graphics card processors to crunch things, I believe) thus taking the load off the main processors... but using energy (battery or mains power) to do so.

    The Intel card is on the same chip as the main processors and may show it's limitations with lower detail or jerkier graphics on image intensive operational use.
  • MouldyOldDough
    MouldyOldDough Posts: 2,687 Forumite
    1,000 Posts Third Anniversary Photogenic Name Dropper
    It's because the NVidiea graphics has much higher power consumption because it has faster processors.

    Yes but I did not expect such a huge difference and surely, the Nvidia should only use the power when used at very high definition - rather than just web browsing ?

    If I was half as smart as I think I am - I'd be twice as smart as I REALLY am.
  • DullGreyGuy
    DullGreyGuy Posts: 18,613 Forumite
    10,000 Posts Second Anniversary Name Dropper
    It's because the NVidiea graphics has much higher power consumption because it has faster processors.

    Yes but I did not expect such a huge difference and surely, the Nvidia should only use the power when used at very high definition - rather than just web browsing ?
    Nvidia's site says the card takes between 35w and 115w of power depending on how hard it's being pushed whereas the Intel Iris varies a bit between processors but is somewhere between 7w-28w so it maxed out is less power drain than the minimum required to run the discrete graphics card. 

    This is where I can happily run my laptop on a 20w charger for day to day use
  • facade
    facade Posts: 7,590 Forumite
    Part of the Furniture 1,000 Posts Name Dropper
    edited 17 June 2024 at 2:22PM
    The NVIDIA cards use a lot of power.
    There is a utility called HWINFO64 that you can download and run, that tells you everything about your computer.
    The sensor option shows GPU (and CPU) power if you scroll down enough.
    I only have a 2060, but it won't be far off the 4060, lower if anything.




    The maximum was when it was rendering a video, the current is just surfing t'interweb, no videos or anything, so just the video is 20W!

    No wonder my electric bill is that of a small city...




    I want to go back to The Olden Days, when every single thing that I can think of was better.....

    (except air quality and Medical Science ;))
  • MouldyOldDough
    MouldyOldDough Posts: 2,687 Forumite
    1,000 Posts Third Anniversary Photogenic Name Dropper
    It's because the NVidiea graphics has much higher power consumption because it has faster processors.

    Yes but I did not expect such a huge difference and surely, the Nvidia should only use the power when used at very high definition - rather than just web browsing ?
    Nvidia's site says the card takes between 35w and 115w of power depending on how hard it's being pushed whereas the Intel Iris varies a bit between processors but is somewhere between 7w-28w so it maxed out is less power drain than the minimum required to run the discrete graphics card. 

    This is where I can happily run my laptop on a 20w charger for day to day use

    Thanks - that explains a lot
    In future - I will only use Nvidia when running off the mains

    If I was half as smart as I think I am - I'd be twice as smart as I REALLY am.
  • bob2302
    bob2302 Posts: 555 Forumite
    500 Posts Second Anniversary Name Dropper
    Personally I'd leave it off unless I actually needed it. Outside of games it probably isn't useful - just increasing your electricity bill.
  • Vitor
    Vitor Posts: 645 Forumite
    500 Posts First Anniversary Photogenic Name Dropper
    Launch MSI Center, go to the general setting and find GPU switch, where you can switch between Microsoft Hybrid Graphics mode or discrete graphics mode. Just select the mode you want and the system will ask you to reboot the laptop.
Meet your Ambassadors

🚀 Getting Started

Hi new member!

Our Getting Started Guide will help you get the most out of the Forum

Categories

  • All Categories
  • 351K Banking & Borrowing
  • 253.1K Reduce Debt & Boost Income
  • 453.6K Spending & Discounts
  • 244K Work, Benefits & Business
  • 598.9K Mortgages, Homes & Bills
  • 176.9K Life & Family
  • 257.3K Travel & Transport
  • 1.5M Hobbies & Leisure
  • 16.1K Discuss & Feedback
  • 37.6K Read-Only Boards

Is this how you want to be seen?

We see you are using a default avatar. It takes only a few seconds to pick a picture.