VMware Communities
topplay
Contributor
Contributor

CPU usage or battery life in Fusion 4 compared to Fusion 3

I have not upgraded to Fusion 4 yet. Are those who have upgraded noticing any significant changes in cpu usage by Fusion 4 when compared to Fusion 3 (either during idle time or during load on the guest OS)? More generally, I am wondering if Fusion 4 has any improvements that benefit Macbook battery life compared to Fusion 3.

0 Kudos
3 Replies
francis_carden
Contributor
Contributor

NO, I am on Fusion 4 - on Lion... It's worse!

I'd love to speak to someone at VMWARE. I have a brand new MacBook pro, 8GB, and 512GB SSD, LION installed

However, I get less time on battery with VMWARE on LION on this machine than on my 3 year old MAC using SL.


The reason is, the new Mac Graphics chip is used BUT here's the bug, I think;

1. Even just opening the Fusion Library (No VM running) uses the battery killing Graphcis processor. I can see it starting as soon as Fusion is started. Interestingly, Twitter has the same bug (Ichat does not).

2. Can Fusion detect if a VM is minimized or on another screen and not use the new GPU (I want my battery life back). i.e. auto switch whilst live.

3. Can it be a configurable option? I am perfectly happy with the standard intel grpahics chip, my machine is fast enough and I prefer for battery life to take precenedance.

Here's the article I used from Apple that showed me that VMARE defaults to ONLY use the new GPU ; http://support.apple.com/kb/HT4152

Thanks for listening.

0 Kudos
ryanamos
Contributor
Contributor

So here's my solution to this: I use gfxCardStatus to force the use of the Intel GPU rather than the NVidia one. The trick here is that you cannot change this setting while VMWare Fusion is running; you must completely close out of VMWare (not just shut down your VMs,) change your GPU settings, then restart VMWare. This isn't due to anything VMWare is doing wrong; it's a perfectly reasonable assumption for them to make that the GPU won't change during normal operation (because they force use of a hardware GPU if it's available;) gfxCardStatus is a "power user" tool that breaks a lot of programs if you don't understand how it works.

This works well, but it doesn't do crazy magic: it extends my battery life on a mid-2010 MBP from about 2 hours to 3 hours. You will never get the battery life numbers Apple states on their product specs; by its very nature, virtualization is pretty hungry for computing power. You're running two operating systems instead of one; which means twice as much disk access, twice as many background processes, etc. Furthermore, the CPU has to do context switching to keep your Windows process and your Mac processes separate in memory, which can also be computationally expensive.

If you have a new MBP, the core i7s scale their clock speed (and likewise, power consumption) based on the load. VMWare increases the load on the CPU, thus increasing the power used. This is generally a good thing; but virtualization software tends to crank up the extra cores due to the number of threads it creates. The new i7s have 4 cores (2 of which are probably disabled most of the time when not running VMWare) so they have the capability to draw more power. There's probably a firmware / microcode hack you could use to disable those 2 cores; but that would reduce performance.

The best thing you can do to increase battery life is gfxCardStatus; but understand this comes with a performance penalty and that you can't switch back and forth between GPUs while using VMWare.

0 Kudos
treee
Enthusiast
Enthusiast

The batterylife with Lion is something that may very well have nothing  to do with Fusion at all as there are many people reporting a decrease  in batterylife since upgrading their Snow Leopard to Lion. Most of them  are not using Fusion at all.

The gpu switching is not really a Fusion or OS X problem as it is the intended behavior (although it can be problematic in some cases). There are some frameworks that will trigger the switch from the Intel gpu to the AMD/Nvidia gpu. Twitter, iPhoto, Aperture, Firefox (the current Fx9 beta fixes this btw!), Fusion, etc. all use one or more of these frameworks. Not all of them can do without because that would mean the apps won't be able to do certain things any more or even render them useless. The problem lies when you can use the framework(s) that will cause a gpu switch. I've seen Fusion 4 run without problems on the Intel gpu. The AMD/Nvidia one only gives you some more performance in the 3D area. It would be rather nice if Fusion is able to only cause a gpu switch when it needs to use that kind of power so the gpu will only switch when starting any vm that has the 3D turned on. All this is highly dependant on the technical possibilities. It seems logical to users but it can be impossible on a technical level.

Adding an option to manually stop the gpu from switching would also be nice. You can already do this by using the gfxcardstatus tool and forcing the Intel gpu (which is what I do, it works fine).

0 Kudos