What is the best single-PC setup for a Linux enthusiast who still likes to play Windows-only games and run other Windows-only software?
Not WINE, because despite working surprisingly well in many cases and being in general better than you might expect, it's often flaky and unstable, and is absolutely not guaranteed to run every program.Not dual-booting, because then you have to constantly reboot and context-switch.Not a Windows virtual machine, because those don't let you run anything graphically intensive.Not a Linux virtual machine on a Windows host, because then you're not really using Linux (especially evident when Windows blue-screens and takes your poor VM along with it).What you want is a way run Linux and Windows simultaneously, allowing each operating system access to the tools it needs to run your system: Windows should get the graphics card and everything else it needs for playing games, and Linux should get the rest for doing everything else.Solution: set up a Linux host with a Windows virtual machine that uses modern virtualisation technology (VT-d & IOMMU) to directly access the graphics card.I set this up over the last week or so, and it was actually a lot easier than I thought it would be. A year or two ago you needed to compile custom kernel images to get it right, but with the latest version of Ubuntu (16.04) running Linux kernel ~4.4, all I had to do was:1. Make some minor config changes to assign my graphics card to the pci-stub driver1 on boot instead of the default radeon driver.2. Set up a Windows virtual machine with Qemu-KVM and assign it my card (there was even a neat GUI for this, and only one point where you need to abandon it and dip into config files).3. Install graphics card drivers on Windows VM.I had to do a bit more futzing around to get keyboard and mouse sharing (Synergy), some futzing with that to get mouselook in FPS games to work (by default Synergy sends absolute mouse coordinates, but games want relative ones, so you end up with a madly spinning screen or a crosshair that simply refuses to move) and a little more mucking around to get my onboard graphics card to play nice so I could use it for the Linux host.All-in-all, easier than getting graphics card drivers to work on Linux itself. /sUnder my setup, I have both monitors wired to both graphics cards. So I can start up my Windows VM, move my mouse up, switch output on the screens, and then feel exactly like I'm using Windows on a normal Windows PC. And if I don't want to actually use Windows, I can Steam's in-home streaming to literally play games on Linux.Graphics settings in games are exactly the same as when I was running Windows, and performance seems the same too (though this may take a few weeks to fully assess).Startup is obviously faster, and I can now spend most of my time using my favourite minimal tiling window manager.Overall, this worked a lot better than I expected, and was almost entirely painless to set up. My thoughts could change in the coming weeks (during which I plan to do a full writeup of how I set things up on my other blog), but so far, so good. Really worth a shot if you think you'd like it and have compatible hardware (you need to be able to enable VT-d in your BIOS and will also require a reasonably recent graphics card). For reference, the three main resources I used were this Linux Mint forum thread, this Arch Wiki page and the five-part guide on the VFIO Tips and Tricks blog.EDIT 2016/09/09: And here at long last is my comprehensive guide to setting this up, as promised months ago.
I mean to do this in the future, my mobo does have an integrated gpu, but how would one do this with just 1 monitor? Just alt-tab into the Windows VM as if it were another linux program?
I'm so glad setting this up is getting easier every day, I really want to make the switch to linux eventually.(Now that I'm thinking about it, my laptop also has integrated and dedicated GPUs… hmm…)If you've only got one HDMI in on your monitor, there are a variety of HDMI switches you can get for reasonably cheap (Like this one) to swap signals instead.
This is awesome, something I really want to do when I rebuild my desktop. Message me when you make your full write up :)
Also Qubes OS looks really cool, thanks for sharing that flashback.If you had a capture card and a desire not to switch inputs on you monitors, I bet you could use the gpu-delegated VM's as a live-displayed video stream within your host OS. Would look a bit daft on the back of your computer though.
Was your streaming going over your physical network, or was it purely within your virtual switch on the host? If you can get it all over the virtual switch, it might reduce latency enough to make things more playable for you.
I'm streaming over gigabit from my windows desktop to a Linux NAS/HTPC, and since I put a low-end nvidia GPU with hardware decoding (GT 710) into the NAS (the desktop runs a GTX 770, using the onboard h.264 encoder). I haven't noticed any huge latency in controls.Can you monitor traffic on your virtual adapter? That will at least let you see where the traffic is going. The VM's interface out to the LAN likely has some overhead on it, so if that's where the traffic is routing, your latency is certainly explainable (though it still shouldn't be huge if your network is wired).
The way Hyper-V in Windows resolves this is by making all network adapters virtual - when you set up Hyper-V networking, your host also gets a new virtual adapter and loses direct access to the LAN connection, so any comms between host and VM have to go over a virtual switch, period. I don't know if you can can configure something like this in Linux, but I'd suspect it's possible.