I’ve always found the documentation around virtio-GPU and virtgl very lacking, and have never gotten them working. Would love to get pointers if anyone has a good source.
I’ve always found the documentation around virtio-GPU and virtgl very lacking, and have never gotten them working. Would love to get pointers if anyone has a good source.
I don’t see any performance differences with the vgpu actually. I have more performance bottlenecks with the CPU, and my RAM isn’t the fastest, so I think I’m more CPU limited. Benchmarks I have run that are GPU focused seem to show little to no difference from what the physical card would do.
Yeah unfortunately. 20xx is last generation supported so far via the patch, not sure if support for later cards is coming or not.
No, but I think you’d have some problems. Only the host has access to the actual DisplayPort outputs, all the vgpus have virtual displays, I don’t think there’s a way to make them use the physical out.
Sure, but you’ll get diminishing returns most likely as consumer hardware doesn’t really have the resources to scale that way very well if all the VMs are running demanding apps simultaneously.
Even for something like 4 VMs that just do NVenc, there are limits for how many streams the GPU can do. I think there’s another patch that lets you raise that, but at some point you’ll run out of resources quick. Even powerful consumer gear isn’t really designed to be used by more than one user/app and it starts to show the more you virtualize and split those resources.
I’ve been doing exactly that at home for a couple years now. First with Parsec, now Sunshine/Moonlight.
Host is Proxmox on Ryzen 5800x, 64gm RAM GPU is 2070 Super, with VGPU patched drivers from https://gitlab.com/polloloco/vgpu-proxmox
When I’m gaming I’ll dedicate the full 8Gb to my windows Vm, otherwise I split it in 2 or 4Gb chunks to Jellyfin or my home camera monitoring. 8gb can’t split very many ways, and most things require at least 2 to run.
Locally at home I can run 1440p 60fps rock solid over wifi on any device, from my phone/old laptop/apple tv/raspberry pi. Remote I can do 1080p60, but a bit more hit or miss depending on my network connection.
Experimenting with LLMs I’ve done through the same windows VM, or to a ubuntu dev VM. Works the same way. I’m thinking of transitioning my gaming VM to Linux too.
The amount of VRAM is the hard limitation to get past, the virtualization tech itself has been there for a while.
But to be perfectly honest……it really was just a “let’s see if I could do this” type task, direct GPU pass though is more straightforward and it’s not really worth splitting 8Gb these days. Unless you get a card with significantly more VRAM passthrough is much less work.
Yes. That’s what allows Unix legends like this: https://www.ee.torontomu.ca/~elf/hack/recovery.html
Reminds me of OS/2
Seriously we did this in 1998, why this again??
Go for a vintage correct OS for a challenge, try Haiku!
It’s astounding. The same reason why the Steamdeck is better than the Asus and Lenovo imitation handhelds is why people will want the Apple Vision Pro compared to building your own headset and PC. Yet just because it’s Apple, all the edgelords are out in force refusing to see why a product combining existing technologies for you is better for the masses than one you cobble together yourself.
Polish.
It useless to be first if that product isn’t reliable, sustainable, practical. Apple adds polish to other concepts to make them usable by the vast majority of people.
Laptops existed……with weird keyboard layouts and mice that were afterthoughts. PowerBook pioneered the keyboard forward design that every laptop now has.
Smartphones existed……incredibly limited, weird UI, awkward input, targeted at businesses instead of regular people. iPhone changed everything so much that every other design died.
Collecting different innovations and figuring how to combine them in a way that is practical and sellable is their continuous innovation.
Wow. Thank you for that incredibly detailed explanation!!
It does sound like though that it is POTENTIALLY cheaper than something like B2, but also much easier to misconfigure and end up in a more expensive tier.
Seems to me unless you have a reason to use Amazon storage or already have something using it, using it for backup isn’t the best idea.
How much is their cheapest glacier tier? Seems complicated to calculate, seems there’s some relation to s3 storage or I’m just missing something? Haven’t looked that closely.
You could also pull all out through cloudflare and then it should be completely free
Didn’t we do this already back in the 90s with IE bundling??
Ok but……who the hell runs blender and FFP in 8GB?
The vast majority of users are NOT running pro apps like that.
It’s just a name. If you’re actually running pro stuff, you’d be an idiot to run that on 8Gb no matter what machine.
Apple’s argument that it’s the same as 16gb is dumb, but anyone actually using pro apps on 8Gb is dumber. The majority of browser(with sane numbers of tabs)/iPhoto/office users really are probably not gonna notice.
My 701 with 2gb ram and extended battery still works. I used to go wardriving with that thing!
I had a pi 1b running my hvac/humidifier/HRV unit at home for years. Only removed it when we moved out.
Where has that been all my life!