Nvidia’s a safer choice for BLender as far as i know.
Maybe that explains the problems you’re having with AMD.
For me, Bodhi Linux plays animation at a higher FPS than Windows 10.
(but my laptop’s slow, so it’s still slow fps).
So dual boot sounds like a good idea (just don’t break your Ubuntu like i did :|)
You should have this much VRAM depending on what you’re doing in Blender:
~1GB for every 8 million triangles
500MB for Cycles kernel
64MB for each 4k texture (8bit x RGBA)
200MB for each 4k HDR texture (32bit per channel)
The GPU in the Dell laptop you want has 4GB VRAM [https://www.videocardbenchmark.net/gpu.php?gpu=GeForce+GTX+1650+(Mobile)&id=4090]
(Note that this is the mobile version of the card, since it’s in a laptop).
If you had the CPU’s exact model, you could look up its benchmark on [cpubenchmark.net/]. I don’t know what kind of CPU is needed to do what, but there’s this:
The better a CPUs single threaded performance, the better it’ll be for physics simulation & most modeling tasks/tools.
The better a CPUs multi-threaded performance, the better it’ll be for rendering/baking/multitasking.
^This has other info on PC specs for Blender, even tho you’re not building a PC.
Oh, also, building a PC will probably give you better value for your money.
You can always upgrade it later.
But it’s not portable, and you need to get a monitor and either headphones or a speaker, and learn how to build it, etc.
You can search how long it takes to render benchmark blend files here:[https://opendata.blender.org/benchmarks/query/] Yours might take a bit longer tho since yours is a mobile gpu.
I’m guessing it all depends on what you wanna do in Blender, tho. Do wanna do:
- use super duper 4K textures?
- super-high-poly sculpting? (P.S. I think I heard other 3D apps are more optimized for sculpting. But I don’t know if they’re free like Blender).
- photo-realistic architecture pics?
- 3D animation or just pictures?
- low-poly npr?
…Also… Getting a TB3 Laptop now, and buying an eGPU and GPU later IS an option BUT from what I remember, it costs a lot, you may have to trouble-shoot it, and the GPU will be bottlenecked. Maybe the technology will improve when thunderbolt 4 comes out? I dunno.