I can barely wrap my head around the costs but sounds like you might want to go all-in with a Gen5 build and 128 GB.The ai engine I'm running on a Linux vm via docker is a memory and cpu hog. I'm debating going to 64 gig of ram to run the larger vm. And it makes me strongly consider a 16 core for my next cpu as opposed to 12 core.
Honestly if I'm goong to go in that deep I'll petition work to buy me an hedt setup with 256 gig or better and a business class gpu that does ai acceleration. We finally released a policy and are looking at blocking things like chat gpt internally to stop idiot devs from feeding it up protected code in the form of questions.I can barely wrap my head around the costs but sounds like you might want to go all-in with a Gen5 build and 128 GB.