I know x) I cracked a good laugh with that one
As much as Iād personally love for AMD to beat NVidiaās graphic cards, it seems every year we hear āThis is going to be the year AMD bests NVidia!ā and every year it turns out not to be true. Iām sure theyāll get there eventually, but I wouldnāt risk being an early adopter.
Well, ābeatingā is not as clear cut in productivity as it is in, lets say, gaming. All they need to do is offer a well rounded competitive product. I have pulled the trigger on a 6800XT this summer simply because it had 16 gigs of ram. Most of the projects I work on routinely exceed the 12 gig mark, and so, apart from one model, the entire line up from nvidia failed to enter the purchasing consideration.
Iāve got a 3070 with 8gbā¦What a pain in the assā¦And the blender team doesnt speedup the integration of an āon demand textureā like Arnold to reduce the memory usage.Maybe we gone get in 2026 when all GPU gone have 32bg LOL!
That is the problem with lack of competition, Nvidia can stall card memory for 2 or 3 generations. Now with competition from Intel, Mac, and hopefully a resurgent AMD things might improve.
This is pretty devastating. First they had us with the low VRAM and over priced cards with enough VRAM , and they knew they had us between a rock and a hard place because of CUDA + OPTIX giving us no other choice for rendering work, and now this they expect to just carry on increasing their prices and being lazy by upping the power envelope of the same architecture.
The real market share is Gamers and crypto miners, and AMD + Intel have a good chance to try to gain GPU market share next year and put pressure on Nvidia. This will mean that unless AMD and Intel somehow get amazing support and drivers for Blender and Cycles so that they can actually compete with Nvidia cards in rendering, then we are going to have to wait for the generation after lovelace to get any hope of a decent, fair priced, energy efficient Nvidia GPU with a fair amount of VRAM.
Between now and then i may just have give in and pay through the nose for a Quadro A5000. Iāve bought some expensive over priced things in my life⦠But this would take the biscuit.
But what choice do we have?
This is playing right into what Nvidia was hoping would eventually happen, which is to sell the GeForce line as just for playing games and little else. If you want to work with computer graphics (even in Blender), then you buy a Quadro. There is a reason why Nvidia never showed times in iRay or in any other creating app. using a GeForce card.
The ball is in AMDās court now, hopefully their drivers are getting pretty good as they unleash the first working MCM model for the GPU.
Sorry but do not make sense based on their promotion of Geforce Studio drivers and Geforce Studio laptops, and their promotion of Blender with both. https://blogs.nvidia.com/blog/2021/12/13/blender-omniverse/
Do you see the promotion banners here in Blenderartists?
I get 23 seconds in RTX3060 BMW scene with 40watts , 12 sec with 115-120w. I think RTX are the most efficient cards for rendering in Blender. We should not judge until we have the numbers.
I also think it is not fair to criticize Nvidia and not the other ones for not reaching their level.
Being able to afford high end cards in the past, and then being forced to pay high end prices for the entry level card and it being out of stock for most of the year its a kick in the teeth.
Im hoping AMD or Intel will show Nvidia that it needs to deliver what they used to deliver or else they may get superceeded.
So if it is out of stock does that mean they want to loose money on propose?
And why you donāt criticize AMD, Intel etc?
When intel or AMD become king of performance and market share and then become greedy and lazy or fail to deliver competitive or innovative cards, then i will be criticizing them. Itās only a matter of time, it will happen.
Nvidia, AMD, Intel ⦠same s#** different package design. And always keep in mind that CG people are probably smallest part of market. If they want they probably can make GPU which are suitable for CG but no so much for gaming and absolutely NOT suitable for crypto. But why they will spend money to please such small portion of consumers?
Nvidia make some ⦠āgiftsā to CG people there and here, small expense for good marketing. AMD try with Pro Render, but this was half backed mess which no one use seriously ( change my mind ). These days they release free material X library⦠again more or less gimmick.
GPU will remain overpriced, in past Nvidia and AMD make under table deals, they even are sued for this⦠And⦠nothing. They continue with this. Competition is more or less just public stunt.
Things will not change with Intel GPU. They are pragmatic companies.
Competition wars with good performance - price ratio are too expensive. There will be some skirmish in beginning⦠maybe, but after they establish āhierarchyā and divide the market, they will continue with old praxis. This time as three evil company instead two.
There are rumors that PowerVR (the maker of impressive mobile graphics chips) will also soon enter the PC GPU market. If they do enter, then they may be the ones to watch if you are looking for a good power/performance ratio for the display of cutting-edge graphics.
If especially we start seeing the arm-based chipmakers make PC-grade products (CPU and GPU), then we might see a return of the 1990ās when there were many players to choose from.
Read what you wrote⦠donāt you notice several contradictions in your thinking?
Thats not even all that pricey as of these current times. The quadros and radeon proās have always been over priced so the supply/demand situation hasnāt affected them as much as the gaming segment.
There will always be āevilā companies in the world. But they have built some very cool technology in the process. We just have to make sure that they do not become monopolies.
An interesting story that was told to me by a co-working who worked with computers in the 60ās. The computer they had was locked by the company who they ārentedā it from. When they wanted to upgrade to the faster version of the system, the company sent a representative to do the work. All he did was unlock the computer, pulled out long cables, and put in short cables. That was the extent of the upgrade. I donāt remember the company or how much it cost. Back then it was probably many thousands of dollars.