First details on Nvidia Lovelace; Dead on arrival?

I know x) I cracked a good laugh with that one

As much as I’d personally love for AMD to beat NVidia’s graphic cards, it seems every year we hear “This is going to be the year AMD bests NVidia!” and every year it turns out not to be true. I’m sure they’ll get there eventually, but I wouldn’t risk being an early adopter.

Well, “beating” is not as clear cut in productivity as it is in, lets say, gaming. All they need to do is offer a well rounded competitive product. I have pulled the trigger on a 6800XT this summer simply because it had 16 gigs of ram. Most of the projects I work on routinely exceed the 12 gig mark, and so, apart from one model, the entire line up from nvidia failed to enter the purchasing consideration.

1 Like

I’ve got a 3070 with 8gb…What a pain in the ass…And the blender team doesnt speedup the integration of an “on demand texture” like Arnold to reduce the memory usage.Maybe we gone get in 2026 when all GPU gone have 32bg LOL!

1 Like

That is the problem with lack of competition, Nvidia can stall card memory for 2 or 3 generations. Now with competition from Intel, Mac, and hopefully a resurgent AMD things might improve.

1 Like

This is pretty devastating. First they had us with the low VRAM and over priced cards with enough VRAM , and they knew they had us between a rock and a hard place because of CUDA + OPTIX giving us no other choice for rendering work, and now this they expect to just carry on increasing their prices and being lazy by upping the power envelope of the same architecture.

The real market share is Gamers and crypto miners, and AMD + Intel have a good chance to try to gain GPU market share next year and put pressure on Nvidia. This will mean that unless AMD and Intel somehow get amazing support and drivers for Blender and Cycles so that they can actually compete with Nvidia cards in rendering, then we are going to have to wait for the generation after lovelace to get any hope of a decent, fair priced, energy efficient Nvidia GPU with a fair amount of VRAM.

Between now and then i may just have give in and pay through the nose for a Quadro A5000. I’ve bought some expensive over priced things in my life… But this would take the biscuit.

But what choice do we have?

This is playing right into what Nvidia was hoping would eventually happen, which is to sell the GeForce line as just for playing games and little else. If you want to work with computer graphics (even in Blender), then you buy a Quadro. There is a reason why Nvidia never showed times in iRay or in any other creating app. using a GeForce card.

The ball is in AMD’s court now, hopefully their drivers are getting pretty good as they unleash the first working MCM model for the GPU.

Sorry but do not make sense based on their promotion of Geforce Studio drivers and Geforce Studio laptops, and their promotion of Blender with both.

Do you see the promotion banners here in Blenderartists?

I get 23 seconds in RTX3060 BMW scene with 40watts , 12 sec with 115-120w. I think RTX are the most efficient cards for rendering in Blender. We should not judge until we have the numbers.

I also think it is not fair to criticize Nvidia and not the other ones for not reaching their level.

Being able to afford high end cards in the past, and then being forced to pay high end prices for the entry level card and it being out of stock for most of the year its a kick in the teeth.
Im hoping AMD or Intel will show Nvidia that it needs to deliver what they used to deliver or else they may get superceeded.

So if it is out of stock does that mean they want to loose money on propose?
And why you don’t criticize AMD, Intel etc?

When intel or AMD become king of performance and market share and then become greedy and lazy or fail to deliver competitive or innovative cards, then i will be criticizing them. It’s only a matter of time, it will happen.

Nvidia, AMD, Intel … same s#** different package design. And always keep in mind that CG people are probably smallest part of market. If they want they probably can make GPU which are suitable for CG but no so much for gaming and absolutely NOT suitable for crypto. But why they will spend money to please such small portion of consumers?
Nvidia make some … ‘gifts’ to CG people there and here, small expense for good marketing. AMD try with Pro Render, but this was half backed mess which no one use seriously ( change my mind ). These days they release free material X library… again more or less gimmick.
GPU will remain overpriced, in past Nvidia and AMD make under table deals, they even are sued for this… And… nothing. They continue with this. Competition is more or less just public stunt.
Things will not change with Intel GPU. They are pragmatic companies.
Competition wars with good performance - price ratio are too expensive. There will be some skirmish in beginning… maybe, but after they establish ‘hierarchy’ and divide the market, they will continue with old praxis. This time as three evil company instead two.

There are rumors that PowerVR (the maker of impressive mobile graphics chips) will also soon enter the PC GPU market. If they do enter, then they may be the ones to watch if you are looking for a good power/performance ratio for the display of cutting-edge graphics.

If especially we start seeing the arm-based chipmakers make PC-grade products (CPU and GPU), then we might see a return of the 1990’s when there were many players to choose from.

Read what you wrote… don’t you notice several contradictions in your thinking?

Thats not even all that pricey as of these current times. :wink: The quadros and radeon pro’s have always been over priced so the supply/demand situation hasn’t affected them as much as the gaming segment.

There will always be “evil” companies in the world. But they have built some very cool technology in the process. We just have to make sure that they do not become monopolies.

An interesting story that was told to me by a co-working who worked with computers in the 60’s. The computer they had was locked by the company who they “rented” it from. When they wanted to upgrade to the faster version of the system, the company sent a representative to do the work. All he did was unlock the computer, pulled out long cables, and put in short cables. That was the extent of the upgrade. I don’t remember the company or how much it cost. Back then it was probably many thousands of dollars.

IMHO 6900XT doesn’t have much sense for 3D since you can have W6800 with 32GB of VRAM and roughly the same performance for the same or lower price.

Top tier flagship PRO cards are ‘value’ option now.