RTX 4000 leak

SpyderTracks

We love you Ukraine
Take all this with a pinch of salt as with any leak.

For those who haven't heard, NVidia's servers were recently hacked and the hackers made off with over 1TB of data and even hash rate limiter source code and driver source code.

They demanded that NVidia make their drivers open source on all platforms, and to take off the hash rate limiter on RTX cards.

NVidia have been given until tomorrow to answer the hackers, and to make them know it's a real deal (NVidia have already admitted their servers were compromised), the hackers have already started to leak some stuff.



This is their leaks on the upcoming RTX 4000 series. Again, please do take with a pinch of salt.

 

Scott

Behold The Ford Mondeo
Moderator
I think we might actually see 8k gaming next gen. I think they're gunning for it, properly.

4k is absolutely in the bag. There is genuinely no doubt, 4k is the new 1440p now. 1080p is completely toast. There's just no value in 1080p gaming now, it's SOOO expensive when you can get 1440p for around 15% more cost.

I'm going to be trying my absolute hardest to get a 4080 0-day. I don't like to purchase this way but I think needs must with the way GPU purchasing is now. If Nvidia stick with releasing decent cooling solutions on their own cards I think it's the way to go for me.

The 4080 will absolutely smash my requirements for 2D gaming (the 2080 already does) but it'll bring new life to my Index :D
 

Steveyg

MOST VALUED CONTRIBUTOR
I think we might actually see 8k gaming next gen. I think they're gunning for it, properly.

4k is absolutely in the bag. There is genuinely no doubt, 4k is the new 1440p now. 1080p is completely toast. There's just no value in 1080p gaming now, it's SOOO expensive when you can get 1440p for around 15% more cost.

I'm going to be trying my absolute hardest to get a 4080 0-day. I don't like to purchase this way but I think needs must with the way GPU purchasing is now. If Nvidia stick with releasing decent cooling solutions on their own cards I think it's the way to go for me.

The 4080 will absolutely smash my requirements for 2D gaming (the 2080 already does) but it'll bring new life to my Index :D
I keep thinking did I drop the ball going to 1440p and then I remember what games I actually play, pretty sure I don't need 4K for Dead Cells or Binding or Issaac

I'll skip the 4000 series this run around
 

TonyCarter

VALUED CONTRIBUTOR
I'm saving my cash for that Alienware ultra wide OLED (AW3423DW), or one of the new 4k OLEDs - that's assuming I can get either for around £1k.
 

Martinr36

MOST VALUED CONTRIBUTOR
When the 4000 series cards come out I might see if i can upgrade my 2070 super to a 3070 of some sort......... :ROFLMAO: :ROFLMAO:
 

DarTon

Well-known member
Those specs for the Ada Lovelace (4000 series) have actually been around for a number of months. It was reported even in Nov on sites such as Videocardz that the AL102 would have a maximum of 18,432 CUDA cores vs. the GA102 with a 10,752. So a 71% increase. Going from an 8nm Samsung node to a 5mn TSMC node really allows them to pack a lot more in.

The problem is the power requirements. Moving to a smaller geometry node should be more power efficient but the specs getting leaked seem to imply the exact opposite. I bought a 1000W PSU for my home machine but I'm really thinking that the high end 4000 series will need 1200W. Pair a 4080/4090 at say 500W with a 13700K/13900K using 200-300W, add another 100W for other components and it feels like 1000W is an absolute minimum, 1200W sensible and I need 1600W for some room to upgrade. It's all going in totally the wrong direction.
 

Steveyg

MOST VALUED CONTRIBUTOR
Those specs for the Ada Lovelace (4000 series) have actually been around for a number of months. It was reported even in Nov on sites such as Videocardz that the AL102 would have a maximum of 18,432 CUDA cores vs. the GA102 with a 10,752. So a 71% increase. Going from an 8nm Samsung node to a 5mn TSMC node really allows them to pack a lot more in.

The problem is the power requirements. Moving to a smaller geometry node should be more power efficient but the specs getting leaked seem to imply the exact opposite. I bought a 1000W PSU for my home machine but I'm really thinking that the high end 4000 series will need 1200W. Pair a 4080/4090 at say 500W with a 13700K/13900K using 200-300W, add another 100W for other components and it feels like 1000W is an absolute minimum, 1200W sensible and I need 1600W for some room to upgrade. It's all going in totally the wrong direction.
I also imagine the costs will be eye bleeding
 

Scott

Behold The Ford Mondeo
Moderator
I keep thinking did I drop the ball going to 1440p and then I remember what games I actually play, pretty sure I don't need 4K for Dead Cells or Binding or Issaac

I'll skip the 4000 series this run around

I still think 1440p is a good sweetspot for standard size monitors, my comment wasn't to suggest what was the best resolution for gaming.... but more the graphical power point that we are at now.

4k on the big screen and 8k on the bigger screen will be a must (you're likely looking at 40"+ for it to be warranted, 50+ for 8k).

My requirements are only brought about with the VR. Otherwise I game on a 1080p plasma :ROFLMAO:

Although I do have a 4k projector now 😇 :unsure:
 

SpyderTracks

We love you Ukraine
I still think 1440p is a good sweetspot for standard size monitors, my comment wasn't to suggest what was the best resolution for gaming.... but more the graphical power point that we are at now.

4k on the big screen and 8k on the bigger screen will be a must (you're likely looking at 40"+ for it to be warranted, 50+ for 8k).

My requirements are only brought about with the VR. Otherwise I game on a 1080p plasma :ROFLMAO:

Although I do have a 4k projector now 😇 :unsure:
It's really moving from dedicated monitors to high end 120hz gaming TVs.

At 8k you need a huge panel! TV just makes more sense. I think monitors will become a more niche for seriously high end stuff.
 

Scott

Behold The Ford Mondeo
Moderator
It's really moving from dedicated monitors to high end 120hz gaming TVs.

At 8k you need a huge panel! TV just makes more sense. I think monitors will become a more niche for seriously high end stuff.

To be honest I think it's the consoles that have really kicked it all off. They've brought some real power to the casual console gaming house. 120hz TVs are all being released, I think, with the consoles in mind..... rather than PC gamers. 4k 120hz is going to be fairly common place on the 50"+ TVs and I couldn't be happier.

Spending £1k on a monitor would bring tears to my eyes but I wouldn't think twice about 2k on a well specced TV. It's ironic, in a way, but I think justified given the variability of the TV over the monitor.

As you say though, that push from the TV sector is going to put some really strong competition on the monitor makers. It's likely a win/win though as the panels and technology with become more shared for the most part.

Bespoke manufacturers, the likes of Dell, Gigabyte, Asus, etc; will probably feel more hurt from the general consumer, but they will likely still remain a firm favourite with professionals.

Samsung will be quids in though :ROFLMAO:
 

SpyderTracks

We love you Ukraine
To be honest I think it's the consoles that have really kicked it all off. They've brought some real power to the casual console gaming house. 120hz TVs are all being released, I think, with the consoles in mind..... rather than PC gamers. 4k 120hz is going to be fairly common place on the 50"+ TVs and I couldn't be happier.

Spending £1k on a monitor would bring tears to my eyes but I wouldn't think twice about 2k on a well specced TV. It's ironic, in a way, but I think justified given the variability of the TV over the monitor.

As you say though, that push from the TV sector is going to put some really strong competition on the monitor makers. It's likely a win/win though as the panels and technology with become more shared for the most part.

Bespoke manufacturers, the likes of Dell, Gigabyte, Asus, etc; will probably feel more hurt from the general consumer, but they will likely still remain a firm favourite with professionals.

Samsung will be quids in though :ROFLMAO:
Samsung are just nailing it at the moment with some seriously price competitive products that no one else is really doing.

Very very clever, and despite the initial firmware issues, and early models quite literally being beta products quite frankly, for the nerds out there ok with troubleshooting, the benefits far outweigh the bugs. They are superb performers in the main for quite frankly very little cash compared to the nearest competition (if there is any)
 

Martinr36

MOST VALUED CONTRIBUTOR
Well I'm one of the one's who'll be sticking with monitors, I've got a 36" tv and that's plenty big enough for me as it's my monitor
 

SpyderTracks

We love you Ukraine
Well I'm one of the one's who'll be sticking with monitors, I've got a 36" tv and that's plenty big enough for me as it's my monitor
I'm the same, not fussed by size particularly (he says with a 48" monitor), I haven't had a TV since about 2003, just stream everything on my computer.
 

TonyCarter

VALUED CONTRIBUTOR
65" LG OLED in the living room, 43" LG OLED in the bedroom, just need the 32" one for the computer room and I'll be sorted!

Strangely, I'm expecting the smallest one to be the most expensive...due to the way the panels are cut out of a large sheet, and the wastage for some sizes. My 43" CX was about the same price as the 65" CX (but I got a good discount on the 65" one).
 
Last edited:

Scott

Behold The Ford Mondeo
Moderator
*Snicker*

nYTTNMU.jpg
 
Top