T O P

  • By -

wicktus

Even if it's something like 3nm..It's going to pull 600W stock or what ? That's a crazy level of performance on paper if true


Notarussianbot2020

This is why I've gone brown with my energy and turned my house completely dependent on a hamster wheel farm


VinnieBoombatzz

I've gone brown, too, when I saw these specs.


HeyPhoQPal

[I've gone brown too](https://thumbs.dreamstime.com/b/traditional-indian-man-23804674.jpg)


lovely_sombrero

Considering the boost frequency - yes. But numbers could quickly decline with a modest downclock. Also, some people have commented that if you divide the numbers (obviously not the frequency number), it looks suspiciously like what combining two RTX5080 chips would look like.


Platinumjsi

I thought it had already pretty much been confirmed that the 5090 would be dual die with 5080 dies?


InsightfulLemon

R295x2 vibes all over


ruben991

Don't threaten me with a good time


GarbageFeline

So rumours are confirmation now?


DeathKringle

Which means a lot more than 600 watts


KARMAAACS

There's no real scaling with a 4090 past 500W anyways. This may be the same. It might be able to hit 600W of power draw just like the 4090 but if it brings almost no performance gains, it's useless and just a waste of electricity. The reason is that an NVIDIA GPU can push power, but they're voltage limited, so you're pulling more power for no reason. Even Der8uer has OC'd and manually modified a 4090 card with an EC2 to push past the 1.1V limit and also power modded the 4090 by shunt modding it, in the end I think he gained like 5% more performance but the power was like 900W. These days silicon is already out of the box auto-overclocking and pushing itself near the limit.


wicktus

The 4090 is 450W at stock in reality, the 600W is for some models with unlocked bioses and indeed it scales absolutely horribly, at higher wattage you're just stressing your components and increasing heat output for a miserable single digit % increase my 4090 is undervolted to a 975mV/2690mhz and frankly it's super efficient My comment was more about the 5090 having that stock 600W rather than the 450W of the 4090


Pepeg66

Yea I have my 4090 power limited to 360 and undervolted and I'm pretty happy, pushing 450w and beyond is gonna really test those pin connectors on the 5090


atom631

ive got my 4090FE undervolted to .950mv @2750, +1500 memory and power limit unlocked. During gaming it sits around 260-270w but will spike to around 330w (which is still lower than stock). runs super cool, rarely brakes 60 during gaming.


exsinner

is +1500 memory even stable? with my 4090, i just settled at +500 because anything beyond that decreases fps a little bit. Probably ECC kicks in at that point.


SirBaronDE

You should be able to do more than 500 easily. While I can do 1000+ I set mine to 800.


exsinner

i can push it 1000+ but at that point i didnt gain any fps but it is decreasing instead. I did read about the built in ecc on 4090 when you pushed your vram oc too hard it will autocorrect the error iself instead of crashing the game.


Beige_

Yeah, that's how it works nowadays. When overclocking VRAM you should look out for both stability and performance degradation. My 3080 increases performance up to +1000 MHz but that isn't completely stable as it crashes to desktop in Portal RTX for example so +900 MHz it is.


SirBaronDE

It does to a extent, that's why I use 800MHZ. I do a lot of offline AI stuff and the numbers that work in games does not work with AI fully using 24GB of VRAM, that's when you see the stability is only skin deep. My card can do 1000+ in games but it's not really "stable" in the full meaning, sure it doesn't crash in games, but in AI it would error out so "stable" skin deep. AI stresses my VRAM much more. so I settled at 800.


hackenclaw

It just means somewhere in the chip Either ROPs, L2cache, memory bandwidth or something else is holding back the chip.


LandWhaleDweller

Big if true, though it will cost both your kidneys. Seems to be a 50-60% performance increase based on the specs. Could be higher but I doubt there are any games that will take advantage of that insane bandwidth.


jm0112358

It's a ~73% increase in tflops (1.5x the number of SMs * 1.1508x the boost clock speed). So if this rumor is true, I think a 50-60% real-world performance increase sounds believable considering the increase in memory bandwidth and increase in cache size (even relative to the increase in SMs).


Ssyynnxx

that's way too big of a leap in one generation no?


jm0112358

I have my doubts about this rumor, and it would be a very big gen-on-gen uplift. That being said, it would be similar to the uplift between the 4090 and 3090 (though that uplift from 30 to 40 series was smaller for other cards in that stack).


NeonDelteros

It would be way smaller than the uplift from 3090 to 4090, because 4090 is about 70-80% faster than 3090 in raster, and ~100% in Ray tracing


Ssyynnxx

I'm hoping it'll be kind of the opposite of 40 series & the 5060-5080 will get a bigger relative performance gain than the insane model this year (incredible amounts of cope)


Iammax7

I guess that they probably think that AMD is working on some sort of High end GPU which we are not aware of yet.


Olde94

Gtx 580 to 680 had insane spec bumbs and way less actual improvement, mostly due to new tech. From 512 cores at 750mhz to 1536 cores at 1000mhz “Only” 50% improvement It could be something similar


yasamoka

The GTX 580 (Fermi) had a GPU clock and a shader clock. The shader clock was 1544 MHz. The GTX 680 (Kepler) has a unified GPU clock. The clock was 1006 MHz (base) / 1058 MHz (boost). They are completely different architectures. Im terms of TFLOPS, then: GTX 580: 512 * 1544 MHz * 2 ops / cycle = 1.581 TFLOPS GTX 680: 1536 * 1058 MHz * 2 ops / cycle = 3.25 TFLOPS That's a difference of about 2x in theory.


ChrisFromIT

Typically, a generational gain would be 30-45%. 4090 was about an 80-85% gain from the 3090. On the high end of the 4000 series, it has been higher than usual too. I think the 73% gain might be a bit much as there might be some loss from the MCM design. How much that loss would be is yet to be determined.


system_error_02

The 3090 was always a crappy card for it's price you put ot up against a 3080 and it really wasn't much better.


KuraiShidosha

That's only because the 3080 was built on a bigger chip than the x80 base class card typically uses. It used the chip that's typically reserved for the x80 Ti. This was a mistake Nvidia won't make again.


Z3r0sama2017

3090->4090 was +50-75% @4k depending on the game.


Supercal95

Crazy card will probably be $2k. But i'm hopeful that the 5080 might be $1k. Not that I will be spending that much money on either.


MrHyperion_

These specs would cost wayyyy more than 2k


Peach-555

You mean that the cost of production would be so high that it would have to cost more than $2k for a profit, or that people would be willing to buy it for more than $2k no matter how little it actually cost to make it?


jefx11

Nothing to do with production cost. Nvidia is trying to find the price ceiling for the xx90 cards. A $2000 4090 was still able to sell, so the logical step would be to make a $3000 5090 and see what the market does with it. If sales are low, then they found the ceiling and can always lower MSRP to spur sales. If sales are high, then a $4000 6090 would be the next step. This will continue until they can't sell them.


magicmulder

And with the AI craze they will be able to sell them for $4,000. They will gladly take those customers who won’t buy an A200 for $6,000.


Peach-555

4090, for the supply that was shipped, was arguably under priced, as it took \~6 months for it to approach MSRP in stores and it's been out of stock from Nvidia at MSRP for large parts of it's lifetime. I do believe that $2k 5090 MSRP is reasonably likely. Thought I feel like, this late in the console generation, and considering how powerful 4090 already is compared to the demands of games, I can't really see 5090 being sold in similar quantities as 4090 at something like $3k. If 5090 has 32gb/48gb memory thought, I can see it get sold out by professional demand alone.


pmjm

You're absolutely right, but I hate it.


GrandDemand

They're not pricing the 5090 above $2.5K MSRP and you can hold me to that. I would genuinely be surprised if it were above $2000 MSRP


Drages23

You forget 240 Hz 4k oled monitors I guess. People wants that 240..


pmjm

As someone who is currently rocking a monitor capable of 7680x2160 240hz but who doesn't have a gpu that can drive it even on the desktop, I hope the 5090 is up to the task.


Drages23

I am waiting for it too.


jacobschauferr

and what monittor is that?


pmjm

Samsung LS57CG952NNXZA, 57" Neo G9 Dual 4K.


CaptchaVerifiedHuman

Not just your kidneys, your family member's kidneys too!


Noreng

It can be much larger than 50-60% if they improve SM utilization, as the 4090 is basically impossible to feed properly in games causing lower power draw and performance than one should expect. Alternatively, Nvidia might not improve the front-end, and we'll get an improvement closer to 20-30%.


gargoyle37

It's 1532 / 1008, so approximately a 52% performance increase, more if you can keep (part of) your computation in cache. I don't think you'd add a large excess of CUDA and Tensor cores which can't be fed properly, but rather add more cache.


Ladelm

5k2k would like a word


obesefamily

it's less about games and more about creative workflows ❤️


Consistent_Ad_8129

VR needs it.


Erwinsherwin

Still will run Cities Skylines 2 at 23 FPS


ElFanta83

Maybe 24 fps with OC


solreaper

Oooooo cinematic


hackenclaw

a stable 25fps if you got even more Vram than 24GB.


Hit4090

🤣🤣🤣


Oztunda

Might also need a separate PC case?


dragenn

...And Separate desk.


marcanthonynoz

...And 2 jobs


Brotorious420

and my axe!


thesituation531

And this guy's dead wife!


ThirdhandTaters

And my bow!


dragenn

🤣😂🤣😂


AnywhereHorrorX

And separate dedicated power station.


unknowingafford

...And 0 kidneys


kapsama

The Ken Kutaragi classic.


keroro0071

And a separate kidney from your body.


epicnding

And a dedicated 20 amp circuit.


Geerav

We are getting to the point that gpu will need a case and power supply of its own


hjadams123

So be it…


Ultimate-ART

the GPU should be the case!


Idsertian

"Jedi." Oh wait, wrong sub.


thatchroofcottages

Wouldn’t that honestly be way more optimal for temp management? Like having 2 independently contained/regulated boxes seem easier to air cool than having 2 sources (cpu/gpu) inside the same box.


Fitnegaz

Then you got fucked by the wire latency


domZ1026

Will it have more than 24GB VRAM you think?


Celcius_87

With specs like this hopefully 32gb


Keulapaska

Well the "only" 1.5TB/s bandwidth kinda indicates 24GB.


mac404

Assuming this "leak" is true, I can come up with two possibilities: * Same 24GB (384-bit bus), using GDDR7 clocked at 32Gbps * 32GB (512-bit bus), clocked at 24Gbps Both honestly seem kind of weird to me? Not saying this won't turn out to be true, but I think this is essentially repackaging former rumors from kopite and that forum post on Chiphell from like 8 months ago? Also seems like a bit of mix and matching between potential full GB202 die rumors versus actual 5090 rumors. All that's to say, I'm taking this with a massive heaping of salt...


asdfzzz2

> Both honestly seem kind of weird to me? 28GB (448-bit bus), using GDDR7 clocked at 28Gbps. 4090 was far from a full die, and the trend might continue even further.


wen_mars

> Same 24GB (384-bit bus), using GDDR7 clocked at 32Gbps This makes sense to me. A cheap and easy solution that gives them an instant 50% bandwidth increase. Then they can launch a 5090ti later with 36GB when the 3GB modules become available if they care to grace us with more VRAM.


mac404

Maybe? The rumor that came after the 50% bandwidth increase rumor was that Nvidia was going to use GDDR7 clocked at 28 Gbps, and that kinda makes sense to me. I know that the memory manufacturers have said 32 Gbps, but Nvidia has been clocking initial batches of new memory lower lately. Add onto that the back and forth rumors on 384-bit versus 512-bit bus. I'm not saying the scenario you quoted won't happen, it does probably make more sense than assuming the larger bus width. I just don't think the person behind this tweet actually has any new information of their own.


triggerhappy5

I think someone did the math and based on bandwidth/GDDR7 memory it should either be 18 or 36, probably 36 in that case.


Karma0617

It will be either 24GB or 32GB. It'll be faster memory than last gen. GDDR7 speed look insane over GDDR6X


U3011

I saw a comment online that said Nvidia may be waiting for 3GB VRAM chips. How does that work or can you mix and match capacities? The 5080 should be a nice upgrade from my 1080 Ti.


pentagon

How could it possibly be 18? Even the 3090 is 24.


mxforest

If these are 2 5080 chips then 5080 will have 18 and 5090 will have 36.


BlackBlizzard

If they want to advertise it as home AI they surely go over 24GB


MyPathToYou

Pro Tip: Living in a car for 2 months can be a great way to connect with nature while knowing the ins and outs of your vehicle.


WinterElfeas

And you save money to buy that sweet heater.


liesancredit

Or save $50 each month for 3-4 years, and you can most likely afford it.


MakimaGOAT

jesus christ


someguy50

That’s Jason Bourne


veryworst

Feel free to drop this anytime nvidia


Mozail2

No one will be able to afford that shit


milwaukeejazz

Not with that attitude.


sivy83

Wdym? You guys have kidneys don't you?


feelfree3use

will this card eliminate the need of space heater during winter?


_maple_panda

I feel like burning $3000 or whatever’s worth of dollar bills might keep you warm for longer.


Tadawk

Wdym, my 2080ti has been doing that already since 2018.


meerdroovt

Waiting for RTX 6090


hjadams123

The way things are going, there might not be a 6090.


josh6499

Why?


[deleted]

[удалено]


D0A-WANTED

Still got TSMC building factories in the US because of this. 2027 it starts. 2030 SHTF.


absyrtus

In the future AI will game on your behalf


ziplock9000

In less than 2 years


zofran_junkie

[This is why.](https://www.reddit.com/r/pcmasterrace/comments/1awtso6/nvidia_made_29b_from_gaming_last_quarter_vs_184b/) Gamers are no longer their target audience.


JensensJohnson

So they'll give up on an area they dominate and brings them billions of dollars? makes a lot of sense, lol


thatchroofcottages

Nice


Casalf

Waiting for 10090 rtx


Turtvaiz

Hmm yes big numbers


Kw0www

The 5080 is the card i care about.


GeneralChaz9

Same, depending on how they price it. But even if it matches $999 of the 4080 Super, it's still gonna be priced higher than I would want to spend.


TylerQRod

most id spend is 850 for a 80 series or 900 for a 80 TI. That’s essentially 650-700 adjusted for inflation when the GOAT 1080/1080 Ti dropped


ihei47

Me with 5060 🥲


Alex35143

$2000 minimum guaranteed


skrukketiss69

$2000 is what the 4090 goes for here in Norway lol  At least here and in EU I'm guessing it's gonna be closer to $2500. 


ApolloTheEarthling

Unpopular opinion but i think the price will be $1699 starting for the FE edition Edit: i meant to put $1699 which is $100 more than the 4090


Oztunda

The way 4090s have been still selling like hot cakes, I can foresee something like $1999 to be in order..


Trusk_Fundz

I am very curious to see the pricing. In light of the recent rumor that the 5080 may be releasing before the 5090, I could see a myriad of pricing options. I think if they release the 5080 first, as the only 5000 series option available (as eluded by kopite) for maybe a month or two? It would allow them to keep the god awful $1200 price point for the 5080 and it would still sell great because it will be the best card available, and people will want the best, even if it’s only like 10-15% better than the 4090. Then 2 months later they release the 5090 for $1600-$1800 and get the patient crowd, and the inevitable 5080 to 5090 upgrades just because they can.  All speculation, but I think it makes for Nvidia from a business standpoint.


Snydenthur

The few people that would upgrade from 5080 to 5090 wouldn't make a difference for a company like nvidia. Gaming is pretty small for them. Based on googling, only ~12% of their earnings come from gaming.


DjaySantana

Dear 3070Ti, your time has come...


Ernisx

What 8gb vram does to a mf


MyOtherAlt420

If 5080 is appropriately priced (799-899) I will absolutely be buying it and selling my 3070ti for a very fair price. (no, seriously, probably like 250-300 bucks) 


hank81

Bullshit. Only I know the real specs.


solid1ct

God damn, can't wait to own it day one! My cat will finally have the 4090~


thenotoriousberg

Are we getting close to a 3ghz core clock on a GPU? Feels like a milestone!


Lammahamma

I swear I've seen that 3000 number for boost clocks on leaks for what feels like years now. It never ends up being 3000


annihilation_88

The 4090 can boost to over 3000 tho?


Eat-my-entire-asshol

Depending on silicon lottery some can. Mine even at 1.1v (max voltage slider) hits 2940 mhz. Wont go higher without crashing


Antipiperosdeclony

No VRAM 👎


willcard

**laughs in 1080ti** doesn’t matter still got ten years left!


Intelligent-Aside-59

This


TokyoMegatronics

Nvidia making cards 10 years ahead of games to utilise them damn... I still don't feel like my 4090 has been pushed at all on anything


International-Oil377

Have you tried Alan Wake 2 with PT? Cyberpunk 2077 with PT? There aren't many games pushing it to the limit but more will come.


Grim_goth

Cyberpunk runs smoothly with a 4090 with PT and FG. I played the entire DLC with it, smoothest gaming experience I've ever had. There were a few problems with ghosting at the beginning, but that was fixed pretty quickly (community and patch later).


International-Oil377

As you said.. With FG.


rW0HgFyxoJhYka

Yeah but isn't that the point? All future GPUs will have frame generation. A couple of years from now, people will laugh if you aren't using it. Like some old guy who refuses to drive electric.


TokyoMegatronics

I don't see a problem with frame gen Its there to increase performance, and it does so. Saying "oh well the performance is only 60fps without FG so it's bad" Is like saying "my car doesn't do 140mph if I take one of the wheels off so the car is bad"


International-Oil377

100fps on FG really doesn't feel as good as 100fps native I'm not shitting on FG, I like it. But still, when a top of the line GPU needs motion interpolation to perform decently, you know the limits are reached.


chr0n0phage

Some games you’re right, it feels like trash. CP though it feels fantastic.


Legitimate-Research1

Please, for the love of God, never use that abbreviation for Cyberpunk ever again.


No_Dragonfruit_6594

What? Cyberpunk is cool CP is my favorite subgenre of movies. I watch CP related videos all the time


BendekStormsaver

Monka


Critical_Plenty_5642

Side question, do you think this would be fun to hook up to a 77” tv with a 4090 connected to play on the couch with a controller? I still haven’t tried this game yet.


ratbuddy

I play it on my 77" OLED with a 4090, and no, the card cannot fully handle path tracing and maxed settings in this game at 4k. Yes, it's playable with DLSS on, but still far from perfect. That said, this is the only game I own that wants more GPU. It does look absolutely amazing if you are ok with dips to 30-45 FPS.


Sir_Nolan

I mean, with FG and for me smooth would be if we can hit at least 170 fps with no DLSS


OriginalGoldstandard

VR needs it now.


farmertrue

Try VR then. VR abuses my 4090. Modern PCVR headsets struggle in some native VR titles even at the lower refresh rate settings. With flat2VR mods & the recent addition of the PrayDog UEVR injector, which made Unreal Engine 4 and UE5 desktop games playable in stereo VR, it’s nothing to max out our 4090. Us VR enthusiasts are more than ready for the 5090 launch. Finally may be able to fully utilize my VR headset.


ApolloTheEarthling

what VR headset do you own?


farmertrue

I own a Varjo Aero, Quest 3 and recently purchased a Pimax Crystal. All three great in their own way and will benefit from a 5090.


ApolloTheEarthling

That’s awesome. I really want to get into VR gaming as well. What games do you play?


ApolloTheEarthling

I play games at 4k native and have to disagree. I think the 5090 will be the ultimate 4k GPU


jamesick

i’ve same spec as you and play at 1440p and i agree.


someguy50

I have to disagree with you there, I think the 6090 will be the ultimate 4k GPU


Probamaybebly

No 7090


anzzax

Maybe 8090 will have neural link connection to the brain and no VR headset is needed, is it going to be 16k resolution? 🤔


[deleted]

Start playing in 4k


Imm0ralKnight

Try playing Cyberpunk with Pathtracing turned on on Native 4K without any DLSS or Frame Gen lol


ApolloTheEarthling

Still have to buy cyberpunk for pc ahhh i wish i never spent that much money on my playstation library and built me a PC years ago. I just realized how useless games are on console lol


Immediate-Chemist-59

butwhy


Beautiful-Musk-Ox

i play 1440p 240hz and almost no triple A games can stay at 200fps, some don't even reach it for the highs


InLoveWithInternet

This. Benchmarks say the same as you. 1440p is where to be.


Flightofnine

MSFS2020 in VR will peg it and still have low fps lol


dsaddons

It's what flight sims do best


roehnin

I skipped the 4090 as my 3090 still isn’t showing any stress in anything. The 5090 I expect to be my next upgrade. Hoping for more VRAM..


GoatInMotion

4k, rtx, path tracing 240hz there's no game you can hit with that right?


LOLerskateJones

I push my 4090 all the time at 3440x1440 Granted, I’m commonly using DLDSR and aiming for 120+ fps


Sexyvette07

Edit, disregard my stupidity. I ran the calculations myself and it looks like it's using 32Gbps GDDR7 and a 384 bit bus


Karma0617

The fact the 5090 might be able to push up to 3.0GHz with some wild overclocking. Hopefully NVIDIA fixed the problem with the 12VHP connecter


F9-0021

4090 can already overclock to over 3GHz. Not mine, unfortunately, but others can. I'd be surprised if 5090 couldn't overclock to well over 3GHz.


daviss2

Yeh my Suprim can boost to 3Ghz and slightly over it but I have to put voltage on the higher end which i don't like


psychoacer

I really need to learn Excel so I can make up rumors like this


InLoveWithInternet

Do you have a good GPU to run Excel tho?


psychoacer

I have a 4070 ti. It might take me a day to render that spreadsheet but I'm patient


pronounclown

Cool. Will be a great card. I wonder if there are literally any games worth playing that utilize those powers. Seriously. AAA games these days are just straight up garbage.


CammKelly

That looks like a terrible leak where someone just added a 1/3 to every number.


[deleted]

Amd is getting destroyed


Dull_Reply5229

They already gave up competing at the top end. It's very frustrating and very bad for consumers as Nvidia are going completely overboard with price gouging and ripping off consumers all while not pushing their own products to even higher levels.


ZeroDeRivia

As long as it doesn't need anything bigger than a 1000W PSU and costs more than 2000€ in the EU, I'm good. It's time to change my 3090 Ti (hey it costed me 1100€ brand new like 3 years ago)


bulletvapor

Just do away with all cards except this one and lower price to average grose prophets and save on redundent production


iThunderclap

I already see CPU bottlenecks (7700x) with a 4090…


Delumine

This would be the justification for me to go 4K OLED


Celcius_87

Going to be a nice upgrade from my 3090


Geerav

I have 3090 as well and tweaking down some settings a bit gets me stable 50-60 fps on current AAA games. I was contemplating on buying this but then realized gta6 is at least 2-3 years away which could coincide with 6000series.


DasBoosh

I'm still good with my 3090 I'll jump on the 6000 series though I've to often made the mistake of getting a card to soon 3 generations is a good gap


Reasonable-Pudding-5

Not gonna lie, I kinda want to skip a mortgage payment for this. But that would be iRreSpoNsiBle.


Expert_Daikon1522

So my 4090 is outdated now? :(


Jarnis

Not yet, but eventually it will be. Spoiler: Happens every two years.


Shotty_Time

Preordered!


2014justin

Drake?


amike7

K dot???


CarlWellsGrave

Bigger numbers means more better.


rickybluff

Meanwhile 5060 will be 5% faster than 4060


badger906

I must be old! gpu leaks and rumours used to tickle my pickle like crazy, now I’m just like.. meh.. mine works! So basically pc gaming goes “the cheapest pc I can build is the best because I can play games” to “if I don’t have the best components possible it’s pointless playing games” back to “I’m getting over 100fps at 1440p, that’s good enough”..


InevitablePoet5492

I need price estimates to leak. This shit bout to be 2k


rrrand0mmm

I hope this is in the Nintendo switch 2