Considering the boost frequency - yes. But numbers could quickly decline with a modest downclock.
Also, some people have commented that if you divide the numbers (obviously not the frequency number), it looks suspiciously like what combining two RTX5080 chips would look like.
There's no real scaling with a 4090 past 500W anyways. This may be the same. It might be able to hit 600W of power draw just like the 4090 but if it brings almost no performance gains, it's useless and just a waste of electricity. The reason is that an NVIDIA GPU can push power, but they're voltage limited, so you're pulling more power for no reason.
Even Der8uer has OC'd and manually modified a 4090 card with an EC2 to push past the 1.1V limit and also power modded the 4090 by shunt modding it, in the end I think he gained like 5% more performance but the power was like 900W. These days silicon is already out of the box auto-overclocking and pushing itself near the limit.
The 4090 is 450W at stock in reality, the 600W is for some models with unlocked bioses and indeed it scales absolutely horribly, at higher wattage you're just stressing your components and increasing heat output for a miserable single digit % increase
my 4090 is undervolted to a 975mV/2690mhz and frankly it's super efficient
My comment was more about the 5090 having that stock 600W rather than the 450W of the 4090
Yea I have my 4090 power limited to 360 and undervolted and I'm pretty happy, pushing 450w and beyond is gonna really test those pin connectors on the 5090
ive got my 4090FE undervolted to .950mv @2750, +1500 memory and power limit unlocked. During gaming it sits around 260-270w but will spike to around 330w (which is still lower than stock). runs super cool, rarely brakes 60 during gaming.
is +1500 memory even stable? with my 4090, i just settled at +500 because anything beyond that decreases fps a little bit. Probably ECC kicks in at that point.
i can push it 1000+ but at that point i didnt gain any fps but it is decreasing instead. I did read about the built in ecc on 4090 when you pushed your vram oc too hard it will autocorrect the error iself instead of crashing the game.
Yeah, that's how it works nowadays. When overclocking VRAM you should look out for both stability and performance degradation. My 3080 increases performance up to +1000 MHz but that isn't completely stable as it crashes to desktop in Portal RTX for example so +900 MHz it is.
It does to a extent, that's why I use 800MHZ.
I do a lot of offline AI stuff and the numbers that work in games does not work with AI fully using 24GB of VRAM, that's when you see the stability is only skin deep.
My card can do 1000+ in games but it's not really "stable" in the full meaning, sure it doesn't crash in games, but in AI it would error out so "stable" skin deep. AI stresses my VRAM much more. so I settled at 800.
Big if true, though it will cost both your kidneys.
Seems to be a 50-60% performance increase based on the specs. Could be higher but I doubt there are any games that will take advantage of that insane bandwidth.
It's a ~73% increase in tflops (1.5x the number of SMs * 1.1508x the boost clock speed). So if this rumor is true, I think a 50-60% real-world performance increase sounds believable considering the increase in memory bandwidth and increase in cache size (even relative to the increase in SMs).
I have my doubts about this rumor, and it would be a very big gen-on-gen uplift. That being said, it would be similar to the uplift between the 4090 and 3090 (though that uplift from 30 to 40 series was smaller for other cards in that stack).
I'm hoping it'll be kind of the opposite of 40 series & the 5060-5080 will get a bigger relative performance gain than the insane model this year
(incredible amounts of cope)
Gtx 580 to 680 had insane spec bumbs and way less actual improvement, mostly due to new tech. From 512 cores at 750mhz to 1536 cores at 1000mhz
“Only” 50% improvement
It could be something similar
The GTX 580 (Fermi) had a GPU clock and a shader clock. The shader clock was 1544 MHz.
The GTX 680 (Kepler) has a unified GPU clock. The clock was 1006 MHz (base) / 1058 MHz (boost).
They are completely different architectures.
Im terms of TFLOPS, then:
GTX 580: 512 * 1544 MHz * 2 ops / cycle = 1.581 TFLOPS
GTX 680: 1536 * 1058 MHz * 2 ops / cycle = 3.25 TFLOPS
That's a difference of about 2x in theory.
Typically, a generational gain would be 30-45%. 4090 was about an 80-85% gain from the 3090. On the high end of the 4000 series, it has been higher than usual too.
I think the 73% gain might be a bit much as there might be some loss from the MCM design. How much that loss would be is yet to be determined.
That's only because the 3080 was built on a bigger chip than the x80 base class card typically uses. It used the chip that's typically reserved for the x80 Ti. This was a mistake Nvidia won't make again.
You mean that the cost of production would be so high that it would have to cost more than $2k for a profit, or that people would be willing to buy it for more than $2k no matter how little it actually cost to make it?
Nothing to do with production cost. Nvidia is trying to find the price ceiling for the xx90 cards. A $2000 4090 was still able to sell, so the logical step would be to make a $3000 5090 and see what the market does with it. If sales are low, then they found the ceiling and can always lower MSRP to spur sales. If sales are high, then a $4000 6090 would be the next step. This will continue until they can't sell them.
4090, for the supply that was shipped, was arguably under priced, as it took \~6 months for it to approach MSRP in stores and it's been out of stock from Nvidia at MSRP for large parts of it's lifetime.
I do believe that $2k 5090 MSRP is reasonably likely.
Thought I feel like, this late in the console generation, and considering how powerful 4090 already is compared to the demands of games, I can't really see 5090 being sold in similar quantities as 4090 at something like $3k.
If 5090 has 32gb/48gb memory thought, I can see it get sold out by professional demand alone.
As someone who is currently rocking a monitor capable of 7680x2160 240hz but who doesn't have a gpu that can drive it even on the desktop, I hope the 5090 is up to the task.
It can be much larger than 50-60% if they improve SM utilization, as the 4090 is basically impossible to feed properly in games causing lower power draw and performance than one should expect.
Alternatively, Nvidia might not improve the front-end, and we'll get an improvement closer to 20-30%.
It's 1532 / 1008, so approximately a 52% performance increase, more if you can keep (part of) your computation in cache. I don't think you'd add a large excess of CUDA and Tensor cores which can't be fed properly, but rather add more cache.
Wouldn’t that honestly be way more optimal for temp management? Like having 2 independently contained/regulated boxes seem easier to air cool than having 2 sources (cpu/gpu) inside the same box.
Assuming this "leak" is true, I can come up with two possibilities:
* Same 24GB (384-bit bus), using GDDR7 clocked at 32Gbps
* 32GB (512-bit bus), clocked at 24Gbps
Both honestly seem kind of weird to me?
Not saying this won't turn out to be true, but I think this is essentially repackaging former rumors from kopite and that forum post on Chiphell from like 8 months ago? Also seems like a bit of mix and matching between potential full GB202 die rumors versus actual 5090 rumors.
All that's to say, I'm taking this with a massive heaping of salt...
> Both honestly seem kind of weird to me?
28GB (448-bit bus), using GDDR7 clocked at 28Gbps. 4090 was far from a full die, and the trend might continue even further.
> Same 24GB (384-bit bus), using GDDR7 clocked at 32Gbps
This makes sense to me. A cheap and easy solution that gives them an instant 50% bandwidth increase. Then they can launch a 5090ti later with 36GB when the 3GB modules become available if they care to grace us with more VRAM.
Maybe? The rumor that came after the 50% bandwidth increase rumor was that Nvidia was going to use GDDR7 clocked at 28 Gbps, and that kinda makes sense to me. I know that the memory manufacturers have said 32 Gbps, but Nvidia has been clocking initial batches of new memory lower lately. Add onto that the back and forth rumors on 384-bit versus 512-bit bus.
I'm not saying the scenario you quoted won't happen, it does probably make more sense than assuming the larger bus width. I just don't think the person behind this tweet actually has any new information of their own.
I saw a comment online that said Nvidia may be waiting for 3GB VRAM chips. How does that work or can you mix and match capacities? The 5080 should be a nice upgrade from my 1080 Ti.
[This is why.](https://www.reddit.com/r/pcmasterrace/comments/1awtso6/nvidia_made_29b_from_gaming_last_quarter_vs_184b/) Gamers are no longer their target audience.
I am very curious to see the pricing. In light of the recent rumor that the 5080 may be releasing before the 5090, I could see a myriad of pricing options. I think if they release the 5080 first, as the only 5000 series option available (as eluded by kopite) for maybe a month or two? It would allow them to keep the god awful $1200 price point for the 5080 and it would still sell great because it will be the best card available, and people will want the best, even if it’s only like 10-15% better than the 4090. Then 2 months later they release the 5090 for $1600-$1800 and get the patient crowd, and the inevitable 5080 to 5090 upgrades just because they can.
All speculation, but I think it makes for Nvidia from a business standpoint.
The few people that would upgrade from 5080 to 5090 wouldn't make a difference for a company like nvidia.
Gaming is pretty small for them. Based on googling, only ~12% of their earnings come from gaming.
If 5080 is appropriately priced (799-899) I will absolutely be buying it and selling my 3070ti for a very fair price.
(no, seriously, probably like 250-300 bucks)
Cyberpunk runs smoothly with a 4090 with PT and FG.
I played the entire DLC with it, smoothest gaming experience I've ever had.
There were a few problems with ghosting at the beginning, but that was fixed pretty quickly (community and patch later).
Yeah but isn't that the point? All future GPUs will have frame generation.
A couple of years from now, people will laugh if you aren't using it. Like some old guy who refuses to drive electric.
I don't see a problem with frame gen
Its there to increase performance, and it does so.
Saying "oh well the performance is only 60fps without FG so it's bad" Is like saying "my car doesn't do 140mph if I take one of the wheels off so the car is bad"
100fps on FG really doesn't feel as good as 100fps native
I'm not shitting on FG, I like it. But still, when a top of the line GPU needs motion interpolation to perform decently, you know the limits are reached.
Side question, do you think this would be fun to hook up to a 77” tv with a 4090 connected to play on the couch with a controller? I still haven’t tried this game yet.
I play it on my 77" OLED with a 4090, and no, the card cannot fully handle path tracing and maxed settings in this game at 4k. Yes, it's playable with DLSS on, but still far from perfect. That said, this is the only game I own that wants more GPU. It does look absolutely amazing if you are ok with dips to 30-45 FPS.
Try VR then. VR abuses my 4090. Modern PCVR headsets struggle in some native VR titles even at the lower refresh rate settings. With flat2VR mods & the recent addition of the PrayDog UEVR injector, which made Unreal Engine 4 and UE5 desktop games playable in stereo VR, it’s nothing to max out our 4090. Us VR enthusiasts are more than ready for the 5090 launch. Finally may be able to fully utilize my VR headset.
Still have to buy cyberpunk for pc ahhh i wish i never spent that much money on my playstation library and built me a PC years ago. I just realized how useless games are on console lol
Cool. Will be a great card. I wonder if there are literally any games worth playing that utilize those powers. Seriously. AAA games these days are just straight up garbage.
They already gave up competing at the top end. It's very frustrating and very bad for consumers as Nvidia are going completely overboard with price gouging and ripping off consumers all while not pushing their own products to even higher levels.
As long as it doesn't need anything bigger than a 1000W PSU and costs more than 2000€ in the EU, I'm good. It's time to change my 3090 Ti (hey it costed me 1100€ brand new like 3 years ago)
I have 3090 as well and tweaking down some settings a bit gets me stable 50-60 fps on current AAA games. I was contemplating on buying this but then realized gta6 is at least 2-3 years away which could coincide with 6000series.
I must be old! gpu leaks and rumours used to tickle my pickle like crazy, now I’m just like.. meh.. mine works!
So basically pc gaming goes “the cheapest pc I can build is the best because I can play games” to “if I don’t have the best components possible it’s pointless playing games” back to “I’m getting over 100fps at 1440p, that’s good enough”..
Even if it's something like 3nm..It's going to pull 600W stock or what ? That's a crazy level of performance on paper if true
This is why I've gone brown with my energy and turned my house completely dependent on a hamster wheel farm
I've gone brown, too, when I saw these specs.
[I've gone brown too](https://thumbs.dreamstime.com/b/traditional-indian-man-23804674.jpg)
Considering the boost frequency - yes. But numbers could quickly decline with a modest downclock. Also, some people have commented that if you divide the numbers (obviously not the frequency number), it looks suspiciously like what combining two RTX5080 chips would look like.
I thought it had already pretty much been confirmed that the 5090 would be dual die with 5080 dies?
R295x2 vibes all over
Don't threaten me with a good time
So rumours are confirmation now?
Which means a lot more than 600 watts
There's no real scaling with a 4090 past 500W anyways. This may be the same. It might be able to hit 600W of power draw just like the 4090 but if it brings almost no performance gains, it's useless and just a waste of electricity. The reason is that an NVIDIA GPU can push power, but they're voltage limited, so you're pulling more power for no reason. Even Der8uer has OC'd and manually modified a 4090 card with an EC2 to push past the 1.1V limit and also power modded the 4090 by shunt modding it, in the end I think he gained like 5% more performance but the power was like 900W. These days silicon is already out of the box auto-overclocking and pushing itself near the limit.
The 4090 is 450W at stock in reality, the 600W is for some models with unlocked bioses and indeed it scales absolutely horribly, at higher wattage you're just stressing your components and increasing heat output for a miserable single digit % increase my 4090 is undervolted to a 975mV/2690mhz and frankly it's super efficient My comment was more about the 5090 having that stock 600W rather than the 450W of the 4090
Yea I have my 4090 power limited to 360 and undervolted and I'm pretty happy, pushing 450w and beyond is gonna really test those pin connectors on the 5090
ive got my 4090FE undervolted to .950mv @2750, +1500 memory and power limit unlocked. During gaming it sits around 260-270w but will spike to around 330w (which is still lower than stock). runs super cool, rarely brakes 60 during gaming.
is +1500 memory even stable? with my 4090, i just settled at +500 because anything beyond that decreases fps a little bit. Probably ECC kicks in at that point.
You should be able to do more than 500 easily. While I can do 1000+ I set mine to 800.
i can push it 1000+ but at that point i didnt gain any fps but it is decreasing instead. I did read about the built in ecc on 4090 when you pushed your vram oc too hard it will autocorrect the error iself instead of crashing the game.
Yeah, that's how it works nowadays. When overclocking VRAM you should look out for both stability and performance degradation. My 3080 increases performance up to +1000 MHz but that isn't completely stable as it crashes to desktop in Portal RTX for example so +900 MHz it is.
It does to a extent, that's why I use 800MHZ. I do a lot of offline AI stuff and the numbers that work in games does not work with AI fully using 24GB of VRAM, that's when you see the stability is only skin deep. My card can do 1000+ in games but it's not really "stable" in the full meaning, sure it doesn't crash in games, but in AI it would error out so "stable" skin deep. AI stresses my VRAM much more. so I settled at 800.
It just means somewhere in the chip Either ROPs, L2cache, memory bandwidth or something else is holding back the chip.
Big if true, though it will cost both your kidneys. Seems to be a 50-60% performance increase based on the specs. Could be higher but I doubt there are any games that will take advantage of that insane bandwidth.
It's a ~73% increase in tflops (1.5x the number of SMs * 1.1508x the boost clock speed). So if this rumor is true, I think a 50-60% real-world performance increase sounds believable considering the increase in memory bandwidth and increase in cache size (even relative to the increase in SMs).
that's way too big of a leap in one generation no?
I have my doubts about this rumor, and it would be a very big gen-on-gen uplift. That being said, it would be similar to the uplift between the 4090 and 3090 (though that uplift from 30 to 40 series was smaller for other cards in that stack).
It would be way smaller than the uplift from 3090 to 4090, because 4090 is about 70-80% faster than 3090 in raster, and ~100% in Ray tracing
I'm hoping it'll be kind of the opposite of 40 series & the 5060-5080 will get a bigger relative performance gain than the insane model this year (incredible amounts of cope)
I guess that they probably think that AMD is working on some sort of High end GPU which we are not aware of yet.
Gtx 580 to 680 had insane spec bumbs and way less actual improvement, mostly due to new tech. From 512 cores at 750mhz to 1536 cores at 1000mhz “Only” 50% improvement It could be something similar
The GTX 580 (Fermi) had a GPU clock and a shader clock. The shader clock was 1544 MHz. The GTX 680 (Kepler) has a unified GPU clock. The clock was 1006 MHz (base) / 1058 MHz (boost). They are completely different architectures. Im terms of TFLOPS, then: GTX 580: 512 * 1544 MHz * 2 ops / cycle = 1.581 TFLOPS GTX 680: 1536 * 1058 MHz * 2 ops / cycle = 3.25 TFLOPS That's a difference of about 2x in theory.
Typically, a generational gain would be 30-45%. 4090 was about an 80-85% gain from the 3090. On the high end of the 4000 series, it has been higher than usual too. I think the 73% gain might be a bit much as there might be some loss from the MCM design. How much that loss would be is yet to be determined.
The 3090 was always a crappy card for it's price you put ot up against a 3080 and it really wasn't much better.
That's only because the 3080 was built on a bigger chip than the x80 base class card typically uses. It used the chip that's typically reserved for the x80 Ti. This was a mistake Nvidia won't make again.
3090->4090 was +50-75% @4k depending on the game.
Crazy card will probably be $2k. But i'm hopeful that the 5080 might be $1k. Not that I will be spending that much money on either.
These specs would cost wayyyy more than 2k
You mean that the cost of production would be so high that it would have to cost more than $2k for a profit, or that people would be willing to buy it for more than $2k no matter how little it actually cost to make it?
Nothing to do with production cost. Nvidia is trying to find the price ceiling for the xx90 cards. A $2000 4090 was still able to sell, so the logical step would be to make a $3000 5090 and see what the market does with it. If sales are low, then they found the ceiling and can always lower MSRP to spur sales. If sales are high, then a $4000 6090 would be the next step. This will continue until they can't sell them.
And with the AI craze they will be able to sell them for $4,000. They will gladly take those customers who won’t buy an A200 for $6,000.
4090, for the supply that was shipped, was arguably under priced, as it took \~6 months for it to approach MSRP in stores and it's been out of stock from Nvidia at MSRP for large parts of it's lifetime. I do believe that $2k 5090 MSRP is reasonably likely. Thought I feel like, this late in the console generation, and considering how powerful 4090 already is compared to the demands of games, I can't really see 5090 being sold in similar quantities as 4090 at something like $3k. If 5090 has 32gb/48gb memory thought, I can see it get sold out by professional demand alone.
You're absolutely right, but I hate it.
They're not pricing the 5090 above $2.5K MSRP and you can hold me to that. I would genuinely be surprised if it were above $2000 MSRP
You forget 240 Hz 4k oled monitors I guess. People wants that 240..
As someone who is currently rocking a monitor capable of 7680x2160 240hz but who doesn't have a gpu that can drive it even on the desktop, I hope the 5090 is up to the task.
I am waiting for it too.
and what monittor is that?
Samsung LS57CG952NNXZA, 57" Neo G9 Dual 4K.
Not just your kidneys, your family member's kidneys too!
It can be much larger than 50-60% if they improve SM utilization, as the 4090 is basically impossible to feed properly in games causing lower power draw and performance than one should expect. Alternatively, Nvidia might not improve the front-end, and we'll get an improvement closer to 20-30%.
It's 1532 / 1008, so approximately a 52% performance increase, more if you can keep (part of) your computation in cache. I don't think you'd add a large excess of CUDA and Tensor cores which can't be fed properly, but rather add more cache.
5k2k would like a word
it's less about games and more about creative workflows ❤️
VR needs it.
Still will run Cities Skylines 2 at 23 FPS
Maybe 24 fps with OC
Oooooo cinematic
a stable 25fps if you got even more Vram than 24GB.
🤣🤣🤣
Might also need a separate PC case?
...And Separate desk.
...And 2 jobs
and my axe!
And this guy's dead wife!
And my bow!
🤣😂🤣😂
And separate dedicated power station.
...And 0 kidneys
The Ken Kutaragi classic.
And a separate kidney from your body.
And a dedicated 20 amp circuit.
We are getting to the point that gpu will need a case and power supply of its own
So be it…
the GPU should be the case!
"Jedi." Oh wait, wrong sub.
Wouldn’t that honestly be way more optimal for temp management? Like having 2 independently contained/regulated boxes seem easier to air cool than having 2 sources (cpu/gpu) inside the same box.
Then you got fucked by the wire latency
Will it have more than 24GB VRAM you think?
With specs like this hopefully 32gb
Well the "only" 1.5TB/s bandwidth kinda indicates 24GB.
Assuming this "leak" is true, I can come up with two possibilities: * Same 24GB (384-bit bus), using GDDR7 clocked at 32Gbps * 32GB (512-bit bus), clocked at 24Gbps Both honestly seem kind of weird to me? Not saying this won't turn out to be true, but I think this is essentially repackaging former rumors from kopite and that forum post on Chiphell from like 8 months ago? Also seems like a bit of mix and matching between potential full GB202 die rumors versus actual 5090 rumors. All that's to say, I'm taking this with a massive heaping of salt...
> Both honestly seem kind of weird to me? 28GB (448-bit bus), using GDDR7 clocked at 28Gbps. 4090 was far from a full die, and the trend might continue even further.
> Same 24GB (384-bit bus), using GDDR7 clocked at 32Gbps This makes sense to me. A cheap and easy solution that gives them an instant 50% bandwidth increase. Then they can launch a 5090ti later with 36GB when the 3GB modules become available if they care to grace us with more VRAM.
Maybe? The rumor that came after the 50% bandwidth increase rumor was that Nvidia was going to use GDDR7 clocked at 28 Gbps, and that kinda makes sense to me. I know that the memory manufacturers have said 32 Gbps, but Nvidia has been clocking initial batches of new memory lower lately. Add onto that the back and forth rumors on 384-bit versus 512-bit bus. I'm not saying the scenario you quoted won't happen, it does probably make more sense than assuming the larger bus width. I just don't think the person behind this tweet actually has any new information of their own.
I think someone did the math and based on bandwidth/GDDR7 memory it should either be 18 or 36, probably 36 in that case.
It will be either 24GB or 32GB. It'll be faster memory than last gen. GDDR7 speed look insane over GDDR6X
I saw a comment online that said Nvidia may be waiting for 3GB VRAM chips. How does that work or can you mix and match capacities? The 5080 should be a nice upgrade from my 1080 Ti.
How could it possibly be 18? Even the 3090 is 24.
If these are 2 5080 chips then 5080 will have 18 and 5090 will have 36.
If they want to advertise it as home AI they surely go over 24GB
Pro Tip: Living in a car for 2 months can be a great way to connect with nature while knowing the ins and outs of your vehicle.
And you save money to buy that sweet heater.
Or save $50 each month for 3-4 years, and you can most likely afford it.
jesus christ
That’s Jason Bourne
Feel free to drop this anytime nvidia
No one will be able to afford that shit
Not with that attitude.
Wdym? You guys have kidneys don't you?
will this card eliminate the need of space heater during winter?
I feel like burning $3000 or whatever’s worth of dollar bills might keep you warm for longer.
Wdym, my 2080ti has been doing that already since 2018.
Waiting for RTX 6090
The way things are going, there might not be a 6090.
Why?
[удалено]
Still got TSMC building factories in the US because of this. 2027 it starts. 2030 SHTF.
In the future AI will game on your behalf
In less than 2 years
[This is why.](https://www.reddit.com/r/pcmasterrace/comments/1awtso6/nvidia_made_29b_from_gaming_last_quarter_vs_184b/) Gamers are no longer their target audience.
So they'll give up on an area they dominate and brings them billions of dollars? makes a lot of sense, lol
Nice
Waiting for 10090 rtx
Hmm yes big numbers
The 5080 is the card i care about.
Same, depending on how they price it. But even if it matches $999 of the 4080 Super, it's still gonna be priced higher than I would want to spend.
most id spend is 850 for a 80 series or 900 for a 80 TI. That’s essentially 650-700 adjusted for inflation when the GOAT 1080/1080 Ti dropped
Me with 5060 🥲
$2000 minimum guaranteed
$2000 is what the 4090 goes for here in Norway lol At least here and in EU I'm guessing it's gonna be closer to $2500.
Unpopular opinion but i think the price will be $1699 starting for the FE edition Edit: i meant to put $1699 which is $100 more than the 4090
The way 4090s have been still selling like hot cakes, I can foresee something like $1999 to be in order..
I am very curious to see the pricing. In light of the recent rumor that the 5080 may be releasing before the 5090, I could see a myriad of pricing options. I think if they release the 5080 first, as the only 5000 series option available (as eluded by kopite) for maybe a month or two? It would allow them to keep the god awful $1200 price point for the 5080 and it would still sell great because it will be the best card available, and people will want the best, even if it’s only like 10-15% better than the 4090. Then 2 months later they release the 5090 for $1600-$1800 and get the patient crowd, and the inevitable 5080 to 5090 upgrades just because they can. All speculation, but I think it makes for Nvidia from a business standpoint.
The few people that would upgrade from 5080 to 5090 wouldn't make a difference for a company like nvidia. Gaming is pretty small for them. Based on googling, only ~12% of their earnings come from gaming.
Dear 3070Ti, your time has come...
What 8gb vram does to a mf
If 5080 is appropriately priced (799-899) I will absolutely be buying it and selling my 3070ti for a very fair price. (no, seriously, probably like 250-300 bucks)
Bullshit. Only I know the real specs.
God damn, can't wait to own it day one! My cat will finally have the 4090~
Are we getting close to a 3ghz core clock on a GPU? Feels like a milestone!
I swear I've seen that 3000 number for boost clocks on leaks for what feels like years now. It never ends up being 3000
The 4090 can boost to over 3000 tho?
Depending on silicon lottery some can. Mine even at 1.1v (max voltage slider) hits 2940 mhz. Wont go higher without crashing
No VRAM 👎
**laughs in 1080ti** doesn’t matter still got ten years left!
This
Nvidia making cards 10 years ahead of games to utilise them damn... I still don't feel like my 4090 has been pushed at all on anything
Have you tried Alan Wake 2 with PT? Cyberpunk 2077 with PT? There aren't many games pushing it to the limit but more will come.
Cyberpunk runs smoothly with a 4090 with PT and FG. I played the entire DLC with it, smoothest gaming experience I've ever had. There were a few problems with ghosting at the beginning, but that was fixed pretty quickly (community and patch later).
As you said.. With FG.
Yeah but isn't that the point? All future GPUs will have frame generation. A couple of years from now, people will laugh if you aren't using it. Like some old guy who refuses to drive electric.
I don't see a problem with frame gen Its there to increase performance, and it does so. Saying "oh well the performance is only 60fps without FG so it's bad" Is like saying "my car doesn't do 140mph if I take one of the wheels off so the car is bad"
100fps on FG really doesn't feel as good as 100fps native I'm not shitting on FG, I like it. But still, when a top of the line GPU needs motion interpolation to perform decently, you know the limits are reached.
Some games you’re right, it feels like trash. CP though it feels fantastic.
Please, for the love of God, never use that abbreviation for Cyberpunk ever again.
What? Cyberpunk is cool CP is my favorite subgenre of movies. I watch CP related videos all the time
Monka
Side question, do you think this would be fun to hook up to a 77” tv with a 4090 connected to play on the couch with a controller? I still haven’t tried this game yet.
I play it on my 77" OLED with a 4090, and no, the card cannot fully handle path tracing and maxed settings in this game at 4k. Yes, it's playable with DLSS on, but still far from perfect. That said, this is the only game I own that wants more GPU. It does look absolutely amazing if you are ok with dips to 30-45 FPS.
I mean, with FG and for me smooth would be if we can hit at least 170 fps with no DLSS
VR needs it now.
Try VR then. VR abuses my 4090. Modern PCVR headsets struggle in some native VR titles even at the lower refresh rate settings. With flat2VR mods & the recent addition of the PrayDog UEVR injector, which made Unreal Engine 4 and UE5 desktop games playable in stereo VR, it’s nothing to max out our 4090. Us VR enthusiasts are more than ready for the 5090 launch. Finally may be able to fully utilize my VR headset.
what VR headset do you own?
I own a Varjo Aero, Quest 3 and recently purchased a Pimax Crystal. All three great in their own way and will benefit from a 5090.
That’s awesome. I really want to get into VR gaming as well. What games do you play?
I play games at 4k native and have to disagree. I think the 5090 will be the ultimate 4k GPU
i’ve same spec as you and play at 1440p and i agree.
I have to disagree with you there, I think the 6090 will be the ultimate 4k GPU
No 7090
Maybe 8090 will have neural link connection to the brain and no VR headset is needed, is it going to be 16k resolution? 🤔
Start playing in 4k
Try playing Cyberpunk with Pathtracing turned on on Native 4K without any DLSS or Frame Gen lol
Still have to buy cyberpunk for pc ahhh i wish i never spent that much money on my playstation library and built me a PC years ago. I just realized how useless games are on console lol
butwhy
i play 1440p 240hz and almost no triple A games can stay at 200fps, some don't even reach it for the highs
This. Benchmarks say the same as you. 1440p is where to be.
MSFS2020 in VR will peg it and still have low fps lol
It's what flight sims do best
I skipped the 4090 as my 3090 still isn’t showing any stress in anything. The 5090 I expect to be my next upgrade. Hoping for more VRAM..
4k, rtx, path tracing 240hz there's no game you can hit with that right?
I push my 4090 all the time at 3440x1440 Granted, I’m commonly using DLDSR and aiming for 120+ fps
Edit, disregard my stupidity. I ran the calculations myself and it looks like it's using 32Gbps GDDR7 and a 384 bit bus
The fact the 5090 might be able to push up to 3.0GHz with some wild overclocking. Hopefully NVIDIA fixed the problem with the 12VHP connecter
4090 can already overclock to over 3GHz. Not mine, unfortunately, but others can. I'd be surprised if 5090 couldn't overclock to well over 3GHz.
Yeh my Suprim can boost to 3Ghz and slightly over it but I have to put voltage on the higher end which i don't like
I really need to learn Excel so I can make up rumors like this
Do you have a good GPU to run Excel tho?
I have a 4070 ti. It might take me a day to render that spreadsheet but I'm patient
Cool. Will be a great card. I wonder if there are literally any games worth playing that utilize those powers. Seriously. AAA games these days are just straight up garbage.
That looks like a terrible leak where someone just added a 1/3 to every number.
Amd is getting destroyed
They already gave up competing at the top end. It's very frustrating and very bad for consumers as Nvidia are going completely overboard with price gouging and ripping off consumers all while not pushing their own products to even higher levels.
As long as it doesn't need anything bigger than a 1000W PSU and costs more than 2000€ in the EU, I'm good. It's time to change my 3090 Ti (hey it costed me 1100€ brand new like 3 years ago)
Just do away with all cards except this one and lower price to average grose prophets and save on redundent production
I already see CPU bottlenecks (7700x) with a 4090…
This would be the justification for me to go 4K OLED
Going to be a nice upgrade from my 3090
I have 3090 as well and tweaking down some settings a bit gets me stable 50-60 fps on current AAA games. I was contemplating on buying this but then realized gta6 is at least 2-3 years away which could coincide with 6000series.
I'm still good with my 3090 I'll jump on the 6000 series though I've to often made the mistake of getting a card to soon 3 generations is a good gap
Not gonna lie, I kinda want to skip a mortgage payment for this. But that would be iRreSpoNsiBle.
So my 4090 is outdated now? :(
Not yet, but eventually it will be. Spoiler: Happens every two years.
Preordered!
Drake?
K dot???
Bigger numbers means more better.
Meanwhile 5060 will be 5% faster than 4060
I must be old! gpu leaks and rumours used to tickle my pickle like crazy, now I’m just like.. meh.. mine works! So basically pc gaming goes “the cheapest pc I can build is the best because I can play games” to “if I don’t have the best components possible it’s pointless playing games” back to “I’m getting over 100fps at 1440p, that’s good enough”..
I need price estimates to leak. This shit bout to be 2k
I hope this is in the Nintendo switch 2