NVIDIA RTX 20 Series Reviews

Note that this is under the assumption that AMD actually delivers a competent new GPU lineup in that timeframe.
I'm doubting AMD will be competing at the high-end before 2020.

The interesting thing about this new era in GPU tech is that AMD could double down on rasterization performance and maybe pull ahead of Nvidia. I guess everything will depend on how prolific and effective DLSS will be.

Anyway, I wish someone with a GTX 2080TI can give performance numbers on FFXV with Gameworks effects on (game, not benchmark).
 
AMD / Intel still doesn’t have an answer for 1080ti, but Nvidia still release the 2080ti. All AMD / Intel does is to help keep Nvidia prices down with a comparable product. It’s more like when a faster product is ready Nvidia will release to get enthusiasts to upgrade. So, 2019 is not out of the question for a 2180ti.
I don't think anything is out of the question, I just don't think it's probable based on the evidence we have at hand right now.
 
To me the timeframe for the next nvidia series feels close because 7nm seems to be coming along. Also the fact that they launched this time with Ti.

Man I really hope that AMD can deliver, it's up to them to put up a fight again since Intel is still so far out.
 
These are current rumors right now I've seen online.

1) AMD might have Vega refresh gpus coming in the next month. Though don't expect anything major but about a 10 percent bump in performance. So more of a refresh.

2) AMD 7nm consumer gpus for mid range are coming 2019 and high end in 2020

So basically Nvidia doesnt have any competition in the high end for a long time. However, Turing chips are huge and that is one major reason they cost so much (Look a these massive 3rd party cooling solutions). I could see a refresh in 2019 with 7nm just so Nvidia can get the cost and prices down. As far as I can tell Turing was a node shrink from 16nm to 12nm. Next step is 12nm to 7nm. This isn't the same jump pascal made going from 28nm to 16nm.

Ultimately though it probably depends on how well the 20 series sells. If the 20 series is a non starter and gpus are sitting on the shelf, this will motivate Nvidia to have something else out more quickly at a lower price, like a 7nm refresh. If the 20 series sells within expectations then Nvidia doesn't have a need to bring anything else out.

The TI at launch is curious for sure, but I feel like this will be the norm now. Whenever the 30 series is out the only card I see truly besting the 2080 TI is a 3080 TI. Nvidia will need a flagship card at launch.
 
Hardware Unboxed did a good deep dive into DLSS based on the limited content available to test with it. I thought it was interesting:


Ultimately compared DLSS to running a game in 1800p resolution based on the Infiltrator demo. It's early though and things could change.
 
Luckily I was able to sell my 1080 the same day the 1080 TI came in. Hopefully AMD is able to do something when 7nm comes around. I'll be waiting as the the new offerings from nvidia is too overpriced.
 
Hardware Unboxed did a good deep dive into DLSS based on the limited content available to test with it. I thought it was interesting:


Ultimately compared DLSS to running a game in 1800p resolution based on the Infiltrator demo. It's early though and things could change.
It's only a single example, but what's interesting is that 4k DLSS offers the same image quality as 1800p TAA, at ... 1800p TAA performance.
 
These are current rumors right now I've seen online.

1) AMD might have Vega refresh gpus coming in the next month. Though don't expect anything major but about a 10 percent bump in performance. So more of a refresh.

2) AMD 7nm consumer gpus for mid range are coming 2019 and high end in 2020

So basically Nvidia doesnt have any competition in the high end for a long time. However, Turing chips are huge and that is one major reason they cost so much (Look a these massive 3rd party cooling solutions). I could see a refresh in 2019 with 7nm just so Nvidia can get the cost and prices down. As far as I can tell Turing was a node shrink from 16nm to 12nm. Next step is 12nm to 7nm. This isn't the same jump pascal made going from 28nm to 16nm.

Ultimately though it probably depends on how well the 20 series sells. If the 20 series is a non starter and gpus are sitting on the shelf, this will motivate Nvidia to have something else out more quickly at a lower price, like a 7nm refresh. If the 20 series sells within expectations then Nvidia doesn't have a need to bring anything else out.

The TI at launch is curious for sure, but I feel like this will be the norm now. Whenever the 30 series is out the only card I see truly besting the 2080 TI is a 3080 TI. Nvidia will need a flagship card at launch.
I agree, Nvidia will go to 7nm if only to get the long term cost benefits, at their volume it will more than make up the R&D costs.

It's important to note that TSMC claims 12nm is a performance boost from 16nm, but it's not a real shrink in terms of area, so dropping to 7nm should help a ton with getting the costs down even if there isn't much performance to be gained vs 12nm.

That said, memory is a huge part of the costs also so I'd keep expectations in check for pricing. I could see a 2180 being $100 cheaper to get back to their normal Maxwell / Pascal era pricing but probably not much more than that. Not worth waiting for IMO but if you're happy with your current cards or think waiting a year is worth saving $100 then it might be worth waiting for.
 
Hardware Unboxed did a good deep dive into DLSS based on the limited content available to test with it. I thought it was interesting:


Ultimately compared DLSS to running a game in 1800p resolution based on the Infiltrator demo. It's early though and things could change.
I thought the 1800p image looked better. If you look at the part where it shows the zoomed out shot of the city, the small city lights looked brighter and cleaner on the 1800p side.
 
These are current rumors right now I've seen online.

1) AMD might have Vega refresh gpus coming in the next month. Though don't expect anything major but about a 10 percent bump in performance. So more of a refresh.

2) AMD 7nm consumer gpus for mid range are coming 2019 and high end in 2020

So basically Nvidia doesnt have any competition in the high end for a long time. However, Turing chips are huge and that is one major reason they cost so much (Look a these massive 3rd party cooling solutions). I could see a refresh in 2019 with 7nm just so Nvidia can get the cost and prices down. As far as I can tell Turing was a node shrink from 16nm to 12nm. Next step is 12nm to 7nm. This isn't the same jump pascal made going from 28nm to 16nm.

Ultimately though it probably depends on how well the 20 series sells. If the 20 series is a non starter and gpus are sitting on the shelf, this will motivate Nvidia to have something else out more quickly at a lower price, like a 7nm refresh. If the 20 series sells within expectations then Nvidia doesn't have a need to bring anything else out.

The TI at launch is curious for sure, but I feel like this will be the norm now. Whenever the 30 series is out the only card I see truly besting the 2080 TI is a 3080 TI. Nvidia will need a flagship card at launch.
Nah, the Ti launched because of lack of node shrink and awful improvement in performance. They couldn't just launch with a 2080.
The 3000 Series will be on 7nm, I would expect a 3070 to be on par with 2080 ti. 3080 pushing 25% over it and 3080ti / Titan 45-50%

The 1000 Series is 14nm, the 2000 is 12nm.
 
That sounds highly optimistic to me but maybe you will be right!

Not speaking about anyone in general but I always find it funny what doesn't exist will be much better than what exists. For example, in the TV thread you have had people chime in the last 3 years to wait on hdmi 2.1, and they are still waiting. It might finally happen in 2019 but it still might not.
 
That sounds highly optimistic to me but maybe you will be right!

Not speaking about anyone in general but I always find it funny what doesn't exist will be much better than what exists. For example, in the TV thread you have had people chime in the last 3 years to wait on hdmi 2.1, and they are still waiting. It might finally happen in 2019 but it still might not.
For years that has been happening with Intel CPUs in the Apple world. Everyone was waiting for codename after codename to pull the trigger on a Macbook Pro or whatever, and every cycle it'd be these little gains. As soon as it launched the "next release" forum threads started and a bunch of the people waiting migrated over to that thread. But whatever was next was always to be huge.

Now I don't doubt that 7nm will be amazing, but with all the node shrink struggles going on these days.. I'm not optimistic it's going to be soon.
 
Raytracing (via DX12) & Nvidia Flex demo (via DX11) simultaneously. Rasterization performance isn't even touched by doing the RT (zero speed difference between doing the RT simulation and not). RT performance decreases under some conditions, but I think it has more to do with drawing to the screen than CUDA-based simulation of fluids.

A well-made game won't be at all hampered by turning on RT effects seemingly, but the overhead of doing rasterization then RT (as opposed to parallel as you see above) can have an impact.

Someone needs to figure out a good pipeline for rasterization & RT parallelization.
 
Hardware Unboxed did a good deep dive into DLSS based on the limited content available to test with it. I thought it was interesting:


Ultimately compared DLSS to running a game in 1800p resolution based on the Infiltrator demo. It's early though and things could change.
Man, if his findings hold up, that will be a fairly large hit to the marketability of DLSS. I guess there's always the potential of DLSS getting better in the future due to its AI component, but the jury's still out. Maybe DLSS 2x will have a larger impact?
 
I want to supplement my previous post by saying, there's still a question about bandwidth usage. Personally I'm concerned about all things BVH. CPU -> GPU and even on-GPU tasks involving the BVH. It's accelerated, but updating / processing / outputting results is a big question for me personally.
 
Hardware Unboxed did a good deep dive into DLSS based on the limited content available to test with it. I thought it was interesting:


Ultimately compared DLSS to running a game in 1800p resolution based on the Infiltrator demo. It's early though and things could change.
Oof... I know it's early still, but this has me second guessing my 2080 purchase now... If DLSS falls through then I essentially sacrificed the extra 3GB Vram of the 1080ti for nothing... AND spent hundreds extra.
 
Raytracing (via DX12) & Nvidia Flex demo (via DX11) simultaneously. Rasterization performance isn't even touched by doing the RT (zero speed difference between doing the RT simulation and not). RT performance decreases under some conditions, but I think it has more to do with drawing to the screen than CUDA-based simulation of fluids.

A well-made game won't be at all hampered by turning on RT effects seemingly, but the overhead of doing rasterization then RT (as opposed to parallel as you see above) can have an impact.

Someone needs to figure out a good pipeline for rasterization & RT parallelization.
This is great to have confirmed in testing. I wonder what kind of latency it would add to pipeline it so both run simultaneously, because I believe the RT has to go after rasterization so it knows what to trace. I'm sure it's still better than doing what BF5 did at its initial demo though where it was entirely serial, hopefully with a few more weeks/months to optimize we can get a better showing for the November launch.
 
I am extremely annoyed at 2080's performance, I held back on buying a 1080Ti back in early 2017 since I didn't play games very much and decided to hold on for another generation because there's always a decent bump in performance from gen to gen, and now it turns out that the 2080 has literally the same performance as a 1080 Ti for the same-ish price two years later. It just feels like a stopgap release or something but I can't wait for longer, my 670 is on its last legs performance wise. but it feels so damn wrong to buy a 2080.

I have no words for the 2080 Ti though. it's almost 50% more expensive for what, 30% extra performance? The thing is more expensive than the rest of the entire computer.
 
As an aside, we can tell a bit about the 2080's raytracing pipeline by looking at rays / sec at various resolutions.

I don't know what it says, but my rays / sec goes up with increased resolution, meaning the actual ray calculations aren't the bottleneck for performance. (It's also not CPU limited in this tech demo because it's not using my CPU at all, really)
 
Does anyone know how to turn off the LEDs on the EVGA 2080? Nvidia LED Visualization doesn't support the card yet and messing with the LED settings in Precision XOC doesn't seem to have any effect.
 
Does anyone know how to turn off the LEDs on the EVGA 2080? Nvidia LED Visualization doesn't support the card yet and messing with the LED settings in Precision XOC doesn't seem to have any effect.
You need to download Precision X1. There is a LED toggle on the right side that breaks out a LED application you can configure.
 
i wonder if can find some gaming use for tensor cores other than DLSS.

on a separate note, i read about oc scanner working with pascal cards and wanted to try it but i cant seem to figure out where in afterburner i access it. i dont see any options in the voltage curve editor window
 
I am extremely annoyed at 2080's performance, I held back on buying a 1080Ti back in early 2017 since I didn't play games very much and decided to hold on for another generation because there's always a decent bump in performance from gen to gen, and now it turns out that the 2080 has literally the same performance as a 1080 Ti for the same-ish price two years later. It just feels like a stopgap release or something but I can't wait for longer, my 670 is on its last legs performance wise. but it feels so damn wrong to buy a 2080.

I have no words for the 2080 Ti though. it's almost 50% more expensive for what, 30% extra performance? The thing is more expensive than the rest of the entire computer.
Also 3GB less VRAM.
 
I am extremely annoyed at 2080's performance, I held back on buying a 1080Ti back in early 2017 since I didn't play games very much and decided to hold on for another generation because there's always a decent bump in performance from gen to gen, and now it turns out that the 2080 has literally the same performance as a 1080 Ti for the same-ish price two years later. It just feels like a stopgap release or something but I can't wait for longer, my 670 is on its last legs performance wise. but it feels so damn wrong to buy a 2080.

I have no words for the 2080 Ti though. it's almost 50% more expensive for what, 30% extra performance? The thing is more expensive than the rest of the entire computer.
If you're just after pure rasterization performance this probably isn't the generation for you. I think it's pretty clear looking at the die sizes, core counts, and Nvidia's presentation that their primary focus this generation was on ray tracing and DLSS. They could've used the same die for pure rasterization and spent way less R&D, but they believed this was the way to go, and the rasterization price/performance is the price for that decision.

I can't wait for some of the RT / DLSS patches and games to start coming out, it really hasn't been since the G80 that there's been such a drastic shift in GPU tech.
 
Smokey

any reason to not get the pg27uq?

I have an acer x34 that I grew tired of because of the low hz (100hz),

I recently got a pg279q (1440p 165hz gsync)

I mainly play competitive fps (siege/ow) mixed with all the latest triple A games

I'm getting a 2080ti in the mail soon.

I put the 27inch center, with the 34 inch to the side, micro center has a sale that make the monitor 2k AFTER taxes



I feel like i'm over ultrawide for general gaming, so maybe I'm not interested in the new 200hz ulrawides, and I think they will cost over 2 gran anyways.


Any glaring downsides? I think there is an issues with mouses in a white background? Is the HDR worth it?, is the 4k worth it?






side note: how easy is it to return a monitor to micro center?
 
Smokey

any reason to not get the pg27uq?

I have an acer x34 that I grew tired of because of the low hz (100hz),

I recently got a pg279q (1440p 165hz gsync)

I mainly play competitive fps (siege/ow) mixed with all the latest triple A games

I'm getting a 2080ti in the mail soon.

I put the 27inch center, with the 34 inch to the side, micro center has a sale that make the monitor 2k AFTER taxes



I feel like i'm over ultrawide for general gaming, so maybe I'm not interested in the new 200hz ulrawides, and I think they will cost over 2 gran anyways.


Any glaring downsides? I think there is an issues with mouses in a white background? Is the HDR worth it?, is the 4k worth it?






side note: how easy is it to return a monitor to micro center?
Besides the price, theres the weird colour banding when going over 98hz, Smokey should be able to tell you more
 
Question for you guys:

Do we know if DLSS is usable outside of 4K panels? Would I be able to use it on my 3440x1440p ultra wide monitor and have it perform as though it were running at half rez? (1720x720 I guess?)
 
Smokey

any reason to not get the pg27uq?

I have an acer x34 that I grew tired of because of the low hz (100hz),

I recently got a pg279q (1440p 165hz gsync)

I mainly play competitive fps (siege/ow) mixed with all the latest triple A games

I'm getting a 2080ti in the mail soon.

I put the 27inch center, with the 34 inch to the side, micro center has a sale that make the monitor 2k AFTER taxes



I feel like i'm over ultrawide for general gaming, so maybe I'm not interested in the new 200hz ulrawides, and I think they will cost over 2 gran anyways.


Any glaring downsides? I think there is an issues with mouses in a white background? Is the HDR worth it?, is the 4k worth it?






side note: how easy is it to return a monitor to micro center?
It's extremely easy to return anything to microcenter
 
Smokey

any reason to not get the pg27uq?

I have an acer x34 that I grew tired of because of the low hz (100hz),

I recently got a pg279q (1440p 165hz gsync)

I mainly play competitive fps (siege/ow) mixed with all the latest triple A games

I'm getting a 2080ti in the mail soon.

I put the 27inch center, with the 34 inch to the side, micro center has a sale that make the monitor 2k AFTER taxes



I feel like i'm over ultrawide for general gaming, so maybe I'm not interested in the new 200hz ulrawides, and I think they will cost over 2 gran anyways.


Any glaring downsides? I think there is an issues with mouses in a white background? Is the HDR worth it?, is the 4k worth it?


side note: how easy is it to return a monitor to micro center?
30 day return policy to Microcenter, which is where I got mine from. Not sure about the mouse issue? Maybe you're referring to the blooming around the mouse that can happen on an all black background? If so that's because of the FALD. It's there but it's not a deal breaker.

Due to Display port bandwidth issues, the following applies

SDR

98Hz, RGB (4:4:4), 10-bit color depth
120Hz, RGB (4:4:4), 8-bit color depth
144Hz, YCbCr 4:2:2, 8-bit color depth

And here are the supported modes and settings for HDR content:
98Hz, RGB (4:4:4), 10-bit color depth
120Hz, YCbCr 4:2:2, 10-bit color depth
144Hz, YCbCr 4:2:2, 10-bit color depth

For SDR stuff I stick to 120hz, HDR 98hz.


Besides that it's an amazing performer. HDR gets really bright, and for me the 27" is perfect size. It's got all kinds of RGB bells and whistles for better or worse, but for 4k gaming and my 2080ti, it's been a perfect match. With Gsync on here gameplay is extremely smooth combined with the performance the 2080ti gives you.
 
Last edited:
So how long does the founder's tax usually last?
It may sound simplistic, but until there's more supply than demand. It's going to fall completely on the consumer. If they gobble them up, it'll be a long time. If demand drops off a cliff, Nvidia will have to counter by getting aggressive with pricing.
 
really appreciate the info

Does the screen have any kind of banding?

What's it like using it on a day to day basis? Is switching from HDR games back to a regular desktop a pain(do you have to switch settings?), or is that largely no longer a problem

Have most games been HDR compliant lately?

Is there a reason why you don't like HDR 144hz 4:2:2? Is it that bad while using on desktop?

What is regular 4k desktop usage like? Highest I have is 1440p, is it that much better?


I'm 70/30 comp fps vs regular gaming, so I'd want the highest FPS possible. Do you think this monitor is overkill for that?
 
30 day return policy to Microcenter, which is where I got mine from. Not sure about the mouse issue? Maybe you're referring to the blooming around the mouse that can happen on an all black background? If so that's because of the FALD. It's there but it's not a deal breaker.

Due to Display port bandwidth issues, the following applies

SDR

98Hz, RGB (4:4:4), 10-bit color depth
120Hz, RGB (4:4:4), 8-bit color depth
144Hz, YCbCr 4:2:2, 8-bit color depth

And here are the supported modes and settings for HDR content:
98Hz, RGB (4:4:4), 10-bit color depth
120Hz, YCbCr 4:2:2, 10-bit color depth
144Hz, YCbCr 4:2:2, 10-bit color depth

For SDR stuff I stick to 120hz, HDR 98hz.


Besides that it's an amazing performer. HDR gets really bright, and for me the 27" is perfect size. It's got all kinds of RGB bells and whistles for better or worse, but for 4k gaming and my 2080ti, it's been a perfect match. With Gsync on here gameplay is extremely smooth combined with the performance the 2080ti gives you.
I just bought the same combo but don't have the 2080ti yet. Love using this thing with my consoles and the few games that actually support hdr on pc.
 
Top