Jump to content
NotebookTalk

*Official Benchmark Thread* - Post it here or it didn't happen :D


Mr. Fox

Recommended Posts

More info... the last section of the video is particularly interesting and probably the most relevant in terms of the global magnitude of the tragedy of EVGA leaving the GPU industry.

20:19 - How NVIDIA Blocked Progress

 

@johnksss

  • Thumb Up 5

Wraith // Z790 Apex | 14900KF | 4090 Suprim X+Byksi Block | 48GB DDR5-8600 | Toughpower GF3 1650W | MO-RA3 360 | Hailea HC-500A || O11D XL EVO
Banshee // Z790 Apex Encore | 13900KS | 4090 Gaming OC+Alphacool Block | 48GB DDR5-8600 | RM1200x SHIFT | XT45 1080 Nova || Dark Base Pro 901
Munchkin // Z790i Edge | 14900K | Arc A770 Phantom Gaming OC | 48GB DDR5-8000 | GameMax 850W | EK Nucleus CR360 Dark || Prime AP201 
Half-Breed // Dell Precision 7720 | BGA CPU Filth+MXM Quadro P5000 | Sub-$500 Grade A Refurb || Nothing to Write Home About  

 Mr. Fox YouTube Channel | Mr. Fox @ HWBOT

The average response time for a 911 call is 10 minutes. The response time of a .357 is 1400 feet per second.

Link to comment
Share on other sites

This guy in this video makes a great point about the 12gb 4080. Maybe it wasn't the best decision to cram 12gb over a 192 bit bus on a card with that many cores after all...
 

 

  • Thumb Up 2

Clevo P870TM-G: Core i7 8700k @ 4.8ghz | Clevo GTX 1080 | 32gb HyperX DDR4 @ 3200mhz | 17" 1440p 120hz B173QTN01.0 Screen | 256gb Samsung 850 EVO | 500gb WD Blue SSD | 1tb Samsung 870 QVO | 2tb Seagate 5400rpm HDD | Prema BIOS
 

Alienware 17 R1: Core i7 4710mq @ 3.619ghz 741 CBR15 (834 CBR15 @ 4.213ghz) | Dell GTX 860m | 16gb HyperX DDR3L @ 2133mhz | 17" 3D 120hz LTN173HT02-T01 Screen | 256gb mSATA SSD

Asus Zephyrus G14: Ryzen 7 4800hs @ 4.2ghz | GTX 1650 | 16gb DDR4 @ 3200mhz | 14" 120hz LM140LF1F01 Screen | 512gb NVME SSD

 

 

Link to comment
Share on other sites

  • Thumb Up 1
  • Like 1
  • Bump 2

Wraith // Z790 Apex | 14900KF | 4090 Suprim X+Byksi Block | 48GB DDR5-8600 | Toughpower GF3 1650W | MO-RA3 360 | Hailea HC-500A || O11D XL EVO
Banshee // Z790 Apex Encore | 13900KS | 4090 Gaming OC+Alphacool Block | 48GB DDR5-8600 | RM1200x SHIFT | XT45 1080 Nova || Dark Base Pro 901
Munchkin // Z790i Edge | 14900K | Arc A770 Phantom Gaming OC | 48GB DDR5-8000 | GameMax 850W | EK Nucleus CR360 Dark || Prime AP201 
Half-Breed // Dell Precision 7720 | BGA CPU Filth+MXM Quadro P5000 | Sub-$500 Grade A Refurb || Nothing to Write Home About  

 Mr. Fox YouTube Channel | Mr. Fox @ HWBOT

The average response time for a 911 call is 10 minutes. The response time of a .357 is 1400 feet per second.

Link to comment
Share on other sites

One benchmark on wccftech.com came up.

 

So the 4090 is 50 60%  more with raytracing against the 3090ti. Whats strange is the temperature.. Is the cooling so good on the 4090??

unknown.png

  • Thumb Up 3
  • Thanks 1

7950X3D| Zotac 4090 AMP Extreme Airo| MSI MPG B650 Edge Wifi| Lian Li Galahad 360 V2| 32GB Kingston Renegade RGBZ 6000|Kingston KC3000 2TB| Fury Renegade 2TB|Samsung 970 Evo 1TB| Lian Li O11 Dynamic Evo| Corsair HX1500i| Samsung Odyssey G9 Neo

Asus Zephyrus G15 (Ryzen 9 6900HS + RTX3080)

 

Link to comment
Share on other sites

7 hours ago, cylix said:

One benchmark on wccftech.com came up.

 

So the 4090 is 50 60%  more with raytracing against the 3090ti. Whats strange is the temperature.. Is the cooling so good on the 4090??

unknown.png

Assuming the thermals are a direct comparison on the same test system within a controlled environment, the variance is possibly a direct reflection on the effectiveness of the fans and heat sink on each GPU. It could vary wildly among brands and different models within a brand. The only way you could draw any good conclusions on thermals of the actual GPU is with direct die water cooling, or using the same fans and heat sink on both GPUs.

 

That is a gigantic difference in raw FPS and temperatures.

  • Bump 1

Wraith // Z790 Apex | 14900KF | 4090 Suprim X+Byksi Block | 48GB DDR5-8600 | Toughpower GF3 1650W | MO-RA3 360 | Hailea HC-500A || O11D XL EVO
Banshee // Z790 Apex Encore | 13900KS | 4090 Gaming OC+Alphacool Block | 48GB DDR5-8600 | RM1200x SHIFT | XT45 1080 Nova || Dark Base Pro 901
Munchkin // Z790i Edge | 14900K | Arc A770 Phantom Gaming OC | 48GB DDR5-8000 | GameMax 850W | EK Nucleus CR360 Dark || Prime AP201 
Half-Breed // Dell Precision 7720 | BGA CPU Filth+MXM Quadro P5000 | Sub-$500 Grade A Refurb || Nothing to Write Home About  

 Mr. Fox YouTube Channel | Mr. Fox @ HWBOT

The average response time for a 911 call is 10 minutes. The response time of a .357 is 1400 feet per second.

Link to comment
Share on other sites

7 hours ago, cylix said:

One benchmark on wccftech.com came up.

 

So the 4090 is 50 60%  more with raytracing against the 3090ti. Whats strange is the temperature.. Is the cooling so good on the 4090??

unknown.png

 

With the uptick in RT performance, I expected more than 62% since 3rd gen tensor cores are supposed to be much better than 2nd gen.

 

50-60% is what I expect in raw rasterization.

 

DLSS as always is a hard pass for me.

 

Maybe the FE stock cooler is a monster if they can handle up to 600w or a reporting anomaly and software needs updating. Suprim is supposed to have a pretty decent cooler so I'm surprised to see 75c.

 

 

 

 

  • Thumb Up 1

Electrosoft Prime: 7950X3D | MSI X670E Carbon  | MSI Suprim X Liquid 4090 | AC LF II 420 | G.Skill 6000 A-Die 2x32GB | Samsung 990 Pro 2TB | EVGA 1600w P2 | Phanteks Ethroo Pro | Alienware AW3225QF 32" OLED

Eurocom Raptor X15 | 12900k | Nvidia RTX 3070ti | 15.6" 1080p 240hz | Kingston 3200 32GB (2x16GB) | Samsung 980 Pro 1TB Heatsink Edition
Heath: i9-12900k | EVGA CLC 280 | Asus Strix Z690 D4 | Asus Strix 3080 | 32GB DDR4 2x16GB B-Die 4000  | WD Black SN850 512GB |  EVGA DG-77 | Samsung G7 32" 144hz 32"

MelMel:  (Retrofit currently in progress)

 

 

 


 

Link to comment
Share on other sites

LOL 3090 FE no longer viable....broken irrevocably!  (thick boii!)

 

 

 

 

 

  • Haha 3

Electrosoft Prime: 7950X3D | MSI X670E Carbon  | MSI Suprim X Liquid 4090 | AC LF II 420 | G.Skill 6000 A-Die 2x32GB | Samsung 990 Pro 2TB | EVGA 1600w P2 | Phanteks Ethroo Pro | Alienware AW3225QF 32" OLED

Eurocom Raptor X15 | 12900k | Nvidia RTX 3070ti | 15.6" 1080p 240hz | Kingston 3200 32GB (2x16GB) | Samsung 980 Pro 1TB Heatsink Edition
Heath: i9-12900k | EVGA CLC 280 | Asus Strix Z690 D4 | Asus Strix 3080 | 32GB DDR4 2x16GB B-Die 4000  | WD Black SN850 512GB |  EVGA DG-77 | Samsung G7 32" 144hz 32"

MelMel:  (Retrofit currently in progress)

 

 

 


 

Link to comment
Share on other sites

On 9/23/2022 at 1:48 PM, Mr. Fox said:

Thanks. Yeah, it is impossible to have enough space for everything, at least on my budget. I'd like to have an extra room the size of my 2-car garage, or larger, for nothing but my computer lab and enough money where I could go nuts building everything to my ideal specifications. Hell will more likely freeze over before that happens.

 

problem is always the wife approval factor and the flexibility of your wallet 😄 i feel ya!

 

On 9/23/2022 at 5:23 PM, Papusan said:

Yep, as long they show only white🙂

 

Microsoft just can't stop screw up their latest and greatest. 

 

Windows 11 22H2 apparently causing problems on Nvidia graphics cards
https://notebooktalk.net/topic/168-all-about-windows-11-news-and-announcements/?do=findComment&comment=14046

 

haha dunno yet what settings im gonna use for the RGB, but its not a high priority tbh. gonna feel my way through the build first, its been a hot minute! im thinking of opening up a component list / order / build / presentation thread to showcase the new baby 🙂 should be fun!

 

23 hours ago, Papusan said:

Nvidia defend their latest scam with crap talk. 

 

There's also the question of why the RTX 4080 12GB isn't just called the RTX 4070. Talking with Nvidia during a briefing, this exact question came up: What was the thought process behind calling the 12GB chip a 4080 instead of a 4070, especially since it's a different chip?

Nvidia's Justin Walker, Senior Director of Product Management, said, "The 4080 12GB is a really high performance GPU. It delivers performance considerably faster than a 3080 12GB... it's faster than a 3090 Ti, and we really think it's deserving of an 80-class product."

Frankly, that's a crap answer. Of course it's faster! It's a new chip and a new architecture; it's supposed to be faster. Remember when the GTX 1070 came out and it was faster than a 980 Ti? I guess that wasn't "deserving" of an 80-class product name. Neither was the RTX 2070 when it matched the 1080 Ti, or the 3070 when it matched the 2080 Ti.

 

Why Nvidia's RTX 4080, 4090 Cost so Damn Much tomshardware.com
https://www.tomshardware.com/news/why-nvidias-4080-4090-cost-so-damn-much

 

Nvidia hasn't released die shots or renderings of AD103 and AD104 just yet, but we do have the full specs. They're quite a bit smaller, and much of that comes from reducing core counts, memory interfaces, and L2 cache size. The 4080 models will naturally be higher volume products than the 4090, though it's worth pointing out that the 4090 potentially has 70% more compute, 50% more memory bandwidth and capacity, and uses 41% more power, all while "only" costing 33% more. In other words, RTX 4080 16GB pricing is proportionately worse than the RTX 4090.

 

Yep, not only is the "castrated 4080@12GB of bad value due you pay premium for xx70 performance, but the more expensive real 4080 give a bad value as well. In short the the whole xx80 line-up is messed up. Jensen need to clean up, but I doub't he will do that.

 

What a Ugly box. I wonder how big the 5000 series xx90 will be.

88570_801_aorus-geforce-rtx-4090-master-is-the-biggest-so-far-by.jpg

 

 

AORUS GeForce RTX 4090 MASTER is the biggest RTX 4090 so far, by far
Anthony Garreffa | Sep 22, 2022 8:58 PM CDT
NVIDIA unleashed its new GeForce RTX 4090 but now AIB partners are announcing their new custom RTX 4090 designs... and man... GIGABYTE's new custom AORUS GeForce RTX 4090 MASTER makes the word behemoth feel small.

Read more: https://www.tweaktown.com/news/video_cards/index.html

 

its hilarious to see how much "good value" the 4090 is compared with the crap 4080 cards below it 😄 makes u wonder why so many tech enthusiasts are becoming more and more cynical, eh? helps u deal with the fax...

 

22 hours ago, Ashtrix said:

Nvidia CEO goes even further defending the garbage move. He says Moore's Law is dead.

 

 

Yeah sure lol have a look at American ETFs, NASDAQ, NYSE market people will come out and rush to buy when they can barely feed themselves. He will say any sort of BS and lies to sell this lol.

 

Best part ? Intel says Moore's Law is Alive (Aug 2022 article)

 

 

How Mr. Ngreedia Green Goblin got those Ampere GAxxx chips for peanuts at Samsung 8N node which is 10nm custom node, and who used it ? NOBODY except Nvidia, unlimited supply and 1/2 the cost of TSMC 7N wafers. That guy knew Mining would be booming which is why he fitted Ampere with GDDR6X Custom PAM4 signalling for extreme Memory performance even that is cheaper than GDDR6. All that was possible because of Moore's Law.

 

He got everything for cheap. Maybe one of the reason why the GA102 shared with all 3080 to 3090 variants. But he thought it was his greatest mistake for 3080 with GA102 for $699. He fixed it now, AD104 like Pascal era of GP104 gets to xx80 GPU (1080 was GP104) but do note 1080Ti is a GP102.

 

Also if you realize the pricing of existing stock Ampere cards, they are not slashed anymore, they are fixed now. Probably to save AIBs because if Nvidia axes more MSRP off these Ampere cards MSI, GB, EVGA, ASUS, Gainward, Palit, Zotac, Galax, etc etc will actually make loss over the thin margins. So he kept the pricing intact and took the pricing baseline from 3090Ti and then applied it to the RTX40.

 

3090Ti ~$1150

4080 12GB $900-1000 either it loses out by 15-25% in Raster / matches or gains using fake frame tech so less priced.

4080 16GB 25% faster than 3090Ti in Raster and shares 25% more price.

4090 50-60% faster in raster from what they have shown, 2-4X in Fake Frame Tech, 60% more cost.

 

*Raster performance is Nvidia picked titles - RE8, AC Valhalla, The Division 2.

**All that when he unceremoniously charged RTX3080 -> RTX3090 2X COST for 15% max perf diff which is now normalized. Expect 4090Ti MSRP to be normalized for RTX5000 cards.

 

 

 

People need to vote with wallet.

 

AMD adjusted a bit of pricing, although 6900XT is literally same as 6800XT but they are better in pricing, 6800XT fights with a 3090 and beats it in some tiles and it's $600 right now. Shame people do not buy them because garbage DLSS and RT, ironic since DLSS 2.x is dead now by FSR2.0 AND DLSS3.x double whammy.

 

https://www.techpowerup.com/img/WzJaU3fQc5LcBrZp.jpg

 

Going by the current Ampere and RDNA2, we are in for a GPU overstock slump. I wonder the RDNA3 prices now how AMD will respond, they clearly have 2 choices - Use the golden chance and knock off Nvidia by severely undercutting OR Reduce the prices a bit and maintain not much marketshare again.

 

all power to AMD, for sure. ill be cheering the red team on to crush Nvidia's 40 series, even with a 4090 tucked in my system. i mean, heck! Moore's law is dead actually QUOTED Nvidia's own engineers being surprised by the steep pricing of the 40 series and ACTUALLY ROOTING FOR AMD AT THIS POINT! imagine that for a second, how messed up is that? 

 

22 hours ago, electrosoft said:

 

Well, I guess the "glass half full" look at this (barf) is he could have priced the 4090 at 1999 making performance:cost proportionate with the 4080 16gb and the 4080-- 12GB......

 

The 4090, compared to the 3090, is offering substantially more performance for only $100 more USD but it is downright pricey considering the original price of flagship GPUs just 3 generations ago (Pascal).

 

Looking at the great Ampere glut, the pricing makes sense and either forces a buyer into a monster purchase of a 4000 card or a more realistic purchase of a 3070 for ~$450-500.

 

4080 12GB is going to have a hard time beating a 3090ti in pure rasterization. RT performance is going to be good though.

 

I wouldn't even contemplate a 4080 16GB even if I have a 3080+ class card and it is priced at $1200. I'm just going to pony up the extra $300 and get a 4090 if I was in that situation.

 

2000 more cores, I expect if Jensen has his way the 4090ti is going to be $1999.99.

 

Once AMD became a non-factor, all bets were off. 6000 series made some good inroads but maybe Jensen knows something we don't about AMD specs/pricing to feel he can justify these prices. I'm still hoping AMD comes out with a winner and similar performance leaps as RDNA to RDNA2.

 

 

 

i for one think one of the major reasons the 40 series is clocked and powered SO damn high is for team green to make sure they stay on top of AMD. this likely means AMD has something SERIOUS to counter offer in November...lets watch the showdown and hope for some good competition, especially with Nvidia being such a dick lately.

 

22 hours ago, cylix said:

 

Bad news are piling up. 😃. What a bad launch for nvidia. Hope  amd can profit from this. Better for the consumers in the end if nvidias legs are shaking. Maybe they will wake up and treat us how its supposed to be.

 

funny actually, i saw this max. 30 cycles note on Zotac's website while researching 4090 cards from AIBs and bam 1-2 days later Jay makes a video about that 😄 in any case, this whole situation makes one very cautious when choosing the power supply. im currently set on the latest Seasonic TX-1600, although its not fully ATX 3.0 specced ("only" up to 1000W for ATX 3.0), but the only alternative right now is a Thermaltake 1650W 80Plus Gold psu with ATX 3.0. ugh....i hope more options will come out SOON, otherwise how am i gonna keep this monster gpu juiced?

 

14 hours ago, Mr. Fox said:

 

hey bro, I heard you're a bit tight on storage... 🤣🤘 (but look whos talking, amirite? haha)

 

10 hours ago, cylix said:

One benchmark on wccftech.com came up.

 

So the 4090 is 50 60%  more with raytracing against the 3090ti. Whats strange is the temperature.. Is the cooling so good on the 4090??

unknown.png

 

thats a big fat nothing burger out of the box without any context, i.e. ambient temps, cooling setup, etc. 

 

as for those DLSS 3 "fake frames" im kinda curious of this tech. if this turns out to be somewhat well implemented it might be a nice way to postpone next-gen gpu purchases for a while in the future. i know i know, for now only 40 series supported, but backport to 30 series or even open source alternative from AMD wouldnt be out of the question.

  • Haha 2

Mine: Hyperion "Titan God of Heat, Heavenly Light, Power" (2022-24)
AMD Ryzen 9 7950X (custom TG IHS) / Asus ROG Crosshair X670E Extreme / MSI Geforce RTX 4090 Suprim X / Teamgroup T-Force Delta RGB DDR5-8200 2x24 GB / Seagate Firecuda 530 4 TB / 5x Samsung 860 Evo 4 TB / Arctic Liquid Freezer II 420 (Push/Pull 6x Noctua NF-A14 IndustrialPPC-3000 intake) / Seasonic TX-1600 W Titanium / Phanteks Enthoo Pro 2 TG (3x Arctic P12 A-RGB intake / 4x Arctic P14 A-RGB exhaust / 1x Arctic P14 A-RGB RAM cooling) / Samsung Odyssey Neo G8 32" 4K 240 Hz / Ducky One 3 Daybreak Fullsize Cherry MX Brown / Corsair M65 Ultra RGB / PDP Afterglow Wave Black

 

My Lady's: Clevo NH55JNNQ "Alfred" (2022-24)
Sharp LQ156M1JW03 FHD matte 15.6" IGZO 8 bit @248 Hz / Intel Core i5 12600 / Nvidia Geforce RTX 3070 Ti / Mushkin Redline DDR4-3200 2x32 GB / Samsung 970 Pro 1 TB / Samsung 870 QVO 8 TB / Intel AX201 WIFI 6+BT 5.2 / Win 11 Pro Phoenix Lite OS / 230 W PSU powered by Prema Mod!

Link to comment
Share on other sites

I do not use DLSS and don't really care about it or the AMD version of it. I don't game often enough for it to matter... maybe 2 or 3 hours YTD.

 

Why all the sudden hate and "fake frame" chatter? Seems like not long ago many were super excited about it and wouldn't buy a GPU without it. Did the world suddenly figure out it was just another silly gimmick for rabid gamers to wet themselves over?

  • Haha 2

Wraith // Z790 Apex | 14900KF | 4090 Suprim X+Byksi Block | 48GB DDR5-8600 | Toughpower GF3 1650W | MO-RA3 360 | Hailea HC-500A || O11D XL EVO
Banshee // Z790 Apex Encore | 13900KS | 4090 Gaming OC+Alphacool Block | 48GB DDR5-8600 | RM1200x SHIFT | XT45 1080 Nova || Dark Base Pro 901
Munchkin // Z790i Edge | 14900K | Arc A770 Phantom Gaming OC | 48GB DDR5-8000 | GameMax 850W | EK Nucleus CR360 Dark || Prime AP201 
Half-Breed // Dell Precision 7720 | BGA CPU Filth+MXM Quadro P5000 | Sub-$500 Grade A Refurb || Nothing to Write Home About  

 Mr. Fox YouTube Channel | Mr. Fox @ HWBOT

The average response time for a 911 call is 10 minutes. The response time of a .357 is 1400 feet per second.

Link to comment
Share on other sites

9 minutes ago, Mr. Fox said:

I do not use DLSS and don't really care about it or the AMD version of it. I don't game often enough for it to matter... maybe 2 or 3 hours YTD.

 

Why all the sudden hate and "fake frame" chatter? Seems like not long ago many were super excited about it and wouldn't buy a GPU without it. Did the world suddenly figure out it was just another silly gimmick for rabid gamers to wet themselves over?

 

Seems like it. Since I have a 2080 Super, I decided to try it out in Shadow Of The Tomb Raider. I'd say it does help boost framerates a bit, but at the cost of a little image blurriness. I'd rather the raytracing and tensor cores be removed and replaced with more CUDA cores. In addition, I'd like DLSS to not be proprietary tech.

 

I think the raytracing hype has faded a lot too. People are realizing that it only yields a small improvement in graphical fidelity while also costing a ton of extra performance over traditional rasterization techniques. At this point, the techniques we've developed around rasterized graphics have become so efficient that we can get near the same graphical fidelity as a fully raytraced scene at a fraction of the performance cost.

 

It's just better to fake effects rather than fully compute them if the object in question isn't interactive or dynamic. Visually, it's the same result to the user. When seeing showcases on new shiny tech like this in regards to games, the developer part of my brain is extremely disappointed with how much performance is wasted when comparable visuals could be achieved at a fraction of the performance cost. Just my mindset as a developer though. I always like extracting every bit of performance I can out of the applications I develop.

  • Like 2

AlienyHackbook: Alienware M17X R5 | i7-4930MX | GTX 1060 | 32GB DDR3L Kingston HyperX @ 2133 MHz CL 12 | MacOS Sierra 10.12.5 | Windows 10 LTSC | Hackintoshes Rule!

 

Desktop Killer: Clevo X170SM-G | i9-10900K | RTX 2080 Super | 32GB DDR4 Crucial Ballistix @ 3200 MHz CL 16 | Windows 10 LTSC | Slayer Of Desktops

 

Sagattarius A: Custom Built Desktop | i9-10900K | RX 6950 XT | 32GB DDR4 G.Skill Ripjaws @ 4000 MHz CL 15 | Windows 10 LTSC | Ultimate Performance Desktop With Cryo Cooling!

Link to comment
Share on other sites

9 hours ago, cylix said:

One benchmark on wccftech.com came up.

 

So the 4090 is 50 60%  more with raytracing against the 3090ti. Whats strange is the temperature.. Is the cooling so good on the 4090??

unknown.png


 

 

I’m looking at the 3090Ti in the 2nd chart showing 61fps avg in Cyberpunk 2077. This is 1440P, Psycho RT, DLSS2. That performance is not right. 

 

 

Below is a video I made. This should

help anyone with a 3090 that’s full overclocked. Yes, the 4090 is still a beast. But if I turn off DLSS it’s still in the 43-50 fps range. I’m gonna say my GPU full OC vs a stock 4090 is 35-40% rasterization improvement. “Which is very good”

 

It is the DLSS performance from the 4090 that is so incredible. Although, I don’t typically use DLSS much since I have the raw GPU power to go without. Cyberpunk 2077 was the only game that really needed DLSS. 
 

Anyways, this is my video. Take a look at the average frame rate. It’s really impressive. 

 

https://youtu.be/Tod5tN7myM8

13900KF

Link to comment
Share on other sites

21 hours ago, Clamibot said:

Seems like it. Since I have a 2080 Super, I decided to try it out in Shadow Of The Tomb Raider. I'd say it does help boost framerates a bit, but at the cost of a little image blurriness. I'd rather the raytracing and tensor cores be removed and replaced with more CUDA cores. In addition, I'd like DLSS to not be proprietary tech.

 

I think the raytracing hype has faded a lot too. People are realizing that it only yields a small improvement in graphical fidelity while also costing a ton of extra performance over traditional rasterization techniques. At this point, the techniques we've developed around rasterized graphics have become so efficient that we can get near the same graphical fidelity as a fully raytraced scene at a fraction of the performance cost.

 

It's just better to fake effects rather than fully compute them if the object in question isn't interactive or dynamic. Visually, it's the same result to the user. When seeing showcases on new shiny tech like this in regards to games, the developer part of my brain is extremely disappointed with how much performance is wasted when comparable visuals could be achieved at a fraction of the performance cost. Just my mindset as a developer though. I always like extracting every bit of performance I can out of the applications I develop.

Yeah, I've never really understood the mentality behind any of it. When I am enjoying a game and it captures my interest it has little to do with photorealism or lifelike AI. It's because the game is fun and captivating and I am more in tune with the action than I  am being a weirdo about things that don't make the game more or less fun to play.

 

I might care more watching a movie or looking at photos if the image quality is poor resolution with washed out colors. I'd be more likely to notice in a scenario that has zero user interaction. I don't give a rat's butt about live gameplay streaming or watching other people play games, but if I were a spectator the lack of engagement and boredom might give me enough opportunity to notice minor anomalies in image quality. But, I think it wouldn't matter because there is a certain level of expectation that streamed video quality is naturally going to leave something to be desired.

21 hours ago, tps3443 said:

I’m looking at the 3090Ti in the 2nd chart showing 61fps avg in Cyberpunk 2077. This is 1440P, Psycho RT, DLSS2. That performance is not right. 

 

Below is a video I made. This should

help anyone with a 3090 that’s full overclocked. Yes, the 4090 is still a beast. But if I turn off DLSS it’s still in the 43-52 fps range. I’m gonna say my GPU full OC vs a stock 4090 is 35% rasterization improvement. “Which is very good”

 

It is the DLSS performance from the 4090 that is so incredible. Although, I don’t typically use DLSS much since I have the raw GPU power to go without. Cyberpunk 2077 was the only game that really needed DLSS. 
 

Anyways, this is my video. Take a look at the average frame rate. It’s really impressive. 

 

https://youtu.be/Tod5tN7myM8

Well, bear in my we have 3090 KPE video cards. The people that own peasant versions of a 3090 are going to be dealing with all sorts of deliberately fabricated stupidity that cripples performance.  Our GPUs are not castrated the way the rest of them are.

 

Without EVGA driving competition with regard to overclocking and raising voltage and power limits, I am expecting the castration to worsen. They were almost single-handedly responsible for enthusiasts having as little reason left to be enthusiastic as we had. I won't have a surprised look on my face if I can't find a reason going forward. Factor in the irrational environmentalist agenda that is ruining our lives in other areas, we will probably see an expansion of the emasculation to include removal of all of the man parts, and we will all be left with cookie-cutter pansy-boy gamer trash.

  • Like 1

Wraith // Z790 Apex | 14900KF | 4090 Suprim X+Byksi Block | 48GB DDR5-8600 | Toughpower GF3 1650W | MO-RA3 360 | Hailea HC-500A || O11D XL EVO
Banshee // Z790 Apex Encore | 13900KS | 4090 Gaming OC+Alphacool Block | 48GB DDR5-8600 | RM1200x SHIFT | XT45 1080 Nova || Dark Base Pro 901
Munchkin // Z790i Edge | 14900K | Arc A770 Phantom Gaming OC | 48GB DDR5-8000 | GameMax 850W | EK Nucleus CR360 Dark || Prime AP201 
Half-Breed // Dell Precision 7720 | BGA CPU Filth+MXM Quadro P5000 | Sub-$500 Grade A Refurb || Nothing to Write Home About  

 Mr. Fox YouTube Channel | Mr. Fox @ HWBOT

The average response time for a 911 call is 10 minutes. The response time of a .357 is 1400 feet per second.

Link to comment
Share on other sites

1 hour ago, Clamibot said:

 

Seems like it. Since I have a 2080 Super, I decided to try it out in Shadow Of The Tomb Raider. I'd say it does help boost framerates a bit, but at the cost of a little image blurriness. I'd rather the raytracing and tensor cores be removed and replaced with more CUDA cores. In addition, I'd like DLSS to not be proprietary tech.

 

I think the raytracing hype has faded a lot too. People are realizing that it only yields a small improvement in graphical fidelity while also costing a ton of extra performance over traditional rasterization techniques. At this point, the techniques we've developed around rasterized graphics have become so efficient that we can get near the same graphical fidelity as a fully raytraced scene at a fraction of the performance cost.

 

It's just better to fake effects rather than fully compute them if the object in question isn't interactive or dynamic. Visually, it's the same result to the user. When seeing showcases on new shiny tech like this in regards to games, the developer part of my brain is extremely disappointed with how much performance is wasted when comparable visuals could be achieved at a fraction of the performance cost. Just my mindset as a developer though. I always like extracting every bit of performance I can out of the applications I develop.

 

and now just imagine how much closer we could get to RT image quality with all those added cuda cores instead of RT and Tensor cores on the die! welp, from nvidias point of view this makes more financial sense of course since 1) they can use the same or very similar dies across their whole line up, including for gamers, creators, AI, servers, HPC, etc and 2) they can BS market proprietary stuff that sets them apart from the competition and makes them "oh so special". funny though how AMD is able to offer similar of not same functionality with just regular old gpu cores, eh? 😛 

 

2 hours ago, Mr. Fox said:

I do not use DLSS and don't really care about it or the AMD version of it. I don't game often enough for it to matter... maybe 2 or 3 hours YTD.

 

 

yeah big difference here, i dont do much benching anymore except as part of hardware optimizations and performance validation / to get to know new hardware. once i got my everyday settings dialed in i leave it as is and use the machine for work / gaming. 

 

1 hour ago, tps3443 said:

I’m looking at the 3090Ti in the 2nd chart showing 61fps avg in Cyberpunk 2077. This is 1440P, Psycho RT, DLSS2. That performance is not right. 

 

 

Below is a video I made. This should

help anyone with a 3090 that’s full overclocked. Yes, the 4090 is still a beast. But if I turn off DLSS it’s still in the 43-50 fps range. I’m gonna say my GPU full OC vs a stock 4090 is 35-40% rasterization improvement. “Which is very good”

 

It is the DLSS performance from the 4090 that is so incredible. Although, I don’t typically use DLSS much since I have the raw GPU power to go without. Cyberpunk 2077 was the only game that really needed DLSS. 
 

Anyways, this is my video. Take a look at the average frame rate. It’s really impressive. 

 

https://youtu.be/Tod5tN7myM8

 

always best to look at third party info, so ur input here is very helpful already! 

 

1 hour ago, Mr. Fox said:

Yeah, I've never really understood the mentality behind any of it. When I am enjoying a game and it captures my interest it has little to do with photorealism or lifelike AI. It's because the game is fun and captivating and I am more in tune with the action than I  am being a weirdo about things that don't make the game more or less fun to play.

 

I might care more watching a movie or looking at photos if the image quality is poor resolution with washed out colors. I'd be more likely to notice in a scenario that has zero user interaction. I don't give a rat's butt about live gameplay streaming or watching other people play games, but if I were a spectator the lack of engagement and boredom might give me enough opportunity to notice minor anomalies in image quality. But, I think it wouldn't matter because there is a certain level of expectation that streamed video quality is naturally going to leave something to be desired.

Well, bear in my we have 3090 KPE video cards. The people that own peasant versions of a 3090 are going to be dealing with all sorts of deliberately fabricated stupidity that cripples performance.  Our GPUs are not castrated the way the rest of them are.

 

Without EVGA driving competition with regard to overclocking and raising voltage and power limits, I am expecting the castration to worsen. They were almost single-handedly responsible for enthusiasts have as little reason left to be enthusiastic as we had. I won't have a surprised look on my face if I can't find a reason going forward. Factor in the irrational environmentalist agenda that is ruining our lives in other areas, we will probably see an expansion of the emasculation to include removal of all of the man parts, and we will all be left with cookie-cutter pansy-boy gamer trash.

 

soooooo tell me: if you ever give up on PC hardware n benching as a hobby, what would be the next best thing? somehow i cant imagine @Mr. Fox doing pottery or gardening instead (especially in the arizona desert lulz). how about joining a motorcycle gang or maybe switch to desert buggies including tinkering with the gear yourself 😁 definitely something along the lines of a grease head hahaha

  • Like 1
  • Haha 1

Mine: Hyperion "Titan God of Heat, Heavenly Light, Power" (2022-24)
AMD Ryzen 9 7950X (custom TG IHS) / Asus ROG Crosshair X670E Extreme / MSI Geforce RTX 4090 Suprim X / Teamgroup T-Force Delta RGB DDR5-8200 2x24 GB / Seagate Firecuda 530 4 TB / 5x Samsung 860 Evo 4 TB / Arctic Liquid Freezer II 420 (Push/Pull 6x Noctua NF-A14 IndustrialPPC-3000 intake) / Seasonic TX-1600 W Titanium / Phanteks Enthoo Pro 2 TG (3x Arctic P12 A-RGB intake / 4x Arctic P14 A-RGB exhaust / 1x Arctic P14 A-RGB RAM cooling) / Samsung Odyssey Neo G8 32" 4K 240 Hz / Ducky One 3 Daybreak Fullsize Cherry MX Brown / Corsair M65 Ultra RGB / PDP Afterglow Wave Black

 

My Lady's: Clevo NH55JNNQ "Alfred" (2022-24)
Sharp LQ156M1JW03 FHD matte 15.6" IGZO 8 bit @248 Hz / Intel Core i5 12600 / Nvidia Geforce RTX 3070 Ti / Mushkin Redline DDR4-3200 2x32 GB / Samsung 970 Pro 1 TB / Samsung 870 QVO 8 TB / Intel AX201 WIFI 6+BT 5.2 / Win 11 Pro Phoenix Lite OS / 230 W PSU powered by Prema Mod!

Link to comment
Share on other sites

1 hour ago, tps3443 said:


 

 

I’m looking at the 3090Ti in the 2nd chart showing 61fps avg in Cyberpunk 2077. This is 1440P, Psycho RT, DLSS2. That performance is not right. 

 

 

Below is a video I made. This should

help anyone with a 3090 that’s full overclocked. Yes, the 4090 is still a beast. But if I turn off DLSS it’s still in the 43-50 fps range. I’m gonna say my GPU full OC vs a stock 4090 is 35-40% rasterization improvement. “Which is very good”

 

It is the DLSS performance from the 4090 that is so incredible. Although, I don’t typically use DLSS much since I have the raw GPU power to go without. Cyberpunk 2077 was the only game that really needed DLSS. 
 

Anyways, this is my video. Take a look at the average frame rate. It’s really impressive. 

 

https://youtu.be/Tod5tN7myM8

 

Of course we're comparing a massively outlier setup (KPE 3090, custom water, Mo-Ra, chiller) vs a 4090 FE with zero over clocks but I do agree some of the numbers are sus especially the 4090 numbers unless....*gasp* Nvidia was fluffing us hardcore with the 4090 presentation.

 

1 hour ago, Mr. Fox said:

Yeah, I've never really understood the mentality behind any of it. When I am enjoying a game and it captures my interest it has little to do with photorealism or lifelike AI. It's because the game is fun and captivating and I am more in tune with the action than I  am being a weirdo about things that don't make the game more or less fun to play.

 

I might care more watching a movie or looking at photos if the image quality is poor resolution with washed out colors. I'd be more likely to notice in a scenario that has zero user interaction. I don't give a rat's butt about live gameplay streaming or watching other people play games, but if I were a spectator the lack of engagement and boredom might give me enough opportunity to notice minor anomalies in image quality. But, I think it wouldn't matter because there is a certain level of expectation that streamed video quality is naturally going to leave something to be desired.

Well, bear in my we have 3090 KPE video cards. The people that own peasant versions of a 3090 are going to be dealing with all sorts of deliberately fabricated stupidity that cripples performance.  Our GPUs are not castrated the way the rest of them are.

 

Without EVGA driving competition with regard to overclocking and raising voltage and power limits, I am expecting the castration to worsen. They were almost single-handedly responsible for enthusiasts have as little reason left to be enthusiastic as we had. I won't have a surprised look on my face if I can't find a reason going forward. Factor in the irrational environmentalist agenda that is ruining our lives in other areas, we will probably see an expansion of the emasculation to include removal of all of the man parts, and we will all be left with cookie-cutter pansy-boy gamer trash.

 

 

RT is definitely easy to differentiate from normal rendering even in basic forms. When I turned off RT in WoW, I noticed immediately and they only used it for shadows and certain lighting situations. I think it is the future but it is a long ways away till it is the dominant rendering preference vs rasterization.

 

The industry feels a bit lost now with EVGA bowing out in regards to enthusiasts and pushing hardware and making superior designs. There's still Galax so we can see what they offer as they were always the direct competitor to EVGA for XOC GPUs but the industry continues to push towards PC computing as a commodity. I don't like it but I understand that the target market has always been 99% everyday users that just want to buy something, plug it in and use it as is. 😞

 

 

  • Thumb Up 1

Electrosoft Prime: 7950X3D | MSI X670E Carbon  | MSI Suprim X Liquid 4090 | AC LF II 420 | G.Skill 6000 A-Die 2x32GB | Samsung 990 Pro 2TB | EVGA 1600w P2 | Phanteks Ethroo Pro | Alienware AW3225QF 32" OLED

Eurocom Raptor X15 | 12900k | Nvidia RTX 3070ti | 15.6" 1080p 240hz | Kingston 3200 32GB (2x16GB) | Samsung 980 Pro 1TB Heatsink Edition
Heath: i9-12900k | EVGA CLC 280 | Asus Strix Z690 D4 | Asus Strix 3080 | 32GB DDR4 2x16GB B-Die 4000  | WD Black SN850 512GB |  EVGA DG-77 | Samsung G7 32" 144hz 32"

MelMel:  (Retrofit currently in progress)

 

 

 


 

Link to comment
Share on other sites

4 minutes ago, electrosoft said:

 

Of course we're comparing a massively outlier setup (KPE 3090, custom water, Mo-Ra, chiller) vs a 4090 FE with zero over clocks but I do agree some of the numbers are sus especially the 4090 numbers unless....*gasp* Nvidia was fluffing us hardcore with the 4090 presentation.

 

 

 

RT is definitely easy to differentiate from normal rendering even in basic forms. When I turned off RT in WoW, I noticed immediately and they only used it for shadows and certain lighting situations. I think it is the future but it is a long ways away till it is the dominant rendering preference vs rasterization.

 

The industry feels a bit lost now with EVGA bowing out in regards to enthusiasts and pushing hardware and making superior designs. There's still Galax so we can see what they offer as they were always the direct competitor to EVGA for XOC GPUs but the industry continues to push towards PC computing as a commodity. I don't like it but I understand that the target market has always been 99% everyday users that just want to buy something, plug it in and use it as is. 😞

 

 

Adapt or die, and its looking like its about time to pick up the soldering iron and rework stations :) 

  • Thumb Up 1
  • Like 1
  • Bump 1
Link to comment
Share on other sites

2 hours ago, electrosoft said:

 

Of course we're comparing a massively outlier setup (KPE 3090, custom water, Mo-Ra, chiller) vs a 4090 FE with zero over clocks but I do agree some of the numbers are sus especially the 4090 numbers unless....*gasp* Nvidia was fluffing us hardcore with the 4090 presentation.

 

 

 

RT is definitely easy to differentiate from normal rendering even in basic forms. When I turned off RT in WoW, I noticed immediately and they only used it for shadows and certain lighting situations. I think it is the future but it is a long ways away till it is the dominant rendering preference vs rasterization.

 

The industry feels a bit lost now with EVGA bowing out in regards to enthusiasts and pushing hardware and making superior designs. There's still Galax so we can see what they offer as they were always the direct competitor to EVGA for XOC GPUs but the industry continues to push towards PC computing as a commodity. I don't like it but I understand that the target market has always been 99% everyday users that just want to buy something, plug it in and use it as is. 😞

 

 

Nowadays just about any air cooled 3090Ti can match me, including your own. I only have a 3090 non Ti. 

That sweet performance in the video above, is exactly what you’ve already got under the hood with that 3090Ti KP. You may even be a little faster. 

13900KF

Link to comment
Share on other sites

23 minutes ago, tps3443 said:

Nowadays just about any air cooled 3090Ti can match me, including your own. I only have a 3090 non Ti. 

That sweet performance in the video above, is exactly what you’ve already got under the hood with that 3090Ti KP. You may even be a little faster. 

 

The only fair and valid comparison is another 3090Ti under the exact same conditions. Any other comparison is invalid but still fun to see. 😀

 

 

Electrosoft Prime: 7950X3D | MSI X670E Carbon  | MSI Suprim X Liquid 4090 | AC LF II 420 | G.Skill 6000 A-Die 2x32GB | Samsung 990 Pro 2TB | EVGA 1600w P2 | Phanteks Ethroo Pro | Alienware AW3225QF 32" OLED

Eurocom Raptor X15 | 12900k | Nvidia RTX 3070ti | 15.6" 1080p 240hz | Kingston 3200 32GB (2x16GB) | Samsung 980 Pro 1TB Heatsink Edition
Heath: i9-12900k | EVGA CLC 280 | Asus Strix Z690 D4 | Asus Strix 3080 | 32GB DDR4 2x16GB B-Die 4000  | WD Black SN850 512GB |  EVGA DG-77 | Samsung G7 32" 144hz 32"

MelMel:  (Retrofit currently in progress)

 

 

 


 

Link to comment
Share on other sites

Jufes continuing to dump all over DDR5 pushing A-die to 7600 vs tuned B-die 4100 for gaming....

 

"If you're a DDR5 fanboy or you bought the wrong product, I'm sorry."

"If you took advice from an extreme overclocker that don't play games or mainstream tech tubers that don't know how to tune machines, I'm sorry again."

 

Savage lol....

 

 

 

Electrosoft Prime: 7950X3D | MSI X670E Carbon  | MSI Suprim X Liquid 4090 | AC LF II 420 | G.Skill 6000 A-Die 2x32GB | Samsung 990 Pro 2TB | EVGA 1600w P2 | Phanteks Ethroo Pro | Alienware AW3225QF 32" OLED

Eurocom Raptor X15 | 12900k | Nvidia RTX 3070ti | 15.6" 1080p 240hz | Kingston 3200 32GB (2x16GB) | Samsung 980 Pro 1TB Heatsink Edition
Heath: i9-12900k | EVGA CLC 280 | Asus Strix Z690 D4 | Asus Strix 3080 | 32GB DDR4 2x16GB B-Die 4000  | WD Black SN850 512GB |  EVGA DG-77 | Samsung G7 32" 144hz 32"

MelMel:  (Retrofit currently in progress)

 

 

 


 

Link to comment
Share on other sites

6 hours ago, Mr. Fox said:

I do not use DLSS and don't really care about it or the AMD version of it. I don't game often enough for it to matter... maybe 2 or 3 hours YTD.

 

Why all the sudden hate and "fake frame" chatter? Seems like not long ago many were super excited about it and wouldn't buy a GPU without it. Did the world suddenly figure out it was just another silly gimmick for rabid gamers to wet themselves over?

 

From what I have known...Apologies for a long post.

 

First about Game AA technologies...

 

We used to have MSAA, Multi Sampling AA, this uses higher resolution of all frames and blends them into the resolution we set, this is why it is ultra hardcore taxing on GPUs as it is using Super Samping technique. This does not have any sort of lower image quality at all. Only super high quality.

 

Nowadays almost all games are using TAA, Temporal Anti Aliasing, it uses reduced graphics effects of previous frames - Ambient Occlusion, Shadows, Lighting for instance. And using TAA in motion causes a blur / vaseline effect reducing the picture quality esp when in Motion.

 

Modern Gaming and AA

 

So modern games cannot implement MSAA due to 2 particular reasons, First Reason the lights used in the games rendering aspect called as "Forward Rendering" limits it, apparently if we have a ton of lights on the Polygons in the game and applying the rendering on all of them in one pass, the GPU hit will be massive because each and every light source will have it's data associated with the textures, GFX effects and etc done at same time so adding Super Sampling means a lot. So developers moved away from Forward Rendering to Deferred Rendering which has separate rendering path and applies with some passes on the final scene. Second reason TAA gets developers make games for trash consoles easily as it doesn't require the heavy GPU power and CPU power to get it done unlike MSAA with ultra high GPU requirement. And thus MSAA is dead now.

 

Notable game which destroys so many AAA today without TAA and is 10 years old - Crysis 3,. CryTek used MSAA and their own TXAA which combines both FXAA and MSAA. Still MSAA in the game is the best. Digital Foundry was drooling over new Crysis Demaster Trilogy, and guess what ? All of the trash Demasters have TAA baked in vs originals which never had that. In motion MSAA destroys TAA without a single question and with super post processing extreme tessellation Crysis 3 the originals look way way better not to talk about other massive downgrades the new Console Ports are yeah old Crysis Trilogy were PC Exclusive, which is why Crysis 2 and 3 melted PS3 X360 and PS4 XB1 never got them, and the modern Demasturds ? They run on Switch LOL.

 

Many modern games do have TAA baked in and you cannot remove it if you remove the game will break the reflections lighting, and LOD effects etc, one of that is Metro Exodus, so we have to accept the TAA low res AA tech no matter what. Resident Evil 2 and Resident Evil 3 do not have TAA that is why game looks super crisp always. However they added new RE Engine upgrades from RE8 Village which means TAA is now mandatory, because RE8 has that atop of RT not only they ruined with TAA but games took a massive downgrade in Ambient Occlusion, Lighting effects too.

 

Red Dead Redemption 2 suffers from horrible TAA implementation as it blurs out all the beautiful vistas in that game massive shame since it is the best looking game with real game inside it unlike many which try (CP2077, not a game). Cannot remove it at all. They added FSR2.0 recently officially after a community mod came and added FSR2.0.

 

DOOM 2016 has SMAA implementation while DOOM Eternal uses TAA. This is why DOOM 2016 looks much much better in fidelity vs Eternal since id Software ditched idTech 6's SMAA for idTech 7 TAA. Although since this DOOM topic came, there's a mod called Carmack's Reshade for DOOM 2016 which uses Reshade and man that thing looks BALLS TO WALLS solid give it a shot the Sharpening used by Reshade is so well done and it clears the game and adds insane oomph factor to it.

 

Control was used a lot in RT demos and etc esp DLSS to fix that perf hit which RT causes. But guess what ? We can remove TAA from the game .exe itself and enjoy better sharp crisp image at nice native resolution.

 

Enter DLSS and FSR

 

DLSS (Deep Learning Super Sampling) Closed Source Proprietary - v1.0  did Temporal Upscaling it uses motion vectors with Sharpening pass on the image which uses TAA, DLSS improves that image because it technically removes that blur which TAA introduces BUT it is only worth on 4K since at 4K the 1080P is used and upscaled to 4K, for 1080P Nvidia uses a very low res picture and then applies the Temporal upscaling. Then the DLSS had all that Ghosting / Shimmering after effects which most of them got improved by 2.x now. Since it is closed source, needs Nvidia to run the game on the servers and generate the data which the RTX GPUs use when the game is loaded through Driver using that nvgx.dll and there's some DLL mods too, like old games which did not get improved DLSS versions they can use the newer DLSS found in the new games as the dll is distributed through the game files and TPU hosts those files.

 

There are ton of presets here - DLSS Performance, Quality, Balanced etc

 

Now in 2021-2022, AMD started FSR (Fidelity FX Super Resolution) 100% Open Source - v1.0 but it did not need any special Tensor Cores or anything on the GPU silicon. It was not Temporal either, it used Spatial Upscaling only. Temporal upscaling factors in the Motion Vector data so the Spatial Upscaler FSR1.0 was not good only helped very poor performing cards. Then AMD upgraded it with FSR2.0 which uses Temporal Upscaling, this does not even use TAA it flatout "Replaces" TAA. Also it has some issues with very thin lines it is improving (RDR2 FSR2.0 has power lines becoming thick). They fixed some more issues with ghosting, shimmering and etc with latest FSR2.1 so it's constantly evolving and open for all GPUs across all ecosystems. Same like DLSS this is also useful only at 4K resolution not at 1440P/1080P since again it is ultimately upscaling only.

 

Same like DLSS FSR also has presets - Quality, Performance, Balanced etc

 

So what AMD did ? No GPU gimmick cores a.k.a Nvidia's chest thumping Tensor Cores mumbo jumbo and it flat out punched Nvidia's proprietary DLSS2.x to the ground, esp now there's no particular visual fidelity difference anymore with FSR2.x vs DLSS2.x even in Motion since it now uses Temporal data and thus uses Motion vector data as well. So Nvidia comes up with Frame Insertion to counter lol.

 

Also there's a lot of games being compared at TPU which we can see. Again they did it without any proprietary technology, all games are getting FSR2.0 mods, RPCS3 Emulator included it inside their code now, Xenia X360 Emulator also added it to the code base.

 

DLSS3.x As mentioned by many videos posted here,Nvidia is adding totally new Fake Frame Data, Unlike the old upscaling techniques this is adding new data, 1,2,3,4 total frames then DLSS3.x is 2 and 3 frames a.k.a not real rendered images, they Extrapolate the Motion Vector Data and the Image and add a frame, this causes Lag and input latency spike. So they added Nvidia Reflex to the DLSS3.x, this Nvidia Reflex is there for almost all GPUs it increases some overhead on CPU but reduces the frame time latency.

 

How Nvidia is claiming there's reduced CPU load when you have more FPS ? Esp that is not how it works right the more FPS there is more CPU work needs to be done, the exact reason why we bench at 1080P to measure CPU performance and not at 4K since at 4K it's GPU dependent as we all know. But how Nvidia says no CPU hit ? because there's no Frame Rendering lol, it's a "Fake Frame" created by their so called Tensor cores to add. This causes artifacts, latency spikes, unwanted effects etc which we do not know as of now and how the picture quality fares.

 

Nvidia is not only upscaling TAA data but now they are adding fake data too all exclusive to Nvidia RTX 40 series only saying Ada has OFA - Optical Flow Accelerator even though it's present on Ampere, because if they add DLSS3.x to Ampere, it will nuke the 4080 cards without a question, this is called on purpose gimping and essentially kills DLSS2.x now. DLSS3.x will run on Ampere and Turing but the Frame Interpolation doesn't work on the old cards.

 

Conclusion

As you have already mentioned how many people are simply into consuming such gimmicks, not many will think this much deep in tech but rather simply accept the low quality BS called Upscaling that is how Nvidia was able to convince everyone about their "DLSS Magic" and keeping buying Nvidia only, buy a $1600 card use Upscaling lol. Fantastic. This is just like how we have BGA cancer overloading the real PC and same how Microsoft's strategy is fine by many. But probably the new 3.x Frame Interpolation kicks Ampere users a lot of people are speaking up more and also adding frame data is also too much on the nose.

 

Just look at that Cyberpunk 2077 RT Overdrive mode lmao, 21FPS at 4K, and install the magical upscaler with fake frames, you get 120FPS everyone claps. At this point why even buy Mirrorless DSLR Cameras and high quality 1" Sensor cameras or IMAX Arri Alexa cameras just use a smartphone sensors and apply AI or so called "Compute Photography" or buy a 2K OLED and use upscaler and get away right ? To me that's how it sounds about all this Upscaling technologies (both FSR and DLSS, hey at-least one is free) that have polluted PC space.

 

As @Clamibotsaid, instead of those gimmick tricks more CUDA would have done much better job, take a 50%+ perf hit on these so called RT and implementing upscaling seems counter intuitive esp when we talk PC here which is famous for Fidelity, Nvidia spews 8K Gaming when in reality it's not 8K but upscaled image. RT is nowhere near that great as Rasterization and esp when we see the whole old games destroying in pure art style, look at Batman Arkham Knight, it's from 2015, looks absolutely stunning than so many garbage games we have.

 

I would rather use DSR (Dynamic Super Resolution) awesome feature that Nvidia did, Render the game at higher resolution and display it on your native display resolution. Not only that they upgraded that, It's called DLDSR (Deep learning Dynamic Super Resolution) it adds Tensor Cores to render the data even better when it's displaying on the native display resolution after rendering at higher resolution. both of these are not upscalers which is why they are superior as they are improving Native Resolution image.

  • Like 3

Helios (WIP)

i9 10900K // Trident Z Royal C16 4000MHz B-Die 32GB // ASUS Maximus XIII APEX // Noctua DH15 Chromax // RTX3090Ti FE // Alienware 360Hz G-Sync Ultimate IPS FHD // Seasonic Prime TX 1000 Titanium // Fractal Meshify 2XL

 

Ethereal Ranger

Alienware 17 R1 // i7 4710MQ // 16GB DDR3L 2133MHz // 980M 860M loaner // Windows 8.1

Link to comment
Share on other sites

1 hour ago, electrosoft said:

Jufes continuing to dump all over DDR5 pushing A-die to 7600 vs tuned B-die 4100 for gaming....

 

"If you're a DDR5 fanboy or you bought the wrong product, I'm sorry."

"If you took advice from an extreme overclocker that don't play games or mainstream tech tubers that don't know how to tune machines, I'm sorry again."

 

Savage lol....

 

 

 

This is exactly why I’ve held on to my 11900K so long. At 4,066Mhz CL14 Gear 1, it is 65GBPS bandwidth, and 36ns latency. And the IPC is better than most would ever think.
 

I can also run 5.4Ghz all cores at 250 watts Cinebench. 🫣
 

But, I’ve got a nice Z690 board and DDR5 now. Waiting on 13th Gen. 
 

 

 

 

0FF2330D-A72E-43FE-9159-BCEA8787F256.png

8A031444-F630-4A20-8A29-0394D628D125.jpeg

  • Thumb Up 1
  • Bump 1

13900KF

Link to comment
Share on other sites

1 hour ago, Ashtrix said:

 

From what I have known...Apologies for a long post.

 

First about Game AA technologies...

 

We used to have MSAA, Multi Sampling AA, this uses higher resolution of all frames and blends them into the resolution we set, this is why it is ultra hardcore taxing on GPUs as it is using Super Samping technique. This does not have any sort of lower image quality at all. Only super high quality.

 

Nowadays almost all games are using TAA, Temporal Anti Aliasing, it uses reduced graphics effects of previous frames - Ambient Occlusion, Shadows, Lighting for instance. And using TAA in motion causes a blur / vaseline effect reducing the picture quality esp when in Motion.

 

Modern Gaming and AA

 

So modern games cannot implement MSAA due to 2 particular reasons, First Reason the lights used in the games rendering aspect called as "Forward Rendering" limits it, apparently if we have a ton of lights on the Polygons in the game and applying the rendering on all of them in one pass, the GPU hit will be massive because each and every light source will have it's data associated with the textures, GFX effects and etc done at same time so adding Super Sampling means a lot. So developers moved away from Forward Rendering to Deferred Rendering which has separate rendering path and applies with some passes on the final scene. Second reason TAA gets developers make games for trash consoles easily as it doesn't require the heavy GPU power and CPU power to get it done unlike MSAA with ultra high GPU requirement. And thus MSAA is dead now.

 

Notable game which destroys so many AAA today without TAA and is 10 years old - Crysis 3,. CryTek used MSAA and their own TXAA which combines both FXAA and MSAA. Still MSAA in the game is the best. Digital Foundry was drooling over new Crysis Demaster Trilogy, and guess what ? All of the trash Demasters have TAA baked in vs originals which never had that. In motion MSAA destroys TAA without a single question and with super post processing extreme tessellation Crysis 3 the originals look way way better not to talk about other massive downgrades the new Console Ports are yeah old Crysis Trilogy were PC Exclusive, which is why Crysis 2 and 3 melted PS3 X360 and PS4 XB1 never got them, and the modern Demasturds ? They run on Switch LOL.

 

Many modern games do have TAA baked in and you cannot remove it if you remove the game will break the reflections lighting, and LOD effects etc, one of that is Metro Exodus, so we have to accept the TAA low res AA tech no matter what. Resident Evil 2 and Resident Evil 3 do not have TAA that is why game looks super crisp always. However they added new RE Engine upgrades from RE8 Village which means TAA is now mandatory, because RE8 has that atop of RT not only they ruined with TAA but games took a massive downgrade in Ambient Occlusion, Lighting effects too.

 

Red Dead Redemption 2 suffers from horrible TAA implementation as it blurs out all the beautiful vistas in that game massive shame since it is the best looking game with real game inside it unlike many which try (CP2077, not a game). Cannot remove it at all. They added FSR2.0 recently officially after a community mod came and added FSR2.0.

 

DOOM 2016 has SMAA implementation while DOOM Eternal uses TAA. This is why DOOM 2016 looks much much better in fidelity vs Eternal since id Software ditched idTech 6's SMAA for idTech 7 TAA. Although since this DOOM topic came, there's a mod called Carmack's Reshade for DOOM 2016 which uses Reshade and man that thing looks BALLS TO WALLS solid give it a shot the Sharpening used by Reshade is so well done and it clears the game and adds insane oomph factor to it.

 

Control was used a lot in RT demos and etc esp DLSS to fix that perf hit which RT causes. But guess what ? We can remove TAA from the game .exe itself and enjoy better sharp crisp image at nice native resolution.

 

Enter DLSS and FSR

 

DLSS (Deep Learning Super Sampling) Closed Source Proprietary - v1.0  did Temporal Upscaling it uses motion vectors with Sharpening pass on the image which uses TAA, DLSS improves that image because it technically removes that blur which TAA introduces BUT it is only worth on 4K since at 4K the 1080P is used and upscaled to 4K, for 1080P Nvidia uses a very low res picture and then applies the Temporal upscaling. Then the DLSS had all that Ghosting / Shimmering after effects which most of them got improved by 2.x now. Since it is closed source, needs Nvidia to run the game on the servers and generate the data which the RTX GPUs use when the game is loaded through Driver using that nvgx.dll and there's some DLL mods too, like old games which did not get improved DLSS versions they can use the newer DLSS found in the new games as the dll is distributed through the game files and TPU hosts those files.

 

There are ton of presets here - DLSS Performance, Quality, Balanced etc

 

Now in 2021-2022, AMD started FSR (Fidelity FX Super Resolution) 100% Open Source - v1.0 but it did not need any special Tensor Cores or anything on the GPU silicon. It was not Temporal either, it used Spatial Upscaling only. Temporal upscaling factors in the Motion Vector data so the Spatial Upscaler FSR1.0 was not good only helped very poor performing cards. Then AMD upgraded it with FSR2.0 which uses Temporal Upscaling, this does not even use TAA it flatout "Replaces" TAA. Also it has some issues with very thin lines it is improving (RDR2 FSR2.0 has power lines becoming thick). They fixed some more issues with ghosting, shimmering and etc with latest FSR2.1 so it's constantly evolving and open for all GPUs across all ecosystems. Same like DLSS this is also useful only at 4K resolution not at 1440P/1080P since again it is ultimately upscaling only.

 

Same like DLSS FSR also has presets - Quality, Performance, Balanced etc

 

So what AMD did ? No GPU gimmick cores a.k.a Nvidia's chest thumping Tensor Cores mumbo jumbo and it flat out punched Nvidia's proprietary DLSS2.x to the ground, esp now there's no particular visual fidelity difference anymore with FSR2.x vs DLSS2.x even in Motion since it now uses Temporal data and thus uses Motion vector data as well. So Nvidia comes up with Frame Insertion to counter lol.

 

Also there's a lot of games being compared at TPU which we can see. Again they did it without any proprietary technology, all games are getting FSR2.0 mods, RPCS3 Emulator included it inside their code now, Xenia X360 Emulator also added it to the code base.

 

DLSS3.x As mentioned by many videos posted here,Nvidia is adding totally new Fake Frame Data, Unlike the old upscaling techniques this is adding new data, 1,2,3,4 total frames then DLSS3.x is 2 and 3 frames a.k.a not real rendered images, they Extrapolate the Motion Vector Data and the Image and add a frame, this causes Lag and input latency spike. So they added Nvidia Reflex to the DLSS3.x, this Nvidia Reflex is there for almost all GPUs it increases some overhead on CPU but reduces the frame time latency.

 

How Nvidia is claiming there's reduced CPU load when you have more FPS ? Esp that is not how it works right the more FPS there is more CPU work needs to be done, the exact reason why we bench at 1080P to measure CPU performance and not at 4K since at 4K it's GPU dependent as we all know. But how Nvidia says no CPU hit ? because there's no Frame Rendering lol, it's a "Fake Frame" created by their so called Tensor cores to add. This causes artifacts, latency spikes, unwanted effects etc which we do not know as of now and how the picture quality fares.

 

Nvidia is not only upscaling TAA data but now they are adding fake data too all exclusive to Nvidia RTX 40 series only saying Ada has OFA - Optical Flow Accelerator even though it's present on Ampere, because if they add DLSS3.x to Ampere, it will nuke the 4080 cards without a question, this is called on purpose gimping and essentially kills DLSS2.x now. DLSS3.x will run on Ampere and Turing but the Frame Interpolation doesn't work on the old cards.

 

Conclusion

As you have already mentioned how many people are simply into consuming such gimmicks, not many will think this much deep in tech but rather simply accept the low quality BS called Upscaling that is how Nvidia was able to convince everyone about their "DLSS Magic" and keeping buying Nvidia only, buy a $1600 card use Upscaling lol. Fantastic. This is just like how we have BGA cancer overloading the real PC and same how Microsoft's strategy is fine by many. But probably the new 3.x Frame Interpolation kicks Ampere users a lot of people are speaking up more and also adding frame data is also too much on the nose.

 

Just look at that Cyberpunk 2077 RT Overdrive mode lmao, 21FPS at 4K, and install the magical upscaler with fake frames, you get 120FPS everyone claps. At this point why even buy Mirrorless DSLR Cameras and high quality 1" Sensor cameras or IMAX Arri Alexa cameras just use a smartphone sensors and apply AI or so called "Compute Photography" or buy a 2K OLED and use upscaler and get away right ? To me that's how it sounds about all this Upscaling technologies (both FSR and DLSS, hey at-least one is free) that have polluted PC space.

 

As @Clamibotsaid, instead of those gimmick tricks more CUDA would have done much better job, take a 50%+ perf hit on these so called RT and implementing upscaling seems counter intuitive esp when we talk PC here which is famous for Fidelity, Nvidia spews 8K Gaming when in reality it's not 8K but upscaled image. RT is nowhere near that great as Rasterization and esp when we see the whole old games destroying in pure art style, look at Batman Arkham Knight, it's from 2015, looks absolutely stunning than so many garbage games we have.

 

I would rather use DSR (Dynamic Super Resolution) awesome feature that Nvidia did, Render the game at higher resolution and display it on your native display resolution. Not only that they upgraded that, It's called DLDSR (Deep learning Dynamic Super Resolution) it adds Tensor Cores to render the data even better when it's displaying on the native display resolution after rendering at higher resolution. both of these are not upscalers which is why they are superior as they are improving Native Resolution image.


Nvidia needs to bring DLSS 3.0 to Ampere graphics cards. Make these GPU’s both have a fair shake against each other. I mean, they are selling them both! They have tons of Ampere GPU’s on their hands to sell along side love lace, why not give the people a reason to buy them! 

13900KF

Link to comment
Share on other sites

Zotac 4090 just appeared at caseking at a whopping 2250 euros 😑@jaybee83

IMG_20220925_164253.jpg

  • Thumb Up 1
  • Sad 4

7950X3D| Zotac 4090 AMP Extreme Airo| MSI MPG B650 Edge Wifi| Lian Li Galahad 360 V2| 32GB Kingston Renegade RGBZ 6000|Kingston KC3000 2TB| Fury Renegade 2TB|Samsung 970 Evo 1TB| Lian Li O11 Dynamic Evo| Corsair HX1500i| Samsung Odyssey G9 Neo

Asus Zephyrus G15 (Ryzen 9 6900HS + RTX3080)

 

Link to comment
Share on other sites

 

  • Thumb Up 4
  • Like 1
  • Bump 2

Maximus Z790 Apex ~ 13900K ~ MSI 4090 Suprim X ~ G.Skill Trident Z5 7800 Mhz ~ 1TB Samsung 980 Pro ~ Thermaltake 1650W GF3 ~ AACH100HP Water Chiller ~ Praxis Wet Bench Flat

MSI MEG X570S Unify-X MAX ~ 5950X ~ GTX 1070 ~ G.Skill 3200 Mhz ~ 500GB Samsung 980 Pro ~ Lian Li O11 Dynamic EVO Tempered Glass ~ Thermaltake 650W ~ 

Hwbot Profile

Link to comment
Share on other sites

23 hours ago, electrosoft said:

Jufes continuing to dump all over DDR5 pushing A-die to 7600 vs tuned B-die 4100 for gaming....

 

"If you're a DDR5 fanboy or you bought the wrong product, I'm sorry."

"If you took advice from an extreme overclocker that don't play games or mainstream tech tubers that don't know how to tune machines, I'm sorry again."

 

Savage lol....

 

 

 

After I read your post I expected to find him on one of his childish, irrational, bipolar emo-Nazi potty-mouth meltdowns that have made it so difficult to respect him as a subject matter expert; but, this was a really good video and he communicated in a mostly-respectable way that made him seem like an authority with only an occassional unprofessional teenager-grade vulgar expression. The video is spot on. DDR5 is just another gimmick/scam designed to make money and stop consumers from reusing their old parts across so many generations that the hardware manufacturers can't get richer. DDR5 is to memory benchmarks an equivalent to what the relationship of RAM caching was to storage benchmarks... a blazing fast clock speed that is real, but doesn't get utilized for anything important other than memory benchmarks. There are going to some isolated examples where it does matter for something, somewhere, but gaming certainly is not one of them.

7 hours ago, johnksss said:

 

Hopefully this won't be another sucky enthusiast wannabe example of a cookie-cutter sheeple metachurch product that performs well stock, but sucks at overclocking. And, hopefully it won't have the same reliability/stability issues and USB malfunction problems like Ryzen 9. What excites me even more is the anticipation of what kind of answer Intel is going to have for it. It's wonderful having the force of competition in play here again. AMD's progress is a win for everyone, including people that won't ever buy their brand.

LMAO at opening remarks here...

 

"...they clock very, very, very high which is kind of exciting like the funny thing about clock speed is it doesn't directly translate into performance but who doesn't love looking at big numbers even if they don't necessarily mean anything like on Bulldozer CPUs..."

- Buildzoid

...or, like DDR5 memory, LOL.

21 hours ago, tps3443 said:


Nvidia needs to bring DLSS 3.0 to Ampere graphics cards. Make these GPU’s both have a fair shake against each other. I mean, they are selling them both! They have tons of Ampere GPU’s on their hands to sell along side love lace, why not give the people a reason to buy them! 

Short answer to why... because they're the Green Goblin and doing the right thing is not how they roll. It never has been... ever. But, as long as they have the most desirable products, they are licensed to be dishonest, shady and unfair with the people whose money keeps their wheels turning. They're going to price themselves out of a job though, unless stupidity continues to prevail when it comes to how people spend waste their money. There is always the remote possibility that common sense will prevail and people will say no with their wallet to the retarded pricing.

  • Thumb Up 1
  • Like 1

Wraith // Z790 Apex | 14900KF | 4090 Suprim X+Byksi Block | 48GB DDR5-8600 | Toughpower GF3 1650W | MO-RA3 360 | Hailea HC-500A || O11D XL EVO
Banshee // Z790 Apex Encore | 13900KS | 4090 Gaming OC+Alphacool Block | 48GB DDR5-8600 | RM1200x SHIFT | XT45 1080 Nova || Dark Base Pro 901
Munchkin // Z790i Edge | 14900K | Arc A770 Phantom Gaming OC | 48GB DDR5-8000 | GameMax 850W | EK Nucleus CR360 Dark || Prime AP201 
Half-Breed // Dell Precision 7720 | BGA CPU Filth+MXM Quadro P5000 | Sub-$500 Grade A Refurb || Nothing to Write Home About  

 Mr. Fox YouTube Channel | Mr. Fox @ HWBOT

The average response time for a 911 call is 10 minutes. The response time of a .357 is 1400 feet per second.

Link to comment
Share on other sites

3 hours ago, cylix said:

Zotac 4090 just appeared at caseking at a whopping 2250 euros 😑@jaybee83

IMG_20220925_164253.jpg

And we will see 4090Ti at 2874 euro (included tax). This even with out the added Mining premium. The beauty, living in a Tax hell.

2 hours ago, johnksss said:

 

 

Yep, new gen chips will provide better performance. Remember AMD scrapped the yearly upgrade. You will get slightly better performance uplift if you only reales new products every second year 🙂

 

LEAK

  • Thumb Up 3

"The Killer"  ASUS ROG Z790 Apex Encore | 14900KS | 4090 HOF + 20 other graphics cards | 32GB DDR5 | Be Quiet! Dark Power Pro 12 - 1500 Watt | Second PSU - Cooler Master V750 SFX Gold 750W (For total of 2250W Power) | Corsair Obsidian 1000D | Custom Cooling | Asus ROG Strix XG27AQ 27" Monitors |

 

                                               Papusan @ HWBOTTeam PremaMod @ HWBOT | Papusan @ YouTube Channel

                             

 

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. Terms of Use