Jump to content
NotebookTalk

*Official Benchmark Thread* - Post it here or it didn't happen :D


Mr. Fox

Recommended Posts

35 minutes ago, Etern4l said:

I guess one could argue that by supporting the Green Goblin, their pricing strategy, as well as anything having to do with AI (the new crypto as regards the demand for GPUs) with our wallets, we are helping them kill PCs, only to later lament collapsing computer sales and further prices hikes as a consequence of falling volumes, a death spiral of sorts. Food for thought.

Yes, in no small part, that is one of the reasons I am looking forward to seeing how the Intel Battlemage GPU turns out. I don't like supporting the Green god of Greed and its goblin minions, but I also can't see myself being interested in buying an AMD GPU at this point. It wouldn't surprise me if there are a lot of people thinking this way besides me. One of the things I like most about Intel is their attention is not distracted from PC by consoles like AMD, nor are they distracted by the things NVIDIA is/has been distracted by (crypto, AI, medical science technologies, etc.).

  • Thumb Up 2

WRAITH // Z790 Apex | 14900KS | 4090 Suprim X+Byksi Block | 48GB DDR5-8600 | Toughpower 1650W | MO-RA3 360 | Hailea HC-500A || O11D XL EVO (T-Rex)

BANSHEE // X870E Carbon | 9950X | 4090 Gaming OC+Alphacool Block | 32GB DDR5-8200 | RM1200x SHIFT | XT45 1080 Nova || Antec C8 (Rhinoceros)

SPECTRE // Z790i Edge | 13900KS | 3090 Ti FTW3 | 48GB DDR5-8200 | RM1000e | EK Nucleus CR360 Direct Die || Prime A21 (Rattlesnake)

HALF-BREED // Precision 7720 | BGA CPU Filth | 32GB DDR4 | Quadro P5000 | 4K Display | Nothing to Write Home About (Turdbook)

 Mr. Fox YouTube Channel | Mr. Fox @ HWBOT

The average response time for a 911 call is 10 minutes. The response time of a .357 is 1400 feet per second. 

Link to comment
Share on other sites

41 minutes ago, Mr. Fox said:

Yes, in no small part, that is one of the reasons I am looking forward to seeing how the Intel Battlemage GPU turns out. I don't like supporting the Green god of Greed and its goblin minions, but I also can't see myself being interested in buying an AMD GPU at this point. It wouldn't surprise me if there are a lot of people thinking this way besides me.


Makes sense, the starting point for the consideration of the Battlemage though would be the acceptance of the fact that 3090 Ti / 4080 performance is really OK for most reasonable purposes. If the mentality continues to be “4090 or bust” then Jensen will also continue having his way at our expense.

 

BTW this medical technologies thing is often used as an excuse for investment in AI. I would confidently estimate that less than 1% (OK, 5% to be safe) of GPU power goes into medical research. It”s a publicity ploy.

  • Thumb Up 2

"We're rushing towards a cliff, but the closer we get, the more scenic the views are."

-- Max Tegmark

 

AI: Major Emerging Existential Threat To Humanity

Link to comment
Share on other sites

14 minutes ago, Etern4l said:


Makes sense, the starting point for the consideration of the Battlemagr though would be the acceptance of the fact that 3090 Ti / 4080 performance is really OK for most reasonable purposes. If the mentality continues to be “4090 or bust” then Jensen will also continue having his way.

I can't speak for anyone else, but I'm willing to go there if it means giving the support to Intel. And, who knows if 3090/4080 level is where its performance ends. Look how far ARC has come just with drivers... massive performance increases... Plus, if they are successful, who knows how Battlemage Gen2 is going to look.  Might get pretty interesting.

 

And, if we are honest, there is nothing wrong with 3090 or 4080 performance. It's not 4090-level, but it's also not nearly as severely overpriced. But, some of us get hung up on feeling like having the top performance GPU is imperative. I am starting to rethink that. I hated how much I paid for the 3090 KPE and hate even more how much I paid for my 4090. Big performance, little value... not necessary.

13 minutes ago, Etern4l said:

BTW this medical technologies thing is often used as an excuse for investment in AI. I would confidently estimate that less than 1% of GPU power goes into medical research. It”s a publicity ploy.

Yeah, that would not surprise me in the least. NVIDIA is unmatched in their ability to misrepresent things for secondary gain. They are masters in the art of deception. 

  • Thumb Up 1
  • Thanks 1
  • Like 2
  • Bump 2

WRAITH // Z790 Apex | 14900KS | 4090 Suprim X+Byksi Block | 48GB DDR5-8600 | Toughpower 1650W | MO-RA3 360 | Hailea HC-500A || O11D XL EVO (T-Rex)

BANSHEE // X870E Carbon | 9950X | 4090 Gaming OC+Alphacool Block | 32GB DDR5-8200 | RM1200x SHIFT | XT45 1080 Nova || Antec C8 (Rhinoceros)

SPECTRE // Z790i Edge | 13900KS | 3090 Ti FTW3 | 48GB DDR5-8200 | RM1000e | EK Nucleus CR360 Direct Die || Prime A21 (Rattlesnake)

HALF-BREED // Precision 7720 | BGA CPU Filth | 32GB DDR4 | Quadro P5000 | 4K Display | Nothing to Write Home About (Turdbook)

 Mr. Fox YouTube Channel | Mr. Fox @ HWBOT

The average response time for a 911 call is 10 minutes. The response time of a .357 is 1400 feet per second. 

Link to comment
Share on other sites

56 minutes ago, Mr. Fox said:

I can't speak for anyone else, but I'm willing to go there if it means giving the support to Intel. And, who knows if 3090/4080 level is where its performance ends. Look how far ARC has come just with drivers... massive performance increases... Plus, if they are successful, who knows how Battlemage Gen2 is going to look.  Might get pretty interesting.

 

And, if we are honest, there is nothing wrong with 3090 or 4080 performance. It's not 4090-level, but it's also not nearly as severely overpriced. But, some of us get hung up on feeling like having the top performance GPU is imperative. I am starting to rethink that. I hated how much I paid for the 3090 KPE and hate even more how much I paid for my 4090. Big performance, little value... not necessary.

Yeah, that would not surprise me in the least. NVIDIA is unmatched in their ability to misrepresent things for secondary gain. They are masters in the art of deception. 

This 100 percent.  I was debating building a new system for myself, but mine now with the 2070 super and 10th gen i7 is flying through anything I toss at it without hiccups.  I do wish my notebook had some sort of video processing besides Xe, but it still does what I need it to do albeit somewhat slower.  I was watching some comparison videos of the super and 30xx and 40xx series cards and the little bit of performance gains for the money is just not worth me doing it right now. 

 

 

  • Thumb Up 6

Workstation - Dell XPS 8940 - desktop creative powerhouse

Mobile Workstation - Dell inspiron 5406 2 in 1 - mobile creative beast

Wifey's Notebook - Dell inspiron 3169 - Little gem for our businesses

Link to comment
Share on other sites

14 hours ago, Raiderman said:

Thanks bud!

 

8000 Ram.jpg

 

noice! welcome to the 8k club! how stable is it? what mobo are you running? and what sticks?

 

12 hours ago, chew said:

 

 

Low vddp helps the most. 1.0 range.

 

I'll give you a breakdown.

 

CPU_VCORE (CPU Core and integrated GPU voltage supply)
CPU_SOC (CPU system uncore voltage, memory controller (UMC), north bridge IO, SMU, FCH, DF...)
CPU_VDDIO_MEM (CPU memory I/O voltage, and the voltage supplier for VDDP)
DDR_VDD (DDR PMIC VDD Voltage)
DDR_VDDQ (DDR PMIC VDDQ Voltage)

CPU VDD_MISC (CPU misc voltage for PCIe, internal clockgen, and most important, VDDG voltage is driven by this voltage)
VDDP (CPU DDR Phy voltage)
VDDG (Global memory interface(GMI) voltage)

for Zen CCD/CCX overclocking, CPU Vcore and CPU Vcore loadline calibration
for Infinity Fabric, CPU_SOC, VDDG and MEMCLK
for memory overclocking, VDDIO_MEM, VDDP and CPU_SOC(UMC)
for VDDIOMEM CLDO, VDDP < VDDIOMEM
for VDDMISC CLDO, VDDG =, <, > VDDMISC
CPU_VCORE's internal regulator for igpu is not exposed

 

Knowing this and knowing  1/2 uclk is less your SOC requirements will also be less,

 

Board choice should not really be a big surprise.....lets look at intel for that matter.....at the end of the day it's ddr 5 and signalling that is the enemy. 2dpc intel your lucky to get 7600. 1 dpc imc is limit.

No different with AMD. I manage to get the giga extreme working good but I have been working on board with giga for awhile.....my tachyon b650e (1dpc) can still take the same cpu further.

 

My voltages are as follows. 1/2 tuning is not the same as 1/1.

 

Vcore auto

vsoc 1.25

vddio auto ( 1.3 )

vmisc auto ( 1.1 )

vddp 1.025

vddg 850/850

vdd 1.625

vddq 1.525

 

Start with Hci or karhu (cache enabled ).....or your wasting your time. Tm5 good for catch mem issues not imc. ycruncher fft will also catch imc.

 

I also have an intel tachyon setup...that said i'm already used to what it requires to get 8000 stable.

 

 

 

 

image.png

 

Intel.

 

May be an image of text

 

damn thats a very nice write up, thanks for that 🙂 ill check and see if lowering VDDP will help, fingers crossed!

 

whats the max bootable frequency u were able to do on AM5? irrespective of stability, just bootable.

  • Thumb Up 1

Mine: Hyperion "Titan God of Heat, Heavenly Light, Power" (2022-24)
AMD Ryzen 9 7950X (TG High Perf. IHS) / Asus ROG Crosshair X670E Extreme / MSI Geforce RTX 4090 Suprim X / Teamgroup T-Force Delta RGB DDR5-8200 2x24 GB / Seagate Firecuda 530 4 TB / 5x Samsung 860 Evo 4 TB / Arctic Liquid Freezer II 420 (Push/Pull 6x Noctua NF-A14 IndustrialPPC-3000 intake) / Seasonic TX-1600 W Titanium / Phanteks Enthoo Pro 2 TG (3x Arctic P12 A-RGB intake / 4x Arctic P14 A-RGB exhaust / 1x Arctic P14 A-RGB RAM cooling) / Samsung Odyssey Neo G8 32" 4K 240 Hz / Ducky One 3 Daybreak Fullsize Cherry MX Brown / Corsair M65 Ultra RGB / PDP Afterglow Wave Black / Beyerdynamic DT 770 Pro X Limited Edition

 

My Lady's: Clevo NH55JNNQ "Alfred" (2022-24)
Sharp LQ156M1JW03 FHD matte 15.6" IGZO 8 bit @248 Hz / Intel Core i5 12600 / Nvidia Geforce RTX 3070 Ti / Mushkin Redline DDR4-3200 2x32 GB / Samsung 970 Pro 1 TB / Samsung 870 QVO 8 TB / Intel AX201 WIFI 6+BT 5.2 / Win 11 Pro Phoenix Lite OS / 230 W PSU powered by Prema Mod!

Link to comment
Share on other sites

13 hours ago, KING19 said:

Its struggling on a RTX 3070 on 1080p on high....... I know the game will receive patches to improve performance but this is becoming a bad trend in modern PC gaming, It makes you wanna buy a console to avoid this crap

Spot on. Hardware is already dead on arrival. Look at me nvidia and your sucky 4060 (xx60 series cards) which usually is the Steam gamer kids cards. Now not good enough anymore. Jensen wants 1199$ to fulfill your gaming dreams. If you have two or three tens home this will cost you near 2400-3600$ if you don’t buy them an console. Happy you are an middle age man and with grown up kids bro @Mr. Fox? I’m. 
 

If I see the numbers correctly on my small phone. The vram problem with 8GB cards isn’t the main problem here. Just a crappy optimized game that needs the best of the best to run with good enough/stable fps. I could understand a  game running awful if you hit the vram buffer all time but if the game still runs as trash on barely 3 years old mid class hardware without this vram limitation then it’s awful. 
 

So in short.., Hardware will now have a lifespan at max 2 years cadence. That’s the awful here. And intended!! New hardware is already old when you buy it. You can’t even have a good experience with 500-600$ graphics cards anymore. Perverse!

 

And I’m not sure you can have a nice experience with previous gen cards as 3080(10GB) that nvidia said was their brand new 4K cards. What a joke. People should stop buying new modern graphics cards. Only spend a few dollars on consoles. This is the only fix for the situation we are in (for gamers). $500 cards useless…. 1199$ cards is 70% overpriced so very very bad value. 4090 not that massive price increase over previous gen but still bad value. What’s left? 

  • Thumb Up 3
  • Like 1
  • Bump 2

"The Killer"  ASUS ROG Z790 Apex Encore | 14900KS | 4090 HOF + 20 other graphics cards | 32GB DDR5 | Be Quiet! Dark Power Pro 12 - 1500 Watt | Second PSU - Cooler Master V750 SFX Gold 750W (For total of 2250W Power) | Corsair Obsidian 1000D | Custom Cooling | Asus ROG Strix XG27AQ 27" Monitors |

 

                                               Papusan @ HWBOTTeam PremaMod @ HWBOT | Papusan @ YouTube Channel

                             

 

Link to comment
Share on other sites

5 hours ago, jaybee83 said:

 

noice! welcome to the 8k club! how stable is it? what mobo are you running? and what sticks?

 

 

damn thats a very nice write up, thanks for that 🙂 ill check and see if lowering VDDP will help, fingers crossed!

 

whats the max bootable frequency u were able to do on AM5? irrespective of stability, just bootable.

Mobo in my sig, and haven't had a chance to play with it yet to see how stable it is. I did, however, receive another set of the Gskill trident Z's in the mail 😁. I had to file for a lost package on my initial purchase, as it never showed up, but it magically appeared yesterday via UPS mail innovations. Not sure if the Egg will ask for it back or not.

 

Was looking through my Newegg account and found an old review I posted. All I have to say is..."ahh, the good old days"!

Screenshot_2023-09-04-10-35-37-86_4adea005b1d4ef442251c0081020739a.thumb.jpg.1e229a96e5f10384715bebd4997f165b.jpg

  • Thumb Up 6
  • Bump 2

Lian Li Lancool III | Ryzen 9 9950X | 48gb G-skill Trident Z5 DDR5 8000mhz | MSI Mpg X670E Carbon |

AsRock Taichi Radeon 7900xtx Bykski Block |Raijintek Scylla Pro 360 custom loop| Crucial T700 1tb

WD Black's SN770 500gb/1tb NVME | Toshiba 8Tb 7200rpm Data |

EVGA 1000w SuperNova |32" Agon 1440p 165hz Curved Screen |  Windows 10 LoT 21h2

Link to comment
Share on other sites

1 hour ago, Raiderman said:

Mobo in my sig, and haven't had a chance to play with it yet to see how stable it is. I did, however, receive another set of the Gskill trident Z's in the mail 😁. I had to file for a lost package on my initial purchase, as it never showed up, but it magically appeared yesterday via UPS mail innovations. Not sure if the Egg will ask for it back or not.

 

Was looking through my Newegg account and found an old review I posted. All I have to say is..."ahh, the good old days"!

Screenshot_2023-09-04-10-35-37-86_4adea005b1d4ef442251c0081020739a.thumb.jpg.1e229a96e5f10384715bebd4997f165b.jpg

Reminds me of my last radeon.  XT9600 AIW.  Loved that card when I had it. 

  • Thumb Up 3

Workstation - Dell XPS 8940 - desktop creative powerhouse

Mobile Workstation - Dell inspiron 5406 2 in 1 - mobile creative beast

Wifey's Notebook - Dell inspiron 3169 - Little gem for our businesses

Link to comment
Share on other sites

2 hours ago, kojack said:

Reminds me of my last radeon.  XT9600 AIW.  Loved that card when I had it. 


My first (Amd) Radeon graphics card was Radeon X800 XT and X850 XT. Dead. 7970 still alive and my 6870 is dead. My history with AMD. For Nvidia… Gigabyte 1070 and 570 Dead. The GTX 570 was a very good binned Gainward card. This brand does it best in reliability tests. Still couldn’t keep up with my bench hobby. Have loads of cards in my shelves but I avoid AMD, not only because of what it is. Here me Scott Herkelman (HMSCOTT). You can’t win me back. You and Azor destroy your brand. Nvidia or no graphics cards. Can’t have it both. I can live with both the latter but no new Radeon cards on me. Only cheapo old sub 50$ AMD cards.

  • Thumb Up 2
  • Like 1

"The Killer"  ASUS ROG Z790 Apex Encore | 14900KS | 4090 HOF + 20 other graphics cards | 32GB DDR5 | Be Quiet! Dark Power Pro 12 - 1500 Watt | Second PSU - Cooler Master V750 SFX Gold 750W (For total of 2250W Power) | Corsair Obsidian 1000D | Custom Cooling | Asus ROG Strix XG27AQ 27" Monitors |

 

                                               Papusan @ HWBOTTeam PremaMod @ HWBOT | Papusan @ YouTube Channel

                             

 

Link to comment
Share on other sites

It does kind of suck having to stay on top of the upgrade wagon if you always want that peak performance. Sometimes it’s nice to just use something for what it is, to get the job done. I started getting in to shooting rifles as a hobby, and I built a rifle with a nice $2K scope on top, it’s nice knowing it won’t be obsolete in 4-5 years or even 10+ years haha. 
 

We aren’t going out and buying new cars every single year. We typically use what we have. And even now, having the latest and greatest still doesn’t always pan out. Especially seeing what the 7900XT/XTX does against 4080s/4090’s in AMD optimized games like Starfield. Even if we do have the best hardware wise we aren’t always gonna get the best driver and optimization service from our brand of choice. I know the AMD guys are enjoying their tech right now considering how a 6800XT competes at a stock RTX3080 level performance in this game right now lol.

 

PS: My first ATI GPU was a ATI 9600XT and then a ATI 9700 Pro soon after, I was too young to afford the Nvidia 6800 Ultra or ATI 9800XT. Back then we could do a lot with bios flashing though. I remember flashing a X850 Pro to a X850XT with a floppy disc.😂

  • Thumb Up 6
  • Like 2
  • Haha 1
  • Bump 2

13900KF

Link to comment
Share on other sites

My oldest memory of computers and the first time I touched one was in kindergarten and my big buddy showed me into a room full of computers can't remember if they were laptops probably not but it was LCD and two colors orange and blue we drew lines a triangle ect. holy smokes have times changed in 33 years

 

as for more modern stuff, a kid in high skool wanted me to check out counterstrike. also I was always arguing(discussing) how shenmue would be better than half life 2 graphically. haha looking back aw man thats funny half life 2 ended up being the greatest game of all time

 

as for jokebooks, my first was ironically a hp laptop with a 7800gtx it powered through half life 2 and left me impressed that was back in 2005-2006. and sadly not much has changed since 2005. I still think half life 2 has passable graphics in 2023. the textures are not 4k but from a little bit back they look crisp the polygon count is low but the overall look isn't bad.

 

 

and back on topic. most people dont have a 4090 or 3090 they are running a 1650 and gaming at 1080p

 

here

 

  • Thumb Up 4
  • Bump 1

ZEUS-COMING SOON

            Omen 16 2021

            Zenbook 14 oled

            Vivobook 15x oled

 

Link to comment
Share on other sites

45 minutes ago, tps3443 said:

It does kind of suck having to stay on top of the upgrade wagon if you always want that peak performance. Sometimes it’s nice to just use something for what it is, to get the job done. I started getting in to shooting rifles as a hobby, and I built a rifle with a nice $2K scope on top, it’s nice knowing it won’t be obsolete in 4-5 years or even 10+ years haha. 
 

We aren’t going out and buying new cars every single year. We typically use what we have. And even now, having the latest and greatest still doesn’t always pan out. Especially seeing what the 7900XT/XTX does against 4080s/4090’s in AMD optimized games like Starfield. Even if we do have the best hardware wise we aren’t always gonna get the best driver and optimization service from our brand of choice. I know the AMD guys are enjoying their tech right now considering how a 6800XT competes at a stock RTX3080 level performance in this game right now lol.

 

PS: My first ATI GPU was a ATI 9600XT and then a ATI 9700 Pro soon after, I was too young to afford the Nvidia 6800 Ultra or ATI 9800XT. Back then we could do a lot with bios flashing though. I remember flashing a X850 Pro to a X850XT with a floppy disc.😂

 

Well there is a reason the bulk of GPU sales are mid to lower tier cards as most buyers tend to shop there. We lament about the 4090 and pricing but this cycle (and most previous cycles) top end cards were reserved and purchased by a much smaller group of enthusiasts not joe consumer.

 

Even then, joe consumer usually buys a mid tier card with mid tier hardware at best and will then use that system for at least 3-5 years before doing anything to it either just through finally wanting an upgrade or games/software running so poorly they decide to upgrade to....yep....whatever is now that cycle's mid tier hardware.

 

If you perpetually want to have the best performance at all times that comes at a cost. It's an expensive hobby but it is what it is.

 

If cars experienced the performance/efficiency gains CPUs and GPUs experienced we would probably be getting 500mi/gal at this point and go 0-65 in less than a second.

 

I 've been saying it for quite some time that the 7900xtx is the sleeper bang:buck card especially if you go for a $999 or less model. Is it as good as the 4090? No. Not even really close when all things are equal. If you see any game that has the 7900xt equaling it or surpassing it in performance that is a reflection of optimized AMD code and/or poorly written Nvidia code that usually reflects sloppy porting from AMD console code where due to the controlled, limited nature of consoles developers HAVE to write leaner, more efficient code that targets AMD optimizations to extract maximum performance that can carry over to AMD hardware.

 

When devs can, then port their code to PCs and don't take the time to take advantage of Nvidia's optimizations coupled with focusing on AMD optimizations (as they should for the console end), you end up with some charts showing the 7900xtx outpacing the 4080 by 15-20% in Starfield and matching or even beating the 4090 especially at lower resolutions....that tells you everything you need to know since we know hardware wise the 4090 is flat out a better card in every aspect (except price).

 

---

 

As for first GPUs......

 

You young whipper snappers. 🙂

 

My first stand alone GPU was a Tandy EGA ISA card for my Tandy 4000 386-16mhz back in 1990 (this was also my first PC I bought while working at Radio Shack for 3.5yrs while in college). I upgraded to a VGA card and a Soundblaster a little while later to fully soak in Wing Commander at max settings.

My first 3D'esque video card was a Rendition Verite 1000 card so I could run VQuake.

My first all purpose real GPU was a 3DFX Voodoo2 to I could run GLQuake (I then gave my Redition card to my brother)

My first ATI GPU was a Radeon 7000 series circa 2000 which I modified and augmented the heatsink and externally cooled and OC'd to the max to eek out every fps I could from Deus Ex (I again gave my Voodoo2 to my brother)

My first Nvidia was a BFG 6800 Ultra circa late 2004 for my Power Mac G5 Duo to play WoW on my brand new 30" Apple Display

 

From 1996->1999 my brother and I were hardcore into Quake playing online and going to cons for matches/competitions so I took our hardware pretty seriously to have it run as quick as possible while keeping details as low/sparse as possible to see everything clearly. We both ran Sony CRTs.

 

 

 

 

 

 

  • Thumb Up 2
  • Like 2

Electrosoft Alpha: SP109 14900KS 59/46/50  | Asrock Z790i Lightning  | MSI  Ventus 3x 4070 Super| AC LF II 420 | TG 2x24GB 8200 | Samsung 990 Pro 2TB | EVGA 1600w P2 | Phanteks Ethroo Pro | Alienware AW3225QF 32" OLED

Ellectrosoft  Beta:   Eurocom X15 Raptor |  i9-12900k |  Nvidia RTX 3070ti  | HyperX 3200 CL20 32GB | Samsung 990 2TB  | 15.6" 144hz  | Wifi 6E
Heath: i9-12900k | EVGA CLC 280 | Asus Strix Z690 D4 | Asus Strix 3080 | 32GB DDR4 2x16GB B-Die 4000  | WD Black SN850 512GB |  EVGA DG-77 | Samsung G7 32" 144hz 32"

My for sale items on eBay.

 

 

 


 

Link to comment
Share on other sites

1 hour ago, electrosoft said:

Well there is a reason the bulk of GPU sales are mid to lower tier cards as most buyers tend to shop there. We lament about the 4090 and pricing but this cycle (and most previous cycles) top end cards were reserved and purchased by a much smaller group of enthusiasts not joe consumer.

 

With Ada, we lament about NVIdia pricing and product offering across the board.

 

1 hour ago, electrosoft said:

If you perpetually want to have the best performance at all times that comes at a cost. It's an expensive hobby but it is what it is.

 

It is an expensive hobby because NVidia are ripping us off. If AMD died and Intel GPU failed, then Jensen would probably jack up the prices even more. We can pretend there is nothing can be done about, however, that is patently false. 

 

1 hour ago, electrosoft said:

I 've been saying it for quite some time that the 7900xtx is the sleeper bang:buck card especially if you go for a $999 or less model. Is it as good as the 4090? No. Not even really close when all things are equal. If you see any game that has the 7900xt equaling it or surpassing it in performance that is a reflection of optimized AMD code and/or poorly written Nvidia code that usually reflects sloppy porting from AMD console code where due to the controlled, limited nature of consoles developers HAVE to write leaner, more efficient code that targets AMD optimizations to extract maximum performance that can carry over to AMD hardware.

 

Err, not so fast, let's look at the raw specs:

 

https://www.techpowerup.com/gpu-specs/geforce-rtx-4090.c3889

https://www.techpowerup.com/gpu-specs/radeon-rx-7900-xtx.c3941

 

We see that the cards trade blows when it comes to texture and pixel rates, and NVidia is indeed faster in lower precision compute, while AMD does better in high precision (less important in gaming).

 

The manufacturing process is the same, memory bandwidth numbers are very similar, so are caches.

We don't have any theo performance specs on the RT cores (I am not sure comparing core counts across architectures is meaningful), so perhaps NVidia holds the theoretical upper hand there. Maybe.

 

That aside, it follows that the difference in realised performance basically comes down to software. If things are optimised for AMD, it evidently has the upper hand, much to NVidia fans' self-defeating dismay. If a game leverages NVidia software features (DLSS etc) the tables turn. You typically see games leaning one way or the other - my guess is this is based on which manufacturer has a deal with the given dev house. I would assume that most PC titles and benchmarks are optimised for NVidia simply because it's the more prevalent platform, hence you see the dominance in benchmarks and PC titles.

 

However, since NVidia gave up/lost the console games because it opted to go with the juicy business of ripping PC users off, it has to pay the price: some/many games are optimised primarily for AMD.

 

In summary, the 7900 XTX specs are great, basically on par with the 4090 (apart from the RT area, potentially).

In fact everything else being equal, at $999 it would have been a no-brainer vs the $1600+ 4090. Sadly, everything else is not equal, PC software probably tends to more optimised for NVidia, meaning that AMD is fighting an uphill battle.

Without getting software studios onboard, which might be difficult given how deep NVidia's pockets are at the moment, they would have to come out with a product sporting 20%+ faster hardware just to match NVidia's performance in optimised titles.... basically it looks like NVidia first rips off PC users, then bribes devs to produce NVidia-optimised titles to keep the racket going lol. Wake up guys.

 

  

1 hour ago, electrosoft said:

My first stand alone GPU was a Tandy EGA ISA card for my Tandy 4000 386-16mhz back in 1990 (this was also my first PC I bought while working at Radio Shack for 3.5yrs while in college). I upgraded to a VGA card and a Soundblaster a little while later to fully soak in Wing Commander at max settings.

My first 3D'esque video card was a Rendition Verite 1000 card so I could run VQuake.

My first all purpose real GPU was a 3DFX Voodoo2 to I could run GLQuake (I then gave my Redition card to my brother)

My first ATI GPU was a Radeon 7000 series circa 2000 which I modified and augmented the heatsink and externally cooled and OC'd to the max to eek out every fps I could from Deus Ex (I again gave my Voodoo2 to my brother)

My first Nvidia was a BFG 6800 Ultra circa late 2004 for my Power Mac G5 Duo to play WoW on my brand new 30" Apple Display

 

From 1996->1999 my brother and I were hardcore into Quake playing online and going to cons for matches/competitions so I took our hardware pretty seriously to have it run as quick as possible while keeping details as low/sparse as possible to see everything clearly. We both ran Sony CRTs.

 

That brought back a lot of memories, I grabbed the first Voodoo though if remember correctly (for the sake of our Candian colleagues, let's not forget about Gravis Ultrasound lol).

 

  • Thumb Up 2
  • Like 1

"We're rushing towards a cliff, but the closer we get, the more scenic the views are."

-- Max Tegmark

 

AI: Major Emerging Existential Threat To Humanity

Link to comment
Share on other sites

4 minutes ago, Etern4l said:

 

With Ada, we lament about NVIdia pricing and product offering across the board.

 

 

It is an expensive hobby because NVidia are ripping us off. If AMD died and Intel GPU failed, then Jensen would probably jack up the prices even more. We can pretend there is nothing can be done about, however, that is patently false. 

 

 

Err, not so fast, let's look at the raw specs:

 

https://www.techpowerup.com/gpu-specs/geforce-rtx-4090.c3889

https://www.techpowerup.com/gpu-specs/radeon-rx-7900-xtx.c3941

 

We see that the cards trade blows when it comes to texture and pixel rates, and NVidia is indeed faster in lower precision compute, while AMD does better in high precision (less important in gaming).

 

The manufacturing process is the same, memory bandwidth numbers are very similar, so are caches.

We don't have any theo performance specs on the RT core (I am not sure comparing core counts across architectures is meaningful), so perhaps NVidia holds the theoretical upper hand there. Maybe.

 

That aside, it follows that the difference in realised performance basically comes down to software. If things are optimised for AMD, it evidently has the upper hand, much to NVidia fans' self-defeating dismay. If a game leverages NVidia software features (DLSS etc) the tables turn, you typically see games leaning one way or the other - my guess is this is based on which manufacturer has a deal with the given dev house. My guess is that most PC titles and benchmarks are optimised for NVidia simply because it's the more prevalent platform, hence you see the dominance in benchmarks and PC titles.

 

However, since NVidia gave up/lost the console games because it opted to go with the juicy business of ripping PC users off, it has to pay the price: some/many games are optimised primarily for AMD.

 

In summary, the 7900 XTX specs are great, basically on par with the 4090 (apart from the RT area, potentially).

In fact everything else being equal, at $999 it would have been a no-brainer vs the $1600+ 4090. Sadly, everything else is not equal, PC software probably tends to more optimised for NVidia, meaning that AMD is fighting an uphill battle.

Without getting software studios onboard, which might be difficult given how deep NVidia's pockets are at the moment, they would have to come out with a product sporting 20%+ faster hardware just to match NVidia's performance in optimised titles.... basically it looks like NVidia first rips off PC users, then bribes devs to produce NVidia-optimised titles to keep the racket going lol. Wake up guys.

To be clear, AMD is their own worst enemy. A lot, maybe all, of their shortcomings are self-inflicted damage.

 

AMD continually hurts itself with lousy drivers. They have never been good at it... for decades now. Crappy drivers keep me from enjoying the 6900 XT to the extent I would be able to with good drivers. Just simply using it for work, where performance really doesn't matter, it delivers an inferior experience due to driver crap. Deja vu for me. Their drivers sucked when I gave them their last chance (2012) and they still do. They need to get a clue.

 

Had they been smart, they would have used GDDR6X. Opting for the slower/cheaper GDDR6 hurt them in the GPU war. Whether it was a mistake that reflects poor judgment or they actually wanted to help keep the price down by using cheaper memory matters not. They slit their own throat opting for cheaper.

  • Thumb Up 2
  • Like 1

WRAITH // Z790 Apex | 14900KS | 4090 Suprim X+Byksi Block | 48GB DDR5-8600 | Toughpower 1650W | MO-RA3 360 | Hailea HC-500A || O11D XL EVO (T-Rex)

BANSHEE // X870E Carbon | 9950X | 4090 Gaming OC+Alphacool Block | 32GB DDR5-8200 | RM1200x SHIFT | XT45 1080 Nova || Antec C8 (Rhinoceros)

SPECTRE // Z790i Edge | 13900KS | 3090 Ti FTW3 | 48GB DDR5-8200 | RM1000e | EK Nucleus CR360 Direct Die || Prime A21 (Rattlesnake)

HALF-BREED // Precision 7720 | BGA CPU Filth | 32GB DDR4 | Quadro P5000 | 4K Display | Nothing to Write Home About (Turdbook)

 Mr. Fox YouTube Channel | Mr. Fox @ HWBOT

The average response time for a 911 call is 10 minutes. The response time of a .357 is 1400 feet per second. 

Link to comment
Share on other sites

3 minutes ago, Mr. Fox said:

To be clear, AMD is their own worst enemy. A lot, maybe all, of their shortcomings are self-inflicted damage.

 

AMD continually hurts itself with lousy drivers. They have never been good at it... for decades now. Crappy drivers keep me from enjoying the 6900 XT to the extent I would be able to with good drivers. Just simply using it for work, where performance really doesn't matter, it delivers an inferior experience due to driver crap. Deja vu for me. Their drivers sucked when I gave them their last chance (2012) and they still do. They need to get a clue.

 

Had they been smart, they would have used GDDR6X. Opting for the slower/cheaper GDDR6 hurt them in the GPU war. Whether it was a mistake that reflects poor judgment or they actually wanted to help keep the price down by using cheaper memory matters not. They slit their own throat opting for cheaper.

 

I'm seeing very similar memory performance specs on the 7900 XTX, yes - a bit slower but we don't know the whole story so we shouldn't rush to potentially unduly harsh judgment. For example, GDDR6X was jointly developed by Micron and NVidia, do you imagine they would just let AMD use it?

 

Also, clearly AMD can develop rock-solid drivers, since this stuff runs on consoles and Macs, so I wonder if what you perceive as "driver issues" is just incompatibility between NVidia-optimised titles and AMD drivers. 

  • Thumb Up 1

"We're rushing towards a cliff, but the closer we get, the more scenic the views are."

-- Max Tegmark

 

AI: Major Emerging Existential Threat To Humanity

Link to comment
Share on other sites

13 minutes ago, Etern4l said:

 

Also, clearly AMD can develop rock-solid drivers, since this stuff runs on consoles and Macs, so I wonder if what you perceive as "driver issues" is just incompatibility between NVidia-optimised titles and AMD drivers. 

You're forgetting I rarely game and maybe didn't notice what I said about the 6900 XT (and have previously posted about it). Their drivers suck. They way they work (or don't work) sucks and the GUI to manage them sucks. Blurry text, disappearing text while typing and DWM desktop rendering graphical glitches just trying to do my job. Nice, huh?

 

Maybe if they burned as many calories on a $900 GPU as they do on a $500 console their PC drivers wouldn't suck.

 

We might be giving them too much credit though. We don't know that they actually produce the drivers for consoles. It might be something Micro$lop and Sony have taken ownership of to make sure it gets done right. I have not ever seen an example of good AMD drivers or software before.

  • Thumb Up 3
  • Like 1

WRAITH // Z790 Apex | 14900KS | 4090 Suprim X+Byksi Block | 48GB DDR5-8600 | Toughpower 1650W | MO-RA3 360 | Hailea HC-500A || O11D XL EVO (T-Rex)

BANSHEE // X870E Carbon | 9950X | 4090 Gaming OC+Alphacool Block | 32GB DDR5-8200 | RM1200x SHIFT | XT45 1080 Nova || Antec C8 (Rhinoceros)

SPECTRE // Z790i Edge | 13900KS | 3090 Ti FTW3 | 48GB DDR5-8200 | RM1000e | EK Nucleus CR360 Direct Die || Prime A21 (Rattlesnake)

HALF-BREED // Precision 7720 | BGA CPU Filth | 32GB DDR4 | Quadro P5000 | 4K Display | Nothing to Write Home About (Turdbook)

 Mr. Fox YouTube Channel | Mr. Fox @ HWBOT

The average response time for a 911 call is 10 minutes. The response time of a .357 is 1400 feet per second. 

Link to comment
Share on other sites

On 8/28/2023 at 8:51 PM, Raiderman said:

It takes a degree in trigonometry to know if the human eye can distinguish between 1080p, and 4K. Viewing angle, pixel pitch, distance from the screen ....blah, blah.. I myself believe it's marketing BS. I cannot tell the difference between 1080, 1440, 4K. My eyes can however, see the difference in low refresh rate screens. 60hz and lower flickers like a florescent bulb.

 

For me it really depends on screen size and viewing distance. But for a normal computer desk viewing distance, 27" screen I can easily spot the difference between a 1080p, 1440p, and 4K screen. 1440p is a tad more difficult, but individual pixels are still viewable, 1080p is disgusting on a screen that size, and 4K is finally at a high enough DPI that individual pixels are no longer distinguishable. 

 

In game, it's no contest, a 4K screens clarity and detail really stands out. Textures look incredible and games have never looked so crisp and beautiful. 

 

I've been gaming at 4K 144Hz since 2019, and the road only points forward. Although I do wonder if were going to reach a point in the near future where it won't matter anymore. Like the next horizon is 8K screens, but will that matter if we can't see individual pixels anymore? My gut tells me we're close to the end on resolution, or need for a new monitor technology to make a real difference in visual fidelity and the drive towards life like graphics.

  • Thumb Up 2
  • Like 1
Spoiler

The Beast Asus Z790 APEX | Intel i9 13900K | ASUS RTX 4090 Strix OC | 64gb DDR5 7466 CL34 Dual Rank A-Dies | Samsung 990 Pro 2TB | Innocn 4K 160Hz Mini LED HDR1000 | LG 27GN950-B 4K 160Hz | Corsair 170i Elite LCD 420mm AIO | Corsair 7000D | EVGA 1600w T2

Little Beast EVGA Z690 DARK | Intel i9 13900K | Nvidia RTX 4090 FE | 32gb DDR5 SK Hynix DDR5 8000 CL36 A-Dies | Samsung 980 Pro 2TB | LG OLED C1 4K 120Hz G-Sync/FreeSync | Alienware AW2721D 1440p 240Hz G-Sync Ultimate | Corsair 115i Elite 280mm AIO | Lian Li 011 Dynamic | EVGA 1000w P6

 

 

Link to comment
Share on other sites

18 minutes ago, Talon said:

 

For me it really depends on screen size and viewing distance. But for a normal computer desk viewing distance, 27" screen I can easily spot the difference between a 1080p, 1440p, and 4K screen. 1440p is a tad more difficult, but individual pixels are still viewable, 1080p is disgusting on a screen that size, and 4K is finally at a high enough DPI that individual pixels are no longer distinguishable. 

 

In game, it's no contest, a 4K screens clarity and detail really stands out. Textures look incredible and games have never looked so crisp and beautiful. 

 

I've been gaming at 4K 144Hz since 2019, and the road only points forward. Although I do wonder if were going to reach a point in the near future where it won't matter anymore. Like the next horizon is 8K screens, but will that matter if we can't see individual pixels anymore? My gut tells me we're close to the end on resolution, or need for a new monitor technology to make a real difference in visual fidelity and the drive towards life like graphics.

I haven't had my 4K monitor long enough to get used to it, and I haven't started liking it on the desktop. I hate using more than 100% scaling and text is smaller than I would like it to be on even a 27-inch screen. I think with more time the smaller text won't annoy me as much. But, yeah... gaming is like... wow... major improvement. Way better and more noticeable than I expected it would be. (If I am honest, my expectations were very low and that might be jading my impressions.) It is like the video rendered on my display has a more chromatic and less cartoonish quality than before. It was kind of trippy at first, but it didn't take long for me to get used to how much better games look. 

  • Thumb Up 3

WRAITH // Z790 Apex | 14900KS | 4090 Suprim X+Byksi Block | 48GB DDR5-8600 | Toughpower 1650W | MO-RA3 360 | Hailea HC-500A || O11D XL EVO (T-Rex)

BANSHEE // X870E Carbon | 9950X | 4090 Gaming OC+Alphacool Block | 32GB DDR5-8200 | RM1200x SHIFT | XT45 1080 Nova || Antec C8 (Rhinoceros)

SPECTRE // Z790i Edge | 13900KS | 3090 Ti FTW3 | 48GB DDR5-8200 | RM1000e | EK Nucleus CR360 Direct Die || Prime A21 (Rattlesnake)

HALF-BREED // Precision 7720 | BGA CPU Filth | 32GB DDR4 | Quadro P5000 | 4K Display | Nothing to Write Home About (Turdbook)

 Mr. Fox YouTube Channel | Mr. Fox @ HWBOT

The average response time for a 911 call is 10 minutes. The response time of a .357 is 1400 feet per second. 

Link to comment
Share on other sites

31 minutes ago, Mr. Fox said:

You're forgetting I rarely game and maybe didn't notice what I said about the 6900 XT (and have previously posted about it). Their drivers suck. They way they work (or don't work) sucks and the GUI to manage them sucks. Blurry text, disappearing text while typing and DWM desktop rendering graphical glitches just trying to do my job. Nice, huh?

 

Maybe if they burned as many calories on a $900 GPU as they do on a $500 console their PC drivers wouldn't suck.

 

We might be giving them too much credit though. We don't know that they actually produce the drivers for consoles. It might be something Micro$lop and Sony have taken ownership of to make sure it gets done right. I have not ever seen an example of good AMD drivers or software before.

 

Unfortunately I have little ammo at hand to try and counter that, even though AMD clearly needs some reinforcements here lol Bro @Raiderman to the rescue perhaps? :)

 

Good news though is that we know that the underlying HW and its specs are solid, and software issues are fixable. To be fair, I recall quite a few NVidia driver complaints - and actually I think I have a live one in Windows 11 concerning multi-GPU setup, which is not to say that AMD would have done better in this scenario.

  • Thumb Up 2

"We're rushing towards a cliff, but the closer we get, the more scenic the views are."

-- Max Tegmark

 

AI: Major Emerging Existential Threat To Humanity

Link to comment
Share on other sites

9 hours ago, Etern4l said:

 

With Ada, we lament about NVIdia pricing and product offering across the board.

 

 

I'm focusing on top tier pricing for those who want the best referencing @tps3443 post but this is absolutely true.

 

I still stand by my assessment that the 4070 and 4090 are the only two viable cards price wise in the line up this time around but if the 4060ti 16GB somehow falls to $399.99 it isn't a bad pick.

 

9 hours ago, Etern4l said:

It is an expensive hobby because NVidia are ripping us off. If AMD died and Intel GPU failed, then Jensen would probably jack up the prices even more. We can pretend there is nothing can be done about, however, that is patently false.

 

It is an expensive hobby because laws of diminishing returns are grossly in play for the top end same as if you want a Ferrari over a Camry or first class over economy. To be the best you pay the best sometimes crazily so.

 

Nvidia isn't ripping anybody off. Nobody is forcing you to purchase their GPUs. This is a capitalistic world. You look at the product and determine if you want it or not at the price offered based upon your own criteria. All Nvidia can do is offer their goods at their selected price points and let the consumer decide.

 

I don't subscribe to the nonsense that consumers don't know what they're doing. They're making educated purchasing decisions based upon their own purchasing power, preferences and buying criteria. Their  preferences and criteria are allowed to be different than yours. It doesn't make yours right or theirs wrong.

 

I never said or intimated there is nothing that can be done. As always, you vote with your purchasing power. Enough have spoken to let Nvidia know their pricing is still acceptable. When it is not and/or they feel enough competition/threat they will adjust accordingly as we just saw the 4060ti drop to $430 with AMDs announcement of their 7800xt along with lackluster sales. Hopefully we see more of this.

 

9 hours ago, Etern4l said:

Err, not so fast, let's look at the raw specs:

 

https://www.techpowerup.com/gpu-specs/geforce-rtx-4090.c3889

https://www.techpowerup.com/gpu-specs/radeon-rx-7900-xtx.c3941

 

We see that the cards trade blows when it comes to texture and pixel rates, and NVidia is indeed faster in lower precision compute, while AMD does better in high precision (less important in gaming).

 

The manufacturing process is the same, memory bandwidth numbers are very similar, so are caches.

We don't have any theo performance specs on the RT cores (I am not sure comparing core counts across architectures is meaningful), so perhaps NVidia holds the theoretical upper hand there. Maybe.

 

That aside, it follows that the difference in realised performance basically comes down to software. If things are optimised for AMD, it evidently has the upper hand, much to NVidia fans' self-defeating dismay. If a game leverages NVidia software features (DLSS etc) the tables turn. You typically see games leaning one way or the other - my guess is this is based on which manufacturer has a deal with the given dev house. I would assume that most PC titles and benchmarks are optimised for NVidia simply because it's the more prevalent platform, hence you see the dominance in benchmarks and PC titles.

 

Just for clarity before proceeding,  architecturally speaking, you are claiming the 7900xtx ~= 4090 in raw rasterization and the 4090 overall wins because of software and optimizations?

 

 

9 hours ago, Etern4l said:

That brought back a lot of memories, I grabbed the first Voodoo though if remember correctly (for the sake of our Candian colleagues, let's not forget about Gravis Ultrasound lol).

 

 

I remember skipping the Voodoo because the Verite was still a relatively new purchase for me so I needed a cooling off period before pulling the trigger on another upgrade! 😄 Games like Fallout 1 and 2 and Arcanum I played back then along with Quake didn't need 3D acceleration.

 

WAY back in the day, I did have a Gravis joystick! 😄

 

 

 

 

 

 

 

 

 

  • Thumb Up 4

Electrosoft Alpha: SP109 14900KS 59/46/50  | Asrock Z790i Lightning  | MSI  Ventus 3x 4070 Super| AC LF II 420 | TG 2x24GB 8200 | Samsung 990 Pro 2TB | EVGA 1600w P2 | Phanteks Ethroo Pro | Alienware AW3225QF 32" OLED

Ellectrosoft  Beta:   Eurocom X15 Raptor |  i9-12900k |  Nvidia RTX 3070ti  | HyperX 3200 CL20 32GB | Samsung 990 2TB  | 15.6" 144hz  | Wifi 6E
Heath: i9-12900k | EVGA CLC 280 | Asus Strix Z690 D4 | Asus Strix 3080 | 32GB DDR4 2x16GB B-Die 4000  | WD Black SN850 512GB |  EVGA DG-77 | Samsung G7 32" 144hz 32"

My for sale items on eBay.

 

 

 


 

Link to comment
Share on other sites

2 minutes ago, Etern4l said:

 

Unfortunately I have little ammo at hand to try and counter that, even though AMD clearly needs some reinforcements here lol Bro @Raiderman to the rescue perhaps? 🙂

 

Good news though is that we know that the underlying HW and its specs are solid, and software issues are fixable. To be fair, I recall quite a few NVidia driver complaints - and actually I think I have a live one in Windows 11 concerning multi-GPU setup, which is not to say that AMD would have done better in this scenario.

I don't really have any complaints about the 6900 XT at a hardware level. It is a lot more powerful than the 3060 Ti it replaced (which I was very content with) and does an excellent job at some things. Software and drivers issues are, indeed, fixable. But, they sometimes never get fixed. Firmware, drivers and software made me hate my X570 setup. Driver bugs aside, in terms of software I felt Ryzen Master was a trashy and bloated-feeling GUI and I don't care for Adrenaline's GUI for the same reasons. Layout is chaotic, somewhat illogical and overall unintuitive and it feels bloated.

  • Thumb Up 1
  • Like 1
  • Sad 1

WRAITH // Z790 Apex | 14900KS | 4090 Suprim X+Byksi Block | 48GB DDR5-8600 | Toughpower 1650W | MO-RA3 360 | Hailea HC-500A || O11D XL EVO (T-Rex)

BANSHEE // X870E Carbon | 9950X | 4090 Gaming OC+Alphacool Block | 32GB DDR5-8200 | RM1200x SHIFT | XT45 1080 Nova || Antec C8 (Rhinoceros)

SPECTRE // Z790i Edge | 13900KS | 3090 Ti FTW3 | 48GB DDR5-8200 | RM1000e | EK Nucleus CR360 Direct Die || Prime A21 (Rattlesnake)

HALF-BREED // Precision 7720 | BGA CPU Filth | 32GB DDR4 | Quadro P5000 | 4K Display | Nothing to Write Home About (Turdbook)

 Mr. Fox YouTube Channel | Mr. Fox @ HWBOT

The average response time for a 911 call is 10 minutes. The response time of a .357 is 1400 feet per second. 

Link to comment
Share on other sites

55 minutes ago, Mr. Fox said:

I don't really have any complaints about the 6900 XT at a hardware level. It is a lot more powerful than the 3060 Ti it replaced (which I was very content with) and does an excellent job at some things. Software and drivers issues are, indeed, fixable. But, they sometimes never get fixed. Firmware, drivers and software made me hate my X570 setup. Driver bugs aside, in terms of software I felt Ryzen Master was a trashy and bloated-feeling GUI and I don't care for Adrenaline's GUI for the same reasons. Layout is chaotic, somewhat illogical and overall unintuitive and it feels bloated.

I was in much higher favor to AMD drivers before they updated to the GUI we all contend with now. Loaded quickly, and changes made for a specific title took place in the game in real time. Where as Nvidia control used to take almost 4 minutes to open at worst and changes I made to a game didnt take effect until a reboot of the title. 

 

Those two points were a huge pain as at the time I was heavily engaged into implementing mods for Mass Effect 2 and 3 (to lesser degree, but like Elder Scrolls modding for many). 

 

I think the GPU settings do take effect in real these days IIRC, but NCP is still sporadic. There would be times even on the 1080K platform where it would take 30-90 seconds just to open and other times instantly. With the 5800X3D on average its about 3-5 seconds, which is acceptable.

 

 

  • Thumb Up 3
  • Like 1
  • Bump 1
Link to comment
Share on other sites

52 minutes ago, Reciever said:

I was in much higher favor to AMD drivers before they updated to the GUI we all contend with now. Loaded quickly, and changes made for a specific title took place in the game in real time. Where as Nvidia control used to take almost 4 minutes to open at worst and changes I made to a game didnt take effect until a reboot of the title. 

 

Those two points were a huge pain as at the time I was heavily engaged into implementing mods for Mass Effect 2 and 3 (to lesser degree, but like Elder Scrolls modding for many). 

 

I think the GPU settings do take effect in real these days IIRC, but NCP is still sporadic. There would be times even on the 1080K platform where it would take 30-90 seconds just to open and other times instantly. With the 5800X3D on average its about 3-5 seconds, which is acceptable.

To be fair, I haven't used a "stock" NVIDIA driver in years. Before NVCleanStall I manually did my own mods, or used j95 mods, because I didn't want the garbage "features" mucking up my system, like Ansel, ShadowPlay, GeFarts Experience, automatic driver updates and what not. Part of what I loathe about the Adrenaline GUI is it has a similar payload of filth and I don't know how to eradicate it. With the GeFarts software it is not all combined into a single interface, so eliminating the trash might be easier.

  • Thumb Up 2
  • Like 1
  • Bump 2

WRAITH // Z790 Apex | 14900KS | 4090 Suprim X+Byksi Block | 48GB DDR5-8600 | Toughpower 1650W | MO-RA3 360 | Hailea HC-500A || O11D XL EVO (T-Rex)

BANSHEE // X870E Carbon | 9950X | 4090 Gaming OC+Alphacool Block | 32GB DDR5-8200 | RM1200x SHIFT | XT45 1080 Nova || Antec C8 (Rhinoceros)

SPECTRE // Z790i Edge | 13900KS | 3090 Ti FTW3 | 48GB DDR5-8200 | RM1000e | EK Nucleus CR360 Direct Die || Prime A21 (Rattlesnake)

HALF-BREED // Precision 7720 | BGA CPU Filth | 32GB DDR4 | Quadro P5000 | 4K Display | Nothing to Write Home About (Turdbook)

 Mr. Fox YouTube Channel | Mr. Fox @ HWBOT

The average response time for a 911 call is 10 minutes. The response time of a .357 is 1400 feet per second. 

Link to comment
Share on other sites

Level 12 in Starfield so far! While there are many aspects to the game that I didn’t like at first. The game has my attention and it has really grown on me. I play it and enjoy it for many hours at a time. I haven’t had a game to dump endless time in to like this. The game is solid! No complaints with performance either. I’ve bumped my DLSS slider to 80% of my 4K resolution (Looks far better than 67% DLSS), and I still maintain over 60FPS at all times, with shadow quality on high instead of ultra. (Worthy trade) I feel like I’m playing a new version of Fallout 4 with slightly better graphics and engine. Nothing ground breaking in this game. I honestly would have preferred if Bethesda kept it in dev for like 2-3 more years and gave us the whole experience.  But I guess we gotta wait for ES6 for all that goodness. Either way it’s a good game, and much better than Cyberpunk  2077. I did beat Cyberpunk 2077… But after the story I just couldn’t get in to it. And they had that bubble you were always in and no action happened outside of this invisible bubble. You could not snipe people at all, and no bombing cars from further away. It was so silly. 

  • Thumb Up 3
  • Bump 2

13900KF

Link to comment
Share on other sites

5 minutes ago, tps3443 said:

The game has my attention and it has really grown on me.

I figured this, the most hyped game of 2023. people need to lower their expectations, it in all honesty is probably a great game from what iv'e seen in the videos. I thought the graphics were terrible until I loaded a 4k video in full screen, not bad for the size of the game and as far as the optimisations go im glad 4gb is enough to play this game.

  • Thumb Up 1

ZEUS-COMING SOON

            Omen 16 2021

            Zenbook 14 oled

            Vivobook 15x oled

 

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. Terms of Use