Jump to content
NotebookTalk

*Official Benchmark Thread* - Post it here or it didn't happen :D


Mr. Fox

Recommended Posts

3 minutes ago, Mr. Fox said:

Doesn't that show in HWiNFO64? Down in the RTSS section? I think that was added as a new feature.


No I mean for best results. With V-SYNC/G-Sync/Frame rate Caps. It’s just super easy to manipulate it, or make it better. I’m lost on setup. 
 

So far I’m running G-SYNC with a capped frame rate of 95FPS and it seems to make the lows look really good. Or do we go with 1% lows with an uncapped frame rate? I guess this is what I’m confused on.

  • Thumb Up 1

13900KF

Link to comment
Share on other sites

1 minute ago, tps3443 said:


No I mean for best results. With V-SYNC/G-Sync/Frame rate Caps. It’s just super easy to manipulate it, or make it better. I’m lost on setup. 
 

So far I’m running G-SYNC with a capped frame rate of 95FPS and it seems to make the lows look really good. Or do we go with 1% lows with an uncapped frame rate? I guess this is what I’m confused on.

Oh, OK. Sorry. I didn't understand that. I don't think I can be much help in that regard then. I don't use V- or G-Sync or framerate limiters.

  • Thumb Up 1

Wraith // Z790 Apex | 14900KF | 4090 Suprim X+Byksi Block | 48GB DDR5-8600 | Toughpower GF3 1650W | MO-RA3 360 | Hailea HC-500A || O11D XL EVO
Banshee // Z790 Apex Encore | 13900KS | 4090 Gaming OC+Alphacool Block | 48GB DDR5-8600 | RM1200x SHIFT | XT45 1080 Nova || Dark Base Pro 901
Munchkin // Z790i Edge | 14900KF | Arc A770 Phantom Gaming OC | 48GB DDR5-8200 | GameMax 850W | EK Nucleus CR360 Dark || Prime AP201 
Half-Breed // Dell Precision 7720 | BGA CPU Filth+MXM Quadro P5000 | Sub-$500 Grade A Refurb || Nothing to Write Home About  

 Mr. Fox YouTube Channel | Mr. Fox @ HWBOT

The average response time for a 911 call is 10 minutes. The response time of a .357 is 1400 feet per second. 

Link to comment
Share on other sites

Screenshot-13.png

Screenshot-14.png

 

mix of skyrim and mass effect and graphically like fall out(terrain) cyber punk level on everything else

 

 

playing at 4k high preset 80percent resolution scale getting low 30s but consistant. and suprisingly its funner that I expected like playing a movie, held my attention for like 10 min I will go back to it just taking a break. impressive start

  • Like 1

ZEUS-COMING SOON

            Omen 16 2021

            Zenbook 14 oled

            Vivobook 15x oled

 

Link to comment
Share on other sites

42 minutes ago, Mr. Fox said:

Mr. Azor is a PR hack that says whatever he thinks will be sufficient to trick people into buying stuff from the company he works for. Accuracy is not important.

 

Indeed, our industry is overrun by shysters and I think being one is a pre-requisite to participation. Mr. Green Jeans (dang Huang wang) just happens to be a king among liars, LOL. But, it doesn't matter what company you're talking about, they're all dishonest and misrepresent their overpriced products. Some are both dishonest and produce less desirable products. 

 

Don't believe for a minute that if 7900 XTX was a 4090 competitor that AMD would be selling it for substantially less. They'd be raping their customers to the same degree, without batting an eye. They are not an honorable company. None of them are. They are charging as much as they believe they can to move the product. Consumers are tasked with picking their own poison.


Well, I’m not sure if Starfield is the only example (that would indeed be a bit suspect), but it seems that the 7900XTX is actually a solid competitor. I didn’t see Azor’s name anywhere near the specs BTW, I’m not sure he would even understand them.
 

Innocent until proven guilty. Any concrete examples of AMD behaving unfairly towards the consumer?

 

Look, NVidia creates proprietary DLSSes which don’t even work on previous gens of their own GPUs, AMD creates open alternatives which work on any GPU.

 

As for the competitive pricing, we are actually seeing the same on the server CPU side, where Sapphire Rapids unfortunately represents terrible value vs EPYC purely from the hardware performance standpoint.

"We're rushing towards a cliff, but the closer we get, the more scenic the views are."

-- Max Tegmark

 

AI: Major Emerging Existential Threat To Humanity

Link to comment
Share on other sites

51 minutes ago, tps3443 said:


Not gonna lie, just my RTX3090 KP at 4K is extremely comparable to a 7900XT. Of course, not in Starfield though 🫣

 

This and more.

 

When I had my 7900xtx it was decent but with RT on it equaled my 3080. Pure raster it was about 26% faster than my stock 3080 10GB which wasn't cutting it for WoW 4k Ultra.

 

47 minutes ago, tps3443 said:

@electrosoft How do I even test 1% lows and 99% lows? Do I set a frame cap? Adaptive sync on? It seems this number can be so easily influenced. I can set adaptive sync with a frame cap with lower graphics and it’s gonna be really good. VS no frame cap will hurt the lows a little. 
 

I don’t even know where to start, I get I’ll be testing against my self, but I appreciate any tips for best results.

 

I figured you would use AB/RTSS w/ benchmark capture and since your hardware is fixed, determine a repeatable save spot run of your choosing and scale frequency without changing timings from your 8800 settings down in 400mhz increments down to 5200 as initially it would be a pure frequency test and see how much it changes.

 

If you really want to get indepth with timings vs frequency maybe pick a sweet spot for your memory (6000-7200) that really lets you tighten up even more so and test 6000, 6400, 6800, 7200 with loose vs tight timings.

 

You can then bar graph it or screen shots (or both) to see how much frequency matters with everything else fixed and OCd.

 

1080p, 1440p, 4k.

 

But anything you want to present would be awesome since your system is so OC'd I'm sure for a large bulk of upper end resolution results it will be GPU bound.

 

The most important thing is to have everything remain absolutely consistent (Turn on the chiller!) as possible while yanking your memory to and fro.

 

 

 

 

Electrosoft Prime: 7950X3D | MSI X670E Carbon  | MSI Suprim X Liquid 4090 | AC LF II 420 | G.Skill 6000 A-Die 2x32GB | Samsung 990 Pro 2TB | EVGA 1600w P2 | Phanteks Ethroo Pro | Alienware AW3225QF 32" OLED

MelMel:  AMD Ryzen 5 7600x | Asus B650 Prime | Powercolor Spectra White 7900XTX | Asus Ryugin III 240mm AIO | M-die 2x16GB Custom | Samsung 980 Pro 1TB | EVGA P2 850w | Hyte Y40 | BenQ 32" 4k
Heath: i9-12900k | EVGA CLC 280 | Asus Strix Z690 D4 | Asus Strix 3080 | 32GB DDR4 2x16GB B-Die 4000  | WD Black SN850 512GB |  EVGA DG-77 | Samsung G7 32" 144hz 32"

Eurocom Raptor X15 | 12900k | Nvidia RTX 3070ti | 15.6" 1080p 240hz | Kingston 3200 32GB (2x16GB) | Samsung 980 Pro 1TB Heatsink Edition

 

 

 


 

Link to comment
Share on other sites

51 minutes ago, tps3443 said:


Not gonna lie, just my RTX3090 KP at 4K is extremely comparable to a 7900XT. Of course, not in Starfield though 🫣

 

This and more.

 

When I had my 7900xtx it was decent but with RT on it equaled my 3080. Pure raster it was about 26% faster than my stock 3080 10GB which wasn't cutting it for WoW 4k Ultra.

 

47 minutes ago, tps3443 said:

@electrosoft How do I even test 1% lows and 99% lows? Do I set a frame cap? Adaptive sync on? It seems this number can be so easily influenced. I can set adaptive sync with a frame cap with lower graphics and it’s gonna be really good. VS no frame cap will hurt the lows a little. 
 

I don’t even know where to start, I get I’ll be testing against my self, but I appreciate any tips for best results.

 

I figured you would use AB/RTSS w/ benchmark capture and since your hardware is fixed, determine a repeatable save spot run of your choosing and scale frequency without changing timings from your 8800 settings down in 400mhz increments down to 5200 as initially it would be a pure frequency test and see how much it changes.

 

If you really want to get indepth with timings vs frequency maybe pick a sweet spot for your memory (6000-7200) that really lets you tighten up even more so and test 6000, 6400, 6800, 7200 with loose vs tight timings.

 

You can then bar graph it or screen shots (or both) to see how much frequency matters with everything else fixed and OCd.

 

1080p, 1440p, 4k.

 

But anything you want to present would be awesome since your system is so OC'd I'm sure for a large bulk of upper end resolution results it will be GPU bound.

 

The most important thing is to have everything remain absolutely consistent (Turn on the chiller!) as possible while yanking your memory to and fro.

 

 

 

 

  • Thumb Up 1

Electrosoft Prime: 7950X3D | MSI X670E Carbon  | MSI Suprim X Liquid 4090 | AC LF II 420 | G.Skill 6000 A-Die 2x32GB | Samsung 990 Pro 2TB | EVGA 1600w P2 | Phanteks Ethroo Pro | Alienware AW3225QF 32" OLED

MelMel:  AMD Ryzen 5 7600x | Asus B650 Prime | Powercolor Spectra White 7900XTX | Asus Ryugin III 240mm AIO | M-die 2x16GB Custom | Samsung 980 Pro 1TB | EVGA P2 850w | Hyte Y40 | BenQ 32" 4k
Heath: i9-12900k | EVGA CLC 280 | Asus Strix Z690 D4 | Asus Strix 3080 | 32GB DDR4 2x16GB B-Die 4000  | WD Black SN850 512GB |  EVGA DG-77 | Samsung G7 32" 144hz 32"

Eurocom Raptor X15 | 12900k | Nvidia RTX 3070ti | 15.6" 1080p 240hz | Kingston 3200 32GB (2x16GB) | Samsung 980 Pro 1TB Heatsink Edition

 

 

 


 

Link to comment
Share on other sites

45 minutes ago, ryan said:

Screenshot-13.png

Screenshot-14.png

 

mix of skyrim and mass effect and graphically like fall out(terrain) cyber punk level on everything else

 

 

playing at 4k high preset 80percent resolution scale getting low 30s but consistant. and suprisingly its funner that I expected like playing a movie, held my attention for like 10 min I will go back to it just taking a break. impressive start


So, if you’re low 30’s indoors you’re gonna want to optimize a little more just for good playable experience throughout.


Install the free DLSS mod (It looks great compared to FSR2 on distant items with that scaling turned down), you’ve gotta download like 3 small files I think and drop them in the install folder(s) etc. Then drop to probably 50-67% resolution slider at least if you are running 4K. Turn motion blur off all together, it’s a very resource hungry item in this game. And probably drop shadow quality to medium. 
 

This game loves huge memory bandwidth, and even then, going from a cave or indoors it will nearly chop a frame rate in half when you walk in to the big city outside.
 

I’ve dumbed all the graphics down and played like that for a while just to see how a higher frame rate felt and once you get in to it, the visuals don’t even matter all that much anyways. 
 

keep playing maybe stick with the main quest for some levels and then side track! It’s a lot of fun once you really get started around. I never really got in to Cyberpunk like this or previous games, but I gave this game a solid chance and now I’m hooked. So much to do. 

  • Thumb Up 1
  • Like 1

13900KF

Link to comment
Share on other sites

woah I was not expecting this, im curious what the minerals are and why he tripped out. lmao its kinda glitchy the shooting but its kinda fun seeing them crawl sick i know. but yeah this game so far is abnormally good, maybe people really did just have high expectations, i had none and so far its fun exploring. side note its annoying me but my guy is moving real slow i dont play RPGS ever but is it because of my inventory? i remember in a rpg not sure which one but when you pick up too much stuff it slows your guy down(thinking skyrim)

 

so far its a master piece not a 7/10 for sure not just good if the intro is anything to go by

ZEUS-COMING SOON

            Omen 16 2021

            Zenbook 14 oled

            Vivobook 15x oled

 

Link to comment
Share on other sites

15 minutes ago, tps3443 said:


So, if you’re low 30’s indoors you’re gonna want to optimize a little more just for good playable experience throughout.


Install the free DLSS mod (It looks great compared to FSR2 on distant items with that scaling turned down), you’ve gotta download like 3 small files I think and drop them in the install folder(s) etc. Then drop to probably 50-67% resolution slider at least if you are running 4K. Turn motion blur off all together, it’s a very resource hungry item in this game. And probably drop shadow quality to medium. 
 

This game loves huge memory bandwidth, and even then, going from a cave or indoors it will nearly chop a frame rate in half when you walk in to the big city outside.
 

I’ve dumbed all the graphics down and played like that for a while just to see how a higher frame rate felt and once you get in to it, the visuals don’t even matter all that much anyways. 
 

keep playing maybe stick with the main quest for some levels and then side track! It’s a lot of fun once you really get started around. I never really got in to Cyberpunk like this or previous games, but I gave this game a solid chance and now I’m hooked. So much to do. 


I’m loving just landing on random planets and exploring a bit. Taking over random ships that land nearby and killing everyone onboard. Space pirate stuff. 
 

The AutoHDR mod/hack looks incredible on OLED. Def have to try that out. 
 

DLSS and Frame Gen are a must as well. 
 

I don’t usually love this type of game but I’ve def gotten into it. I can see this game lasting a long time and being heavily modded over the years. Worth a look on Gamepass for $5 tomorrow when it drops for those that haven’t tried it out yet. 

  • Bump 1
Spoiler

The Beast Asus Z790 APEX | Intel i9 13900K | ASUS RTX 4090 Strix OC | 64gb DDR5 7466 CL34 Dual Rank A-Dies | Samsung 990 Pro 2TB | Innocn 4K 160Hz Mini LED HDR1000 | LG 27GN950-B 4K 160Hz | Corsair 170i Elite LCD 420mm AIO | Corsair 7000D | EVGA 1600w T2

Little Beast EVGA Z690 DARK | Intel i9 13900K | Nvidia RTX 4090 FE | 32gb DDR5 SK Hynix DDR5 8000 CL36 A-Dies | Samsung 980 Pro 2TB | LG OLED C1 4K 120Hz G-Sync/FreeSync | Alienware AW2721D 1440p 240Hz G-Sync Ultimate | Corsair 115i Elite 280mm AIO | Lian Li 011 Dynamic | EVGA 1000w P6

 

 

Link to comment
Share on other sites

4 minutes ago, ryan said:

woah I was not expecting this, im curious what the minerals are and why he tripped out. lmao its kinda glitchy the shooting but its kinda fun seeing them crawl sick i know. but yeah this game so far is abnormally good, maybe people really did just have high expectations, i had none and so far its fun exploring. side note its annoying me but my guy is moving real slow i dont play RPGS ever but is it because of my inventory? i remember in a rpg not sure which one but when you pick up too much stuff it slows your guy down(thinking skyrim)

 

so far its a master piece not a 7/10 for sure not just good if the intro is anything to go by


I’m someone who never could get in to games like Assassins Creed Valhalla or Witcher III without feeling like I’m forcing my self to play them. However, Starfield doesn’t feel this way at all. Just an FYI the gun play gets really really really good at level 10-12. You’ll also be pretty well educated on all of the concepts and buttons of the game. Keep going you’ve gotta lot of fun ahead.

 

3 minutes ago, Talon said:


I’m loving just landing on random planets and exploring a bit. Taking over random ships that land nearby and killing everyone onboard. Space pirate stuff. 
 

The AutoHDR mod/hack looks incredible on OLED. Def have to try that out. 
 

DLSS and Frame Gen are a must as well. 
 

I don’t usually love this type of game but I’ve def gotten into it. I can see this game lasting a long time and being heavily modded over the years. Worth a look on Gamepass for $5 tomorrow when it drops for those that haven’t tried it out yet. 


I love it.  I can 1 shot in the head while sneaking now on most bad guys. Lots of fun. I got 29 hours play time in so far.

13900KF

Link to comment
Share on other sites

I'm at the part where I have to mine for that guy from the outpost on jemison. This game I will give a chance. I don't mind this game. Found witcher boring. But I am not even half a percent in and am blown away by the size of this game. I don't mind missions to collect stuff. It's like wow meets mass effect.

 

I have it locked at 30 with ultra high settings shadows on low motion blur off and resolution at 80 fsr2.

 

I was not expecting to play this at 4k 30fps. I was expecting 1080p to run at 30 based off what I saw on youtube..tweaking os ftw

ZEUS-COMING SOON

            Omen 16 2021

            Zenbook 14 oled

            Vivobook 15x oled

 

Link to comment
Share on other sites

Lol total newb question but how do I see a diary or info on missions etc I literally am just wandering around checking out the world. But as for main quests I've done 0 and have no idea where to go or who too see.. newb with rpgs

ZEUS-COMING SOON

            Omen 16 2021

            Zenbook 14 oled

            Vivobook 15x oled

 

Link to comment
Share on other sites

17 hours ago, Talon said:

https://www.maxon.net/en/downloads/cinebench-2024-downloads

 

Get benching yall. 
 

edit: Seems very demanding compared to R23. 

 

damn, even MORE demanding than R23. if this trend keeps up, there wont be a need to use a powervirus like P95 anymore 😄 ill download and test it out once i get home. curious to see which command line instructions still work, sometimes they update the syntax, other times they dont.

 

any new features / options integrated vs. CB R23?

 

12 hours ago, electrosoft said:

HUB confirming What GN found about CPU performance in Starfield....it is a very heavy CPU influenced title and Intel rules the roost.

 

Especially the 5800X3D just gets wrecked....

 

Seriously, Starfield loves Intel CPUs....that has to be a weird win some / lose some for AMD who sponsored the PC version.

 

(Eyes the SP115 13900KS sitting on the shelf in the MSI Z790i Edge motherboard atm doing nothing)

 

300368467_Screenshot2023-09-05151144.thumb.jpg.8a01e794037b588b67d53b857db165f9.jpg

 

1458950029_Screenshot2023-09-05151436.thumb.jpg.01c1255e6f5b2b0fae0e91e1ddf027ac.jpg

 

Memory findings in line with GN meaning much smaller gains going from 4800 -> 7200 (but again, I'd like to see 7600 or 8000+. Maybe take a looksy at memory benefits @tps3443 and let us know).

 

211343254_Screenshot2023-09-05152008.thumb.jpg.88258e05dc86355d5ace1350dc49d2c2.jpg

 

 

 

 

 

HUB just confirms and extends some of GNs results to really dial in these hard truths:

--

Intel CPUs dominate for Starfield.

AMD GPUs are competitive with Nvidia (7900xtx vs 4090 especially).

DDR5 memory bandwidth is not nearly as important for performance gains but there are some

--

 

We've reached a point where if you're serious about Starfield, a 7600 > 5800X3D for gaming.

 

I'd say best bang:buck system is a 13700k + cheap B660 + cheap 6400-7000 DDR5 + 7900XTX

 

As always one of my favorite things to see unfold is how driver and game updates change the performance landscape.

 

 

 

 

 

i find it interesting to see how vastly different games react to the same hardware. some games where X3D leaves everything else completely in the dust, in this case more cache only helps a tiny bit, specific arch and jiggahertz galore is the winner. i hope this is not a new trend that will intensify in the future. i dont want to have to specifically look up all games and programs i POTENTIALLY want to use in the future and adapt my hardware accordingly. as we all know, theres always new software around the corner so things might change rapidly.... ugh

 

10 hours ago, Izy said:

JLnXOdv.png

 

this i find funny

 

i actually find this super interesting (me being the big data / statistics nerd that i am 😄 ) especially long term wise, this would be a cool indicator for silicon wear / degradation. aside from accumulated temp theres also the same for voltage and amperage. 

the only time this would SUCK though, is if manufacturers use these values to judge, whether or not a cpu / mobo / etc was overclocked or utilized in a "suboptimal ambient thermal environment" to try and get out of warranty claims 🤪

  • Bump 1

Mine: Hyperion "Titan God of Heat, Heavenly Light, Power" (2022-24)
AMD Ryzen 9 7950X (custom TG IHS) / Asus ROG Crosshair X670E Extreme / MSI Geforce RTX 4090 Suprim X / Teamgroup T-Force Delta RGB DDR5-8200 2x24 GB / Seagate Firecuda 530 4 TB / 5x Samsung 860 Evo 4 TB / Arctic Liquid Freezer II 420 (Push/Pull 6x Noctua NF-A14 IndustrialPPC-3000 intake) / Seasonic TX-1600 W Titanium / Phanteks Enthoo Pro 2 TG (3x Arctic P12 A-RGB intake / 4x Arctic P14 A-RGB exhaust / 1x Arctic P14 A-RGB RAM cooling) / Samsung Odyssey Neo G8 32" 4K 240 Hz / Ducky One 3 Daybreak Fullsize Cherry MX Brown / Corsair M65 Ultra RGB / PDP Afterglow Wave Black

 

My Lady's: Clevo NH55JNNQ "Alfred" (2022-24)
Sharp LQ156M1JW03 FHD matte 15.6" IGZO 8 bit @248 Hz / Intel Core i5 12600 / Nvidia Geforce RTX 3070 Ti / Mushkin Redline DDR4-3200 2x32 GB / Samsung 970 Pro 1 TB / Samsung 870 QVO 8 TB / Intel AX201 WIFI 6+BT 5.2 / Win 11 Pro Phoenix Lite OS / 230 W PSU powered by Prema Mod!

Link to comment
Share on other sites

18 hours ago, Etern4l said:

 

Sure, we know there is a premium to pay, the problem is that NVidia raised prices of products across the board to create an illusion of necessity to purchase the top end card, as everything else seems like bad value. This and the other unhealthy practices have hurt the entire PC market, which is already in a worrying state of decline. So Level 2 thinking here would be to avoid supporting a company which acts to the detriment of the entire community (if not humanity). Sure, people can be selfish, myopic and just pay up the extra $400-500 over par, if that makes them feel as if they were driving a Ferrari among plebs in packed Camrys lol, but will this help move the world in the right direction? 

 

 

How they choose to conduct business is their prerogative. Nvidia isn't the first nor will they be the last company to purposely make one of their products look like a poor buy to push consumers to another, more expensive one. It is a well known marketing strategy. They are not the first to have overall ridiculously high profit margins. They are in the business of making money while advancing technology. They are the GPU market crown jewel at the moment and they know it and will capitalize on it as much as possible. The golden rule is in full effect.

 

The consumer then reviews their product line and decides what fits in their budget and if they want one of their GPUs or perhaps go elsewhere to AMD which has a pretty nice overall product stack of their own. Like I said, no one is forcing anybody to purchase one of their products. If they choose Nvidia in the end, I would not be so arrogant as to think they are being selfish or myopic just because their criteria is different than yours.

 

As noted before, we can see Nvidia reacting to market conditions and competition with the 4060ti 16GB and I am sure down the line the 4060ti 8GB may be re-positioned price wise in light of the 7700xt and even the 4070 re-positioned to compete directly with the 7800xt.

 

 

18 hours ago, Etern4l said:

 

 

You probably meant to say "this is a free market". Well, it's not exactly, because NVidia is a pseudo-monopolist. They are in a position to rip people off by charging excessively for their products. They are doing so without regard for us, the enthusiasts, or the PC market, because Jensen has his head stuck high up in the clouds, hoping for AI-driven world dominance. It's only us, the consumers, who can bring him down a peg or two.

 

 

 

No, I selected capitalism specifically to zero in on corporations (Nvidia in this instance) not the idea of a free market. While they do share many same values, they are not the same: https://www.investopedia.com/ask/answers/042215/what-difference-between-capitalist-system-and-free-market-system.asp

 

Jensen is technology and profit driven above all else including us enthusiasts. Shocker. 🙂

News flash: So is AMD and the vast majority of corporations out there.

 

18 hours ago, Etern4l said:

People make their decisions based on a plethora of fuzzy factors: their knowledge, their CPU characteristics, the information in their possession, their habits, and crucially - their emotions. Arguably, what's often missing is long-term thinking. I mean we know for example that people make decisions they later regret, buyer's remorse is a thing. But sometimes, there is no remorse even if the consequences are bad - that's flawed human nature, we tend to love ourselves the most, sometimes in situations where actually helping others would be more beneficial. 

 

 

Agreed. Welcome to humanity.

 

18 hours ago, Etern4l said:

Well, words are cheap, I don't own any AMD GPUs either (harder constraints unfortunately), but the least we can do is avoid helping NVidia by hyping up their products.

 

See, I actually try to push AMD products even initially skipping the 4090 and trying to go with the 7900xtx but it was a poor performer (Scroll back to January 2023) so I returned it and picked up a 4090. I main rig'd a 5800X and now a 7800X3D. I picked up a 6700xt over a 4060ti for my ITX rig. It has been well known I have a soft spot for them in my heart and always have and I've used their GPUs and CPUs quite frequently over the last 20 years. My main GPU during Turing was even a 5700xt anniversary edition for over a year.

 

18 hours ago, Etern4l said:

Yes, based on the theoretical performance numbers sources from the links I posted earlier. I have used techpowerup GPU specs page quite a bit when comparing GPUs - if there is a major flaw in the methodology, I would like to know.

 

You can see for example that the 7900XTX has a higher pixel rate (the GPU can render more pixels than NVidia's top dog), but lower texture rate (NVidia can support more or higher res textures, by about 25%). Compute specs are mostly in favour of the 7900XTX, except for FP32, where the 4090 is in the lead. The differences either way can be fairly large, which suggests there exist architectural differences between the GPUs which might also explain differences in game performance depending on the engine or content type.

 

Based on the specs alone, the 4090 should be priced maybe within 10%, certainly not 60%, of the 7900XTX. The 50% excess constitutes software and "inventory management" premiums, neither of which should really be applicable in a healthy market (to be fair, the software part of it could - to some extent - be AMD's own goal as per @Mr. Fox, I have yet to see bro @Raiderman jump in to the drivers' defence).

 

 

If your theory is paper calculations show they are relatively equal in rasterization performance but the discrepancy and vast majority of benchmarks and reviews that show the 4090 is clearly superior overall in rasterization (and then basically everything else) is based upon software and/or Nvidia dev optimizations then I believe you are in error.

 

Feel free to support your theory with hard evidence outside of tech specs on paper. I'm always open to change my mind when presented with hard evidence.

 

 

  • Thumb Up 2

Electrosoft Prime: 7950X3D | MSI X670E Carbon  | MSI Suprim X Liquid 4090 | AC LF II 420 | G.Skill 6000 A-Die 2x32GB | Samsung 990 Pro 2TB | EVGA 1600w P2 | Phanteks Ethroo Pro | Alienware AW3225QF 32" OLED

MelMel:  AMD Ryzen 5 7600x | Asus B650 Prime | Powercolor Spectra White 7900XTX | Asus Ryugin III 240mm AIO | M-die 2x16GB Custom | Samsung 980 Pro 1TB | EVGA P2 850w | Hyte Y40 | BenQ 32" 4k
Heath: i9-12900k | EVGA CLC 280 | Asus Strix Z690 D4 | Asus Strix 3080 | 32GB DDR4 2x16GB B-Die 4000  | WD Black SN850 512GB |  EVGA DG-77 | Samsung G7 32" 144hz 32"

Eurocom Raptor X15 | 12900k | Nvidia RTX 3070ti | 15.6" 1080p 240hz | Kingston 3200 32GB (2x16GB) | Samsung 980 Pro 1TB Heatsink Edition

 

 

 


 

Link to comment
Share on other sites

9 hours ago, Etern4l said:

Azor’s name anywhere near the specs BTW, I’m not sure he would even understand them.
 

Innocent until proven guilty. Any concrete examples of AMD behaving unfairly towards the consumer?

 
Nope. Azor isn’t innocent. Seems people have forgot what a manipulator he is. Remember the bragging about world’s fastest gaming laptop? He was taken with the pants down from etc @Prema Hi himself afterwards promised a new vbios for their jokebook to try come on par to compete with the other brands. He talks with everyone in his department to make everything his way if needed. Don’t trust that devil. And yup, Azor was the man that said AMD won’t compete with 4090. All AMD has is an 4080 competitor with slightly lower price. Directly out of the asss mouth. All those on the top talking with each others before they throw out finished products. And Azor is one of them. The real salesman that will talk with the press afterwards. 

 

AMD looks more on power efficiency than offering the gaming crown. Tuning games vs the graphics cards brand isn’t the same as the most powerful graphics cards. Just adjust the game vs your graphics architecture. 

 


Btw. Every morning and a new nice good beef for breakfast here outside Africa 🙂


Spain and Africa have good beef and at 1/3 of the price vs home in Norway. So beef two times a day now, LOOL

 

Then we have all the restaurants with same good beef quality. I think it will be some time before I jump on a beef dinner when we come home, HaHa
IMG-6226.jpg

  • Thumb Up 2
  • Like 1
  • Bump 1

"The Killer"  ASUS ROG Z790 Apex Encore | 14900KS | 4090 HOF + 20 other graphics cards | 32GB DDR5 | Be Quiet! Dark Power Pro 12 - 1500 Watt | Second PSU - Cooler Master V750 SFX Gold 750W (For total of 2250W Power) | Corsair Obsidian 1000D | Custom Cooling | Asus ROG Strix XG27AQ 27" Monitors |

 

                                               Papusan @ HWBOTTeam PremaMod @ HWBOT | Papusan @ YouTube Channel

                             

 

Link to comment
Share on other sites

Nice! How are you now, is the heatwave gone? Are the temps a little better ?

  • Thanks 1

7950X3D| Zotac 4090 AMP Extreme Airo| MSI MPG B650 Edge Wifi| Lian Li Galahad 360 V2| 32GB Kingston Renegade RGBZ 6000|Kingston KC3000 2TB| Fury Renegade 2TB|Samsung 970 Evo 1TB| Lian Li O11 Dynamic Evo| Corsair HX1500i| Samsung Odyssey G9 Neo

Asus Zephyrus G15 (Ryzen 9 6900HS + RTX3080)

 

Link to comment
Share on other sites

3 hours ago, cylix said:

Nice! How are you now, is the heatwave gone? Are the temps a little better ?

Hi. Better. Thanks for asking. We have had two days with chilli 27C and no sun. Grey weather. But slightly windy so it feels colder. Today is the sun back and I expect temperatures to be around 30C. Not bad. Hopefully temp doesn’t climb above 32C. We have AC now so we can have at least 25C as max inside the bungalow. I usually never sit directly under the sun. Has to be under a roof or a huge umbrella. 25-28 is what I hope for 🙂 I only hope we can avoid the wind and high humidity. Now the humidity is around 63 I think. Hope it stays there or slightly sink to 50. But humidity has been 62-66 several days so I doubt that will change. 
 

The weather is chaotic in Europe now. Islas Canary is under Spain but the islands is right outside Africa coast so not as affected. These islands have one of the best climate on earth. Usually 25-28C. Right now 27C, full sun and middle of day 13:30


 

Here are the car we rented for this summer vacation. T-Rox. Works ok and ok price 

IMG-0105.jpg

 

The chaos.

https://www.dagbladet.no/nyheter/europa-snudd-pa-hodet/80153217

 

As usual. Middle Norway can’t have good weather. Even if the chaotic weather in Europe usually push better/warmer weather up to Scandinavia and southern Norway. My hometown is flooded with rain and cold as usual. Not fun going home when that time comes 😞 10C rain wind and 96% humidity home isn’t tempting. Stuck with awful weather 8 out of 12 months isn’t nice. And the 4 so called better months isn’t fantastic either. Can’t have it both. Same with today’s tech. 

  • Thumb Up 3
  • Bump 2

"The Killer"  ASUS ROG Z790 Apex Encore | 14900KS | 4090 HOF + 20 other graphics cards | 32GB DDR5 | Be Quiet! Dark Power Pro 12 - 1500 Watt | Second PSU - Cooler Master V750 SFX Gold 750W (For total of 2250W Power) | Corsair Obsidian 1000D | Custom Cooling | Asus ROG Strix XG27AQ 27" Monitors |

 

                                               Papusan @ HWBOTTeam PremaMod @ HWBOT | Papusan @ YouTube Channel

                             

 

Link to comment
Share on other sites

4 hours ago, jaybee83 said:

actually find this super interesting (me being the big data / statistics nerd that i am 😄 ) especially long term wise, this would be a cool indicator for silicon wear / degradation. aside from accumulated temp theres also the same for voltage and amperage. 

the only time this would SUCK though, is if manufacturers use these values to judge, whether or not a cpu / mobo / etc was overclocked or utilized in a "suboptimal ambient thermal environment" to try and get out of warranty claims 🤪

AMD and Intel both now have written policies that expressly deny warranty on processors that have been overclocked so they don't need any more ammunition to screw us. It is premeditated on both of their parts.

 

As far as I know, ASUS is the only one with those stats logged in the UEFI. It would be naive to believe that it won't be utilized for a nefarious purpose. There's no legitimate reason to log the information if they're not going to use it for something and you can bet your booty it won't be something good. And, that would fit their modus operandi as a company.

 

One more example of uninvited surveillance designed to benefit the entity surreptitiously gathering the data rather than the party producing it.

  • Thumb Up 1

Wraith // Z790 Apex | 14900KF | 4090 Suprim X+Byksi Block | 48GB DDR5-8600 | Toughpower GF3 1650W | MO-RA3 360 | Hailea HC-500A || O11D XL EVO
Banshee // Z790 Apex Encore | 13900KS | 4090 Gaming OC+Alphacool Block | 48GB DDR5-8600 | RM1200x SHIFT | XT45 1080 Nova || Dark Base Pro 901
Munchkin // Z790i Edge | 14900KF | Arc A770 Phantom Gaming OC | 48GB DDR5-8200 | GameMax 850W | EK Nucleus CR360 Dark || Prime AP201 
Half-Breed // Dell Precision 7720 | BGA CPU Filth+MXM Quadro P5000 | Sub-$500 Grade A Refurb || Nothing to Write Home About  

 Mr. Fox YouTube Channel | Mr. Fox @ HWBOT

The average response time for a 911 call is 10 minutes. The response time of a .357 is 1400 feet per second. 

Link to comment
Share on other sites

8 hours ago, ryan said:

Lol total newb question but how do I see a diary or info on missions etc I literally am just wandering around checking out the world. But as for main quests I've done 0 and have no idea where to go or who too see.. newb with rpgs


I press M and then press L to open missions. But I think we can just press L and it opens it. You can also hold P to level up and open the skills menu (you need to acquire the boost pack training skill so you can double jump and use your boost pack). And of course hit tab or I to open your inventory. Make sure you take a nap in game every once in a while. Or else your skills take a hit to their improvement speed. Maybe just sleep for a whole 24 hours every once in a while (few days lol). Also, as for healing your self up you can use med packs only. Or eat a bunch of food. I typically steal and take anything edible in game even oranges and potatoes out of refrigerators and stuff lol. Use cover when you’re shooting at enemies. This is helpful. 
 

Once you get to the missions you can click them, and set a route, or travel there. By hitting R or X.

 

Sometimes the destination or galaxy is too far to travel, so you must side jump or baby jump planet to planet. You’ll figure it out, it took me a bit to really understand it. 
 

Last but not least, your character may have afflictions going on. You can see this by just pressing tab. You may see a colored icon beside your character and it will show the description. It could be broken bones, or burns, or head wounds. This is what the heal paste, bandages, or burn paste is used for. These items only fix afflictions, and cannot heal you any or very much alone. Use med packs, resting in a bed, or food for that. 
 

One more thing, when you land your ship.  You can approach the ship man and upgrade your ship, not just that but you can tear the whole thing apart and build it however you like. You can make it massive or tiny. There is also a nifty cargo hold you can store things in your ship. This is accessed inside of your ship on the left hand side panel via a LCD screen on the wall if you’re facing your cockpit seat and looking out of the cockpit window. 

  • Thumb Up 1

13900KF

Link to comment
Share on other sites

14 hours ago, Etern4l said:

 

Sure, we know there is a premium to pay, the problem is that NVidia raised prices of products across the board to create an illusion of necessity to purchase the top end card, as everything else seems like bad value. This and the other unhealthy practices have hurt the entire PC market, which is already in a worrying state of decline. So Level 2 thinking here would be to avoid supporting a company which acts to the detriment of the entire community (if not humanity). Sure, people can be selfish, myopic and just pay up the extra $400-500 over par, if that makes them feel as if they were driving a Ferrari among plebs in packed Camrys lol, but will this help move the world in the right direction? 

 

 

You probably meant to say "this is a free market". Well, it's not exactly, because NVidia is a pseudo-monopolist. They are in a position to rip people off by charging excessively for their products. They are doing so without regard for us, the enthusiasts, or the PC market, because Jensen has his head stuck high up in the clouds, hoping for AI-driven world dominance. It's only us, the consumers, who can bring him down a peg or two.

 

 

People make their decisions based on a plethora of fuzzy factors: their knowledge, their CPU characteristics, the information in their possession, their habits, and crucially - their emotions. Arguably, what's often missing is long-term thinking. I mean we know for example that people make decisions they later regret, buyer's remorse is a thing. But sometimes, there is no remorse even if the consequences are bad - that's flawed human nature, we tend to love ourselves the most, sometimes in situations where actually helping others would be more beneficial. 

 

All of this can be far from optimal, it's impossible to even argue that the decisions people take are always right and optimal for them. The best we can do is think about our decisions and constantly try to improve our decision making, knowing full well it's necessarily flawed to some extent... 

 

If we look a few steps ahead certain sets of criteria might be globally better than others. I don't want to go as far as invoking any of the striking examples of application of people's flawed criteria leading to various historical disasters, however, there are some future paths where supporting NVidia in any way could lead to very much unintended and undesirable consequences. Less drastically, and more immediately, NVidia clearly isn't helping our beloved PC world flourish.

 

 

 

Well, words are cheap, I don't own any AMD GPUs either (harder constraints unfortunately), but the least we can do is avoid helping NVidia by hyping up their products.

 

 

Yes, based on the theoretical performance numbers sources from the links I posted earlier. I have used techpowerup GPU specs page quite a bit when comparing GPUs - if there is a major flaw in the methodology, I would like to know.

 

You can see for example that the 7900XTX has a higher pixel rate (the GPU can render more pixels than NVidia's top dog), but lower texture rate (NVidia can support more or higher res textures, by about 25%). Compute specs are mostly in favour of the 7900XTX, except for FP32, where the 4090 is in the lead. The differences either way can be fairly large, which suggests there exist architectural differences between the GPUs which might also explain differences in game performance depending on the engine or content type.

 

Based on the specs alone, the 4090 should be priced maybe within 10%, certainly not 60%, of the 7900XTX. The 50% excess constitutes software and "inventory management" premiums, neither of which should really be applicable in a healthy market (to be fair, the software part of it could - to some extent - be AMD's own goal as per @Mr. Fox, I have yet to see bro @Raiderman jump in to the drivers' defence).

 

Not jumping in to defend AMD's drivers, because they do somewhat suck. I miss the old CCC days for sure, as I didnt have to reset my PC  if the driver crashed while benching (most of the time) The modern app model is broken, and pointless IMO, and should have been abandoned years ago. I do, however, like the update frequency in which AMD releases their drivers. On a side note, I am very pleased with the purchase of the 7900xtx, and think its a fun card to bench, especially with a water block installed. (of what little time I have had with it)

 

Update: Newegg has not asked for the extra DDR5 8000mhz ram back, so I have some 2x16gb Trident Z5 available on the cheap.

https://www.newegg.com/g-skill-32gb/p/N82E16820374449?Item=N82E16820374449

 

  • Thumb Up 2
  • Like 1
  • Bump 1

Lian Li Lancool III | Ryzen 9 7950X | 48gb G-skill Trident Z5 DDR5 8000mhz | MSI Mpg X670E Carbon |

AsRock Taichi Radeon 7900xtx Bykski Block |Raijintek Scylla Pro 360 custom loop| Crucial T700 1tb

WD Black's SN770 500gb/1tb NVME | Toshiba 8Tb 7200rpm Data |

EVGA 1000w SuperNova |32" Agon 1440p 165hz Curved Screen |  Windows 10 LoT 21h2

Link to comment
Share on other sites

20 hours ago, Mr. Fox said:

OK, I guess you have to download and install special AMD drivers that have a module to integrate CUDA for it to work. Trying it now to find out.

 

https://www.amd.com/en/developer/rocm-hub/hip-sdk.html

 

@Raidermanyou may need this for it to work with your GPU.

4 hours ago, Raiderman said:

Not jumping in to defend AMD's drivers, because they do somewhat suck. I miss the old CCC days for sure, as I didnt have to reset my PC  if the driver crashed while benching (most of the time) The modern app model is broken, and pointless IMO, and should have been abandoned years ago. I do, however, like the update frequency in which AMD releases their drivers. On a side note, I am very pleased with the purchase of the 7900xtx, and think its a fun card to bench, especially with a water block installed. (of what little time I have had with it)

 

Update: Newegg has not asked for the extra DDR5 8000mhz ram back, so I have some 2x16gb Trident Z5 available on the cheap.

https://www.newegg.com/g-skill-32gb/p/N82E16820374449?Item=N82E16820374449

 

OK, well that driver not only was needed to make my AMD GPU compatible with the GPU part of Cinebench, but the ray tracing components included, which I opted to install, made a measurable improvement in ray tracing. Seems like about 5-10 FPS better but also the visual quality is much better than the standard Adrenaline driver. Much smoother appearance than the standard Adrenaline driver. Definitely worth checking out on your 7900 XTX. At the start of the driver installation you have to choose components to install. Choose everything, including the Pro driver all the way down at the bottom.

  • Thumb Up 3

Wraith // Z790 Apex | 14900KF | 4090 Suprim X+Byksi Block | 48GB DDR5-8600 | Toughpower GF3 1650W | MO-RA3 360 | Hailea HC-500A || O11D XL EVO
Banshee // Z790 Apex Encore | 13900KS | 4090 Gaming OC+Alphacool Block | 48GB DDR5-8600 | RM1200x SHIFT | XT45 1080 Nova || Dark Base Pro 901
Munchkin // Z790i Edge | 14900KF | Arc A770 Phantom Gaming OC | 48GB DDR5-8200 | GameMax 850W | EK Nucleus CR360 Dark || Prime AP201 
Half-Breed // Dell Precision 7720 | BGA CPU Filth+MXM Quadro P5000 | Sub-$500 Grade A Refurb || Nothing to Write Home About  

 Mr. Fox YouTube Channel | Mr. Fox @ HWBOT

The average response time for a 911 call is 10 minutes. The response time of a .357 is 1400 feet per second. 

Link to comment
Share on other sites

1 hour ago, Raiderman said:

Not jumping in to defend AMD's drivers, because they do somewhat suck. I miss the old CCC days for sure, as I didnt have to reset my PC  if the driver crashed while benching (most of the time) The modern app model is broken, and pointless IMO, and should have been abandoned years ago. I do, however, like the update frequency in which AMD releases their drivers. On a side note, I am very pleased with the purchase of the 7900xtx, and think its a fun card to bench, especially with a water block installed. (of what little time I have had with it)

 

Update: Newegg has not asked for the extra DDR5 8000mhz ram back, so I have some 2x16gb Trident Z5 available on the cheap.

https://www.newegg.com/g-skill-32gb/p/N82E16820374449?Item=N82E16820374449

 


The nvidia control panel has been almost the same since like 2004. 😂 lol. I guess if it ain’t broke don’t fix it. I too remember the Catalyst Control Center though at least they had the built in overclocking which was only good for quick out of overclocking on a new OS. My last AMD GPU was a RX480 from launch day. Amazing GPU. I miss really those days. I paid $250 dollars for a 8GB GPU on launch day that could game 1080P/1440P really well. But, that was also when 30-60FPS was accepted lol. So maybe we’re all being duped after all. 
 

Anyways, I think any of these GPU’s will get the job done. I wouldn’t mind giving the 7900XTX liquid devil a try. 

  • Thumb Up 1

13900KF

Link to comment
Share on other sites

Very nice. Now up to AMD delivering proper drivers and their new frame gen features. And a lot cards for sale. AMD can own low/mid end. Every single bit. 
 

Nvidia have only RT and that’s feature is almost useless for mid end card. How long will nvidia keep 600$ for 4070 bro @electrosoft? And what with the price point for their 128-bus 4060Ti and 4060 scam? I hope Nvidia now take a pause and rethink their greed.

 

https://www.techpowerup.com/review/powercolor-radeon-rx-7800-xt-hellhound/31.html

 

7700 has lower value but still a much better choice than Nvidia.

 

https://www.techpowerup.com/review/sapphire-radeon-rx-7700-xt-pulse/32.html

 

There is simply no reason to buy nvidia low/mid end. 
 

And that was my conclusion before reviews of 7700 and 7800 xt. Gamers don’t need RT with so weak cards. Neither should they buy such overpriced castrated graphics cards. 128 bit is almost iGPU graphics whose share internal memory. Aka worst of the worst for real gaming.

 

Now up to nvidia showing up how greedy they are. Better have finished graphics cards in the shelves not selling? @electrosoft 🙂

  • Bump 1

"The Killer"  ASUS ROG Z790 Apex Encore | 14900KS | 4090 HOF + 20 other graphics cards | 32GB DDR5 | Be Quiet! Dark Power Pro 12 - 1500 Watt | Second PSU - Cooler Master V750 SFX Gold 750W (For total of 2250W Power) | Corsair Obsidian 1000D | Custom Cooling | Asus ROG Strix XG27AQ 27" Monitors |

 

                                               Papusan @ HWBOTTeam PremaMod @ HWBOT | Papusan @ YouTube Channel

                             

 

Link to comment
Share on other sites

2 hours ago, Mr. Fox said:

OK, well that driver not only was needed to make my AMD GPU compatible with the GPU part of Cinebench, but the ray tracing components included, which I opted to install, made a measurable improvement in ray tracing. Seems like about 5-10 FPS better but also the visual quality is much better than the standard Adrenaline driver. Much smoother appearance that the standard Adrenaline driver. Definitely worth checking out on your 7900 XTX. At that start of the driver installation you have to choose components to install. Choose everything, including the Pro driver all the way down at the bottom.


ROCm (AMD”s counter to CUDA) was in a sorry state when I looked at it a year ago, didn’t even support the current consumer cards (6900 at the time). Good to see they are making progress while maintaining competitive pricing across the entire product line - shocking as that may seem to us, NVidia slaves, I know. More seriously though, both them and Intel face the same challenge: must provide a software gap discount while they are catching up, and they have to make progress on software with less cash - a bit of a chicken and egg situation.
 

Luckily there is now also more software that leverages ROCm as people are fed up with NVidia”s grip on the industry, and are being outspoken about it. I love the 2 slot FF BTW.
 

If bro @Raidermanwould be so kind as to run a couple of benchmarks with those drivers, such as AIDA64 GPGPU, and maybe Indigo and Blender, we could see if they help the card deliver compute performance that”s closer to “theoretical performance”, and we could see if and how far things have improved just thanks to the driver/software vs launch review numbers which frankly weren’t corresponding to the specs.

 

Edit: now that I think about it, the odds are not great unless said benchmarks specifically utilise ROCm rather than OpenCL, but worth a quick shot I guess.

  • Thumb Up 1

"We're rushing towards a cliff, but the closer we get, the more scenic the views are."

-- Max Tegmark

 

AI: Major Emerging Existential Threat To Humanity

Link to comment
Share on other sites

1 hour ago, Papusan said:

And that was my conclusion before reviews of 7700 and 7800 xt. Gamers don’t need RT with so weak cards. Neither should they buy such overpriced castrated graphics cards. 


AMD graphics is equal old Skylake gen processors. Nvidia offers usually more than 5-10 performance uplift gen over gen. The cooperation between nvidia and AMD is awful. They know exactly what the competitors offer every gen. Hence we won’t see real competition and correct pricing for their products. So not only nvidia to blame here bro @Etern4l

 

But if you already have a decent gaming PC with a GPU made in the past four years, you can almost certainly sit this generation out and wait to see what happens with RDNA 4 and "Nvidia-Next" — maybe Blackwell, though that could just as well be the next data center part.

 

https://www.tomshardware.com/reviews/amd-radeon-rx-7800-xt-review/10

 

The 7800 XT serves a purpose, and it's not a bad card, but it's also not particularly exciting. 5% more performance and 17% less power use, for basically the same going rate, represents a small step forward

  • Bump 1

"The Killer"  ASUS ROG Z790 Apex Encore | 14900KS | 4090 HOF + 20 other graphics cards | 32GB DDR5 | Be Quiet! Dark Power Pro 12 - 1500 Watt | Second PSU - Cooler Master V750 SFX Gold 750W (For total of 2250W Power) | Corsair Obsidian 1000D | Custom Cooling | Asus ROG Strix XG27AQ 27" Monitors |

 

                                               Papusan @ HWBOTTeam PremaMod @ HWBOT | Papusan @ YouTube Channel

                             

 

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. Terms of Use