-
Posts
2,801 -
Joined
-
Last visited
-
Days Won
130
Content Type
Profiles
Forums
Events
Everything posted by electrosoft
-
*Official Benchmark Thread* - Post it here or it didn't happen :D
electrosoft replied to Mr. Fox's topic in Desktop Hardware
How they choose to conduct business is their prerogative. Nvidia isn't the first nor will they be the last company to purposely make one of their products look like a poor buy to push consumers to another, more expensive one. It is a well known marketing strategy. They are not the first to have overall ridiculously high profit margins. They are in the business of making money while advancing technology. They are the GPU market crown jewel at the moment and they know it and will capitalize on it as much as possible. The golden rule is in full effect. The consumer then reviews their product line and decides what fits in their budget and if they want one of their GPUs or perhaps go elsewhere to AMD which has a pretty nice overall product stack of their own. Like I said, no one is forcing anybody to purchase one of their products. If they choose Nvidia in the end, I would not be so arrogant as to think they are being selfish or myopic just because their criteria is different than yours. As noted before, we can see Nvidia reacting to market conditions and competition with the 4060ti 16GB and I am sure down the line the 4060ti 8GB may be re-positioned price wise in light of the 7700xt and even the 4070 re-positioned to compete directly with the 7800xt. No, I selected capitalism specifically to zero in on corporations (Nvidia in this instance) not the idea of a free market. While they do share many same values, they are not the same: https://www.investopedia.com/ask/answers/042215/what-difference-between-capitalist-system-and-free-market-system.asp Jensen is technology and profit driven above all else including us enthusiasts. Shocker. 🙂 News flash: So is AMD and the vast majority of corporations out there. Agreed. Welcome to humanity. See, I actually try to push AMD products even initially skipping the 4090 and trying to go with the 7900xtx but it was a poor performer (Scroll back to January 2023) so I returned it and picked up a 4090. I main rig'd a 5800X and now a 7800X3D. I picked up a 6700xt over a 4060ti for my ITX rig. It has been well known I have a soft spot for them in my heart and always have and I've used their GPUs and CPUs quite frequently over the last 20 years. My main GPU during Turing was even a 5700xt anniversary edition for over a year. If your theory is paper calculations show they are relatively equal in rasterization performance but the discrepancy and vast majority of benchmarks and reviews that show the 4090 is clearly superior overall in rasterization (and then basically everything else) is based upon software and/or Nvidia dev optimizations then I believe you are in error. Feel free to support your theory with hard evidence outside of tech specs on paper. I'm always open to change my mind when presented with hard evidence. -
*Official Benchmark Thread* - Post it here or it didn't happen :D
electrosoft replied to Mr. Fox's topic in Desktop Hardware
This and more. When I had my 7900xtx it was decent but with RT on it equaled my 3080. Pure raster it was about 26% faster than my stock 3080 10GB which wasn't cutting it for WoW 4k Ultra. I figured you would use AB/RTSS w/ benchmark capture and since your hardware is fixed, determine a repeatable save spot run of your choosing and scale frequency without changing timings from your 8800 settings down in 400mhz increments down to 5200 as initially it would be a pure frequency test and see how much it changes. If you really want to get indepth with timings vs frequency maybe pick a sweet spot for your memory (6000-7200) that really lets you tighten up even more so and test 6000, 6400, 6800, 7200 with loose vs tight timings. You can then bar graph it or screen shots (or both) to see how much frequency matters with everything else fixed and OCd. 1080p, 1440p, 4k. But anything you want to present would be awesome since your system is so OC'd I'm sure for a large bulk of upper end resolution results it will be GPU bound. The most important thing is to have everything remain absolutely consistent (Turn on the chiller!) as possible while yanking your memory to and fro. -
*Official Benchmark Thread* - Post it here or it didn't happen :D
electrosoft replied to Mr. Fox's topic in Desktop Hardware
This and more. When I had my 7900xtx it was decent but with RT on it equaled my 3080. Pure raster it was about 26% faster than my stock 3080 10GB which wasn't cutting it for WoW 4k Ultra. I figured you would use AB/RTSS w/ benchmark capture and since your hardware is fixed, determine a repeatable save spot run of your choosing and scale frequency without changing timings from your 8800 settings down in 400mhz increments down to 5200 as initially it would be a pure frequency test and see how much it changes. If you really want to get indepth with timings vs frequency maybe pick a sweet spot for your memory (6000-7200) that really lets you tighten up even more so and test 6000, 6400, 6800, 7200 with loose vs tight timings. You can then bar graph it or screen shots (or both) to see how much frequency matters with everything else fixed and OCd. 1080p, 1440p, 4k. But anything you want to present would be awesome since your system is so OC'd I'm sure for a large bulk of upper end resolution results it will be GPU bound. The most important thing is to have everything remain absolutely consistent (Turn on the chiller!) as possible while yanking your memory to and fro. -
*Official Benchmark Thread* - Post it here or it didn't happen :D
electrosoft replied to Mr. Fox's topic in Desktop Hardware
I would definitely try the 4400 sticks to remove that as a bottleneck and see how far you can push the 5800X3D keeping IF:FSR 1:2 intact. The good thing is that while your CPU will give up the ghost at 3800-4000, you can then tighten the timings on those sticks to really dial them in with the 5800X3D. 3800/1900 was routinely the cap for the bulk of AM4 chips with some dips into slightly higher frequencies. Some could hit 4000 but they basically had golden IMCs. I know my 5800X capped out around 3866 IRC on known, good B-die sticks even in a SR 2x8GB config to lessen the load on the IMC. -
*Official Benchmark Thread* - Post it here or it didn't happen :D
electrosoft replied to Mr. Fox's topic in Desktop Hardware
Bethesda and their Creation Engine is a great example of the blessing and curse of the 3D cache model. Creation Engine 1 and ES5 along with Fallout 4/76 was a blessing as it was extremely single threaded but more important, instructions and assets were so much smaller since it was basically using a ~12yr old engine that everything could sit in the cache and performance was greatly bumped. Even now, I'd recommend a 5800X3D or 7800X3D system for Fallout 3/4/76. Creation Engine 2 has been massively overhauled and upgraded in regards to multithreaded support and more importantly much larger asset and instructions per thread handling and it clearly can't sit in the cache in a large enough or meaningfully way to impact performance to a major degree. We can see this as the 7800X3D is ~10% faster than the 7700X. I'd like to look more indepth in the 7950X3D vs 7950X and their near identical scores as this may be a case of massive thread utilization and problems with some being on the cache enable CCD and others not or gamebar issues. I'd like to see Lasso at work there just as a checksum. Based on how Starfield feels and plays, they clearly masively overhauled CE1 probably the same way Blizzard overhauled their WoW Engine to modernize it but it still has its roots in its original 2004 design. I'd like to see how some tight 3800 on a 5800X3D along with an optimized PBO and/or CO deals with it but there obviously won't be any miracles to close that massive gap (108fps vs 65fps) I'm sure we'll revisit this months down the road as patches, and upgrades and optimizations (oh my!) are implemented and we may find the 4090 properly moving ahead along with the gap between AMD and Intel CPUs closing. -
*Official Benchmark Thread* - Post it here or it didn't happen :D
electrosoft replied to Mr. Fox's topic in Desktop Hardware
Whoa but yeah the smallest (or not so small) deviations when running such tight tolerances can derail your setup. See: MSI AIO clogging debacle. I had the same happen to my Hyte R3 13900ks setup and out of the blue crashes and run away temps and I finally had to abandon that chassis. I'll put up a post later about what happened. -
*Official Benchmark Thread* - Post it here or it didn't happen :D
electrosoft replied to Mr. Fox's topic in Desktop Hardware
HUB confirming What GN found about CPU performance in Starfield....it is a very heavy CPU influenced title and Intel rules the roost. Especially the 5800X3D just gets wrecked.... Seriously, Starfield loves Intel CPUs....that has to be a weird win some / lose some for AMD who sponsored the PC version. (Eyes the SP115 13900KS sitting on the shelf in the MSI Z790i Edge motherboard atm doing nothing) Memory findings in line with GN meaning much smaller gains going from 4800 -> 7200 (but again, I'd like to see 7600 or 8000+. Maybe take a looksy at memory benefits @tps3443 and let us know). -
*Official Benchmark Thread* - Post it here or it didn't happen :D
electrosoft replied to Mr. Fox's topic in Desktop Hardware
Yep, looks feels and plays like FO4/76. That's not a bad thing. I was able to jump right in. Graphics are clearly updated and I expect many hours of fun when I can fully dive in. As for performance, Steve and GN did a CPU Benchmark/Bottleneck video with a 4090 and a checksum top end with the 7900xtx to show where CPUs give up the ghost. He also did some memory scaling (after the Buildzoid video) showing not much is gained from 6000 up to 7000 (~2fps) but I would have liked to have seen some 7600 and 8000+ testing just because.... This game seems to really like Intel 13th gen CPUs especially 13700/13900 and clearly outpace the X3D models including 7800X3D, 7950X3D and especially the 5800X3D. Dropping resolution down to stupid levels (720p) shows not much of a difference between the 7900xtx and 4090 CPU wise. ~4fps but forcing the 13900k into a bottlenecked situation still shows the 7900xtx winning vs the 4090 at all useful resolutions (IE, resolutions and settings that can force a CPU bottleneck). -
*Official Benchmark Thread* - Post it here or it didn't happen :D
electrosoft replied to Mr. Fox's topic in Desktop Hardware
I'm focusing on top tier pricing for those who want the best referencing @tps3443 post but this is absolutely true. I still stand by my assessment that the 4070 and 4090 are the only two viable cards price wise in the line up this time around but if the 4060ti 16GB somehow falls to $399.99 it isn't a bad pick. It is an expensive hobby because laws of diminishing returns are grossly in play for the top end same as if you want a Ferrari over a Camry or first class over economy. To be the best you pay the best sometimes crazily so. Nvidia isn't ripping anybody off. Nobody is forcing you to purchase their GPUs. This is a capitalistic world. You look at the product and determine if you want it or not at the price offered based upon your own criteria. All Nvidia can do is offer their goods at their selected price points and let the consumer decide. I don't subscribe to the nonsense that consumers don't know what they're doing. They're making educated purchasing decisions based upon their own purchasing power, preferences and buying criteria. Their preferences and criteria are allowed to be different than yours. It doesn't make yours right or theirs wrong. I never said or intimated there is nothing that can be done. As always, you vote with your purchasing power. Enough have spoken to let Nvidia know their pricing is still acceptable. When it is not and/or they feel enough competition/threat they will adjust accordingly as we just saw the 4060ti drop to $430 with AMDs announcement of their 7800xt along with lackluster sales. Hopefully we see more of this. Just for clarity before proceeding, architecturally speaking, you are claiming the 7900xtx ~= 4090 in raw rasterization and the 4090 overall wins because of software and optimizations? I remember skipping the Voodoo because the Verite was still a relatively new purchase for me so I needed a cooling off period before pulling the trigger on another upgrade! 😄 Games like Fallout 1 and 2 and Arcanum I played back then along with Quake didn't need 3D acceleration. WAY back in the day, I did have a Gravis joystick! 😄 -
*Official Benchmark Thread* - Post it here or it didn't happen :D
electrosoft replied to Mr. Fox's topic in Desktop Hardware
Well there is a reason the bulk of GPU sales are mid to lower tier cards as most buyers tend to shop there. We lament about the 4090 and pricing but this cycle (and most previous cycles) top end cards were reserved and purchased by a much smaller group of enthusiasts not joe consumer. Even then, joe consumer usually buys a mid tier card with mid tier hardware at best and will then use that system for at least 3-5 years before doing anything to it either just through finally wanting an upgrade or games/software running so poorly they decide to upgrade to....yep....whatever is now that cycle's mid tier hardware. If you perpetually want to have the best performance at all times that comes at a cost. It's an expensive hobby but it is what it is. If cars experienced the performance/efficiency gains CPUs and GPUs experienced we would probably be getting 500mi/gal at this point and go 0-65 in less than a second. I 've been saying it for quite some time that the 7900xtx is the sleeper bang:buck card especially if you go for a $999 or less model. Is it as good as the 4090? No. Not even really close when all things are equal. If you see any game that has the 7900xt equaling it or surpassing it in performance that is a reflection of optimized AMD code and/or poorly written Nvidia code that usually reflects sloppy porting from AMD console code where due to the controlled, limited nature of consoles developers HAVE to write leaner, more efficient code that targets AMD optimizations to extract maximum performance that can carry over to AMD hardware. When devs can, then port their code to PCs and don't take the time to take advantage of Nvidia's optimizations coupled with focusing on AMD optimizations (as they should for the console end), you end up with some charts showing the 7900xtx outpacing the 4080 by 15-20% in Starfield and matching or even beating the 4090 especially at lower resolutions....that tells you everything you need to know since we know hardware wise the 4090 is flat out a better card in every aspect (except price). --- As for first GPUs...... You young whipper snappers. 🙂 My first stand alone GPU was a Tandy EGA ISA card for my Tandy 4000 386-16mhz back in 1990 (this was also my first PC I bought while working at Radio Shack for 3.5yrs while in college). I upgraded to a VGA card and a Soundblaster a little while later to fully soak in Wing Commander at max settings. My first 3D'esque video card was a Rendition Verite 1000 card so I could run VQuake. My first all purpose real GPU was a 3DFX Voodoo2 to I could run GLQuake (I then gave my Redition card to my brother) My first ATI GPU was a Radeon 7000 series circa 2000 which I modified and augmented the heatsink and externally cooled and OC'd to the max to eek out every fps I could from Deus Ex (I again gave my Voodoo2 to my brother) My first Nvidia was a BFG 6800 Ultra circa late 2004 for my Power Mac G5 Duo to play WoW on my brand new 30" Apple Display From 1996->1999 my brother and I were hardcore into Quake playing online and going to cons for matches/competitions so I took our hardware pretty seriously to have it run as quick as possible while keeping details as low/sparse as possible to see everything clearly. We both ran Sony CRTs. -
While I feel it looks, feels and plays a lot like Fallout 3/4/76, I am enjoying it for what it is and its growing on me but I love Fallout 4/76 so I'm jaded. I do think it could have been better though. Metacritic score seems to show it is generally liked based upon 50 critical reviews.
- 136 replies
-
- 1
-
-
- star field
- best ever
-
(and 2 more)
Tagged with:
-
2700X to 5800X3D upgrade! With a 10850K Cameo!
electrosoft replied to Reciever's topic in Desktop Hardware
"In today's episode on Gamer's Nexus, we look indepthly into the shady tactics and benchmark inaccuracies of @Reciever along with his attempt to correct said 'errors' " 🤣 -
*Official Benchmark Thread* - Post it here or it didn't happen :D
electrosoft replied to Mr. Fox's topic in Desktop Hardware
I think it would cost you $5? -
*Official Benchmark Thread* - Post it here or it didn't happen :D
electrosoft replied to Mr. Fox's topic in Desktop Hardware
I believe the term you are looking for is "Performance Enthusast." 🤣 Having played some more of it, I am now referring to it as Starfield 76 aka Fallout in Space aka Starfall aka StarOut aka FallField.....as it looks feels and plays just like Fallout 4 and 76. I'm waiting to see if any of the frame render or loot/NPC bugs rear their ugly heads. Tempering your expectations, the vast majority of GPUs down to even a 3060 are enough to actually play. We're just a group of jaded hardware enthusiasts..... As for Nvidia, they are perpetually in cash grab mode 24/7/365 😞 I'm so happy you rectified the situation Papu! How many pairs of Crocs did you have to trade in for that wonderful upgrade in quality of life? 😁 -
*Official Benchmark Thread* - Post it here or it didn't happen :D
electrosoft replied to Mr. Fox's topic in Desktop Hardware
That looks really fantastic....oh...and.... "P-Dizzle" and "E-Dizzle" 🤣 Anybody ever tell you that you sound a lot like Bill Hader? -
*Official Benchmark Thread* - Post it here or it didn't happen :D
electrosoft replied to Mr. Fox's topic in Desktop Hardware
Well just wait till 2025 and get the 5090 then you will experience an even more pronounced upgrade than going simply to a 4090. For example, buddy of mine finally upgraded from his KP 2080ti to a 4090 specifically for Starfield, Hogwarts and TLOU 4k and it was a monster uplift but then here he is wanting an upper tier specimen kicking in instead of just enjoying it out of the box: He picked up an Asus Strix and it topped out at ~3030 and 2770ish +1100 on mem. He returned it and picked up another. ~3015 and 2785ish and +1000 on mem. Returned it, picked up a Suprim X AC...+2985 and 2750 and +900 on mem. All three blew his 2080ti out of the water even blocked and OC'd. On the Suprim he even used the new vBIOS capability and reverted it back to the 1.10 top and it still blew chunks (and made it sing like a canary). I told him they're all good out of the box (even the worse 4090 is still a beast at stock) and just block it up like he planned to do but he wants mine. I told him, "sure, $2.2k and it's yours" jokingly and he said he'll think it over but leaning towards wanting it. 🙄 What are we seriously talking about? 5fps? Law of diminishing returns are seriously in effect here. Go pick up the cheapest 4090, block it and then sell your KP 3090 to offset the cost and join us with the top dog till 2025 Or pick up a 7900xtx for a change of pace and get 4090 level of performance in Starfield according to Steve Steve collecting benchmarks and the 7900xtx holding its own (basically tying or close enough) the 4090 across all settings from 1080p to 4k ultra. -
*Official Benchmark Thread* - Post it here or it didn't happen :D
electrosoft replied to Mr. Fox's topic in Desktop Hardware
I know, right? 3090 is still a power house, but the 4090 was Nvidia's answer to such a close contest last gen. It left no doubt who was king this cycle and they didn't even have to bring the full power to do it. Easy coast to 5000 series now in 2025..... -
*Official Benchmark Thread* - Post it here or it didn't happen :D
electrosoft replied to Mr. Fox's topic in Desktop Hardware
I forced it with Inspector but didn't implement the DLSS mod yet nice. I am sure continued driver refinements will help everybody (or initial Intel AHEM). -
*Official Benchmark Thread* - Post it here or it didn't happen :D
electrosoft replied to Mr. Fox's topic in Desktop Hardware
I didn't run AB, but late last night I set mine to 4k, Ultra, no FSR full render and played all the way to the planet land and had no problems (FYI, I am NOT feeling the space battles atm). I'll have to enable AB to see how it performs objectively versus subjectively. I will say as a veteran of Fallout 4 and Fallout 76, Creation Engine 2 feels an awful lot like Creation Engine 1 down to even using the same key binds and everything. The kill/looting system feels identical. The little bit I played just left me wishing they updated Fallout 76 to the CE2. -
*Official Benchmark Thread* - Post it here or it didn't happen :D
electrosoft replied to Mr. Fox's topic in Desktop Hardware
I cut my cord completely and after my cooling off period with Xfinity passed, I signed back up for their 400mbps per package for $55.99/mo vs their triple play BS w/ 1.2mbps for ~$270/mo. I never feel the difference until those fleeting moments like this. Felt like it took forever to download. 🤣 -
*Official Benchmark Thread* - Post it here or it didn't happen :D
electrosoft replied to Mr. Fox's topic in Desktop Hardware
True, different tools for different jobs. -
I have the Premium Edition waiting for launch but I am going to wait a few weeks like I always do before playing for that initial server slam and bug fest. As a long time fan of the FO Universe, the Creation Engine 2 looks an awful lot like their Creation Engine 1 in game play footage, so I'll be curious to see what it brings to the table. But yes, I definitely will be playing.
- 136 replies
-
- 2
-
-
- star field
- best ever
-
(and 2 more)
Tagged with:
-
*Official Benchmark Thread* - Post it here or it didn't happen :D
electrosoft replied to Mr. Fox's topic in Desktop Hardware
Tom's being so optimistic for 13th gen price drops: "As the new chips differ little from current 13th Gen chips — apart from that i7 e-core count increase and higher clock speeds, it'll be well worth keeping an eye out for 13th Gen bargains. " But if Intel goes all Intel as always, they will just keep 13th gen priced as is (or a slight drop) and slot in the pricier 14th gen right above it. If something like this happened, expect 13th gen to start experiencing meteorically bad bins as all the decent chips get pushed to 14th gen. -
Steve had basically issued their own version of LTTs "Here's the Plan" video which was a mix of very VERY indepth policy and testing structures (along with future plans) but it was a tad tone deaf and moreso terrible timing. I said it before and I'll say it again. GN should have just dropped that first analysis of LTTs testing methodology along with the prototype shenanigans and left it at that.
- 54 replies
-
- 1
-
-
- linustechtips
- lmg
-
(and 4 more)
Tagged with:
-
FTFY
- 54 replies
-
- linustechtips
- lmg
-
(and 4 more)
Tagged with: