Jump to content
NotebookTalk

electrosoft

Member
  • Posts

    2,800
  • Joined

  • Last visited

  • Days Won

    129

Everything posted by electrosoft

  1. Funny, I came away with the same conclusion. 🙂 "Oh, here's something for @tps3443 to aspire to" and "Yeah, I'm good with the benchmark" Don't do it! You know you'll regret it! But if you DO do it, make sure to post all the results. 🤣 True on the 4090. I am sure you could hit that on mine no problem seeing as it does 3120+ on stock Vbios and stock AIO cooling. I have no idea where it would go blocked let alone chilled lol. I know you have to be excited about this bro! Once we downsized, we sold our monster sized house and moved into a Double Wide, costs dropped by insane levels across the board. 15 years later, my lady now keeps sending me listings for "smaller houses...not as giant sized." She wants the privacy of a house and land around it again. Looking forward to the finished place. You DID make sure it has a dedicated computer/office room for you?
  2. Here you go @tps3443. System gaming stats to shoot for! 4090 - 3150mhz 14900KS - 64x / 49x (SP111 P126 / E81) DDR5 8800 CL38
  3. This is what? 10 years now since the NBR forums? I realized X amount of years ago that you buy Intel/Nvidia hardware day one or close to it and give it a really competent set of early testing that tends to stick and align with my approach to hardware so I always trust it and it has yet to be wrong.... .....why fix what isn't broke? 🤣 So yeah....I'll be on the lookout for your results. 😜
  4. Ditto..... If I wasn't into gaming and I was new to computers, 9950X makes sense. I would pick it over 14th gen. But right now, the smart move is to wait for 15th gen and X3D variants.
  5. I'll wait for day 1 @Talon results to see where I'm going.... 🤣 BTW, saw the thread for the SP103 over yonder....yikes that's going to be a poo poo storm and then some. JayZ on how to check for degraded chips.....
  6. Yeah, I tested it on my 14900KS setup and it compiled just fine with my original profiles. Wukan Benchmark just hit differently as noted by many here and elsewhere. I also think there is a wide crossover for software issues and hardware issues and the easy out is to blame Intel atm as we see below: The fact it reaches 100% shows it most likely isn't a shaders compiling issue but the problem lies elsewhere. Sometimes people confuse buggy software with hardware issues. This is true for Intel and AMD platforms.
  7. I have hopes for Arrow Lake. AMD 9000 really isn't exciting. Slim hope for 9950X3D/9800X3D to provide some type of meaningful uplift. AMD GPUs are apparently regressing next gen overall but supposed to have stronger RT? Help me 5090..you're our only hope. 🤣 Right now, I'm financing my daughter's College Education and she's in year 2, so any excuse to pump the brakes is ok. The bill from the Bursar's office just hit and it was spicy! 🤑
  8. Pity, the 5600Mhz V/F is actually decent. If it won't run anything even on your killer setup, DD and stability only at those settings and load voltage, that chip screams defective. I don't think I've ever read of a degraded chip to that level. Either way, need to relid it professionally and send that thing in for an exchange from Intel.
  9. There have been plenty out of the box new 14th gen CPUs that were failing on auto settings because they were garbage, but this one seems like uber trash. It could be a dud , abused/degraded or a bit of both. That is stupidly high load voltage for 48c at 5.6. How does it work on pure auto? Incrementing + offset to see if you can find a level of stability since it is DD/water chilled to take heat out of the equation.
  10. Hmmmm, I think I remember someone basically saying the same thing. Lemme see if I can find it...ah yes....here it is.... 🤣
  11. Considering you said your SP99 vs SP108 was fairly close with chiller love (temps rule everything), I am sure you will get amazing results from it too. I'm always curious about the IMC.
  12. Apparently the new Final Fantasy XVI Demo on Steam released on the 19th also is giving some users fits with its shader compiles. I just ran it on my main desktop (7950X3D/4090) and no problems at all. I'll try it on my 14900KS/7900XTX setup later to see how it stacks up against the Wukong fun times. https://store.steampowered.com/app/2738000/FINAL_FANTASY_XVI_DEMO/
  13. Yep, I called the original 4070 the sweet spot and the 4070 Super is the ultra sweet spot. It is a no brainer for someone with a ~$600 budget for a GPU (or what we previously called the flagship pricing years ago....). I keep coming back to it for my SFF build on my charts with the 14900KS...... Depends on the title and consumption requirements. Look at Starfield.... you can see as it gets hungrier they start to separate but for mid to lower tier power requirements they all kinda just end up in the same area for the bigger chips. A 7800X3D would wipe the floor with the 14900k for consumption and beat it overall in gaming (it gets trashed everywhere else). Those results really don't show Intel in the best light even (look at the memory) and it is still winning. I am expecting Ultra 200 series to hopefully bring some heat. The laptop variants are......ok. The top of the line Ultra 9 185 is basically a slightly better 13600k. The 9800X3D will just step right back in and take the top spot......over the 7800X3D but not as massive as YT keep saying because..... ......of this. A properly tuned 13th/14th gen can hold its own with the X3D chips but it takes a level of refinement that normal users aren't going to do nor should they have to do. Intel is the tinkerer's chip as it yields the most results from getting under the hood. AMD has a lot of "set and forget" properties which has their merits. Most users are sitting on air coolers or rando AIOs, using factory thermal paste on the AIO and booting up and going from there with factory defaults. I respect YT channels like Testing Games that present a more realistic "end user" experience but also acknowledge we will never run anything even close to that. When you run an out of the box experience, Intel suffers the worse as it has much more tune-ability. Look at the memory settings. 9950x is running at the 6000 sweet spot while Intel is chugging along at 6000 too. If you want a more realistic AMD vs Intel vs Nvidia channel, I watch Bang4BuckPC Gamer. He tunes his hardware on a CL, and routinely does comparisons with all his hardware (X3D, 14900k, 7900xtx, 4090, etc...). Intel can push the new MC, but it still comes down to MB makers responsibly enforcing it while also allowing advanced users to disable it if they see fit. I am sure we will see a few more BIOS updates fleshing it out. As for the SA Bug, when all is said and done, I am still setting my SA back to 1.18 for 8200 because it works and no need to feed more voltage than required. Of course for benching and testing, I let it run free. 🙂
  14. With the way I tier my profiles based on power requirements for games to boost my clocks and scale, it will quickly show which games are sucking down the most power and require higher tier/lower clock settings on my modest AIO setup. This was the first game I ran into that all my previous auto/lower power tiers crashed. This was even before it could get to any type of shader loading even. Shaders just kicked it up a notch. I had no problems running it on my 4090/7900xtx and 7950x3d/14900ks (post miser power acknowledgement lol). Oh, and happy birthday @Papusan! Hope it was a good one and here's to many more brother! True, but unfortunately joe user isn't going to know what or how to handle it and either refund the game or blame the hardware that might be unstable on many fronts. Me personally? I like tinkering and figuring out wth is going on. 🙂 297w on your setup is serious for a game, yikes. This might be a good wake up call for Intel and to an extend AMD going forward to maybe not push their chips so hard trying to one up each other and get back to static, realistic settings.
  15. I do not have that one. Is it free or is there a demo? With adjustments (using my Y-cruncher/OCCT profile), Wukan works perfectly now. I just didn't know I was going to have to roll out the bigger guns profile to do it but it's understandable. Even thought it pulled only 226w max at 5.6 compiling shaders and running the demo, it still needs 1.31v set (down from 1.32v) and 1.184v under load to run. Hogwarts and Fortnite UE5 worked just fine but Wukan must have some special sauce mixed in. 🤣
  16. Ugh, MSI is supposed to be right up there by Asus in regards to BIOS controls. One thing I can say about Asus is they really do give you the kitchen sink and then some. EVGA was up there too. I always considered MSI next but watching BZ flip around in Gigabyte's BIOS, it seems pretty competent these days too. Asrock is missing a lot of fine/granular controls and I find their FIVR controls a bit lacking. So far, it is Asus and Gigabyte with the ultimate cut off valve for vrout max settings. When it comes to games, I have always run a mix of configs allowing me to scale to the games demands and reap the rewards in clock settings. It lets me really wring out every last bit of performance per game from my chips without trying to implode them. 🙂 What I usually do is when I find a game or stressor I want/need to run, I will then adjust around it and create another profile tier to run. For example, WoW, Starfield and FO76, I can even get away with tuned/dialed in auto and run it up to 6ghz all core if I want but 5.9 is the perfect sweet spot. I could actually probably run FO76 6.1ghz all core most likely. CP2077 I can do 5.8 all core. This Wukan benchmark? I had to go back and use my "big boy" profile at 5.6 which means I'll have to adjust up 5.7 and most likely 5.8 won't run but it might. 5.9 is definitely a pipe dream on my lidded AIO setup. With that being said, I absolutely agree that real world stability is the best as it assures everything will run, but the cost is performance left on the table when a game can run more with less. Did you ever find out what made your monitoring text turn color and what it meant when running this benchmark? I'm curious.
  17. Yeah, the power draw is one thing but having to actually use my Y-cruncher/OCCT profile to get it to run for a game benchmark is crazy. Using my other settings, it would just exit out before it even got to the shaders compiler screen. I'll have to go back in and figure out where 5.7 and 5.8 can get it to run too unless the pull/heat gets a little too much for my modest setup. Glad to know I wasn't seeing things as I tried three times to get my adjusted 66 to then show up on the final results screen. Seems like over on the OCN forums, they have all decided on max everything at 4k, Super Resolution setting of 75 and trying to see how many can break 80fps and by how much. I saw a couple 83fps scores. LOL.... I chuckled nicely at that one. 🙂
  18. It's 66. I don't know why it kept showing 65, but when I went back in and manually set it to 66, it would show 65. I actually did the run *3* times trying to get it to show the 66 I selected! 🙂 Thinking about blocking it? Max everything out and set Super Resolution to 66 to see how it stacks up to @tps3443 and my scores at the same settings.
  19. And here is some AMD 7900XTX runs on the 14900KS. I had to actually bump my Vcore up to 1.32 for 5.6/4.5/5.0 (1.2 under load) to get it to run properly not just with shader compiles but normal use too. This thing is hungry. I think there are going to be a lot of systems that are going to crash/won't run this on launch. This is the same profile I have to run for even small OCCT and Y-cruncher testing. I can pass CB23/CB15 at 1.29 on the new bios and 1.28 (1.152v load) on the old bios. But this thing required the same profile as the ultra hungry stuff. Temps were good though! 🙂 This is why I like having multiple profiles. I can load up 5.9 all core auto tuned for WoW and FO76 and 5.6 as above for any heavy hitting. As for AMD, er, well.... let's just say RT really isn't in the future for them with this game in any meaningful way. Using the same settings as my previous run on the 4090 (and same as @tps3443😞 7900XTX stock + 14900KS fixed 56x/45x/50x RT ON: (Note: this run was the first run and includes shader compiles data in the HWInfo) 7900XTX stock + 14900KS fixed 56x/45x/50x RT OFF:
  20. Pure 100% stock CPU and stock GPU, mem at 6000 only but tuned. IF at stock 2000. So 6fps difference? That's it? Hmmm.... And yeah, the shader compiles are brutal. I'm fishing through various setups on my 14900KS on the AIO to find what works best....7950X3D just shrugged it off and said, "and what?" 🤣
  21. Agreed, but sometimes synergistic pairings between software and hardware do pay dividends. Apple comes to mind, love 'em or hate 'em. 🙂 The real problem was when Intel and AMD started to deviate from a monolithic type design and started offering these hybrid designs both in core differentials (P vs E) and core functionality (X3D vs non-X3D) so OSes could no longer just work with a generic scheduler. We're expecting M$ and AMD/Intel to work hand in hand to properly have their CPUs know what and how to prioritize tasks. It has gotten somewhat better since Alderlake and 7950X3D/7900X3D introduction but it will always be a work in progress.....
  22. The main issue is upgrading a system from one CPU to another (IE 7600x to a 7900X3D/7950X3D). I've been using and monitoring core usage for several months now with this 7950X3D and I can say disabling CCD1 for an X3D only box versus running full tilt is the same on a clean install with the newest BIOS and drivers on the x670e carbon. Running Fallout 76, WoW, Starfield, Deux Ex, CP2077 and more, as soon as you're in game, CCD1 is put to sleep and all the load shifts to CCD0. I used Ryzen Master to monitor core usage. I also used Project Lasso for awhile to see if there was a difference and there was not resulting in the same basic effective outcome. I'm not sure what magic was worked from December to now, but there is no more drifting like before. One thing I did do is stop using MSI's chipset drivers and went straight to the source and got them from AMD so that may have helped too. Could have been one of the BIOS updates too as Agesa is constantly updated. There is also the prospect of gamebar updates/refinements on M$ end. I just did a clean install along with the newest BIOS update a few days ago and everything is working right as rain with proper core parking during gaming with the 7950X3D on the MSI x670E carbon. In summary: Clean install with new CPU (newest drivers and BIOS update) = no problems Upgrade CPU from single CCD to dual CCD X3D model but want to keep your current Windows install? Uninstall chipset drivers, clean out driver debris with Revo, reinstall chipset drivers Still got a problem? Toggle Driver for CPPC in BIOS. You can usually toggle back to auto after if you want and it keeps the proper settings.
  23. Nice OC on the IF. Yeah, the sheer amount of power it takes for minimal returns isn't worth it outside of benchmarks. We'll have to see some pics of it in action when it's up to your standards.:)
  24. Welcome to the Desktop brigade Meaker! Time to update the sig. 🙂
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. Terms of Use