Jump to content
NotebookTalk

electrosoft

Member
  • Posts

    2,830
  • Joined

  • Last visited

  • Days Won

    131

Everything posted by electrosoft

  1. The beauty in ITX builds is it gives you somewhat of the challenge of tuning and/or overclocking a laptop but with real desktop components across the board. Looking at some of the ITX builds Optimum puts out really shows you what you can cram into these small form factors and run full tilt 4090s and 13900k/7800X3D/7950X3D systems. A lot starts with binned parts to get the best performance with CPU and GPU (in that order) leading the pack. The 14900k you're using is the R batch from OCN I think. I was tempted to pick it up for Intel APO testing with WoW, but that price for its condition made it not worth it (delidded, lapped, dinged on the PCB, SA bug) to me. I would have topped out at $450 shipped for it but luckily you snagged it so we can see it in action.
  2. That's not a bad chip at all. Outside of the SA bug (booo!) the actual performance and V/F curve looks rather nice! On your real hardware, this will break 45k easily.
  3. I don't think so for the consumer model but I can certainly hope so. A 77% uplift is massive but an uplift of 50% more cuda cores on the same node is going to be a 600-700w+ monstrosity at stock. I could reasonably see ~25% more cuda cores and GDDR7 carrying a lot of the heavy lifting for a 500-600w card and ~30-50% uplift. $2k has been the rumored price for awhile now and you would be getting some major bang for the buck if that is true but then again I don't want a space heater in my room like the 3090ti was either. If we get that monster uplift (again doubt it), that is an instant buy and sell for myself. We'll revisit this in October/November if it actually launches this year.
  4. What I've garnered so far is 5090 = same node / process as 4090 pumped up more but the majority of gains will be from GDDR7 and it's gonna run hot and heavy. I am curious about the performance gains since the 4090 was such a bump over the 3090. Part of me expects smaller than expected gains as they are cramming in more voltage and while the AI sector is looking at an MCM design, the consumer market looks to get a big ole juicy monolithic die again. I plan on waiting till after it drops (or close to it). I do too much Fallout 76 and WoW to suffer for months without top notch 4k performance. if the 7900xtx wasn't such a crapshow with WoW, I could see using it as a stop gap but for WoW it is garbage with all the driver issues. I've actually had a few people reach out (3) wanting to buy my 4090 because it auto boosts to 2820+ right out of the box and with a basic OC on the stock AIO cooler does 3120+. Mem >=+1500. I imagine on a chiller and a real block with a real XOC bios it might turn into a monster if it does this on stock gear. The major win for me is basically zero coil whine to the point I not only run my case fully open but I no longer have to use headphones and can play like I used to. With all this said, if a model on launch I want pops up for retail I will most likely pull the trigger but if I miss it or have to wait till Jan/Feb (which is what happened with my 4090), I'm fine with that too. I'll probably lock into MSI, FE or Asus again (in that order).
  5. I ordered a few motherboards from Amazon Warehouse: Asus B650 Prime and Asus X670E Strix for $95 and $228 for my daughter's retrofit to a full Team Red build. I figured if I can't get the Asus B650 Prime working properly (They have a high rate of return / problems) I can always go with the Strix. I also picked up a 7600x on ebay for $165. So far after a few headaches (I can see why they have a high return rate), the Asus B650 is holding it down. I can't see how Asus charges $200 for this board though. No debug lights at all and minimal heatsinks on the board. I must admit I do like the overall look though for some reason. The early BIOS revisions were to fix a crapton of memory incompatibilities for EXPO (still an issue), but since I manually tune all my memory it is not an issue. Main issue was reported overheating VRMs. I swapped in my 7950X3D as a test and let it run CB23 for 20 minutes and they do get toasty but the 7600x doesn't even break 50 running the same test (In summary, if you're buying a 7950X/X3D, get a board worthy of it). As for this 7600x, it is the first AM5 chip I've encountered that can do 2200 fclk no problem. With the newest Agesa updates, running 6000 tuned older M-die sticks, fclk 2000 vs fclk 2200 = bandwidth increase of 63k ->69k which makes sense considering the fclk increases by 10%. uclk holds to 6400 1:1 but gives up the ghost at 6600. My pipe dream of perfect ratios for fclk 2200 were quickly dashed. I had to swap in my 2x32GB G.Skill A-die sticks for that test since these first gen M-die's (I picked them up in 2022) give up the ghost at 6600 (uclk = 1/2). Using the already dialed in timings that had been tested on my 13900ks (MSI Z790i Edge), 12900k (Z690 unify), 7800X3D/7950X3D (X670e carbon, Asrock B650 HDV, Asus B650 Prime), I kept getting flickering that intensified during memory tests. Trial and error, I found out Tras was a bit too aggressive for the iGPU so I backed off from 30 to 35 and all was well. I'll run some more tests this week in the test case including swapping in the 7900XTX for final testing before I break down her system for the retrofit.
  6. When some of the mainstream YTers try to say faster memory is really a waste and no real gains, then you get B4B who actually does some tuning comparing 6000 vs 8600 for gaming. This is right along the lines of the gains I saw 6000 vs 8000 tuned on my 13900KS + MSI Z790i Edge. The gains are real and substantial especially in WoW.
  7. Hopefully one of those 14900ks's is at least decent. Hope you feel better soon @Papusan Well, buying an expensive binned chip definitely takes out the lottery lows that's for sure. Pay more but know *exactly* what you're getting has its pluses. I just flat out opted to skip 14th gen after selling my 13900KS SP115 and stick with AMD this go around. Let me know down the road when you want to jettison it out the escape hatch. 🙂
  8. This and more of this and then a sprinkle of this on top..... I've never understood that "grind it to the edge for hours or days" mentality. I usually have a benching profile and a D2D profile. In no way, shape or form am I going to run my benching profile(s) under those types of conditions for hours on end. That is just asinine. Worse thing you can do is listen to others try to tell you how to qualify your chip based on their criteria instead of your own. Don't think my chip meets your OC standards? Ah well, go cry in a corner. 🙂
  9. In the end, frames rule the game. Remember, I had a 3060 12GB before and all the VRAM in the world will not save you from a GPU that just isn't up to the task. Pumping up details to take advantage of the 12GB = steadily decreasing frame rates and it tanked at 1080p high on Hogwarts, WoW and Fallout. 3070 8GB ran circles around it. Yes you have a 50% larger VRAM buffer but now you need more GPU power to handle the higher level of detail/settings. Chicken or the egg? 🙂 On the other hand, you can dial down the details to the point you can even limp by at 4k on a 3060 which I did actually when I was waiting for my "next card" after selling my 3090 FE and before I got my KPE 3090. That one was actually an 8GB 3060 and I ran it at 4k low details for over a month. Such low settings = no VRAM issues and less stress on the GPU. Ideally, you would want a 4060ti 16GB and you get the best of both worlds. An even larger VRAM buffer AND FG that will absolutely shine at 1080p but Nvidia is stupid with their pricing and you're looking at a $150 difference but it makes the most sense for 1080p gaming. Albeit I do see the value proposition with the 3060 12GB. It's a good starter card and overall pretty solid at 1080p. Then again my nephew is loving his A380 6GB GPU so perspective I guess. 🙂
  10. If I was using GFN, I would use ethernet for optimal latency to the router/CM. 3060 is still solid for 1080p gaming and realistic settings. 3060 laptop and desktop are much closer than previous generations: Only downfall I can think of the 3000 vs 4000 series at the xx60 point is the lack of FG which can really help in pockets.
  11. I keep telling myself I'll stick with my MSI 4090 but I know I'll most likely snag the first MSRP 5090 that is one of my favorite/chosen models. I'm gonna fight REAL hard though! I'll probably hold off of the 8950x and wait for the X3D unless the performance is that much better in gaming than the 7950X3D unless their value sky rockets. Either way.... All I see in our future is this..... 🤣
  12. I still have my Clevo NH55. Last of the quasi desktop CPU using models from them for Socket 1700. I also have a Dell G5 15.6" I5-13650HX and RTX 4050 laptop and I just cracked open an Asus Vivobook Pro 15 OLED with an Ultra 7 155H and 3050 graphics. I sold or gave away all my previous laptops so I'm down to these three in my arsenal. I tried an Asus G18 twice last year and returned them because of the armory crate, silicon quality, inability to really do anything with it and screen quality. The refreshed models now have mini led options but I'm still not seeing anything on the horizon calling me. I have an ultra thin 18" portable high refresh display that runs right off of USB-C/TB both signal and power I can take with me to game if I want a big screen experience. Change for me was definitely the death of fully configurable laptops with interchangeable GPUs and CPUs along with memory, storage and wifi. The death of beefy, real heatsinks was also another "writing on the wall" situation as everything became about thin and light. My enthusiasm diminished quickly. Last real fun projects I did were the binning and tuning of the X170SM-G w/ Prema. I did pick up a random X170KM-G, but it was one of those rare (and I do mean rare) instances where it came with a crazy golden 11900k and ran full tilt right out of the box. I didn't have to do anything. The 3070 in it also ran ice cold. There was literally nothing left to tune on it. The BIOS the professional company used ran rock solid and everything "just worked" right out of the box. I was both elated and saddened at the same time. 🤣 Now, I have no problem with thin and light laptops as I've used them along side the big boys but I want options too.
  13. That's killer! Much better than the rig I built for my nephew.
  14. Yeah I'm over Intel atm but I am looking forward to 15th gen. I just have to tamp down this urge to try APO with WoW. I'm waiting for some results somewhere (anywhere?) from a WoW player. Looking at my inventory, I have 2x EVGA PSUs (850w P2, 1600w P2), 2x AIO (360 CLC, 280CLC) and 2x of their keyboards which the wife and I still love. I've tested numerous keyboards since and I keep going back to it. A few of their mice, but they are in storage as they are ok but not the best for gaming. I switched to a Razer Pro and it is by far the best I've used so far. But all the GPUs and Motherboards of yesteryear? Gone but certainly not forgotten. 😞 MSI is my tier 1 now followed by Asus and maybe Asrock again considering the last three boards I used from them (1x Intel, 2x AMD) have all been rock solid and stellar.
  15. lol, he's not wrong. 13900k, 13900ks, 14900k, 14900ks has just been a protracted two cycle (with four releases) of basically the same CPU just binned differently. Plenty of hyping his own products too including his "custom" memory modules like they're magical. With such variances, Silicon Lottery could have made a major killing with these. I always liked the look of Ampere Strix cards. Whenever I'm working on the wife's system, I always admire the 3080 Strix in there and remember fondly paying $1100 for it during the height of the GPU shortages....good times. 🙂 It is really painful to see what was once the standard bearer for quality and professionalism just slowly circle the drain as they've just become really a PSU and peripherals company and have nerfed all their warranties and abandoned everything that made them great including GPUs, MBs and even cases and outlier peripherals.
  16. Since I permanently have a 7900xtx in the house now "for my daughter" (wink wink) I've been having fun with it. I think I'm going to swap out her B660M/12400/DDR4 for a B650M/7600X/DDR5 setup to really make her setup House AMD.
  17. If you don't care about RT of DLSS (as indicated above), you will absolutely get a nice performance bump from decent to near 4090 levels depending on program. Such a solid first build! He's going to love it.
  18. Yeah, I've got my eye on Asrock. They left a sour taste in my mouth with the Z390 Taichi but they have slowly redeemed themselves through a couple of MBs and GPUs over the years since. White Asrock Taichi 7900xtx is down to $999.99. If I hadn't snagged this Hellhound 7900xtx for $720 total to my door, I prolly would have picked up an open box for $860+tax = ~$914 shipped. Yep. When I was first testing 8000 on my 13900KS, it bombed spectacularly just scaling at Jedec 4800 loose timings . XMP eventually bombed out, but once I tightened them up they had no problem and that's how I ran them.
  19. Asus now has SP ratings for AMD chips for AM5 for their upper tier boards and will give an SP rating per each CCD and a combined score which is kind of cool along with V/F curves. For AMD, 7800X3D / CCDx X3D on a 7950X3D, anything over SP100 is above average. SP103+ is golden. CCDx non X3D, anything above ~117 is above average and over 120 is golden. Combined, anything greater than 112 is golden. Example:
  20. It could have been worse 😞 Make sure to test it especially the IMC if you're looking to push 8400+ I can definitely see this scenario playing out. MSI Titan 18 HX is completely unlocked too and has a BIOS basically identical to their desktop boards allowing access to CPU and memory adjustments. Asus G18 allows an undervolt.....and that's it.
  21. ARC continues to improve in leaps and bounds (of course the argument being it needed massive amounts of improvement) on the software/driver side and this will lay the groundwork for Battlemage. I'm hoping they can at least compete with the upper middle tier (4070/4080 level) at least if not more. *WHEW* glad you were able to get an RMA on that stinker. 14900k/ks would be a buy pre-binned only affair for me but I am skipping it and sticking with AMD this go around and see how 15th gen and 8000 X3D shakes out. I don't see any 14th gen chip being much better than the SP115 13900KS I had considering I use AIOs and it was on par overall with my 7950X3D (but just got destroyed in Fallout 76). I'm also skipping all laptops (outside of what companies send me to evaluate and review) and just sticking with my NH55. Maybe down the road (along with @jaybee83) I could see swapping in a killer sample 14900 chip like @tps3443 has and going from there because of the sweet lows.
  22. Not at the moment unfortunately. I sold my last 3070 and that 6700xt a while back. A380 went to my nephew, 7900xtx to my daughter and 4060 to my friend's daughter.
  23. I topped out at 6800 on the Unify with my 12900k but that might have been the m-die sticks I was using at the time.
  24. 100% agree. It isn't always what ya got but how ya use it. Delid it, throw it on a chiller DD and suddenly the variance between high and low SP narrows greatly under real world conditions and pushing them quite nicely. I do think many want to see good samples on AIOs because that is what they run so they can personally quantify and qualify the chip compared to their own cooling and sample run (personal system variances notwithstanding). I LOVE seeing chiller and LN2 runs and seeing these chips pushed while also accepting I run AIOs so I can't directly compare the results with my own chosen cooling but it would be nice time to time to see some of them tested on AIOs first then moved to DD/chillers to not only document the difference between the AIO and DD/chiller on that particular chip but to also compare somewhat directly. Then again, you can't demand someone to take the time to setup and test a chip on an AIO before moving to their own preferred cooling method. I do not see the argument for disparaging @tps3443 setup (or yours or @Mr. Fox) because one is not running it though. That doesn't make any sense. Oh and I definitely remember you taking average chips and pushing them over on the old forums. Those were some of my favorite posts.
  25. Ditto. We're early into the 14900KS Xmas day unboxing joy, but as a month or so passes and samples continue to trickle in, we're going to see some prime lovelies. @tps3443 R batch is about as perfectly balanced and good as they get especially on his chiller even capped. It's a great chip both high (desktop) and low (laptop).
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. Terms of Use