Jump to content
NotebookTalk

electrosoft

Member
  • Posts

    2,605
  • Joined

  • Last visited

  • Days Won

    112

Everything posted by electrosoft

  1. Isn't it awesome? If anyone is looking for 7950x's they're in stock at both Best Buy and Amazon. I'm going to sit back and let AMD, Intel and Nvidia go to war and casually walk through the darkened battlefield after the smoke clears and collect my spoils in Q1/Q2.
  2. EVGA also redesigned the Classified Z790 too. Looks like the consolidated form factors and build approach between the two for costs. https://videocardz.com/press-release/evga-announces-z790-dark-kngpn-and-z790-classified-motherboards
  3. Seeing as 7950x is just matching the 12900k ST basically, you know the 13900k is going to smash it ST but MT is where things are going to get real murky I agree. And I definitely am waiting till next year for any platform upgrades when the 13900KS / 7X00X3D variants drop. Before, I didn't put much stock into motherboard/platform longevity because MB prices weren't outlandish overall but with the way modern motherboard pricing has been, AMD has a point when you can keep the same platform/socket/motherboard and literally drop in CPU upgrades as needed. I know those former 1700x owners who dropped in a 5800X3D are giggling with glee. That Z790 KP MB warms the heart....
  4. This is exactly how it is going to turn out. AMD? Intel? We all win since they are competing HARD against each other. Same with AMD vs Nvidia. I'm loving it. 🙂
  5. Very nice! There was no way Intel was going to let AMD have all that singular purchasing glory! Smart move on Intel's part if someone was on the fence. 🙂
  6. That's awesome bro, congrats! Looking forward to the build out, pics and results!
  7. That's a good idea to have a sub forum for build threads. They were (and are) one of my favorite parts of ole NBR and other forums where people document their whole process building out or doing outlier projects (like @srs2236 and his 3080 mod for his laptop).
  8. I'm sure EVGA has had the bean counters pour over their sales with a fine tooth comb and a smaller, leaner EVGA with higher profit margins, no debt, cash on hand can decide what they want to do with their name brand. Clearly the Nvidia partnership is over and it wasn't worth the hassle anymore. I do hope they jump back into the fray and potentially hold onto Kingpin for future work. It now comes down to are GPU buyers that loyal to EVGA to abandon Nvidia and jump ship to AMD or just switch to another AIB. If I stayed with Nvidia, I'd go with Asus or MSI for their transferable warranties but I'm pretty irked at Nvidia atm. I am sure other AIBs are quietly sulking and watching how this plays out.
  9. Those 5800X3D numbers are confirmed. It is just such a strong gamer CPU. With all that cache you can literally grab a dirt cheap X570 or B550 motherboard, 2 sticks of 3200 whatever and have a blazing system performing pretty much as good as it is going to get. Looking at the gains from a 5800X -> 5800X3D, it just makes you salivate thinking what a 7600X3D (or higher tier) AMD chip will bring. So many refinements and improvements (as noted by @Ashtrix) overall to boot even before we get to that point next year.
  10. lol, friends zing each other. 🙂 I am really liking what I see with the 7950X but for gamers? That 5800X3D continues to show value. If AMD drops the price a cheap 5800X3D buildout would still have no problems holding its own easily for gamers. 7950x will just continue to show the hybrid Intel design for the nonsense it is overall. I seriously would be content with a 10-12 core real P-core CPU from Intel and they can keep the e-core nonsense. If they're able to move 3D stacking into mobile architecture that's game over as laptop GPUs are capped anyhow...it will be even moreso for the 4000 series thermal wise but architectural improvements will still be there. I prefer gaming on laptops at 1080p anyhow. I don't think anyone on this forum at least really believed Nvidia was going to bring valid 2-4x performance gains. Same for AMD being able to bring all that CPU performance uplift in a 170w TDP. Marketing fluff as always on both sides.
  11. Starting to watch the review videos. GN and De8auer are the best of the bunch. I love that Roman is always all about the delids and outlier testing.... Monstrous list of reviews: https://videocardz.com/138525/amd-ryzen-7000-zen4-raphael-desktop-cpu-review-roundup My take: 5800X3D is a gaming monster. As prices start to fall on older chips, if I was a gamer primarily with no need for massive amounts of productivity threads, I would build out a much cheaper 5800X3D platform even in the here and now. You can build out a 5800X3D with memory and motherboard cheaper than the 7950x itself. Gaming at 1440p and greater, it really doesn't matter which CPU you pick from the top half of everything as the 3090ti is bottlenecking. Let's see what the 4000 and 7000 series bring to maybe cap some of this unbridled CPU power we're experiencing but for those of us gaming at 4k I'm sure the GPU will still be the bottleneck but we will see. 1080p gaming where CPU is the bottleneck was a mixed bag but man seeing that 5800X3D still in the top part of the list or outright winning shows the power of that design. Did I mention the 5800X3D is a gaming beast? The 7000 X3D variants should be absolute juggernauts. 95c is the new normal taking a GPU type approach to let the chip boost as much as it wants depending on temps. Cooling REALLY matters now and the AMD CPU design now further minimizes traditional overclocking since the CPU is designed to boost as much as possible based on thermal conditions. This basically gives every buyer the bulk of performance the chip has to offer right out of the box versus leaving so much performance on the table like CPUs of yore where only overclockers could extract monstrous gains. (Power to the people?) Welcome to the power hungry af club AMD...it pulls more than a 12900k at stock now (251w). AMD hit that efficiency wall and finally are letting their CPUs suck down as much as needed till it hits that thermal wall. AMD 2021 "Efficiency is king" AMD 2022 "Efficiency? What is that?" De8auer's delid was pretty righteous on that 7900x. The temp drops were insane. This thing is going to sing with a proper delid but I'm curious to see how it fares with a better/thinner IHS or a good lapping too. I see Asus has SP ratings on their 7000 series motherboards. Looking forward to see how that pans out. Intel retains the single thread lead....kinda. Based on the CB23 vs GB5 results with locked clocks for IPC evaluation, we will need a suite of locked clock results to see what's going on. Multicore it is an absolute slaughter. The 7950x is just a beast....an absolute beast. Hopefully Intel's refinements, cache, clocks and more sad_cores will level the multi-core playing field or at least staunch the blood letting. I remember when Athlon dethroned Intel and I switched to FX-60 I remember when 5000 dethroned 10th and I switched to 5800x I have no qualms switching back to AMD again if 13th gen is a sad response but we will see. If 13th isn't a worthy response, I'll skip 13th and build out an AMD system. I would LOVE for AMD to come out of the gate with their GPUs too and wreck Nvidia. An All AMD build (last time I had that was back with the FX-60 and an ATI card) would be sweet.
  12. When someone says not a fanboi that means they're a fanboi..... (zing!) 🤣 Who said I didn't like it? I'm loving it but can always want and expect more of a response overall. I was hoping to see them do more than basically match Alderlake at best. 12900ks still rules the roost for single at stock vs the 7950x. If you're talking gains over the 5000 series then definitely. It is a worthy successor but I had already said Intel would retain the single core lead and that's exactly what they are going to do. Full fat 16 core vs 8 + 16 hybrid mess I wanted to see more too. Architecturally though I prefer a proper full fat 16 core design vs 8 + 16 "we can't control the thermonuclear" approach but the fact Intel can tack on 8 more cores, add more cache, refine the process and still continue to compete against AMD's brand new architecture answer to Alderlake leaves me a bit meh. On the other hand, knowing X3D is in the pipeline stacked on top of these nice improvements those should be monsters for cache sensitive (games basically) uses after seeing what the 5800X3D brought to the table. Looking forward to the real reviews though but at least it is competitive. If AMD brings the high heat with their GPUs a full AMD buildout would be a viable alternative IMHO. I won't be buying any new "new" hardware till everything is released and the dust settles and we get an idea of what seems the most fun / makes the most sense. I still have my master plan to get an NH55 and plop this 12900k in it. What I end up with after that? We will see. Since you're going AMD, it will be nice to see some other numbers popping up in this thread. I liked it on the old forums when there were at least 2-3 of us with modern AMD hardware (myself, @Mr. Fox and @Rage Set) to toss in benchmarks and data for comparison.
  13. This across all spectrums.... AMDs greatest contribution is forcing Intel and Nvidia to innovate and not hold back products and punish the market with trickled releases at cyclic inflated costs... Here's hoping their GPUs bring the high heat this time. Their leaked CPU scores are looking decidedly meh...single core is lacking and multi-core is less than I expected:
  14. Yeah, I very much enjoyed Jufe's last video and it has been my conclusion too all along. If you can get a decent IMC, decent memory and a decent motherboard that can handle 4000+, you're pretty much good to go. Get your latency to ~46ms or lower, bandwidth to ~65k (all in G1) or higher and you will have to go that extra mile to extract better gaming performance out of DDR4 and DDR5 and as seen it is some serious extremes but this has been par for the course for all new memories compared to refined previous iterations (IE DDR vs DDR2 vs DDR3 vs DDR4). Eventually DDR5 will reign supreme across the board in all aspects. It will just take some time. I do have an EVGA Z690 Classified sitting on the shelf I ordered a few weeks back as they now toss in another free keyboard and their wireless mouse and associates code still works so $285 for the whole bundle. I was planning to wait for 13th gen to pick up a 13900k to mix and match between my Strix Z690 which does even 4200 G1 DDR4 and yjr Classified Z690 and some Hynix DDR5 sticks I picked up over on the overclock forums to play with but i'm also keeping an eye on the Ryzen launch to see what AMD brings to the table too. I have such a sour taste in my mouth over those USB issues and AMD neutering Ageisa with subsequent releases for overclocking where I ended up just abandoning their neutered PBO and clock offsets and just went traditional for consistent clocks across the board so we will see.
  15. I really enjoyed following this project and the final write up was a good read! Awesome work @srs2236!
  16. So what are the top 5 persistent issues/problems with the X170KM-G?
  17. Jufes continuing to dump all over DDR5 pushing A-die to 7600 vs tuned B-die 4100 for gaming.... "If you're a DDR5 fanboy or you bought the wrong product, I'm sorry." "If you took advice from an extreme overclocker that don't play games or mainstream tech tubers that don't know how to tune machines, I'm sorry again." Savage lol....
  18. The only fair and valid comparison is another 3090Ti under the exact same conditions. Any other comparison is invalid but still fun to see. 😀
  19. Of course we're comparing a massively outlier setup (KPE 3090, custom water, Mo-Ra, chiller) vs a 4090 FE with zero over clocks but I do agree some of the numbers are sus especially the 4090 numbers unless....*gasp* Nvidia was fluffing us hardcore with the 4090 presentation. RT is definitely easy to differentiate from normal rendering even in basic forms. When I turned off RT in WoW, I noticed immediately and they only used it for shadows and certain lighting situations. I think it is the future but it is a long ways away till it is the dominant rendering preference vs rasterization. The industry feels a bit lost now with EVGA bowing out in regards to enthusiasts and pushing hardware and making superior designs. There's still Galax so we can see what they offer as they were always the direct competitor to EVGA for XOC GPUs but the industry continues to push towards PC computing as a commodity. I don't like it but I understand that the target market has always been 99% everyday users that just want to buy something, plug it in and use it as is. 😞
  20. LOL 3090 FE no longer viable....broken irrevocably! (thick boii!)
  21. With the uptick in RT performance, I expected more than 62% since 3rd gen tensor cores are supposed to be much better than 2nd gen. 50-60% is what I expect in raw rasterization. DLSS as always is a hard pass for me. Maybe the FE stock cooler is a monster if they can handle up to 600w or a reporting anomaly and software needs updating. Suprim is supposed to have a pretty decent cooler so I'm surprised to see 75c.
  22. Well, I guess the "glass half full" look at this (barf) is he could have priced the 4090 at 1999 making performance:cost proportionate with the 4080 16gb and the 4080-- 12GB...... The 4090, compared to the 3090, is offering substantially more performance for only $100 more USD but it is downright pricey considering the original price of flagship GPUs just 3 generations ago (Pascal). Looking at the great Ampere glut, the pricing makes sense and either forces a buyer into a monster purchase of a 4000 card or a more realistic purchase of a 3070 for ~$450-500. 4080 12GB is going to have a hard time beating a 3090ti in pure rasterization. RT performance is going to be good though. I wouldn't even contemplate a 4080 16GB even if I have a 3080+ class card and it is priced at $1200. I'm just going to pony up the extra $300 and get a 4090 if I was in that situation. 2000 more cores, I expect if Jensen has his way the 4090ti is going to be $1999.99. Once AMD became a non-factor, all bets were off. 6000 series made some good inroads but maybe Jensen knows something we don't about AMD specs/pricing to feel he can justify these prices. I'm still hoping AMD comes out with a winner and similar performance leaps as RDNA to RDNA2.
  23. As always, wait for the real numbers, but if you're going 4000 definitely 4090... Even with Nvidia's fluffery you know it is going to be at least a raw uptick of 50-60% over the 3090ti right out of the box for rasterization never mind the bump to RT.
  24. As always, it comes down to the AIB. Having tested personally the EVGA FTW 3090ti vs KPE 3090 and KPE 3090ti Hybrids = FTW 3090ti TS run = ~65c and fans at full blast. Same run with 3090ti KPE AIO = ~51c and much quieter and clocking ~40mhz higher stock for stock (2075 vs 2115)...could just be the silicon there though. FTW 3090ti runs were when it was ~67f in my computer room versus 70f for the KPE 3090ti. KPE 3090ti 360mm is mounted up top so it isn't getting the fresh air as I could only mount the AC LF II 420mm in the front. Gaming (WoW but of course) in Ardenweald (roughest zone) after an hour = ~45.5c KPE 3090ti / 65c FTW 3090ti. I did remove the side panel for the FTW 3090ti though to stop the hotboxing of my chassis. If done properly, Hybrid AIO beats air cooling on average every time. Plus Hybrid has much more flexibility in fan/rad adjustments. I quickly tossed the new, snazzy EVGA aRGB fans and slapped on some Arctic P12s. Much quieter and better performance. And as always, a real block trumps all if that is your cup of tea. LOL, as for the Neptune it would definitely have to be a white/silver build out. Just think, if you went with a white build, you could get white crocs to match! 🙂
  25. The Neptune has a better AIO cooler than standard Hybrids which tend to just cover the GPU primarily but obviously can't touch a real block. It falls right in between a classic Hybrid and a full block.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. Terms of Use