Jump to content
NotebookTalk

electrosoft

Member
  • Posts

    2,800
  • Joined

  • Last visited

  • Days Won

    129

Everything posted by electrosoft

  1. Those 5800X3D numbers are confirmed. It is just such a strong gamer CPU. With all that cache you can literally grab a dirt cheap X570 or B550 motherboard, 2 sticks of 3200 whatever and have a blazing system performing pretty much as good as it is going to get. Looking at the gains from a 5800X -> 5800X3D, it just makes you salivate thinking what a 7600X3D (or higher tier) AMD chip will bring. So many refinements and improvements (as noted by @Ashtrix) overall to boot even before we get to that point next year.
  2. lol, friends zing each other. 🙂 I am really liking what I see with the 7950X but for gamers? That 5800X3D continues to show value. If AMD drops the price a cheap 5800X3D buildout would still have no problems holding its own easily for gamers. 7950x will just continue to show the hybrid Intel design for the nonsense it is overall. I seriously would be content with a 10-12 core real P-core CPU from Intel and they can keep the e-core nonsense. If they're able to move 3D stacking into mobile architecture that's game over as laptop GPUs are capped anyhow...it will be even moreso for the 4000 series thermal wise but architectural improvements will still be there. I prefer gaming on laptops at 1080p anyhow. I don't think anyone on this forum at least really believed Nvidia was going to bring valid 2-4x performance gains. Same for AMD being able to bring all that CPU performance uplift in a 170w TDP. Marketing fluff as always on both sides.
  3. Starting to watch the review videos. GN and De8auer are the best of the bunch. I love that Roman is always all about the delids and outlier testing.... Monstrous list of reviews: https://videocardz.com/138525/amd-ryzen-7000-zen4-raphael-desktop-cpu-review-roundup My take: 5800X3D is a gaming monster. As prices start to fall on older chips, if I was a gamer primarily with no need for massive amounts of productivity threads, I would build out a much cheaper 5800X3D platform even in the here and now. You can build out a 5800X3D with memory and motherboard cheaper than the 7950x itself. Gaming at 1440p and greater, it really doesn't matter which CPU you pick from the top half of everything as the 3090ti is bottlenecking. Let's see what the 4000 and 7000 series bring to maybe cap some of this unbridled CPU power we're experiencing but for those of us gaming at 4k I'm sure the GPU will still be the bottleneck but we will see. 1080p gaming where CPU is the bottleneck was a mixed bag but man seeing that 5800X3D still in the top part of the list or outright winning shows the power of that design. Did I mention the 5800X3D is a gaming beast? The 7000 X3D variants should be absolute juggernauts. 95c is the new normal taking a GPU type approach to let the chip boost as much as it wants depending on temps. Cooling REALLY matters now and the AMD CPU design now further minimizes traditional overclocking since the CPU is designed to boost as much as possible based on thermal conditions. This basically gives every buyer the bulk of performance the chip has to offer right out of the box versus leaving so much performance on the table like CPUs of yore where only overclockers could extract monstrous gains. (Power to the people?) Welcome to the power hungry af club AMD...it pulls more than a 12900k at stock now (251w). AMD hit that efficiency wall and finally are letting their CPUs suck down as much as needed till it hits that thermal wall. AMD 2021 "Efficiency is king" AMD 2022 "Efficiency? What is that?" De8auer's delid was pretty righteous on that 7900x. The temp drops were insane. This thing is going to sing with a proper delid but I'm curious to see how it fares with a better/thinner IHS or a good lapping too. I see Asus has SP ratings on their 7000 series motherboards. Looking forward to see how that pans out. Intel retains the single thread lead....kinda. Based on the CB23 vs GB5 results with locked clocks for IPC evaluation, we will need a suite of locked clock results to see what's going on. Multicore it is an absolute slaughter. The 7950x is just a beast....an absolute beast. Hopefully Intel's refinements, cache, clocks and more sad_cores will level the multi-core playing field or at least staunch the blood letting. I remember when Athlon dethroned Intel and I switched to FX-60 I remember when 5000 dethroned 10th and I switched to 5800x I have no qualms switching back to AMD again if 13th gen is a sad response but we will see. If 13th isn't a worthy response, I'll skip 13th and build out an AMD system. I would LOVE for AMD to come out of the gate with their GPUs too and wreck Nvidia. An All AMD build (last time I had that was back with the FX-60 and an ATI card) would be sweet.
  4. When someone says not a fanboi that means they're a fanboi..... (zing!) 🤣 Who said I didn't like it? I'm loving it but can always want and expect more of a response overall. I was hoping to see them do more than basically match Alderlake at best. 12900ks still rules the roost for single at stock vs the 7950x. If you're talking gains over the 5000 series then definitely. It is a worthy successor but I had already said Intel would retain the single core lead and that's exactly what they are going to do. Full fat 16 core vs 8 + 16 hybrid mess I wanted to see more too. Architecturally though I prefer a proper full fat 16 core design vs 8 + 16 "we can't control the thermonuclear" approach but the fact Intel can tack on 8 more cores, add more cache, refine the process and still continue to compete against AMD's brand new architecture answer to Alderlake leaves me a bit meh. On the other hand, knowing X3D is in the pipeline stacked on top of these nice improvements those should be monsters for cache sensitive (games basically) uses after seeing what the 5800X3D brought to the table. Looking forward to the real reviews though but at least it is competitive. If AMD brings the high heat with their GPUs a full AMD buildout would be a viable alternative IMHO. I won't be buying any new "new" hardware till everything is released and the dust settles and we get an idea of what seems the most fun / makes the most sense. I still have my master plan to get an NH55 and plop this 12900k in it. What I end up with after that? We will see. Since you're going AMD, it will be nice to see some other numbers popping up in this thread. I liked it on the old forums when there were at least 2-3 of us with modern AMD hardware (myself, @Mr. Fox and @Rage Set) to toss in benchmarks and data for comparison.
  5. This across all spectrums.... AMDs greatest contribution is forcing Intel and Nvidia to innovate and not hold back products and punish the market with trickled releases at cyclic inflated costs... Here's hoping their GPUs bring the high heat this time. Their leaked CPU scores are looking decidedly meh...single core is lacking and multi-core is less than I expected:
  6. Yeah, I very much enjoyed Jufe's last video and it has been my conclusion too all along. If you can get a decent IMC, decent memory and a decent motherboard that can handle 4000+, you're pretty much good to go. Get your latency to ~46ms or lower, bandwidth to ~65k (all in G1) or higher and you will have to go that extra mile to extract better gaming performance out of DDR4 and DDR5 and as seen it is some serious extremes but this has been par for the course for all new memories compared to refined previous iterations (IE DDR vs DDR2 vs DDR3 vs DDR4). Eventually DDR5 will reign supreme across the board in all aspects. It will just take some time. I do have an EVGA Z690 Classified sitting on the shelf I ordered a few weeks back as they now toss in another free keyboard and their wireless mouse and associates code still works so $285 for the whole bundle. I was planning to wait for 13th gen to pick up a 13900k to mix and match between my Strix Z690 which does even 4200 G1 DDR4 and yjr Classified Z690 and some Hynix DDR5 sticks I picked up over on the overclock forums to play with but i'm also keeping an eye on the Ryzen launch to see what AMD brings to the table too. I have such a sour taste in my mouth over those USB issues and AMD neutering Ageisa with subsequent releases for overclocking where I ended up just abandoning their neutered PBO and clock offsets and just went traditional for consistent clocks across the board so we will see.
  7. I really enjoyed following this project and the final write up was a good read! Awesome work @srs2236!
  8. So what are the top 5 persistent issues/problems with the X170KM-G?
  9. Jufes continuing to dump all over DDR5 pushing A-die to 7600 vs tuned B-die 4100 for gaming.... "If you're a DDR5 fanboy or you bought the wrong product, I'm sorry." "If you took advice from an extreme overclocker that don't play games or mainstream tech tubers that don't know how to tune machines, I'm sorry again." Savage lol....
  10. The only fair and valid comparison is another 3090Ti under the exact same conditions. Any other comparison is invalid but still fun to see. 😀
  11. Of course we're comparing a massively outlier setup (KPE 3090, custom water, Mo-Ra, chiller) vs a 4090 FE with zero over clocks but I do agree some of the numbers are sus especially the 4090 numbers unless....*gasp* Nvidia was fluffing us hardcore with the 4090 presentation. RT is definitely easy to differentiate from normal rendering even in basic forms. When I turned off RT in WoW, I noticed immediately and they only used it for shadows and certain lighting situations. I think it is the future but it is a long ways away till it is the dominant rendering preference vs rasterization. The industry feels a bit lost now with EVGA bowing out in regards to enthusiasts and pushing hardware and making superior designs. There's still Galax so we can see what they offer as they were always the direct competitor to EVGA for XOC GPUs but the industry continues to push towards PC computing as a commodity. I don't like it but I understand that the target market has always been 99% everyday users that just want to buy something, plug it in and use it as is. 😞
  12. LOL 3090 FE no longer viable....broken irrevocably! (thick boii!)
  13. With the uptick in RT performance, I expected more than 62% since 3rd gen tensor cores are supposed to be much better than 2nd gen. 50-60% is what I expect in raw rasterization. DLSS as always is a hard pass for me. Maybe the FE stock cooler is a monster if they can handle up to 600w or a reporting anomaly and software needs updating. Suprim is supposed to have a pretty decent cooler so I'm surprised to see 75c.
  14. Well, I guess the "glass half full" look at this (barf) is he could have priced the 4090 at 1999 making performance:cost proportionate with the 4080 16gb and the 4080-- 12GB...... The 4090, compared to the 3090, is offering substantially more performance for only $100 more USD but it is downright pricey considering the original price of flagship GPUs just 3 generations ago (Pascal). Looking at the great Ampere glut, the pricing makes sense and either forces a buyer into a monster purchase of a 4000 card or a more realistic purchase of a 3070 for ~$450-500. 4080 12GB is going to have a hard time beating a 3090ti in pure rasterization. RT performance is going to be good though. I wouldn't even contemplate a 4080 16GB even if I have a 3080+ class card and it is priced at $1200. I'm just going to pony up the extra $300 and get a 4090 if I was in that situation. 2000 more cores, I expect if Jensen has his way the 4090ti is going to be $1999.99. Once AMD became a non-factor, all bets were off. 6000 series made some good inroads but maybe Jensen knows something we don't about AMD specs/pricing to feel he can justify these prices. I'm still hoping AMD comes out with a winner and similar performance leaps as RDNA to RDNA2.
  15. As always, wait for the real numbers, but if you're going 4000 definitely 4090... Even with Nvidia's fluffery you know it is going to be at least a raw uptick of 50-60% over the 3090ti right out of the box for rasterization never mind the bump to RT.
  16. As always, it comes down to the AIB. Having tested personally the EVGA FTW 3090ti vs KPE 3090 and KPE 3090ti Hybrids = FTW 3090ti TS run = ~65c and fans at full blast. Same run with 3090ti KPE AIO = ~51c and much quieter and clocking ~40mhz higher stock for stock (2075 vs 2115)...could just be the silicon there though. FTW 3090ti runs were when it was ~67f in my computer room versus 70f for the KPE 3090ti. KPE 3090ti 360mm is mounted up top so it isn't getting the fresh air as I could only mount the AC LF II 420mm in the front. Gaming (WoW but of course) in Ardenweald (roughest zone) after an hour = ~45.5c KPE 3090ti / 65c FTW 3090ti. I did remove the side panel for the FTW 3090ti though to stop the hotboxing of my chassis. If done properly, Hybrid AIO beats air cooling on average every time. Plus Hybrid has much more flexibility in fan/rad adjustments. I quickly tossed the new, snazzy EVGA aRGB fans and slapped on some Arctic P12s. Much quieter and better performance. And as always, a real block trumps all if that is your cup of tea. LOL, as for the Neptune it would definitely have to be a white/silver build out. Just think, if you went with a white build, you could get white crocs to match! 🙂
  17. The Neptune has a better AIO cooler than standard Hybrids which tend to just cover the GPU primarily but obviously can't touch a real block. It falls right in between a classic Hybrid and a full block.
  18. Run down of all the 4090 cards announced so far (AKA Size Matters):
  19. True the 3090ti design fixes all the flaws and potential problems of the 3090 with a small uplift on top. Benefits of end cycle refinements. Still doesn't wash the taste out of my mouth as they should have released it from the get go if able. Looking at their stack, even dismissing all the fluff numbers, the 4090 is a monster of a card but knowing a true Ti level card is lurking with 2000 more CUDAs gives me pause plus early adopter costs. The market will be much clearer End of Q1 2023. AMD has just been bringing it the last 2-3 years both CPU and GPU in terms of growth and performance. Hopefully RDNA3 is as much of a beast compared to RDNA2 as RDNA2 was to RDNA. 5700xt -> 6900xt was serious growth. Whatever gets Jensen shaking in his leather jacket, price adjust and release the 4090ti sooner than later. At the rate the power consumption is growing, we will eventually end up with a two handed GPU that slots in and uses 4-5 retention screws.....4 slot cards just spilling gobsmackingly amounts of heat into your case. Don't mind me, I'm jaded. The way that FTW 3090ti ran and turned my case and room into a sauna (and was loud too under load) just left a sour taste in my mouth. Look at the size of that Gigabyte! (We're gonna need a bigger boat... 🙂 )
  20. The real challenge for Nvidia will be can they still expect those prices with no Miners, recession, inflation, used market glut and plenty of 3000 owners (and even 2000 owners still) not looking to upgrade? After the initial wave of early adopters, it will be interesting to see how the market pans out post holidays. I am also very interested to see what AMD has to offer too (on both fronts).
  21. I've been looking through all the models and my initial assessment still stands. The only one (so far) that catches my eye is the Colorful Neptune 4090. Immediate ideas of a Gundam/Robotech White and silver vertical build danced through my head:
  22. This. Nvidia has so many other markets captured outside of consumer cards that we slowly see them taking control of the consumer markets now with their fixed AIB pricing while also offering their own cards too. As AIB pricing and FE pricing slowly inch towards each other (IE, raw GPU/Mem fabs vs full on the shelf FE products), AIBs will have to jack up their prices to remain profitable to an extent that it makes no sense to spend 10-20% more for their model when you can grab an Nvidia model for much less. Next step is for Nvidia to prioritize allotments to their own products and dole out what's left to AIBs. EVGA had the right idea and as numerous charts have shown, AIB profits plummet year after year while Nvidia's rise. If Nvidia wants full vertical control this is the right way to achieve it. Not outright ban AIBs but focus on their own FEs and make AIB pricing unappealing till they start to jump ship due to pricing and other practices that result in at best razor thin margins at worst losses. If you aren't an AMD fan, now is the time to start putting them on your radar along with backing Intel. For family and friends build outs I think I am going to start using them instead when able. DISCLAIMER: If EVGA comes back to Nvidia, all bravado and chest pumping is potentially null and void... 😁
  23. I'm definitely waiting to see what AMD has to offer both CPU and GPU before I would make any kind of moves I've never been a fan of nor will I ever use DLSS. I do like RT but WoW DF beta has kept it to just shadows like before and cleaned up their code base FO76 doesn't even push my 3090ti above 150w full 4k everything to ultra We knew the price hike was coming but the major scum move is introducing some oddly tiered 3080 w/ cut down cores and less memory and charging $900.......yeah we can all spot what should have been the $599 3070.... I like their charts where they are pressing HARD to show RT/DLSS gains as rasterization gains (which we know are being shown in their best light) are......ok. You will most likely need the full fat 3080 16gb and above to beat a 3090ti outright. Like you said, this is AMD's time to strike. I expect if AMD comes out with performance at or slightly greater than Nvidia in Rasterization and other areas that we will get the 4090ti sooner than later. With EVGA out of the picture, my enthusiasm for the 4000 series has dropped massively. As for the other AIBs, the only card so far that looks compelling is the Colorful Neptune 360mm AIO. All of the others are massive case thermonuclear reactors. MSI has an AIO model but it has a 240mm.....ok.....
  24. I like it. It's a KPE 3090 on steroids and better binned than my KPE 3090. Sold my KPE 3090 on ebay for $2800. Bought it originally direct from EVGA for $1999. Bought the KPE 3090ti for $2499.99 but after contacting EVGA CS a few times after the price drop to $1999.99 at the time they refunded me the $500 difference PLUS the associates code 3% on top making it $1924.99 so I was ok with that since they didn't have to. That included the 1600w PSU which I actually needed as my best PSU was a 1000w Seasonic and a 850w EVGA which both lacked the PCIe ports I needed. Basically an upgrade from my KPE 3090 to a 3090ti, plus a bit back AND 1 600w PSU. I'm content and the WoW Beta and FO76 aren't even stressing it at 4k. Actually FO76 doesn't even push it above 150w and Blizzard seems to have optimized their code finally which is nice but we will see with the final launch. With all the EVGA/Nvidia problems along with my two main games having no problems at 4k with it I'm definitely skipping the 5000 series. Downside of course is if you want to block it that's a no go. 🙂 I wouldn't block it, but after having a FTW 3090ti in my case for a week, I will never put a furnace of that level inside a cased system ever again. In the future it will always be hybrids or a potential block.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. Terms of Use