Jump to content
NotebookTalk

electrosoft

Member
  • Posts

    3,097
  • Joined

  • Last visited

  • Days Won

    160

Everything posted by electrosoft

  1. When someone says not a fanboi that means they're a fanboi..... (zing!) 🤣 Who said I didn't like it? I'm loving it but can always want and expect more of a response overall. I was hoping to see them do more than basically match Alderlake at best. 12900ks still rules the roost for single at stock vs the 7950x. If you're talking gains over the 5000 series then definitely. It is a worthy successor but I had already said Intel would retain the single core lead and that's exactly what they are going to do. Full fat 16 core vs 8 + 16 hybrid mess I wanted to see more too. Architecturally though I prefer a proper full fat 16 core design vs 8 + 16 "we can't control the thermonuclear" approach but the fact Intel can tack on 8 more cores, add more cache, refine the process and still continue to compete against AMD's brand new architecture answer to Alderlake leaves me a bit meh. On the other hand, knowing X3D is in the pipeline stacked on top of these nice improvements those should be monsters for cache sensitive (games basically) uses after seeing what the 5800X3D brought to the table. Looking forward to the real reviews though but at least it is competitive. If AMD brings the high heat with their GPUs a full AMD buildout would be a viable alternative IMHO. I won't be buying any new "new" hardware till everything is released and the dust settles and we get an idea of what seems the most fun / makes the most sense. I still have my master plan to get an NH55 and plop this 12900k in it. What I end up with after that? We will see. Since you're going AMD, it will be nice to see some other numbers popping up in this thread. I liked it on the old forums when there were at least 2-3 of us with modern AMD hardware (myself, @Mr. Fox and @Rage Set) to toss in benchmarks and data for comparison.
  2. This across all spectrums.... AMDs greatest contribution is forcing Intel and Nvidia to innovate and not hold back products and punish the market with trickled releases at cyclic inflated costs... Here's hoping their GPUs bring the high heat this time. Their leaked CPU scores are looking decidedly meh...single core is lacking and multi-core is less than I expected:
  3. Yeah, I very much enjoyed Jufe's last video and it has been my conclusion too all along. If you can get a decent IMC, decent memory and a decent motherboard that can handle 4000+, you're pretty much good to go. Get your latency to ~46ms or lower, bandwidth to ~65k (all in G1) or higher and you will have to go that extra mile to extract better gaming performance out of DDR4 and DDR5 and as seen it is some serious extremes but this has been par for the course for all new memories compared to refined previous iterations (IE DDR vs DDR2 vs DDR3 vs DDR4). Eventually DDR5 will reign supreme across the board in all aspects. It will just take some time. I do have an EVGA Z690 Classified sitting on the shelf I ordered a few weeks back as they now toss in another free keyboard and their wireless mouse and associates code still works so $285 for the whole bundle. I was planning to wait for 13th gen to pick up a 13900k to mix and match between my Strix Z690 which does even 4200 G1 DDR4 and yjr Classified Z690 and some Hynix DDR5 sticks I picked up over on the overclock forums to play with but i'm also keeping an eye on the Ryzen launch to see what AMD brings to the table too. I have such a sour taste in my mouth over those USB issues and AMD neutering Ageisa with subsequent releases for overclocking where I ended up just abandoning their neutered PBO and clock offsets and just went traditional for consistent clocks across the board so we will see.
  4. I really enjoyed following this project and the final write up was a good read! Awesome work @srs2236!
  5. So what are the top 5 persistent issues/problems with the X170KM-G?
  6. Jufes continuing to dump all over DDR5 pushing A-die to 7600 vs tuned B-die 4100 for gaming.... "If you're a DDR5 fanboy or you bought the wrong product, I'm sorry." "If you took advice from an extreme overclocker that don't play games or mainstream tech tubers that don't know how to tune machines, I'm sorry again." Savage lol....
  7. The only fair and valid comparison is another 3090Ti under the exact same conditions. Any other comparison is invalid but still fun to see. 😀
  8. Of course we're comparing a massively outlier setup (KPE 3090, custom water, Mo-Ra, chiller) vs a 4090 FE with zero over clocks but I do agree some of the numbers are sus especially the 4090 numbers unless....*gasp* Nvidia was fluffing us hardcore with the 4090 presentation. RT is definitely easy to differentiate from normal rendering even in basic forms. When I turned off RT in WoW, I noticed immediately and they only used it for shadows and certain lighting situations. I think it is the future but it is a long ways away till it is the dominant rendering preference vs rasterization. The industry feels a bit lost now with EVGA bowing out in regards to enthusiasts and pushing hardware and making superior designs. There's still Galax so we can see what they offer as they were always the direct competitor to EVGA for XOC GPUs but the industry continues to push towards PC computing as a commodity. I don't like it but I understand that the target market has always been 99% everyday users that just want to buy something, plug it in and use it as is. 😞
  9. LOL 3090 FE no longer viable....broken irrevocably! (thick boii!)
  10. With the uptick in RT performance, I expected more than 62% since 3rd gen tensor cores are supposed to be much better than 2nd gen. 50-60% is what I expect in raw rasterization. DLSS as always is a hard pass for me. Maybe the FE stock cooler is a monster if they can handle up to 600w or a reporting anomaly and software needs updating. Suprim is supposed to have a pretty decent cooler so I'm surprised to see 75c.
  11. Well, I guess the "glass half full" look at this (barf) is he could have priced the 4090 at 1999 making performance:cost proportionate with the 4080 16gb and the 4080-- 12GB...... The 4090, compared to the 3090, is offering substantially more performance for only $100 more USD but it is downright pricey considering the original price of flagship GPUs just 3 generations ago (Pascal). Looking at the great Ampere glut, the pricing makes sense and either forces a buyer into a monster purchase of a 4000 card or a more realistic purchase of a 3070 for ~$450-500. 4080 12GB is going to have a hard time beating a 3090ti in pure rasterization. RT performance is going to be good though. I wouldn't even contemplate a 4080 16GB even if I have a 3080+ class card and it is priced at $1200. I'm just going to pony up the extra $300 and get a 4090 if I was in that situation. 2000 more cores, I expect if Jensen has his way the 4090ti is going to be $1999.99. Once AMD became a non-factor, all bets were off. 6000 series made some good inroads but maybe Jensen knows something we don't about AMD specs/pricing to feel he can justify these prices. I'm still hoping AMD comes out with a winner and similar performance leaps as RDNA to RDNA2.
  12. As always, wait for the real numbers, but if you're going 4000 definitely 4090... Even with Nvidia's fluffery you know it is going to be at least a raw uptick of 50-60% over the 3090ti right out of the box for rasterization never mind the bump to RT.
  13. As always, it comes down to the AIB. Having tested personally the EVGA FTW 3090ti vs KPE 3090 and KPE 3090ti Hybrids = FTW 3090ti TS run = ~65c and fans at full blast. Same run with 3090ti KPE AIO = ~51c and much quieter and clocking ~40mhz higher stock for stock (2075 vs 2115)...could just be the silicon there though. FTW 3090ti runs were when it was ~67f in my computer room versus 70f for the KPE 3090ti. KPE 3090ti 360mm is mounted up top so it isn't getting the fresh air as I could only mount the AC LF II 420mm in the front. Gaming (WoW but of course) in Ardenweald (roughest zone) after an hour = ~45.5c KPE 3090ti / 65c FTW 3090ti. I did remove the side panel for the FTW 3090ti though to stop the hotboxing of my chassis. If done properly, Hybrid AIO beats air cooling on average every time. Plus Hybrid has much more flexibility in fan/rad adjustments. I quickly tossed the new, snazzy EVGA aRGB fans and slapped on some Arctic P12s. Much quieter and better performance. And as always, a real block trumps all if that is your cup of tea. LOL, as for the Neptune it would definitely have to be a white/silver build out. Just think, if you went with a white build, you could get white crocs to match! 🙂
  14. The Neptune has a better AIO cooler than standard Hybrids which tend to just cover the GPU primarily but obviously can't touch a real block. It falls right in between a classic Hybrid and a full block.
  15. Run down of all the 4090 cards announced so far (AKA Size Matters):
  16. True the 3090ti design fixes all the flaws and potential problems of the 3090 with a small uplift on top. Benefits of end cycle refinements. Still doesn't wash the taste out of my mouth as they should have released it from the get go if able. Looking at their stack, even dismissing all the fluff numbers, the 4090 is a monster of a card but knowing a true Ti level card is lurking with 2000 more CUDAs gives me pause plus early adopter costs. The market will be much clearer End of Q1 2023. AMD has just been bringing it the last 2-3 years both CPU and GPU in terms of growth and performance. Hopefully RDNA3 is as much of a beast compared to RDNA2 as RDNA2 was to RDNA. 5700xt -> 6900xt was serious growth. Whatever gets Jensen shaking in his leather jacket, price adjust and release the 4090ti sooner than later. At the rate the power consumption is growing, we will eventually end up with a two handed GPU that slots in and uses 4-5 retention screws.....4 slot cards just spilling gobsmackingly amounts of heat into your case. Don't mind me, I'm jaded. The way that FTW 3090ti ran and turned my case and room into a sauna (and was loud too under load) just left a sour taste in my mouth. Look at the size of that Gigabyte! (We're gonna need a bigger boat... 🙂 )
  17. The real challenge for Nvidia will be can they still expect those prices with no Miners, recession, inflation, used market glut and plenty of 3000 owners (and even 2000 owners still) not looking to upgrade? After the initial wave of early adopters, it will be interesting to see how the market pans out post holidays. I am also very interested to see what AMD has to offer too (on both fronts).
  18. I've been looking through all the models and my initial assessment still stands. The only one (so far) that catches my eye is the Colorful Neptune 4090. Immediate ideas of a Gundam/Robotech White and silver vertical build danced through my head:
  19. This. Nvidia has so many other markets captured outside of consumer cards that we slowly see them taking control of the consumer markets now with their fixed AIB pricing while also offering their own cards too. As AIB pricing and FE pricing slowly inch towards each other (IE, raw GPU/Mem fabs vs full on the shelf FE products), AIBs will have to jack up their prices to remain profitable to an extent that it makes no sense to spend 10-20% more for their model when you can grab an Nvidia model for much less. Next step is for Nvidia to prioritize allotments to their own products and dole out what's left to AIBs. EVGA had the right idea and as numerous charts have shown, AIB profits plummet year after year while Nvidia's rise. If Nvidia wants full vertical control this is the right way to achieve it. Not outright ban AIBs but focus on their own FEs and make AIB pricing unappealing till they start to jump ship due to pricing and other practices that result in at best razor thin margins at worst losses. If you aren't an AMD fan, now is the time to start putting them on your radar along with backing Intel. For family and friends build outs I think I am going to start using them instead when able. DISCLAIMER: If EVGA comes back to Nvidia, all bravado and chest pumping is potentially null and void... 😁
  20. I'm definitely waiting to see what AMD has to offer both CPU and GPU before I would make any kind of moves I've never been a fan of nor will I ever use DLSS. I do like RT but WoW DF beta has kept it to just shadows like before and cleaned up their code base FO76 doesn't even push my 3090ti above 150w full 4k everything to ultra We knew the price hike was coming but the major scum move is introducing some oddly tiered 3080 w/ cut down cores and less memory and charging $900.......yeah we can all spot what should have been the $599 3070.... I like their charts where they are pressing HARD to show RT/DLSS gains as rasterization gains (which we know are being shown in their best light) are......ok. You will most likely need the full fat 3080 16gb and above to beat a 3090ti outright. Like you said, this is AMD's time to strike. I expect if AMD comes out with performance at or slightly greater than Nvidia in Rasterization and other areas that we will get the 4090ti sooner than later. With EVGA out of the picture, my enthusiasm for the 4000 series has dropped massively. As for the other AIBs, the only card so far that looks compelling is the Colorful Neptune 360mm AIO. All of the others are massive case thermonuclear reactors. MSI has an AIO model but it has a 240mm.....ok.....
  21. I like it. It's a KPE 3090 on steroids and better binned than my KPE 3090. Sold my KPE 3090 on ebay for $2800. Bought it originally direct from EVGA for $1999. Bought the KPE 3090ti for $2499.99 but after contacting EVGA CS a few times after the price drop to $1999.99 at the time they refunded me the $500 difference PLUS the associates code 3% on top making it $1924.99 so I was ok with that since they didn't have to. That included the 1600w PSU which I actually needed as my best PSU was a 1000w Seasonic and a 850w EVGA which both lacked the PCIe ports I needed. Basically an upgrade from my KPE 3090 to a 3090ti, plus a bit back AND 1 600w PSU. I'm content and the WoW Beta and FO76 aren't even stressing it at 4k. Actually FO76 doesn't even push it above 150w and Blizzard seems to have optimized their code finally which is nice but we will see with the final launch. With all the EVGA/Nvidia problems along with my two main games having no problems at 4k with it I'm definitely skipping the 5000 series. Downside of course is if you want to block it that's a no go. 🙂 I wouldn't block it, but after having a FTW 3090ti in my case for a week, I will never put a furnace of that level inside a cased system ever again. In the future it will always be hybrids or a potential block.
  22. Forgot the "of GPU" (which we know is low to none) lol... The 2% sales for motherboards and misc was the one I found jaw dropping. I assumed their motherboard business (even with them being habitually late to market) was greater than that. With the data as presented, I can see why they cut ties with Nvidia and focus on a proper downsize while deciding what they want to do next. They can continue as a small, tightly run PSU, MB and misc company while they decide on another GPU venture that yields better control and profitability. I do hope they go with AMD. A K|NGP|N 7900xt or 8900xt would be righteous but I just feel like they will somehow work out their differences enough to after optimizing they will come back around to Nvidia for the 5000 series. It definitely is a combination of objective and subjective reasons but this has clearly been brewing for some time as Jensen and Nvidia have become a tiresome problem to work with but it isn't as if AMD doesn't make their own cards too so there is a bit of competition with AIBs there.
  23. According to GN video (as reported by EVGA and personal extrapolation): ~78% of sales are GPU (but low to no profitability) ~20% of sales are PSU (300% profitability) ~2% are motherboards and misc (this is surprising) 20% layoff makes sense as they informed Nvidia in April 2022 their intentions to sever their partnership. Nvidia had almost 5 months to make right and didn't. Wondering where Vince will go? Facts as told to Steve: Primary reason for terminating partnership = can't compete with Nvidia and FE editions and pricing down along with Nvidia not listening finalized with EVGA losing hundreds per card sale from the 3080 12gb and up even with the older less severe price cuts:
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. Terms of Use