Jump to content
NotebookTalk

electrosoft

Member
  • Posts

    2,660
  • Joined

  • Last visited

  • Days Won

    116

Everything posted by electrosoft

  1. 13900KS repair: Nope, I sold that after moving on from the SM/KM/TM era sorry. 😞
  2. Another HOF on the way and beautiful family time! Ugh, it's confirmed all the new 4090's rolling off the line are 1.07v variants: It also appears the newest Nvidia drivers are letting 4090 VRAM OC's go a bit higher? Ill have to check later:
  3. Congrats on the Z790 Apex pickup! *Looks at all the new hardware strewn about my abode* I completely understand the logic vs "I want it" conflict. *Stares at the 13900ks SP115 I probably didn't need but couldn't pass up* 🤣
  4. Good video showing the differences between not only frequency but timings and Samsung vs Hynix in gaming on AM5: The best results are the Hynix kit on top each and everytime. Like your Samsung kit @Raiderman his primary stalled at CL32 while Hynix hit CL30 and Hynix had better subtimings 6000 vs 6000 (CL30 results = Hynix, CL32 results = Samsung).
  5. So far with this SP115 13900ks testing it on the Strix Z690 D4 with just an EVGA 280mm (since that is what would be used in my sff), this thing has no problems with CB23 scoring ~41k and temps in the 70s. It isn't even delidded. Hearing all the horror stories of 13900k's thermal throttling in CB23 even with 360mm and some with custom loops but nothing even close to throttling.... I'll be moving it to the MSI Z790i Edge in the next day or so to start testing these DDR5 kits to see if either can hit 7000+ or I'll end up ordering some of those @Mr. Fox recommended bare A-die's and slap some heatsinks on them instead.
  6. It just sounds like garbage sticks are in play possibly outside of the reboot hangs. Both of these sets of M-die I have will do 30-36-36-48 TRFC 500, tREFI 65535 and even trfcsb 300 or lower. The one set with after market heatsinks on them can even do TRTP 12, RRDS/RRDL 4 and a perfect x4 tfaw of 16. The G.skill sticks can't go as tight on terts at 1.4v You could try pushing Vmem up like @Mr. Fox suggested if only to see if that will allow them to boot and/or at least pass TM5 and see how hot they might get under load monitoring in HWI. I saw you had them at 1.38 so push past my suggestion of at least 1.4 and try 1.45 but Samsung is almost always going to be more problematic than Hynix on AM5. Again, I don't want to give the impression like all the sets of DDR5 I've used and tested worked great and were just magic unicorns prancing across my Carbon X670E. I sent back / replaced two sets from G.skill.
  7. So far with this 13900ks: Was sent a few messages offering to buy it off of me for $800 and $1000 Tested in the NH55 laptop. Fully expected it not to post and it didn't disappoint. 🙂 No post. Will need a BIOS update that will never come unless we get a Prema/Dsanke/Hidden Gem application. In the Z690 D4 looking at the VF curve I sure wish it did post in the NH55 because it has the best 4.3 point I've seen. I was able to finally see it was my 12900k that tapped out on DDR4 at 4133 not my Z690 D4 as this 13900k booted right up at 4300 1:1 no problem and zoomed through TM5. I'll end up running some other baseline benchmarks and tests to add to my comparative collection versus the other 12900k's I've tested before I install the MSI Z790i Edge in my other test mATX case to start testing these M-die's and A-die's to see where they give up the ghost seeing as the MSI is rated for 8000+.
  8. Yeah those timings aren't great. Even my few sets of M-die sitting around (generic sticks in generic heatsinks and a pair of G.skill's) are much better. I did have to exchange one pair of the G.skill's though to get a properly working set. See if you can push it up to 1.4v and do CL 30 and push TWR to 48, TRFC to 500 and TREFI to 65535 for starters and pass a basic TM5 run. Well.... That's rather disheartening to see from EVGA. First their GPUs and now a reduction in PSU warranty from 10yrs to 3yrs? *sigh* ----------------------- SP115 has arrived @tps3443
  9. Do you have a link to the video? I probably skimmed right over it in classic me style. 🙂 Let me know what settings you're able to dial in that can make it past TM5/WMD. These are my current settings with G.Skill 2x32GB A-die sticks: I'm wary to push them harder with the stock heatsinks as they can get toasty (~60c) under load and really need some real heatsinks. These are WMD, Memtest 5hr and TM5 extended certified timings. I've spent hours (way too many heh) gaming with these settings and no problems.
  10. I was running .172 and I upgrade to .174 No performance loss, temp issues or hangs. All 5 sets of DDR5 ram that have crossed my way (I currently have three sets) have all been Hynix, no Samsung. 3x M-die, 2x A-die. One set of Ripjaws I received was Samsung and it acted oddly but in G.Skill's defense they were only certified for Intel/XMP not AMD/Expo. I sent those back. All the sets I have now work perfect. (2x 2x16GB M-die, 1x 2x32GB A-die). For whatever reasons, people are having some (not all) issues with Samsung memory even when manually dialed in myself included. Do you have a hang issue even at default or just EXPO/XMP / manually tuned? G.skill sources what works and it can be Samsung, Hynix along with M die or A die and it is a crap shoot but the sticks will usually tell you what works and how along if they are M or A die with an "M" or "A" at the end of the code above barcode: M-die: A-die: Obviously once you have them in your possession you can physically slap the sticks in your system and use CPU-Z to see who makes them. If they are Hynix, give them the ole tRFC challenge as A-die tend to be able to run sub 450 and M-die's give up the ghost quicker. If they can run sub 400 especially, they are most likely A-die. But to re-iterate, I've had zero boot issues with any of the sticks I've used and I've rebooted dozens of times per day when dialing each set in and no issues. Have you tested to see where your IMC gives up its ghost? my 7800X3D tops out at 6200 which is no problem because 6000 truly is the sweet spot. At 6000, tight timings trump 6200-6400 IMHO.
  11. Nice video bro! I did see some dips into the 60's though 🙂 3090 is hitting a major wall. Turn on benchmarking next time to see the lows too. 69 dip (~72 compensating for capture deviation): Slap a 4090 in that rig to see what that 6ghz setup and memory OC can really do. 🙂
  12. Did you grab the one over on the OCN FS forums? I saw one for sale at a good price. I was tempted to pick it up to replace my daughter's 3070 for Hogwarts but she let me know she's done with Hogwarts as she's beaten it and re-play value is low. 🙂 Good thing I didn't either. I don't think I could have swung that and the 13900ks at the same time. Well....not without some backlash. 😁 We'll have to see some complete pics with it in the N200R! Congrats!
  13. LOL, Well here's the long version.... Actually, I have this MSI z790i Edge, my NH55 (Socket 1700 laptop), Z690 D4, wife has an Asrock B660M, Daughter has an Asus B660. Haus Intel atm: ----------------------------------- MSI Z790i Edge = No CPU (+????) Asus Z690 D4 = 12900k (Nothing ATM) NH55 = 12900k (+3070ti Mobile) Wife = Asrock B660M + 12100f (+Asus Strix 3080) Daughter = Asus B660 Prime + 12400 (+Asus 3070 KO Edition) ------------------------------------ My original plan was to put an MSI Edge B650 and swap this 7800X3D for on the go in a sff but yeah I don't feel like doing that now that I have my desktop dialed in and it is running in pure beast mode right now for gaming so I'm going to send back the MSI Edge B650 and keep the Z790i Edge. The other plan was also to swap out my Suprim 4090 as needed if possible but that is even worse than swapping out the CPU then rebuilding. I found this display that a lot of travelers seem to like because it is a 1080p 22" 144hz panel. That means I don't need a 4090 in my sff so I'll be returning this sealed FE 4090 to Best Buy sometime next week unless someone wants it at cost. My original plan if I built out a 4090 SFF and had to use the FE 4090 was to sell my Liquid X 4090 if I had to but I really didn't want to as it is definitely above average silicon. So then I was trying to sort out where to conjure another Intel CPU out of thin air since the Z690 D4 is slated for the wife's system when all is said and done along with the 12900k as her RS B660M is woefully under powered for the 12900k and she is getting performance issues with her teams and other software running in the background while she plays WoW. The best silicon of my 12900k's is definitely going in the NH55 (right now it is looking like my original 12900k from Dec 2021 is the winner still) I was contemplating picking up another 12900k or 13900k/ks if I had to for the Z790i Edge and then that 13900ks popped up. He was able to run 5.7 P-cores on air which means it will most likely end up in my sff + Z790i Edge with an EVGA 280mm CLC AIO after it system hops for testing/fun times in the NH55 and Z690 D4. Next objective is to see what will get me a solid 1080p 144hz everywhere basically. I'll test both the 3070 and 3080 but I can see a 4070 doing the trick or just yanking out the 3070 or 3080 for travel but switching it up I can potentially go with a smaller sff. I can also see an AMD 6950xt easily taking care of business too but a 4070 runs much much cooler and pulls a fraction of the power....it also has half the VRAM. 😞
  14. I did. 🙂 I couldn't pass up an SP115 for that price. I'll end up playing with it on my Z690 D4 to see where these B-dies truly cap out before seeing where the DDR5 MC (or MB or mem) gives up the ghost in my MSI Z790i Edge which is rated to 8000. See where these three sets of DDR5 give up the ghost as I'm sure none of them will hit 8000. I'm also curious to see if a 13th gen will boot in my NH55 laptop. I'll also be curious to see how it stacks up against my 7800X3D when all is said and done. @Custom90gt New Fractal Design Terra seems to hit all the sweet spots properly (if air cooling).
  15. @win32asmguy I finally got a chance to test yours against my original cool hand luke. On the desktop it about as expected with conditions locked and a proper AIO, mine was better under normal conditions. Both running total stock yours was pulling ~20w more than mine (223w vs 243w). With LLC dropped to 1 mine was pulling ~160w and yours 193w. More importantly, yours was consistently running 5-10c+ hotter under all conditions even with a beefy EVGA AIO cooling them down (direct die frame for both). Locked down to ~140w and a -0.100 UV, on paper your chip pulled slightly less but ran hotter. Much of that is masked by the AIO cooling down both easily but it still ran 2-5c hotter on the desktop. In the NH55, yours pulled ~0.971 max at total stock while mine pulled 1.038v But package was only 1w difference (~124w vs 125w) and mine scored ~21500 and yours around 20100 because it just runs hot...so hot. I reapplied nanogrease 3x. Neither CPU is delidded at this point. Either it is just a hot and leaky chip or it has a bad STIM application under the hood. I'm going to delid it and see what that does compared against the desktop and laptop data collected in comparison. If it experiences a 10-15c+ drop in temps, you know it was a bad STIM application which happens.
  16. Price wise, that is par for the course with FNW systems. I long abandoned pre-builts on desktop. I think my last pre-built was a DECpc XL 466d2 which I used till I built out a Celeron 300a (promptly OC'd to 450mhz) and I actually then turned the DECpc into my first foray into Linux in the late 90's. I could never stomach picking up another pre-built anything on the PC side (Apple not much choice). lol, still sitting in its bag, in a box of parts not being used. I'm good to go for now. 🙂
  17. Looking good! How's the Asrock Taichi 7900xtx treating you versus your former 6800xt? Good uplift? coil whine? Gaming?
  18. Looking good! I agree on the thickness of the Arctic. It is a very competent "set and forget" AIO with no bells and whistles and great cooling but everything about it is pretty bulky. Plenty of room for numerous types of GPUs. How would you rate it for portability? I've been comparing the N200R to the LianLi A4 and Dan C4 and while the N200R weighs the most by ~3lbs, dimension wise they are all very comparable. N200R is less than 1" bigger than the Dan C4 across all dimensions LxWxH. Can't beat $80 shipped for an N200R open box new. The other cases are 2-3x more expensive. Plus the N200R looks easier to work in with regards to removing components quicker than the other two. I'm also thinking of going with the Z790i Edge and sending back the B650i edge as I have a few extra 12900k's sitting around. Only thing I would lose out is I would need better memory to really let it shine with DDR5 unless this other set of newer M-die can hit 7400 where one of my 12900k's gives up the ghost.
  19. Oh Cablemod.... If Nvidia wanted to shoehorn in another 4080 derivative with such a wide chasm between the 4080 and 4090, it would make sense to go with ~12000 for Super, 14000 for TI but with market conditions and lackluster sales due to crazy 4080 pricing? The smart play to me is still 4080 (Ti or Super....whatever name they want to give it this time around albeit TPU is leaning towards TI naming convention) with ~13000 shaders and 20GB for $1199.99. I personally think a 4080ti 14000/20GB for $1399.99 is nearly DOA as most buyers will just pony up $200 more for the full on 4090 experience but again a 4080ti at ~13000/20GB for $1199.99 and drop the 4080 16GB down to $999.99? That would pretty much lock up Nvidia at those price points, force AMD to drop the price on the 7900XTX (or accept defeat in most purchases) and the 4090 continues to reign supreme at $1600 untouched. Nvidia could drop a 4090ti late cycle to clean out excess top end dies if needed but with AI gobbling up everything in sight, is it needed? How many dies don't qualify for a full fat Titan/A die but have enough shaders to qualify for a TI level card? This is Nvidia's race to lose at this point. I'm already thinking of the 5090 / 8900XTX battle 😅 Ideal game over pricing that would put the screws to AMD even more and counter Intel vs Ngreedia typical pricing: 4090.............$1599 vs same 4080ti...........$1199 vs $1399 4080.............$ 999 vs $1199 4070ti...........$ 699 vs $799 4070.............$ 549 vs $599 4060ti 16GB.$ 399 vs $499 4060ti 8GB $ 349 vs $399 (Just EOL this garbage card) 4060 $269 vs $299 4050 $ 219 vs $249 Those small adjustments could change everything and adjust those precious few points in their favor. But as we know....Nvidia wants it all and gives no quarter with extracting maximum value for their products.
  20. When I tell you this Suprim X was miles better than the other identical one I tested from Best Buy, that isn't an understatement! 🙂 Runs cooler, pulls less, clocks higher both boost and OC, 2x+ higher on the mem and no coil screech just a low vibration compared to that Best Buy garbage sample. I do wonder what this one would do on chilled water from time to time. 3090 on that 8nm fatty hot Samsung node but there are areas of D4 that will push me to 450w+ stock. I have frames capped in NV to 142 so you can see I'm basically getting cap or near cap everywhere. Those lows are loads/cut scenes. As for Diablo IV, I'm having a blast. Luckily timing I JUST finished doing everything there is to do in FO76 and was entering "dailies repeat cycle" just to maybe progress select buildouts so this came along at the right time (or it was back to WoW). I've played all the Diablos and this one is pretty righteous so far.
  21. Blizzard: "Diablo IV will present a dynamic landscape with proper asset management and comparable visuals and play across a wide array of hardware" Diablo IV: 4k Ultra settings? Why yes I would like 20GB+ of VRAM and RAM....why thank you!
  22. Nah, it's all about the refresh now.... last hurrah for Socket 1700. 🙂
  23. Yep, the newer models try their best to use tactics to reduce chances of burn in but the core technology is still suspect to it but with a bit of due diligence can be lessened. Miy display burned in within months. I got a replacement. That one burned in again within months. I got another replacement and sold it and switched to a Sony fald model that I'm still using 3-4 years later. I will be looking to upgrade in the next few years though from a 65" to a 75" so we will see. Love seeing the 7970 get a workout! That was my main GPU for over a year and a half when it was all shiny and new back in the day. 🙂 As for Nvidia....(disregarding VRAM issues on the 3080 10GB vs 3090 24GB) Well the difference between the 3080ti/3090 scenario is there is a much bigger leap between the 4080 and 4090 giving Nvidia plenty of room to work. The 3080 10gb -> 3080 12gb -> 3080ti -> 3090 -> 3090ti = 8704 -> 8960 -> 10240 ->10496 -> 10752 So 3080 vs 3090 = 8704 vs 10496 = ~20% difference (Even vs 3090ti = ~23%) 4080 -> 4090 = 9728 vs 16384 = ~68% difference (almost 3x difference) Much bigger chasm means close but no cigar. A 4080 Super/TI will not be "on par" with a 4090 as you slot in a 4080 Super/TI with ~13000 shaders and 20GB of memory. The real question is, does Nvidia continue to thumb its nose up in the air and price it at $1399.99? or do they accept reality and either replace the 4080 16GB with it at the $1199.99 price point or drop the 4080 16GB down to $999.99 (forcing AMD's hand with the 7900XTX) along with slotting in the 4080 20GB at $1199? If Nvidia comes to their senses and drops the 4080 16GB down to $999 and slots in a 4080 Super/TI at $1199? That would really put pressure on AMD while also allowing them to just sit atop the roost with the 4090 and potentially a 4090ti/super (I personally prefer TI). I wouldn't put it past Nvidia to release a 4090TI with ~17,408 shaders for $1799-$1999 and keep the full fat 18432 for top of the stack professional cards if at all. There simply is no competition on the top. The real question is their yields. Cores that are slightly not making the full fat cut but have more shaders than the 4090. Looking back at the 3000 series, Nvidia slotting in so many models was insane then the last gasp money grab of the 3090ti wraps it up(and all those AIBs who ordered thinking the good times were going to last lol). It also gives us insight into Nvidia letting no partial core go unused... 🙂 They're not called NGreedia the Green Goblin for nothing....
  24. "GALAX is likely just following suit by adjusting prices to the same level as other vendors. " They also adjusted the 4070, 3060 and 3050. along with a glut as sales continue to be flat or worse. 4080 is the worse offender by far price wise of all of Nvidia's 4000 series compared to their 3000 series offerings. Are the TI/Supers coming? Of course. For the 4090, it comes down to is it this year or next year (aka 3090ti time table).
  25. LG C7's. The original then the replacement. I am not cut out for OLEDs as I routinely leave static images on screen and/or pass out with them on the screen. Most visible lightly is the older Comcast grey box menu select screen predominantly on the bottom. I ended up switching to a Sony 65" 3-4 years ago. I will literally fall asleep on menu/desktop screens. It is my secret power. Another problem was they started to dim and become slightly uneven too. I know OLEDs have come a long way, but I'm good right now not using them. Maybe down the road when they have perfect a 100% no burn in model which they still have not. You could always slap a static image on your display every night for 4-8 hours and leave it as is for weeks on end to see how it holds up while you sleep as a true acid test. "However, according to leaks revealed by YouTuber Graphically Challenged" Yeah, that's where you lost me a bit.....He makes MLID seem rock solid and that presenter's voice he uses drives me nuts. I used to follow him before.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. Terms of Use