Jump to content
NotebookTalk

tps3443

Member
  • Posts

    2,121
  • Joined

  • Last visited

  • Days Won

    88

Everything posted by tps3443

  1. I think the 14900KS is a nice improvement, and most of them are really good, so you don’t have to do all of this buying. It makes binning easy. They are all tested for a good 5.9+ Vmin and solid stability. The biggest thing I have learned is, all a delid will do is give me +100Mhz with a lower temperature. And that’s not worth the work. Running direct die also makes your previous stable ram overclocks a little tiny bit tougher most of the time. If I could go back I would not have delidded my R-Batch P117/E88, it had a very low 5.8 VF of 1.344v. It was a nice chip DD! But, even if I could go back I would still take my current 14900KS (SP108) the IMC on this chip is the best I’ve tested, the P/E cores are also fantastic and it can manage a lower load voltage than that R-batch could at 5.9Ghz. I can reset the bios load 8600c36-49 and load windows and I’m done. So I’m not worried one bit about the R-Batch. I had my fun with it. It was for sure an early “KS” before the KS even released. But since the KS launched even my SP99 (KS) could compete with it P-Core wise. I am finished buying binned chips. From now on I am buying only retail chips, and if it’s not testing good, I’m going to exchange it. If the 2nd one is the same then okay that’s fine, I’ll just run it. I think SP rating is just seriously incorrect and taken way too seriously. I had a low SP13900KS, and a low SP14900KS and both were great samples their main weaknesses were not the P cores but the IMC or the Cache. The differences between my SP99 and SP108 14900KS voltage requirements at 5.9Ghz is pretty small. It was maybe a 0.010-0.018V difference under a full load. With the direction that CPU overclocking and GPU demands are going I think system memory OC is probably more important now that we edge in to even more CPU bottleneck possibilities haha. So with future chip purchases, I’m going to test for IMC most importantly and P/E cores 2nd. Low SP or high SP I’m going to give them all a fair shake, I think with good cooling they’ll all be pretty close in the end. It’s nearly impossible to find something absolutely perfect anyways, all of these chips will have a weakness. Maybe it’s P-Cores, maybe E-Cores, maybe IMC, maybe it’s got a terrible cache lol. I may even skip next gen, and buy when KS version launches. If they do.
  2. The single gen 5 PSU is the way to go! Those dongles are hideous to me. They work in a pinch, but I don’t like the look with all those 8 pins running to it. You have to be good with that MSI 1300 MEG PSU though. My MSI is the lower tier and less wattage than yours. I really like the PSU.
  3. Is it safe to say the whole 4090’s catching fire thing was all from just improper usage? I have been running the Galax 666 watt bios, I can consistently pull 600+ watts in games for hours with no issues whatsoever. My wires right at the plug reach about 57C tops, and the plastic connectors may hit 40C if that. I have a slight curve in my cable for wire management, no major creases like some do. But, it’s been great. The 4090 really needs the 666 watt bios at a minimum to operate freely I think.
  4. I have been getting this “GPU out of video memory” error since 13th gen first launched, all of those errors in the video are probably from me lol. I would use it as my quick and easy stability test lol. I knew that if BF2042 would not spit that GPU out of VRAM error, or The Last of US, during their shader decompression, then my system was 100% stable lol. The largest reason for this error is simply from voltage/clock speed/ power/heat. There’s some sort of silicon issue with that initial launch load going from 10-180 watts in 0.3 seconds and if that voltage isn’t enough, or if those temps aren’t cold enough. It’s going to fail. I actually had one last night lol. This one popped up after closing the game though so it did not bother me 🤣 I always undervolt using LLC and AC/DC etc, so I stretch the CPU‘s to their lowest possible voltage required to save on power and reduce heat. And this is where I have always seen the errors. (Not from running the chips stock) now, I use a chiller so these chips stay cold so that would probably explain why I never see the errors when running stock/auto voltage. Asus updated the bios and moved away from their LLC3 standard to now using LLC5 which made many of them go away. If your load voltage is high enough you won’t get that error at all. Pretty wild seeing this though. It fixed the problem for most. But it always came down to just a weird silicon sensitivity problem. You can always force the error to happen on any chip out there I doesn’t matter how amazing it is. They all spit the same error if you starve them too much on voltage with these very quick ramping shader compiling loads. I’m not sure who to blame here. But it’s something wrong with the CPU’s I guess. I can always make them work either way though. Hopefully next gen won’t have the issue. I recommend anyone with an Intel 13th/14th gen to run Cinebench R15 a few times. If they can get through that, they should be okay. Something about R15 and the instruction set it uses is much harder to run on a 13th/14th gen Intel than R23/R24. If they can’t get through R15, add more voltage lol. Then once it can run reliably, add another +0.010 and you should be good for never having “Out of GPU memory errors”
  5. I think so too. I’m debating ordering a waterblock for it.
  6. Yep! That would be nice. GPU’s leave a lot to be desired and it’s tough to land something perfect everytime. Good core, good memory, and no coil whine. Are you getting a 5090 HOF?
  7. Maybe they’re mad, or maybe not lol. There are many 4090’s that can run a higher core clock limited to 1.070V than me unlocked at 1.100V 😂
  8. Sounds like your cards might be okay core wise and memory. I tried three different bios so far, and I did try locking the core yesterday to 1.100V. I physically can’t do 2,970Mhz unfortunately. The Galax Bios was working pretty well for me, I also tried the Gigabyte gaming OC bios with the unlocked 1.100V, and also a new Gaming OC bios from April 2024. I might just leave this one on air cooling and be happy with it like my 4080 Super. I really enjoyed that card on air.
  9. It’s a nice GPU. Not quite CPU limited. Even 1080P and hammering 300+ fps, it’s all GPU limits in the games I play. @Mr. Fox how did your gaming OC clock on air? And on water?
  10. So, this card doesn’t have a very amazing core unfortunately 🥲. I use Cyberpunk 2077 at 4K for testing. And I can run 2,955 solid! Anymore is a no go. Even just 2,970 will crash lol. Temps are warm about 69-72C. But that’s its limit using auto fan or even 100% fan. Now, the memory is absolutely solid at +1700 so far, maybe can do a little wee bit more. My 4080 Super could handle about 3,060-3,070Mhz pretty dang well. Anyways, it is what it is. I’m still happy with it! I imagine chilled water will help quite a bit. I know Cyberpunk at 4K puts a “Real” limit for the most part. Those RT cores either can or can’t it seems. Testing was done with Galax HOF 666 bios.
  11. It’s sold 😃 (Going to Canada) I was very tempted to just keep it as a backup or spare high end GPU. But then that didn’t make much sense. Plus, I’m not that rich. Haha. The 4090 is nice! I like it a lot so far. It’s strong. I saw a 42% higher Superposition score over my 4080 Super. 🤩 Also. What sort of memory overclocks are people getting out of 4090’s?
  12. The RTX4090 landed today. And I love it. Massive step down in quality compared to my Suprim-X. But I don’t mind. No fires or melted connectors so far, even getting close to 600 watt GPU power with an overclock in Nomad bench and some games. I run this same little MSI Gen5 PCIe dinky PSU, with a single power cable for the GPU, and it’s performing like a champ! Even powering (5) D5’s, Z790 Apex, 14900KS, and a 4090! That tiny little 1KW PSU gets it done without making a sound. (Quick peek behind the desk reveals the tank and chiller) 😄
  13. That’s a nice CPU so far. I would also be after that new Asrock ITX Z790 motherboard for sure, and not even consider anything else. I think with the right CPU DDR5 8400c38 can be done on one of those. It’s really just about the CPU’s ability. Honestly, I have heard of people hitting 8600+ with those little dinky Asrock ITX boards with water cooled ram and gold IMC.
  14. You know, when I was gaming playing Cyberpunk, I turned off/on my E-Cores out of curiosity in the video below. So just a straight 5.9P 8/16 CPU essentially. I noticed some hitching periodically and some GPU usage drops periodically too. Turning on E-Cores fixed this right away. I had consistent GPU usage when E-Cores were on, and it was just crazy smooth. But not with them E-Cores off . Really makes me wonder if there actually is an advantage with Intel and these E-Cores handling windows back ground stuff in certain titles that no one has mentioned as a possible problem for these 8/16 chips. I mean, if my 14900KS with E-Cores disabled and only 8/16 at 5.9Ghz and DDR5 8600 and a 4080S can’t handle the heat, then a 7800X3D certainly cannot with a 4090+ GPU right? Enabling these E-Cores alleviates this dropped GPU usage and smoothness. I caught this in my video below. And just found it interesting. It always made me think a 5800X3D or 7800X3D AMD CPU would never be enough core wise for some of these well optimized newer games, where 8/16 is just not enough running a 4090. This leads me back to my memories/experience with the 11900K chips (Those were not enough either with a 3090 at times) This is 1080P so it’s the ultimate CPU test lol. But still I have to give props for the 14900KS performance as a whole. In the video I start out with E-Cores on, then I work them all! Then I re-enable them.
  15. I tried the Patriot 8200 C38’s when they first came out. I loved the easy disassembly with screws and they have a PMIC thermal pad too. Very nice kit. But the big misconception about ram is I think all the 3Gbit stuff is equally good and capable of 8600+, and any kit is only as good as your IMC. Even at lower speeds. My G.Skill 8200’s required 1.500V to run 8400c40 on my R-Batch and other CPU, but my current chip can run them at 8400c40 at 1.350V. I think all of these ram kits are equally good up top though, and all of them can probably reach similar speeds. We’re just IMC limited.
  16. Yes, I actually owned that one already (I was thinking it was the 2nd one or something) I’ll give it a test though with my 4080 Super and see how it compares. Can you share the graphic settings?
  17. It does look good. The Eisblock is still available though, and may get it just because it’s less expensive. Alphacool is stepping it up for sure though. All of their GPU blocks are less expensive than the EKWB, Heatkiller, and Optimus. And based on the comparisons they perform almost as well as the Optimus in GPU temps, and actually better than the Optimus in VRAM temps.
  18. Have you seen the Alphacool Core V2? I’m gonna get one of these. It looks pretty nice. It wraps around the whole GPU, and the actual block is longer and covers the PCB. As for the terminals, it’s made to the block so the terminal and block are one piece.
  19. Hey, @Mr. Fox I notice you bought a Alphacool Eisblock for your 4090 Gaming OC. But did you buy the V2 version since you have a 4090 Gaming OC V2? I’m looking at three blocks maybe (3) versions Alphacool makes for the 4090 Gaming OC. #1 Alphacool Eisblock Gaming OC #2 Alphacool Eisblock V2 Gaming OC #3 Alphacool Core Gaming OC V2 I really like the Alphacool Core V2, it wraps around the whole PCB but it’s a little more expensive. I think since you had a solid experience, I’ll grab one. Don’t think I’ll leave the 4090 on air 🤣
  20. Not sure why the P5810X is so expensive either that is an interesting one though. But price to performance the Intel P5801X wins. A lot of guys in China are running these E1S P5801X gum stick style Optanes. There is a website called Bilibili it’s like Chinese YouTube lol. Many videos on there about the P5801X, some even show its performance in benchmarks. See this one video below, and you can see an unboxing of P5801X and it shows off that very hefty heatsink they have, and the really well engineered design. And they still get very warm. But they are a nice quality looking piece of storage. They are very tall and thick. I am assuming this is to keep them from throttling since they consume so much power for something so small. https://www.bilibili.com/video/BV1Qp421y7N5/?spm_id_from=333.337.search-card.all.click The drive below is Gen4, so you’d have those super fast speeds. But the latency from these things are so low, and such high IOPS would make this drive absolutely superior in speed and feel using Windows daily compared those traditional Gen4/Gen5 M.2’s.
  21. I thought the new Gigabyte Z790 ITX motherboard was really good at ram OC? I’d go for that.
  22. Intel Optane is the best. I’m going to get a P5800X still. They are expensive like all get out though. I really want an 800GB as an OS drive, and those are just way too much right now. I almost pulled the trigger on a 400GB P5800X a few times If you want to see a good deal though, go check out the Intel P5801X on eBay. It’s a M.2 style, but uses E1S connectors, these are actually faster than a 5800X and can consume 20+ watts of juice easily. You can run a E1S to PCIe or E1S to M.2 adapter. They also have higher write/read speeds and more IOPS than a traditional U.2 P5800X. They have a massive heatsink attached to keep them cool. There are two sku’s on eBay. “BEF” or “BF1” the BEF is an Engineering sample. BF1 is retail sample. But they are blistering fast and only $460 for a BF1 usually. I’m gonna snag one next week or week after probably, and an adapter. I got side tracked with another hardware purchase, so I held off. PS: If you didn’t know. The E1S connector will essentially be the replacement for M.2 connectors we use now. They are hot swappable, and capable of much higher power draw. Newer enterprise seem to be using this. They are really cool though. I did a lot of research on these below.
  23. So if a 4080 Super can do the above, just imagine what a mid-range GPU is capable of. Something still very impressive I’m sure. I think if one is gonna running a mid range GPU and 1080P, CPU+RAM helps a lot.
  24. Nice! I’ll go get it. Thanks. Take a look at the monstrous performance of a stock 14th gen Intel chip, tuned DDR5, and on a clogged up “non-modified” Win11 install lol. This kind of blew my mind seeing 1080P render at such consistent GPU usage even with FPS over the 300’s. I think these chips can easily handle whatever next gen Nvidia can throw at them comfortably. Also shows how much the E-Cores really help. Keeps the game smooth, and the GPU usage high.
  25. Alright awesome! And good luck! Maybe it will have a IMC like this one.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. Terms of Use