Jump to content
NotebookTalk

Mr. Fox

Member
  • Posts

    4,851
  • Joined

  • Days Won

    509

Everything posted by Mr. Fox

  1. I delidded the first 12900K, the second 12900K and now the 12900KS. I heated the first one but not the next two. It did not seem to help anything other than making it more difficult to handle. Just a heads up, in all three cases the delid causes minor damage to the 12th gen IHS and will probably do the same to the 13th gen. The first one (that I heated) got the top surface slightly deformed the top contact surface and I had to lap it to make it flat. It works flawlessly. On the second 12900K and the 12900KS the little "wing" that the ILM presses on received minor scarring where the delid tool presses against it. In both cases I easily fixed it by sanding the bottom surface where the rubber seal makes contact. This also accomplished cleaning off all of the old rubber gasket at the same time as fixing the burr on the wing. I mention this so it does not come as a surprise. In all cases the delid tool screws were very tight, but it seems like the manufacturing is off a little bit on the tolerances. All previous gen Rockit delid tools popped the IHS off without a blemish. I had definitely grown complacent after doing about 20~ CPU delids and finding the process to always be quick, easy and painless. The 11900K disaster made me approach the process with much greater caution and care. The delay in delidding the 12900KS was more being pure lazy than worry about how it would turn out. Of course, having a chip with a high silicon quality is a good reason to be extraordinarily cautious. I used to delid all CPUs before installing them for the first time because I viewed it as a waste of time and thermal paste to install it before delidding. I have never seen a CPU that doesn't benefit from it, but I am more inclined to see what the temps are with the solder first and evaluate the potential benefit of a delid against the risk. Hi bro. We have missed you and I was planning to call you to make sure you were still happy and healthy. I suspect you have been busy with the new job, etc. I have not done anything yet on the 2080 Ti. It is not registered in my name, but I am assuming it might have a little bit of time left on the warranty. I was going to try to fix it, but it might make more sense to put the hybrid thing back together and RMA. Can you check the status for me and let me know how much (if any) time is remaining on the warranty and if we can arrange a cross-ship at my expense? I did not want to bother you with this since I suspect you have been very busy. Plus, I didn't want the warranty question to appear to be the only reason I would call to check on you.
  2. I updated to 2.03 and it's working fine, ready for 13900K. Weird thing though, now Windows 11 has non-functional USB ports. Older versions of Windows work fine. I tried flashing in Windows 11 on older BIOS and it gives an error about "Intel ME Device not found" and reboots. Lovely POS from the Micro$lop Mafia.
  3. I figured I should stop being lazy and delid the 12900KS before it goes in Banshee. As expected, a nice 10-12°C drop in load temps and allowed me to lower vcore from 1.360V @ 54x all P cores to 1.340V. As many times as I have done this, it still makes me nervous since the first 11900K that got de-die'd rather than delidded. I never batted an eye before and viewed it as non-chalant as a repaste. 86°C core max on ambient water... not too shabby. Just under 400W. No special tweaks or tuning here... ran it on my crash test dummy cancer OS. How is it? I heard there were reports of issues in EVGA forum, but haven't gone to investigate yet. Just checked. There is a 2.03 beta out now to fix issues with 2.02. BETA BIOS Updates for Z690 DARK K|NGP|N (2.03) / CLASSIFIED (2.03) [Fixes 53,54,7F loop]
  4. Newer is better, right? https://imgur.com/gfRxA0E NV_Burn.mp4
  5. Another wonderful example of change for the sake of change. It's new, so it must be better, right? I wonder if NVIDIA gets a "license fee" on new PSUs sold with their crappy new proprietary connector?
  6. That's awesome. The thrust of focus should always be in the results. Being obsessed with power draw is silly and a distraction. The way I look at it is, "it is what it is" and what I should be paying more attention to is everything else. The right questions to ask are things like "how did I do?" and "what is my rank?" and "can I beat the person in front of me?" and "what do I need to do in order to do that?" and power draw is merely a natural byproduct of that pursuit. If I let it become a focal point, success will evade me. Once you have a baseline on what is normal, increasing power draw while simultaneously increasing performance confirms you are on the right track with tuning. The only point at which it matters is whether or not you have the necessary hardware to tame it thermally. Determining success based on a preconceived bias about what that number is, including a conclusion that it is "too high" is arbitrary. It tells me that person is potentially misguided and lacks understanding. Yes, that includes lots of famous people on YouTube, many of whom are legends in their own minds. They may be worth putting up with if they are a source of useful information that I can leverage. A massive subscriber base is a measurement of popularity, not authority. The abortion we see with modern turdbooks is a great example of what the end result looks like when the wrong thing becomes most important. That is a mess, not a success. No thank you... It draws less power, but it still overheats. Somebody with an engineering degree sucks at their job.
  7. It would be interesting to know how to translate that against an ASUS SP rating, or if doing so is even feasible due to lack of data to validate the conversion. @johnkssswhat is your MSI "CPU Force 2" rating?
  8. Everything remained the same with my SP rating after the firmware update. I was even able to import my saved BIOS profile with all of the old settings without error, so that was a real time-saver. I flashed from 1504 to 2103. I totally skipped 2004 and had no issues.
  9. That is what I am praying happens. Seems like the right thing to happen. Yes, I have. I have spent hours trying different things, including that. It is definitely something with the mobo. Even my custom 4500 XMP profile that ran as smooth as silk on the Z490 Prime, Z490 Strix, Z590 Dark, Z490 Dark is unbootable. It never trains and dumps me back on the boot screen with the F1 option to try something else.
  10. He would not do that to me, or anyone else, without full disclosure and letting them decided if that is OK. Looking forward to getting it. Crossing my fingers. I already flashed the 13th Gen update to the Strix D4 to check the SP rating before moving it to the Z690 Dark. Now I am wishing I had not purchased a DDR4 mobo, but can't cry over spilled milk. It is what it is. It's actually a good mobo, just limited on RAM clocking. It can't overclock DDR4 as well as the Z490 mobos I used these same modules in. They were stable at 4500 on Z490 Strix and Dark. Now they are only bootable at 4200, stable at 4000 on the D4 for some weird reason.
  11. Same place that @johnksssand @Talon purchased theirs with the same 2-year Micro Center warranty. @tps3443 - this is what I was referring to before about correcting the high PCH temperatures. I showed this to you once a long time ago, but in case you did not remember. See this post and change from "disabled" to "auto" and they should dramatically drop. If MSI changed the defaults to "auto" in the new BIOS you may not need to do that. https://forum-en.msi.com/index.php?threads/crazy-pch-temperature-on-z690-carbon.370039/post-2096947
  12. Yes, I use those as well. They are very handy, and I always keep spares in case I ever need them. I also use these. https://smile.amazon.com/gp/product/B0975XH3JS (new and improved design)
  13. It is brand new from Chicago Micro Center. If the binning is no good it will get exchanged for a better one or I will get my money back. Once that is settled, I will move the 12900KS to my work computer and sell the golden 12900K in the work computer to offset the cost of the 13900K. That is very much a no-no. Good way to kill a mobo, or burn up a fan controller. A single Delta fan is enough to kill it. @tps3443don't risk it. It's not worth it when $10 is enough insurance. It will get power from SATA and only use the speed control from the mobo. https://www.amazon.com/ARCTIC-Case-Fan-Hub-Distributor/dp/B0887VG14J/ref=sr_1_3?crid=IBF6J65W1NOV (This can handle a maximum of 1.0A per fan port. Most fans are less than 0.5A each.)
  14. It's going to vary some based on a variety of conditions. The higher you clock the memory and the tighter you make the primary timings the more tREFI has potential to cause instability. Higher tREFI also takes more voltage Luumi has a couple of good DDR5 videos that touch on this. Even if you can't get it to pass memory tests without errors, I have generally found it stable enough for benching when it is maxed out. I've seen some difference from one memory set to the next.
  15. I do, but naked DDR5 sticks run cool enough with a fan blowing on them. I think the errors occur at lower temperatures as clock speed increases. It seems that way based on my own experience.
  16. If it does truly lock it then it absolutely will help. Samsung B-die, SK Hynix and Micron all overclock higher with added voltage. You know this already because you're used to spending lots of time overclocking Samsung B-die on DDR4 sticks and you have seen how things don't work if you don't give the RAM enough voltage. Be sure to monitor the temperatures because those high clock speeds will actually fry your memory. DDR5 gets much hotter than DDR4 did. You'll also start to see errors running memory tests before the RAM gets hot enough to damage it. Errors begin to surface around 45-50°C and an unstable overclock can often become stable below 40°C. A fan blowing on them helps tremendously. Good question. If I had to guess I would say probably so, but that's just speculation.
  17. It's actually very simple. It's only a matter of turning it on. When you enable it, that unlocks the industry standard 1.435V cap on DDR5 voltage. For some idiotic reason MSI removed that menu option and it meant that certain brands of PMIC chips could not exceed 1.435V.
  18. That was missing in the previous BIOS. It was there in an earlier one and MSI removed it. Awesome to see they brought it back and now you can really get everything that memory is good for. You might even be able to return the Hynix RAM if this performs as well now that it is unlocked.
  19. Yes. With 2 year warranty from MC it was cheaper than from NewEgg without a warranty and an icky "no refund - exchange only" policy. That is great news about the memory. Amazing how different CPU generation can be using the same mobo and memory. That is wonderful. Samsung B-die goes another round with the Titans! And, as a friendly reminder that the leaders of our industry are crooked, lying, worthless bastards, here is a screenshot to bring us back to a correct understand of who these losers are. (Some of you have already seen this before.) Windows 7 is installed in UEFI mode and no CSM enabled (which is supposedly "impossible" - even EVGA Precision X1 assumes it's not because "it can't be" LOL) and the "Windows 10 only" Adjustable Bar feature is enabled in the BIOS and in Windows. So much for the nonsense and fake compatibility baloney. Join me in offering up another nasty finger salute to the Redmond Retards and the Green Goblin.
  20. I agree with that. What is useful while maintaining value is often NOT what we want. But, it is exactly what most gamerboys want. They don't buy flagship CPUs and GPUs because they don't need to. Speaking of that. 3080, 3080 Ti, 3090, 3090 Ti, 4090... 6900 XT, 6950XT, AMD 7XXX... all totally unwarranted for 1080p and (and 1440p minus extreme settings) gaming with very high/Ultra settings. This 3060 Ti FTW3 provides a truly respectable gaming experience without spending a lot of money. Now, I certainly don't like that it is a downgrade from 2080 Ti FTW3, BUT, it would be silly to deliberately spend more on the system I use for work, and spending more is a waste of money for playing games. It is good enough that I might just sell the 2080 Ti FTW3 after I fix it. I can use the money for something else. Here are some samples... Quake 2 RTX Shadow of the Tomb Raider (with and without DLSS) Metro Exodus Enhanced (Extreme and Ultra - RTX On) Gears 4 @ Ultra Gears 5 @ Ultra Bright Memory Infinite (RTX High) And, here's why a waterblock is appropriate "just for gaming" LOL. Look at these LUDICROUS thermals. Totally asinine! And, that's all stock clocks. This little critter is clearly putting its best foot forward. Not bad for a sub-$400 purchase.
  21. WOW, that really is the proof in the pudding as @johnkssssaid about the jump in memory. What is the BIOS showing for memory voltage. With 12900K that was limited to 1.435V and 6400 was the max with that voltage. It is running 6800 with that voltage or did 13900K BIOS unlock the PMIC on that kit?
  22. This will help with annoying distractions and screenshot artifacts. Unless something has recently changed, he will need a ROG board. Prime motherboards do not show the SP rating unless they added that with Z590 or Z690. It would be nice if the nitwits at ASUS actually listed this in their product specifications and features. It's definitely nice to know for the sake of conversation or resale value, but it's not a need to know and it doesn't change anything whatsoever. You either got lucky or you got screwed. All you need to know is shown to you in the way your system behaves and it is obvious @tps3443has an excellent CPU sample. As @Clamibotknows, SP-rating can be misleading if you place too much stock in it for the same reason ASIC rating was often misleading on GeFarts GPUs when it was popular for gamer kids to obsess over it.
  23. imgur is your friend... it also helps maintain image quality
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. Terms of Use