Jump to content
NotebookTalk

Mr. Fox

Member
  • Posts

    4,851
  • Joined

  • Days Won

    509

Everything posted by Mr. Fox

  1. Yeah, I have to agree that MSI caters to gamers more than overclocking enthusiasts. Unfortunately, I am not sure that it would be less than accurate to say the same applies to ASUS, ASROCK and Gigabyte. I loved the Unify-X, but it is the only MSI motherboard I have owned and can't legitimately comment about their track record for reliability. I know Brother @johnkssshad a bumpy ride with his Unify-X. I had to RMA my first and brand new Unify-X because the BIOS update applied on day one bricked it. I had NewEgg swap it out. No issues at all after that. The sad reality is an EVGA Dark mobo is no longer a future option and almost anything else is repulsive to me. I was also puzzled why MSI did not release a Z790 Unify-X, but the bottom line is that Z790 was a pointless product. It was created to make something to sell. It is architecturally the same product as Z690 with some features removed. In some respects Z790 was a downgrade. The only reason many Z790 boards overclock memory better is the motherboard manufacturers shift all of their attention to the new product. My Z690 Dark overclocks memory as good or better than the Z790 Apex, and we know that the Z690 Apex was a failure because of ASUS's incompetence, not because it was Z690 chipset. Because Z790 was not an actual upgrade, perhaps MSI decided it was a waste of time and resources to release a Z790 Unify-X. I'd probably agree with them. I have never tried playing COD titles with a lower spec'd system. I have always found Crysis and COD titles to play extremely well, while many people complain. But, I do not buy low- and mid-range PC components and I always try to have far more CPU, GPU and memory system resources than what is necessary. Perhaps that is why I have never seen any reason to complain about them. Awesome that EVGA can send that stuff to you. Are they sending thermal putty as well? I bought some from them for the 3090 KPE once, but it was no longer in stock the next time I tried to buy a tube of it.
  2. It is interesting how different experiences can be. My first, second and third EVGA motherboards were replacements for failed ASUS motherboards that took more than a month to get replaced under warranty. I have owned only three EVGA power supplies and none had any issues. One 850W G (gold) PSU and two 1600W SuperNova P2 (Platinum) and never any issues. I did have the Q-code led fail on the Z590 Dark, but it still functioned correctly as a product. It took 3 business days, zero downtime, and cost me nothing (cross-ship RMA). It would have taken 3 to 6 weeks to get it replaced by ASUS and I would have incurred the shipping cost to send the product to them. Chances are also good they would come up with a lame excuse to deny warranty, like bent CPU socket pins that were not bent when sent, or a speck of thermal paste on the PCB or edge of the CPU socket (not in the pins) as we have seen more than once by people we know. My experience with EVGA has been stellar, and of such high satisfaction that it would be difficult to say I am not a fanboy. Perfect? No. But, excellent because the products and service are both superior. I am very sad they are going under. My passion for high performance computers may actually be ending because of it. I am not sure it is worth the hassle having to deal with nonsense from other brands. Speaking of 4K gaming, I think I met my annual quota for gaming. In typical Mr. Fox fashion, I hardly ever play games, but when I do I binge on something I like. Most of the time it is play for an hour or less, lose interest and uninstall the title to make drive space for something I like. Yesterday I binged on Call of Duty Black Ops Cold War. The price finally fell to a point I was willing to pay ($20). I installed it and played the entire campaign from start to finish in one day. It was buttery smooth on my 4K 144Hz ASUS display. (Yes, it is a good ASUS product. There are a few.) Set on max quality settings with my CPU set at 59x on P-cores, 47x on E-cores, 50x cache ratio and memory at 8200, with 4090 at stock clocks +100mV and max power limits averaging ~225 FPS, often over 300 FPS. Cold War used 13GB of system memory and 16GB of VRAM to deliver a top notch experience. This is also another example of mixed experiences. I have almost every COD release and have thoroughly enjoyed every single title in the franchise. But, many people don't like COD.
  3. True, but one bad experience is quite different than multiple bad experiences, and the repeated lousy warranty fulfillment experiences are like soft-serve feces on top of the dung flavored cupcake. Whether it's because I am foolishly optimistic, or just a fool, ASUS has been given way too many second chances. Nobody can say I haven't given them a fair shake. The count of products I have registered in my ASUS account (only one of which I still own in terms of mobos and GPU) proves otherwise. But, as Brother @Papusansuggested "What other choice do you have?" None if you want a 2-DIMM current generation motherboard. So the option is to roll the dice, clench your butt and hope for the best. When it comes to graphics cards and other products, no they are not even open for discussion at this point. EVGA is no longer in the business, otherwise they would be my first pick for everything they sell. If and when the day comes that I decide to go with Z890, my first pick would be a Unify-X. Ultimately I will have to settle for whatever options exist in a 2-DIMM board. I won't buy a 4-DIMM model no matter what the brand is.
  4. I need to be careful not to jinx myself by saying something, but the Z790 Apex is the only ASUS enthusiast product I have owned that hasn't been a piece of garbage or failed. It is almost as good as the Z690 Dark. Hopefully it will last. Every other flagship part I have purchased from ASUS has been trash or died. That is the reason for the hate. It was earned. The sad part is their budget and mid-range gamer stuff has been reliable (Prime and Strix mobos) but kind of mediocre in terms of features, performance and build quality. If that's not enough, if you have a warranty claim with them that will push you over the edge. Slow and painful, and they treat you like it is inconvenient to make things good on your expensive product that failed. You'll be down for a month or more unless you buy something else to replace it.
  5. I haven't on the Apex or the Dark. I did on the Unify-X and Strix Z690 motherboards. It didn't really make much difference when I did. Enabling the ASPM and DMI power management that are disabled by default was the only thing that made a meaningful improvement. Hmmm... another half-a$$ed GPU with the Strix branding. Am I surprised? Nope. The name no longer means anything. I think ASUS only does things well by accident. It is difficult to respect them as a company anymore. Poor QC, severely overpriced and horrible warranty service. Should be a deadly 3-strike failure, but the fanboys keep the brand alive in spite of the shortcomings.
  6. I am not able to enlarge the images enough that text becomes legible. They remain as thumbnails even if I right click and open in a new browser tab. Edit: it looks as if you uploaded them to the forum rather than an image hosting service. That is why. The forum resizes them to save space and basically ruins the images. Cap Frame X is wonderful software, but it is not a native in-game benchmark. I mention this for the benefit of anyone reading that is not familiar with it. I do prefer a native in-game benchmark when one exists. It is unfortunate that it is not mandatory that all published games include a benchmark tool that runs a very specific scene like those that do. I think the ability to do comparisons between systems is better that way. It levels the playing field much better.
  7. They should focus on calling out the most disgusting, unacceptable garbage and stop pretending there is importance to stupid things like whether it is cute or charming. I mean, like, who gives a flying you know what about what they think. It either looks like a normal GPU or it looks like crap. There's no middle ground. Only varying degrees of ugly trash. Maybe the novelty of exalting performance above all else and calling balls and strikes on ugly trash is too hateful or Machiavellian. Changing the subject... two things. First... What is in the box under the crocs, brother? Did you show us? Second... On a positive note, Brother Martin is doing another much-needed annoyance fix for HWiNFO64. I'm glad I asked. This will make me very happy because the feature that needs to be muted sucks. https://www.hwinfo.com/forum/threads/feature-request.9177/
  8. People that buy GPUs like these don't and shouldn't care about whether it is the fastest of the crappy class. At the $500-$600 price range it's going to be a sucky compromise compared against anything because the class is built on that as a foundation. They should just be happy they got something cheap because it wasn't good enough for enthusiasts. If it puts an image on the screen and plays games smoothly using medium and low quality settings that's what they paid for. There is no reason to show any concern one way or the other. It's a cheap product. Que sera sera.
  9. We are living in the darkest days our nation has ever seen. Wonder how much longer until it becomes a felony to publicly say what I just said? They're coming for the children as well. The quest of evil to destroy good has no bottom end. Raise them to be a pervert that loves corruption and they'll believe all of the other B.S. they're told. Parents are just getting in the way so they need to go to jail, too.
  10. I not sure. I literally never run the CPU stock so I haven't. I suspect it would. I doubt that passing Cinebench R23, Y-Cruncher and what-not that there is an instability in the overclock.
  11. I can't get it to work correctly except using static voltage. Adaptive I think it gets too much VDROOP even using L8 LLC. If I could disable Guardband on the Apex I think it would do better. I run with that disabled at all times on the Z690 Dark. I will have to run it on the 13900K in the Dark and see how it does. I'm tempted to put the Dark back on the open bench and make the Apex my "work" PC. I can't figure out how to disable Guardband on the Apex. I can adjust it, but I don't see a way to completely disable it. CPU: https://hwbot.org/submission/5351245_ | GPU: https://hwbot.org/submission/5351246_
  12. Yes, that is true. You can lower the PCH temps by enabling power management in the BIOS for ACPI/ASPM/DMI. Same was true on the Unify-X and Crosshair VIII X570. Both of them had a crazy hot PCH and it does hinder memory overclocking stability. So did the Strix Z690-E. Not sure why they don't do a better job with the heat sinks. The Z690 Dark has a very cool PCH, but the heat sink for it weighs almost as much as the entire Apex mobo, LOL. I actually used a big heatsink with fans and a thermal pad on top of the stock heatsink on the Unify-X before I found out about the power management in the BIOS being disabled causing the PCH temps to soar. Yes it is. Not sure why. I think there is something wrong with the benchmark to be honest. I think it is doing something weird with cache or memory. It's not as stressful as Cinebench R23. My CPU doesn't get as hot as R23, but it still takes lots of extra voltage for some reason. It's strange. I haven't tried lowering the cache ratio or decreasing the memory overclock to see if the behavior changes, but something unusual is going on. It seems to waste a bit of time "preparing" for the benchmark before it starts. Not sure what that's about. Sometimes it won't start rendering. Sometimes it stops responding without crashing or freezing. Other times it runs normally. I think it's just buggy. On top of that, I find it really annoying to have to remember to manually change it from a timed stress test to benchmark using the "off" option. I just tried to open it and got a BSOD a minute ago. Yet, I can run any other version of Cinebench, OCCT or AIDA64 stress test without issue, so yeah, it's buggy.
  13. I will not be surprised to learn that it is accurate. It's not very hard to cool memory chips and I could see having no delta being relatively easy to achieve with water flowing through the jackets. My liquid cooled memory on the Z690 Dark is rarely ever more than a 5°C delta from the water temperature with only the block on top to passively cool the memory jackets. The block on top alone drops the memory load temps about 30°C. The issue is that memory starts having errors at relatively low temperatures. But they don't get terribly hot, which is why the half-assed heating blankets most of the retail kits ship with is so inexcusable. My naked green sticks run cooler with nothing but a fan blowing on them than expensive retail kits do with so-called "heatsinks" on them that actually tend to increase the temperature rather than reduce it.
  14. The real benefit will be seen in what, if anything, it does to extend my memory overclocking. At 8200 I seldom see more than 32-34°C under stress testing with the liquid cooled memory on the Z690 Dark. It is definitely not a bad price for what a complex design it is with the water flowing through the RAM jackets. Active cooling directly on the memory ICs is pretty amazing when you think about it. The price is really decent, as long as it yields a benefit on the overclocking capacity. If it doesn't then it will just be an extra-fancy setup that I spent more on than I needed to.
  15. Ordered. Thanks for the heads up. It's about twice as much ($112 USD including shipping) as my other liquid cooled memory components cost me, but it will be interesting to see if keeping the memory below 30°C improves overclocking. If not, then I will know not to spend extra next time. Also nice that now I can get rid of the nasty G.SKILL RGB heating blankets and rainbow puke for good. I really hate RGB memory. Such a stupid thing, LOL. Finding good memory options without RGB garbage should be much easier.
  16. It looks like all they have is the jackets for the modules, but not the block that goes on top. Were you able to snag the block that mounts on top of the RAM jackets as well?
  17. If both of your motherboards are 4-DIMM and the Tachyon is 2-DIMM that more than likely accounts for most, if not all, of your limited memory overclock capacity if the scenario is the same as Intel. I'm not aware of any amount of skill that allows a 4-DIMM Intel mobo to match 2-DIMM models. I think it's totally illogical that they even manufacture 4-DIMM enthusiast mobos. I would avoid 4-DIMM boards like a plague. This is why the Tachyon, Unify-X, Dark and Apex boards rule, at least on the Team Blue side of the house. They should stop manufacturing 4-DIMM boards for gaming and enthusiast applications. They are a waste of money. My 12th Gen Celeron 2C/2T netbook CPU runs 8000 stable on the Dark and Apex. IMC and mobo quality certainly matters, but not as much as topology.
  18. I am not really sure on your AMD system what is best. If I recall correctly (and I may be a bit foggy now since it has been a while) X570 starts getting crazy latency and diminished returns if your memory clock is not at or near syncronized with FCLK. I found it difficult and frustrating with the 5950X. Running DDR4 at 4400 is better than most people could accomplish.
  19. Unfortunately that's the problem with way too many things in our lives right now. If you put a blank spot where the word game is you will never run out of words to insert there. That's what I've been saying all along... I need a 1.200V+ vBIOS or hardware mod that works in order to get what I paid for. Increasing the power limit is waste of time when you don't have the ability to consume it and more than a 1.100V baby-girl voltage cap is the only way to get there. As awesome as they might be the 4090 is castrated by insufficient voltage. But you know there are going to be some numbskulls out there trying to undervolt LOL.
  20. LOL... 100K in attendance and not even one person showed up. 🤣 Justice served.
  21. I figured something out that applies to 4K. The DPI override never made any difference on 1080p or 1440p to my eyes. But, 4K scaling still leaves something to be desired with text rendering not being nice, even with ClearType optimized. Fonts still look fuzzy. Right-clicking an .exe file (any .exe file you want) and choosing the override option greatly enhances the quality of text rendered by an application. In the image below, the CPU-Z on the left is with the tweak applied. The one on the right (that is much uglier) is default Windows DPI. No brainer. I have been applying this tweak to essentially every executable. The before/after is remarkable. I don't know why this is not a default configuration option for Windows 10 and Windows 11. It should be.
  22. OK, so after testing multiple (five in total) stock and modded versions of Windows 10 and Windows 11, the winners are: Overall (Combination CPU and GPU): Windows 10 22H2 - Altas OS https://atlasos.net/downloads Windows 11 21H2.2003 - Oprekin.com https://oprekin.com/threads/win11-21h2.411/post-6791 The worst CPU performance was the latest version of Windows 11 (22H2) fully and indiscriminately updated with all available Redmond Reprobate cancer filth installed. GPU performance is still good, but the CPU performance is really lousy. The worst GPU performance was Windows 10 21H1, which also had the best CPU performance by a tiny margin, but GPU performance was bad enough that overall 3DMark Time Spy scores were consistently the lowest of all five of the other OS versions. When I have time to create them I will post some Excel graphs. In chronological release date order, tested OSes were: Windows 10 20H2 (Defender removed, otherwise stock with no updates, just tuned) Windows 10 21H1 (Defender removed, otherwise stock with no updates, just tuned) Windows 10 22H2 - AtlasOS (fully updated, tuned further - Defender removed) Windows 11 21H2.2003 LitePlus - Oprekin.com (no updates, tuned further - Defender removed) Windows 11 22H2 (no mods, fully updated and tuned - Defender still functional) Unfortunately, most of the garbage from Micro$lop Studios does that now. Even if it works on the current version of Windows when you install it, some will stop working until you update to the latest Windows version filth and tell you that you need to upgrade/update to the latest version of the OS to continue playing the game. No, I'm not kidding. I refuse to purchase trash from these losers any more because of their control freak ways. Gears of War 4 was the last title I purchased for that very reason. It also required that you use the XBOX for Windows App. What is super-stupid about it is that Gears of War 5 was released on Steam and runs fine on Windows 7. @Raiderman @Clamibot
  23. Yes, a couple of different ones, in fact. I am testing something that may be relevant in terms of the better option to share. I will send you a PM and link after I confirm my recommendation. I will share the results of that testing here in the thread if it is worth sharing. I think it will be. I purchased it on eBay. One thing to keep in mind deciding between LTSC 2019 versus 2021 is while the GUI is better on 2019, overall performance of 2021 has been better for me for both CPU and GPU, but especially graphics performance.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. Terms of Use