Jump to content
NotebookTalk

Clamibot

Member
  • Posts

    461
  • Joined

  • Last visited

  • Days Won

    4

Posts posted by Clamibot

  1. 3 hours ago, Mr. Fox said:

    They should be but it should be 25% more than an acceptable FE base price not screwed 25% harder than already screwed for 200% more than what an FE should sell for. None, not one, of the GPUs are worth the price they are being advertised at. Everyone that buys one is getting sodomized. And if the only thing available turns out to be shitty Windows 11 driver support it will be getting shanked in the groin on top of the sodomization. There are already components on new motherboards that have no Windows 10 drivers. I'd love to round up all their idiots responsible for that, put them into a cage and drop the cage into the ocean. They are unworthy of oxygen.

     

    Well that woud explain why I can't get wifi working with my Asus Maximus Z790 Apex Encore no matter what I do. That's been making me feel like an idiot, but that would make total sense as the reason why it isn't working for me. Fortunately that is also the only component that doesn't work, and I have a few USB wifi adapters laying around, so I'm just using my wireless AC one. It works well enough.

    • Sad 2
  2. 10 hours ago, 1610ftw said:

    Some of the old modular laptop crew may like this:

     

     

    It is an interesting concept to say the least and probably what he had hinted at in a previous video.

     

    Of course it is a bit of a bad joke that this unique watercooled more modular laptop that also sports a good keyboard layout is limited to 16" and rather basic memory and drive options.

     

     

    Clevo is going back to making modular laptops again? Awesome! Or is that a Tongfang model?

    Laptops just keep pulling me back in. I can't get away from them because the allure of portability is impossible to resist. Having said that, I also don't have a tolerance for laptops that are not modular, so I wouldn't buy such machines anyway. It takes something special like this to pique my interest, and it piques my interest HARD!

     

    Let's hope we get a proper 18-19 inch DTR with that modularity + external water cooling. That'll be fun for benchmarks and awesome for max performance in an ultraportable form factor (yes I consider 18 inch laptops ultraportable). Speaking of which, I'll need that special water cooling heatsink for my X170 to perform an upgrade to the RTX 3080. That should be interesting.

    • Thumb Up 1
    • Bump 1
    • Sad 1
  3. 7 hours ago, StripeySnake said:

    I forgot to post it last time, here is a sneak peek at the shorty config. 

    image.thumb.png.fbc82bf4fb248272d4ade7e0536cd0fb.png

    P.S. going forward I will be modifying the main design to allow compatibility with Pico style PSUs, as well as FLEX of course. I am currently working on parity between the expensive sheet cut sections so that every configuration will only require 3D printing to reconfigure, which is a lot cheaper than sheet cutting. I will also be mocking up an optional mod to the liquid cooled chassis to add two additional 40mm dual radiators (and fans) stacked with the 60mm dual rads for 1.6x radiator capacity. Hopefully I won't need it, but if the 60mm can handle 5950x/5700 xt, then adding these additional radiators might be able to further improve cooling for even more powerful graphics. 

     

    I would say if you're going to replace the Flex ATX PSU with a Pico PSU, then being able to house 2 of those HDPlex 500W PicoSPUs would be awesome as that would allow for very high power builds using a dual PSU config.

    • Bump 1
  4. 14 minutes ago, johnksss said:

    The 5070 is the new 4090 from what Jensen states....

    jptMFpuKTHgAhZdNL8sjF6-1200-80.png.webp

     

    If the 5070 truly shifted performance up by 2 performance classes vs the previous generation, then the pricing seems a bit better to swallow, but I'm still not going to allow them to condition me to higher prices. A 70 class card should still not be that expensive, but I digress. I'm conditioned to Pascal era prices.

    • Thumb Up 2
    • Bump 2
  5. Looks like Clevo is starting to pull their heads out of their butts. Still soldered crap, but it's a massive upgrade over the X370 abomination that is completely undeserving of the Xx70 moniker: https://videocardz.com/newz/clevos-x580-next-gen-laptop-specs-leaked-arrow-lake-hx-cpu-and-geforce-rtx-50-gpu

     

    This is a nice 18 inch Clevo DTR, so it's good to see them starting to go back to what they once made. It doesn't have an upgradable CPU or GPU, but they increased the drive slots from 3 back to 4 and also increased the RAM slots back to 4.

     

    Gives me hope that Clevo may return to full socketed models one day.

    • Thumb Up 2
    • Like 1
  6. I just swapped my 144 Hz screen on my X170SM-G with a 300 Hz one. It was a pain in the butt to find the display cable for it, but it was worth it. I ended up finding the part on Aliexpress and then waited around for the cable to arrive. I just did the panel and cable swap today.

    I also modded the screen to make it glossy. It looks amazing! I have a nice glossy 300 Hz screen for my X170SM-G now! The colors are better than on the 144 Hz screen and everything looks so much smoother on the 300 Hz screen (as it should).

     

    I think I'm going to keep going higher and higher with my screen refresh rates as time goes by. My ultimate goal regarding screens is to acquire one where motion on it looks like real life (super duper ultra smooth). The 220 Hz screen I have for my desktop pushed me to get this 300 Hz one for my laptop. I'm very happy I got it!

     

    • Like 3
  7. Also if anyone needs extra cotton swabs for applying liquid metal, they're these Japanese ones: https://www.amazon.com/Tifanso-Cruelty-Free-Biodegradable-Chlorine-Free-Hypoallergenic/dp/B07R8B93GL/ref=asc_df_B07R8B93GL?mcid=e042f9f5c5c2392481644c9da3b526ea&tag=hyprod-20&linkCode=df0&hvadid=693127596188&hvpos=&hvnetw=g&hvrand=14693873930929850647&hvpone=&hvptwo=&hvqmt=&hvdev=c&hvdvcmdl=&hvlocint=&hvlocphy=9026808&hvtargid=pla-757827286496&th=1

     

    Don't waste your money on ordering extra swabs from Thermal Grizzly. They're just doing an insane upcharge on the swabs I mentioned above. The swabs are "special" in the sense that they're more densely packed than q-tips, which helps a lot with spreading the liquid metal, but they're just regular cotton swabs otherwise.

    • Like 4
  8. 1 hour ago, Mr. Fox said:

    I have not seen this liquid metal before. Looks very interesting--especially the higher viscosity part--so I ordered some to try the next time I need to take something apart. It seems like over the years that liquid metal formulas have changed and have a lower viscosity than they used to, and I think that makes it more likely to drip or run. (The original Liquid Ultra was much thicker than Conductonaut, then they changed it and made it extra watery.) It looks like they use more indium and silver in their formula.

     

    https://www.amazon.com/gp/product/B0CJ9QGDFV   They have some other products on their web site.

    71pd-uDZnFL._SL1500_.jpg

    51VXwEEX2oL._SL1024_.jpg

     

    I'm very interested in your end results! I just bought some Conductonaut extreme since I used the rest of my Conductonaut on the build I did for my buddy over the weekend. This liquid metal here looks like it has even better thermal conductivity.

    • Thumb Up 4
    • Bump 1
  9. 22 hours ago, Kharstraetor said:

    Hey everyone! 

    I've commenced replacing my board and I've encountered a cable which I'm unsure how to remove.

     

    Circled in red in the picture below is the cable which I'm referring to. It runs into the screen. Any ideas how to remove it safely?

     

     

    IMG_20241207_193033.jpg

     

    Pull the cable straight up using the black tab that's attached to it. This is the eDP cable. You'll need to pinch that black tab with your thumb and index finger. It should lift off pretty easily once you pull straight up.

    • Thumb Up 1
  10. I love seeing people being passionate about the things they love and enjoy. This is quite the collection you have!

     

    I used to be super into alienware laptops as a kid and it was my dream to own one. I finally got an Alienware 17 R1/M17X R5 when I was in high school and really enjoyed it. I upgraded the GPU to a GTX 1060 and I still use it to this day alongside my Clevo X170. The old Alienwares were built to last!

     

    My most favorite backpack ever is my Alienware Vindicator backpack I got in 2015. I've used the crap out of it and the rubberized face is peeling off, and my mom snapped one of the zippers by accident since the rubberized zipper tips have become quite brittle, but the rest of the backpack itself is still in really good shape.

    • Like 1
  11. 26 minutes ago, tps3443 said:


    It’s working extremely well now! I downloaded just the regular version and set the “Cryo” mode. I saw the idle temps go from 43c to 20c in like 20-30 seconds lol. After running R15 the max temps were much much lower. I love this cooler. This is actually really impressive seeing this. 
     

    This 10900K seems to be a gem so far. I ran R15@5.3Ghz and it only hit 180 watts max. This is madness. I remember hitting 298 watts on a direct die 10850K on custom loop cooling. And it was a really good sp85 chip. So this chip is not delidded at all, and only on an AIO, it’s absolutely killing it. I’ve got to put this thing on the chiller/big boy extreme loop. 

     

    Yep, high binned 10900Ks can do some pretty amazing all core speeds if they're kept cold enough. The cold also significantly reduces their voltage requirements and power draw, further increasing your overclocking headroom. You saw the results that just a TEC is capable of getting out of this chip.

     

    I was able to do 5.5 Ghz on mine in games initially but could never keep the chip cold enough to keep it 100% stable. Any temperature spike too high and the machine crashed.

     

    One thing I really like about my current 14900KF system I just built last weekend is that the system doesn't crash if I overclock just a little too high. Programs crash instead, which makes the trial and error a lot quicker of a process now that I don't have to wait for system shutoffs and reboots every time now. I do want to get the V2 of the MasterLiquid ML 360 Sub Zero, called the MasterLiquid ML 360 Sub Zero Evo, as that will enable me to continue my sub ambient overclocking adventures with the 14900KF.

     

    Funnily enough, Skatterbencher did some overclocking using a TEC on almost my exact same setup (he used a 14900KS instead of the KF) and got really good results out of it. He was able to easily do 6.2 GHz all core in Shadow Of The Tomb Raider.

     

    Video if anyone is interested: 

     

    • Thumb Up 2
    • Like 2
  12. 8 minutes ago, tps3443 said:


    Are you using Windows 10? That’s probably why your Cinebench scores are low then and not using the E-Cores. I had that issue before. You need a newer Windows 10, or preferably Windows 11. I remember this happen to me when I first upgraded to 13900K on launch. 

     

    I'm using Windows 10 22H2. I'm specifically using the WindowsXLite version for maximum performance.

     

    Cinebench R15 is able to use the E cores, just not at the same time as the P cores. I can make it use either core types, just not at the same time.

    • Thumb Up 1
  13. 18 minutes ago, tps3443 said:

    I’m testing the 10900K LTX SP106 currently. And it does really well so far. Or at least I think so. 
     

    I’m running all-cores at 5.0Ghz@138 watts max with an AIO during R15 Cinebench.  This cooler doesn’t seem all that great, or maybe it’s not working properly. 🤷‍♂️ Temps are peaking at 92c package still. I’m not all that impressed with the subzero so far, unless I’m using it wrong. I may have to check the mount as well. 
     

    This seems kinda toasty for sub 140 watts. 
     

    @Clamibot anything I need to do to enable this CM ML360 subzero is there a command center for it?  it’s all hooked up correctly. PCIe 8 pin, USB connection, and all of that. 

     

     

     

    Yes, the cooler sucks until you enable the TEC. There are 2 different control center applications you can use for it. You can either use the Intel official Cryo Cooling software: https://www.intel.com/content/www/us/en/download/715177/intel-cryo-cooling-technology-gen-2.html

     

    Or you can download the open source modded version that doesn't contain the stupid artificial CPU model check, which allows the TEC to work with any CPU: https://github.com/juvgrfunex/cryo-cooler-controller

     

    I prefer the open source version.

     

    Once you get either one of these applications set up along with the required drivers to make them work, just enable the TEC and watch your CPU go brrrrr (literally, the CPU can get pretty cold).

     

    If you use the Intel official version, I'd recommend setting it to Cryo mode for daily driving, and enable unlimited mode only when benchmarking in short bursts.

     

    If you use the open source version, setting the Offset parameter to positive 2 is equivalent to running Cryo mode on the Intel official software, and setting that Offset parameter to a negative number is equivalent to running Unlimited mode on the Intel official software.

     

    Once you enable the TEC, the cooler is good for a CPU power draw up to about 200 watts before you start thermal throttling. I also used liquid metal between the CPU die and IHS, and also between the IHS and the cooler coldplate nozzle.

     

    Note, DO NOT use liquid metal between the IHS and coldplate on the MasterLiquid ML 360 Sub Zero if your IHS is pure copper, use nickel plated IHSes only! If you use a pure copper one like I did, the liquid metal will weld both copper surfaces together over time with the TEC active as the colder temperatures from the TEC coldplate and the heat from the CPU accelerate the absorption of the gallium from the liquid metal into both copper surfaces, and eventually bonds the copper surfaces together.

     

    Alternatively, you can pre treat both copper surfaces with liquid metal to prevent this from happening. Stupid me, now I can't get the cooler off my golden 10900K. It really needs a repaste.😭

     

    I was able to do 5.4 GHz all core in games on this thing using the TEC, 5.6 GHz single core boost on my particular chip. It can hold all core speeds over 5.4 GHz indefinitely as long as the temperature is kept below 140°F (60°C).

    • Thumb Up 3
  14. 10 hours ago, Mr. Fox said:

    I have never had that issue before. Are you using one of the saved BIOS profiles or testing your own? You should be getting like 43-44K easily in CBR23 with one of saved profiles unless it is just getting too hot. Unless there is some kind of error elsewhere causing Cinebench to malfunction.

    https://hwbot.org/submission/5546221

    3142371

    https://hwbot.org/submission/5546469

    3142564

      

    That iTX motherboard and 4070 look SO TINY in that big Enthoo case, LOL.

     

     

    I'm using the BIOS profile that was on the motherboard when I received the parts. Also I'm benchmarking using Cinebench R15.

     

    Also funnily enough, my system feels significantly more responsive and snappy with the E cores disabled. I haven't done any official benchmarks for this, but it just feels faster.

     

    1 hour ago, tps3443 said:


    Cinebench R15 was always troublesome with Intel 13/14 gen chips. It was always the hardest to run of R20/R23/R24. If you can run R15 then the dang thing can run Ycruncher with the same settings. That’s just how it is with the P/E core chips using R15. Sometimes TVB can cause issues, sometimes you may need it on/off. 
     

    Just know R15 is tough on these chips. 

     

    Looks like it'll still serve as a good stability test for me then. I've always used Cinebench R15 to test my overclocks. If they pass a benchmark, they're pretty much always stable in games.

    • Thumb Up 2
  15. 10 hours ago, Mr. Fox said:

    I have never had that issue before. Are you using one of the saved BIOS profiles or testing your own? You should be getting like 43-44K easily in CBR23 with one of saved profiles unless it is just getting too hot. Unless there is some kind of error elsewhere causing Cinebench to malfunction.

    https://hwbot.org/submission/5546221

    3142371

    https://hwbot.org/submission/5546469

    3142564

      

    That iTX motherboard and 4070 look SO TINY in that big Enthoo case, LOL.

     

     

    I'm using the BIOS profile that was on the motherboard when I received the parts. Also I'm benchmarking using Cinebench R15.

     

    1 hour ago, tps3443 said:


    Cinebench R15 was always troublesome with Intel 13/14 gen chips. It was always the hardest to run of R20/R23/R24. If you can run R15 then the dang thing can run Ycruncher with the same settings. That’s just how it is with the P/E core chips using R15. Sometimes TVB can cause issues, sometimes you may need it on/off. 
     

    Just know R15 is tough on these chips. 

     

    Looks like it'll still serve as a good stability test for me then. I've always used Cinebench R15 to test my overclocks. If they pass a benchmark, they're pretty much always stable in games.

    • Thumb Up 2
  16. 1 hour ago, electrosoft said:

    You hear me talk about X3D and stuttering/chunk when playing that I do not experience on Intel.

     

    This is an example here in Ardenweald and this is on the 9800X3D from Jansn Benchmarks. Watch closely for it. This is a mild version out in open areas with minimal player data and higher GPU utilization. It happens more with RT on and especially at 4k.

     

    Side note: on a 7900XTX with RT on, the stuttering is just brutal yet it plays so much smoother with it off.

     

     

    PvP/Raids with X3D is just brutal...BRUTAL when that cache gives up the ghost but I have to say it doesn't look as bad as the 7800X3D/7950X3D but it is still there. It comes down to when does the massive amount of player data micro stutters end and the X3D chunk begins....

     

     

    Remember, this is at 1440p not even 4k on the 9800X3D.

     

    Still want to try it myself as I don't know about his memory settings. 9800X3D is on its way back to newegg for an exchange (hopefully).

     

     

     

    Solid proof of why high speed, low latency RAM does in fact matter a lot in gaming. Even with the amount of 3D vcache these X3D CPUs have, it's still possible to exhaust that cache many times over in some games. This is exactly the reason Intel CPUs scale so well with fast memory; it helps mititgate the shortcomings of the cache on Intel's CPUs vs AMD's X3D CPUs (in regards to cache size that is).

    • Thumb Up 3
    • Bump 1
  17. Uhh, well I'm really glad my house and everything within a few miles didn't vaporize. Apparently my new super 14900KF consumed more than a quadrillion watts of power for a fraction of a second. @Mr. Fox This is some seriously powerful hardware you sold me.

     

    image.thumb.png.840b97c69426b781b1b17f18bc4d32b4.png

     

    I don't know if this is better or worse than someone's laptop CPU running at 90,000,000°C. I'm surprised nuclear fusion didn't occur inside their laptop.

    • Haha 4
  18. Dual GPU is back in style my dudes!

     

    I did some testing with heterogeneous multi GPU in games with an RX 6950 XT and an RTX 2080 TI, and the results are really good. Using Lossless Scaling, I used the 6950 XT as my render GPU and the 2080 TI as my framegen GPU. This is the resulting usage graph for both GPUs at ultrawide 1080p (2560x1080) at a mix of high and medium settings tuned to my liking for an optimized but still very good looking list of settings.

     

    image.thumb.png.7b1eba6efb13eb1232aeac95e90a3751.png

     

    I'm currently at the Mission of San Juan location and my usage on the 6950 XT is 70%, while it is 41% on the 2080 TI. My raw framerate s 110 fps, interpolated to 220 fps.

     

    One thing that I really like aboout this approach is the lack of microstuttering. I did not notice any microstuttering using this multi GPU setup to render a game in this manner. Unlike SLI or Crossfire, this rendering pipeline is not to split the frame and have each GPU work on a portion of it. Instead, we render the game on one GPU, then do frame interpolation, upscaling, or a combination of both using the second GPU. This bypasses any timing issues from having multiple GPUs work on the same frame, which eliminates microstuttering. We also get perfect scaling relative to the amount of processing power needed for the interpolation or upscaling as there is no previous frame dependency, only the current frame is needed for either process (plus motion vectors for interpolation, which are already provided). No more wasting processing power!

     

    Basically this is SLI/Crossfire without any of the downsides, the only caveat being that you need your raw framerate to be sufficiently high (preferably 100 fps or higher) to get optimal results. Otherwise, your input latency is going to suck and will ruin the experience. I recommend this kind of setup only on ultra high refresh rate monitors where you'll still get good input latency at half the max refresh rate (mine being a 200 Hz monitor, so I have my raw framerate capped at 110 fps).

     

    To get this working, install any 2 GPUs of your choice in your system. Make sure you have Windows 10 22H1 or higher installed or this process may not work. Microsoft decided to allow MsHybrid mode to work on desktops since 22H1, but you'll need to perform some registry edits to make it work:

     

    1. Open up Regedit to Computer\HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Class\{4d36e968-e325-11ce-bfc1-08002be10318}.
    2. Identify the four digit subfolders that contain your desired GPUs (e.g. by the key DriverDesc inside since this contains the model name, making it really easy to identify). In my case, these happened to be 0001 fo the 6950 XT and 0003 for the 2080 TI (not sure why as I only have 2 GPUs, not 4)image.thumb.png.14b7833edf10d79211595c0f87eb610b.pngimage.thumb.png.637e0b33bf3e3e08e4dc1e17e29d0536.png
    3. Create a new DWORD key inside both four digit folders. Name this key: EnableMsHybrid.image.thumb.png.a427f5e330d8f90d43d072bba6bf38a2.pngSet the value of the key 1 to assign it as the high performance GPU, or set it to a value of 2 to assign it as the power saving GPU.
    4. Once you finish step 3, open up Graphics Settings in the Windows Settings appimage.thumb.png.68c51ec44043c9e6f81acb9e81f98a5e.png
    5. Once you navigate to this panel, you can manually configure the performance GPU (your rendering GPU) and the power saving GPU (your frame interpolation and upsclaing GPU) per program. I think that the performance GPU is always used by default, so configuration is not required, but helps with forcing the system to behave how you want. It's more of a reassurance than anything else.
    6. Make sure your monitor is plugged into the power saving GPU and launch Lossless Scaling.
    7. Make sure your preferred frame gen GPU is set to the desired GPUimage.png.1e212dfd8b3529bf0e8ee61915c0dbed.png
    8. Profit! You now have super duper awesome performance from your dual GPU rig with no microstuttering! How does that feel?

    This approach to dual GPU rendering in games works regardless of if you have a homogeneous (like required by SLI or Crossfire) or heterogeneous multi GPU setup. Do note that this approach only scales to 2 GPUs under normal circumstances, maybe 3 if you have an SLI/Crossfire setup being used to render your game. SLI/Crossfire will not help with Lossless Scaling as far as I'm aware, but if it does, say hello to quad GPU rendering again! The downside is that you get microstuttering again however. I prefer the heterogeneous dual GPU approach as it allows me to reuse my old hardware to increase performance and has no microstuttering.

    • Like 3
    • Bump 1
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. Terms of Use