-
Posts
439 -
Joined
-
Last visited
-
Days Won
4
Content Type
Profiles
Forums
Events
Everything posted by Clamibot
-
*Official Benchmark Thread* - Post it here or it didn't happen :D
Clamibot replied to Mr. Fox's topic in Desktop Hardware
Since oddly Cinebench R15 refuses to use both the P cores and E cores on my 14900KF at the same time, I ended up doing a P core only run and an E core only run by manually setting processor affinity each time to those respective core groups. The P core only run is the one in orange and the E core run is the one in brown. Looks like 8 P cores is more powerful than 16 E cores. -
*Official Benchmark Thread* - Post it here or it didn't happen :D
Clamibot replied to Mr. Fox's topic in Desktop Hardware
Solid proof of why high speed, low latency RAM does in fact matter a lot in gaming. Even with the amount of 3D vcache these X3D CPUs have, it's still possible to exhaust that cache many times over in some games. This is exactly the reason Intel CPUs scale so well with fast memory; it helps mititgate the shortcomings of the cache on Intel's CPUs vs AMD's X3D CPUs (in regards to cache size that is). -
*Official Benchmark Thread* - Post it here or it didn't happen :D
Clamibot replied to Mr. Fox's topic in Desktop Hardware
Uhh, well I'm really glad my house and everything within a few miles didn't vaporize. Apparently my new super 14900KF consumed more than a quadrillion watts of power for a fraction of a second. @Mr. Fox This is some seriously powerful hardware you sold me. I don't know if this is better or worse than someone's laptop CPU running at 90,000,000°C. I'm surprised nuclear fusion didn't occur inside their laptop. -
*Official Benchmark Thread* - Post it here or it didn't happen :D
Clamibot replied to Mr. Fox's topic in Desktop Hardware
Dual GPU is back in style my dudes! I did some testing with heterogeneous multi GPU in games with an RX 6950 XT and an RTX 2080 TI, and the results are really good. Using Lossless Scaling, I used the 6950 XT as my render GPU and the 2080 TI as my framegen GPU. This is the resulting usage graph for both GPUs at ultrawide 1080p (2560x1080) at a mix of high and medium settings tuned to my liking for an optimized but still very good looking list of settings. I'm currently at the Mission of San Juan location and my usage on the 6950 XT is 70%, while it is 41% on the 2080 TI. My raw framerate s 110 fps, interpolated to 220 fps. One thing that I really like aboout this approach is the lack of microstuttering. I did not notice any microstuttering using this multi GPU setup to render a game in this manner. Unlike SLI or Crossfire, this rendering pipeline is not to split the frame and have each GPU work on a portion of it. Instead, we render the game on one GPU, then do frame interpolation, upscaling, or a combination of both using the second GPU. This bypasses any timing issues from having multiple GPUs work on the same frame, which eliminates microstuttering. We also get perfect scaling relative to the amount of processing power needed for the interpolation or upscaling as there is no previous frame dependency, only the current frame is needed for either process (plus motion vectors for interpolation, which are already provided). No more wasting processing power! Basically this is SLI/Crossfire without any of the downsides, the only caveat being that you need your raw framerate to be sufficiently high (preferably 100 fps or higher) to get optimal results. Otherwise, your input latency is going to suck and will ruin the experience. I recommend this kind of setup only on ultra high refresh rate monitors where you'll still get good input latency at half the max refresh rate (mine being a 200 Hz monitor, so I have my raw framerate capped at 110 fps). To get this working, install any 2 GPUs of your choice in your system. Make sure you have Windows 10 22H1 or higher installed or this process may not work. Microsoft decided to allow MsHybrid mode to work on desktops since 22H1, but you'll need to perform some registry edits to make it work: Open up Regedit to Computer\HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Class\{4d36e968-e325-11ce-bfc1-08002be10318}. Identify the four digit subfolders that contain your desired GPUs (e.g. by the key DriverDesc inside since this contains the model name, making it really easy to identify). In my case, these happened to be 0001 fo the 6950 XT and 0003 for the 2080 TI (not sure why as I only have 2 GPUs, not 4) Create a new DWORD key inside both four digit folders. Name this key: EnableMsHybrid.Set the value of the key 1 to assign it as the high performance GPU, or set it to a value of 2 to assign it as the power saving GPU. Once you finish step 3, open up Graphics Settings in the Windows Settings app Once you navigate to this panel, you can manually configure the performance GPU (your rendering GPU) and the power saving GPU (your frame interpolation and upsclaing GPU) per program. I think that the performance GPU is always used by default, so configuration is not required, but helps with forcing the system to behave how you want. It's more of a reassurance than anything else. Make sure your monitor is plugged into the power saving GPU and launch Lossless Scaling. Make sure your preferred frame gen GPU is set to the desired GPU Profit! You now have super duper awesome performance from your dual GPU rig with no microstuttering! How does that feel? This approach to dual GPU rendering in games works regardless of if you have a homogeneous (like required by SLI or Crossfire) or heterogeneous multi GPU setup. Do note that this approach only scales to 2 GPUs under normal circumstances, maybe 3 if you have an SLI/Crossfire setup being used to render your game. SLI/Crossfire will not help with Lossless Scaling as far as I'm aware, but if it does, say hello to quad GPU rendering again! The downside is that you get microstuttering again however. I prefer the heterogeneous dual GPU approach as it allows me to reuse my old hardware to increase performance and has no microstuttering. -
*Official Benchmark Thread* - Post it here or it didn't happen :D
Clamibot replied to Mr. Fox's topic in Desktop Hardware
Yeah I don't get it either. Matte screens are unreadable under intense light sources because the diffusion of light across the screen from the matte anti glare just makes the screen blurry. At least on a glossy screen, even with glare, it's still sharp, so you can read whatever parts you can still see. -
*Official Benchmark Thread* - Post it here or it didn't happen :D
Clamibot replied to Mr. Fox's topic in Desktop Hardware
300 Hz glossy perfection: Bleh matte layer I peeled off: This stupid piece of plastic ruins every screen I lay my eyes on that has it. Glossy forever baby! I got a 300 Hz screen for my X170. Since there are no glossy screen options, I had to take matters into my own hands and glossify it since I absolutely detest matte screens. This came out well just like when I did it on my current 144 Hz screen. It's good to know this mod still works on laptop panels as I know on some of the newer desktop ultrawide panels, the matte anti glare layer is stupidly infused into the polarizer layer istead of being a separate layer. -
Yasss! It's alive! I'm glad to hear this project is still in progress and has not been forgotten to the sands of time. I've been looking for and following similar projects when I find them in hopes I can gather enough knowledge to build my own version of such projects. My only critique would be to allow for an 18 inch screen, but I'm also a go big or go home kind of guy, plus I like big screens. Weight is not a concern for me. I'd really like a return of 19 inch class laptops, but I'd like a rise of mini itx laptops even more.
-
*Official Benchmark Thread* - Post it here or it didn't happen :D
Clamibot replied to Mr. Fox's topic in Desktop Hardware
Shadow Of The Tomb Raider Benchmark on my Legion Go (also using a new WindowsXLite 22H2 install): I set the power limit of the APU to 54w and overclocked the iGPU. All 8 CPU cores were active. I also set the temperature limit of the CPU and the skin temperature sensor to 105°C to stop the APU from throttling because it thinks my hands will get burned. The results are a definite improvement over stock settings, but still not up to par with my standards for raw framerate. However, the input lag is not very noticeable when using a controller and setting my raw framerate to 72 fps, then interpolating to 144 fps, thus giving me my high refresh rate experience on this device. Despite raising the power limit to 54w, the system was maxing out at 40w sustained. I'm not sure if this is because the APU was using 40w, with the remaining 14w of the power budget being used for everything else. I thought the power limit I set using the Smokeless tool was for the APU only. I'm definitely pushing the power circuitry pretty hard here as the APU is rated for 28w only. I was able to perform a static overclock on the iGPU to 2400 Mhz and, performance did improve. Interestingly, performance dropped if I tried pushing further as it seems there is either a power limit, voltage limit, or both. UXTU allows me to overclock the iGPU up to 4000 Mhz, but I didn't try going that far. I was however able to overclock my Legion Go's iGPU to 3200 Mhz without the display drivers crashing, so looks like this iGPU has a lot of overclocking headroom left, but is held back by power/voltage limits. I can't wait to see a handheld with a Strix Halo APU and a 240 Hz screen. -
*Official Benchmark Thread* - Post it here or it didn't happen :D
Clamibot replied to Mr. Fox's topic in Desktop Hardware
Well guys, I benchmarked my 2019 LTSC install vs my new WindowsXLite 22H2 install in Shadow Of The Tomb Raider again (this time using the built in benchmark), and I can confirm the 22H2 install does indeed perform better, even with all my programs installed (which seemed to have no effect on performance at all). Looks like WindowsXLite 22H2 is the to upgrade from LTSC 2019! I'm currently installing it on my Legion Go and will be installing it on my X170 next. I had my 10900K running at 3.7 GHz to induce a CPU side bottleneck. My GPU is a Radeon RX 6950 XT. 2019 LTSC: WindowsXLite22H2 (minimalist gamer only installation): WindowsXLite22H2 (all my programs installed + some extra services running): WindowsXLite22H2 wins by about 2%. I did not expect this at all. I was expecting a performance downgrade, but I am very happy I got a slight performance upgrade instead. You don't see that very often with installing newer versions of windows. I like that Shadow Of The Tomb Raider is useful both for CPU and GPU benchmarks. This makes it an easy all in one benchmark that saves me time as it will give me a general idea of performance differences between different machines and windows installs. -
*Official Benchmark Thread* - Post it here or it didn't happen :D
Clamibot replied to Mr. Fox's topic in Desktop Hardware
So far I've gotten my 4000 CL15 32 GB (4x8GB) kit to 4200 with 1.5 IMC and 1.54v on the memory itself. I haven't tried pushing further, but I do think there's headroom left if I push the memory voltage higher. Samsung B Die is awesome! This is of course dependent on the motherboard. I have an MSI Unify Z590 motherboard, and I know these were made for overclocking. If you get a similar quality board, you should obtain similar results. -
*Official Benchmark Thread* - Post it here or it didn't happen :D
Clamibot replied to Mr. Fox's topic in Desktop Hardware
I've had a lot of fun with the ML360 Sub-Zero. That's the cooler I've been running the entire time I've had my current desktop. It's incredibly good for gaming and allows me to do 5.6 GHz on my 10900K in games, which really helps with games that have an artifically induced single core performance bottleneck. I also know liquid metal wasn't recommended with this cooler, but I did it anyway and that made the temperature results even better of course. -
Clevo X170SM-G user manual: https://www.manualslib.com/manual/1890112/Clevo-X170sm-G.html#manual I think G-Sync won't work once you upgrade the GPU, but I personally wouldn't worry about it. Gsync is a gimmick anyways since we'd always want our games running at the monitor's max refresh rate anyways for the best experience. I lost Gsync after replacing my laptop's monitor, and I found it really didn't matter. I haven't missed Gsync at all.
-
*Official Benchmark Thread* - Post it here or it didn't happen :D
Clamibot replied to Mr. Fox's topic in Desktop Hardware
Oh you're using Windows 11 builds? My WindowsXLite install is based on a Windows 10 Pro 22H2 build. To add to that list you have: (6) Disable unnecessary security mitigations like Spectre, Meltdown, Core Isolation/Memory Integrity (Virtualization Based Security), and Control Flow Guard (7) Install DXVK Async into games that see an uplift from it (8) Use Lossless Scaling (you get best results if your raw framerate is already 100 fps or higher with mouse and keyboard, or if on a controller, if your raw framerate is already 72 fps or higher). This software is absolutely amazing! (9) Perform settings tuning in games. I find that some settings barely have a difference between low and ultra, especially in newer games. (10) Disable anticheat and/or DRM if possible -
*Official Benchmark Thread* - Post it here or it didn't happen :D
Clamibot replied to Mr. Fox's topic in Desktop Hardware
After some preliminary testing with Shadow Of The Tomb Raider in Windows 10 LTSC 2019 and WIndowsXLite 22H2, it looks like the CPU side performance is actually slightly higher on the 22H2 installation. I'm using the maximally tuned OOB install without Windows Defender. I can probably do a bit more tuning with it too. My LTSC install is also tuned with stupid security mitigations disabled, but it's probably not tuned as much as some of you guys have managed to do with your installs. GPU side performance seems to be higher too, but I don't know by what amount as I was only testing CPU side performance. I need to get some concrete numbers, but this is looking really good so far. To be fair, the 22H2 installation is currently set up as a minimalist installation purely for game testing, so I need to install all my programs onto that Windows install and compare again as that could affect the results. It probably won't, but I need to cover all my bases. If WindowsXLite 22H2 truly outperforms Windows 10 LTSC 2019, I will be happy. This will be the first time I've seen a version upgrade in Windows actually deliver a performance upgrade rather than a downgrade (other than upgrading from regular consumer editions to Windows 10 LTSC since I've seen this firsthand, and that jump in CPU side performance was significant). I'll then need to upgrade all my systems🤪 -
As the title says, humble bundle currently has a deal for a big collection of Sid Meier games. This bundle contains the past 5 Civilization games + DLCs for Civilization 6, and some other Sid Meier games I'm less familiar with. You only have to pay $18 to obtain the entire collection, so some get it while it's hot! The deal lasts for another week as of the date of this post. Enjoy! https://www.humblebundle.com/games/2k-presents-sid-meier-collection?hmb_source=&hmb_medium=product_tile&hmb_campaign=mosaic_section_1_layout_index_1_layout_type_threes_tile_index_2_c_2kpresentssidmeiercollection_bundle&_gl=1*1xmxnzv*_up*MQ..*_ga*NzcxNjUwNTUzLjE3MzE0ODE2NDU.*_ga_BBZCZLHBF6*MTczMTQ4MTY0NS4xLjAuMTczMTQ4MTY1Mi4wLjAuMA..&gclid=Cj0KCQiAlsy5BhDeARIsABRc6ZucfeiADFbBRCh22Int8tkA_caIeK7Kb3vtXXRRJo29B__usIKX2b8aAgMLEALw_wcB
-
*Official Benchmark Thread* - Post it here or it didn't happen :D
Clamibot replied to Mr. Fox's topic in Desktop Hardware
Christmas came early for me this year! Lol, yep, you get more done and you get your money's worth when buying from bro @Mr. Fox. A super duper mega ultra deluxe hardware bundle being placed right into my hands. You betcha awesome is here! Lots of goodies! I haven't gotten anything set up due to work being very busy this week, but I'm definitely looking forward to assembling and tuning 2 systems! One will be for me and the other will be for a buddy building his first desktop after graduating college and years of using laptops and being dissatisfied with the diminishing options for upgrade paths. Full hardware bundle contents: - Super bin i9 14900KF - i9 14900K - Asus Maximus Z790 Apex motherboard - Asrock Z690 PG Velocita motherboard - Super bin G.SKILL 32 GB 8000 MHz DDR5 memory kit (pre tuned to 8400 MHz CL 36 just for me!) - Crucial 16 GB 6000 MHz DDR5 memory kit - Gigabyte Aorus Xtreme Waterforce RTX 2080 TI (bro Fox even included an air cooler for it!) - Iceman direct die waterblock - Iceman direct touch RAM waterblock - Bykski RAM heatsinks for the super bin G.SKILL memory kit -
*Official Benchmark Thread* - Post it here or it didn't happen :D
Clamibot replied to Mr. Fox's topic in Desktop Hardware
As an owner of an Optane 905P, I feel qualified to answer this question. Depending on what you're doing and your setup, you may or may notnotice a difference. If you use Optane with something like an i7 7700K, you probably won't notice a difference vs a standard m.2 SSD. If you are however using Optane with a recent platform (like 14th gen or current gen), you will definitely notice a significant difference. I currently have an Optane 905P installed in my system with a 10900K in it, and the difference is significant. The first thing you'll notice is that the system feels more responsive. I can't assign this particular metric to a number, but it feels snappier vs using a standard SSD. For everyday usage, programs load significantly faster (provided your CPU isn't a bottleneck on load speeds). You can also open a ton of programs at once and the Optane drive will just blast through your file read requests. Optane also excels at small file copy speeds due to the much faster random write speeds vs a standard SSD. Optane also doesn't slow down as it fills up vs a standard SSD, so you can load these babies to the brim and not see a decrease in drive performance. The most major difference I've noticed is in file search speed. When doing development for my job, I osmetimes have to look for particular files. With Optane, I can search the root directory of a Unity project using windows explorer file search (mind you, our projects are pretty big for VR games, at least 30 GB or larger for the repository), and my 905P will have already returned a bunch of results after I snap my fingers (it's still searching, but at least it found a few files right off the bat). File searching is so much faster on an Optane drive. If I were to perform this same task on a standard SSD, it would take a bit before the file search returned any results, even the initial few. For my development workloads, code compiles faster, asset imports complete significantly faster, and builds complete a bit faster. For gaming, games load faster, especially open world ones. Any game that does heavy asset streaming also has loading microstutters gone. Both development workloads and game load speeds will continue to scale with ever faster CPUs on Optane whereas they've already kinda hit a wall with standard SSDs. Oh yeah, also hibernate and wake from hibernate is far faster on Optane vs a standard SSD. So Is file zipping/unzipping. So overall, Optane is a must as a boot drive if you want the snappiest experience, or if you're a developer like me, or just want the best game load times, or if you do tons of file operations (especially with small files), or if you want some combination of the 4. Optane benefits newer platforms far more than older ones as newer CPUs can really take advantage of the throughput of Optane random read and write speeds. -
*Official Benchmark Thread* - Post it here or it didn't happen :D
Clamibot replied to Mr. Fox's topic in Desktop Hardware
The 9800X3D seems like a really good candidate for Intel's Cryo Cooling watercoolers. Fortunately it's now possible to run those waterblocks and AIOs using modified Intel Cryo Cooling software that doesn't have the stupid artificial CPU restriction check from here: https://github.com/juvgrfunex/cryo-cooler-controller -
*Official Benchmark Thread* - Post it here or it didn't happen :D
Clamibot replied to Mr. Fox's topic in Desktop Hardware
Looks like Christmas is coming early for someone. What a lucky individual! -
*Official Benchmark Thread* - Post it here or it didn't happen :D
Clamibot replied to Mr. Fox's topic in Desktop Hardware
Update on my shenanigans: 1.52v on the RAM was not enough to stabilize my memory overclock. It appeared to be stable but crashed. I tried 1.53v and it still crashed, but took far longer. 1.54v has not crashed after an entire work day doing heavy multiplayer testing on our newest VR basketball title. I'll consider this stable as I was also able to load Mass Effect Andromeda without my system crashing, which usually means my CPU or memory overclock is stable as loading onto that game is a very CPU and memory intensive process. 4200 MHz CL15 DDR4 on a 4 DIMM board is pretty impressive. I don't know if I want to take the IMC voltage any higher than 1.5v long term, but I know these current voltages on the memory and IMC are for sure safe long term. Heh screw it, I'll allow up to 1.55v on the IMC and 1.6v on the memory. Nothing should go wrong, right?🤪 I'm begrudgingly going to be moving to a WIndows 10 22H2 install due to software incompatibilities starting to creep up on me. The WindowsXLite downloads brother @Mr. Fox linked me to seems like they'll perform as well as my 2019 LTSC install, so I'll be satisfied if that's the case. I'm happy windows 10 support will be ending soon ish because I don't want any more dang updates! They're incredibly annoying, and my computers always get these updates and force install them when I'm using the machine, usually in the middle of me working or playing a game. I know that's not supposed to happen, it's supposed to update when I'm away from my machines, but it updates during active use for me, so I'll be really happy when the updates stop for good. The updates don't ever contain anything I care about anyway. Having tested multiple versions of windows myself in games, I can confirm that all this marketing surrounding windows 11 is complete BS. I've tested on multiple laptops, a desktop, and my Legion Go. My framerates are around 20% higher in windows 10 LTSC vs windows 11 across all those devices. It does depend on the game, but that's the performance increase I found on average, with most of my newer games showing a slightly greater than 20% increase in framerates. I'm hoping WindowsXLite Optimum 10 classic gives me that LTSC grade performance. You all know I will be doing my game benchmarks to compare. This is gonna be a fun weekend. -
*Official Benchmark Thread* - Post it here or it didn't happen :D
Clamibot replied to Mr. Fox's topic in Desktop Hardware
Woo hoo! I tried my hand at memory overclocking for the first time today and was able to successfully get a 5% overclock on the memory speed on my 4 dimm motherboard from 4000 MHz to 4200 MHz. IMC voltage is at 1.5v and the DRAM voltage is at 1.52v. This seems stable. I did at first try my hand at tightening timings but ultimately gave up for now as I couldn't get it stable after messing with them for a few hours, so I instead opted for the brute force approach, which I was successful with. I probably should've gone witht he brute force approach first with me being new to memory overclocking. I'll try my hand at tightening timings again another time. After having used a system with an AMD dGPU for a while and getting used to it's idiosyncrasies, I much prefer AMD graphics cards now. Turns out, my black screen driver crashes that I've spent moths trying to figure out weren't because of AMD's drivers sucking. That was merely a symptom of the root cause, which was memory instability. My XMP profile was unstable at stock IMC voltages. Raising the voltage by 10mv made all the stupid crashes go away. So PSA to those with AMD GPUs, if you experience random black screen crashes, consider raising your IMC voltage just a tad. This made all my headaches go away. -
*Official Benchmark Thread* - Post it here or it didn't happen :D
Clamibot replied to Mr. Fox's topic in Desktop Hardware
We are on page 666 after all. -
*Official Benchmark Thread* - Post it here or it didn't happen :D
Clamibot replied to Mr. Fox's topic in Desktop Hardware
So I got the Asus XG309CM, and I'm absolutely loving the 220 Hz refresh rate. It's so smooth! Significant improvement over 144 Hz in smoothness and it's starting to get close to looking like real life in terms of motion smoothness. My original reason for getting one is so I can have a flat ultrawide screen since I hate curved screens, and having used a curved ultrawide screen for a while, this flat ultrawide is far better. It's much more comfortable to look at since there's no image distortion (since there's no curve, yay!). Unfortunately because I have now adjusted to the 220 Hz refresh rate and really like its smoothness, my performance requirements have gone up yet again. However, this means I may have a use for dual GPU rigs again, which will be fun to play around with. Lossless Scaling works with a multi GPU setup, so you can render your game with one GPU and then run Lossless Scaling's frame gen on a different GPU. It's essentially like SLI/Crossfire, but better since you get superlinear scaling in most cases (since the frame gen generally takes significantly less processing power than actually rendering a frame), and it works in pretty much any game. The only caveat is input lag, but you probably won't be bothered by it much if your raw framerate is already sufficiently high (in excess of 120 fps), and you will notice the increased smoothness from the higher interpolated framerate much more at this level. Since there's no singular GPU powerful enough to render every game in existence at hundreds of frames a second @ ultrawide 1080p, this is my ticket to lifelike motion in all the games currently in my library, and games I'll be playing in the future. This kind of setup will be especially useful when I inevitably move onto even higher refresh rate monitors (I saw a 480 Hz one, like dang!). Motion clarity at 220 fps is pretty dang good. It's super smooth, but still not as smooth as real life. I don't know what my perception limits are, but I know I'm still not there. Ahh, the sweet dream of planning yet another new build. I guess we're never done here are we? I am currently satisfied with this 220 Hz monitor, but you guys know me and my extremely high requirements. You all KNOW I will eventually get an even higher refresh rate monitor because I want video games to have the exact motion clarity real life does. I demand it because for me, motion smoothness increases immersion for me much more than better colors or higher resolution. Lifelike motion smoothness or close to such is incredibly immersive to me. -
Personally I always do my overclocking through Intel XTU so I don't brick my machine if I apply bad settings, plus you can perform adjustments so your setting dynamically. This will depend on your chip's silicon quality, but I can apply a 20mv undervolt at 5.3 GHz with the 10900K currently installed in my laptop (5.4 GHz for the better binned chip in my desktop) and it remains stable for me. Up to 1.5v is a safe 24/7 voltage on this CPU within this laptop. Up to 1.6v is safe if you have really good cooling (like custom water cooling, which we can do on this laptop), so I wouldn't worry about the voltage being 1.3v on your CPU as that's not a super high voltage for this specific generation of CPUs. To answer your question on the sign of the offset (whether it is positive or negative), I'll have to jump in the BIOS and take a look to see where that can be identified. I'm pretty sure there's an option to set the offset sign somewhere. I also have not done that memory overclocking I was going to do over the weekend just yet. Unfortunately enabling the realtime memory tuning option in the BIOS causes the system to not boot, so I can't do that on the fly withing Intel XTU unfortunately.
-
*Official Benchmark Thread* - Post it here or it didn't happen :D
Clamibot replied to Mr. Fox's topic in Desktop Hardware
I'm definitely gonna try that out as I'm starting to run into sofware compatibility issues (both for games and work) with my V1809 LTSC install, so I now have a genuine need for a newer version of windows. Thanks for posting this! I also got an Asus XG309CM monitor and absolutely love the 220 Hz refresh rate. I don't like that the max refresh rate isn't perfectly divisible by 24 or 30 though so I've been looking into some monitor overclocking (which I've done before), but am running into a bit of a snag this time. Apparently, there's a refresh rate limiter on this monitor according to CRU. Does anyone know how to bypass such a thing? I've never seen something like this before on a monitor. I just want to overclock it to 240 Hz, which I think will be doable on this panel.