Jump to content
NotebookTalk

Official Clevo X170KM-G Thread


electrosoft

Recommended Posts

  • Thumb Up 2

Clevo X170KM-G // BIOS Clevo 1.07.08 // EC 1.07.04
Intel Core i9-11900K with Thermal Grizzly Kryonaut Extreme
64GB (2x32GB) Patriot 3200Mhz DDR4
Nvidia RTX 3080 Laptop 16GB with Thermal Grizzly Kryonaut Extreme
17.3" UHD 4k 60Hz display (calibrated full AdobeRGB)
System Win11: 2TB Samsung Pro 980 NVMe SSD
System Win10: 2TB Kingston KC3000 NVMe SSD
Data: 2TB Samsung Evo 970 NVMe SSD
VMware: 1TB Samsung 980 NVMe SSD

Link to comment
Share on other sites

5 hours ago, srs2236 said:

 

As if the X170KM-G by default doesn't throttle.

Also 2.4-2.7GHz??? Source? Because today I saw it easily sit in 4,4GHz P-Core and 3,3 E core or something.

Yes, the 4070 is complete ass. Yet the 4080 and 4090 is just day and night difference.

 

btw the one I screenshotted was for 4080 version not 4070.

 

And most improtantly look at this 4090 just beating a 3080/3080 Ti BY 10000 freaking TimeSpy points.

 

It is not overrated at all. Clevo DTR's are overrated at this point.

 

I am just so sad I actually spent my money on the so disappointing 3080 mobile upgrade. What a shame.

 

Ohxd74s.png

 

The GT77 is a rather bad example as it is hideously expensive but you will probably be able to find other models that will at least give you stable 30K and 21K respectively at a lower price. I would guess that you can sell your 3080 for not much less than you paid for it, didn't you get a great deal?

 

As for the Clevo DTRs being overrated they have certain characteristics not found in most or even no laptops today so if someone seeks that they are still valid choices. If you only need something fast with little modularity / serviceability and normal memory / storage than you can obviously do better by now - that is just progress and it so happens that the last two years first saw a big jump in CPU performance and then another one and now also a big jump in GPU performance.

 

 

  • Thumb Up 3
Link to comment
Share on other sites

@srs2236 source:  Razer Blade 16 review + Raytheon ZERO 2023 review.

I agree that the old X170 on the 10(11)th gen is obviously weaker than the new products on the 13th gen, but due to the fact that there are no laptops now offering the same level of cooling and case dimensions for installing it, we get a double (and more) drawdown in performance for new laptops after warming up. It's always been that way.

 

  • Thumb Up 1

TongFang GM6PX8X | 13900HX | 32Gb@6600Mhz C40 | RTX4080 | QHD @240Hz | PM9A1 1Tb | PREMA MOD | Custom 360mm AIO Liquid System |

Link to comment
Share on other sites

On 2/23/2023 at 3:57 PM, Tersio said:

Did I not see here in forum one fella who ordered one for 4000 euros new few weeks ago?

 

 

If you mean me, I'm a bad example as I always prefer thick notebooks with LGA CPUs. I bought my GT76 a few months before the X170SM-G was announced (only other DTR options were Area-51M which had GPUs blowing up, and P775 which was old and to be replaced). If I had my time over I would have waited.

 

When I decided to upgrade my GT76 I was spooked by the issues stability issues of the KM-G. So I bought the GT77 which died within a few months and because there was zero stock I risked it and used the full refund to buy a KM-G as it is going to be the last of its kind. No stability issues so was a great choice. But I didn't like how thin the GT77 was and I really hate mobile CPUs. Most people don't care though.

 

Anyway, I guess things have changed because Intel are innovating way quicker than they used to, but thick DTRs haven't been replaced. 12th and 13th gen have been huge steps up in peformance, and 14th gen will probably obsolete Raptor Lake too.

  • Thumb Up 5

Metabox Prime-X (X170KM-G) | 17.3" 165Hz G-sync | 11900KF | 32GB DDR4 3200 | RTX 3080 16GB | 1TB 980 Pro

Link to comment
Share on other sites

1 hour ago, ymsv said:

It is not just Intel or AMD , NVIDIA does not want to support MXM anymore . 

In addition to that , in reality upgradability for video card module as far as I can see did not work ( alto the idea is good ) . 

To be honest the mobile CPUs are pretty good - it is the GPUs that stink compared to what we can get in a desktop.

 

  • Thumb Up 1
Link to comment
Share on other sites

2 hours ago, 1610ftw said:

To be honest the mobile CPUs are pretty good - it is the GPUs that stink compared to what we can get in a desktop.

 

 

As far as I understand they are basically a PC CPUs . But , yes , the GPUs are far from PC ones and also a bit misleadingly advertised . But you still need an NVIDIA or AMD approval for the product where they will be used and they do not have a wish to support MXM . All that makes the DTR in present concept a dead end . Alto I like them . 

I read also that SO - DIMM type RAM also have some speed limitations , which is one more obstacle . Maybe with the Dell type one it will change . 

 

Link to comment
Share on other sites

4 hours ago, 1610ftw said:

To be honest the mobile CPUs are pretty good - it is the GPUs that stink compared to what we can get in a desktop.

 

Yeah but how do you pack a 450W card in a laptop that uses a power brick less than that TDP for the whole laptop alone? It is impossible. CPU's don't use that much power on desktops. So you can actually achieve closer performance to desktops in a laptop because of it.

  • Thumb Up 1

Clevo P775TM1-G:

Spoiler

GPU: NVIDIA GeForce RTX 3080 Laptop (165W, +110MHz Core, +350MHz Mem)

CPU: Intel(R) Core(TM) i9-9900KS (5GHz, 4GHz Cache, -130mV, 255A, 200W PL1/2, C0 only)

RAM: 32 GB (3333MHz, 14-17-17-32, 2x16, Micron rev.E, 1.45v)

Storage 1: Kingston KC3000 2TB RAID0 (2x1TB, NVME, PCI-E 3.0)

Storage 2: Seagate LM015 2TB (2.5, HDD, SATA3)

Storage 3: Integral UltimaPro 512GB (SDXC, 100r/50w, PCI-E)

Display 1: AU Optronics B173ZAN0.10 (4K, 60Hz)

Display 2: ROG STRIX XG17AHP (1080, 240Hz, G-Sync, DP)

Wi-Fi/BT: Killer(R) Wireless-AC 1550 (9260NGW, PCI-E)

Operating system: Windows 11 Pro x64 (22H2)

Lenovo ThinkPad T540p:

Spoiler

GPU1: Intel(R) HD Graphics 4600

GPU2: NVIDIA GeForce GT 730M (+135MHz Core, +339MHz Mem)

CPU: Intel(R) Core(TM) i5-4210M

RAM: 16 GB (1600MHz, 2x8)

Storage: Samsung 860 Pro 256GB (2.5, SSD, SATA3)

Operating system: Windows 11 Pro x64 (22H2)

Lenovo IdeaPad E31-70:

Spoiler

GPU: Intel(R) HD Graphics 5500

CPU: Intel(R) Core(TM) i3-5005U

RAM: 4 GB (1600MHz, 1x4)

Storage: Kingston SA400 128GB (2.5, SSD, SATA3)

Operating system: Windows 11 Pro x64 (22H2)

Link to comment
Share on other sites

9 hours ago, srs2236 said:

Yeah but how do you pack a 450W card in a laptop that uses a power brick less than that TDP for the whole laptop alone? It is impossible. CPU's don't use that much power on desktops. So you can actually achieve closer performance to desktops in a laptop because of it.

There are limits to what can be done but we have gone down from up to 2 x 200W for GPUs to 1 x 175W - manufacturers could do better than that. Also even at 250W a 4090 that actually uses the 4090 desktop chip could achieve at least 80% of the performance of its desktop counterpart and often more.

Link to comment
Share on other sites

6 hours ago, 1610ftw said:

There are limits to what can be done but we have gone down from up to 2 x 200W for GPUs to 1 x 175W - manufacturers could do better than that. Also even at 250W a 4090 that actually uses the 4090 desktop chip could achieve at least 80% of the performance of its desktop counterpart and often more.

When did we have official 2x200W configuration and in which laptop? Seems a bit crazy. What kind of cooling solution was it using?

 

And again, 250W + let's say 125W CPU - where will you get heatsink that can effectively get rid of 375W amount of heat?

 

You know what could be a cool experiment? Running a desktop 4090 at TDP limit of 175W and running it against the mobile 4090 and comparing results. But would have to artificially increase the temperature so it is the same as in the laptop otherwise it would not be fair. Then we could see what the true benefit of full 4090 chip in laptop could potentially be.

Clevo P775TM1-G:

Spoiler

GPU: NVIDIA GeForce RTX 3080 Laptop (165W, +110MHz Core, +350MHz Mem)

CPU: Intel(R) Core(TM) i9-9900KS (5GHz, 4GHz Cache, -130mV, 255A, 200W PL1/2, C0 only)

RAM: 32 GB (3333MHz, 14-17-17-32, 2x16, Micron rev.E, 1.45v)

Storage 1: Kingston KC3000 2TB RAID0 (2x1TB, NVME, PCI-E 3.0)

Storage 2: Seagate LM015 2TB (2.5, HDD, SATA3)

Storage 3: Integral UltimaPro 512GB (SDXC, 100r/50w, PCI-E)

Display 1: AU Optronics B173ZAN0.10 (4K, 60Hz)

Display 2: ROG STRIX XG17AHP (1080, 240Hz, G-Sync, DP)

Wi-Fi/BT: Killer(R) Wireless-AC 1550 (9260NGW, PCI-E)

Operating system: Windows 11 Pro x64 (22H2)

Lenovo ThinkPad T540p:

Spoiler

GPU1: Intel(R) HD Graphics 4600

GPU2: NVIDIA GeForce GT 730M (+135MHz Core, +339MHz Mem)

CPU: Intel(R) Core(TM) i5-4210M

RAM: 16 GB (1600MHz, 2x8)

Storage: Samsung 860 Pro 256GB (2.5, SSD, SATA3)

Operating system: Windows 11 Pro x64 (22H2)

Lenovo IdeaPad E31-70:

Spoiler

GPU: Intel(R) HD Graphics 5500

CPU: Intel(R) Core(TM) i3-5005U

RAM: 4 GB (1600MHz, 1x4)

Storage: Kingston SA400 128GB (2.5, SSD, SATA3)

Operating system: Windows 11 Pro x64 (22H2)

Link to comment
Share on other sites

On 2/23/2023 at 11:21 PM, crossshot said:

They are working very hard on it.

 

6.014 is available

  • Thumb Up 6

Clevo X170KM-G // BIOS Clevo 1.07.08 // EC 1.07.04
Intel Core i9-11900K with Thermal Grizzly Kryonaut Extreme
64GB (2x32GB) Patriot 3200Mhz DDR4
Nvidia RTX 3080 Laptop 16GB with Thermal Grizzly Kryonaut Extreme
17.3" UHD 4k 60Hz display (calibrated full AdobeRGB)
System Win11: 2TB Samsung Pro 980 NVMe SSD
System Win10: 2TB Kingston KC3000 NVMe SSD
Data: 2TB Samsung Evo 970 NVMe SSD
VMware: 1TB Samsung 980 NVMe SSD

Link to comment
Share on other sites

On 2/25/2023 at 9:51 AM, srs2236 said:

When did we have official 2x200W configuration and in which laptop? Seems a bit crazy. What kind of cooling solution was it using?

 

And again, 250W + let's say 125W CPU - where will you get heatsink that can effectively get rid of 375W amount of heat?

 

You know what could be a cool experiment? Running a desktop 4090 at TDP limit of 175W and running it against the mobile 4090 and comparing results. But would have to artificially increase the temperature so it is the same as in the laptop otherwise it would not be fair. Then we could see what the true benefit of full 4090 chip in laptop could potentially be.

 

OK, it was only 2 x 190W:

 

https://www.notebookcheck.net/Eurocom-Sky-X9C-i7-8700K-GTX-1080-SLI-Clevo-P870TM1-G-Laptop-Review.274230.0.html#toc-7

 

The Witcher stresstest went up to over 450W power consumption:

 

image.png.cf7459be54483f238058eacbf72fa9ea.png

 

Here are some power consumption vs performance numbers:

 

d7lq97nttkv91.png

 

Excellent performance down to 270W and still good performance at 220W so at around 225 to 275W we should be looking at something like 80 to 90% of stock performance.

 

 

  • Thanks 1
  • Bump 1
Link to comment
Share on other sites

On 2/24/2023 at 8:01 AM, FTW_260 said:

@srs2236 source:  Razer Blade 16 review + Raytheon ZERO 2023 review.

I agree that the old X170 on the 10(11)th gen is obviously weaker than the new products on the 13th gen, but due to the fact that there are no laptops now offering the same level of cooling and case dimensions for installing it, we get a double (and more) drawdown in performance for new laptops after warming up. It's always been that way.

 

 

For comparison I would like to see how much CPU power can really be sustained at a certain temperature limit with something like a 30 minute Cinebench run.

 

For example the MSI GT77 chassis can sustain up to about 150W in such a test for 30 minutes without additional cooling and with stock TIM and at 95 degree prochot. That is quite good imo and not too far away from what I have seen from the X170KM-G in stock configuration. I would be interested to hear how much higher the X170 can go with:

- everything stock and regular tim

- delidding

- different tims including liquid metal

- aftermarket heatsink with air cooling

- aftermarket heatsink with air and water cooling

 

Both are also unbearably noisy with fans at max so it is a shame that we cannot also standardize noise levels, but then there are other variables that we cannot account for either so looking at the numbers that we can control could be a start.

 

I am not that interested in GPU performance but I already noticed that Nvidia seems to regulate GPUs to a much higher degrees than Intel does CPUs so comparing CPUs probably would be easier.

Link to comment
Share on other sites

I remember reading on the old forum someone had found replacement fans with a better blade design that aided in cooling. Anyone have a lead on these? I'm swapping in the watercooled heatsink and figured I'd do the fans too if I can find them.

Link to comment
Share on other sites

Has anybody checked battery life with the unlocked bios and found a good combination of bios/other settings that would significantly increase battery life?

 

I am currently running my KM-G with supposedly only the iGPU and still the battery does not last for more than two hours with a bit of youtube playback which frankly is pathetic when the CPU itself only consumes between 3 to 6W of power - there must be a lot of power consumption going on elsewhere.

 

 

 

 

Link to comment
Share on other sites

2 hours ago, 1610ftw said:

Has anybody checked battery life with the unlocked bios and found a good combination of bios/other settings that would significantly increase battery life?

 

I am currently running my KM-G with supposedly only the iGPU and still the battery does not last for more than two hours with a bit of youtube playback which frankly is pathetic when the CPU itself only consumes between 3 to 6W of power - there must be a lot of power consumption going on elsewhere.

 

 

I've noticed my 11900KF actually uses quite a lot of power even while idling (8-10W minimum). It is downclocking normally, and I've seen data from reviews that shows this is normal for Rocket Lake. With an iGPU I imagine this would only be higher. Are you sure yours is using only 3-6W?

 

By comparison, my old 9900K could idle as low as 1-2W.

Metabox Prime-X (X170KM-G) | 17.3" 165Hz G-sync | 11900KF | 32GB DDR4 3200 | RTX 3080 16GB | 1TB 980 Pro

Link to comment
Share on other sites

1 hour ago, Sniffy said:

 

I've noticed my 11900KF actually uses quite a lot of power even while idling (8-10W minimum). It is downclocking normally, and I've seen data from reviews that shows this is normal for Rocket Lake. With an iGPU I imagine this would only be higher. Are you sure yours is using only 3-6W?

 

By comparison, my old 9900K could idle as low as 1-2W.

 

I am using a 10850K and it is now using 3W in iGPU mode while playing a YouTube video.

Looks like the  iGPU mode and 60Hz refresh for the 1080p panel are helping to stretch use to about three hours. Wifi is switched on, bluetooth is switched off and brightness is at 20%. Still not great but good enough to get some use out of battery mode.

 

I did not try to kill a bunch of processes or other exotic stuff just a battery optimized energy mode and a battery optimized Throttlestop mode with turbo boost deactivated which is still plenty.

 

Throttlestop tells me that the CPU is at slightly less than 40°C and at 3W according to the info icons in the tray, this has been confirmed by HWmonitor.

Link to comment
Share on other sites

31 minutes ago, 1610ftw said:

 

I am using a 10850K and it is now using 3W in iGPU mode while playing a YouTube video.

Looks like the  iGPU mode and 60Hz refresh for the 1080p panel are helping to stretch use to about three hours. Wifi is switched on, bluetooth is switched off and brightness is at 20%. Still not great but good enough to get some use out of battery mode.

 

I did not try to kill a bunch of processes or other exotic stuff just a battery optimized energy mode and a battery optimized Throttlestop mode with turbo boost deactivated which is still plenty.

 

Throttlestop tells me that the CPU is at slightly less than 40°C and at 3W according to the info icons in the tray, this has been confirmed by HWmonitor.

My PC shuts itself off if I unplug the power cord xD The battery is only good for sleep mode lol. I am not sure why this is happening but I assumed it could have something to do with the fact that I am running a 9900KS and increased voltage for my RAM to 1.45v. But yeah, my 775 just can't provide enough power from the battery to even keep the damn thing alive.

  • Haha 1

Clevo P775TM1-G:

Spoiler

GPU: NVIDIA GeForce RTX 3080 Laptop (165W, +110MHz Core, +350MHz Mem)

CPU: Intel(R) Core(TM) i9-9900KS (5GHz, 4GHz Cache, -130mV, 255A, 200W PL1/2, C0 only)

RAM: 32 GB (3333MHz, 14-17-17-32, 2x16, Micron rev.E, 1.45v)

Storage 1: Kingston KC3000 2TB RAID0 (2x1TB, NVME, PCI-E 3.0)

Storage 2: Seagate LM015 2TB (2.5, HDD, SATA3)

Storage 3: Integral UltimaPro 512GB (SDXC, 100r/50w, PCI-E)

Display 1: AU Optronics B173ZAN0.10 (4K, 60Hz)

Display 2: ROG STRIX XG17AHP (1080, 240Hz, G-Sync, DP)

Wi-Fi/BT: Killer(R) Wireless-AC 1550 (9260NGW, PCI-E)

Operating system: Windows 11 Pro x64 (22H2)

Lenovo ThinkPad T540p:

Spoiler

GPU1: Intel(R) HD Graphics 4600

GPU2: NVIDIA GeForce GT 730M (+135MHz Core, +339MHz Mem)

CPU: Intel(R) Core(TM) i5-4210M

RAM: 16 GB (1600MHz, 2x8)

Storage: Samsung 860 Pro 256GB (2.5, SSD, SATA3)

Operating system: Windows 11 Pro x64 (22H2)

Lenovo IdeaPad E31-70:

Spoiler

GPU: Intel(R) HD Graphics 5500

CPU: Intel(R) Core(TM) i3-5005U

RAM: 4 GB (1600MHz, 1x4)

Storage: Kingston SA400 128GB (2.5, SSD, SATA3)

Operating system: Windows 11 Pro x64 (22H2)

Link to comment
Share on other sites

On 3/4/2023 at 3:33 PM, srs2236 said:

My PC shuts itself off if I unplug the power cord xD The battery is only good for sleep mode lol. I am not sure why this is happening but I assumed it could have something to do with the fact that I am running a 9900KS and increased voltage for my RAM to 1.45v. But yeah, my 775 just can't provide enough power from the battery to even keep the damn thing alive.

 

Have you checked how much life is left in your battery? My son has a battery that is essentially dead in his P870. It probably was constantly charged to 100% and that killed it. Now it does not even last long enough to go with it from one room to the next.

 

Apart from not having a charging technology that properly preserves the battery It looks to me as if Clevo did not really make much of an effort to save energy in these desktop replacement designs. It is clear that the Intel desktop processor is not the issue with consumption below 5W in my case and less in yours yet total system power consumption is around 30W even without a dGPU.

 

That is way too much and just not a very good design when it comes to energy efficiency and given a 96Wh capacity of the battery.

 

To be able to comfortably use the X170 for about 4 hours and taking into account that we do not want to deep discharge or overcharge the battery 20W seems like a desirable number but it is far removed from the power that the X170 really needs.

 

Link to comment
Share on other sites

On 3/5/2023 at 1:47 AM, 1610ftw said:

 

Have you checked how much life is left in your battery? My son has a battery that is essentially dead in his P870. It probably was constantly charged to 100% and that killed it. Now it does not even long enough to go with it from one room to the next.

 

Apart from not having a charging technology that properly preserves the battery It looks to me as if Clevo did not really make much of an effort to save energy in these desktop replacement designs. It is clear that the Intel desktop processor is not the issue with consumption below 5W in my case and less in yours yet total system power consumption is around 30W even without a dGPU.

 

That is way too much and just not a very good design when it comes to energy efficiency and given a 96Wh capacity of the battery.

 

To be able to comfortably use the X170 for about 4 hours and taking into account that we do not want to deep discharge or overcharge the battery 20W seems like a desirable number but it is far removed from the power that the X170 really needs.

 

 

I've been using FlexiCharger in the BIOS to keep my battery between 50-60%.

 

My Total System Power is also 30w+ when idle and jumps up much higher even when just watching videos in a web browser. The battery in this fills more of a UPS role than anything else by those numbers.

Metabox Prime-X (X170KM-G) | 17.3" 165Hz G-sync | 11900KF | 32GB DDR4 3200 | RTX 3080 16GB | 1TB 980 Pro

Link to comment
Share on other sites

I tried giving the unlocking BIOS feature a run provided by @ViktorV and while everything unlocked everything I do doesn't change anything for the memory timings. I tried XMP and custom XMP changes and nothing adjusts.

 

I'm wondering if this BOXX BIOS (1.07.02) is suffering from the XMG plague and is incompatible with acknowledging the unlocking changes.

 

What is the best, known BIOS to use with the KM to use with the KM to run the unlocker that is known to work? I'll give that one a try.

 

 

  • Thumb Up 1

Electrosoft Prime: 7950X3D | MSI X670E Carbon  | MSI Suprim X Liquid 4090 | AC LF II 420 | G.Skill 6000 A-Die 2x32GB | Samsung 990 Pro 2TB | EVGA 1600w P2 | Phanteks Ethroo Pro | Alienware AW3225QF 32" OLED

Eurocom Raptor X15 | 12900k | Nvidia RTX 3070ti | 15.6" 1080p 240hz | Kingston 3200 32GB (2x16GB) | Samsung 980 Pro 1TB Heatsink Edition
Heath: i9-12900k | EVGA CLC 280 | Asus Strix Z690 D4 | Asus Strix 3080 | 32GB DDR4 2x16GB B-Die 4000  | WD Black SN850 512GB |  EVGA DG-77 | Samsung G7 32" 144hz 32"

MelMel:  (Retrofit currently in progress)

 

 

 


 

Link to comment
Share on other sites

1.07.08 is the latest known stock bios and its possible to unlock it

https://my.hidrive.com/share/yze8mg-wf8#$/BIOS and EC Firmware/CLEVO/X_Series/X170KM-G

  • Thumb Up 1
  • Like 1

Clevo X170KM-G // BIOS Clevo 1.07.08 // EC 1.07.04
Intel Core i9-11900K with Thermal Grizzly Kryonaut Extreme
64GB (2x32GB) Patriot 3200Mhz DDR4
Nvidia RTX 3080 Laptop 16GB with Thermal Grizzly Kryonaut Extreme
17.3" UHD 4k 60Hz display (calibrated full AdobeRGB)
System Win11: 2TB Samsung Pro 980 NVMe SSD
System Win10: 2TB Kingston KC3000 NVMe SSD
Data: 2TB Samsung Evo 970 NVMe SSD
VMware: 1TB Samsung 980 NVMe SSD

Link to comment
Share on other sites

Wonder how the Modified BIOS is coming along with the EC firmware corrects etc. Its been a tad bit quite on that front it would seem.

  • Thumb Up 1

{Main System:} The Beast

Spoiler

{Cooling:} Corsair H170i Elite

{Mainboard:} Asrock X670E Pro

{CPU/GPU:} AMD Ryzen R9 7900x3D / AMD RX 7900 XTX (Asrock Phantom)

{RAM/Storage:} 2x 16GB DDR5 Corsair Vengeance 6400MT/s , 13TB WDD SN850X 2x4TB, 2x 2TB, 1x 1TB

{PSU/Case:} Corsair RM 1000x V2, Corsair 7000D Airflow (Black)

{OS:} Windows 11 Pro

 

Realtek Nahimic 3 Modded Driver for MSI Systems:Latest
 

Link to comment
Share on other sites

Anyone able to confirm fan control works in Linux using Tuxedo Control Center for the X170?

 

edit: The answer is it works great. Using all the tuxedo packages, Linux is flawless. The Tuxedo Control Center functionality is all there even without Tuxedo firmware. There is one thing that slows the boot process down massively (I think it is Thunderbolt related), but there is a workaround I have yet to try.

Metabox Prime-X (X170KM-G) | 17.3" 165Hz G-sync | 11900KF | 32GB DDR4 3200 | RTX 3080 16GB | 1TB 980 Pro

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. Terms of Use