Jump to content
NotebookTalk

Bought RTX 3080 Laptop GPU for my P775TM1-G.


srs2236

Recommended Posts

How am I expected to cool the CPU?!?!

 

Also, as soon as the package TDP falls below 180W the CPU starts downlocking itself from 5GHz. yikes...

 

This is with -0.130v undervolt in 1pass of Cinebench r23:

image.thumb.png.99ad15f43c89df955bb2da3219dc89f0.png

Clevo P775TM1-G:

Spoiler

GPU: NVIDIA GeForce RTX 3080 Laptop 16GB (150W, +110MHz Core, +350MHz Mem)

CPU: Intel(R) Core(TM) i9-9900KS (5GHz, 4GHz Cache, -130mV, 255A, 200W PL1/2)

RAM: DDR4 32 GB (3333MHz, 14-17-17-32, 2x16, Micron rev.E, 1.45v)

Storage 1: Kingston KC3000 2TB RAID0 (2x1TB, NVME, PCI-E 3.0)

Storage 2: Seagate LM015 2TB (2.5, HDD, SATA3)

Storage 3: Integral UltimaPro 512GB (SDXC, 100r/50w, PCI-E)

Display: AU Optronics B173ZAN0.10 (4K, 60Hz)

Wi-Fi/BT: Killer(R) Wireless-AC 1550 (9260NGW, PCI-E)

Operating system: Windows 11 Pro x64 (23H2)

Link to comment
Share on other sites

Yeah, it is a bit crazy for this CPU.

 

I don't know how but I think sooner or later I will have to go liquid metal or something.

 

image.thumb.png.51eebd94ffca9d14dc7f822cd3964045.png

 

First pass looks very promising but it draws an unrealistic 180W for my cooler and it eventually thermal throttles itself to 150W which equals 4.67GHz only.

 

I was able to achieve an extra -0.015v undervolt by disabling Intel TVB voltage optimization. It had a bug where the undervolt would be stable but it would drop the voltage too much in idle and crash during long idle times.

 

However -0.135v would crash under 5th pass on cinebench. This test was done with -0.130v.

 

I am wondering what are other people scores in cinebench with Clevo laptops? Is there a thread or benchmarks from other people somewhere that I am missing? I have a good idea now how well my GPU performs against other similar builds but not sure about the CPU performance in my scenario. From a quick google search people seem to say that for i9-9900KS average on cinebench should be around 14k. mine is just 11.5k... yikes 😕 And this is without a load on the GPU. With a load on the GPU the CPU TDP drops to even lower more like 100W in games.

 

image.thumb.png.4ac0acf6c4bf55f05e825ff5bc6a8bcd.png

image.thumb.png.5699eb9c8eebe584a69fc739ff4e70f8.png

Clevo P775TM1-G:

Spoiler

GPU: NVIDIA GeForce RTX 3080 Laptop 16GB (150W, +110MHz Core, +350MHz Mem)

CPU: Intel(R) Core(TM) i9-9900KS (5GHz, 4GHz Cache, -130mV, 255A, 200W PL1/2)

RAM: DDR4 32 GB (3333MHz, 14-17-17-32, 2x16, Micron rev.E, 1.45v)

Storage 1: Kingston KC3000 2TB RAID0 (2x1TB, NVME, PCI-E 3.0)

Storage 2: Seagate LM015 2TB (2.5, HDD, SATA3)

Storage 3: Integral UltimaPro 512GB (SDXC, 100r/50w, PCI-E)

Display: AU Optronics B173ZAN0.10 (4K, 60Hz)

Wi-Fi/BT: Killer(R) Wireless-AC 1550 (9260NGW, PCI-E)

Operating system: Windows 11 Pro x64 (23H2)

Link to comment
Share on other sites

On 10/27/2022 at 4:40 AM, Reciever said:

I have seen a number of things but of them, liquid metal, de-lid, bare die, BartX IHS, second fans, water blocks, Silicon Lottery

 

Going to have to get a little creative 🙂

Actually, I was thinking about this for a little while now.

 

I always run my fans at max speed. When I did the Superposition benchmark the GPU draw a constant 150W, yet it ran so comfortably cool reaching only 60C at the very last moments of the benchmark.

 

Yet, If I run a CPU benchmark like Cinebench at 150W, one of the CPU cores reaches 99C almost immediately.

 

Something doesn't feel right... I know for a fact the GPU side has more copper mass but still. Starts to sound like poor thermal transfer to me.

 

Maybe it's the fact that the GPU is direct die? Or maybe just has a better contact... I use Thermal Grizzly Kryonaut for both.

I am very afraid of liquid metal for the fact that it can destroy the whole PC if not applied correctly + the issue of re-application because of LM + copper problems.

 

I have been thinking about the IHS and change of thermal paste though.

 

The idea was to change IHS to a bigger one and put LM under the IHS. Then maybe change thermal paste from Kryonaut to the Honeywell phase change one. Seen some people praise it a lot.

Clevo P775TM1-G:

Spoiler

GPU: NVIDIA GeForce RTX 3080 Laptop 16GB (150W, +110MHz Core, +350MHz Mem)

CPU: Intel(R) Core(TM) i9-9900KS (5GHz, 4GHz Cache, -130mV, 255A, 200W PL1/2)

RAM: DDR4 32 GB (3333MHz, 14-17-17-32, 2x16, Micron rev.E, 1.45v)

Storage 1: Kingston KC3000 2TB RAID0 (2x1TB, NVME, PCI-E 3.0)

Storage 2: Seagate LM015 2TB (2.5, HDD, SATA3)

Storage 3: Integral UltimaPro 512GB (SDXC, 100r/50w, PCI-E)

Display: AU Optronics B173ZAN0.10 (4K, 60Hz)

Wi-Fi/BT: Killer(R) Wireless-AC 1550 (9260NGW, PCI-E)

Operating system: Windows 11 Pro x64 (23H2)

Link to comment
Share on other sites

Truly happy about this one:

image.thumb.png.d42ed544bb19dc2e22d8b548f7b2380c.png

Clevo P775TM1-G:

Spoiler

GPU: NVIDIA GeForce RTX 3080 Laptop 16GB (150W, +110MHz Core, +350MHz Mem)

CPU: Intel(R) Core(TM) i9-9900KS (5GHz, 4GHz Cache, -130mV, 255A, 200W PL1/2)

RAM: DDR4 32 GB (3333MHz, 14-17-17-32, 2x16, Micron rev.E, 1.45v)

Storage 1: Kingston KC3000 2TB RAID0 (2x1TB, NVME, PCI-E 3.0)

Storage 2: Seagate LM015 2TB (2.5, HDD, SATA3)

Storage 3: Integral UltimaPro 512GB (SDXC, 100r/50w, PCI-E)

Display: AU Optronics B173ZAN0.10 (4K, 60Hz)

Wi-Fi/BT: Killer(R) Wireless-AC 1550 (9260NGW, PCI-E)

Operating system: Windows 11 Pro x64 (23H2)

Link to comment
Share on other sites

3 hours ago, srs2236 said:

Actually, I was thinking about this for a little while now.

 

I always run my fans at max speed. When I did the Superposition benchmark the GPU draw a constant 150W, yet it ran so comfortably cool reaching only 60C at the very last moments of the benchmark.

 

Yet, If I run a CPU benchmark like Cinebench at 150W, one of the CPU cores reaches 99C almost immediately.

 

Something doesn't feel right... I know for a fact the GPU side has more copper mass but still. Starts to sound like poor thermal transfer to me.

 

Maybe it's the fact that the GPU is direct die? Or maybe just has a better contact... I use Thermal Grizzly Kryonaut for both.

I am very afraid of liquid metal for the fact that it can destroy the whole PC if not applied correctly + the issue of re-application because of LM + copper problems.

 

I have been thinking about the IHS and change of thermal paste though.

 

The idea was to change IHS to a bigger one and put LM under the IHS. Then maybe change thermal paste from Kryonaut to the Honeywell phase change one. Seen some people praise it a lot.

dont expect to get 9900KS desktop stock performance in your machine. even with delid, liquid metal, highend thermal paste, higher contact pressure and highend pads + higher pressure on the CPU VRMs i wasnt able to get stock desktop performance out of my SIlicon Lottery binned 9900K. as bro @Recieveralready said, ull need to get creative 😄 

  • Thumb Up 1
  • Bump 1

Mine: Hyperion "Titan God of Heat, Heavenly Light, Power" (2022-24)
AMD Ryzen 9 7950X (TG High Perf. IHS) / Asus ROG Crosshair X670E Extreme / MSI Geforce RTX 4090 Suprim X / Teamgroup T-Force Delta RGB DDR5-8200 2x24 GB / Seagate Firecuda 530 4 TB / 5x Samsung 860 Evo 4 TB / Arctic Liquid Freezer II 420 (Push/Pull 6x Noctua NF-A14 IndustrialPPC-3000 intake) / Seasonic TX-1600 W Titanium / Phanteks Enthoo Pro 2 TG (3x Arctic P12 A-RGB intake / 4x Arctic P14 A-RGB exhaust / 1x Arctic P14 A-RGB RAM cooling) / Samsung Odyssey Neo G8 32" 4K 240 Hz / Ducky One 3 Daybreak Fullsize Cherry MX Brown / Corsair M65 Ultra RGB / PDP Afterglow Wave Black / Beyerdynamic DT 770 Pro X Limited Edition

 

My Lady's: Clevo NH55JNNQ "Alfred" (2022-24)
Sharp LQ156M1JW03 FHD matte 15.6" IGZO 8 bit @248 Hz / Intel Core i5 12600 / Nvidia Geforce RTX 3070 Ti / Mushkin Redline DDR4-3200 2x32 GB / Samsung 970 Pro 1 TB / Samsung 870 QVO 8 TB / Intel AX201 WIFI 6+BT 5.2 / Win 11 Pro Phoenix Lite OS / 230 W PSU powered by Prema Mod!

Link to comment
Share on other sites

3 hours ago, Scruffy said:

nope im not into it and dunno where to look at 😄

 

about ur high temperature i would just downclock the speed of cpu to around 4.6ghz

sounds about right to me for high cpu loads without AVX and pulling all cooling stops available to the user.

Mine: Hyperion "Titan God of Heat, Heavenly Light, Power" (2022-24)
AMD Ryzen 9 7950X (TG High Perf. IHS) / Asus ROG Crosshair X670E Extreme / MSI Geforce RTX 4090 Suprim X / Teamgroup T-Force Delta RGB DDR5-8200 2x24 GB / Seagate Firecuda 530 4 TB / 5x Samsung 860 Evo 4 TB / Arctic Liquid Freezer II 420 (Push/Pull 6x Noctua NF-A14 IndustrialPPC-3000 intake) / Seasonic TX-1600 W Titanium / Phanteks Enthoo Pro 2 TG (3x Arctic P12 A-RGB intake / 4x Arctic P14 A-RGB exhaust / 1x Arctic P14 A-RGB RAM cooling) / Samsung Odyssey Neo G8 32" 4K 240 Hz / Ducky One 3 Daybreak Fullsize Cherry MX Brown / Corsair M65 Ultra RGB / PDP Afterglow Wave Black / Beyerdynamic DT 770 Pro X Limited Edition

 

My Lady's: Clevo NH55JNNQ "Alfred" (2022-24)
Sharp LQ156M1JW03 FHD matte 15.6" IGZO 8 bit @248 Hz / Intel Core i5 12600 / Nvidia Geforce RTX 3070 Ti / Mushkin Redline DDR4-3200 2x32 GB / Samsung 970 Pro 1 TB / Samsung 870 QVO 8 TB / Intel AX201 WIFI 6+BT 5.2 / Win 11 Pro Phoenix Lite OS / 230 W PSU powered by Prema Mod!

Link to comment
Share on other sites

  • 1 month later...
On 10/28/2022 at 11:32 AM, srs2236 said:

Actually, I was thinking about this for a little while now.

 

I always run my fans at max speed. When I did the Superposition benchmark the GPU draw a constant 150W, yet it ran so comfortably cool reaching only 60C at the very last moments of the benchmark.

 

Yet, If I run a CPU benchmark like Cinebench at 150W, one of the CPU cores reaches 99C almost immediately.

 

Something doesn't feel right... I know for a fact the GPU side has more copper mass but still. Starts to sound like poor thermal transfer to me.

 

Maybe it's the fact that the GPU is direct die? Or maybe just has a better contact... I use Thermal Grizzly Kryonaut for both.

I am very afraid of liquid metal for the fact that it can destroy the whole PC if not applied correctly + the issue of re-application because of LM + copper problems.

 

I have been thinking about the IHS and change of thermal paste though.

 

The idea was to change IHS to a bigger one and put LM under the IHS. Then maybe change thermal paste from Kryonaut to the Honeywell phase change one. Seen some people praise it a lot.

 

The 9900K is a hot CPU. My son was playing a game today with his new 9900K that we got from brother @cylix and with max cores set to 5GHz the CPU would go up to 99 degrees even in some games. We then backed up to 4.8 GHz and temps stayed at 90 degrees max and usually in the mid 70s to mid 80s with fans not being excessively loud. That is a drop of at least 10 degrees for a 4% lower clock speed and I am sure that backing up to 4.7 all core would still yield more of an improvement and it possibly will also allow a bit more of an undervolt to further reduce power consumption. That was with a P870TM but with the less beefy CPU heatsink and a modified cooling pad with Noctua fans that we also got from brother @cylix.

 

So if you are really into CPU only tasks or some extremely good benchmarks then I would say go for it but if you aren't then you may want to consider what good all of this will do when the total power available for your GPU and CPU is only about 250W. You can probably accommodate this with your current setup already and it is more worthwhile to shoot for lower CPU temps to extend the life of your P775. In any case if you do type on an external keyboard at home then I strongly recommend a cooling pad and to consider running your P775 with the bottom cover off or at least with some strategically placed holes - that should drop your average CPU temps by 10 to 15 degrees in total before taking any other measures.

 

 

Link to comment
Share on other sites

  • 4 weeks later...

Played around with afterburner and found this!!!

 

While playing Rocket League (didn't have time to test anything else) the Core 6 on average had like +17c more degrees than all other cores! (Rest of the cores was like +-4c between themselves)

 

How can it be so bad? I had noticed this aswell in XTU where my CPU had thermal throttling even though just one core was on 99c and others were lower like 90 or something.

 

I would understand Rocket League using just one core but then why isn't it the 1st core instead of the 6th one?

 

And its all the time like 17c difference and thats just too much... Anyone experienced something like this? I think it definately shouldn't be like this.

 

 

this is fine =).png

Clevo P775TM1-G:

Spoiler

GPU: NVIDIA GeForce RTX 3080 Laptop 16GB (150W, +110MHz Core, +350MHz Mem)

CPU: Intel(R) Core(TM) i9-9900KS (5GHz, 4GHz Cache, -130mV, 255A, 200W PL1/2)

RAM: DDR4 32 GB (3333MHz, 14-17-17-32, 2x16, Micron rev.E, 1.45v)

Storage 1: Kingston KC3000 2TB RAID0 (2x1TB, NVME, PCI-E 3.0)

Storage 2: Seagate LM015 2TB (2.5, HDD, SATA3)

Storage 3: Integral UltimaPro 512GB (SDXC, 100r/50w, PCI-E)

Display: AU Optronics B173ZAN0.10 (4K, 60Hz)

Wi-Fi/BT: Killer(R) Wireless-AC 1550 (9260NGW, PCI-E)

Operating system: Windows 11 Pro x64 (23H2)

Link to comment
Share on other sites

Yeah…the lack of G-Sync is a big no go for me to upgrade to 3080…

XMG Neo 17 (E24)  | 14900HX | 32GB (2X16) Corsair Vengeance DDR5@6600 CL30 | RTX4090 (+285/900) | QHD @240Hz G-Sync | 2x2TB Samsung 990 PRO | Killer WiFi 1675i | Oasis MK2 WC | Win 11 Home - TimeSpy

 

Clevo X170KM-G | 11700KF | 64GB (4x16) Crucial Ballistix DDR4@3066 CL14 | RTX3080 (+220/1650) | QHD @165Hz G-Sync | 1TB WD SN850X + 1TB WD SN750 | Intel AX200 WiFi | Win 11 Pro - TimeSpy

Link to comment
Share on other sites

On 1/13/2023 at 11:27 AM, srs2236 said:

Played around with afterburner and found this!!!

 

While playing Rocket League (didn't have time to test anything else) the Core 6 on average had like +17c more degrees than all other cores! (Rest of the cores was like +-4c between themselves)

 

How can it be so bad? I had noticed this aswell in XTU where my CPU had thermal throttling even though just one core was on 99c and others were lower like 90 or something.

 

I would understand Rocket League using just one core but then why isn't it the 1st core instead of the 6th one?

 

And its all the time like 17c difference and thats just too much... Anyone experienced something like this? I think it definately shouldn't be like this.

 

 

this is fine =).png

Had the same problem with my 9600kf. solved after delid and die lap.

Desktop - MSI X670E Tomahawk Wifi (cheap Ebay mobo that I fixed) | AMD 7800X3D | 32 GB Trident Z5 Neo RGB 6000Mhz | MSI RTX 4070TI Suprim X  | Alienware 27 AW2724DM 2K 165 Hz Gsync | Samsung 990 Pro Nvme - Boot | Other various storage | Windows 10 Pro x64

SOLD - Clevo P870DM-G | i9-9700K 4.5 Ghz on all cores (-50 mv undervolted) | 32GB Hyper X Black 2666MHz | Clevo RTX 2080 3.1b undervolted for better temp 1905Mhz @881 mv | AUO B173HAN03.1 144hz Gsync | Samsung 980 NVME | Dsanke TM BIOS - Chujoi13 adapted based on needs | Network Card: Intel AX210-AX | Windows 10 Pro x64

 

Link to comment
Share on other sites

15 minutes ago, MiRaGe said:

Yeah…the lack of G-Sync is a big no go for me to upgrade to 3080…

As are things standing right now, Gsync will never work due to the inability to decrypt Gsync cookies. Also the Green Goblin will be pissed...

Desktop - MSI X670E Tomahawk Wifi (cheap Ebay mobo that I fixed) | AMD 7800X3D | 32 GB Trident Z5 Neo RGB 6000Mhz | MSI RTX 4070TI Suprim X  | Alienware 27 AW2724DM 2K 165 Hz Gsync | Samsung 990 Pro Nvme - Boot | Other various storage | Windows 10 Pro x64

SOLD - Clevo P870DM-G | i9-9700K 4.5 Ghz on all cores (-50 mv undervolted) | 32GB Hyper X Black 2666MHz | Clevo RTX 2080 3.1b undervolted for better temp 1905Mhz @881 mv | AUO B173HAN03.1 144hz Gsync | Samsung 980 NVME | Dsanke TM BIOS - Chujoi13 adapted based on needs | Network Card: Intel AX210-AX | Windows 10 Pro x64

 

Link to comment
Share on other sites

This means I get more fluent FPS now with RTX 2080 and supported G-Sync than in case I upgrade to RTX 3080, don‘t I?

I am speaking about competitive games like Warzone 

XMG Neo 17 (E24)  | 14900HX | 32GB (2X16) Corsair Vengeance DDR5@6600 CL30 | RTX4090 (+285/900) | QHD @240Hz G-Sync | 2x2TB Samsung 990 PRO | Killer WiFi 1675i | Oasis MK2 WC | Win 11 Home - TimeSpy

 

Clevo X170KM-G | 11700KF | 64GB (4x16) Crucial Ballistix DDR4@3066 CL14 | RTX3080 (+220/1650) | QHD @165Hz G-Sync | 1TB WD SN850X + 1TB WD SN750 | Intel AX200 WiFi | Win 11 Pro - TimeSpy

Link to comment
Share on other sites

4 hours ago, MiRaGe said:

This means I get more fluent FPS now with RTX 2080 and supported G-Sync than in case I upgrade to RTX 3080, don‘t I?

I am speaking about competitive games like Warzone 

Are we talking here Full HD or?

Desktop - MSI X670E Tomahawk Wifi (cheap Ebay mobo that I fixed) | AMD 7800X3D | 32 GB Trident Z5 Neo RGB 6000Mhz | MSI RTX 4070TI Suprim X  | Alienware 27 AW2724DM 2K 165 Hz Gsync | Samsung 990 Pro Nvme - Boot | Other various storage | Windows 10 Pro x64

SOLD - Clevo P870DM-G | i9-9700K 4.5 Ghz on all cores (-50 mv undervolted) | 32GB Hyper X Black 2666MHz | Clevo RTX 2080 3.1b undervolted for better temp 1905Mhz @881 mv | AUO B173HAN03.1 144hz Gsync | Samsung 980 NVME | Dsanke TM BIOS - Chujoi13 adapted based on needs | Network Card: Intel AX210-AX | Windows 10 Pro x64

 

Link to comment
Share on other sites

deleted

XMG Neo 17 (E24)  | 14900HX | 32GB (2X16) Corsair Vengeance DDR5@6600 CL30 | RTX4090 (+285/900) | QHD @240Hz G-Sync | 2x2TB Samsung 990 PRO | Killer WiFi 1675i | Oasis MK2 WC | Win 11 Home - TimeSpy

 

Clevo X170KM-G | 11700KF | 64GB (4x16) Crucial Ballistix DDR4@3066 CL14 | RTX3080 (+220/1650) | QHD @165Hz G-Sync | 1TB WD SN850X + 1TB WD SN750 | Intel AX200 WiFi | Win 11 Pro - TimeSpy

Link to comment
Share on other sites

2 hours ago, runix18 said:

Are we talking here Full HD or?

Yes, Full HD 144Hz

XMG Neo 17 (E24)  | 14900HX | 32GB (2X16) Corsair Vengeance DDR5@6600 CL30 | RTX4090 (+285/900) | QHD @240Hz G-Sync | 2x2TB Samsung 990 PRO | Killer WiFi 1675i | Oasis MK2 WC | Win 11 Home - TimeSpy

 

Clevo X170KM-G | 11700KF | 64GB (4x16) Crucial Ballistix DDR4@3066 CL14 | RTX3080 (+220/1650) | QHD @165Hz G-Sync | 1TB WD SN850X + 1TB WD SN750 | Intel AX200 WiFi | Win 11 Pro - TimeSpy

Link to comment
Share on other sites

On 10/19/2022 at 4:09 PM, srs2236 said:

G-Sync at the moment is still not working. I personally think it's probably possible to get it back if the mods are done correctly, maybe some parts of code need to be borrowed from X170KM-G bios. I am not sure. But at the moment I am leaving it like it is as I don't really care that much about g-sync for internal panel.

Hello,
regarding G-Sync. It is definitely not a hardware problem! It needs a bios update! Does your modded Bios everything works as before!?

Link to comment
Share on other sites

On 1/17/2023 at 4:29 PM, MiRaGe said:

This means I get more fluent FPS now with RTX 2080 and supported G-Sync than in case I upgrade to RTX 3080, don‘t I?

I am speaking about competitive games like Warzone 

I don't think G-Sync alone can make for a smoother experience over a 3080 without G-Sync but that might be subjective topic. I wouldn't recommend the 3080 over the 2080 you have just because it doesn't make sense performance wise. I really understood that the performance difference was too small only once I got it.

 

And to respond about the G-Sync mods - yeah hardware is not the problem, but unless you have some secret BIOS update that can fix that, it will require someone who has really good understanding of how G-Sync verification on laptops work. From my little understanding not only there are some cookies or whatever encrypted in the BIOS for each specific display panel, there are also hardware checks on the NVIDIA drivers themselves. Put simply, to bring it back to life would be a pain in the ass. And I don't think it would be worth it anyways. Like, I am thinking of getting a new panel for my Clevo and I can guarantee I will never ever see G-Sync support on a panel that was not originally shipped with the P775TM1-G. There might be a way to restore G-Sync for the original panels that were officially supported but definately not for new ones that are not in the list.

 

If somebody can make a proper BIOS mod so that the 3080 gets spoofed as 2080 G-Sync model maybe it would work.

Clevo P775TM1-G:

Spoiler

GPU: NVIDIA GeForce RTX 3080 Laptop 16GB (150W, +110MHz Core, +350MHz Mem)

CPU: Intel(R) Core(TM) i9-9900KS (5GHz, 4GHz Cache, -130mV, 255A, 200W PL1/2)

RAM: DDR4 32 GB (3333MHz, 14-17-17-32, 2x16, Micron rev.E, 1.45v)

Storage 1: Kingston KC3000 2TB RAID0 (2x1TB, NVME, PCI-E 3.0)

Storage 2: Seagate LM015 2TB (2.5, HDD, SATA3)

Storage 3: Integral UltimaPro 512GB (SDXC, 100r/50w, PCI-E)

Display: AU Optronics B173ZAN0.10 (4K, 60Hz)

Wi-Fi/BT: Killer(R) Wireless-AC 1550 (9260NGW, PCI-E)

Operating system: Windows 11 Pro x64 (23H2)

Link to comment
Share on other sites

 

On 10/19/2022 at 6:09 PM, srs2236 said:

Successfully modified the bios to get rid of inf mod necessity and to correctly recognize the card in bios settings:

 

20221019_143924.thumb.jpg.478d616e226627d70a0c2b2e18a0c857.jpg

 

20221019_144117.thumb.jpg.ee2ba97691e64c40caad5a4d4fa00850.jpg

 

GeForce Experience now shows notifications for new drivers and drivers from NVIDIA website install without problems.

 

This is really nice as even when Windows 11 did the update from 21H2 to 22H2 I had to reinstall drivers again with inf mod and driver signature disabling etc and it is just pain. So now this is not an issue + automatic updates.

 

This was achieved by altering the Device ID. The subsys part now reports same ID as X170KM-G.

 

G-Sync at the moment is still not working. I personally think it's probably possible to get it back if the mods are done correctly, maybe some parts of code need to be borrowed from X170KM-G bios. I am not sure. But at the moment I am leaving it like it is as I don't really care that much about g-sync for internal panel.

 

Also Dynamic Boost is still not working. I don't have the slightest clue if it even is possible to get it working with some mods or not (would be nice to run the card at 165W).

 

When I will have time, I will make a quick YouTube tutorial video on how to modify P775 bios to make it compatible with 30 series cards for anyone who might need it in future. It is not hard and requires only some hex edits in two of the bios modules.

 

 

Good afternoon. Please share bios. I also want a bios that supports 3rd generation cards. Do you have a modified bios installed?

ec_logo.jpg.3990b27597f0224d6654295f13333c38.jpg

Eurocom Sky X7C / Clevo P775TM1-G / i9-9900T  / RTX 2060 / 32GB DDR4 / 780W AC Adapter

Link to comment
Share on other sites

Hi!

 

Sorry for not replying before. Here is all of my pictures from my installation. Same ones that has broken links now + more:

https://files.fm/u/mq83nsjzp

 

And here are the ones you asked now:

tWjeMBJ.jpg

jn8Mbrw.jpg

Zm2s4Yj.jpg

Clevo P775TM1-G:

Spoiler

GPU: NVIDIA GeForce RTX 3080 Laptop 16GB (150W, +110MHz Core, +350MHz Mem)

CPU: Intel(R) Core(TM) i9-9900KS (5GHz, 4GHz Cache, -130mV, 255A, 200W PL1/2)

RAM: DDR4 32 GB (3333MHz, 14-17-17-32, 2x16, Micron rev.E, 1.45v)

Storage 1: Kingston KC3000 2TB RAID0 (2x1TB, NVME, PCI-E 3.0)

Storage 2: Seagate LM015 2TB (2.5, HDD, SATA3)

Storage 3: Integral UltimaPro 512GB (SDXC, 100r/50w, PCI-E)

Display: AU Optronics B173ZAN0.10 (4K, 60Hz)

Wi-Fi/BT: Killer(R) Wireless-AC 1550 (9260NGW, PCI-E)

Operating system: Windows 11 Pro x64 (23H2)

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. Terms of Use