Jump to content
NotebookTalk

anyone uses nvidia inspector. is 1981.beta the latest.?


raptorddd
 Share

Recommended Posts

i have 1981.beta nvidia inspector. 

i found hes github but it has profile inspector,

founs another github version 19.69  but i could not find the file to download. it only has a .md readme file.

 

latest here

https://orbmu2k.de/tools/nvidia-inspector-tool
http://download.orbmu2k.de/files/nvidiaInspector.zip

 

dell precision m4600

i7 2760QM

8GB ram

MX500 crucial SSD 500GB.

win 10 21H2

Link to comment
Share on other sites

1.9.8.1 is what I have.  Haven't been able to find a newer one (but haven't looked recently).  It works fine for what I need, even with later NVIDIA driver versions.

  • Thumb Up 1

Dell Precision 7770 (personal) • Dell Precision 7560 (work) • Full specs in spoiler block below
Info posts (Dell) — Dell Precision key postsDell driver RSS feeds • Dell Fan Management — override fan behavior
Info posts (Windows) — Turbo boost toggle • The problem with Windows 11 • About Windows 10 LTSC

Spoiler

Dell Precision 7770 (personal)

  • Intel Core i9-12950HX ("Alder Lake"), 8P+8E
    • 8× P cores ("Golden Cove"): 2.3 GHz base, 5.0 GHz turbo, hyperthreading
    • 8× E cores ("Gracemont"): 1.7 GHz base, 3.6 GHz turbo
  • 128GB DDR5-3600 (CAMM)
  • NVIDIA GeForce RTX 3080 Ti 16GB (DGFF)
  • Storage:
    • 2TB system drive: Samsung 980 Pro, PCIe4
    • 24TB additional storage: 3× Sabrent Rocket 4 Plus 8TB, PCIe4 (Storage Spaces)
  • Windows 10 (Enterprise LTSC 2021)
  • 17.3" 3940×2160 display
  • Intel Wi-Fi AX211 (Wi-Fi 6E + Bluetooth)
  • 93Wh battery
  • IR webcam
  • Fingerprint reader

 

Dell Precision 7560 (work)

  • Intel Xeon W-11955M ("Tiger Lake")
    • 8×2.6 GHz base, 5.0 GHz turbo, hyperthreading ("Willow Cove")
  • 64GB DDR4-3200 ECC
  • NVIDIA RTX A2000 4GB
  • Storage:
    • 512GB system drive (Micron 2300)
    • 4TB additional storage (Sabrent Rocket Q4)
  • Windows 10 (Enterprise LTSC 2021)
  • 15.6" 3940×2160 display
  • Intel Wi-Fi AX210 (Wi-Fi 6E + Bluetooth)
  • 95Wh battery
  • IR webcam
  • Fingerprint reader

 

Previous

  • Dell Precision 7530, 7510, M4800, M6700
  • Dell Latitude E6520
  • Dell Inspiron 1720, 5150
  • Dell Latitude CPi
Link to comment
Share on other sites

Well I was using 1.9.7.3 and clicked on the update link and it updated to 1.9.8.1 Thanks for the notice, I just never updated it cause it worked and am using it on an old GTX 980m. It just installed and downloaded itself, seems like no need to download

NVIDIA INSPECTOR - CHANGELOG  states Version 1.9.8.1 Beta is latest

  • Thumb Up 1

Alienware M18x R2 i7-3920xm-16GB DDR3-Quadro P4000  Alienware M17x R4 i7-3940XM 16GB DDR3-1866 Quadro P4000

Precision m6700 i7-3840QM - 16.0GB DDR3 - GTX 970M     Alienware M17x R4 i7-3940XM 32GB DDR3-1600 GTX 680M 120hz 3D
Precision m4700 i7-3610QM-8.00GB DDR3 @ 1600MHz-K2000M

GOBOXX SLM  G2721-i7-10875H RTX 3000-32GB ddr4(Gave to my Wife)

 

Link to comment
Share on other sites

I've never used Nvidia Inspector, but since I have no way of utilising the integrated graphics on my laptop with a 980m, I was wondering if there is much power saving benefits that could be extracted from it with undervolting or limiting clocks? Or does the GPU firmware already do a decent job with power efficiency when under lighter loads? I can't really play games with my current health, so not really making much use of it. And my electricity prices have recently gone up by a factor of 3x, so need to save every Watt I can 👾 

Link to comment
Share on other sites

On 7/30/2022 at 7:34 PM, Ishatix said:

I've never used Nvidia Inspector, but since I have no way of utilising the integrated graphics on my laptop with a 980m, I was wondering if there is much power saving benefits that could be extracted from it with undervolting or limiting clocks? Or does the GPU firmware already do a decent job with power efficiency when under lighter loads? I can't really play games with my current health, so not really making much use of it. And my electricity prices have recently gone up by a factor of 3x, so need to save every Watt I can 👾 

im actually using nvidia inspector to generate a batch file that i automatically run at windows startup. the batch file forces the 980M to lowest possible voltages and clocks no matter the load (completely fine for desktop usage) and thus severely limits its power usage.

i manually tested what kinda clocks i can get away with, if u go too low u start seeing graphical glitches and/or stutters.

  • Thumb Up 1
  • Like 1

Mine: Hyperion "Titan God of Heat, Heavenly Light, Power" (2022)
AMD Ryzen 9 7950X / Asus ROG Crosshair X670E Extreme / MSI Geforce RTX 4090 Suprim X / G.Skill Trident Z5 RGB DDR5-6600 2x16 GB / Seagate Firecuda 530 4 TB / 2x Samsung 860 Evo 4 TB / Arctic Liquid Freezer II 420 / Seasonic TX-1600 W Titanium / Phanteks Enthoo Pro 2 TG / Samsung Odyssey Neo G8 32" UHD 240 Hz / Ducky One 3 Daybreak Fullsize Cherry MX Brown / Corsair M65 Ultra RGB

 

My Lady's: Clevo NH55JNNQ "Alfred" (2022)
Sharp LQ156M1JW03 FHD matte 15.6" IGZO 8 bit @248 Hz / Intel 12600 @ 4.4 - 4.8 Ghz / Nvidia 3070 Ti 8 GB GDDR6 / G.Skill 16 GB DDR4-3800 / Samsung 970 Pro 1 TB / Intel AX201 ax+BT / Win 11 Pro Phoenix Lite OS / 230 W PSU powered by Prema Mod!

Link to comment
Share on other sites

On 7/30/2022 at 10:10 PM, jaybee83 said:

im actually using nvidia inspector to generate a batch file that i automatically run at windows startup. the batch file forces the 980M to lowest possible voltages and clocks no matter the load (completely fine for desktop usage) and thus severely limits its power usage.

i manually tested what kinda clocks i can get away with, if u go too low u start seeing graphical glitches and/or stutters.

 

Oh wow, that sounds like just what I need! Did you have to do much testing to determine the lowest safe level like with undervolting the CPU, or is there a low enough level that would be considered safe enough without having to do much testing? I'm very limited for "doing things" capability these days unfortunately... I still haven't even managed to update my GPU driver which I've been meaning to do for years now (so any recommendations on best driver for 980m also appreciated).

Link to comment
Share on other sites

32 minutes ago, Ishatix said:

 

Oh wow, that sounds like just what I need! Did you have to do much testing to determine the lowest safe level like with undervolting the CPU, or is there a low enough level that would be considered safe enough without having to do much testing? I'm very limited for "doing things" capability these days unfortunately... I still haven't even managed to update my GPU driver which I've been meaning to do for years now (so any recommendations on best driver for 980m also appreciated).

if you want quick n dirty u can just force the lowest idle 2D clocks that your gpu sets as per stock. then u know for sure that there are no instabilities 🙂

otherwise, its actually way quicker than UV with a CPU since u start seeing glitches or stuttering right away once u go too low.

 

as for drivers, im also still on 451.48 *lol* im using a mod provided by J95 from back in the NBR days.

for more current ones, i can warmly recommend going with xtreme g (or XG for short) drivers, theyre decrapified from telemetry and have some tweaks implemented: https://www.reddit.com/r/XtremeG/

  • Thumb Up 1
  • Thanks 1

Mine: Hyperion "Titan God of Heat, Heavenly Light, Power" (2022)
AMD Ryzen 9 7950X / Asus ROG Crosshair X670E Extreme / MSI Geforce RTX 4090 Suprim X / G.Skill Trident Z5 RGB DDR5-6600 2x16 GB / Seagate Firecuda 530 4 TB / 2x Samsung 860 Evo 4 TB / Arctic Liquid Freezer II 420 / Seasonic TX-1600 W Titanium / Phanteks Enthoo Pro 2 TG / Samsung Odyssey Neo G8 32" UHD 240 Hz / Ducky One 3 Daybreak Fullsize Cherry MX Brown / Corsair M65 Ultra RGB

 

My Lady's: Clevo NH55JNNQ "Alfred" (2022)
Sharp LQ156M1JW03 FHD matte 15.6" IGZO 8 bit @248 Hz / Intel 12600 @ 4.4 - 4.8 Ghz / Nvidia 3070 Ti 8 GB GDDR6 / G.Skill 16 GB DDR4-3800 / Samsung 970 Pro 1 TB / Intel AX201 ax+BT / Win 11 Pro Phoenix Lite OS / 230 W PSU powered by Prema Mod!

Link to comment
Share on other sites

On 7/27/2022 at 4:58 AM, raptorddd said:

i have 1981.beta nvidia inspector. 

i found hes github but it has profile inspector,

founs another github version 19.69  but i could not find the file to download. it only has a .md readme file.

After a bit of searching around I finally found the developers home page and direct download link for it as follows:
https://orbmu2k.de/tools/nvidia-inspector-tool
http://download.orbmu2k.de/files/nvidiaInspector.zip

 

The sensor monitoring doesn't seem to display so well on my monitor with the lower portion of the text cut-off by the borders between the different segments. Also, none of the three power monitoring options show anything, and the first "Power and Temperature Target" slider (supposed to be power?) in the overclocking section is greyed out.

 

On 8/4/2022 at 10:52 PM, jaybee83 said:

if you want quick n dirty u can just force the lowest idle 2D clocks that your gpu sets as per stock. then u know for sure that there are no instabilities 🙂

otherwise, its actually way quicker than UV with a CPU since u start seeing glitches or stuttering right away once u go too low.

 

Thanks! How do you set a particular clock? I can only see options for offsetting them. The other setting I wondered about was what happens if I just set the temperature target to the lowest it will go (64°C), though I'm not sure that will achieve much considering the highest GPU temperature I can generate in casual usage is only 53°C (by keeping the clock at max by jumping back and forth constantly on the timeline of a 4k video).

 

Anyway, so looking at all the numbers here I'm not sure if there is much to be gained really. Do you have any idea on the power savings your settings deliver?

 

On 8/4/2022 at 10:52 PM, jaybee83 said:

as for drivers, im also still on 451.48 *lol* im using a mod provided by J95 from back in the NBR days.

for more current ones, i can warmly recommend going with xtreme g (or XG for short) drivers, theyre decrapified from telemetry and have some tweaks implemented: https://www.reddit.com/r/XtremeG/

 

And thanks again! That sounds fancy and modern compared to my 368.22 lol. Do the XG drivers have any particular advantage compared to installing with NVCleanstall?

  • Thumb Up 1
Link to comment
Share on other sites

9 hours ago, Ishatix said:

After a bit of searching around I finally found the developers home page and direct download link for it as follows:
https://orbmu2k.de/tools/nvidia-inspector-tool
http://download.orbmu2k.de/files/nvidiaInspector.zip

 

The sensor monitoring doesn't seem to display so well on my monitor with the lower portion of the text cut-off by the borders between the different segments. Also, none of the three power monitoring options show anything, and the first "Power and Temperature Target" slider (supposed to be power?) in the overclocking section is greyed out.

 

 

Thanks! How do you set a particular clock? I can only see options for offsetting them. The other setting I wondered about was what happens if I just set the temperature target to the lowest it will go (64°C), though I'm not sure that will achieve much considering the highest GPU temperature I can generate in casual usage is only 53°C (by keeping the clock at max by jumping back and forth constantly on the timeline of a 4k video).

 

Anyway, so looking at all the numbers here I'm not sure if there is much to be gained really. Do you have any idea on the power savings your settings deliver?

 

 

And thanks again! That sounds fancy and modern compared to my 368.22 lol. Do the XG drivers have any particular advantage compared to installing with NVCleanstall?

 

yep thats the original developer, good find.

 

i think u gotta change ur "high dpi settings" under file properties and compability to correct that flawed display. some apps need to be adjusted like that, otherwise ull see that weird behaviour ure describing.

 

yep, power and temp target are greyed out in my case as well, so thats normal. what do u mean by power monitoring options? are u referring to the various performance levels shown on the top of the advanced overclocking menu?

 

as for settings clocks: u can set those individually for the four (in my case with the 980M) listed performance profiles P8, P5, P1 and P0. Basic rule of thumb: the higher the number, the lower the power state of the gpu. so P0 is 3D active with full clocks and voltages, P1 is 3D idle (i.e. when ure in game menus), P5 is 2D active and P8 is 2D idle with the lowest clocks and voltages. dont quote me on these descriptions, this is just how i remember it 🙂

for P8, P5 and P1 u can set core and vram clocks as absolute values, with no option to change core voltage.

for P0 the core and vram clocks are set in offsets and u also have the option to change core voltage as an offset with steps of 12.5 mV.

 

dont worry about the temperature slider, thats only good for overclocking to prioritize temperature and give the gpu a bit more thermal headroom.

 

for my particular case ive set the following power savings clocks: P8, P5, P1 all set for core 135 Mhz and vram 180 Mhz. for P0, i set core offset -810 Mhz and vram offset -905 Mhz. i left the core voltage at 0 mV but im also sporting a PremaMod vbios that lowers the 3D active core voltage to 1.000 V (a bit lower than stock, dont remember the exact stock value though).

once u set everything and apply, u can then click "create clocks shortcut" in the lower left corner of the advanced oc menu. thatll create a shortcut on ur desktop for these settings to be applied in Nvidia Inspector.

for a custom batch file, open those shortcuts with notepad and copy out the respective clock and voltage commands and into a batch file which u can then set to be executed via windows startup - voila, there u go 🙂

 

to return back to stock, u can just click "apply defaults" and ure back where u started.

 

as for power savings: stock board power is around 100-125W, with the above power saver settings im maxing out at 25-26W so were talking about 75-80% reduction in max power draw for the gpu.

 

as for the XG drivers i dont remember all the special sauce tweaks they implement, but u can read up on the on the link i provided before. i just remember its some performance tweaks they do in addition to the decrapifying the telemetry that we all love to hate haha 😛 

  • Thumb Up 1
  • Thanks 1

Mine: Hyperion "Titan God of Heat, Heavenly Light, Power" (2022)
AMD Ryzen 9 7950X / Asus ROG Crosshair X670E Extreme / MSI Geforce RTX 4090 Suprim X / G.Skill Trident Z5 RGB DDR5-6600 2x16 GB / Seagate Firecuda 530 4 TB / 2x Samsung 860 Evo 4 TB / Arctic Liquid Freezer II 420 / Seasonic TX-1600 W Titanium / Phanteks Enthoo Pro 2 TG / Samsung Odyssey Neo G8 32" UHD 240 Hz / Ducky One 3 Daybreak Fullsize Cherry MX Brown / Corsair M65 Ultra RGB

 

My Lady's: Clevo NH55JNNQ "Alfred" (2022)
Sharp LQ156M1JW03 FHD matte 15.6" IGZO 8 bit @248 Hz / Intel 12600 @ 4.4 - 4.8 Ghz / Nvidia 3070 Ti 8 GB GDDR6 / G.Skill 16 GB DDR4-3800 / Samsung 970 Pro 1 TB / Intel AX201 ax+BT / Win 11 Pro Phoenix Lite OS / 230 W PSU powered by Prema Mod!

Link to comment
Share on other sites

On 8/6/2022 at 5:40 AM, Ishatix said:

After a bit of searching around I finally found the developers home page and direct download link for it as follows:
https://orbmu2k.de/tools/nvidia-inspector-tool
http://download.orbmu2k.de/files/nvidiaInspector.zip

 

The sensor monitoring doesn't seem to display so well on my monitor with the lower portion of the text cut-off by the borders between the different segments. Also, none of the three power monitoring options show anything, and the first "Power and Temperature Target" slider (supposed to be power?) in the overclocking section is greyed out.

 

 

Thanks! How do you set a particular clock? I can only see options for offsetting them. The other setting I wondered about was what happens if I just set the temperature target to the lowest it will go (64°C), though I'm not sure that will achieve much considering the highest GPU temperature I can generate in casual usage is only 53°C (by keeping the clock at max by jumping back and forth constantly on the timeline of a 4k video).

 

Anyway, so looking at all the numbers here I'm not sure if there is much to be gained really. Do you have any idea on the power savings your settings deliver?

 

 

And thanks again! That sounds fancy and modern compared to my 368.22 lol. Do the XG drivers have any particular advantage compared to installing with NVCleanstall?

thats the same where i got it from says beta..

thanks

 

for me GPU clock is greyed out . but moving the shader moves the GPU.

Capturddddde.PNG

  • Thumb Up 2

dell precision m4600

i7 2760QM

8GB ram

MX500 crucial SSD 500GB.

win 10 21H2

Link to comment
Share on other sites

guys i dont use it.just test it

 am afraid il shorten the life of my quadro 2000m.  i have never overclocked any device before..

does it really shortens the life of device.??  is it safe as long as it doesnt over heat.? or what is there to avoid to shorten the life of device.?

my quadro 2000m runs at 63C.

ive tested 50mhz but temps were the same  the most i used was 100mhz and temps are the same..

dell precision m4600

i7 2760QM

8GB ram

MX500 crucial SSD 500GB.

win 10 21H2

Link to comment
Share on other sites

Should be fine, especially if temps are below mid-80s.  Without a custom vBIOS it will not let you overclock that far.  I ran Quadro K5000M in M6700 overclocked for years with no issues.

  • Thumb Up 2

Dell Precision 7770 (personal) • Dell Precision 7560 (work) • Full specs in spoiler block below
Info posts (Dell) — Dell Precision key postsDell driver RSS feeds • Dell Fan Management — override fan behavior
Info posts (Windows) — Turbo boost toggle • The problem with Windows 11 • About Windows 10 LTSC

Spoiler

Dell Precision 7770 (personal)

  • Intel Core i9-12950HX ("Alder Lake"), 8P+8E
    • 8× P cores ("Golden Cove"): 2.3 GHz base, 5.0 GHz turbo, hyperthreading
    • 8× E cores ("Gracemont"): 1.7 GHz base, 3.6 GHz turbo
  • 128GB DDR5-3600 (CAMM)
  • NVIDIA GeForce RTX 3080 Ti 16GB (DGFF)
  • Storage:
    • 2TB system drive: Samsung 980 Pro, PCIe4
    • 24TB additional storage: 3× Sabrent Rocket 4 Plus 8TB, PCIe4 (Storage Spaces)
  • Windows 10 (Enterprise LTSC 2021)
  • 17.3" 3940×2160 display
  • Intel Wi-Fi AX211 (Wi-Fi 6E + Bluetooth)
  • 93Wh battery
  • IR webcam
  • Fingerprint reader

 

Dell Precision 7560 (work)

  • Intel Xeon W-11955M ("Tiger Lake")
    • 8×2.6 GHz base, 5.0 GHz turbo, hyperthreading ("Willow Cove")
  • 64GB DDR4-3200 ECC
  • NVIDIA RTX A2000 4GB
  • Storage:
    • 512GB system drive (Micron 2300)
    • 4TB additional storage (Sabrent Rocket Q4)
  • Windows 10 (Enterprise LTSC 2021)
  • 15.6" 3940×2160 display
  • Intel Wi-Fi AX210 (Wi-Fi 6E + Bluetooth)
  • 95Wh battery
  • IR webcam
  • Fingerprint reader

 

Previous

  • Dell Precision 7530, 7510, M4800, M6700
  • Dell Latitude E6520
  • Dell Inspiron 1720, 5150
  • Dell Latitude CPi
Link to comment
Share on other sites

18 hours ago, raptorddd said:

thats the same where i got it from says beta..

thanks

 

for me GPU clock is greyed out . but moving the shader moves the GPU.

Capturddddde.PNG

thats good, actually! quadros are normally completely locked in terms of over or underclocking...

 

18 hours ago, raptorddd said:

guys i dont use it.just test it

 am afraid il shorten the life of my quadro 2000m.  i have never overclocked any device before..

does it really shortens the life of device.??  is it safe as long as it doesnt over heat.? or what is there to avoid to shorten the life of device.?

my quadro 2000m runs at 63C.

ive tested 50mhz but temps were the same  the most i used was 100mhz and temps are the same..

no worries on that front. only once u up the voltage will u significantly increase the heat. if u only up clocks then there should be no or only very minor heat increase.

as for temps, anything below 90 is fine, so in the 60s ure faaaaar away from any risk of degrading anything in your gpu 🙂 

  • Thumb Up 1

Mine: Hyperion "Titan God of Heat, Heavenly Light, Power" (2022)
AMD Ryzen 9 7950X / Asus ROG Crosshair X670E Extreme / MSI Geforce RTX 4090 Suprim X / G.Skill Trident Z5 RGB DDR5-6600 2x16 GB / Seagate Firecuda 530 4 TB / 2x Samsung 860 Evo 4 TB / Arctic Liquid Freezer II 420 / Seasonic TX-1600 W Titanium / Phanteks Enthoo Pro 2 TG / Samsung Odyssey Neo G8 32" UHD 240 Hz / Ducky One 3 Daybreak Fullsize Cherry MX Brown / Corsair M65 Ultra RGB

 

My Lady's: Clevo NH55JNNQ "Alfred" (2022)
Sharp LQ156M1JW03 FHD matte 15.6" IGZO 8 bit @248 Hz / Intel 12600 @ 4.4 - 4.8 Ghz / Nvidia 3070 Ti 8 GB GDDR6 / G.Skill 16 GB DDR4-3800 / Samsung 970 Pro 1 TB / Intel AX201 ax+BT / Win 11 Pro Phoenix Lite OS / 230 W PSU powered by Prema Mod!

Link to comment
Share on other sites

1 hour ago, jaybee83 said:

thats good, actually! quadros are normally completely locked in terms of over or underclocking...

 

no worries on that front. only once u up the voltage will u significantly increase the heat. if u only up clocks then there should be no or only very minor heat increase.

as for temps, anything below 90 is fine, so in the 60s ure faaaaar away from any risk of degarding anything in your gpu 🙂 

actually on this quadro 2000m the voltage parameter doesnt work. i mean you can change it but it reverts back. it alway stays on 9v.

what about the memory.? should that also be overclocked.?

 

dell precision m4600

i7 2760QM

8GB ram

MX500 crucial SSD 500GB.

win 10 21H2

Link to comment
Share on other sites

3 hours ago, raptorddd said:

actually on this quadro 2000m the voltage parameter doesnt work. i mean you can change it but it reverts back. it alway stays on 9v.

what about the memory.? should that also be overclocked.?

 

so seems like u dont have access to voltage adjustments, thats alright.

 

sure, best to OC both the gpu core and vRAM to get the most out of it 🙂 

 

if you go too high on the core u usually end up with black screens, system freezes or BSODs. as for vRAM, if u go too high ull start seeing glitches / artifacts (i.e. colorful lines, weird shapes, etc.). but no worries, in both instances a hard reboot resets the clocks to stock again.

  • Thumb Up 1

Mine: Hyperion "Titan God of Heat, Heavenly Light, Power" (2022)
AMD Ryzen 9 7950X / Asus ROG Crosshair X670E Extreme / MSI Geforce RTX 4090 Suprim X / G.Skill Trident Z5 RGB DDR5-6600 2x16 GB / Seagate Firecuda 530 4 TB / 2x Samsung 860 Evo 4 TB / Arctic Liquid Freezer II 420 / Seasonic TX-1600 W Titanium / Phanteks Enthoo Pro 2 TG / Samsung Odyssey Neo G8 32" UHD 240 Hz / Ducky One 3 Daybreak Fullsize Cherry MX Brown / Corsair M65 Ultra RGB

 

My Lady's: Clevo NH55JNNQ "Alfred" (2022)
Sharp LQ156M1JW03 FHD matte 15.6" IGZO 8 bit @248 Hz / Intel 12600 @ 4.4 - 4.8 Ghz / Nvidia 3070 Ti 8 GB GDDR6 / G.Skill 16 GB DDR4-3800 / Samsung 970 Pro 1 TB / Intel AX201 ax+BT / Win 11 Pro Phoenix Lite OS / 230 W PSU powered by Prema Mod!

Link to comment
Share on other sites

6 hours ago, jaybee83 said:

so seems like u dont have access to voltage adjustments, thats alright.

 

sure, best to OC both the gpu core and vRAM to get the most out of it 🙂 

 

if you go too high on the core u usually end up with black screens, system freezes or BSODs. as for vRAM, if u go too high ull start seeing glitches / artifacts (i.e. colorful lines, weird shapes, etc.). but no worries, in both instances a hard reboot resets the clocks to stock again.

cool i learned something new.  that each acts differebt when set up too high.

  • Thumb Up 1

dell precision m4600

i7 2760QM

8GB ram

MX500 crucial SSD 500GB.

win 10 21H2

Link to comment
Share on other sites

It turns out mucking around with Nvidia Inspector in a casual manner while writing a reply to a forum post is not a good idea(!)... so I now have to write this all out a second time... XD

 

On 8/6/2022 at 10:11 PM, jaybee83 said:

i think u gotta change ur "high dpi settings" under file properties and compability to correct that flawed display. some apps need to be adjusted like that, otherwise ull see that weird behaviour ure describing.

 

I tried disabling display scaling and it made no difference at all. But not big deal, is only a very minor visual glitch with the sensor read-outs.

 

On 8/6/2022 at 10:11 PM, jaybee83 said:

yep, power and temp target are greyed out in my case as well, so thats normal. what do u mean by power monitoring options? are u referring to the various performance levels shown on the top of the advanced overclocking menu?

 

I meant in the sensor monitoring display panel. If you right-click on the graphs and select "Monitors>", you can select which data to display. None of the power related options display anything at all for me. So I guess those read-outs are not available for this system as with the temperature and fan control options.

 

On 8/6/2022 at 10:11 PM, jaybee83 said:

as for settings clocks: u can set those individually for the four (in my case with the 980M) listed performance profiles P8, P5, P1 and P0. Basic rule of thumb: the higher the number, the lower the power state of the gpu. so P0 is 3D active with full clocks and voltages, P1 is 3D idle (i.e. when ure in game menus), P5 is 2D active and P8 is 2D idle with the lowest clocks and voltages. dont quote me on these descriptions, this is just how i remember it 🙂

for P8, P5 and P1 u can set core and vram clocks as absolute values, with no option to change core voltage.

for P0 the core and vram clocks are set in offsets and u also have the option to change core voltage as an offset with steps of 12.5 mV.

 

dont worry about the temperature slider, thats only good for overclocking to prioritize temperature and give the gpu a bit more thermal headroom.

 

for my particular case ive set the following power savings clocks: P8, P5, P1 all set for core 135 Mhz and vram 180 Mhz. for P0, i set core offset -810 Mhz and vram offset -905 Mhz. i left the core voltage at 0 mV but im also sporting a PremaMod vbios that lowers the 3D active core voltage to 1.000 V (a bit lower than stock, dont remember the exact stock value though).

once u set everything and apply, u can then click "create clocks shortcut" in the lower left corner of the advanced oc menu. thatll create a shortcut on ur desktop for these settings to be applied in Nvidia Inspector.

for a custom batch file, open those shortcuts with notepad and copy out the respective clock and voltage commands and into a batch file which u can then set to be executed via windows startup - voila, there u go 🙂

 

This is great info thank you! Is there any way to see or set/limit which power state you are in?

 

In P0, for base clock offset I can only go down to -135MHz minimum.. so nowhere close to 810 MHz! Maybe Prema mod gives you more leeway with that? And yeah, looks like the stock voltage is 1.2V.

 

Anyway, and most importantly, do you need to test the stability of these settings for each of the power states? And if so, how do you go about forcing a particular power state to be active to test it?

 

On 8/6/2022 at 10:11 PM, jaybee83 said:

to return back to stock, u can just click "apply defaults" and ure back where u started.

 

OK, so this is where it went wrong for me. From previous playing around I was assuming that "Apply defaults", sets everything back to default... Was playing around with the different clock sliders to double check ranges and increments and then went to click "Apply defaults" as I had done before no issue, but this time, instant reboot! So I'm only wondering now whether having a different power state selected can change anything here, like it tries to apply the selected power state? If not, then I guess I must have just hit the "Apply Clocks and Voltage" button beside it by mistake. I'm getting old!! 😕

 

On 8/6/2022 at 10:11 PM, jaybee83 said:

as for power savings: stock board power is around 100-125W, with the above power saver settings im maxing out at 25-26W so were talking about 75-80% reduction in max power draw for the gpu.

 

as for the XG drivers i dont remember all the special sauce tweaks they implement, but u can read up on the on the link i provided before. i just remember its some performance tweaks they do in addition to the decrapifying the telemetry that we all love to hate haha 😛 

 

Wow, that is a big saving. How are you measuring/monitoring that??

 

Link to comment
Share on other sites

1 hour ago, Ishatix said:

It turns out mucking around with Nvidia Inspector in a casual manner while writing a reply to a forum post is not a good idea(!)... so I now have to write this all out a second time... XD

lulz better focus on what ure doing 😛 😄 

 

I tried disabling display scaling and it made no difference at all. But not big deal, is only a very minor visual glitch with the sensor read-outs.

ok no worries then

 

I meant in the sensor monitoring display panel. If you right-click on the graphs and select "Monitors>", you can select which data to display. None of the power related options display anything at all for me. So I guess those read-outs are not available for this system as with the temperature and fan control options.

ooh ok, gotcha. im not using Inspector for monitoring, i rather go with HWInfo64 for that purpose.

 

This is great info thank you! Is there any way to see or set/limit which power state you are in?

unfortunately not, the gpu sets that by itself depending on the type of load it experiences. the only power states u can kind of control are the lowest one at 2D idle and the highest one at 3D active. the "in-betweeners" are difficult to coax out on command.

 

In P0, for base clock offset I can only go down to -135MHz minimum.. so nowhere close to 810 MHz! Maybe Prema mod gives you more leeway with that? And yeah, looks like the stock voltage is 1.2V.

yes that might be the case, iirc standard clock adjustment range for Maxwell was +/- 135 Mhz, so thats in line with that. do u have more control over the vRAM?

 

Anyway, and most importantly, do you need to test the stability of these settings for each of the power states? And if so, how do you go about forcing a particular power state to be active to test it?

for overclocking, u only need to focus on P0 since thats always active when ure in a 3D application like benches or games. for underclocking i would test with the 2D idle profile on desktop if everything is ok, then u can apply it to the other profiles. stability testing for OC is any benches and games u use most often. for UC/UV its basically just regular 2D desktop usage and maybe an overnight idle test (optional though). 

 

OK, so this is where it went wrong for me. From previous playing around I was assuming that "Apply defaults", sets everything back to default... Was playing around with the different clock sliders to double check ranges and increments and then went to click "Apply defaults" as I had done before no issue, but this time, instant reboot! So I'm only wondering now whether having a different power state selected can change anything here, like it tries to apply the selected power state? If not, then I guess I must have just hit the "Apply Clocks and Voltage" button beside it by mistake. I'm getting old!! 😕

haha no worries, Inspector actually sometimes needs several click to properly apply default clocks. it sometimes takes me 2-3 clicks on the default button for all profiles to be properly reset again. thats also the reason why my batch file contains 3 repetitions of every clock and voltage adjustment to be 100% sure that the settings are really applied properly.

 

Wow, that is a big saving. How are you measuring/monitoring that??

thats based on benchmarks and gameplay, i monitor using HWInfo64 and the wattages are based on total board consumption. naturally, those values are not exact since were using software to measure, but the rough range should be good enough for reference purposes 🙂 

 

 

  • Thumb Up 1
  • Thanks 1

Mine: Hyperion "Titan God of Heat, Heavenly Light, Power" (2022)
AMD Ryzen 9 7950X / Asus ROG Crosshair X670E Extreme / MSI Geforce RTX 4090 Suprim X / G.Skill Trident Z5 RGB DDR5-6600 2x16 GB / Seagate Firecuda 530 4 TB / 2x Samsung 860 Evo 4 TB / Arctic Liquid Freezer II 420 / Seasonic TX-1600 W Titanium / Phanteks Enthoo Pro 2 TG / Samsung Odyssey Neo G8 32" UHD 240 Hz / Ducky One 3 Daybreak Fullsize Cherry MX Brown / Corsair M65 Ultra RGB

 

My Lady's: Clevo NH55JNNQ "Alfred" (2022)
Sharp LQ156M1JW03 FHD matte 15.6" IGZO 8 bit @248 Hz / Intel 12600 @ 4.4 - 4.8 Ghz / Nvidia 3070 Ti 8 GB GDDR6 / G.Skill 16 GB DDR4-3800 / Samsung 970 Pro 1 TB / Intel AX201 ax+BT / Win 11 Pro Phoenix Lite OS / 230 W PSU powered by Prema Mod!

Link to comment
Share on other sites

  • 4 weeks later...
On 8/9/2022 at 9:32 PM, jaybee83 said:

 

 

Thanks for all the great feedback! I have HWInfo ok for monitoring and logs so will look into that sometime for figuring out power draw. Might have to wait a couple of months now as will be busy with a trip visiting family etc. for the next while.

The only voltage adjustment I can see is an Over Voltage slider under P0 with a minimum and default at 0 mV, and going up to +25 mV. So no undervolt potential it seems.

Link to comment
Share on other sites

  • 2 months later...

Navigate to C:\Users\(your username)\AppData\Roaming\Microsoft\Windows\Start Menu\Programs\Startup

Note that "AppData" is a hidden folder.

 

Shortcuts dropped in the "Startup" folder will fire every time you log in to Windows.

  • Thumb Up 1

Dell Precision 7770 (personal) • Dell Precision 7560 (work) • Full specs in spoiler block below
Info posts (Dell) — Dell Precision key postsDell driver RSS feeds • Dell Fan Management — override fan behavior
Info posts (Windows) — Turbo boost toggle • The problem with Windows 11 • About Windows 10 LTSC

Spoiler

Dell Precision 7770 (personal)

  • Intel Core i9-12950HX ("Alder Lake"), 8P+8E
    • 8× P cores ("Golden Cove"): 2.3 GHz base, 5.0 GHz turbo, hyperthreading
    • 8× E cores ("Gracemont"): 1.7 GHz base, 3.6 GHz turbo
  • 128GB DDR5-3600 (CAMM)
  • NVIDIA GeForce RTX 3080 Ti 16GB (DGFF)
  • Storage:
    • 2TB system drive: Samsung 980 Pro, PCIe4
    • 24TB additional storage: 3× Sabrent Rocket 4 Plus 8TB, PCIe4 (Storage Spaces)
  • Windows 10 (Enterprise LTSC 2021)
  • 17.3" 3940×2160 display
  • Intel Wi-Fi AX211 (Wi-Fi 6E + Bluetooth)
  • 93Wh battery
  • IR webcam
  • Fingerprint reader

 

Dell Precision 7560 (work)

  • Intel Xeon W-11955M ("Tiger Lake")
    • 8×2.6 GHz base, 5.0 GHz turbo, hyperthreading ("Willow Cove")
  • 64GB DDR4-3200 ECC
  • NVIDIA RTX A2000 4GB
  • Storage:
    • 512GB system drive (Micron 2300)
    • 4TB additional storage (Sabrent Rocket Q4)
  • Windows 10 (Enterprise LTSC 2021)
  • 15.6" 3940×2160 display
  • Intel Wi-Fi AX210 (Wi-Fi 6E + Bluetooth)
  • 95Wh battery
  • IR webcam
  • Fingerprint reader

 

Previous

  • Dell Precision 7530, 7510, M4800, M6700
  • Dell Latitude E6520
  • Dell Inspiron 1720, 5150
  • Dell Latitude CPi
Link to comment
Share on other sites

On 11/3/2022 at 7:54 PM, Aaron44126 said:

Navigate to C:\Users\(your username)\AppData\Roaming\Microsoft\Windows\Start Menu\Programs\Startup

Note that "AppData" is a hidden folder.

 

Shortcuts dropped in the "Startup" folder will fire every time you log in to Windows.

thank you it was so easy.

dell precision m4600

i7 2760QM

8GB ram

MX500 crucial SSD 500GB.

win 10 21H2

Link to comment
Share on other sites

I was a little worried about this.  I think that shortcuts in the "Startup" folder that need a UAC prompt will not fire.

 

You'll have to use Task Scheduler.  Set up a job to run at login.  Make sure to select "Run with highest privileges" and it will bypass the UAC prompt.

  • Thumb Up 1

Dell Precision 7770 (personal) • Dell Precision 7560 (work) • Full specs in spoiler block below
Info posts (Dell) — Dell Precision key postsDell driver RSS feeds • Dell Fan Management — override fan behavior
Info posts (Windows) — Turbo boost toggle • The problem with Windows 11 • About Windows 10 LTSC

Spoiler

Dell Precision 7770 (personal)

  • Intel Core i9-12950HX ("Alder Lake"), 8P+8E
    • 8× P cores ("Golden Cove"): 2.3 GHz base, 5.0 GHz turbo, hyperthreading
    • 8× E cores ("Gracemont"): 1.7 GHz base, 3.6 GHz turbo
  • 128GB DDR5-3600 (CAMM)
  • NVIDIA GeForce RTX 3080 Ti 16GB (DGFF)
  • Storage:
    • 2TB system drive: Samsung 980 Pro, PCIe4
    • 24TB additional storage: 3× Sabrent Rocket 4 Plus 8TB, PCIe4 (Storage Spaces)
  • Windows 10 (Enterprise LTSC 2021)
  • 17.3" 3940×2160 display
  • Intel Wi-Fi AX211 (Wi-Fi 6E + Bluetooth)
  • 93Wh battery
  • IR webcam
  • Fingerprint reader

 

Dell Precision 7560 (work)

  • Intel Xeon W-11955M ("Tiger Lake")
    • 8×2.6 GHz base, 5.0 GHz turbo, hyperthreading ("Willow Cove")
  • 64GB DDR4-3200 ECC
  • NVIDIA RTX A2000 4GB
  • Storage:
    • 512GB system drive (Micron 2300)
    • 4TB additional storage (Sabrent Rocket Q4)
  • Windows 10 (Enterprise LTSC 2021)
  • 15.6" 3940×2160 display
  • Intel Wi-Fi AX210 (Wi-Fi 6E + Bluetooth)
  • 95Wh battery
  • IR webcam
  • Fingerprint reader

 

Previous

  • Dell Precision 7530, 7510, M4800, M6700
  • Dell Latitude E6520
  • Dell Inspiron 1720, 5150
  • Dell Latitude CPi
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. Terms of Use