Jump to content
NotebookTalk

Aaron44126

Moderator
  • Posts

    2,102
  • Joined

  • Days Won

    30

Everything posted by Aaron44126

  1. Pretty sure Pascal is the first generation where NVIDIA enforces digital signature verification on the vBIOS. So, you can't just dump the vBIOS, "adjust values", and flash it back on. (Well, you can, but the GPU will reject it and won't operate.) You can however cross-flash a different official vBIOS that comes from another P4000 card if it has more desirable behavior than yours. Maybe someone has found a way around this by now but that was the situation when I was messing with a P5000 back in around 2019.
  2. Yeah I saw this exact page, and its "helpful" but I'm not sure if it might cause other issues. For instance, the TLP configuration seems to turn the WiFi on or off when the Ethernet cable is connected or disconnected. Great. What happens if I have Ethernet connected, power the system off, unplug the cable, and then power it back on? Or vice versa? Will I end up with both connected, or neither connected? Will have to do some testing to make sure funny state cases are covered with appropriate behavior.
  3. I just noticed that Ethernet priority is -100 and Wi-Fi priority is 0. Maybe flipping those around would fix it? I thought maybe it was a Linux "lower priority is better" thing but it looks like that is not the case. (That's just the autoconnect priority so it might not even matter for traffic.)
  4. It's become increasingly clear that going back to Windows would be just an exercise in never-ending frustration, so... Linux may have its own frustrations, but I'm working through things bit by bit, and the only way to go is forward. I'll ask this, maybe this is something you guys have experience with. Is there a good way to get it to disconnect from Wi-Fi if there is an Ethernet connection? (Windows does this automatically.) Network performance between my Windows VM and my network scanner (talking about like a document/photo scanner) is poop if Wi-Fi is connected. The VM only sees one network adapter, and traffic doesn't seem to always prioritize the Ethernet connection. I can script a solution to this as well, just wondering if there was an easy way to do it.
  5. I'm not sure if that will really help you. The system might not draw more than 180W of power even if you hook up a 240W. I just thought that it might be something to try. I'm right now more interested in what happens if you try to disable Dynamic Boost... It might stop the system from trying to steal power from the GPU if there is CPU load pressure.
  6. You can disable Dynamic Boost in the NVIDIA control panel... but not for Ampere and later GPUs? Though, maybe a procedure like this would be helpful. [Edit] Now I am also very interested in disabling or bypassing Dynamic Boost on Linux because I wonder if this might be explaining a few odd things I have been seeing with more intense GPU loads. There is documentation on it. nvidia-settings -q DynamicBoostSupport indicates that dynamic boost is supported, but I do not see a nvidia-powerd service/process running. (Using NVIDIA 525 proprietary drivers from the Ubuntu repos.)
  7. You're on a desktop right? I think that the power management behavior is going to be different. I always see the GPU spending most time in P0. But, the laptop GPUs have a way lower max power level. I had another thought. If we're on the right track, this seems to be a case of a "not ideal" (putting it nicely) implementation of power sharing between the CPU and GPU. Dell's older systems would just power-starve the CPU if the GPU was under load, rather than the other way around, and I think that would be the preferred behavior for gaming (and most other high-GPU loads). Anyway. Do you see the "NVIDIA Platform Controllers and Frameworks" device in Device Manager? I think it is under "Software devices". That driver is at least partly responsible for handling NVIDIA Dynamic Boost (shifting power allocation between the CPU and GPU). I wonder if the system behaves any differently if you just disable it.
  8. Yeah, on desktops or pro GPUs the scale might be wider than the ones I enumerated above. Those are what I have observed on laptops and there is a big drop-off between P2 and P3 (at least on Ampere GPUs in the Precision). You can force a certain power state with NVIDIA Inspector but I don't think that will help here. I tried messing with it before; if Dell wants you in a low power state, it's going to put you in a low power state regardless of what you try to force. And now thinking about it, I have experienced "oscillations" before where a combined CPU/GPU load would force the GPU to throttle on and off periodically. I think the best solution is along the lines of what @Etern4l suggested up above. Do what you need to do to lower the CPU power draw to the point where the system stops throttling the dGPU. You might have to be more aggressive with it. As another test, you could run a high-GPU low-CPU load for a while and confirm that this issue doesn't occur. I'd suggest 3DMark Fire Strike, just run a single GPU test in a loop, windowed. It should put the GPU up to 100% utilization and not even stop for loading between loops. (Then maybe add a separate CPU load and see if the GPU starts power throttling.) [Edit] One more thing. Do you have a 180W or 240W adapter attached to your system? Do you have access to a 240W to test with, if you are using 180W? I don't know if it will matter, but at least on Precision 7670, Dell is shipping 240W with systems with a high-power dGPU.
  9. P3 is not a good state to be in when gaming, that’s what Dell limits you to if you are on the “cool” or “quiet” profiles and it has a huge effect on gaming performance. I bet if you select one of those profiles, you will see it doing the same "slow performance" that you are seeing now, but like, all the time. You want to be in P2 or better. I think the Dell BIOS/EC is the culprit here but I’m not sure what else to suggest. Actually, P0 is the best performance and P8 is the worst as reported here. Not all numbers in the scale are used, it goes P0 (P1?) P2 P3 P5 P8. P8 is the low-power “idle” state. I have seen the 0…4 scale as well where 4 is the highest, but this is not what NVIDIA-SMI reports. My system spends most of its time in 4 with occasional drops to 3 when the NVIDIA GPU is busy. (I’ve been doing a lot of measuring on Linux.)
  10. No, Precision 7550/7560 cards will not fit in Precision 7530/7540. In addition, RTX 4000 and RTX 5000 cards require a heatsink replacement in order to work in Precision 7530/7540 because they have a different component layout and screw positions compared to the other GPU cards. RTX 3000 should be a drop-in replacement if you already have an NVIDIA card in there. (You will have to do INF mod to get the NVIDIA driver to load though.)
  11. Here are some links for old Windows ISOs. https://isofiles.bd581e55.workers.dev/ https://tb.rg-adguard.net/public.php Windows 11 21H1 (build 22000) is only supported until October 2023, unless you have Enterprise/Education edition, then you get an extra year.
  12. nvidia-smi will show you the current power state, and I know you can also see it on the screen in NVIDIA Inspector. Yes. You can also set it in the BIOS setup, and I wrote a command-line tool that can switch it quickly without having to wait for Dell Power Manager to fire up that I can dig up if it turns out that it would be handy. I think @MyPC8MyBrain figured out how to change it from PowerShell as well.
  13. Now I'm sort of curious if you did your own clean Windows 11 install, if it would behave more like Dell's Windows 11 install or your clean Windows 10 install. I agree, the performance difference is crazy.
  14. I also updated my post above with an idea. But if you want to downclock the CPU as @Etern4l suggests, if your game is more GPU-heavy than CPU-heavy then you can just toggle Turbo Boost off and I have a post with a number of "easy" ways to achieve that linked in my sig.
  15. GPU-Z is reporting "power" as the performance cap reason, it would seem that the BIOS or something else in the system is not allowing the GPU to draw more power. Has it been doing this "forever" or is it a new issue ...? [Edit] What do you have the "thermal mode" set to? I have been noticing on my 7770 that the GPU sometimes gets temporarily throttled if there is a heavy or even moderate CPU load going on and the system is set to the "high performance" mode. (This issue seems to be more pronounced under Linux for some reason, but now looking back I think it must have been happening sometimes on Windows too.) The default "balanced"/"optimized" mode seems better about this. Basically I guess I would suggest, if you are using "high performance" mode then try the "balanced" mode, or vice versa, and see if the behavior is better. If you are using any other mode then I would expect GPU throttling for sure. [Edit 2] Oh, if you are experiencing this issue, maybe just switching thermal modes would be enough to kick it back to normal behavior, so maybe try that too. If that does help, it would probably be possible to script a fix to just watch the dGPU power state and give it a kick if there is high GPU utilization but a low power state. I'm assuming it is dropping down to P5 or maybe even P8 when this happens ...?
  16. VBS is an interesting one to look at; they are probably turning it on by default on new Windows 11 systems coming out of Dell but it would not be turned on by default on a new Windows 10 install. It's in Windows Security, "Device security" (on the left), then "Core isolation details". Turn off "memory integrity" and then you will have to reboot. Microsoft even recommends disabling it for top performance.
  17. For those into gaming, here's a cool thing that I just found out about and plan to mess with soon... Gamescope. I always wanted something like this for Windows. It basically creates a whole separate Wayland session to run a game in and then routes it to a window on your desktop (which can be running either X or Wayland). Looking at the Github readme, it appears that they've done a lot of work and thinking on making sure that compositing process is as smooth and efficient as possible. (I've seen a few cases of benchmarks where running a game in Gamescope even makes it slightly faster than running it native.) Aside from making it "impossible" for apps running in your main desktop to mess with your game and vice/versa, it allows you to do things like blow up games running at a lower resolution to full screen without actually changing your desktop resolution, run games that really want to run in 16:9 on a non-16:9 screen without changing your desktop resolution, force a framerate limit without changing your monitor refresh rate, and force full screen games to run in a window if you so choose. There are even options you can set for how scaling should work (i.e. you can use AMD FSR as the upscaler for old games). There are also keyboard shortcuts to change between windowed/full-screen and turn integer scaling on and off on the fly. Does it actually work well on a laptop with Optimus in play? It does say it supports Intel GPUs which should be all that matters, I would think. A quick test looks promising, I didn't have to do anything specific to get a game running inside of Gamescope rendering on the NVIDIA GPU.
  18. Indeed, third-party KMS servers are not a legit way to obtain a license. You do not need to worry about it if you have Windows 10 Home, Pro, Enterprise or Education editions. There is no practical difference in licensing from Windows 10 to Windows 11 and the same product key or OEM license will work for both. (I believe this is true in all territories.) If you upgrade in-place, you will not be asked for a product key and your system will remain activated. This may not be true for less common Windows editions.
  19. Oh yes, I tested that one as well on the 7770 and it was also measurably bad for dGPU performance. (It "sounds" like a good idea, I don't know if they didn't implement it properly or what.)
  20. Check BIOS settings and see if you have Intel "Turbo Boost 3.0" turned on. They added it in a BIOS update at some point and some people here are saying that it defaulted to on, which I would argue is inappropriate. That "feature" causes havoc with max power on the dGPU if I turn it on in my Precision 7770. (I haven't done testing with it on the 7560, but I don't have a beefy dGPU in that system.)
  21. Figured out the whole priority/pin system in the apt package manager, so I can add the Ubuntu 23.04 repos but only have it pull down select package updates from those repos (Linux kernel and MESA packages, because I need newer versions of those than are currently offered on 22.04). ...This opens up some interesting mix & match possibilities for pulling different package sets from different ditros/distro version and not having to worry about keeping track of updates. I don't have any other specific use cases right now but I think maybe if I wanted a certain application to stay more up-to-date than it does in the regular Ubuntu repos, I could set it up to be updated through the Debian testing repo or something. I know they have Snap and Flatpak to help with keeping applications up-to-date, but my trials of those have ended in frustration with how sandboxed everything is. I do understand the purpose of the sandboxing, and that's great, but here's an example of the frustration. Yesterday, I tried to install Pinta through Flatpak and ended up with an "ow, my eyes" experience when it opened up a big bright screen and everything else on my system is in a "dark mode" theme. Apparently, at least when running KDE, Flatpak apps can't "see" the system theme and you have to jump through hoops to make that sort-of work. I tried downloading it from Github and compiled it myself and when I ran that one, Pinta just used the system-wide dark GTK theme that I have set and looks great. The whole Snap/Flatpak thing is all new to me because it wasn't a thing yet last time I tried Linux as a desktop OS and isn't something I had any need for during my normal use of Linux in a VM, mostly from the terminal. And, I am looking at Pinta because it is obviously inspired by paint.net, which is my preferred image editor on Windows. It uses a lot of the same GUI ideas and keyboard shortcuts, so less re-learning may be needed on my part. For system backup, I ran across Timeshift which seems to do what I want and was very easy to set up. (I did have to use the terminal to get it to set the snapshot location to my RAID/LUKS drive, which didn't pop up as an option in the GUI.) It backs up all of the system files to another drive, and you can just browse the snapshots in any file manager. Multiple "snapshots" will use hard links for unchanged files to save space. It excludes user files (everything in the home directory) so that you can roll back to a snapshot without having to worry about those being modified. (I already have a separate backup solution in place for that.) It can do fancier stuff if you use btrfs, but my volumes are all ext4 right now. Today, I finally set up Thunderbird for email. Working fine for mail, and I've used it before so it's familiar. Still haven't tried to figure out calendar & tasks but it looks like there are options to make that work. Sabrent says they have inspected my broken drive and are sending me a new one. No tracking details yet, though...
  22. OK, I didn't realize that the default is 1440p instead of 1080p, but I believe what @win32asmguy did was a default run (so, 1440p). When you ran this test, were you running power through the dock? WD19TB3 does have 240W brick, but it can't deliver a full 240W to the system. You might have slightly better results if you have a 240W brick connected to the system directly. If you did have a 240W brick attached to the system directly, then it is interesting that your 1440p result is slightly lower than @win32asmguy's. You might also get slightly faster result if you switch the system to "Ultra performance" mode (thermal mode set in the Dell Power Manager or Dell Optimizer app).
  23. Twitter policy states that logging in once every 30 days should be good enough to keep your account (presumably, regardless of whether or not you post). ...I've had my account for 12+ years and it has 0 tweets. So. "Reverse special treatment"?
  24. Today, there are some new details on the AMD version of the Framework Laptop 13, featuring RDNA 3 integrated graphics. https://frame.work/blog/announcing-the-framework-laptop-13-powered-by-amd-ryzen
  25. Can you do a default settings/1080p run just so we can have a sort of apples-to-apples comparison against @win32asmguy's RTX 4000 Ada GPU result? [Edit] You mentioned an external monitor run above. Were you using a Dell "dual USB-C" dock for power when you ran this? I know there can be a performance difference because of the difference in power (210W vs 240W, docked vs undocked).
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. Terms of Use