Jump to content
NotebookTalk

SuperMG

Member
  • Posts

    491
  • Joined

  • Last visited

  • Days Won

    2

Everything posted by SuperMG

  1. Hello. Did you test your theory for real? What's the bench results?
  2. Hello. The heatsink won't fit because of the screw holes offset from the new card. You have to heavily modify your heatsink. The temps are good with a 3080 if you have a good thermal paste.
  3. Hello. Possible to flash with Nvflash (dos or win) without CH341A? I managed to get 127W with VGA generic drivers for 485M. You're telling me the same Nvidia drivers are causing the limit issue... But what If I have one AMD + one NVIDIA driver, will that fix or not? But I still don't understand one thing... How is your M2000M impacting so much your slave card? But the 880M isn't? M2000M is a low consumption card compared to the 880M right?
  4. Even with a GTX 680M I get the same watt average by the way. Isn't it the same as Intel iGPUs + Nvidia? Is it because of the VRAM speed? 96-116GB/s (485/680M) vs 230GB/s for P5200. HD 7970M has 156GB/s I think I can put a RTX 3000/4000 but not worth it. The thing is, if the master has the generic VGA drivers, the slave gets lots of power by the way. So we probably need to limit the first GPU, but how? The best master card is: GTX 680M, HD 7970M, K5000M, all 100W Also I'll test with a HD 7970M in the master slot soon.
  5. Hello. Indeed my GPU does P0. With or without regedits. When the GTX 485M has a driver, P5200 can do max 70w average and without a 485M driver, P5200 does 95W average... I don't understand. 485M is also at state P0 at the same time as the P5200 while the 485M is doing nothing other than displaying...
  6. Hello, thanks for reaching. I'll try that later. I did something earlier, the GPU was cooking. 90W average, 112W MAX on Heaven Benchmark instead of 70W GTX 485M: Error code 31, drivers broken/missing Quadro P5200: Drivers installed, DirectX9 doesn't work but the rest it works FireStrike only runs at 10fps, weird maybe because it has an influence on the first GPU? TimeSpy didn't try. Heaven Benchmark, I did see I have 30-40 fps more than before but the GPU hot spot cooked, 99C, I stopped the benchmark. Indeed it was pulling 90W average with 95W, 98W, 101W readings. So having the default VGA Windows drivers (none) for 485M and Nvidia driver for P5200, managed for me to unlock the wattage but I can't even execute DirectX 9 titles anymore... Do you have an idea how did the power get unlocked? I'm waiting for the HD 7970M to be delivered.
  7. Hello. The Quadro K3000M arrived but... It's not the K3000M. It was the K3100M instead. Not compatible as it's a newer Kepler card... The listing was for a K3000M 2GB. I'm so unlucky. 3rd GPU that's not working in a row!
  8. Bruh. I bought dual PSUs for nothing then and Quadro k3000m 75W... Didn't test them yet
  9. Hello, thanks for commenting. What TDP max is your RTX 5000? Did you try AMD for the first slot? Any PSU upgrades or vBIOS mods to the target TDP? I still haven't try with AMD HD 6XXX and Quadro K3000M as master cards yet...
  10. @sliderfra @Clamibot Hello, sorry to disturb you. Do you know how to get 100W average for my case?
  11. Hello. I have a clevo X7200 laptop with a X58 desktop chipset and a Xeon W3680 This laptop has two MXM slots, both can deliver 100W with 100W heatsinks as well. Officially this laptop works with the GTX 680M as the best MXM GPU. It does work with GTX Fermi as stock and AMD 5000-6000. Many in the past tried a GTX 780M, 980M but there was nothing on display. But none of them tested the next combo of: putting a Kepler based GTX 670/680/K3000M GPU in the master slot and GTX 780M/980M in the slave slot. So I tried to put a Quadro P5200 in the first slot, not work. But If I do a GTX 680M for display + Quadro P5200 for 3d rendering, the laptop boots. Detects both GPU as Nvidia Graphics Card in the BIOS. So what I did is having a GTX 680 2GB DELL card in the first slot (master) and a Quadro P5200 in the second slot (slave). So yes, you can add a much recent card in the Clevo X7200 only if you have a compatible GPU for display in the master slot. I do have one issue. My GPU is performing worse. Like it's averaging 70W in 3dmark and heaven benchmark. In FurMark I did get 84W spike at best. The GPU has a 110W vBIOS. My temps seems to be good (below 80C on 3d benchmarking), I added thermal paste and thermal pads on VRAMs and VRMs. @triturbowho sold me the card was able to achieve 110W average with an HP 8740W with an Intel Core i7 740QM and PCIE at 2.0 x16. My laptop does pull PCIE 2.0 x16 speed for both MXM slots. I get 3400 points on 3dmark TimeSpy instead of 5700. There is no thermal throttling, GPU is being used at 100% according to GPU-Z. My AC Adapter is a 300W 20V one and I have a 2024 Nvidia Quadro driver (modded .INF for both GTX 680M and P5200). How can I achieve 100W average when benchmarking/stress testing, am I power limited? Does the vbios has a bad reading for the target power limit? (Temps sensor work, I hear fan speeding up when doing some benchmarks).
  12. You know what. I made a new thread in Graphics Card. I don't know if that's the right place to put it in.
  13. Later for the thread. Because I didn't test the Quadro K3000M + dual PSUs
  14. And I'll upgrade my CPU to a Xeon W3680 too. Imagine the GPU gets dropped
  15. From the outlet. The PSU is a 300W. I did see some people doing SLI GTX 680M With only one 300W brick... Maybe SLI and non-sli is different ? (I don't have enough ressources, maybe they used 2 PSUs?) The 680M isn't using a whole power, it's in idle.
  16. On 3d benchmarking? 180-190W. I edited the existing bios file from the card itself so no worries, right.
  17. Same for 3dmark. TimeSpy: 3400, 77W max FireStrike: 10500, 84W max. Should I just edit my VBIOS and thinker with the TDP? Like 95W or 125W instead of 110W? Should I get a quadro K3000M (75W) for the master mxm instead of GTX 680M?
  18. Temps are all good. Otherwise I'll get a thermal throttle warning in GPU-Z. Below 75C in benchmarking GPU is being used at 100% in 3dmark, heaven Benchmark, Passmark, yes. The card is a 100W version with a 110W vBIOS. My max wattage was 84W in FireStrike and for triturbo it was 127W! Triturbo was using a CPU way slower than mine and he got greater results with the same card, so no, no CPU bottlenecks.
  19. Also the laptop always start with 100% fan on CPU and GPUs. Even with a single GTX 485M, it's normal to be loud at the start. How can I fix low wattage? I want to get atleast 90W average on benchmarks. Slave MXM slot can do 100W max.
  20. The HD 5870 was returned because it wasn't for my laptop, it was for a WU860 and not for X7200. It probably didn't understand the AMD card. My GTX 680M Dell variant works on X7200 no problem. ThrottleStop is for CPU, you mean GPU-Z? I have the same graph as triturbo, I get Pwr limit signs but my wattage is way lower than his. The GPU2 fan works great, it's on par with the P5200 Nvidia's temperature.
  21. @GuitarG @jaybee83 Hello. I bought the GPU for my Clevo X7200 (aka X58 predecessor of the Clevo P570WM). My card runs at average 70W on the 3dmark, Heaven Benchmark, but why? I use dual GPUs. One GTX 680M for master and one P5200 for slave (as high performance GPU in Windows). I use the 2024 quadro driver with two hardware IDs mod. Driver from 2020 or 2024, same performance whatsoever. My PSU is a 300W one and my CPU is a Xeon E5620. The PCIE 2.0 speed is x16 for both MXM lanes. @triturboalso had the same bandwidth speed as me but he was able to pull more watt. I also set the Power savings mode on Windows to Off.
  22. I set to high performance in the unlocked bios, how will that overheat instantly? I think it had no power and cut. It never cuts on benchmark at 70W ( should be running at 100W instead). The guy who sold it to me was running at 110W with PCIE 2.0 X16 Do I need dual PSUs? Because I'm gonna upgrade to the W3680.. I found a power box converter for dual PSUs for 60USD (Clevo AC 200)
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. Terms of Use