Jump to content
NotebookTalk

SuperMG3

Member
  • Posts

    339
  • Joined

  • Last visited

Everything posted by SuperMG3

  1. And I'll upgrade my CPU to a Xeon W3680 too. Imagine the GPU gets dropped
  2. From the outlet. The PSU is a 300W. I did see some people doing SLI GTX 680M With only one 300W brick... Maybe SLI and non-sli is different ? (I don't have enough ressources, maybe they used 2 PSUs?) The 680M isn't using a whole power, it's in idle.
  3. On 3d benchmarking? 180-190W. I edited the existing bios file from the card itself so no worries, right.
  4. Same for 3dmark. TimeSpy: 3400, 77W max FireStrike: 10500, 84W max. Should I just edit my VBIOS and thinker with the TDP? Like 95W or 125W instead of 110W? Should I get a quadro K3000M (75W) for the master mxm instead of GTX 680M?
  5. Temps are all good. Otherwise I'll get a thermal throttle warning in GPU-Z. Below 75C in benchmarking GPU is being used at 100% in 3dmark, heaven Benchmark, Passmark, yes. The card is a 100W version with a 110W vBIOS. My max wattage was 84W in FireStrike and for triturbo it was 127W! Triturbo was using a CPU way slower than mine and he got greater results with the same card, so no, no CPU bottlenecks.
  6. Also the laptop always start with 100% fan on CPU and GPUs. Even with a single GTX 485M, it's normal to be loud at the start. How can I fix low wattage? I want to get atleast 90W average on benchmarks. Slave MXM slot can do 100W max.
  7. The HD 5870 was returned because it wasn't for my laptop, it was for a WU860 and not for X7200. It probably didn't understand the AMD card. My GTX 680M Dell variant works on X7200 no problem. ThrottleStop is for CPU, you mean GPU-Z? I have the same graph as triturbo, I get Pwr limit signs but my wattage is way lower than his. The GPU2 fan works great, it's on par with the P5200 Nvidia's temperature.
  8. @GuitarG @jaybee83 Hello. I bought the GPU for my Clevo X7200 (aka X58 predecessor of the Clevo P570WM). My card runs at average 70W on the 3dmark, Heaven Benchmark, but why? I use dual GPUs. One GTX 680M for master and one P5200 for slave (as high performance GPU in Windows). I use the 2024 quadro driver with two hardware IDs mod. Driver from 2020 or 2024, same performance whatsoever. My PSU is a 300W one and my CPU is a Xeon E5620. The PCIE 2.0 speed is x16 for both MXM lanes. @triturboalso had the same bandwidth speed as me but he was able to pull more watt. I also set the Power savings mode on Windows to Off.
  9. I set to high performance in the unlocked bios, how will that overheat instantly? I think it had no power and cut. It never cuts on benchmark at 70W ( should be running at 100W instead). The guy who sold it to me was running at 110W with PCIE 2.0 X16 Do I need dual PSUs? Because I'm gonna upgrade to the W3680.. I found a power box converter for dual PSUs for 60USD (Clevo AC 200)
  10. Do I need more power on my power supply? I have a 300W PSU. With 100W GTX 680M + "100W" P5200 (Max measured at 79W) and 80-130W Xeon E6520. I then set high performance on the bios for the Power savings mode and the laptop SHUTDOWN INSTANTLY when booting windows... K3000M is a 75W card HD 6990M is a 100W card. So I need to inf mod TWO hardware IDs to get them both (K3000m and P5200) working on the recent drivers? We can't add more than 2 hardware IDs on NVCleanInstall...
  11. Because on 3dmark I'm getting 3400. While the other person was getting 5700. We both have the same PCIE speed (x16 2.0). He's only using one GPU because of one MXM slot and he also uses a newer driver.
  12. Okay. I did inf mod a GTX 680M for quadro drivers but meh. If I wanna use recent quadro drivers then that won't work great so that's where the K3000M comes handy. Also is it better having a K3000M or HD 6990M in master and P5200 in slave?
  13. Hello. I see that GK104 576 CUDA doesn't have a desktop part. But there are many GK104 with 900-1300 CUDAs, would that work for the K3000M?
  14. Hi. I saw your video about installing unsupported notebook GPUs on desktop drivers. You had the GK104 GTX 880M. Can I install the newest quadro driver that's supported by GK104 chip? (k3000m, k4000m, k5000m, k4200) ? Drivers for K3000M are from 2019 but the desktop one is from 2023.
  15. No registry mods? Can I do K3000M + P5200? 427.68?
  16. How do you make them work without losing any performance and use DirectX on the slave card? I use 485M and 680M + P5200 and I get crappy performance. I'm at full PCIE speed too.
  17. @ssj92Hello. How did you manage to make the GTX 980M + RTX 5000 combo work together with the drivers? Can the RTX 5000 render in DirectX 11/12?
  18. @panda_zzz Hello. I got the P5200 work on the Clevo X7200 as the slave GPU. Temps monitoring all works but the performance isn't good. Because the master GPU is a Fermi based GTX 485M... And there are no drivers that's shared between Fermi and Pascal. I just installed inf drivers for the p5200 then installed the 485M one. Because I need to install the master's drivers or nothing would work. Now Nvidia Panel states the version I'm running is the 391 lol I tried to run the HD 5870 as the master but it keeps dying at the first minute of the boot (Windows and DOS)... And the HD 5870 is from the clevo WU860. I tested the HD 5870 with and without the slave GPU, still the same shutdown issue.
  19. I got the card working. It does 49FPS AVG in FurMark. Normally it should do 79FPS. I guess it's because of the GTX 485M drivers. Fermi and Pascal don't even share a single driver. I need to get a AMD HD 6XXX graphics card to make the Nvidia GPU run with the latest drivers without any conflicts or limits. I did buy an HD 5870M but it was faulty as the laptop would shutdown in windows/bios/dos after one minute from power
  20. Yeah I need an AMD card, because mixing old Fermi GPU with a more recent one will cause this + error 38/10
  21. Oh yeah my bad. On my previous M17X R4 I was sometimes limited by 60fps on dx12 using RTX 3000 MXM
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. Terms of Use