Jump to content
NotebookTalk

M18x - How to disable secondary GPU slot


GMP

Recommended Posts

Since I'm now running a GTX 1070 in my M18xR2, I decided what to do with the empty secondary GPU slot.

I decided not to use it for a MXM to NVMe Adapter, since I think a normal SSD gives all the speed I need.

What I did decide was to use the secondary GPU slot for extra cooling for the CPU.

I'm gonna use the secondary GPU heatsink and fan, and solder some heatpipes from the triple piple CPU heatsink to the secondary GPU heatsink. That will allow me to keep the 3940XM even cooler and push it even more.

I checked and it can certainly be done. I can easily add an extra 3 heatpipes. When it's done, I will post some pictures.

I ordered some low temperature solder (melts at 138°C), that's low enough to keep the heatpipes from bursting/exploding while soldering. The second option is a thermal conductive epoxy, but I'm having trouble finding a good one with good thermal conductivity, can be cured at room temperature and has a glass transition temperature that is high enough (this is the temperature at which the epoxy starts to become liquid again).

That being said, that is not my problem.

The thing is I prefer not to make any custom plate that fits in the secondary GPU slot on which I can mount the GPU heat sink. I have an old AMD GPU from before that fits and which is perfect to mount the 100W AMD heatsink on. My question is, will this interfere with the GTX 1070. Will I have boot problems with the 2 different GPU's? Or is there a way to completely disable the secondary GPU slot from the unlocked bios so that the AMD card doesn't get detected and powered? That way it can act like a dummy plate on which I can mount the heatsink. I checked but can't really see an option that fits disabling it in bios, unless I'm missing it. Or can I just boot in windows and disable the AMD GPU from there in the hardware? Does that mean the GPU won't be powered in that case as soon as windows loads the next time? Not really sure what the best option is here. Of course I don't want any heat originating from that dummy GPU. If someone has any experience with this...

Thanks!

  • Thumb Up 1
Link to comment
Share on other sites

I've thought about doing this a few times over the years but never did anything about it. 

 

I'd imagine that you'd definitely have to find a way of disabling the MXM slot as putting another functional card in there could cause some issues. 

 

If you don't care about the AMD card, I would cut the pin edge off so it didn't sit into the slot. This should work perfectly. If you do care about the card then find a GTX 260M and cut that up. It should be dirt cheap. 

 

Alienware Aurora R15 - i9-13900KF - RTX 4090 - 32GB DDR5 

Alienware M18x R2 - i7-3920XM - GTX 970M - 16GB 1866Mhz DDR3L - Samsung 970 Evo with MXM to NVMe Adapter - Custom Delft Blue  

Alienware X51 - i7-3770 - NO GPU - 16GB 1600Mhz DDR3  Alienware M17x R3 - i7-2760QM - GTX 580M - 16GB 1600Mhz DDR3

Alienware Area 51 ALX - i7-975 Extreme - GTX 980 - 16GB 1600Mhz DDR3   Alienware Aurora R2 - i7-3770K - GTX 670FTW - 16GB 1600Mhz DDR3 

Alienware Aurora R4 ALX - i7-3930K - GTX 1060 - 16GB 1600Mhz DDR3        Alienware Aurora R4 ALX - i7-3930K - GTX 670FTW - 16GB 1600Mhz DDR3

 

Link to comment
Share on other sites

3 hours ago, Maxware79 said:

I've thought about doing this a few times over the years but never did anything about it. 

 

I'd imagine that you'd definitely have to find a way of disabling the MXM slot as putting another functional card in there could cause some issues. 

 

If you don't care about the AMD card, I would cut the pin edge off so it didn't sit into the slot. This should work perfectly. If you do care about the card then find a GTX 260M and cut that up. It should be dirt cheap. 

 

That was actually the first thing I thought about, cutting the pin edge off. But the card being snug in the MXM slot gives it the best stability. I want there to be the least amount of movement on the card because of the heatpipes mounted on it to the CPU heatsink. It's still some distance and the heatpipes moving could tilt the CPU heatsink and cause imbalanced temperatures from less than optimal contact. At least I consider that a possible risk that I'd prefer not to have to deal with.

Link to comment
Share on other sites

It should boot up fine. Do a test and see what happens. Disable in windows device manager and see if that works. Of course it won’t be the same is it being completely disabled through pcie port 

Alienware m18             : Intel Core i9 13900HX                  | nVidia GeForce RTX 4090    | K1675 | 2x1TB SSDs 

Alienware Area-51M : Intel Core i9-9900K @ 5.3Ghz    | nVidia GeForce RTX 2080    | AX210 | Samsung 970 Evo+ 
Alienware M18x R2 :    Intel Core i7 3920XM @ 4.7Ghz | nVidia Quadro RTX 3000     | AX210 | Samsung 980 PRO   
Clevo X170SM-G:         Intel Core i7 10700K @ Stock     | nVidia GeForce RTX 2070S | AX210 | 256GB+2x512GB 

More Laps: M14x (555m) | M14xR2 (650m) | M15x (980m) | M17xR3 (880m) | M18xR1 (880m SLI) | 18 R1 (RTX 3000)

DT: Aurora R4 (i9 10980XE/RTX 4070) | Area-51 R2 (22-Core Xeon/2x Titan V) | SR-2 [2x6-Core/3x980Ti] | Mac Studio


CS Studios YouTube: https://www.youtube.com/c/CSStudiosYT 

Link to comment
Share on other sites

I've looked a bit into the MXM standard and the pins.

Always fun to learn new things :)

 

Here are the pins we all know, front and back:

image.thumb.png.5699aecf32bbf8824506bd51d0120968.png

image.thumb.png.68f374655fb6bb6be09b115fbf34f5c3.png

 

Everything that is power related seems to be in the "big" pins E1-E4 and the couple pins after that. But the main power to the MXM GPU seems to be PWR_SRC E1-E4.

 

image.png.60562cf4e3fd6313963090b6493f752c.png

 

There are 3 types of power going to the MXM card: the main power, 5V and 3V. All through the MXM connector coming from the motherboard I assume:

 

image.png.c7875bdc6261e9637506b40bca77abca.png

 

Pin 6 PWR_GOOD is an output signal on the card. When all 3 power sources (main, 5V and 3V) are detected a signal is given to the motherboard from this pin. The motherboard in turn will send a signal back as an input on pin 8 PWR_EN, which in the end enables the power to be allowed on the MXM card.

 

image.png.6a29bc8a5312f6ff40bcf1e9c2c69a9a.png

 

The Power Sequencing below also implies that first the main power, 5V and 3V must be detected on the pins.

When that is detected, and output signal PWR_GOOD on pin 6 is sent to the motherboard, which in turn then sends an input signal back to the PWR_EN pin 8, after which the module internal power is applied.

The "NOTE" under the graph also states it specifically that no voltages to any of the other more than 200 pins will be applied before the power is enabled to the MXM card.

So if I'm able to mess with the PWR_GOOD signal, that means the card should be competely "dead" and powerless forever.

 

 

image.png.70a8eb6544815fe568586318265ae5bf.png

 

So from that I derive that there are some options:

1) I cut the PWR_GOOD pin 6 so that the signal is never given to the motherboard, which otherwise in turn would send a signal back to actually apply the power on the module.

2) I cut any of the power pins, main power are big pins so that would be the easiest. That way not all 3 power sources (main, 5V and 3V) are present and the MXM card never sends any signal from the PWR_GOOD pin 6.

3) I cut the PWR_EN pin 8, so that even if the power is present to be applied, it can never get an input signal to actually apply the power to the MXM card.

 

Bottom line, I think it's easiest to cut the main power pins E1-E4 with a dremel or something, as the other pins are way to small to do any work on them. That should do the trick and never allow the card to be powered on, and thus for sure never produce any heat:

 

image.png.945b98ac76e87d05b5b86039be284721.png

Or I just cut the whole piece containing E1-E4 and the pins 1 to 8 after that. That way I take away anything that is power related.

The rest of the pins can stay on, which will allow me to still plug it into the MXM connector and get a snug fit for the GPU card so that I can mount the heatsink on it on which I can solder the heatpipes coming from the CPU.

I am curious though if the BIOS will still detect the card.

I will test it out when I get the heat pipes and low temperature solder which is on the way.

 

If I'm saying things that don't make sense, please tell me lol :)

 

 

  • Thumb Up 2
Link to comment
Share on other sites

Looks like a good plan. If the card isn't getting power then I would assume that the BIOS won't detect the card. Will be interesting to see though. 

Alienware Aurora R15 - i9-13900KF - RTX 4090 - 32GB DDR5 

Alienware M18x R2 - i7-3920XM - GTX 970M - 16GB 1866Mhz DDR3L - Samsung 970 Evo with MXM to NVMe Adapter - Custom Delft Blue  

Alienware X51 - i7-3770 - NO GPU - 16GB 1600Mhz DDR3  Alienware M17x R3 - i7-2760QM - GTX 580M - 16GB 1600Mhz DDR3

Alienware Area 51 ALX - i7-975 Extreme - GTX 980 - 16GB 1600Mhz DDR3   Alienware Aurora R2 - i7-3770K - GTX 670FTW - 16GB 1600Mhz DDR3 

Alienware Aurora R4 ALX - i7-3930K - GTX 1060 - 16GB 1600Mhz DDR3        Alienware Aurora R4 ALX - i7-3930K - GTX 670FTW - 16GB 1600Mhz DDR3

 

Link to comment
Share on other sites

Before you cut any pins, I definitely recommend testing out either the nvidia or amd gpu in slot 2 in SG mode and see how the laptop performs. With Windows 10 and newer, having multiple GPUs isn't as bad anymore and it would be interesting to see if the cards at least work. 

Alienware m18             : Intel Core i9 13900HX                  | nVidia GeForce RTX 4090    | K1675 | 2x1TB SSDs 

Alienware Area-51M : Intel Core i9-9900K @ 5.3Ghz    | nVidia GeForce RTX 2080    | AX210 | Samsung 970 Evo+ 
Alienware M18x R2 :    Intel Core i7 3920XM @ 4.7Ghz | nVidia Quadro RTX 3000     | AX210 | Samsung 980 PRO   
Clevo X170SM-G:         Intel Core i7 10700K @ Stock     | nVidia GeForce RTX 2070S | AX210 | 256GB+2x512GB 

More Laps: M14x (555m) | M14xR2 (650m) | M15x (980m) | M17xR3 (880m) | M18xR1 (880m SLI) | 18 R1 (RTX 3000)

DT: Aurora R4 (i9 10980XE/RTX 4070) | Area-51 R2 (22-Core Xeon/2x Titan V) | SR-2 [2x6-Core/3x980Ti] | Mac Studio


CS Studios YouTube: https://www.youtube.com/c/CSStudiosYT 

Link to comment
Share on other sites

7 hours ago, ssj92 said:

Before you cut any pins, I definitely recommend testing out either the nvidia or amd gpu in slot 2 in SG mode and see how the laptop performs. With Windows 10 and newer, having multiple GPUs isn't as bad anymore and it would be interesting to see if the cards at least work. 

Work how exactly if I may ask?

I thought multiple GPU's didn't work with SG mode?

Or does that only apply to SLI and Crossfire.

You're thinking that somehow not only the GTX 1070 will be used but also the AMD card for extra performance?

Kinda lost what you mean here or what I would get out of it...

Link to comment
Share on other sites

What about the plastic plate that fits instead of the gpu card when running 1 card? Maybe it would be easier to drill to fit the X bracket and mount the heatsink to it?

Johndill@  on  this forum  thread  "M18x R2 GPU2 custom blanking plate." 

  • Thumb Up 1

Clevo P870DM3-G i9-9900k-32.0GB 2667mhz-GTX 1080 SLI

Alienware M18x R2 i7-3920xm-32GB DDR3-1600 RTX 3000 

Alienware M17x R4 i7-3940XM 32GB DDR3-1600 RTX 3000

Alienware M17x R4 i7-3940XM 20GB DDR3-1600 P4000 120hz 3D

Precision m6700 i7-3840QM - 16.0GB DDR3 - GTX 970M 
Precision m4700 i7-3610QM-8.00GB DDR3 @ 1600MHz-K2000M

GOBOXX SLM  G2721-i7-10875H RTX 3000-32GB ddr4(Gave to my Wife)

 

Link to comment
Share on other sites

26 minutes ago, aldarxt said:

What about the plastic plate that fits instead of the gpu card when running 1 card? Maybe it would be easier to drill to fit the X bracket and mount the heatsink to it?

Johndill@  on  this forum  thread  "M18x R2 GPU2 custom blanking plate." 

Yes I saw that thread. And I do have one of those original alienware blanking plates.

But I figured I'd use that one in my other M18x which is an R1.

I just didn't want to make a custom one myself since I'm afraid of putting anything in the MXM slot that can short pins if it's even the slightest bit conductive, even with static.

Even that alienware blanking plate scared me a little to put it in, maybe it's some special material which is not so static sensitive I don't know. Or it might be just HDPE, I have no idea. But like I said, I prefer keeping it for my R1.

Maybe I'll change my mind who knows :)

I'll decide when I have all the materials I ordered. Maybe I end up using the blanking plate in the end, we'll see.

I'll post pictures when it's all done. Could take a month or so though before I can do the mod.

  • Thumb Up 1
Link to comment
Share on other sites

On 9/20/2022 at 5:59 AM, GMP said:

Work how exactly if I may ask?

I thought multiple GPU's didn't work with SG mode?

Or does that only apply to SLI and Crossfire.

You're thinking that somehow not only the GTX 1070 will be used but also the AMD card for extra performance?

Kinda lost what you mean here or what I would get out of it...

Work as in, windows boots up and everything still runs on primary gpu, but secondary gpu shows up. You don't need second gpu to provide any performance boost. Point is to see if you can use the computer normally with two different GPUs in SG. 

 

People have said SG doesn't work with 2 GPUs but from my Titan V SLI mod I realized it should work even in SG mode. So I am curious to see. I also plan to test RTX 3000 SLI in AW18 once my A4500 comes for M18xR2. 

Alienware m18             : Intel Core i9 13900HX                  | nVidia GeForce RTX 4090    | K1675 | 2x1TB SSDs 

Alienware Area-51M : Intel Core i9-9900K @ 5.3Ghz    | nVidia GeForce RTX 2080    | AX210 | Samsung 970 Evo+ 
Alienware M18x R2 :    Intel Core i7 3920XM @ 4.7Ghz | nVidia Quadro RTX 3000     | AX210 | Samsung 980 PRO   
Clevo X170SM-G:         Intel Core i7 10700K @ Stock     | nVidia GeForce RTX 2070S | AX210 | 256GB+2x512GB 

More Laps: M14x (555m) | M14xR2 (650m) | M15x (980m) | M17xR3 (880m) | M18xR1 (880m SLI) | 18 R1 (RTX 3000)

DT: Aurora R4 (i9 10980XE/RTX 4070) | Area-51 R2 (22-Core Xeon/2x Titan V) | SR-2 [2x6-Core/3x980Ti] | Mac Studio


CS Studios YouTube: https://www.youtube.com/c/CSStudiosYT 

Link to comment
Share on other sites

6 hours ago, ssj92 said:

Work as in, windows boots up and everything still runs on primary gpu, but secondary gpu shows up. You don't need second gpu to provide any performance boost. Point is to see if you can use the computer normally with two different GPUs in SG. 

 

People have said SG doesn't work with 2 GPUs but from my Titan V SLI mod I realized it should work even in SG mode. So I am curious to see. I also plan to test RTX 3000 SLI in AW18 once my A4500 comes for M18xR2. 

But the RTX 3000 doesn't have an SLI plug for the SLI cable, without the SLI cable the secondary would only show up as PHYSX processor. I hope I am wrong and in SG it might work, This is very interesting, would 2 same cards or 2 different cards? An experiment is needed!!!

Clevo P870DM3-G i9-9900k-32.0GB 2667mhz-GTX 1080 SLI

Alienware M18x R2 i7-3920xm-32GB DDR3-1600 RTX 3000 

Alienware M17x R4 i7-3940XM 32GB DDR3-1600 RTX 3000

Alienware M17x R4 i7-3940XM 20GB DDR3-1600 P4000 120hz 3D

Precision m6700 i7-3840QM - 16.0GB DDR3 - GTX 970M 
Precision m4700 i7-3610QM-8.00GB DDR3 @ 1600MHz-K2000M

GOBOXX SLM  G2721-i7-10875H RTX 3000-32GB ddr4(Gave to my Wife)

 

Link to comment
Share on other sites

6 hours ago, ssj92 said:

Work as in, windows boots up and everything still runs on primary gpu, but secondary gpu shows up. You don't need second gpu to provide any performance boost. Point is to see if you can use the computer normally with two different GPUs in SG. 

 

People have said SG doesn't work with 2 GPUs but from my Titan V SLI mod I realized it should work even in SG mode. So I am curious to see. I also plan to test RTX 3000 SLI in AW18 once my A4500 comes for M18xR2. 

 

Titan V SLI mod? deetz! 😄 

Mine: Hyperion "Titan God of Heat, Heavenly Light, Power" (2022-24)
AMD Ryzen 9 7950X (custom TG IHS) / Asus ROG Crosshair X670E Extreme / MSI Geforce RTX 4090 Suprim X / Teamgroup T-Force Delta RGB DDR5-8200 2x24 GB / Seagate Firecuda 530 4 TB / 5x Samsung 860 Evo 4 TB / Arctic Liquid Freezer II 420 (Push/Pull 6x Noctua NF-A14 IndustrialPPC-3000 intake) / Seasonic TX-1600 W Titanium / Phanteks Enthoo Pro 2 TG (3x Arctic P12 A-RGB intake / 4x Arctic P14 A-RGB exhaust / 1x Arctic P14 A-RGB RAM cooling) / Samsung Odyssey Neo G8 32" 4K 240 Hz / Ducky One 3 Daybreak Fullsize Cherry MX Brown / Corsair M65 Ultra RGB / PDP Afterglow Wave Black

 

My Lady's: Clevo NH55JNNQ "Alfred" (2022-24)
Sharp LQ156M1JW03 FHD matte 15.6" IGZO 8 bit @248 Hz / Intel Core i5 12600 / Nvidia Geforce RTX 3070 Ti / Mushkin Redline DDR4-3200 2x32 GB / Samsung 970 Pro 1 TB / Samsung 870 QVO 8 TB / Intel AX201 WIFI 6+BT 5.2 / Win 11 Pro Phoenix Lite OS / 230 W PSU powered by Prema Mod!

Link to comment
Share on other sites

SLI in SG mode doesnt work ofc but there Is some kind of tool that change something in Nvidia driver that allows to enable SLI on different cards without bridge or anything else It should to work Also on SG. Btw, using dedicated Pascal GPU on slave slot does help to avoid pwr throttling getting Better perfermance but must to tweak manual fans

Link to comment
Share on other sites

On 9/23/2022 at 12:43 PM, jaybee83 said:

 

Titan V SLI mod? deetz! 😄 

 

On 9/23/2022 at 6:10 AM, ssj92 said:

Work as in, windows boots up and everything still runs on primary gpu, but secondary gpu shows up. You don't need second gpu to provide any performance boost. Point is to see if you can use the computer normally with two different GPUs in SG. 

 

People have said SG doesn't work with 2 GPUs but from my Titan V SLI mod I realized it should work even in SG mode. So I am curious to see. I also plan to test RTX 3000 SLI in AW18 once my A4500 comes for M18xR2. 

It doesnt. They show up but you cant use any of them for 3d applications

Link to comment
Share on other sites

On 9/23/2022 at 3:33 AM, aldarxt said:

But the RTX 3000 doesn't have an SLI plug for the SLI cable, without the SLI cable the secondary would only show up as PHYSX processor. I hope I am wrong and in SG it might work, This is very interesting, would 2 same cards or 2 different cards? An experiment is needed!!!

 

My Titan V also don't have official SLI support. Yet, I am running two of them in SLI on a 12 year old PCIe 2.0 x16 system WITHOUT an SLI bridge and they almost beat my 3090Ti:

 

https://www.3dmark.com/compare/spy/30859401/spy/30263905/spy/27316826 

 

Once I get A4500 and remove my 2nd RTX 3000, I will test this same method in AW18. 

 

I am actually working on my youtube video on how to do this. You can play around with any set of GPUs without SLI bridge later. 😄

 

7 hours ago, Expresstyle said:

 

It doesnt. They show up but you cant use any of them for 3d applications

Have you tested this personally and which OS? If they both show up, windows boots up then there is hope. 

 

Windows 11 and newer Windows 10 builds do a good job with multi-GPU set-ups and assigning which one runs. If I have two identical GPUs then I may be able to do the same mod I did to Titan V for RTX 3000 to enable SLI. 

 

There is a member in our AW Club running Aetina 1080 SLI in AW18. 

  • Thanks 1

Alienware m18             : Intel Core i9 13900HX                  | nVidia GeForce RTX 4090    | K1675 | 2x1TB SSDs 

Alienware Area-51M : Intel Core i9-9900K @ 5.3Ghz    | nVidia GeForce RTX 2080    | AX210 | Samsung 970 Evo+ 
Alienware M18x R2 :    Intel Core i7 3920XM @ 4.7Ghz | nVidia Quadro RTX 3000     | AX210 | Samsung 980 PRO   
Clevo X170SM-G:         Intel Core i7 10700K @ Stock     | nVidia GeForce RTX 2070S | AX210 | 256GB+2x512GB 

More Laps: M14x (555m) | M14xR2 (650m) | M15x (980m) | M17xR3 (880m) | M18xR1 (880m SLI) | 18 R1 (RTX 3000)

DT: Aurora R4 (i9 10980XE/RTX 4070) | Area-51 R2 (22-Core Xeon/2x Titan V) | SR-2 [2x6-Core/3x980Ti] | Mac Studio


CS Studios YouTube: https://www.youtube.com/c/CSStudiosYT 

Link to comment
Share on other sites

windows 10, i don't remember exactly the thing but just i tried time ago to launch with gtx 880m dual in sg and everything was working but sli wasn't available and when i launched games they were crashing at startup.

that's nice but sli can be fine perhaps with old dx11 games, sadly dx12 cant be forced to run in sli and developers doesnt support it ofc anymore.

 

Link to comment
Share on other sites

  • 1 year later...

So, no one has tried with 2 different GPUs if it works well? Because I am also very interested in knowing if it works well on an Alienware 18... I saw on another discussion that a person mounted an rtx 3000 and a p3000 on his m18x r2, he did not say that it didn't work, and apparently for him it worked but he didn't explain how and he didn't respond to messages regarding this subject!! 😭

Link to comment
Share on other sites

  • 1 month later...

AW18 is probably a better platform to mix and match GPUs since both slots can be used as a primary only card. 

 

Maybe if I get some free time in the future I can try 

Alienware m18             : Intel Core i9 13900HX                  | nVidia GeForce RTX 4090    | K1675 | 2x1TB SSDs 

Alienware Area-51M : Intel Core i9-9900K @ 5.3Ghz    | nVidia GeForce RTX 2080    | AX210 | Samsung 970 Evo+ 
Alienware M18x R2 :    Intel Core i7 3920XM @ 4.7Ghz | nVidia Quadro RTX 3000     | AX210 | Samsung 980 PRO   
Clevo X170SM-G:         Intel Core i7 10700K @ Stock     | nVidia GeForce RTX 2070S | AX210 | 256GB+2x512GB 

More Laps: M14x (555m) | M14xR2 (650m) | M15x (980m) | M17xR3 (880m) | M18xR1 (880m SLI) | 18 R1 (RTX 3000)

DT: Aurora R4 (i9 10980XE/RTX 4070) | Area-51 R2 (22-Core Xeon/2x Titan V) | SR-2 [2x6-Core/3x980Ti] | Mac Studio


CS Studios YouTube: https://www.youtube.com/c/CSStudiosYT 

Link to comment
Share on other sites

  • 1 month later...
On 9/19/2022 at 12:19 AM, Maxware79 said:

I've thought about doing this a few times over the years but never did anything about it. 

 

I'd imagine that you'd definitely have to find a way of disabling the MXM slot as putting another functional card in there could cause some issues. 

 

If you don't care about the AMD card, I would cut the pin edge off so it didn't sit into the slot. This should work perfectly. If you do care about the card then find a GTX 260M and cut that up. It should be dirt cheap. 

 

Or he does what I did with my RTX 2080 

Spoiler

Alienware 18 R1 / xR3: 4940MX-4.3GHz / RTX 5000 - 16GB / RTX 2080 🙂

Alienware M17x R5 (Ranger): 4940MX 4.35 GHz / RTX 4000. (CURRENT)

Aliennware M17 R4: 10870H 4.8 GHz / 3080 16 Gb 165W

Alienware M17x r4 3940MX 4.1 GHz / Aetina GTX 1080.

Alienware M18x R2 Intel i7 3940XM. 4.2 GHz / RTX 4&5000 / 32Gb 1866

MSI GT75 'Titan' VR 8SF (upgraded to 8SG)

 

Link to comment
Share on other sites

On 11/14/2023 at 7:13 PM, Geg said:

So, no one has tried with 2 different GPUs if it works well? Because I am also very interested in knowing if it works well on an Alienware 18... I saw on another discussion that a person mounted an rtx 3000 and a p3000 on his m18x r2, he did not say that it didn't work, and apparently for him it worked but he didn't explain how and he didn't respond to messages regarding this subject!! 😭

I had the RTX 3000 and the Quadro P3000 at the same time, not in SLI, but if both at the same time, I had problems with the DLSS recognition of the RTX 3000, and difficulties installing the drivers, envy is not a very good friend to have 2 different ones on the same pc. I had them on my Alienware 18

---Dell Precision M6700---, i7-4900MQ, Tesla M6, M6100/HD 8950M, 1920x1080p 6bits LVDS.

---Dell Precision M6800---, i7-4900MQ, Tesla M6, M6100/HD 8950M, 1920x1080p 6bits LVDS.

---Alienware M17x R2--, I7-940MX, HD 8970M Crossfire, M6100/HD 8950M, GTX 765M, 1920x1200P OC 75 Hzt.

---Alienware 17-- i7-4940MX, RTX 3000(LVDS),P3000, GTX 980M, 880M,860M, M6100/HD 8950M,  1080P, 120 Hz, 3D.

--Alienware 18-- i7-4930MX, 32GB RAM 1866 MHz, RTX 4000, RTX 3000, P3000, M6100/HD 8950M, 1920x1080P. LCDOC 75 Hzt. (In use).

---Alienware M18x R2- i7-3940XM, 32GB RAM 1866 MHz, GTX 980M SLI, 1920x1080p 6bits OC 75 Hzt.

 

Youtube Channel : https://www.youtube.com/channel/UCi0b4otJZHtUJePgIejiI4Q.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. Terms of Use