Jump to content
NotebookTalk

Alienware M18x (R2) SG dGPU video out


DoenerBoy123

Recommended Posts

Hello, 

I recently installed the zotac GTX 1070 (these MXM GPUs out of their zBoxes) in my M18x R2 and so far everthing works fine(besides the strange fan behaviour, but thats another topic). As the lack of LVDS support I have to use the GPU in SG mode, so it's running over the iGPU. When connecting an external monitor I noticed that its also connected just to the iGPU which is bad for me as I want to use a high refreshrate screen. I digged through the schematics of the laptop(easily avialable) and noticed that the video outputs are switched by MUX switches that are controlled by the EC. My question now is: is there any way to switch these chips by software like a bios setting or something else? Otherwise I would just set them manually as these chips just get an high or low signal to change the input.

Alienware Area 51 ALX R5 3600 32GB 3200MHz RTX 2080 Ti FE

Alienware M18x R2 I7 3840QM 4.2GHz 16GB 1600MHz GTX 1070(Zotac MXM, 10 MOSFETs)

Alienware M17x R4 I7 3630QM 16GB 1600MHz GTX 980M 8GB(six MOSFETs)

MacBook Pro Retina I7 3820QM 16GB 1600MHz GT650M (for school)

And lot's of other stuff in my hobby corner (old Clevo's 'n' stuff like that)😄

 

Link to comment
Share on other sites

On 11/12/2022 at 9:02 AM, DoenerBoy123 said:

Hello, 

I recently installed the zotac GTX 1070 (these MXM GPUs out of their zBoxes) in my M18x R2 and so far everthing works fine(besides the strange fan behaviour, but thats another topic). As the lack of LVDS support I have to use the GPU in SG mode, so it's running over the iGPU. When connecting an external monitor I noticed that its also connected just to the iGPU which is bad for me as I want to use a high refreshrate screen. I digged through the schematics of the laptop(easily avialable) and noticed that the video outputs are switched by MUX switches that are controlled by the EC. My question now is: is there any way to switch these chips by software like a bios setting or something else? Otherwise I would just set them manually as these chips just get an high or low signal to change the input.

As far as we know, nothing in software. 

 

If you get it working with hardware mod please do a write up as that would be very useful for many of us. 

Alienware m18             : Intel Core i9 13900HX @ 5.0Ghz | nVidia GeForce RTX 4090    | K1675 | 2x1TB SSDs 

Alienware Area-51M : Intel Core i9-9900K @ 5.3Ghz    | nVidia GeForce RTX 2080    | AX210 | Samsung 970 Evo+ 
Alienware M18x R2 :    Intel Core i7 3920XM @ 4.7Ghz | nVidia Quadro RTX 5000     | AX210 | Samsung 980 PRO   
Alienware 18 :              Intel Core i7 4930MX @ 4.5Ghz  | nVidia Quadro RTX 3000  | AX210 | Samsung 980 NVMe  

More Laps: M14x (555m) | M14xR2 (650m) | M15x (980m) | M17xR3 (880m) | M18xR1 (RTX 5000) 

BEAST Server:          Intel Xeon W7-3465X 28 P-Cores | nVidia Titan V | 128GB RDIMM | Intel Optane P5800X


CS Studios YouTube: https://www.youtube.com/c/CSStudiosYT 

Link to comment
Share on other sites

We tried to flash the bios, when I installed zotac 1080, only one bios worked with it normally..?

Alienware M18 R1 : Intel Core i9 13900HX | nVidia RTX 4090 16 ГБ | 32 ГБ DDR5

Alienware Area-51M R2 : Intel Core i9 10900K @ | nVidia GeForce RTX 2080 super 8 ГБ | 64 ГБ DDR4 2993 МГц | x2 NVMe емкостью 1 ТБ.

Alienware 17 : Intel Core i7 4900QM @ 4.1 ГГц | nVidia Quadro RTX 3000 6 ГБ | 32 ГБ DDR3 1600 МГц | 2 Samsung 860 Evo 500 ГБ (Raid).

 

Link to comment
Share on other sites

On 11/16/2022 at 4:20 AM, ssj92 said:

As far as we know, nothing in software. 

 

If you get it working with hardware mod please do a write up as that would be very useful for many of us.

Good to know and somehow sad.

I'm gonna try this mod the next days when I may have some time. I think it's actually not that hard. As example, the MUX chip used for the mini DP is a Maxim MAX14998. This chip has two inputs, one coming from the PCH and the other coming from the MXM master slot. The inputs of this chip can be changed by setting the switch pin either high or low. So, if my theory is correct I have to disconnect this pin from the EC and pull it low in order to activate the output coming from the MXM slot(according to the schematics). This goes for all outputs, but these have different switches(same principle as described).

On 11/16/2022 at 8:18 PM, Komputers-Best said:

We tried to flash the bios, when I installed zotac 1080, only one bios worked with it normally..?

I dont know about the 1080. I've heard that these card are hard to handle in dGPU only laptops as the eDP output makes some problems(was never intended to be used). My card seems to have the original zotac vBios and works fine as I'm using optimus. Another problem with these zotac cards is the non working fan control, guess that has something to do with the smBus

Alienware Area 51 ALX R5 3600 32GB 3200MHz RTX 2080 Ti FE

Alienware M18x R2 I7 3840QM 4.2GHz 16GB 1600MHz GTX 1070(Zotac MXM, 10 MOSFETs)

Alienware M17x R4 I7 3630QM 16GB 1600MHz GTX 980M 8GB(six MOSFETs)

MacBook Pro Retina I7 3820QM 16GB 1600MHz GT650M (for school)

And lot's of other stuff in my hobby corner (old Clevo's 'n' stuff like that)😄

 

Link to comment
Share on other sites

Because there is no properly sharpened bios for zotac 1070/1080 for Alienware, I used the HWINFO program and calmly adjusted the fan speeds there.

Alienware M18 R1 : Intel Core i9 13900HX | nVidia RTX 4090 16 ГБ | 32 ГБ DDR5

Alienware Area-51M R2 : Intel Core i9 10900K @ | nVidia GeForce RTX 2080 super 8 ГБ | 64 ГБ DDR4 2993 МГц | x2 NVMe емкостью 1 ТБ.

Alienware 17 : Intel Core i7 4900QM @ 4.1 ГГц | nVidia Quadro RTX 3000 6 ГБ | 32 ГБ DDR3 1600 МГц | 2 Samsung 860 Evo 500 ГБ (Raid).

 

Link to comment
Share on other sites

On 11/18/2022 at 11:15 AM, Komputers-Best said:

Because there is no properly sharpened bios for zotac 1070/1080 for Alienware, I used the HWINFO program and calmly adjusted the fan speeds there.

That's what I thought too. In my case the fan works after some warming up and then without issues(GPU heats up to arround 65-70* in idle, fan turns on after some time, now put some load on the GPU and the fan will continue to spin like with a compatible card). I found out that there's actually another thermal sensor on the pcb which monitors the MXM slot ambient temperature which might be responsible for this behaviour. If you look at the HWmonitors's entry to the EC you see that theres no GPU temperature listed with the zotac card. If you use another card like the 980M theres an entry for the GPU, so the EC "sees" the card. I went through the MXM 3.1 spec sheet and found out that MXM specifies some pre defined adress values for the smBus which is how the EC reads the temperature. I think zotac used some diffrent adress values which might be why the EC is unable to read the temp. The adress can't be changed by a using a different vBios, rather there should be some resistors on the card to set the adress. Maybe here are people with more knowlege about smBus and might wanna help solve the problem? As the card semms to be based on the GeCube one, people with those cards can help?

 

Edit: Here's the spec sheet for MXM 3.1 and the thing about the smBus adresses i meant:

 

https://www.module-store.de/media/pdf/d9/a4/43/MXM_Specification_v31_r10.pdf

 

 

Capture.PNG

Alienware Area 51 ALX R5 3600 32GB 3200MHz RTX 2080 Ti FE

Alienware M18x R2 I7 3840QM 4.2GHz 16GB 1600MHz GTX 1070(Zotac MXM, 10 MOSFETs)

Alienware M17x R4 I7 3630QM 16GB 1600MHz GTX 980M 8GB(six MOSFETs)

MacBook Pro Retina I7 3820QM 16GB 1600MHz GT650M (for school)

And lot's of other stuff in my hobby corner (old Clevo's 'n' stuff like that)😄

 

Link to comment
Share on other sites

I installed the zotac 1080 card in Alienware 18 in both the first and second slots, in the first slot it was very hot in games, but in idle time the temperature was 55, I moved it to the second slot and reinforced it with a copper plate on the chip + used the best thermal pads, the laptop worked through optimus, the temperature in idle time was 45-48, in the load reached 86 degrees and since the fan was not spinning, I used aggressive settings to turn it on in the HWINFO program.

Alienware M18 R1 : Intel Core i9 13900HX | nVidia RTX 4090 16 ГБ | 32 ГБ DDR5

Alienware Area-51M R2 : Intel Core i9 10900K @ | nVidia GeForce RTX 2080 super 8 ГБ | 64 ГБ DDR4 2993 МГц | x2 NVMe емкостью 1 ТБ.

Alienware 17 : Intel Core i7 4900QM @ 4.1 ГГц | nVidia Quadro RTX 3000 6 ГБ | 32 ГБ DDR3 1600 МГц | 2 Samsung 860 Evo 500 ГБ (Raid).

 

Link to comment
Share on other sites

That is, in Alienware 18, the fan on the 1080 card was turned on exactly according to the temperature sensor through the HWINFO program and everything worked, though I don't know how it is in R2.

Alienware M18 R1 : Intel Core i9 13900HX | nVidia RTX 4090 16 ГБ | 32 ГБ DDR5

Alienware Area-51M R2 : Intel Core i9 10900K @ | nVidia GeForce RTX 2080 super 8 ГБ | 64 ГБ DDR4 2993 МГц | x2 NVMe емкостью 1 ТБ.

Alienware 17 : Intel Core i7 4900QM @ 4.1 ГГц | nVidia Quadro RTX 3000 6 ГБ | 32 ГБ DDR3 1600 МГц | 2 Samsung 860 Evo 500 ГБ (Raid).

 

Link to comment
Share on other sites

On 11/19/2022 at 5:46 PM, Komputers-Best said:

That is, in Alienware 18, the fan on the 1080 card was turned on exactly according to the temperature sensor through the HWINFO program and everything worked, though I don't know how it is in R2

Okay that might be the difference! Are you able to control all the fans independently or do you control all three at once and does the card work at it's desired speed? I was also thinking about using an Arduino and a temperature probe to build something like a Pam fan controller, but that was before I found the fan to be partially working. 

Alienware Area 51 ALX R5 3600 32GB 3200MHz RTX 2080 Ti FE

Alienware M18x R2 I7 3840QM 4.2GHz 16GB 1600MHz GTX 1070(Zotac MXM, 10 MOSFETs)

Alienware M17x R4 I7 3630QM 16GB 1600MHz GTX 980M 8GB(six MOSFETs)

MacBook Pro Retina I7 3820QM 16GB 1600MHz GT650M (for school)

And lot's of other stuff in my hobby corner (old Clevo's 'n' stuff like that)😄

 

Link to comment
Share on other sites

10 hours ago, DoenerBoy123 said:

Okay that might be the difference! Are you able to control all the fans independently or do you control all three at once and does the card work at it's desired speed? I was also thinking about using an Arduino and a temperature probe to build something like a Pam fan controller, but that was before I found the fan to be partially working. 

 

Yes, everything worked, I had two sensors involved, one for the processor and the other for the video card, well, I'll say this, after all, the radiator from Alienware 18 is weak, I had to additionally install copper plates both on the chip itself and around the perimeter.

Alienware M18 R1 : Intel Core i9 13900HX | nVidia RTX 4090 16 ГБ | 32 ГБ DDR5

Alienware Area-51M R2 : Intel Core i9 10900K @ | nVidia GeForce RTX 2080 super 8 ГБ | 64 ГБ DDR4 2993 МГц | x2 NVMe емкостью 1 ТБ.

Alienware 17 : Intel Core i7 4900QM @ 4.1 ГГц | nVidia Quadro RTX 3000 6 ГБ | 32 ГБ DDR3 1600 МГц | 2 Samsung 860 Evo 500 ГБ (Raid).

 

Link to comment
Share on other sites

1 hour ago, Komputers-Best said:

Yes, everything worked, I had two sensors involved, one for the processor and the other for the video card, well, I'll say this, after all, the radiator from Alienware 18 is weak, I had to additionally install copper plates both on the chip itself and around the perimeter.

 

As far as I know a copper plate between heatsink and die is actually essential as pascals core has less height. 

 

To get back to the main topic, I have some plans to try the modification and if I'm lucky I have some time at the weekend. I'll defenetly post some updates, even unsuccesful ones!

Alienware Area 51 ALX R5 3600 32GB 3200MHz RTX 2080 Ti FE

Alienware M18x R2 I7 3840QM 4.2GHz 16GB 1600MHz GTX 1070(Zotac MXM, 10 MOSFETs)

Alienware M17x R4 I7 3630QM 16GB 1600MHz GTX 980M 8GB(six MOSFETs)

MacBook Pro Retina I7 3820QM 16GB 1600MHz GT650M (for school)

And lot's of other stuff in my hobby corner (old Clevo's 'n' stuff like that)😄

 

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. Terms of Use