Jump to content
NotebookTalk

DoenerBoy123

Member
  • Posts

    26
  • Joined

  • Last visited

Other options

  • Member Title
    Kebab Hunter

Recent Profile Visitors

424 profile views

DoenerBoy123's Achievements

Apprentice

Apprentice (3/14)

  • One Year In
  • Collaborator
  • One Month Later
  • Conversation Starter
  • Dedicated

Recent Badges

11

Reputation

  1. I’m actually planing on doing this along with connecting the dgpu straight to the mDP connector. In theory it should be possible by adding the eDP and smBus lanes to the mxm slot. Thing is, I don’t know if software modifications are necessary for this to work(bios). There’s also the problem that the DP signal is fragile af. Both wires of each lane needs to have the almost exact same length and impedance
  2. Interesting, thought this wasn't a thing anymore since often cards have different vBios versions for different amounts of memory. Are there any sources for how to place the resistors for specific memory sizes like for a 980M or 1070M(ex. the MSI version)
  3. Would be interesting to know what part of the vBios needs to be modded in order to use that amount of memory. Id really like to try something like that on an old GTX 980
  4. I had the same Issue and also tried that, but this does not seem to fix this(at least in my case). As I mentioned above, you first have to refresh the bios chip with a dump of whatever version. This makes the usb backflash working again which is what your machine brings back to life. I had to desolder the bios chip in order to be recognised by the CH341A programmer, probably because the 3.3V rail draws too much current. My theory is that this Intel HD memory setting bricks the Intel ME portion of the rom which is absolutely necessary for the machine to run. This would explain why the USB backflash method does not work in this case as the ME is responsible for the usb flashing process(as far as I know) @simon h Depending on where you live and if you are willing to, you can sent me your board and I can try to revive it the way it worked for me. Just PM me
  5. For those who might be interested, I've done something similar to a Zotac GTX 1070, also to increase the available power. There are three unused pads left for a total of 10 MOSFETs. The powerstage for the Vcore circuitry consists out of 5 phases, so at the end you have 2 MOSFETs per phase. The stock config has two phases with 2 MOSFETs and three with 1 MOSFET per phase. I already saw a few cards that had these single MOSFET phases burned up, which is another reason why I did this. All these mods were made on a PCB preheater and with leaded solder to decrease stress on involved components. After that I flashed the vBios from the MSI GT72 to raise the powerlimit from 90W to 115-120W I hope that my text is not too confusing, I'm quite tired ....
  6. Try Nvcleaninstall. In my opinion this is a little bit easier and you have more control over the driver intsallation like telemetry and stuff like that
  7. This thread is a little bit old now but I want to to give an update about this mod. It’s already a few months past since I’ve mod this card and so far the results are pretty good. I’ve took the mosfets off of an old half working quadro and a 680m I’ve had laying around. The soldering itself was not a big deal, but requires some experience. The maximum power draw with the prema vbios seems to be around 170 watts. The test was done with furmark for a very short time to see how much this card can pull(this was with a 200mhz oc on the core, voltage untouched). In most games the power consumption goes up to around 140-150 watts max with the temperature settling at 65-70 degrees(M17x R4)
  8. I can recommend NVCleaninstall. In my opinion, this is easier than modding the driver manually and this gives you more control about telemetry and other stuff.
  9. In the M18x and many other notebook's the MXM slot is wired directly to the main power rail without any switch in between. Even when the notebook is turned off there is voltage present on the slot. This is also why lot's of notebook's with shorted GPU's will trigger the PSU's short circuit protection when plugging in. My guess would be that there is no switch on the adapter board that disconnects the SSD and it's voltage regulator from the main power rail/MXM slot when turned off. This means the SSD is always on and drawing power. It's the same like an external HDD with it's own PSU turned on but without USB being connected to it
  10. I think it doesn't really matter which template you choose, at least that's the case with my GTX 1070 mobile. I just took the standard template for the 1070 mobile and it worked fine without even having the same hardware id
  11. Some people replaced the stock fan with one from the m14x R1 as this fan can push much more air. Unfortunately these fans are hard to get as lots of sellers sell "R1 compatible" fans which is not what you want. The R1 one has similar shaped blades like the gpu fan which is the reason why they are so good.
  12. As far as I know a copper plate between heatsink and die is actually essential as pascals core has less height. To get back to the main topic, I have some plans to try the modification and if I'm lucky I have some time at the weekend. I'll defenetly post some updates, even unsuccesful ones!
  13. Okay that might be the difference! Are you able to control all the fans independently or do you control all three at once and does the card work at it's desired speed? I was also thinking about using an Arduino and a temperature probe to build something like a Pam fan controller, but that was before I found the fan to be partially working.
  14. That's what I thought too. In my case the fan works after some warming up and then without issues(GPU heats up to arround 65-70* in idle, fan turns on after some time, now put some load on the GPU and the fan will continue to spin like with a compatible card). I found out that there's actually another thermal sensor on the pcb which monitors the MXM slot ambient temperature which might be responsible for this behaviour. If you look at the HWmonitors's entry to the EC you see that theres no GPU temperature listed with the zotac card. If you use another card like the 980M theres an entry for the GPU, so the EC "sees" the card. I went through the MXM 3.1 spec sheet and found out that MXM specifies some pre defined adress values for the smBus which is how the EC reads the temperature. I think zotac used some diffrent adress values which might be why the EC is unable to read the temp. The adress can't be changed by a using a different vBios, rather there should be some resistors on the card to set the adress. Maybe here are people with more knowlege about smBus and might wanna help solve the problem? As the card semms to be based on the GeCube one, people with those cards can help? Edit: Here's the spec sheet for MXM 3.1 and the thing about the smBus adresses i meant: https://www.module-store.de/media/pdf/d9/a4/43/MXM_Specification_v31_r10.pdf
  15. Good to know and somehow sad. I'm gonna try this mod the next days when I may have some time. I think it's actually not that hard. As example, the MUX chip used for the mini DP is a Maxim MAX14998. This chip has two inputs, one coming from the PCH and the other coming from the MXM master slot. The inputs of this chip can be changed by setting the switch pin either high or low. So, if my theory is correct I have to disconnect this pin from the EC and pull it low in order to activate the output coming from the MXM slot(according to the schematics). This goes for all outputs, but these have different switches(same principle as described). I dont know about the 1080. I've heard that these card are hard to handle in dGPU only laptops as the eDP output makes some problems(was never intended to be used). My card seems to have the original zotac vBios and works fine as I'm using optimus. Another problem with these zotac cards is the non working fan control, guess that has something to do with the smBus
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. Terms of Use