Jump to content
NotebookTalk

JamieTheAnything

Member
  • Posts

    36
  • Joined

  • Last visited

1 Follower

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

JamieTheAnything's Achievements

Explorer

Explorer (4/14)

  • Reacting Well
  • Dedicated
  • Collaborator
  • One Month Later
  • Week One Done

Recent Badges

2

Reputation

  1. I would rec. a thermal repaste along with replacing the thermal pads with thermal putty. It seems like your machine might not have the entire Die being in contact with the heatsink. at idle it shouldnt be at 100C, even when stressed it should get up to 100 but still be around 4.2-4.3ghz all core under load. (optimized fan preset). With my 7550 I have yet to repaste it since getting it, so it should be using the original paste and that has been my experience. After getting fresh and enhanced paste should improve things dramatically. Most cpu's that have absolutely cracked and dried out paste will show what you're experiencing, repasting it and also replacing the pads for the CPU with putty would be your best bet of getting things under control thermally.
  2. lmfao, based on the board design I'm just gonna flatten out the heatsink and then use the back of the Motherboard as the mosfet heatink lmfao.
  3. Here's the two cards along with an overlay of both of them for reference.
  4. In short, A little bit of the bottom of the palm rest (the bottom part of it which faces up when you stare at the bare GPU, would have to get trimmed to make room for the mosfets/PD of the non-maxQ version. However the non-maxq version is the same length as the p5000 that was supposed to fit the 7540/50, as far as I can tell it's just longer between the two of them. the Width is the same and only one screw mount (related to screwing it to the palmrest) is different, other than that, the palmrest + heatsink modification (flatten it out since the "dip" for the mosfets will be in the wrong place, and then either adding a shim/small bit of aluminum to connect the mosfets to the 7550 heatsink), is all that is needed, and since they both share the same BIOS there shouldn't be a whitelist conflict.
  5. Not if you're afraid to make a little modification to the palmrest and the existing heatsink :3 I will make this fit and prove it :3
  6. Anyone swap a 7750 GPU into a 7550? Specifically the 110W variants of the RTX4/5000 series?
  7. Here’s what the listing photo looked like and the dude was asking 280+40 for shipping. I lowballed and gave him an offer for 100$ and I was actually surprised that he accepted it.
  8. Hi yall, long time no see, I managed to snag a 7550 lately for 100$. No ssd no ram, keyboard ripped out, however i already had the ram and a SSD, so the only thing i had to replace was thankfully the busted keyboard. its an i9 configuration with the T2000 and the dci-P3 calibrated 1080p display which looks amazing. however since before I originally got it I’ve been rolling around the idea of upgrading it to the max spec GPU an RTX 5000 16GB. However between the 7750 and 7550 there’s two different sku’s for each, a 80(90)Watt “MaxQ” version that’s ment for the 7550, and a 110watt version ment for the 7750. I just so happened to snag a cheap rtx 5000 on Friday and it’s the 7750 110W version. Based on overlaying images at various opacities it looks like I may need to trim a little bit of the palm rest in order to add extra cooling to the mosfets since it’s so much longer than a typical card they wouldn’t be in contact with the heatsink. if anyone has done this kind of swap before precision 17” GPU -> 15(16”) please let me know if there’s any other potential incompatibilities I should be aware of like bios locks. But as far as I know the 7550 version (maxQ) also works in the 7750 and they both share the same bios there’s no reason as to why it wouldn’t recognize and boot with the 17” card. also to add I’m switching from the “P” heatsink over to the “E” style heat sink since said heatsink has a much bigger cutout(?) for the larger die of the rtx4000+ gpu’s. attached are some photos of the wonderful thing.
  9. As it turns out if you dont plan on shaving them down, you can also hard mount the TM cooler to a DM2/3, I used the screws from a Cisco Switch I took apart. Took off the spring screws and replaced it with those, I finally got good enough contact to even use PTM7950.
  10. Out of the abundance of gatekeeping and spite regarding modding heatsinks to accept the 30 series cards, I'm starting a thread for making an Open source and freely available heatsink mod for the 3080m on the DM(2/3),KM, and TM series. I will be releasing exact measurements, and a guide once i've completed the mod and ironed out the kinks. I'm also limiting myself to using only thermal Putty and Copper shims so that ANYONE can do it. I've seen far to many gatekeepers here and I want to change that. There will be two versions, one for T-Shape and one for the Dual GPU vapor chamber, The first being developed being the dual GPU vapor chamber. If no one else is going to do this I will and I Will update the thread as it goes on.
  11. What I'm more than likely going to do is just limit it to 75 watts and up the single core performance to improve gaming. All core idk anymore, just sticking it to 85-97 watts or something to get an all core 4Ghz while undervolted and with the cache at 40x is what I'll deal with.
  12. Yeah I ended up hard mounting it since one of the screws decided to cease holding, so I used 4 screws from a cisco switch I took apart and hard mounted it because otherwise it wasnt making ANY contact.
  13. Odd thing about mine, It actually had all 12V fans. and It's a DM2_DM3 *I think the DM3 is the dual GPU version which is what I have.
  14. Unfortunately, I have installed the new cooler and it had zero net improvement. however I did figure something else out to help in getting under 100 watts of load. i had the damn cache multiplyer set to 44 setting it to the default (21x)results in being able to get up to 4.4 GHz all core without thermal or power throttling. however it does have a considerable dip in performance compared to all core at 44x at 4.2 GHz vs 21x at 4.4, 1800 vs 1650 in CB 15 im tired.
  15. It wont because the only difference between the dual GPU TM and DM is the thing that sticks out into the channel for the CPU
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. Terms of Use