Jump to content
NotebookTalk

MaxxD

Member
  • Posts

    347
  • Joined

  • Last visited

Other options

  • Member Title
    โ—„Clevoยฎ X170SM-G (zTecpc Prema BIOS) *i9-10900KF*128GB RAM*RTX 3080 16GB vRAM*8TB M.2 SSDโ–บ

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

MaxxD's Achievements

Rising Star

Rising Star (9/14)

  • Very Popular Rare
  • One Year In
  • One Month Later
  • Collaborator
  • Week One Done

Recent Badges

131

Reputation

  1. Full (XMG) Control Center "Gallery" (<---LINK) My experience so far has been good and I am satisfied. I would recommend this machine to anyone. It is properly optimized in both software and hardware. The machine is a Medion Erazer, but it matches the XMG NEO machine perfectly, so I installed the XMG BIOS on it and used the XMG Control Center, and I don't regret it.
  2. There is Custom Mode. What settings do you recommend there? GPU-Z During the bench test, this max GPU value is the default.
  3. Hi, I bought a Medion Erazer Beast 16 X1 Ultimate laptop yesterday. I'm still getting to know it, but my first impression is good. The extra refrigerator is not here yet, but it is on its way. Hopefully I will have it within 1-2 weeks. Power Adapter 420W BIOS ๐Ÿ™‚
  4. I prefer to use paste, it's safe and definitely good! ๐Ÿ˜โœŒ๏ธ
  5. Of course, it's been a good and interesting method so far...I've used Kryosheet chips a lot, cut to size exactly as needed. According to my friend, it also degraded...that might have been the problem...(?) I used it for both CPU and GPU.
  6. Finally I smeared it with paste and that solved the problem... in the end, paste is the best... always! Default: "Tested" During the Time Spy test, the GPU temperature reached a maximum of 79 degrees and the HotSpot temperature was 87 degrees.
  7. With a GPU temperature of 71 degrees, a HotSpot temperature above 100 degrees is not very lucky.๐Ÿ™„๐Ÿ˜ฅ
  8. Unfortunately, I can't solve it. In principle, the thickness of the pads is appropriate, but the fit is still not good. The assembled picture shows something...how is this possible? (I have to find out if the size matches the real size, in principle it does!) Please confirm or deny, thank you! It seems that the heel is high...but why!? It was good before and it wasn't this bad. So where or what is the mistake?๐Ÿค”๐Ÿ˜ฌ
  9. If everything is true, there is a reason for the error...the maximum difference between GPU and HotSpot temperature can be 12 degrees, since it is already higher (71 degrees GPU / 102-108 degrees HotSpot) so that is what causes the previously described.
  10. I experience this everywhere, yes. ๐Ÿ™„ 3DMark TimeSpy LINK
  11. hi, I noticed something interesting. My 2080 Super card is not working 100% and I don't know what's wrong with it. I tested it under Win10 and 11 with different drivers, the result is the same. Here is a FurMark test.
  12. The machine is working again...it was a big fight, but a victory! (you could say it like that...) ๐Ÿคญโ˜บ๏ธ
  13. Polishing the CPU and GPU heatsinks can also help. I have a couple of factory SM-G and KM-G heatsinks, all polished. The water coolers were not polished.โœŒ๏ธ
  14. Well, it's done! Finally! The RAM voltage was not there, so I had to reconnect it and find a solution where the voltage was coming from.... There was a separate panel voltage, separately (3A) provided for the memory section, but then the 4th SSD slot would not have been usable. The RAM voltage was OK anyway. Then the idea came and the Intel HD voltage was used to the advantage of the RAM and was passed through. Only 2 Modules can be used, because it would not withstand 4 modules and would be destroyed. Two opposite RAM slots work and the 3200MHz CL20 Kingston RAM KIT can be used! CPU and GPU work perfectly. The whole machine works perfectly. i7-11700K @ 4.6GHz (Prime95)
ร—
ร—
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. Terms of Use