Jump to content
NotebookTalk

Developer79

Member
  • Posts

    276
  • Joined

  • Last visited

Everything posted by Developer79

  1. Here are 2 pictures from my tests with AMD and Nvidia. The AMD card was just not plugged in correctly, therefore 4 lanes. This sometimes happens with the PCIe port!:-)
  2. Correct, there are only these connections via the chipset, no direct connections via the CPU! No, there are no PCIe connections over the chipset with me, but only over the CPU!
  3. Yes, I also think since there are no problems with the speed:-) I'll test a current UEFI card soon.
  4. Here is my last test with HD6790-Desktop in P870TM: You can see x16 Lanes!
  5. In this respect, the technologies mentioned cannot be relied upon. Here you have to look for solutions yourself. This requires in-depth knowledge of physics, electrical engineering, electronics and CAD, and you have to know the relevant hardware specifications and standards.
  6. My concept is not like your proposed one. For me it's about full integration!:-) So one GPU for everything, no division of performance resources.....
  7. Exactly, if you have a PCIe 4.0 card then the display would be PCIe 3.0 on older systems. Of course, the CPU and chipset reflect the current generation. The backward compatibility is always there!
  8. You would have to go deeper into the different hardware specifications. In any case, I can say that no controller is necessary. When I've done my tests, I'll show a few things.
  9. Exactly, you need a direct connection to the MXM ports. For this you need an extra development. But you don't need a controller, because a direct connection is the connection itself! All other possibilities are not worthwhile from a physics point of view. A Thunderbolt port still takes time to allocate the PCIe lanes. Every controller needs time to switch and that would then always be a bottleneck from the PCIexpress point of view.
  10. Yes, this is a big disadvantage of the usual adapters! I will soon test an adapter with real 16x connection in the P870TM. Let's see how that works.
  11. I know! I think this heat sink is excellent:-) Only a really good water cooling is better. Even today's standard watercooling are worse than the Vapour...
  12. Here my last Experiment with P870TM / 2x RTX3080 / 9900KS and the big Power supply(780W): And all this with the original Vapourchamber with adapters. The temperatures are very good with the vapour chamber. This is the best heatsink ever built for the P870!!!
  13. The power supply does a good job. My last big test was a P870TM with 2x RTX3080(SLI) and original Vapourchamber...Unfortunately it is hardly possible to modify the new drivers regarding SLI anymore!!!
  14. I mean of course the Vapourchamber for the P870Tm as cooling! This is the best air cooling ever with the RTX3080!!!
  15. You don't have to have the copper heatsink. There are also adapters to continue using the original heatsink. The temperatures are at full load with me between 55-60 degrees with the air cooling and RTX3080. See pictures in the thread. I think this cooling performance can be seen:-)
  16. You need a RAM bar that supports the XMP profile, that's why there are no timing values in the bios! On occasion I'll send you a picture of a compatible Ram!
  17. I don't know why most people are so surprised when comparing the RTX3080MXM with the RTX4080 4090 laptop! It has always been the case that the laptop versions are somewhat weaker than the desktop versions in the same generation. The performance values of the RTX4080 4090 laptop are also electrically reduced compared to the desktop version. Then there is the influence that they are no longer MXM laptops. With an MXM card, the electrical influence is separated to a port with a certain distance to the rest. The electromagnetic influence is then lower.
  18. I did another test with Guardians of the Galaxy. I found well comprehensible settings at Computerbase. Comparing the P870TM / RTX 3080 with the RTX 4080 4090 laptop now shows that the RTX3080 performs on the level of an RTX4080. In both settings with ray tracing and without, the whole thing is as just described. The RTX3080 is about as fast as the RTX4080. The RTX4090 is always a bit faster in the game.
  19. I have to agree with him there! That with the programmer you must also consider certain things. But just try it out!
  20. I have a comparison here between Computerbase and my tests on Cyberpunk. The exact settings are documented here. Take a look at the amazing results! The P870TM / RTX3080 / 9900KS performs better in raytracing than without! With raytracing, the P870TM / RTX3080 is almost 20 FPS faster than an RTX4090 and without it is about the same, maybe about 5FPS slower!
  21. Yes, I agree that the benchmarks are somewhat changed so that the new laptops RTX40xx looks good! Anyway, that's my experience in measurements:-) You must also remember that an MXM card would generate more reading. But because everything is basically soldered there is more interference and influence!
  22. Here is another test from someone independent on Youtube. You can see all the settings and apply them exactly the same way. Gaming performance is identical for the comparison and the P870TM / RTX3080 / 9900KS performs equally fast to the RTX4080 at 175W / Intel Core i9-13900HX!!!! The comparisons rather show the picture, the RTX4080 performs on the old RTX3080 MXM level! The RTX4090 laptop is slightly faster than a RTX3080 MXM! With DLSS 3.0 comes even more distortion to it and brings only in the supported games extra performance. All old games remain rather the same speed compared to the RTX 3080 MXM!
  23. Yes, that's a problem with availability! Corona has also increased inflation and with it everything old prices! The days of cheap are over and you really have to spend more than before! With appropriate adapters it is possible to make the Vapourchamber compatible with the RTX30xx series...
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. Terms of Use