Jump to content
NotebookTalk

1610ftw

Member
  • Posts

    1,135
  • Joined

  • Last visited

  • Days Won

    1

Everything posted by 1610ftw

  1. Looks like you got a steal and it would be interesting to see how your 10900K performs in it!
  2. Looking forward to hear from you brother, which CPU will be in yours? I wanted to do the upgrade to the Optimus capable bios but lost interest as I got pissed off that I can only use 3 out of 4 NVME slots with the 10850K and 10900K at my disposal but I do not want to go back from 10 to 8 cores for that stupid 11xxx CPU generation and I would also like to keep my 4 SSDs that are currently in a Dell 7760. With only 3 SSDs and the 10850K I had very few troubles by the way even with an early firmware so maybe one of the keys to stability of the KM-G is to not go for that CPU generation and leave that 4th SSD slot alone.
  3. A not even 25% performance increase of the 3080 Ti over the 2080 over THREE generations is now defined as much more capable? While at the same time desktop performance has increased by more than 75% for more than three times the gains? Looks to me as if Nvidia did as little as possible to prevent the impression of a complete standstill while still charging the same high prices as back in the days when their laptop GPUs were actually quite competitive. Obviously the number of people who see it as kind of natural that Nvidia routinely uses very low power limits and inferior dies that belong to smaller desktop cards is quite high. Therefore it seems that the strategy to do as little as possible seems to work out very well and I will give them that but with the technology of today we could easily see laptop GPU's that have at least 50% higher performance on a much lower total power budget than the SLI laptops of the past.
  4. Your English is very good and those fans are very strong - I like how they make things move on your table 😄 Merry Christmas to all and all the Best for 2023!
  5. That is indeed a very good clock speed in that case - and I thought you got a real stinker with your 8700K 😄
  6. Does not sound like you have the best sample there. You should not have to be out in the cold to reach more than 4.8 GHz and I would recommend to look for a different processor - could be a better binned 8700K or you could also look at the 8086K, 9700K or 9900K.
  7. Probably some truth to that certainly the money-making part 😄 However with Nvidia and desktop card manufacturers Nvidia is mostly telling them what to do. So it is probably a bit of both and manufacturers and Nvidia are happy with relatively safe mediocrity that does not cause a lot of issues.
  8. Looks like at least that max power may be a bit higher this time: https://www.notebookcheck.net/Nvidia-RTX-4090-and-RTX-4080-Laptop-GPUs-rated-for-2-GHz-boost-and-up-to-200-W-TGP-including-Dynamic-Boost-RTX-4060-and-RTX-4050-up-to-95-W-or-165-W-each.676258.0.html It still puzzles me why Nvidia does not see the necessity to increase max power going to the 4090 but then by all accounts they are only using a slower version of the 4080 desktop chip with less bus width, clock speeds and probably also cuda cores and more power would not have the same results as giving more power to a desktop 4080.
  9. SLI laptop designs like the P870 did cool up to 400W peak GPU power with a total die size that was very similar to an RTX 4080 (more than 600 square mm). Even a 270W version of the 4090 chip in a laptop (60% of desktop power target) should yield excellent results that would easily be at least 65% better than what Nvidia plans to achieve with the 4090 mobile and probably it would still be at least 50% better with a 225W version of the 4090 desktop die.
  10. As the GPU discussion has taken over the HX processor thread for the most part I would like to open this thread and start with a post that shows the constantly increasing performance differential between laptop and desktop GPUs that depending on ones point of view may reach new sad and/or comical heights with the upcoming 40xx laptop GPUs. For the first time since the 980m there will be a top end laptop GPU that in my opinion does not really have any right to carry the name of its desktop counterpart as it will probably not even reach half of its performance nor does it even use the same basic chipset. In order to trail the opening gap between laptop and desktop cards I have checked the leaderboard GPU Time Spy scores at 3DMark.com for the top laptop GPUs and their counterparts for 5 generations starting with the best laptop card of the them all the GTX 1080 and the trend has been pretty obvious: After three generations where increases on performance over their laptop counterparts were relatively moderate in desktop cards the differences have increased a lot with the 30xx cards. But all of this is nothing compared to the upcoming laptop 4090 that will probably not even reach half of the performance of the desktop version. To make it a fair comparison with attainable performance and to exclude more elaborate or extreme setups I have used the number 100 ranked scores as of today and as we can see there is no real similarity in performance any more, just in name, for the 3080 cards and the upcoming 4090 if rumored performance will not be completely off. Extrapolating from previous generations and leaked scores I am going with ca. 19000 for now as even 20000 looks less than likely due to severely limited TGP and use of a chip that is weaker than the one used for the 4080 desktop card. Compare that to the 1080, 2080 and 2080 Super with a much smaller differential that I was even surprised about as I had thought it would be higher: So what has happened lately? After leaving the path of very close performance capability between desktop and laptop GPUs it looks like Nvidia stopped to even try and keep up with its desktop designs when it came to laptops. With the 3080 Ti the differential first went over 50% and also for the first time they used the name of a top tier desktop card which they had never done before but without really making sure that they also had the hardware to back up that new big name. Fast forward to today and obviously Nvidia could not withstand the temptation to now even introduce the name of its top end biggest and most power hungry desktop card in the mobile lineup without taking any further steps to also beef up the hardware accordingly. What we will get is apparently a severely TDP limited design that is based on the 4080 desktop chip but pared down in almost all aspects of performance, so much that even the desktop 4080 will be more than 50% if not 60% faster than this laptop 4090 card. Will be interestingly to revisit this once the new cards have launched and I will also add comparisons for the second tier cards used in laptops as they are imo the sweet spot for elevated performance but still decent pricing in the 30x0 cards and it will be interesting to see if this will change with the upcoming 40x0 launch.
  11. DTR means desktop replacement, just not replacement for the very best desktops out there, that is an important distinction. At least for the same GPU and CPU and with both systems air cooled it used to be relatively close for a while like for example with the 9900K and GTX1080 or the 10900K and RTX2080 Super. Not so much any more as CPUs are not even the main issue these days as GPUs are where the differences grow to ridiculous proportions even without overclocking. Nvidia has crippled laptop GPUs so much while steadily increasing TDP of their higher end desktop cards that with the 40x cards we may be talking about a ca. 2:1 performance ratio between high end desktop and laptop GPUs which is ridiculous.
  12. The problem is that in current laptops it is very hard to sustain more than 160W for even a 10 minute Cinebench run. With a golden sample and undervolting the 12900HX could maybe achieve 24500 or 25000 CR23 in such a setting but I doubt that much more would be possible with that amount of power. And of course the CPU will get VERY hot while doing this. Compared to a stock 12900K this is actually pretty impressive: But we all know that with proper undervolting / overclocking and even higher end air cooling more performance can be had pretty easily with the 12900K. I can only see that gap widen with the 13900K / 13900HX but if 160W allows 32K+ CR23 that will still be a lot of performance in a small form factor - just don't compare it to desktops. For the 13900HX laptop 18" generation If one would combine a bigger unified vapor chamber heatsink with water cooling like TongFang / Uniwill do it then it would probably be possible to push sustained power to 200W and beyond, probably around 250W at least and 200W with not too much noise. Would be cool to have such a design that can use the added water cooling when stationary and a big unified vapor chamber when travelling.
  13. The 9900K is a hot CPU. My son was playing a game today with his new 9900K that we got from brother @cylix and with max cores set to 5GHz the CPU would go up to 99 degrees even in some games. We then backed up to 4.8 GHz and temps stayed at 90 degrees max and usually in the mid 70s to mid 80s with fans not being excessively loud. That is a drop of at least 10 degrees for a 4% lower clock speed and I am sure that backing up to 4.7 all core would still yield more of an improvement and it possibly will also allow a bit more of an undervolt to further reduce power consumption. That was with a P870TM but with the less beefy CPU heatsink and a modified cooling pad with Noctua fans that we also got from brother @cylix. So if you are really into CPU only tasks or some extremely good benchmarks then I would say go for it but if you aren't then you may want to consider what good all of this will do when the total power available for your GPU and CPU is only about 250W. You can probably accommodate this with your current setup already and it is more worthwhile to shoot for lower CPU temps to extend the life of your P775. In any case if you do type on an external keyboard at home then I strongly recommend a cooling pad and to consider running your P775 with the bottom cover off or at least with some strategically placed holes - that should drop your average CPU temps by 10 to 15 degrees in total before taking any other measures.
  14. Looks to me like a 3060 and 3070 desktop chip and a 3080 mobile chip. In that case I would prefer the 3070 desktop chip. Something really interesting would be the 4070 Ti desktop chip on one of those - it still has the right bus width for MXM (192) but performance that even with a 150W TDP might be not too far off from a desktop 3080 especially if they manage to give it 12 or even 16GB memory.
  15. The P870 was available with 2 x 200W GTX1080 and the die size of the desktop 4090 is about twice the die size of the GTX 1080. Would certainly be possible to cool a 250W or 300W version of that 4090 die, it is just a matter of going. die area RTX 4090: 608 mm² die area GTX 1080 MXM: 314 mm² Won't be the lightest and thinnest laptops that can do that but not every laptop has to be 30mm tops.
  16. Well, maybe they could carry on with the form factor but I do not really see any backwards compatibility if they need to up both power consumption and bus width that is now at 384 bit for the 4090 desktop and 256 bit for the 4080 desktop. Also the physical size of the MXM card is rather small compared to what we see with desktop cards that have seen substantial size increases over the years. With everything increasing in size in desktops having a bigger socketed solution for laptops would make sense with everything else being the same - it is not as if those big desktop boards were empty with no parts on them.
  17. Way to go, looks like with liquid metal you can get some added performance out of it! The CPX17 can do almost 14K with a bit of Afterburner Tweaking so it is a shame Dell is leaving so much power on the table by setting the power limit so low. Still an excellent improvement over stock and it would be cool to know how much your 7770 does pull during TimeSpy with the slim and the fat power supply and how much of a difference it makes in scores. Have you checked how much power you can sustain with your CPU now that you have applied liquid metal? I am pretty sure that it will be at least 120W as otherwise I doubt that you would make it to a 25.5K CR23 score.
  18. What I would like to see: 18" screen room for at least 4 drives continued support for 128GB memory 1200p, 1600p and 2400p screen options, if only one of possible 1600p (2560 x 1600) will be the best option If they get an LGA CPU then please use the AM5 socket as this is the last generation of the 1700 socket for Intel and we will be stuck with this generation. Only really excisting when Clevo finally will manage to support these for most or all of the duration of the socket lifetime. Could be a great showcase for AMD. Or alternatively if it has to be Intel for some reason it is hard to get excited if again the socket will only be usable for this one generation of CPUs. For BGA I do not even care that much any more as long as we will have the latest TB4/5 or equivalent with both CPU manufacturers. MXM GPU or better some new socketed format that can support higher memory bandwidth and higher power uptake - MXM is a dead end with todays much more powerful GPUs If BGA is used please offer some selection, preferably down to 4060 but at least 4070. If we need to live with crappy BGA it would be nice to have a proper plug-in eGPUZ solution - some NVME compatible plug on the side of the laptop would be cool, something like what Alienware had or like an express card slot. watercooling or something else to improve performance when stationary At least 400W TDP between GPU and CPU with a huge unified vapor chamber, preferably with add-on watercooling like TongFang uses The ability to use it with only one or two power bricks, and preferably with specifications that make it possible to swap power supplies with MSI that also use the same connector - that will help with smaller power supplies for travel mechanical cherry keyboard and normal silent keyboard with a proper keyboard layout each for workstation and gaming use properly distributed and chosen connectivity switchable graphics modes: Optimus, dGPU only and iGPU only 100WHr battery that can be swapped and accessed without tools - easier to get out please than for the X170 Ability to mix small GPU with big CPU and vice versa at least 1080p webcam properly working fan control
  19. 4090 is a new performance class - if they charge accordingly the performance jump will have to be higher both compared to the previous generation and to what they call the 4080. Charging 50% more for a card with a less than 10% increase in performance could result in disappointing sales. But then they are mostly about making money and not about making sense so my hope is that at least when AMD is coming too close they will have a plan. If AMD really is looking to rival a desktop 3090 with their next mobile offering they would overtake the 4090 with 175W and Nvidia will look rather foolish with that "4090". So I am hoping that both the new name and AMD will make this 4090 better than it currently looks and in the long run hopefully AMD will keep Nvidia on their feet unlike in the past where AMD offerings have been lacking in high end laptops.
  20. You will have to check what its important to you: 16, 17 or even 18" 16:9 or 16:10 how much memory how much storage Screen resolution warranty and support fan noise cooling Not every manufacturer covers all the bases.
  21. MSI GT77 with next gen Nvidia graphics and Intel i9: https://www.notebookcheck.net/Upcoming-MSI-Titan-GT77-to-be-the-first-to-offer-4K-144-Hz-mini-LED-display-with-1-008-dimming-zones-and-1-000-nits-peak-brightness.673883.0.html Most probably 13900HX and the mobile Geforce 4090. We'll see if there will also be an 88 Titan. Probably hard to justify a 17.3 16:9 and 18" 16:10 Titan at the same time.
  22. The TGP of that "4090" is too low compared to the "4080" so I would assume that this is not the last word on TGP. They will want to at least go up to something like 225 or 250W including boost. Bigger DTRs and the water cooled TongFang units should not have too many problems with this. They certainly do not have any issues right now with 175W where temps barely go beyond 65°C in the better units.
  23. That would suck, I thought it was a given that all their X170SM-G would get the Prema bios. In any case it surely doesn't hurt to ask.
  24. Something doesn't seem right. If you got this new from zTecPC you may want to check with them about your findings. You would expect Prema Bios and working audio out of the box so better to sort that out first before diving in.
  25. Agreed on the 10/20 series that did not have the huge performance differential although it already opened up with the 20x cards. AND rumors are just that and from what it looks like these cards are still some months out while the first laptops with the Nvidia solution should be available starting next month. So how much real pressure Nvidia feels from AMD in the laptop sector is hard to say. As the 4090 is derived from the desktop 4080 Nvidia could easily allow the 4090 mobile to reach 3090 levels but for that they will have to allow more power and possibly more cuda cores than have been rumored so far. Upping max power uptake to up to 200W + 50W Boost or even leaving it to manufacturers how much power they want to handle is easily done in theory but may be hard to pull off if Nvidia already told all the laptop companies that max power uptake would be 175W. In any case not holding my breath in the short term but I agree that Nvidia will not want AMD to take the laptop crown and they will probably have some kind of answer to prevent that from happening if AMD is releasing a mobile card that is as good as some rumors say (RTX 3090 levels). Would be great to finally have some competition in the laptop world 🙂 @mods: Maybe we could place the mobile GPU discussion in the tech news section? Current gen GPUs sadly do not seem to have any relevance for the X170KM-G so it may be better to have it there?
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. Terms of Use