Jump to content
NotebookTalk

Nvidia Lovelace AD102 RTX 4090 to have 800W power limit, Laptop variants tamed at 175 W


KING19

Recommended Posts

As expected in the laptop sector

 

https://www.notebookcheck.net/Nvidia-Lovelace-AD102-RTX-4090-to-have-800-W-power-limit-laptop-variants-tamed-at-175-W-could-result-in-wide-performance-deltas-compared-to-desktop.629527.0.html

 

We have been speculating that Nvidia's upcoming Lovelace generation of GPUs would have higher power consumption than Ampere. Rumors so far suggested that the upcoming flagship RTX 4090 based on the AD102 GPU may have power consumption pushing past 700 W or even 850 W

 

Nvidia will offer both desktop and mobile variants of Lovelace, and now we are getting to know what could be their likely power targets.

This information comes via @kopite7kimi on Twitter who has been quite accurate for the most part when it comes to GPU leaks. The Lovelace lineup will have GPU offerings starting from AD102, which is the flagship chip, all the way down to AD106.

 

AD102 is expected to have a mind-boggling 800 W power limit, which obviously means it will not make it to laptop form factors. This is not unlike what we've seen with Ampere wherein the top-end RTX 3080 Ti Laptop GPU was based on GA103 and not GA102.

 

GPUs from AD103 onwards will come in both desktop and mobile variants. While the desktop AD103 will have a maximum power limit of 450 W, the mobile version will be capped at 175 W similar to the RTX 3080 Ti Laptop GPU.

 

AD104 is touted to have a power limit of 400 W on the desktop but will be restricted to 175 W on mobile. This is a good uplift from what we've seen with the RTX 3070 Ti Laptop GPU, which has a TGP of 150 W (125 W TDP and 25 W Dynamic Boost).

Finally, AD106 will have a power limit of 260 W on the desktop but will work at 140 W in laptops. The RTX 3060 Laptop GPU offers a range of TDPs from 60 W to 115 W, but it is interesting to know that its successor can be pushed all the way to 140 W. We do not have power limits for AD107 yet.

 

A point worth noting here is that these figures are maximum power limits for a particular GPU and not actual TDP or TGP figures. These numbers serve to indicate as to how much a GPU can be pushed under the right cooling and overclocking conditions. They also give us an indication of what could be the extent of a momentary spike in power consumption under load.

The RTX 4090 cards are thought to hover around 600 W TGP. However, a more recent spec leak puts the RTX 4090's TGP at around 450 W, which is still a good 100 W higher than the RTX 3090's 350 W rating. 

 

The mobile Lovelace cards look to be using similar power limits as the current Ampere generation. The wide power limit gap between corresponding desktop and laptop Lovelace SKUs indicates that the performance delta between them could be huge. 

Having said that, pushing Lovelace any higher than 175 W would probably require large investments into chassis and cooling redesign R&D, which is unlikely a near-term goal for most OEMs. Some Alder Lake HX laptops are pushing a combined 250 W from both the CPU and the GPU already. That being said, the increased CUDA core counts and other specifications should still lend a good performance uplift with Lovelace mobile, but it is too early to speculate.

  • Thumb Up 3
  • Sad 1

Current Laptop:

Lenovo Legion 5: AMD Ryzen 7 4800H 2.8Ghz (Boost: 4.2Ghz), 6GB Nvidia Geforce GTX 1660Ti GDDR6 Memory, 15.6" FHD (1920 x 1080) 144Hz IPS display, 32GB 3200MHz DDR4 memory, 512GB M.2 NVMe PCIe SSD, 1 TB Teamgroup MP34 M.2 NVMe PCIe SSD, Windows 10 Home 22H2

Link to comment
Share on other sites

Frankly when I read this I have zero respect left for Nvidia.

 

I know the feeling that high performance laptops are a lost cause but then we go and buy desktop cards from the very same company that more and more seems to be into neutering their laptop offerings. Nvidia seems to be going out of their way recently to increase the differential between serviceability and performance between a desktop and a laptop and this has to stop.

 

I get it that it will not be possible to have 800W graphics in a laptop but we used to have 2 x 200W GPU in previous generations and going even in that direction coming from 175 to 300W would probably go a long way compared to the anemic 175W we are left with now although frankly I do not see why we could not have top of the line DTRs with a 400W GPU, a unified vapor chamber and added water cooling Eluktronics style.

 

So instead of manifesting the complete degradation of GPU performance in laptops I sincerely hope that Nvidia will come to their senses and work on some kind of new standard for socketed GPUs in laptops - it is about time. 

 

  • Thumb Up 3
Link to comment
Share on other sites

14 hours ago, 1610ftw said:

Frankly when I read this I have zero respect left for Nvidia.

 

I know the feeling that high performance laptops are a lost cause but then we go and buy desktop cards from the very same company that more and more seems to be into neutering their laptop offerings. Nvidia seems to be going out of their way recently to increase the differential between serviceability and performance between a desktop and a laptop and this has to stop.

 

I get it that it will not be possible to have 800W graphics in a laptop but we used to have 2 x 200W GPU in previous generations and going even in that direction coming from 175 to 300W would probably go a long way compared to the anemic 175W we are left with now although frankly I do not see why we could not have top of the line DTRs with a 400W GPU, a unified vapor chamber and added water cooling Eluktronics style.

 

So instead of manifesting the complete degradation of GPU performance in laptops I sincerely hope that Nvidia will come to their senses and work on some kind of new standard for socketed GPUs in laptops - it is about time. 

 

I expected at least 200w for the laptop version but that too much to ask it seems like. The Laptop versions will keep getting neutered while the desktop versions will keep getting more powerful increasing the gap between them. 

 

Its very doubtful that OEMs will make power bricks over 330W in order to handle a higher wattage laptop GPU because it'll lose its portability and it'll make more sense to buy a desktop instead. Back then you'll have to use 2 powerbricks in order to handle it or it'll power throttle.

 

Sadly the days of socketed GPUs are over unless you get a workstation.

 

 

  • Thumb Up 1

Current Laptop:

Lenovo Legion 5: AMD Ryzen 7 4800H 2.8Ghz (Boost: 4.2Ghz), 6GB Nvidia Geforce GTX 1660Ti GDDR6 Memory, 15.6" FHD (1920 x 1080) 144Hz IPS display, 32GB 3200MHz DDR4 memory, 512GB M.2 NVMe PCIe SSD, 1 TB Teamgroup MP34 M.2 NVMe PCIe SSD, Windows 10 Home 22H2

Link to comment
Share on other sites

20 hours ago, KING19 said:

I expected at least 200w for the laptop version but that too much to ask it seems like. The Laptop versions will keep getting neutered while the desktop versions will keep getting more powerful increasing the gap between them. 

 

Its very doubtful that OEMs will make power bricks over 330W in order to handle a higher wattage laptop GPU because it'll lose its portability and it'll make more sense to buy a desktop instead. Back then you'll have to use 2 powerbricks in order to handle it or it'll power throttle.

 

Sadly the days of socketed GPUs are over unless you get a workstation.

 

 

 

Interesting to see the TDP development of the top desktop GPU model that was also in laptops, at least by name:

 

GTX 980:  165W

GTX 1080: 180W

RTX 2080: 215/225W

RTX 2080 Super: 250W

RTX 3080: 320/350W

RTX 3080 Ti: 350W

 

It is clear that starting with the 3080 the gap for laptops opened wide due to Nvidia limiting TDP to at most 175W and mobile also fell behind massively in memory bandwidth.

 

We used to have laptops that could deliver at least 350W to their SLI GPUs alone and they also had two power supplies and the customer could decide if he wanted to use one power supply or both so no problem with using only one power supply on the go and if people wanted to go for it they could travel with two.

 

So we already had all of this power and we lost it due to the industry and Nvidia deciding to not even make an effort to let mobile solutions stay competitive.

 

Looking at desktop cards it should be possible to get at least 90% of the performance out of a GPU with an at least 25% lower power consumption so it would have been possible to have something like a 250W 3080 Ti with very competitive performance if only the other parameters would have been comparable to the desktop.

 

I can only speculate that the big laptop manufacturers together with Nvidia decided that it would make their life a lot easier - and the laptops thinner and lighter - if they limited the TDP of the GPU to 175W while also limiting its memory bandwidth.   

 

What this does to laptop performance vs a desktop can be seen here - the way the desktop dominates especially in 4K is rather brutal:

 

 

Going forward even 250W will make it difficult for laptops to keep pace with desktops if those go to 400+W for the 4080. Only going up to 200W would simply be too little too late.

 

 

  • Thumb Up 2
Link to comment
Share on other sites

It's just not really feasible to have a laptop with a full desktop variant GPU.  Stock GPU heatsinks are thicker than a stack of notebooks in order to keep that 300-350W at reasonable temps.  No one is going to produce a 2+" thick notebook this day so the 0.5% of laptop gamers (already a small percentage of the market) can buy it.

  • Thumb Up 2

Desktop | Intel i9-12900k | ASUS ROG Strix Z690-F | 2x16GB Oloy DDR5 @ 6400mhz CL32 | EVGA 3080 FTW3 Ultra | AW3821DW| 980 Pro 1TB PCIe 4.0 | All under water |

Server | SM846 | Unraid  6.12.0-rc4.1 | AMD Epyc 7F52 | Supermicro H12SSL-I | Tesla P40 24GB | 256GB 3200MHz ECC 8-channel | 100+TB ZFS |

Backup Server | SM826 | Unraid  6.12.0-rc4.1 | AMD Epyc 7302 | Supermicro H11SSL-I | Tesla P4 8GB | 256GB 2133MHz ECC 8-channel | 100+TB ZFS |

Dell XPS 9510 | Intel  i7-11800H | RTX 3050 Ti | 16GB 3200mhz | 1TB SX8200 | 1080P |

 

 

Link to comment
Share on other sites

20 hours ago, 1610ftw said:

I can only speculate that the big laptop manufacturers together with Nvidia decided that it would make their life a lot easier - and the laptops thinner and lighter - if they limited the TDP of the GPU to 175W while also limiting its memory bandwidth.

 

6 hours ago, Custom90gt said:

It's just not really feasible to have a laptop with a full desktop variant GPU.  Stock GPU heatsinks are thicker than a stack of notebooks in order to keep that 300-350W at reasonable temps.  No one is going to produce a 2+" thick notebook this day so the 0.5% of laptop gamers (already a small percentage of the market) can buy it.

 

+1 to @Custom90gt's comment.  It's not evil/lazy laptop markers & NVIDIA that are driving the current state of affairs.  It's the fact that the market for big/bulky laptops isn't there.  You know that businesses in general will follow the money; if there was money to be made selling bulky but more powerful systems, someone would sell them.

 

(The audience among this forum is of course the exact types of people who would prefer bulkier but more powerful laptops.  We are, however, not at all representative of the general market.  The cost to get a system designed and manufactured, put it on sale for a "reasonable price", and only end up selling a few hundred / few thousand units is too high.)

 

It's not all bad.  From the video posted above, I can gather:

  • A high-end laptop can play today's big graphics games at "ultra settings" and >60FPS (pretty much all of them at 1440p, and many of them at 4K).  That's pretty awesome.
  • Honestly, the fact that the desktop is "only" ≈60% faster than the laptop is also pretty impressive, given that the desktop has a 3090 Ti GPU (around 3× power usage compared a laptop 3080 Ti?) and takes up as much physical space as 10+ laptops.

A laptop always has been and always will be a tradeoff — performance for portability.  You'll always be able to build a desktop that performs better than the best laptop.  Laptop performance is a generation or two behind desktop performance, b laptops are still getting better over time and today's best laptops are better than the best desktops from a few years back.  Regarding the original topic of this thread, NVIDIA Lovelace's node shrink from 8nm to 5nm will allow for notably higher performance at the same power level, so it will be a boost even if the power limit isn't cranked up.


I've seen comments recently that laptop makers should "invest more into R&D" for the cooling system, as if that will allow the systems to cool off chips drawing drastically more power.  To me, it seems that the best laptops these days are operating at close to the edge of what physics allows with regards to allowing more cooling (without increasing the system's physical size and/or making them absurdly loud).  There will continue to be improvements, sure, but they will be incremental, just little bits at a time.  You're not going to some a sudden revolution in cooling design that allows for double power consumption.

 

 ...After "defending" the current state of laptops, I will say that the lack of standardized swappable parts in laptops (other than SSD/RAM) is a real issue for anyone who would like to keep their investment for more than a few years with some upgrades along the way, and that has not been getting better over time.  😕  But at the same time, things have gotten to the point in the computing world where the real-world value of upgrades is getting less year-by-year so you can keep a high-end system as-is for longer without it seeming like a slug.

 

Lastly, there's always the eGPU option for anyone who would like a (sort-of) portable setup with even closer to desktop-class GPU performance.

 

My two cents.

  • Thumb Up 3

Apple MacBook Pro 16-inch, 2023 (personal) • Dell Precision 7560 (work) • Full specs in spoiler block below
Info posts (Windows) — Turbo boost toggle • The problem with Windows 11 • About Windows 10/11 LTSC

Spoiler

Apple MacBook Pro 16-inch, 2023 (personal)

  • M2 Max
    • 4 efficiency cores
    • 8 performance cores
    • 38-core Apple GPU
  • 96GB LPDDR5-6400
  • 8TB SSD
  • macOS 15 "Sequoia"
  • 16.2" 3456×2234 120 Hz mini-LED ProMotion display
  • Wi-Fi 6E + Bluetooth 5.3
  • 99.6Wh battery
  • 1080p webcam
  • Fingerprint reader

Also — iPhone 12 Pro 512GB, Apple Watch Series 8

 

Dell Precision 7560 (work)

  • Intel Xeon W-11955M ("Tiger Lake")
    • 8×2.6 GHz base, 5.0 GHz turbo, hyperthreading ("Willow Cove")
  • 64GB DDR4-3200 ECC
  • NVIDIA RTX A2000 4GB
  • Storage:
    • 512GB system drive (Micron 2300)
    • 4TB additional storage (Sabrent Rocket Q4)
  • Windows 10 Enterprise LTSC 2021
  • 15.6" 3940×2160 IPS display
  • Intel Wi-Fi AX210 (Wi-Fi 6E + Bluetooth 5.3)
  • 95Wh battery
  • 720p IR webcam
  • Fingerprint reader

 

Previous

  • Dell Precision 7770, 7530, 7510, M4800, M6700
  • Dell Latitude E6520
  • Dell Inspiron 1720, 5150
  • Dell Latitude CPi
Link to comment
Share on other sites

2 hours ago, Aaron44126 said:

 

 

+1 to @Custom90gt's comment.  It's not evil/lazy laptop markers & NVIDIA that are driving the current state of affairs.  It's the fact that the market for big/bulky laptops isn't there.  You know that businesses in general will follow the money; if there was money to be made selling bulky but more powerful systems, someone would sell them.

 

(The audience among this forum is of course the exact types of people who would prefer bulkier but more powerful laptops.  We are, however, not at all representative of the general market.  The cost to get a system designed and manufactured, put it on sale for a "reasonable price", and only end up selling a few hundred / few thousand units is too high.)

 

It's not all bad.  From the video posted above, I can gather:

  • A high-end laptop can play today's big graphics games at "ultra settings" and >60FPS (pretty much all of them at 1440p, and many of them at 4K).  That's pretty awesome.
  • Honestly, the fact that the desktop is "only" ≈60% faster than the laptop is also pretty impressive, given that the desktop has a 3090 Ti GPU (around 3× power usage compared a laptop 3080 Ti?) and takes up as much physical space as 10+ laptops.

A laptop always has been and always will be a tradeoff — performance for portability.  You'll always be able to build a desktop that performs better than the best laptop.  Laptop performance is a generation or two behind desktop performance, b laptops are still getting better over time and today's best laptops are better than the best desktops from a few years back.  Regarding the original topic of this thread, NVIDIA Lovelace's node shrink from 8nm to 5nm will allow for notably higher performance at the same power level, so it will be a boost even if the power limit isn't cranked up.


I've seen comments recently that laptop makers should "invest more into R&D" for the cooling system, as if that will allow the systems to cool off chips drawing drastically more power.  To me, it seems that the best laptops these days are operating at close to the edge of what physics allows with regards to allowing more cooling (without increasing the system's physical size and/or making them absurdly loud).  There will continue to be improvements, sure, but they will be incremental, just little bits at a time.  You're not going to some a sudden revolution in cooling design that allows for double power consumption.

 

 ...After "defending" the current state of laptops, I will say that the lack of standardized swappable parts in laptops (other than SSD/RAM) is a real issue for anyone who would like to keep their investment for more than a few years with some upgrades along the way, and that has not been getting better over time.  😕  But at the same time, things have gotten to the point in the computing world where the real-world value of upgrades is getting less year-by-year so you can keep a high-end system as-is for longer without it seeming like a slug.

 

Lastly, there's always the eGPU option for anyone who would like a (sort-of) portable setup with even closer to desktop-class GPU performance.

 

My two cents.

I agree with your comments on parts being swappable. A laptop being upgradable used to mean upgradable GPU and/or CPU as everything else being upgradable was a given. Now we get a hurrah for memory, SSDs and network adapters not being soldered.

 

As for the first part of your post and what @Custom90gtthere would be zero harm to anybody if Nvidia allowed a TDP max of let's say 275W for the 3080 Ti. In the end it is up to the manufacturers to implement the TDP that they see fit. If they weren't interested there would not be a single laptop built that can handle such a TDP but I am pretty sure that instead we would see laptops that support at least 225W if not more. And then customers could decide what they want to buy - it is called having a choice as opposed to telling customers that companies know better what they want. This is how it SHOULD work. Instead we have an artificial cap and both Clevo and Uniwill who according to several accounts have asked for a higher TDP implementation and in the case of Clevo chips/support for an MXM design get stonewalled by Nvidia. It is hard to believe that others who do not want to be upstaged /forced to release better product have nothing to do with that.

 

 

 

 

 

 

  • Thumb Up 1
Link to comment
Share on other sites

On 6/29/2022 at 6:06 PM, 1610ftw said:

 

Interesting to see the TDP development of the top desktop GPU model that was also in laptops, at least by name:

 

GTX 980:  165W

GTX 1080: 180W

RTX 2080: 215/225W

RTX 2080 Super: 250W

RTX 3080: 320/350W

RTX 3080 Ti: 350W

 

It is clear that starting with the 3080 the gap for laptops opened wide due to Nvidia limiting TDP to at most 175W and mobile also fell behind massively in memory bandwidth.

 

We used to have laptops that could deliver at least 350W to their SLI GPUs alone and they also had two power supplies and the customer could decide if he wanted to use one power supply or both so no problem with using only one power supply on the go and if people wanted to go for it they could travel with two.

 

So we already had all of this power and we lost it due to the industry and Nvidia deciding to not even make an effort to let mobile solutions stay competitive.

 

Looking at desktop cards it should be possible to get at least 90% of the performance out of a GPU with an at least 25% lower power consumption so it would have been possible to have something like a 250W 3080 Ti with very competitive performance if only the other parameters would have been comparable to the desktop.

 

I can only speculate that the big laptop manufacturers together with Nvidia decided that it would make their life a lot easier - and the laptops thinner and lighter - if they limited the TDP of the GPU to 175W while also limiting its memory bandwidth.   

 

What this does to laptop performance vs a desktop can be seen here - the way the desktop dominates especially in 4K is rather brutal:

 

 

Going forward even 250W will make it difficult for laptops to keep pace with desktops if those go to 400+W for the 4080. Only going up to 200W would simply be too little too late.

 

 

 

That's very brutal results especially in 4k which im not surprised then again 4K is a waste on a laptop anyways. Also the desktop High end GPUs uses GDDR6x unlike the laptop high end GPUs. Im still surprised that the 3070ti and 3080ti doesnt use GDDR6X at least. What a missed opportunity

 

Personally i dont expect laptop GPUs will ever keep pace with desktop GPUs, The gap will keep expanding every generation with laptop GPUs being stuck at 175W.

 

10 hours ago, Custom90gt said:

It's just not really feasible to have a laptop with a full desktop variant GPU.  Stock GPU heatsinks are thicker than a stack of notebooks in order to keep that 300-350W at reasonable temps.  No one is going to produce a 2+" thick notebook this day so the 0.5% of laptop gamers (already a small percentage of the market) can buy it.

 

I agree. I said this in another thread that its barely a market for it anymore and that OEM's will cater to causals instead of enthusiasts because it'll make them more money. Same goes with T&L gaming laptops which have terrible heat management and soldiered components.

 

  • Thumb Up 1

Current Laptop:

Lenovo Legion 5: AMD Ryzen 7 4800H 2.8Ghz (Boost: 4.2Ghz), 6GB Nvidia Geforce GTX 1660Ti GDDR6 Memory, 15.6" FHD (1920 x 1080) 144Hz IPS display, 32GB 3200MHz DDR4 memory, 512GB M.2 NVMe PCIe SSD, 1 TB Teamgroup MP34 M.2 NVMe PCIe SSD, Windows 10 Home 22H2

Link to comment
Share on other sites

4 hours ago, 1610ftw said:

I agree with your comments on parts being swappable. A laptop being upgradable used to mean upgradable GPU and/or CPU as everything else being upgradable was a given. Now we get a hurrah for memory, SSDs and network adapters not being soldered.

 

As for the first part of your post and what @Custom90gtthere would be zero harm to anybody if Nvidia allowed a TDP max of let's say 275W for the 3080 Ti. In the end it is up to the manufacturers to implement the TDP that they see fit. If they weren't interested there would not be a single laptop built that can handle such a TDP but I am pretty sure that instead we would see laptops that support at least 225W if not more. And then customers could decide what they want to buy - it is called having a choice as opposed to telling customers that companies know better what they want. This is how it SHOULD work. Instead we have an artificial cap and both Clevo and Uniwill who according to several accounts have asked for a higher TDP implementation and in the case of Clevo chips/support for an MXM design get stonewalled by Nvidia. It is hard to believe that others who do not want to be upstaged /forced to release better product have nothing to do with that.

 

 

 

 

 

 

Thats kinda what ampere mobile was. Since the way it functioned was base + CTDP(optional) + DB.
this means oems could modify the tdp to fit a chassis or use it for power modes(Lenovo for example will do the latter.)

The ctdp is about 35w. So 35w can be added to the gpu in total. with dynamic boost activating for 15w when the cpu seems to be under 35w or so. so basically up to 50+ w in total can be added.  Thats what your at least 225w for 3080 ti is coming from then. We have had laptops that cooled 200w 3080 ti's before. But the issue is thermal density. It's the reason why even though ryzen is more efficient it gets hotter. The more thermally dense a cpu is the more easily hot it can get.

When it comes to amd gpus though, they do it differently, but its seen mostly in amd advantage laptops. Amd smart shift gives wattage either to cpu or gpu depending on the option. 6600m can be from 50 to 100w. 6700m is up to 135w. But what amd does for the top end is interesting. image.thumb.png.935c3ef19ea192dc71e0ac82ec574a1c.png
Its 145w and above. Which means the 6800m is allowed to above 145w if the thermal design and power brick allows for it. We dont know how it will peform when its dgpu only mode when it comes to wattage. Might have to check again.

 

Update: OK got back on that. The legion 5 6600m had a mux the issue is that because smart shift didn't support direct mode it lost the smartshift ability in dgpu mode. So in some games it suffered cuz of that. Smartshift 2.0 looks like fixed those issues. And seems to have to added some option called smartshift Max. Although nobody knows what it's for yet.

  • Thumb Up 1
Link to comment
Share on other sites

On 7/1/2022 at 12:22 AM, KING19 said:

 

That's very brutal results especially in 4k which im not surprised then again 4K is a waste on a laptop anyways. Also the desktop High end GPUs uses GDDR6x unlike the laptop high end GPUs. Im still surprised that the 3070ti and 3080ti doesnt use GDDR6X at least. What a missed opportunity

 

Personally i dont expect laptop GPUs will ever keep pace with desktop GPUs, The gap will keep expanding every generation with laptop GPUs being stuck at 175W.

 

 

I agree. I said this in another thread that its barely a market for it anymore and that OEM's will cater to causals instead of enthusiasts because it'll make them more money. Same goes with T&L gaming laptops which have terrible heat management and soldiered components.

 

 

When the laptop is to be used as a desktop replacement it will often be connected to an external 4K display so being able to run games at 4K 60 Hz or more should be possible, no reasons why that shouldn't be a goal or alternatively people may want to game at a QHD resolution but with really high refresh rates - one needs more power for that than the current mobile designs allow for.

Yes of course there are other issues with mobile graphics except the TDP being too low but power delivery is the parameter that I used for illustrative purposes and to show that in contrast to other performance related parameters that have improved TDP actually went down and not up.

 

While I have not been around at the height of the GTX 1080 DTR monsters with SLI I am sure that @Papusan @Mr. Fox @jaybee83 or @electrosoft could fill in the gaps and tell us about what was possible back then with regard to power delivery and performance - a P870 with a single GTX1080 card surely was quite a bit more competive back then when compared to a desktop 1080i Ti then what we see today where a desktop brutalizes the "Titan" in the most humiliating way and without having to wear ear plugs...

 

As for the market for more powerful machines there are companies that wanted to serve that market and by several accounts they were/are denied by Nvidia. Nvidia probably checked what kind of cooling could be accomplished in small bezel 17.3" laptops with a weight of about 7 lbs tops and decided that 175W TDP was more than enough. Didn't matter at that point that both Clevo and Uniwill/Tongfang would have liked to build higher TDP models or even just an MXM 3080 Ti with the beefier conventional cooling or liquid cooling in their models as they could not get anything with a higher TDP from Nvidia. The result is that the liquid cooling solution from Uniwill runs at ridiculously low temperatures for the GPU and the X170 project seems to be dead for now. 

 

So this is the issue: Nvidia is effectively holding back companies who want to do more, it is not that no company wants to serve what surely is a niche but in other areas of the economy there are also companies that serve a niche but they do not have to beg a big chip manufacturer to let them do it. Nvidia has in effect prevented any company from serving such a market by setting a very low max TDP and memory bandwidth that will result in superior designs having almost no performance advantage over the thin and light brigade that thanks to Nvidia does not have to fear that any other company will strive for superior performance with a beefier design.

 

In any case Nvidia is still defended for the ridiculously low TDP and memory bandwidth when it is clear that more could be done in order to allow companies to differentiate themselves and to truly allow customers a choice. While I have obviously no knowledge of any backdoor dealings / informal conversations taking place I would be VERY surprised if Nvidia chose a max 175W TDP and a much lower bandwidth despite their bigger partners wanting more from them, it is not really a realistic scenario. 

 

 

 

 

 

 

On 7/1/2022 at 3:28 AM, VEGGIM said:

Thats kinda what ampere mobile was. Since the way it functioned was base + CTDP(optional) + DB.
this means oems could modify the tdp to fit a chassis or use it for power modes(Lenovo for example will do the latter.)

The ctdp is about 35w. So 35w can be added to the gpu in total. with dynamic boost activating for 15w when the cpu seems to be under 35w or so. so basically up to 50+ w in total can be added.  Thats what your at least 225w for 3080 ti is coming from then. We have had laptops that cooled 200w 3080 ti's before. But the issue is thermal density. It's the reason why even though ryzen is more efficient it gets hotter. The more thermally dense a cpu is the more easily hot it can get.

When it comes to amd gpus though, they do it differently, but its seen mostly in amd advantage laptops. Amd smart shift gives wattage either to cpu or gpu depending on the option. 6600m can be from 50 to 100w. 6700m is up to 135w. But what amd does for the top end is interesting. image.thumb.png.935c3ef19ea192dc71e0ac82ec574a1c.png
Its 145w and above. Which means the 6800m is allowed to above 145w if the thermal design and power brick allows for it. We dont know how it will peform when its dgpu only mode when it comes to wattage. Might have to check again.

 

Update: OK got back on that. The legion 5 6600m had a mux the issue is that because smart shift didn't support direct mode it lost the smartshift ability in dgpu mode. So in some games it suffered cuz of that. Smartshift 2.0 looks like fixed those issues. And seems to have to added some option called smartshift Max. Although nobody knows what it's for yet.

 

Very good points and indeed I would think that being able to go up to 225W combined with a higher memory bandwidth and GDDR6X would have been a step in the right direction for this generation of Nvidia mobile chips.

 

But maybe it is really more interesting to go with AMD next generation when they put out a chip that with an open TDP effectively makes them the king of the laptop world when working with laptop manufacturers that want to go for that. Of course that would have to be with a basic design that would also profit from a higher TDP and you need enough performance potential to make good use of that added power. With TB4 support now being possible with AMD and 16 core chips on the horizon in both desktop and mobile form factor AMD could really end up in a lot of higher end gaming and DTR solutions and then Nvidia and Intel will hopefully want to strike back - hurrah to competition. 

  • Thumb Up 1
  • Like 1
  • Bump 1
Link to comment
Share on other sites

5 hours ago, 1610ftw said:

 

Very good points and indeed I would think that being able to go up to 225W combined with a higher memory bandwidth and GDDR6X would have been a step in the right direction for this generation of Nvidia mobile chips.

 

Well GDDR6X wasn't used due to how rn its kinda inefficient, hot, and power hungry.

Link to comment
Share on other sites

47 minutes ago, VEGGIM said:

Well GDDR6X wasn't used due to how rn its kinda inefficient, hot, and power hungry.

Not so sure about that when we are talking about coming down from the 350W of a desktop 3080 Ti.

With 225W instead of 350W everything would be running a lot cooler including GDDR6X.

 

Not that it really matters as a decision seems to have been made that laptops get truly high end cards in name only. Would be better to just call the top end in a laptop an RTX 3070 desktop equivalent to give people an estimate of what level of performance they can expect with a 3080 Ti mobile.

 

 

  • Thumb Up 2
Link to comment
Share on other sites

41 minutes ago, 1610ftw said:

Not so sure about that when we are talking about coming down from the 350W of a desktop 3080 Ti.

With 225W instead of 350W everything would be running a lot cooler including GDDR6X.

 

Not that it really matters as a decision seems to have been made that laptops get truly high end cards in name only. Would be better to just call the top end in a laptop an RTX 3070 desktop equivalent to give people an estimate of what level of performance they can expect with a 3080 Ti mobile.

 

 

I guess it wouldn't work well for Nvidia. Since higher numbers = better for advertising. Unless we have like 3070 super and also 3070 ti

Link to comment
Share on other sites

32 minutes ago, VEGGIM said:

I guess it wouldn't work well for Nvidia. Since higher numbers = better for advertising. Unless we have like 3070 super and also 3070 ti

I find it deceptive, better to just come up with a different nomenclature altogether.

Another option would be to just use the desktop chips and let manufacturers add power as they see fit but clearly this was ruled out as an option long ago.

 

Glad that this is still possible for Intel CPUs with at least one model in every generation sporting a proper LGA socket - now if only it would not be a seriously gimped design this time around.... 

  • Sad 1
Link to comment
Share on other sites

22 hours ago, Aaron44126 said:

It's not evil/lazy laptop markers & NVIDIA that are driving the current state of affairs.  It's the fact that the market for big/bulky laptops isn't there.  You know that businesses in general will follow the money; if there was money to be made selling bulky but more powerful systems, someone would sell them.

I think the gaming DTR niche effectively collapsed itself when we had the Alienware 51m, MSI GT76, and X170SM all competing against each other for the same customers that year. I would not be surprised if Clevo lost many sales to those competitors, and their R&D budget was reduced as a result. I know monopolies are generally frowned upon but in this case it is probably best to have less competition for this type of laptop.

  • Thumb Up 5

Desktop - Xeon W7-2495X, 64GB DDR5-6400 C32 ECC, 800GB Optane P5800X, MSI RTX 4080 Super Ventus 3X OC, Corsair HX1500i, Fractal Define 7 XL, Asus W790E-SAGE SE, Windows 10 Pro 22H2

Clevo PE60SNE - 14900HX, 32GB DDR5-5600 CL40, 4TB WD SN850X, RTX 4070 mobile, 16.0 inch FHD+ 165hz, System76 open source firmware, Windows 10 Pro 22H2

Link to comment
Share on other sites

I cannot defend or justify anything the manufacturers of laptops, or Intel, NVIDIA and AMD do in terms of crippling and performance capping. However, it is understandable to the extent that form factor and the irrational public fetish relating to laptops needing to be absurdly thin and light necessitates emasculating these products to avoid thermal malfunction, premature failure, end-user drama associated with the chassis getting too hot to handle (literally) and even fire hazards. The underlying problem that gives rise to it and exacerbates matters is the fact that they CANNOT give us monsterbooks when the form factor only supports it being a turdbook. Throw in irrational expectations on having a silent machine and you're basically looking at laptops beng a smartphone on steroids. 

 

As I have always said, "we will have whatever the sheeple are willing to put up with" and the OEM/ODM is going to give those stupid sheeple masses whatever they are willing to pay for... minus a little bit because they never go above and beyond and operate on a "just barely" approach.

  • Thumb Up 3

WRAITH // Z790 Apex | 14900KS | 4090 Suprim X+Byksi Block | 48GB DDR5-8600 | Toughpower 1650W | MO-RA3 360 | Hailea HC-500A || O11D XL EVO (T-Rex)

BANSHEE // X870E Carbon | 9950X | 4090 Gaming OC+Alphacool Block | 32GB DDR5-8200 | RM1200x SHIFT | XT45 1080 Nova || Antec C8 (Rhinoceros)

SPECTRE // Z790i Edge | 13900KS | 3090 Ti FTW3 | 48GB DDR5-8200 | RM1000e | EK Nucleus CR360 Direct Die || Prime A21 (Rattlesnake)

HALF-BREED // Precision 7720 | BGA CPU Filth | 32GB DDR4 | Quadro P5000 | 4K Display | Nothing to Write Home About (Turdbook)

 Mr. Fox YouTube Channel | Mr. Fox @ HWBOT

The average response time for a 911 call is 10 minutes. The response time of a .357 is 1400 feet per second. 

Link to comment
Share on other sites

On 7/1/2022 at 8:24 AM, 1610ftw said:

So this is the issue: Nvidia is effectively holding back companies who want to do more,

With the castrated laptop graphics TDP paired with the disaster Dynamic boost (your Cpu will be crippled if your gpu get max TDP), they hope you upgrade before time. If you get an desktop 4080/4080 ti you’ll ride on it due proper TDP. The max TDP including Dynamic disaster on laptops is a joke! They know how to milk laptop jockeys. Yep, they do this to push you on next year’s awful products. 
 

Edit. Regarding power adapters… The biggest you’ll get is a slim 280w adapter forwards. Why offer a thicker Delta 330w when the new chassis design with Nvidia’s big boy graphics is so damn cute and pretty?

 

On 7/1/2022 at 7:12 PM, win32asmguy said:

I think the gaming DTR niche effectively collapsed itself when we had the Alienware 51m, MSI GT76, and X170SM all competing against each other for the same customers that year. I would not be surprised if Clevo lost many sales to those competitors, and their R&D budget was reduced as a result. I know monopolies are generally frowned upon but in this case it is probably best to have less competition for this type of laptop.

No point make it if all you’ll get is exactly the same as in the thin and slim. Upgradable would only be the Cpu. If they was so kind offer it. DTR is history. And most likely will never come back. I would never accept the DTR branding on new laptops. 

  • Haha 2

"The Killer"  ASUS ROG Z790 Apex Encore | 14900KS | 4090 HOF + 20 other graphics cards | 32GB DDR5 | Be Quiet! Dark Power Pro 12 - 1500 Watt | Second PSU - Cooler Master V750 SFX Gold 750W (For total of 2250W Power) | Corsair Obsidian 1000D | Custom Cooling | Asus ROG Strix XG27AQ 27" Monitors |

 

                                               Papusan @ HWBOTTeam PremaMod @ HWBOT | Papusan @ YouTube Channel

                             

 

Link to comment
Share on other sites

1 hour ago, Papusan said:

No point make it if all you get is exactly the same as in the thin and slim. Upgradable would only be the Cpu. If they was so kind offer it. DTR is history. And most likely will never come back. 

As we have seen in most countries throughout the world since COVID-19, the average "citizen" (which can also define consumers) is content for someone else to decide what is best for them and they are glad to be compliant with those decisions because they assume that decision-makers have their best interests at heart. They pretend to, but they do not. They want command and control. Pretty messed up. But, even if they did/do, I won't let someone else decide what is best for me or my family, and if their opinion about things doesn't match mine then too bad for them. I'm going my way, not theirs, rejecting their lead, disregarding and deliberately resisting their influence, and withdrawing any show of support for them or their agenda. That is exactly what I have done with the turdbook manufacturers. They make trash. I don't want trash. I'm not going to accept that from them, and they aren't getting any money from me. I won't be nice to them and won't try to find anything good to say about them. They can rot in hell for all I care, along with the pathetic garbage they are peddling.

  • Thumb Up 1
  • Like 1

WRAITH // Z790 Apex | 14900KS | 4090 Suprim X+Byksi Block | 48GB DDR5-8600 | Toughpower 1650W | MO-RA3 360 | Hailea HC-500A || O11D XL EVO (T-Rex)

BANSHEE // X870E Carbon | 9950X | 4090 Gaming OC+Alphacool Block | 32GB DDR5-8200 | RM1200x SHIFT | XT45 1080 Nova || Antec C8 (Rhinoceros)

SPECTRE // Z790i Edge | 13900KS | 3090 Ti FTW3 | 48GB DDR5-8200 | RM1000e | EK Nucleus CR360 Direct Die || Prime A21 (Rattlesnake)

HALF-BREED // Precision 7720 | BGA CPU Filth | 32GB DDR4 | Quadro P5000 | 4K Display | Nothing to Write Home About (Turdbook)

 Mr. Fox YouTube Channel | Mr. Fox @ HWBOT

The average response time for a 911 call is 10 minutes. The response time of a .357 is 1400 feet per second. 

Link to comment
Share on other sites

10 minutes ago, Mr. Fox said:

As we have seen in most countries throughout the world since COVID-19, the average "citizen" is content for someone else to decide what is best for them and they are glad to be compliant with those decisions because they assume that decision-makers have their best interests at heart. Pretty messed up. Even if they did/do, I won't let someone else decide what is best for me or my family, and if their opinion about it doesn't match mine, I'm going my way, not theirs, rejecting their lead, disregarding and deliberately resisting their influence, and withdrawing any show of support for them or their agenda. That is exactly what I have done with the turdbook manufacturers. They make trash. I don't want trash. I'm not going to accept that from them, and they aren't getting any money from me. I won't be nice to them and won't try to find anything good to say about them. They can rot in hell for all I care, along with the pathetic garbage they are peddling.

I will ride long on my Clevo P870, my Clevo W860 and my wife’s cheap laptop she used for work. After that I will buy a sub 300$ laptop for on the go. No need for anything other than this. I won’t support greedy and lie’s coming from the laptop oems/Nvidia’s graphics solutions for laptops. And I will put money on my monster desktop build. Nothing come close do it this way. 

  • Like 1

"The Killer"  ASUS ROG Z790 Apex Encore | 14900KS | 4090 HOF + 20 other graphics cards | 32GB DDR5 | Be Quiet! Dark Power Pro 12 - 1500 Watt | Second PSU - Cooler Master V750 SFX Gold 750W (For total of 2250W Power) | Corsair Obsidian 1000D | Custom Cooling | Asus ROG Strix XG27AQ 27" Monitors |

 

                                               Papusan @ HWBOTTeam PremaMod @ HWBOT | Papusan @ YouTube Channel

                             

 

Link to comment
Share on other sites

5 hours ago, 1610ftw said:

I find it deceptive, better to just come up with a different nomenclature altogether.

Another option would be to just use the desktop chips and let manufacturers add power as they see fit but clearly this was ruled out as an option long ago.

 

Glad that this is still possible for Intel CPUs with at least one model in every generation sporting a proper LGA socket - now if only it would not be a seriously gimped design this time around.... 

That was Turing. Just reduce the wattage and clocks and that's what you kinda get.

 

And could you give me an example of a different nomenclature for Nvidia. Going 1000 to 3000 doesn't work since those are used for generations.

Link to comment
Share on other sites

On 7/1/2022 at 7:12 PM, win32asmguy said:

I think the gaming DTR niche effectively collapsed itself when we had the Alienware 51m, MSI GT76, and X170SM all competing against each other for the same customers that year. I would not be surprised if Clevo lost many sales to those competitors, and their R&D budget was reduced as a result. I know monopolies are generally frowned upon but in this case it is probably best to have less competition for this type of laptop.

According to @jaybee83Clevo was already planning a new X170 heatsink for the larger 3080 Ti heatspreader to go with a 3080 Ti MXM card but Nvidia wouldn't want any of that.

You can have a healthy R&D budget but it is not worth much when you cannot get a new GPU.

 

On 7/1/2022 at 10:16 PM, VEGGIM said:

That was Turing. Just reduce the wattage and clocks and that's what you kinda get.

 

And could you give me an example of a different nomenclature for Nvidia. Going 1000 to 3000 doesn't work since those are used for generations.

I liked Turing - desktop and mobile were a lot closer as was the TDP of desktop and mobile.

 

Different nomenclature could just be dropping the zero and call it 308 Ti instead of 3080 Ti for example or 3080m Ti. Even the added m would show it is NOT a 3080 Ti like in a desktop.

But deception is the name of the game here so this is unlikely to change and the gullible t&l crowd will be ecstatic when they can get a 3080 Ti in a 0.8" thin laptop - never mind that it has a performance that at worst it may have trouble surpassing a 3050 desktop card.

Link to comment
Share on other sites

8 hours ago, 1610ftw said:

According to @jaybee83Clevo was planning a new X170 heatsink for the larger 3080 Ti heatspreader but Nvidia wouldn't want any of that.

You can have a healthy R&D budget but it is not worth much when you cannot get a new GPU.

 

Why would NVIDIA have any say in the heatsink design by Clevo?  That doesn't make sense to me.

Desktop | Intel i9-12900k | ASUS ROG Strix Z690-F | 2x16GB Oloy DDR5 @ 6400mhz CL32 | EVGA 3080 FTW3 Ultra | AW3821DW| 980 Pro 1TB PCIe 4.0 | All under water |

Server | SM846 | Unraid  6.12.0-rc4.1 | AMD Epyc 7F52 | Supermicro H12SSL-I | Tesla P40 24GB | 256GB 3200MHz ECC 8-channel | 100+TB ZFS |

Backup Server | SM826 | Unraid  6.12.0-rc4.1 | AMD Epyc 7302 | Supermicro H11SSL-I | Tesla P4 8GB | 256GB 2133MHz ECC 8-channel | 100+TB ZFS |

Dell XPS 9510 | Intel  i7-11800H | RTX 3050 Ti | 16GB 3200mhz | 1TB SX8200 | 1080P |

 

 

Link to comment
Share on other sites

41 minutes ago, Custom90gt said:

 

Why would NVIDIA have any say in the heatsink design by Clevo?  That doesn't make sense to me.

Yeah, that made little sense, edited for clarity.

 

Still not quite sure how the whole MXM production process is/was done but I assume that as the manufacturer of this kind of cards you have to ask Nvidia for chips and if you do not get them you are essentially out of luck.

Link to comment
Share on other sites

3 hours ago, Custom90gt said:

 

Why would NVIDIA have any say in the heatsink design by Clevo?  That doesn't make sense to me.

haha let me clarify on this: clevo had already implemented the capability into the KM heatsink to properly support a 3080 Ti mxm card, but Nvidia blocked them from developing and releasing the mxm card. unfortunately, without the green goblin's say so theres nothing u can do as an ODM / OEM.

  • Bump 1

Mine: Hyperion "Titan God of Heat, Heavenly Light, Power" (2022-24)
AMD Ryzen 9 7950X (TG High Perf. IHS) / Asus ROG Crosshair X670E Extreme / MSI Geforce RTX 4090 Suprim X / Teamgroup T-Force Delta RGB DDR5-8200 2x24 GB / Seagate Firecuda 530 4 TB / 5x Samsung 860 Evo 4 TB / Arctic Liquid Freezer II 420 (Push/Pull 6x Noctua NF-A14 IndustrialPPC-3000 intake) / Seasonic TX-1600 W Titanium / Phanteks Enthoo Pro 2 TG (3x Arctic P12 A-RGB intake / 4x Arctic P14 A-RGB exhaust / 1x Arctic P14 A-RGB RAM cooling) / Samsung Odyssey Neo G8 32" 4K 240 Hz / Ducky One 3 Daybreak Fullsize Cherry MX Brown / Corsair M65 Ultra RGB / PDP Afterglow Wave Black / Beyerdynamic DT 770 Pro X Limited Edition

 

My Lady's: Clevo NH55JNNQ "Alfred" (2022-24)
Sharp LQ156M1JW03 FHD matte 15.6" IGZO 8 bit @248 Hz / Intel Core i5 12600 / Nvidia Geforce RTX 3070 Ti / Mushkin Redline DDR4-3200 2x32 GB / Samsung 970 Pro 1 TB / Samsung 870 QVO 8 TB / Intel AX201 WIFI 6+BT 5.2 / Win 11 Pro Phoenix Lite OS / 230 W PSU powered by Prema Mod!

Link to comment
Share on other sites

3 hours ago, jaybee83 said:

haha let me clarify on this: clevo had already implemented the capability into the KM heatsink to properly support a 3080 Ti mxm card, but Nvidia blocked them from developing and releasing the mxm card. unfortunately, without the green goblin's say so theres nothing u can do as an ODM / OEM.

I'm actually wondering why clevo never goes to amd for gpus. It seems like they seem to avoid amd like the plague.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. Terms of Use