Jump to content
NotebookTalk

the decline of the once mighty Nvidia laptop GPUs


1610ftw
 Share

Recommended Posts

As the GPU discussion has taken over the HX processor thread for the most part I would like to open this thread and start with a post that shows the constantly increasing performance differential between laptop and desktop GPUs that depending on ones point of view may reach new sad and/or comical heights with the upcoming 40xx laptop GPUs.

 

For the first time since the 980m there will be a top end laptop GPU that in my opinion does not really have any right to carry the name of its desktop counterpart as it will probably not even reach half of its performance nor does it even use the same basic chipset.

 

In order to trail the opening gap between laptop and desktop cards I have checked the leaderboard GPU Time Spy scores at 3DMark.com for the top laptop GPUs and their counterparts for 5 generations starting with the best laptop card of the them all the GTX 1080 and the trend has been pretty obvious: After three generations where increases on performance over their laptop counterparts were relatively moderate in desktop cards the differences have increased a lot with the 30xx cards. But all of this is nothing compared to the upcoming laptop 4090 that will probably not even reach half of the performance of the desktop version.

 

To make it a fair comparison with attainable performance and to exclude more elaborate or extreme setups I have used the number 100 ranked scores as of today and as we can see there is no real similarity in performance any more, just in name, for the 3080 cards and the upcoming 4090 if rumored performance will not be completely off. Extrapolating from previous generations and leaked scores I am going with ca. 19000 for now as even 20000 looks less than likely due to severely limited TGP and use of a chip that is weaker than the one used for the 4080 desktop card. Compare that to the 1080, 2080 and 2080 Super with a much smaller differential that I was even surprised about as I had thought it would be higher:

 

image.png.14343264e404c23a760fbfd5325ff0a6.png

 

So what has happened lately? After leaving the path of very close performance capability between desktop and laptop GPUs it looks like Nvidia stopped to even try and keep up with its desktop designs when it came to laptops. With the 3080 Ti the differential first went over 50% and also for the first time they used the name of a top tier desktop card which they had never done before but without really making sure that they also had the hardware to back up that new big name.

 

Fast forward to today and obviously Nvidia could not withstand the temptation to now even introduce the name of its top end biggest and most power hungry desktop card in the mobile lineup without taking any further steps to also beef up the hardware accordingly. What we will get is apparently a severely TDP limited design that is based on the 4080 desktop chip but pared down in almost all aspects of performance, so much that even the desktop 4080 will be more than 50% if not 60% faster than this laptop 4090 card.

 

Will be interestingly to revisit this once the new cards have launched and I will also add comparisons for the second tier cards used in laptops as they are imo the sweet spot for elevated performance but still decent pricing in the 30x0 cards and it will be interesting to see if this will change with the upcoming 40x0 launch.

 

 

 

  • Thumb Up 4
Link to comment
Share on other sites

I kind of see this in the opposite direction.  It's not the decline of laptop GPUs so much as it is desktop GPUs finally growing up to take advantage of the desktop form factor.

 

If desktops are an order of magnitude larger than laptops (talking about physical space / volume) then they should be able to dissipate an order of magnitude more heat.  A decade ago, desktop CPUs and GPUs were not using more power than you could reasonably dissipate from a laptop chassis.  Now, they are.  NVIDIA is now building desktop GPUs that consume more than 400W and there's not really a way that you could dissipate that amount of heat from a laptop chassis (plus heat from the CPU as well) using current designs and materials.

 

So yes, you're right, the difference between desktop and laptop GPU performance will only continue to widen as NVIDIA continues to crank up GPU power limits.  It's more a matter of physics than it is NVIDIA failing in the laptop space.

 

Not to give NVIDIA a pass...

One could make the argument that putting a GA103 or AD103 GPU chip into a laptop is stupid.  Here, I am assuming that recent rumors about an upcoming "GeForce 4090" laptop GPU with an AD103 core and 175W TGP are true, but NVIDIA is already selling "GeForce 3080 Ti" laptop GPUs with the GA103 core (...I have one right here).  The power limit is going to be so low that the performance benefit to using one of those chips over GA104/AD104 at the same power level is going to be in the 2-5% range (as you can see by looking at the 3080 vs 3080 Ti performance numbers above), yet NVIDIA will charge hundreds of dollars more for the higher-end GPU.

 

And of course, NVIDIA's propensity to name desktop and laptop GPUs the same is definitely misleading.  Less aware consumers will think they're getting desktop 4090 performance out of their laptop 4090 GPU and ... obviously it won't even be close.  I preferred it back when they just stuck an "M" on the end of all of their laptop GPUs to make it clear that they were different.  But NVIDIA likes it this way because it makes their GPUs appear to be more competitive against the desktop variants and thus easier to sell, I presume.

 

A more high-bandwidth eGPU connection option could help laptop users who want access to desktop GPU levels of performance, I guess...?

  • Thumb Up 3

Dell Precision 7770 (personal) • Dell Precision 7560 (work) • Full specs in spoiler block below
Info posts (Dell) — Dell Precision key postsDell driver RSS feeds • Dell Fan Management — override fan behavior
Info posts (Windows) — Turbo boost toggle • The problem with Windows 11 • About Windows 10 LTSC

Spoiler

Dell Precision 7770 (personal)

  • Intel Core i9-12950HX ("Alder Lake"), 8P+8E
    • 8× P cores ("Golden Cove"): 2.3 GHz base, 5.0 GHz turbo, hyperthreading
    • 8× E cores ("Gracemont"): 1.7 GHz base, 3.6 GHz turbo
  • 128GB DDR5-3600 (CAMM)
  • NVIDIA GeForce RTX 3080 Ti 16GB (DGFF)
  • Storage:
    • 2TB system drive: Samsung 980 Pro, PCIe4
    • 24TB additional storage: 3× Sabrent Rocket 4 Plus 8TB, PCIe4 (Storage Spaces)
  • Windows 10 (Enterprise LTSC 2021)
  • 17.3" 3940×2160 display
  • Intel Wi-Fi AX211 (Wi-Fi 6E + Bluetooth)
  • 93Wh battery
  • IR webcam
  • Fingerprint reader

 

Dell Precision 7560 (work)

  • Intel Xeon W-11955M ("Tiger Lake")
    • 8×2.6 GHz base, 5.0 GHz turbo, hyperthreading ("Willow Cove")
  • 64GB DDR4-3200 ECC
  • NVIDIA RTX A2000 4GB
  • Storage:
    • 512GB system drive (Micron 2300)
    • 4TB additional storage (Sabrent Rocket Q4)
  • Windows 10 (Enterprise LTSC 2021)
  • 15.6" 3940×2160 display
  • Intel Wi-Fi AX210 (Wi-Fi 6E + Bluetooth)
  • 95Wh battery
  • IR webcam
  • Fingerprint reader

 

Previous

  • Dell Precision 7530, 7510, M4800, M6700
  • Dell Latitude E6520
  • Dell Inspiron 1720, 5150
  • Dell Latitude CPi
Link to comment
Share on other sites

what happened was ever higher board TDPs on desktop GPUs and more and more castrated mobile GPU TDPs 😄 

Mine: Hyperion "Titan God of Heat, Heavenly Light, Power" (2022)
AMD Ryzen 9 7950X / Asus ROG Crosshair X670E Extreme / MSI Geforce RTX 4090 Suprim X / G.Skill Trident Z5 RGB DDR5-6600 2x16 GB / Seagate Firecuda 530 4 TB / 2x Samsung 860 Evo 4 TB / Arctic Liquid Freezer II 420 / Seasonic TX-1600 W Titanium / Phanteks Enthoo Pro 2 TG / Samsung Odyssey Neo G8 32" UHD 240 Hz / Ducky One 3 Daybreak Fullsize Cherry MX Brown / Corsair M65 Ultra RGB

 

My Lady's: Clevo NH55JNNQ "Alfred" (2022)
Sharp LQ156M1JW03 FHD matte 15.6" IGZO 8 bit @248 Hz / Intel 12600 @ 4.4 - 4.8 Ghz / Nvidia 3070 Ti 8 GB GDDR6 / G.Skill 16 GB DDR4-3800 / Samsung 970 Pro 1 TB / Intel AX201 ax+BT / Win 11 Pro Phoenix Lite OS / 230 W PSU powered by Prema Mod!

Link to comment
Share on other sites

^This 

 

We went from 250w TDP desktop flagship to now 450W and possibly 600w in future. 

 

Laptop we went from 100w TDP to 200W then 190W , now 175W (with bs turbo tdp accounted for) 

 

Basically laptop went down and desktop went way up.

  • Thumb Up 4

Alienware Area-51M : Intel Core i9-9900K @ 5.3Ghz    | nVidia GeForce RTX 2080    | AX210 | Samsung 970 Evo+ 
Alienware X14 R1 :       Intel Core i7 12700H 12th-Gen   | nVidia GeForce RTX 3060    | AX411 | SK Hynix 2TB PCIe4 
Alienware M18x R2 :    Intel Core i7 3920XM @ 4.7Ghz | nVidia Quadro RTX 3000     | AX210 | Samsung 980 PRO   
Alienware 18 R1 :          Intel Core i7 4930MX @ 4.0Ghz | nVidia Quadro RTX 3000     | AX210 | SK Hynix 1TB NVMe 
Clevo X170SM-G:         Intel Core i7 10700K @ Stock     | nVidia GeForce RTX 2070S | AX210 | 256GB+2x512GB 


CS Studios YouTube: https://www.youtube.com/c/CSStudiosYT 

Link to comment
Share on other sites

SLI laptop designs like the P870 did cool up to 400W peak GPU power with a total die size that was very similar to an RTX 4080 (more than 600 square mm). Even a 270W version of the 4090 chip in a laptop (60% of desktop power target) should yield excellent results that would easily be at least 65% better than what Nvidia plans to achieve with the 4090 mobile and probably it would still be at least 50% better with a 225W version of the 4090 desktop die.

Link to comment
Share on other sites

Looks like at least that max power may be a bit higher this time:

 

https://www.notebookcheck.net/Nvidia-RTX-4090-and-RTX-4080-Laptop-GPUs-rated-for-2-GHz-boost-and-up-to-200-W-TGP-including-Dynamic-Boost-RTX-4060-and-RTX-4050-up-to-95-W-or-165-W-each.676258.0.html

 

image.png.ae185029842f355d26edd4b1ca1a3515.png

 

It still puzzles me why Nvidia does not see the necessity to increase max power going to the 4090 but then by all accounts they are only using a slower version of the 4080 desktop chip with less bus width, clock speeds and probably also cuda cores and more power would not have the same results as giving more power to a desktop 4080.

Link to comment
Share on other sites

50 minutes ago, 1610ftw said:

Looks like at least that max power may be a bit higher this time:

 

https://www.notebookcheck.net/Nvidia-RTX-4090-and-RTX-4080-Laptop-GPUs-rated-for-2-GHz-boost-and-up-to-200-W-TGP-including-Dynamic-Boost-RTX-4060-and-RTX-4050-up-to-95-W-or-165-W-each.676258.0.html

 

image.png.ae185029842f355d26edd4b1ca1a3515.png

 

It still puzzles me why Nvidia does not see the necessity to increase max power going to the 4090 but then by all accounts they are only using a slower version of the 4080 desktop chip with less bus width, clock speeds and probably also cuda cores and more power would not have the same results as giving more power to a desktop 4080.

 

The only thing Nvidia sees as a necessity is making more money. So they ask the manufacturers: what chips can we provide to make more money? The answer is:

Please provide as efficient chips as possible, and obviously keep the power low. No more than 170W-ish, otherwise laptops tend to go on fire (see Alienware Area 51M), and we do have the slim-and-light agenda to pursue here. Thanks! 

  • Thumb Up 1
Link to comment
Share on other sites

Probably some truth to that certainly the money-making part 😄

However with Nvidia and desktop card manufacturers Nvidia is mostly telling them what to do.

So it is probably a bit of both and manufacturers and Nvidia are happy with relatively safe mediocrity that does not cause a lot of issues.

  • Thumb Up 1
Link to comment
Share on other sites

I have to agree, desktop is bigger so it has more die space ect to be more powerful. 1060 2060 and 3060 are the odd ones out. my 3060 is equal to a desktop 3060 and its the 115w variant. I think as long as the performance goes up by a reasonable amount for both desktop and laptop who cares. its not a secret.

Link to comment
Share on other sites

On 12/24/2022 at 4:32 AM, Bullit said:

Well the "declined" Nvidia laptop GPU's are much more capable than the once "might"  Nvidia laptop GPU's

 

A not even 25% performance increase of the 3080 Ti over the 2080 over THREE generations is now defined as much more capable?

While at the same time desktop performance has increased by more than 75% for more than three times the gains?

 

Looks to me as if Nvidia did as little as possible to prevent the impression of a complete standstill while still charging the same high prices as back in the days when their laptop GPUs were actually quite competitive.

 

Obviously the number of people who see it as kind of natural that Nvidia routinely uses very low power limits and inferior dies that belong to smaller desktop cards is quite high. Therefore it seems that the strategy to do as little as possible seems to work out very well and I will give them that but with the technology of today we could easily see laptop GPU's that have at least 50% higher performance on a much lower total power budget than the SLI laptops of the past.

  • Thumb Up 2
  • Bump 1
Link to comment
Share on other sites

I did get a noticeable improvement out of my 3070 mobile going from 125W to 180W but it's more of a desktop 3060ti with that modding done.

  • Thumb Up 4

Sager NP9877

Hybrid Watercooled setup, D5 pump with dual 140MM fans integrated into custom stand. All controlled via temp outputs from GPU and CPU.

17.3" 1440p 120Hz display

9900KS @ 4.7Ghz All core, delid with rockit IHS, liquid metal

RTX 3070 @ 180W shunt mod, liquid metal

3000Mhz DDR4 CL14

1TB SM961 SSD

Dual 330W PSU 

Link to comment
Share on other sites

On 12/25/2022 at 10:09 PM, Bullit said:

 

Blender has seen an above average intergenerational improvement as it seems to profit from Ampere in an above average way:

 

 

By the way performance of desktop vs mobile:

 

3080 Ti mobile: 3840

3080 Ti desktop: 5958

 

55% increase of desktop over laptop, basically the same as in Time Spy

 

Link to comment
Share on other sites

Raytracing, tensor for denoising with Optix. I have the 130w 3060 Laptop and already in some scenes it starts to be a pleasure to work. It is at point where your work starts to feel fluid while in past render time was a break a coffee at best.

 

  • Thumb Up 1
Link to comment
Share on other sites

1 hour ago, Bullit said:

Raytracing, tensor for denoising with Optix. I have the 130w 3060 Laptop and already in some scenes it starts to be a pleasure to work. It is at point where your work starts to feel fluid while in past render time was a break a coffee at best.

 

 

The 3060 is very power efficient. 130W is plenty of power for it.  Yet, even looking at this entry-level card, Nvidia nerfed the laptop variant as it only has 6GB of VRAM, whereas the desktop cards have 12GB. Of course, once we move up the food chain we see the laptop/desktop gap increase further.

  • Thumb Up 1
Link to comment
Share on other sites

I have the 3060 115w variant. and yes it only has 6gb of ram but most games don't need more. maybe like 4k ultra some games need 8gb or more but if your gaming at 4k ultra im betting the game is old and won't require much vram. I know watch dogs legion has a hard time with the 3060 laptop but I don't play that game. games should be built around common power and the most common gpu is the 1650 4gb. so with that said 6gb 3060 is still pretty good..

 

also bullit I was wondering what you get in timespy? wonder how big the gap is from 115w to 130w. I have no idea if my 9200 score is the highest for the 115w or not...no way to tell

Link to comment
Share on other sites

6Gb is a big issue for me in Unreal Engine, but also might be in Blender just currently i am just making small projects there.

If timespy is something i need to install no chance.

You can do the Blender Benchmark  no need to install https://opendata.blender.org choose GPU Optix and v.3.3. My value is 2615 with Hwinfo i see it between 110-125w doing the benchmark.

  • Thumb Up 1
Link to comment
Share on other sites

47 minutes ago, Reciever said:

I dont think I have had anything less than 8GB since 2018/2019.

 

yep, since 2015 in my case. i wouldnt go below 12GB with a new gpu nowadays, rather aim for 16GB for some future-proofing.

Mine: Hyperion "Titan God of Heat, Heavenly Light, Power" (2022)
AMD Ryzen 9 7950X / Asus ROG Crosshair X670E Extreme / MSI Geforce RTX 4090 Suprim X / G.Skill Trident Z5 RGB DDR5-6600 2x16 GB / Seagate Firecuda 530 4 TB / 2x Samsung 860 Evo 4 TB / Arctic Liquid Freezer II 420 / Seasonic TX-1600 W Titanium / Phanteks Enthoo Pro 2 TG / Samsung Odyssey Neo G8 32" UHD 240 Hz / Ducky One 3 Daybreak Fullsize Cherry MX Brown / Corsair M65 Ultra RGB

 

My Lady's: Clevo NH55JNNQ "Alfred" (2022)
Sharp LQ156M1JW03 FHD matte 15.6" IGZO 8 bit @248 Hz / Intel 12600 @ 4.4 - 4.8 Ghz / Nvidia 3070 Ti 8 GB GDDR6 / G.Skill 16 GB DDR4-3800 / Samsung 970 Pro 1 TB / Intel AX201 ax+BT / Win 11 Pro Phoenix Lite OS / 230 W PSU powered by Prema Mod!

Link to comment
Share on other sites

Nvidia GeForce RTX 4060 laptop graphics will take a big leap in performance over its predecessor. A wopping 20% performance boost over RTX 3060 laptop graphics. Whats the point with this upgrade if it's true? Bro @Mr. Foxdon't you have the 3060 in your Jokebook? If so, will you jump on a ADA gaming book with the 4060 refresh? 

 

 

Nvidia GeForce RTX 4060 laptop graphics card's 3DMark Time Spy benchmark score leaks online
https://www.notebookcheck.net/Nvidia-GeForce-RTX-4060-laptop-graphics-card-s-3DMark-Time-Spy-benchmark-score-leaks-online.677472.0.html

 

 

  • Thumb Up 1
  • Haha 1

"The Killer"  ASUS Maximus Z690 Apex | 13900K | Zotac graphics | 32GB DDR5 | Be Quiet! Dark Power Pro 12 - 1500 Watt  |  Custom Loop

 Papusan @ HWBOTTeam PremaMod @ HWBOT

Link to comment
Share on other sites

27 minutes ago, Papusan said:

Nvidia GeForce RTX 4060 laptop graphics will take a big leap in performance over its predecessor. A wopping 20% performance boost over RTX 3060 laptop graphics. Whats the point with this upgrade if it's true? Bro @Mr. Foxdon't you have the 3060 in your Jokebook? If so, will you jump on a ADA gaming book with the 4060 refresh? 

 

 

Nvidia GeForce RTX 4060 laptop graphics card's 3DMark Time Spy benchmark score leaks online
https://www.notebookcheck.net/Nvidia-GeForce-RTX-4060-laptop-graphics-card-s-3DMark-Time-Spy-benchmark-score-leaks-online.677472.0.html

 

 

No. Half-breed is a 16GB Quadro. Wheezer was 2060.

 

 

 

  • Thumb Up 1

Wraith // EVGA Z690 Dark K|NGP|N | 13900K | EVGA 3090 K|NGP|N | 32GB DDR5 | EVGA 1600 P2 | HC-500A Chiller | MO-RA3 360

Banshee // ASUS Strix Z690-E | 13900KF | EVGA 3060 Ti FTW3 | 32GB DDR5 | EVGA 850 B5 | XT45 1080 Nova Custom Loop

Half-Breed // Precision 17 7720 | 7920HQ (BGA filth) | Quadro P5000 16GB (MXM) | 32GB DDR4 || Grade A Off-Lease Refurb

Mr. Fox YouTube Channel | Mr. Fox @ HWBOT | Team PremaMod @ HWBOT 

The average response time for a 911 call is 10 minutes. The response time of a .357 is 1400 feet per second.

Link to comment
Share on other sites

1 hour ago, Papusan said:

Nvidia GeForce RTX 4060 laptop graphics will take a big leap in performance over its predecessor. A wopping 20% performance boost over RTX 3060 laptop graphics. Whats the point with this upgrade if it's true? Bro @Mr. Foxdon't you have the 3060 in your Jokebook? If so, will you jump on a ADA gaming book with the 4060 refresh? 

 

 

Nvidia GeForce RTX 4060 laptop graphics card's 3DMark Time Spy benchmark score leaks online
https://www.notebookcheck.net/Nvidia-GeForce-RTX-4060-laptop-graphics-card-s-3DMark-Time-Spy-benchmark-score-leaks-online.677472.0.html

 

 

 

Wow...

 

The only positive about it is that it uses 8GBs of VRAM instead of 6GBs like the RTX 3060/2060

 

According to notebookcheck it says that the RTX 4060 has only a 96 Memory bit bus and i hope thats a typo because it clearly got castrated......

 

https://www.notebookcheck.net/NVIDIA-GeForce-RTX-4060-Laptop-GPU-Benchmarks-and-Specs.675692.0.html

 

 

  • Thumb Up 1
  • Hands 1
  • Confused 1

Current Laptop:

Lenovo Legion 5: AMD Ryzen 7 4800H 2.8Ghz (Boost: 4.2Ghz), 6GB Nvidia Geforce GTX 1660Ti GDDR6 Memory, 15.6" FHD (1920 x 1080) 144Hz IPS display, 16GB 3200MHz DDR4 memory, 512GB M.2 NVMe PCIe SSD, 1 TB Teamgroup MP34 M.2 NVMe PCIe SSD, Windows 10 Home 21H2

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. Terms of Use