Jump to content
NotebookTalk

RTX 4000 mobile series officially released.


VEGGIM

Recommended Posts

5 minutes ago, VEGGIM said:

https://twitter.com/JarrodsTech/status/1623822414814777345?s=20

<snip>


Well this looks interesting. looks like some have ctgp configurable.

I believe some brands offered control over GPU TDP for 30 series laptops too, I'm mainly thinking of the XMG Neo 15 with the Oasis cooling system. Anyway currently it is possible to change the power limit with recent NVIDIA drivers (including the latest 528.49 drivers as of now) on 20 and 30 series mobile GPUs in MSI Afterburner/EVGA Precision, hopefully this remains the case for 40 series GPUs as well.

 

 

Link to comment
Share on other sites

4 hours ago, Papusan said:

Try this one below🙂 And Nvidia/Jensen said their 3080@10GB (desktop card) was their new 4K flagship 2 years ago. Damn futureproofing for 4K (RT enabled - 15.0 FPS), LOOL

 

FPS = frames per second😁

 

 

Very enjoyable. And only god Nvidia knows how the FPS will fade away in newer games 2 years down the road. 

performance-3840-2160.png

 

And where should we put 4090 Mobile@85-110W versions?

image.thumb.png.ee406be47ea4db809e9a64d40ff2c5ab.png

Hogwarts Legacy Benchmark Test & Performance Analysis Review - VRAM Usage Record

REVIEW GAME TESTING
Hogwarts Legacy lets you relive the steps of Harry Potter and is a fantastic adaptation of the books. In our performance review, we're taking a closer look at image quality, differences between the graphical setting presets, VRAM usage, and performance on a selection of modern graphics cards.

 

Atmospheric VRAM eater with upsampling fed up

ComputerBase takes a look at performance, image quality, ray tracing and upsampling in Hogwarts Legacy for the PC including benchmarks.

 

10 GB is no longer enough even without ray tracing in Ultra HD with DLSS at a sufficient frame rate. If the graphics card only has 8 GB, there are already clear signs of wear and tear in WQHD including DLSS, even without ray tracing, and even in Full HD one or the other texture is missing when playing. Ultimately, 8 GB and maximum textures are not possible.

 

Some of these new games must be directly financed by the hardware manufacturers.

 

"I need to get myself a 5090Ti so that I can finally get 60 fps in Hogwarts Legacy with raytracing"

 

Pathetic.

 

  • Haha 2
Link to comment
Share on other sites

8 hours ago, IamTechknow said:

I believe some brands offered control over GPU TDP for 30 series laptops too, I'm mainly thinking of the XMG Neo 15 with the Oasis cooling system. Anyway currently it is possible to change the power limit with recent NVIDIA drivers (including the latest 528.49 drivers as of now) on 20 and 30 series mobile GPUs in MSI Afterburner/EVGA Precision, hopefully this remains the case for 40 series GPUs as well.

 

 

is this for all with nvidia mobile GPUs or no.

Link to comment
Share on other sites

11 hours ago, ssj92 said:

What’s interesting is, the guy said his cpu peaked 88c but ran in high 70s for most of the run. I really am wondering if something like AWCC is messing things up or some other software he has on the computer. I’ve seen on passmark these cpus get close to desktop chips so they have the potential. 
 

hopefully someone here gets one. I’m waiting for 4090 so I won’t have one anytime soon. 
 

 

btw @win32asmguy on the m18 bios you can turn on dgpu only mode apparently so theoretically those tb4 should output from dgpu in that mode but i’ll give it a test whenever I end up with mine. Also as far as I can tell, 4080/4090 will get the full vapor chamber cooler, not sure on lower tier gpus

 

That makes sense for CPU thermals it should only peak in the CPU test. 

 

I happen to see an "open box excellent" m16 on Best Buy's site so I have that on the way to test out. A sales rep at Dell mentioned the fully customizable m18 should be available soon.

Desktop - 12900KS, 32GB DDR5-6400 C32, 2TB WD SN850, Windows 10 Pro 22H2

Clevo X170SM - 10900K LTX SP106, 32GB DDR4-2933 CL17, 4TB WD SN850X, RTX 3080 mobile, 17.3 inch FHD 144hz, System76 open source firmware, Windows 10 Pro 22H2

Link to comment
Share on other sites

50 minutes ago, win32asmguy said:

 

That makes sense for CPU thermals it should only peak in the CPU test. 

 

I happen to see an "open box excellent" m16 on Best Buy's site so I have that on the way to test out. A sales rep at Dell mentioned the fully customizable m18 should be available soon.

Wow I wish I had even a new one available let alone open box at my local best buy. 

 

I heard it will be a few weeks until 4090 config comes out but we'll see 

  • Thumb Up 1

Alienware m18             : Intel Core i9 13900HX @ 5.0Ghz | nVidia GeForce RTX 4090    | K1675 | 2x1TB SSDs 

Alienware Area-51M : Intel Core i9-9900K @ 5.3Ghz    | nVidia GeForce RTX 2080    | AX210 | Samsung 970 Evo+ 
Alienware M18x R2 :    Intel Core i7 3920XM @ 4.7Ghz | nVidia Quadro RTX 5000     | AX210 | Samsung 980 PRO   
Alienware 18 :              Intel Core i7 4930MX @ 4.5Ghz  | nVidia Quadro RTX 3000  | AX210 | Samsung 980 NVMe  

More Laps: M14x (555m) | M14xR2 (650m) | M15x (980m) | M17xR3 (880m) | M18xR1 (RTX 5000) 

BEAST Server:          Intel Xeon W7-3465X 28 P-Cores | nVidia Titan V | 128GB RDIMM | Intel Optane P5800X


CS Studios YouTube: https://www.youtube.com/c/CSStudiosYT 

Link to comment
Share on other sites

1 hour ago, ssj92 said:

Wow I wish I had even a new one available let alone open box at my local best buy. 

 

I heard it will be a few weeks until 4090 config comes out but we'll see 

 

Yeah, that one was just for shipping from their website. dell.com also has the same config m16 on their website for quick shipment.

 

Its a shame Dell does not have a 16 inch FHD+ 165hz panel option. You would think because they offer it on the G16 that it would be as simple as making the custom eDP cable for the m16 as well. At least they do have the FHD+ panel coming for the m18, even with wide gamut support as well.

Desktop - 12900KS, 32GB DDR5-6400 C32, 2TB WD SN850, Windows 10 Pro 22H2

Clevo X170SM - 10900K LTX SP106, 32GB DDR4-2933 CL17, 4TB WD SN850X, RTX 3080 mobile, 17.3 inch FHD 144hz, System76 open source firmware, Windows 10 Pro 22H2

Link to comment
Share on other sites

  • Thumb Up 4

7950X3D| Zotac 4090 AMP Extreme Airo| MSI MPG B650 Edge Wifi| Lian Li Galahad 360 V2| 48GB GSkillTrident Z RGB 7600|Kingston KC3000 2TB| Fury Renegade 2TB| Lian Li O11 Dynamic Evo| Corsair HX1500i| Samsung Odyssey G9 Neo

Asus Zephyrus G15 (Ryzen 9 6900HS + RTX3080)

 

Link to comment
Share on other sites

  • Thumb Up 2
  • Haha 3

"The Killer"  ASUS ROG Z790 Apex Encore | 14900KS | 4090 HOF + 20 other graphics cards | 32GB DDR5 | Be Quiet! Dark Power Pro 12 - 1500 Watt | Second PSU - Cooler Master V750 SFX Gold 750W (For total of 2250W Power) | Corsair Obsidian 1000D | Custom Cooling | Asus ROG Strix XG27AQ 27" Monitors |

 

                                               Papusan @ HWBOTTeam PremaMod @ HWBOT | Papusan @ YouTube Channel

                             

 

Link to comment
Share on other sites

5 minutes ago, Papusan said:

this isn't new. The 3080 laptop was not as fast as the desktop neither was the 2080 to desktop....so why all the concern

 

source

 

https://www.notebookcheck.net/GeForce-RTX-2080-Laptop-vs-GeForce-RTX-2080-Desktop_9541_9286.247598.0.html

ZEUS-COMING SOON

            Omen 16 2021

            Zenbook 14 oled

            Vivobook 15x oled

 

Link to comment
Share on other sites

21 minutes ago, ryan said:

this isn't new. The 3080 laptop was not as fast as the desktop neither was the 2080 to desktop....so why all the concern

 

source

 

https://www.notebookcheck.net/GeForce-RTX-2080-Laptop-vs-GeForce-RTX-2080-Desktop_9541_9286.247598.0.html

Actually if you had the Alienware Area-51m R1/R2 you had a 200w 2080 that matches/beats desktop 2080:

 

https://www.3dmark.com/fs/19316903

 

Clevo X170SM-G/TM-G also had 2080/2080S with 150w vBIOS that were neck to neck with desktop 2080. You could also flash it with 200w AW vBIOS to get the full performance too 

 

3000 series was when no laptop was even close 

  • Thumb Up 3

Alienware m18             : Intel Core i9 13900HX @ 5.0Ghz | nVidia GeForce RTX 4090    | K1675 | 2x1TB SSDs 

Alienware Area-51M : Intel Core i9-9900K @ 5.3Ghz    | nVidia GeForce RTX 2080    | AX210 | Samsung 970 Evo+ 
Alienware M18x R2 :    Intel Core i7 3920XM @ 4.7Ghz | nVidia Quadro RTX 5000     | AX210 | Samsung 980 PRO   
Alienware 18 :              Intel Core i7 4930MX @ 4.5Ghz  | nVidia Quadro RTX 3000  | AX210 | Samsung 980 NVMe  

More Laps: M14x (555m) | M14xR2 (650m) | M15x (980m) | M17xR3 (880m) | M18xR1 (RTX 5000) 

BEAST Server:          Intel Xeon W7-3465X 28 P-Cores | nVidia Titan V | 128GB RDIMM | Intel Optane P5800X


CS Studios YouTube: https://www.youtube.com/c/CSStudiosYT 

Link to comment
Share on other sites

10 hours ago, ryan said:

this isn't new. The 3080 laptop was not as fast as the desktop neither was the 2080 to desktop....so why all the concern

 

source

 

https://www.notebookcheck.net/GeForce-RTX-2080-Laptop-vs-GeForce-RTX-2080-Desktop_9541_9286.247598.0.html

980 to 2080 Super were all pretty close to the desktop cards compared to the last two generations.

 

Looking at Time Spy for comparison the 1080, 2080 and 2080 Super desktop cards were only about  9 to 12% better and only with the 980 was there a ca. 28% difference.

 

Compare that to about 60% today for the 4080 and I predict a whopping 80%+ for the 4090 where so far we do not have that many results for the notebook cards. At the moment the number 100 4090 is more than 90% better than the 4090 notebook ranked in 25th place.

 

Would have made a lot more sense to just call the 4090 notebook a 4080. Even then the desktop 4080 would still be more than 35% better which already creates enough confusion and/or disappointment.

Link to comment
Share on other sites

22 hours ago, Papusan said:

Try this one below🙂 And Nvidia/Jensen said their 3080@10GB (desktop card) was their new 4K flagship 2 years ago. Damn futureproofing for 4K (RT enabled - 15.0 FPS), LOOL

 

FPS = frames per second😁

 

 

Very enjoyable. And only god Nvidia knows how the FPS will fade away in newer games 2 years down the road. 

performance-3840-2160.png

 

And where should we put 4090 Mobile@85-110W versions?

image.thumb.png.ee406be47ea4db809e9a64d40ff2c5ab.png

Hogwarts Legacy Benchmark Test & Performance Analysis Review - VRAM Usage Record

REVIEW GAME TESTING
Hogwarts Legacy lets you relive the steps of Harry Potter and is a fantastic adaptation of the books. In our performance review, we're taking a closer look at image quality, differences between the graphical setting presets, VRAM usage, and performance on a selection of modern graphics cards.

 

Atmospheric VRAM eater with upsampling fed up

ComputerBase takes a look at performance, image quality, ray tracing and upsampling in Hogwarts Legacy for the PC including benchmarks.

 

10 GB is no longer enough even without ray tracing in Ultra HD with DLSS at a sufficient frame rate. If the graphics card only has 8 GB, there are already clear signs of wear and tear in WQHD including DLSS, even without ray tracing, and even in Full HD one or the other texture is missing when playing. Ultimately, 8 GB and maximum textures are not possible.

 

18 hours ago, 1610ftw said:

 

Some of these new games must be directly financed by the hardware manufacturers.

 

"I need to get myself a 5090Ti so that I can finally get 60 fps in Hogwarts Legacy with raytracing"

 

Pathetic.

 

Yep, isn't it nice? Nvidia's gaming flagship (3090 was for creators and 8K, LOOL) obsolete after barely 2 years use. And I don't talk about the more crippled 3080 Mobile. This is the real xx80 desktop card singin its last song. Maybe we can call it the Deathmarch 🙂

Hogwarts Legacy, GPU Benchmark: Obsoleting The RTX 3080 10GB

 

"The Killer"  ASUS ROG Z790 Apex Encore | 14900KS | 4090 HOF + 20 other graphics cards | 32GB DDR5 | Be Quiet! Dark Power Pro 12 - 1500 Watt | Second PSU - Cooler Master V750 SFX Gold 750W (For total of 2250W Power) | Corsair Obsidian 1000D | Custom Cooling | Asus ROG Strix XG27AQ 27" Monitors |

 

                                               Papusan @ HWBOTTeam PremaMod @ HWBOT | Papusan @ YouTube Channel

                             

 

Link to comment
Share on other sites

Always going to be true though, they're not going to release anything that doesn't obsolete the previous stuff makes sense. We've all had this convo a few times though, just call the mobiles xxxxm or a totally different PN, but related as to generation and capabilities like 300m, 320m etc The vram is particularly egregious as that's needed a lot for higher res gaming to not have 16gb+ desktops and 8gb+ mobiles is just greed. x50m's should have at least 6gb.

 

Ngreedia probably has the next 2 gens at the top level mapped out and mostly done at least theoretically and manufacturing capabilities I'd think, then they manage it downwards to be able to play at 4k/60 w dlss2/3. DLSS3 just there to pad the fps artificially and as I've said elsewhere it's so fast/100fps+ as to not be able to be noticed the artifacting and bad frames w normal human perception. Something wrong w that to me. Tricks are fine but this is different it's right in front of you but you can't notice it easily. I guess dial it up to 1000fps put in subliminals to buy this or that, heck put in ads for all sorts of stuff at a price of course. Stick some racy stuff in there lol

  • Thumb Up 1
Link to comment
Share on other sites

This is why I always bought the Titan series GPUs. 

 

Maxwell Titan X had 12GB VRAM

Pascal Titan 12GB

Titan V 12GB (I have this one right now)

RTX Titan 24GB 

 

I bet if RTX Titan Ada comes out it'll be 48GB 

 

I knew 3080 was a bad choice for 4K when I saw 10GB VRAM. It's why I went with 3090 instead. 

 

It's normally better to buy a GPU with more VRAM (especially 4K). 

 

AMD did it right last gen with 16GB cards for top three. 

  • Thumb Up 1

Alienware m18             : Intel Core i9 13900HX @ 5.0Ghz | nVidia GeForce RTX 4090    | K1675 | 2x1TB SSDs 

Alienware Area-51M : Intel Core i9-9900K @ 5.3Ghz    | nVidia GeForce RTX 2080    | AX210 | Samsung 970 Evo+ 
Alienware M18x R2 :    Intel Core i7 3920XM @ 4.7Ghz | nVidia Quadro RTX 5000     | AX210 | Samsung 980 PRO   
Alienware 18 :              Intel Core i7 4930MX @ 4.5Ghz  | nVidia Quadro RTX 3000  | AX210 | Samsung 980 NVMe  

More Laps: M14x (555m) | M14xR2 (650m) | M15x (980m) | M17xR3 (880m) | M18xR1 (RTX 5000) 

BEAST Server:          Intel Xeon W7-3465X 28 P-Cores | nVidia Titan V | 128GB RDIMM | Intel Optane P5800X


CS Studios YouTube: https://www.youtube.com/c/CSStudiosYT 

Link to comment
Share on other sites

Techspot editor was spot on.... @Mr. Fox @Ashtrix

 

We think Nvidia could simply admit that it's not possible to put their flagship desktop GPU into a laptop, and give the laptop GPU a different name. This should be an RTX 4090M – or even better, the RTX 4080M – because then we would be matching the desktop model in GPU die and have the M suffix to make it clear it's clocked lower and power limited relative to the desktop card.

 

You can see the scam in the details...

image.png.6891a891611b557f0c20a15350e6b3e8.png

 

The reason why we believe Nvidia doesn't do that and is giving the desktop and laptop GPUs the same name, is that it allows laptop vendors to overprice their gaming laptops and put them on a similar pricing tier to a much more powerful desktop.

 

Nvidia GeForce RTX 4090 Desktop vs. Laptop GPU techspot.com

  • Thumb Up 4
  • Sad 1

"The Killer"  ASUS ROG Z790 Apex Encore | 14900KS | 4090 HOF + 20 other graphics cards | 32GB DDR5 | Be Quiet! Dark Power Pro 12 - 1500 Watt | Second PSU - Cooler Master V750 SFX Gold 750W (For total of 2250W Power) | Corsair Obsidian 1000D | Custom Cooling | Asus ROG Strix XG27AQ 27" Monitors |

 

                                               Papusan @ HWBOTTeam PremaMod @ HWBOT | Papusan @ YouTube Channel

                             

 

Link to comment
Share on other sites

3 hours ago, Papusan said:

Techspot editor was spot on.... @Mr. Fox @Ashtrix

 

We think Nvidia could simply admit that it's not possible to put their flagship desktop GPU into a laptop, and give the laptop GPU a different name. This should be an RTX 4090M – or even better, the RTX 4080M – because then we would be matching the desktop model in GPU die and have the M suffix to make it clear it's clocked lower and power limited relative to the desktop card.

 

You can see the scam in the details...

image.png.6891a891611b557f0c20a15350e6b3e8.png

 

The reason why we believe Nvidia doesn't do that and is giving the desktop and laptop GPUs the same name, is that it allows laptop vendors to overprice their gaming laptops and put them on a similar pricing tier to a much more powerful desktop.

 

Nvidia GeForce RTX 4090 Desktop vs. Laptop GPU techspot.com

 

Well, there are viable technical solutions if they wanted to step out of the box a little and pursue them. A true 8 lane eGPU in a large enough enclosure would be one approach. If they integrated the GPU into the eGPU device, they could maintain the lovely "no upgrades mo********s!" paradigm (I think Asus has tried this).

  • Thumb Up 2

"We're rushing towards a cliff, but the closer we get, the more scenic the views are."

-- Max Tegmark

 

AI: Major Emerging Existential Threat To Humanity

Link to comment
Share on other sites

6 hours ago, Etern4l said:

 

Well, there are viable technical solutions if they wanted to step out of the box a little and pursue them. A true 8 lane eGPU in a large enough enclosure would be one approach. If they integrated the GPU into the eGPU device, they could maintain the lovely "no upgrades mo********s!" paradigm (I think Asus has tried this).

 

That sounds like an excellent external solution. TB5 may have the potential to give much better results already but I have a gut feeling that somebody somewhere will mess it up again.

 

As for the cards that we find in laptops I would say offer manufacturers the freedom to use desktop chips and see what they come up with, we might be surprised how good these can be.

 

I am completely OK with Nvidia doing some testing of such a machine to make sure they do not go up in flames with one of the larger chips that would hopefully come with a TGP well in excess of 200W. The important point here is that Nvidia stops setting arbitrary and rather low limits on what laptop cards or rather chips can do or be. It is in the interest of all people who spend big money on laptops to get top-of-the-line models that aren't artificially gimped as they are right now.

 

 

  • Thumb Up 1
Link to comment
Share on other sites

3 hours ago, 1610ftw said:

 

That sounds like an excellent external solution. TB5 may have the potential to give much better results already but I have a gut feeling that somebody somewhere will mess it up again.

 

As for the cards that we find in laptops I would say offer manufacturers the freedom to use desktop chips and see what they come up with, we might be surprised how good these can be.

 

I am completely OK with Nvidia doing some testing of such a machine to make sure they do not go up in flames with one of the larger chips that would hopefully come with a TGP well in excess of 200W. The important point here is that Nvidia stops setting arbitrary and rather low limits on what laptop cards can do. It is in the interest of all people who spend big money on laptops to get top-of-the-line models that aren't artificially gimped as they are right now.

 

 

Thats basically the concept of the asus flow series. Though for thr desktop chip thing. They would have to be modified anyway to support the certain laptop technologies. Including things like optimus. Desktop gpus don't really support that in their drivers.

Link to comment
Share on other sites

We've all seen Clevo's newest X170 replacement, the X370.

 

It's obvious it's quite a departure from standard Clevo designs from old. Lost modularity and power delivery in favour of a slimmer design no one was asking for, etc...

 

But still is it gearing up to be the most powerful 4090 laptop?

Link to comment
Share on other sites

1 hour ago, Shark00n said:

We've all seen Clevo's newest X170 replacement, the X370.

 

It's obvious it's quite a departure from standard Clevo designs from old. Lost modularity and power delivery in favour of a slimmer design no one was asking for, etc...

 

But still is it gearing up to be the most powerful 4090 laptop?

 

No chance that the X370 will be the most powerful.

Maybe the most reviled though given its shameless posing as something it is not.

 

To be clear every manufacturer needs some breadwinner in the form of thin and light BGA books but nobody forced Clevo to treat its X-lineup with such contempt.

Link to comment
Share on other sites

  • Thumb Up 1

ZEUS-COMING SOON

            Omen 16 2021

            Zenbook 14 oled

            Vivobook 15x oled

 

Link to comment
Share on other sites

On 2/10/2023 at 4:28 PM, Papusan said:

 

Its not a surprise. We all already knew that as long desktop GPUs keep getting more power hungry every gen while laptop GPUs stuck at 175W and have less CUDA cores.

 

It was fun while it lasted when the 10, 16 and 20 laptop GPUs series was very close in performance to the desktop versions and correct me if im wrong but didnt those GPUs used the same exact chip for both desktops and laptops? because it explains why both versions was very close in performance

 

 

  • Thumb Up 2

Current Laptop:

Lenovo Legion 5: AMD Ryzen 7 4800H 2.8Ghz (Boost: 4.2Ghz), 6GB Nvidia Geforce GTX 1660Ti GDDR6 Memory, 15.6" FHD (1920 x 1080) 144Hz IPS display, 32GB 3200MHz DDR4 memory, 512GB M.2 NVMe PCIe SSD, 1 TB Teamgroup MP34 M.2 NVMe PCIe SSD, Windows 10 Home 22H2

Link to comment
Share on other sites

9 hours ago, Shark00n said:

We've all seen Clevo's newest X170 replacement, the X370.

 

It's obvious it's quite a departure from standard Clevo designs from old. Lost modularity and power delivery in favour of a slimmer design no one was asking for, etc...

 

But still is it gearing up to be the most powerful 4090 laptop?

 

I don't think it has been reviewed yet. Similar height (25mm) to other "DTR" notebooks this gen and slightly heavier or lighter at 3.3kg depending on what you're comparing to (Scar 18 is 3.1kg and m18 is 4.04kg). I doubt it will be any better than the 18" notebooks offered by others.

Metabox Prime-X (X170KM-G) | 17.3" 165Hz G-sync | 11900KF | 32GB DDR4 3200 | RTX 3080 16GB | 1TB 980 Pro

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. Terms of Use