-
Posts
2,102 -
Joined
-
Days Won
30
Content Type
Profiles
Forums
Events
Everything posted by Aaron44126
-
the decline of the once mighty Nvidia laptop GPUs
Aaron44126 replied to 1610ftw's topic in General Discussion
I kind of see this in the opposite direction. It's not the decline of laptop GPUs so much as it is desktop GPUs finally growing up to take advantage of the desktop form factor. If desktops are an order of magnitude larger than laptops (talking about physical space / volume) then they should be able to dissipate an order of magnitude more heat. A decade ago, desktop CPUs and GPUs were not using more power than you could reasonably dissipate from a laptop chassis. Now, they are. NVIDIA is now building desktop GPUs that consume more than 400W and there's not really a way that you could dissipate that amount of heat from a laptop chassis (plus heat from the CPU as well) using current designs and materials. So yes, you're right, the difference between desktop and laptop GPU performance will only continue to widen as NVIDIA continues to crank up GPU power limits. It's more a matter of physics than it is NVIDIA failing in the laptop space. Not to give NVIDIA a pass... One could make the argument that putting a GA103 or AD103 GPU chip into a laptop is stupid. Here, I am assuming that recent rumors about an upcoming "GeForce 4090" laptop GPU with an AD103 core and 175W TGP are true, but NVIDIA is already selling "GeForce 3080 Ti" laptop GPUs with the GA103 core (...I have one right here). The power limit is going to be so low that the performance benefit to using one of those chips over GA104/AD104 at the same power level is going to be in the 2-5% range (as you can see by looking at the 3080 vs 3080 Ti performance numbers above), yet NVIDIA will charge hundreds of dollars more for the higher-end GPU. And of course, NVIDIA's propensity to name desktop and laptop GPUs the same is definitely misleading. Less aware consumers will think they're getting desktop 4090 performance out of their laptop 4090 GPU and ... obviously it won't even be close. I preferred it back when they just stuck an "M" on the end of all of their laptop GPUs to make it clear that they were different. But NVIDIA likes it this way because it makes their GPUs appear to be more competitive against the desktop variants and thus easier to sell, I presume. A more high-bandwidth eGPU connection option could help laptop users who want access to desktop GPU levels of performance, I guess...? -
Yes, that's what I'm referring to. Normally, messing with the INF file is required for "unsupported" GPU upgrades. I'd suggest that NVIDIA didn't actually turn Optimus support on for whatever system this card is being recognized as (since it is not "supposed" to be working) and it would still require an INF mod to "fix" it properly. Probably easiest to just disable Optimus altogether, if you don't actually use the system on battery!
-
Another one.
- 78 replies
-
- 4
-
- ftx
- sam bankman-fried
-
(and 1 more)
Tagged with:
-
Yes, Optimus is primarily for power savings. It is fine to run with it disabled, it generally removes some hassle this way. Inability to get the NVIDIA GPU to engage in Optimus after a GPU upgrade is generally a symptom of the INF mod being done wrong. I ran into this myself when I did my first GPU upgrade (Quadro M5000M in Precision M6700). When you do the device ID replacement you have to make sure that you are replacing a configuration that supports Optimus. These Dell systems actually change the hardware ID depending on if Optimus is enabled or not so you have to pay attention to which hardware ID you are replacing to do the INF mod.
-
dGPU stays on when it shouldn’t. I’ve posted about it lots before. With hybrid graphics on, try just disabling and then enabling the dGPU in Device Manager and see if that fixes power draw and temps. I have this scripted to happen 2 minutes after I log in to fix Optimus. The dGPU might be on even if the NVIDIA status icon thing is showing gray/off. An easy way to check is with Dell Fan Management. If it is showing a temperature for the dGPU, it is on. If it is off, it will show “—“.
-
Sam Bankman-Fried’s former friends pleaded guilty and are cooperating in the FTX fraud case https://www.theverge.com/2022/12/21/23521967/sam-bankman-fried-ftx-crypto-fraud-caroline-ellison-gary-wang
- 78 replies
-
- 1
-
- ftx
- sam bankman-fried
-
(and 1 more)
Tagged with:
-
I think twice the E cores will be fine. Will it work with higher power limits (overall) or clock speeds than Alder Lake / 12th gen? No. Will it be faster, as in, get more work done in the same amount of time overall? For fully multi-threaded loads, likely yes. More cores running at lower power levels (per core) is generally more efficient in terms of performance-per-watt.
-
Rumored specs showing AD103 as the top GPU for laptops next generation, and NVIDIA could indeed be naming it "GeForce 4090" despite the same 175W limit that we are seeing for this generation. (Top pro 5000-level GPU will likely match specs with whatever the top consumer mobile GPU is.) https://videocardz.com/newz/alleged-nvidia-geforce-rtx-40-laptop-gpu-clock-and-tgp-specs-emerge I'd say that this means that there is not much room for a mid-generation GPU upgrade in the 2024 systems. (I seriously doubt that NVIDIA will try to shove an AD102 GPU into laptops.)
-
SBF has waived his right to formal extradition hearings and may be moved to the U.S. as soon as today. https://www.theblock.co/post/197107/bankman-fried-extradition-to-u-s-approved-wsj
- 78 replies
-
- 1
-
- ftx
- sam bankman-fried
-
(and 1 more)
Tagged with:
-
Alder Lake HX also had a 157W upper limit. I think that the power efficiency at lower power levels is not bad, especially with the E cores in play... But, obviously more power = higher performance and competition is forcing Intel to raise power limits in order to stay on top of the performance charts. I suppose they figure you'll buy an Alder Lake "P" or "U" CPU if you want a system that gobbles less power .....? Heck, even the Alder Lake U CPUs have up to 10 cores / 12 threads and can turbo boost to 4.8 GHz. (Dell offers "U" CPUs in the Precision 3000 line.)
-
AFAIK, no laptop manufacturers are putting MXM cards in current-generation laptops anymore. You can find "standard" Turing and Ampere MXM cards (T1000/T2000/A1000/A2000), but they are more built for small embedded systems and "happen to work" in some older laptops with MXM slots. (They are also hard to find and expensive when they do pop up.) Higher-end cards like A4500 exist but are even more difficult to come by. I don't recall seeing a GeForce MXM card since RTX 2080 and those were wildly out of spec in terms of both size/shape and power requirement. High-end GPUs have moved beyond the MXM spec (it was designed for around 100W TDP max, and it can be pushed a bit higher than that... but Ampere GPUs can pull up to around 175W). A modern standard has not emerged. Laptop manufacturers are happy to either solder GPUs directly onto the motherboard or use proprietary cards.
-
It says "Raptor Lake" laptops will be available before the end of 2022 (Intel has stated this in the past) but it does not say that about the HX line specifically. I'm sticking with my earlier projection. We'll hear about lots of systems with Raptor Lake H at CES (coming up soon) that will probably be launching throughout first quarter 2023, but Raptor Lake HX will be a few months further out still.
-
You are right. I missed the desktop-vs-laptop comparison. (Though honestly, we don't know what type of system the CPU was in when it was benchmarked. It could even be in a development board with a desktop-style cooler. We'll have to still wait and see how it performs in various specific laptop models.)
-
- 78 replies
-
- 3
-
- ftx
- sam bankman-fried
-
(and 1 more)
Tagged with:
-
13900HX shows up in Geekbench. 8P+16E (as expected) and about 20% faster than 12900K in a multi-core benchmark. (It looks like the additional E cores are where the boost comes from, because the single-threaded benchmark shows an almost negligible gain.) https://www.tomshardware.com/news/13900hx-outperforms-12900k-geekbench-5
-
Interesting that the system boots with nothing in the DGFF area. Precision 7X30-7X60 would fail to boot without a dGPU, or the iGPU pass through card. (In these systems, the mDP and HDMI ports were physically on the dGPU card, so the iGPU pass through card was needed to pass those ports to the integrated GPU.) I wonder what the iGPU spacer card actually does in this system…?