1610ftw Posted December 22, 2022 Share Posted December 22, 2022 As the GPU discussion has taken over the HX processor thread for the most part I would like to open this thread and start with a post that shows the constantly increasing performance differential between laptop and desktop GPUs that depending on ones point of view may reach new sad and/or comical heights with the upcoming 40xx laptop GPUs. For the first time since the 980m there will be a top end laptop GPU that in my opinion does not really have any right to carry the name of its desktop counterpart as it will probably not even reach half of its performance nor does it even use the same basic chipset. In order to trail the opening gap between laptop and desktop cards I have checked the leaderboard GPU Time Spy scores at 3DMark.com for the top laptop GPUs and their counterparts for 5 generations starting with the best laptop card of the them all the GTX 1080 and the trend has been pretty obvious: After three generations where increases on performance over their laptop counterparts were relatively moderate in desktop cards the differences have increased a lot with the 30xx cards. But all of this is nothing compared to the upcoming laptop 4090 that will probably not even reach half of the performance of the desktop version. To make it a fair comparison with attainable performance and to exclude more elaborate or extreme setups I have used the number 100 ranked scores as of today and as we can see there is no real similarity in performance any more, just in name, for the 3080 cards and the upcoming 4090 if rumored performance will not be completely off. Extrapolating from previous generations and leaked scores I am going with ca. 19000 for now as even 20000 looks less than likely due to severely limited TGP and use of a chip that is weaker than the one used for the 4080 desktop card. Compare that to the 1080, 2080 and 2080 Super with a much smaller differential that I was even surprised about as I had thought it would be higher: So what has happened lately? After leaving the path of very close performance capability between desktop and laptop GPUs it looks like Nvidia stopped to even try and keep up with its desktop designs when it came to laptops. With the 3080 Ti the differential first went over 50% and also for the first time they used the name of a top tier desktop card which they had never done before but without really making sure that they also had the hardware to back up that new big name. Fast forward to today and obviously Nvidia could not withstand the temptation to now even introduce the name of its top end biggest and most power hungry desktop card in the mobile lineup without taking any further steps to also beef up the hardware accordingly. What we will get is apparently a severely TDP limited design that is based on the 4080 desktop chip but pared down in almost all aspects of performance, so much that even the desktop 4080 will be more than 50% if not 60% faster than this laptop 4090 card. Will be interestingly to revisit this once the new cards have launched and I will also add comparisons for the second tier cards used in laptops as they are imo the sweet spot for elevated performance but still decent pricing in the 30x0 cards and it will be interesting to see if this will change with the upcoming 40x0 launch. 4 Link to comment Share on other sites More sharing options...
Aaron44126 Posted December 22, 2022 Share Posted December 22, 2022 I kind of see this in the opposite direction. It's not the decline of laptop GPUs so much as it is desktop GPUs finally growing up to take advantage of the desktop form factor. If desktops are an order of magnitude larger than laptops (talking about physical space / volume) then they should be able to dissipate an order of magnitude more heat. A decade ago, desktop CPUs and GPUs were not using more power than you could reasonably dissipate from a laptop chassis. Now, they are. NVIDIA is now building desktop GPUs that consume more than 400W and there's not really a way that you could dissipate that amount of heat from a laptop chassis (plus heat from the CPU as well) using current designs and materials. So yes, you're right, the difference between desktop and laptop GPU performance will only continue to widen as NVIDIA continues to crank up GPU power limits. It's more a matter of physics than it is NVIDIA failing in the laptop space. Not to give NVIDIA a pass... One could make the argument that putting a GA103 or AD103 GPU chip into a laptop is stupid. Here, I am assuming that recent rumors about an upcoming "GeForce 4090" laptop GPU with an AD103 core and 175W TGP are true, but NVIDIA is already selling "GeForce 3080 Ti" laptop GPUs with the GA103 core (...I have one right here). The power limit is going to be so low that the performance benefit to using one of those chips over GA104/AD104 at the same power level is going to be in the 2-5% range (as you can see by looking at the 3080 vs 3080 Ti performance numbers above), yet NVIDIA will charge hundreds of dollars more for the higher-end GPU. And of course, NVIDIA's propensity to name desktop and laptop GPUs the same is definitely misleading. Less aware consumers will think they're getting desktop 4090 performance out of their laptop 4090 GPU and ... obviously it won't even be close. I preferred it back when they just stuck an "M" on the end of all of their laptop GPUs to make it clear that they were different. But NVIDIA likes it this way because it makes their GPUs appear to be more competitive against the desktop variants and thus easier to sell, I presume. A more high-bandwidth eGPU connection option could help laptop users who want access to desktop GPU levels of performance, I guess...? 4 Apple MacBook Pro 16-inch, 2023 (personal) • Dell Precision 7560 (work) • Full specs in spoiler block below Info posts (Windows) — Turbo boost toggle • The problem with Windows 11 • About Windows 10/11 LTSC Spoiler Apple MacBook Pro 16-inch, 2023 (personal) M2 Max 4 efficiency cores 8 performance cores 38-core Apple GPU 96GB LPDDR5-6400 8TB SSD macOS 15 "Sequoia" 16.2" 3456×2234 120 Hz mini-LED ProMotion display Wi-Fi 6E + Bluetooth 5.3 99.6Wh battery 1080p webcam Fingerprint reader Also — iPhone 12 Pro 512GB, Apple Watch Series 8 Dell Precision 7560 (work) Intel Xeon W-11955M ("Tiger Lake") 8×2.6 GHz base, 5.0 GHz turbo, hyperthreading ("Willow Cove") 64GB DDR4-3200 ECC NVIDIA RTX A2000 4GB Storage: 512GB system drive (Micron 2300) 4TB additional storage (Sabrent Rocket Q4) Windows 10 Enterprise LTSC 2021 15.6" 3940×2160 IPS display Intel Wi-Fi AX210 (Wi-Fi 6E + Bluetooth 5.3) 95Wh battery 720p IR webcam Fingerprint reader Previous Dell Precision 7770, 7530, 7510, M4800, M6700 Dell Latitude E6520 Dell Inspiron 1720, 5150 Dell Latitude CPi Link to comment Share on other sites More sharing options...
jaybee83 Posted December 22, 2022 Share Posted December 22, 2022 what happened was ever higher board TDPs on desktop GPUs and more and more castrated mobile GPU TDPs 😄 Mine: Hyperion "Titan God of Heat, Heavenly Light, Power" (2022-24) AMD Ryzen 9 7950X (TG High Perf. IHS) / Asus ROG Crosshair X670E Extreme / MSI Geforce RTX 4090 Suprim X / Teamgroup T-Force Delta RGB DDR5-8200 2x24 GB / Seagate Firecuda 530 4 TB / 5x Samsung 860 Evo 4 TB / Arctic Liquid Freezer II 420 (Push/Pull 6x Noctua NF-A14 IndustrialPPC-3000 intake) / Seasonic TX-1600 W Titanium / Phanteks Enthoo Pro 2 TG (3x Arctic P12 A-RGB intake / 4x Arctic P14 A-RGB exhaust / 1x Arctic P14 A-RGB RAM cooling) / Samsung Odyssey Neo G8 32" 4K 240 Hz / Ducky One 3 Daybreak Fullsize Cherry MX Brown / Corsair M65 Ultra RGB / PDP Afterglow Wave Black / Beyerdynamic DT 770 Pro X Limited Edition My Lady's: Clevo NH55JNNQ "Alfred" (2022-24) Sharp LQ156M1JW03 FHD matte 15.6" IGZO 8 bit @248 Hz / Intel Core i5 12600 / Nvidia Geforce RTX 3070 Ti / Mushkin Redline DDR4-3200 2x32 GB / Samsung 970 Pro 1 TB / Samsung 870 QVO 8 TB / Intel AX201 WIFI 6+BT 5.2 / Win 11 Pro Phoenix Lite OS / 230 W PSU powered by Prema Mod! Link to comment Share on other sites More sharing options...
ssj92 Posted December 22, 2022 Share Posted December 22, 2022 ^This We went from 250w TDP desktop flagship to now 450W and possibly 600w in future. Laptop we went from 100w TDP to 200W then 190W , now 175W (with bs turbo tdp accounted for) Basically laptop went down and desktop went way up. 4 Alienware m18 : Intel Core i9 13900HX @ 5.0Ghz | nVidia GeForce RTX 4090 | K1675 | 2x1TB SSDs Alienware Area-51M : Intel Core i9-9900K @ 5.3Ghz | nVidia GeForce RTX 2080 | AX210 | Samsung 970 Evo+ Alienware M18x R2 : Intel Core i7 3920XM @ 4.7Ghz | nVidia Quadro RTX 5000 | AX210 | Samsung 980 PRO Alienware 18 : Intel Core i7 4930MX @ 4.5Ghz | nVidia Quadro RTX 3000 | AX210 | Samsung 980 NVMe More Laps: M14x (555m) | M14xR2 (650m) | M15x (980m) | M17xR3 (880m) | M18xR1 (RTX 5000) BEAST Server: Intel Xeon W7-3465X 28 P-Cores | nVidia Titan V | 128GB RDIMM | Intel Optane P5800X CS Studios YouTube: https://www.youtube.com/c/CSStudiosYT Link to comment Share on other sites More sharing options...
1610ftw Posted December 23, 2022 Author Share Posted December 23, 2022 SLI laptop designs like the P870 did cool up to 400W peak GPU power with a total die size that was very similar to an RTX 4080 (more than 600 square mm). Even a 270W version of the 4090 chip in a laptop (60% of desktop power target) should yield excellent results that would easily be at least 65% better than what Nvidia plans to achieve with the 4090 mobile and probably it would still be at least 50% better with a 225W version of the 4090 desktop die. Link to comment Share on other sites More sharing options...
1610ftw Posted December 23, 2022 Author Share Posted December 23, 2022 Looks like at least that max power may be a bit higher this time: https://www.notebookcheck.net/Nvidia-RTX-4090-and-RTX-4080-Laptop-GPUs-rated-for-2-GHz-boost-and-up-to-200-W-TGP-including-Dynamic-Boost-RTX-4060-and-RTX-4050-up-to-95-W-or-165-W-each.676258.0.html It still puzzles me why Nvidia does not see the necessity to increase max power going to the 4090 but then by all accounts they are only using a slower version of the 4080 desktop chip with less bus width, clock speeds and probably also cuda cores and more power would not have the same results as giving more power to a desktop 4080. Link to comment Share on other sites More sharing options...
Etern4l Posted December 23, 2022 Share Posted December 23, 2022 50 minutes ago, 1610ftw said: Looks like at least that max power may be a bit higher this time: https://www.notebookcheck.net/Nvidia-RTX-4090-and-RTX-4080-Laptop-GPUs-rated-for-2-GHz-boost-and-up-to-200-W-TGP-including-Dynamic-Boost-RTX-4060-and-RTX-4050-up-to-95-W-or-165-W-each.676258.0.html It still puzzles me why Nvidia does not see the necessity to increase max power going to the 4090 but then by all accounts they are only using a slower version of the 4080 desktop chip with less bus width, clock speeds and probably also cuda cores and more power would not have the same results as giving more power to a desktop 4080. The only thing Nvidia sees as a necessity is making more money. So they ask the manufacturers: what chips can we provide to make more money? The answer is: Please provide as efficient chips as possible, and obviously keep the power low. No more than 170W-ish, otherwise laptops tend to go on fire (see Alienware Area 51M), and we do have the slim-and-light agenda to pursue here. Thanks! 1 "We're rushing towards a cliff, but the closer we get, the more scenic the views are." -- Max Tegmark AI: Major Emerging Existential Threat To Humanity Link to comment Share on other sites More sharing options...
1610ftw Posted December 23, 2022 Author Share Posted December 23, 2022 Probably some truth to that certainly the money-making part 😄 However with Nvidia and desktop card manufacturers Nvidia is mostly telling them what to do. So it is probably a bit of both and manufacturers and Nvidia are happy with relatively safe mediocrity that does not cause a lot of issues. 1 Link to comment Share on other sites More sharing options...
Bullit Posted December 24, 2022 Share Posted December 24, 2022 Well the "declined" Nvidia laptop GPU's are much more capable than the once "might" Nvidia laptop GPU's Link to comment Share on other sites More sharing options...
ryan Posted December 24, 2022 Share Posted December 24, 2022 I have to agree, desktop is bigger so it has more die space ect to be more powerful. 1060 2060 and 3060 are the odd ones out. my 3060 is equal to a desktop 3060 and its the 115w variant. I think as long as the performance goes up by a reasonable amount for both desktop and laptop who cares. its not a secret. ZEUS-COMING SOON Omen 16 2021 Zenbook 14 oled Vivobook 15x oled Link to comment Share on other sites More sharing options...
1610ftw Posted December 25, 2022 Author Share Posted December 25, 2022 On 12/24/2022 at 4:32 AM, Bullit said: Well the "declined" Nvidia laptop GPU's are much more capable than the once "might" Nvidia laptop GPU's A not even 25% performance increase of the 3080 Ti over the 2080 over THREE generations is now defined as much more capable? While at the same time desktop performance has increased by more than 75% for more than three times the gains? Looks to me as if Nvidia did as little as possible to prevent the impression of a complete standstill while still charging the same high prices as back in the days when their laptop GPUs were actually quite competitive. Obviously the number of people who see it as kind of natural that Nvidia routinely uses very low power limits and inferior dies that belong to smaller desktop cards is quite high. Therefore it seems that the strategy to do as little as possible seems to work out very well and I will give them that but with the technology of today we could easily see laptop GPU's that have at least 50% higher performance on a much lower total power budget than the SLI laptops of the past. 2 1 Link to comment Share on other sites More sharing options...
Bullit Posted December 25, 2022 Share Posted December 25, 2022 In Blender render benchmark https://opendata.blender.org/benchmarks/query/?compute_type=OPTIX&compute_type=CUDA&compute_type=HIP&compute_type=METAL&compute_type=ONEAPI&group_by=device_name&blender_version=3.3.0 RTX3080 laptop = 3482 points RTX2080 = 2400 points Link to comment Share on other sites More sharing options...
Reciever Posted December 26, 2022 Share Posted December 26, 2022 Not News, moved to general for now. Thanks Telegram / TS3 / Twitter 2700X to 5800X3D upgrade! With a 10850K cameo! Link to comment Share on other sites More sharing options...
Meaker Posted December 26, 2022 Share Posted December 26, 2022 I did get a noticeable improvement out of my 3070 mobile going from 125W to 180W but it's more of a desktop 3060ti with that modding done. 4 Louqe Ghost S1 case (Top hat and bottom extension) Nvidia RTX 4070 MSI twin fan 32" MSI 4k 160HZ IPS display AMD Ryzen 7 7700 cooled via Thermalright 240mm AIO 48GB (2x24) DDR5 6000 CL36 Asus B650E-I motherboard 2TB T500 nvme SSD + 2TB SN770 nvme 500W Silverstone SFX-L PSU Link to comment Share on other sites More sharing options...
1610ftw Posted December 28, 2022 Author Share Posted December 28, 2022 On 12/25/2022 at 10:09 PM, Bullit said: In Blender render benchmark https://opendata.blender.org/benchmarks/query/?compute_type=OPTIX&compute_type=CUDA&compute_type=HIP&compute_type=METAL&compute_type=ONEAPI&group_by=device_name&blender_version=3.3.0 RTX3080 laptop = 3482 points RTX2080 = 2400 points Blender has seen an above average intergenerational improvement as it seems to profit from Ampere in an above average way: By the way performance of desktop vs mobile: 3080 Ti mobile: 3840 3080 Ti desktop: 5958 55% increase of desktop over laptop, basically the same as in Time Spy Link to comment Share on other sites More sharing options...
Bullit Posted December 28, 2022 Share Posted December 28, 2022 Raytracing, tensor for denoising with Optix. I have the 130w 3060 Laptop and already in some scenes it starts to be a pleasure to work. It is at point where your work starts to feel fluid while in past render time was a break a coffee at best. 1 Link to comment Share on other sites More sharing options...
Etern4l Posted December 28, 2022 Share Posted December 28, 2022 1 hour ago, Bullit said: Raytracing, tensor for denoising with Optix. I have the 130w 3060 Laptop and already in some scenes it starts to be a pleasure to work. It is at point where your work starts to feel fluid while in past render time was a break a coffee at best. The 3060 is very power efficient. 130W is plenty of power for it. Yet, even looking at this entry-level card, Nvidia nerfed the laptop variant as it only has 6GB of VRAM, whereas the desktop cards have 12GB. Of course, once we move up the food chain we see the laptop/desktop gap increase further. 1 "We're rushing towards a cliff, but the closer we get, the more scenic the views are." -- Max Tegmark AI: Major Emerging Existential Threat To Humanity Link to comment Share on other sites More sharing options...
ryan Posted December 29, 2022 Share Posted December 29, 2022 I have the 3060 115w variant. and yes it only has 6gb of ram but most games don't need more. maybe like 4k ultra some games need 8gb or more but if your gaming at 4k ultra im betting the game is old and won't require much vram. I know watch dogs legion has a hard time with the 3060 laptop but I don't play that game. games should be built around common power and the most common gpu is the 1650 4gb. so with that said 6gb 3060 is still pretty good.. also bullit I was wondering what you get in timespy? wonder how big the gap is from 115w to 130w. I have no idea if my 9200 score is the highest for the 115w or not...no way to tell ZEUS-COMING SOON Omen 16 2021 Zenbook 14 oled Vivobook 15x oled Link to comment Share on other sites More sharing options...
Bullit Posted December 29, 2022 Share Posted December 29, 2022 6Gb is a big issue for me in Unreal Engine, but also might be in Blender just currently i am just making small projects there. If timespy is something i need to install no chance. You can do the Blender Benchmark no need to install https://opendata.blender.org choose GPU Optix and v.3.3. My value is 2615 with Hwinfo i see it between 110-125w doing the benchmark. 1 Link to comment Share on other sites More sharing options...
Etern4l Posted December 30, 2022 Share Posted December 30, 2022 6GB was on the low side 4 years ago. If you look at benchmarks, current games are pushing 11-12GB (ok, probably not the titles which would comfortably run on a 3060), and of course the extended VRAM can get very useful outside of the gaming arena. "We're rushing towards a cliff, but the closer we get, the more scenic the views are." -- Max Tegmark AI: Major Emerging Existential Threat To Humanity Link to comment Share on other sites More sharing options...
Reciever Posted December 31, 2022 Share Posted December 31, 2022 I dont think I have had anything less than 8GB since 2018/2019. Telegram / TS3 / Twitter 2700X to 5800X3D upgrade! With a 10850K cameo! Link to comment Share on other sites More sharing options...
jaybee83 Posted December 31, 2022 Share Posted December 31, 2022 47 minutes ago, Reciever said: I dont think I have had anything less than 8GB since 2018/2019. yep, since 2015 in my case. i wouldnt go below 12GB with a new gpu nowadays, rather aim for 16GB for some future-proofing. Mine: Hyperion "Titan God of Heat, Heavenly Light, Power" (2022-24) AMD Ryzen 9 7950X (TG High Perf. IHS) / Asus ROG Crosshair X670E Extreme / MSI Geforce RTX 4090 Suprim X / Teamgroup T-Force Delta RGB DDR5-8200 2x24 GB / Seagate Firecuda 530 4 TB / 5x Samsung 860 Evo 4 TB / Arctic Liquid Freezer II 420 (Push/Pull 6x Noctua NF-A14 IndustrialPPC-3000 intake) / Seasonic TX-1600 W Titanium / Phanteks Enthoo Pro 2 TG (3x Arctic P12 A-RGB intake / 4x Arctic P14 A-RGB exhaust / 1x Arctic P14 A-RGB RAM cooling) / Samsung Odyssey Neo G8 32" 4K 240 Hz / Ducky One 3 Daybreak Fullsize Cherry MX Brown / Corsair M65 Ultra RGB / PDP Afterglow Wave Black / Beyerdynamic DT 770 Pro X Limited Edition My Lady's: Clevo NH55JNNQ "Alfred" (2022-24) Sharp LQ156M1JW03 FHD matte 15.6" IGZO 8 bit @248 Hz / Intel Core i5 12600 / Nvidia Geforce RTX 3070 Ti / Mushkin Redline DDR4-3200 2x32 GB / Samsung 970 Pro 1 TB / Samsung 870 QVO 8 TB / Intel AX201 WIFI 6+BT 5.2 / Win 11 Pro Phoenix Lite OS / 230 W PSU powered by Prema Mod! Link to comment Share on other sites More sharing options...
Papusan Posted December 31, 2022 Share Posted December 31, 2022 Nvidia GeForce RTX 4060 laptop graphics will take a big leap in performance over its predecessor. A wopping 20% performance boost over RTX 3060 laptop graphics. Whats the point with this upgrade if it's true? Bro @Mr. Foxdon't you have the 3060 in your Jokebook? If so, will you jump on a ADA gaming book with the 4060 refresh? Nvidia GeForce RTX 4060 laptop graphics card's 3DMark Time Spy benchmark score leaks online https://www.notebookcheck.net/Nvidia-GeForce-RTX-4060-laptop-graphics-card-s-3DMark-Time-Spy-benchmark-score-leaks-online.677472.0.html 1 1 "The Killer" ASUS ROG Z790 Apex Encore | 14900KS | 4090 HOF + 20 other graphics cards | 32GB DDR5 | Be Quiet! Dark Power Pro 12 - 1500 Watt | Second PSU - Cooler Master V750 SFX Gold 750W (For total of 2250W Power) | Corsair Obsidian 1000D | Custom Cooling | Asus ROG Strix XG27AQ 27" Monitors | Papusan @ HWBOT | Team PremaMod @ HWBOT | Papusan @ YouTube Channel Link to comment Share on other sites More sharing options...
Mr. Fox Posted December 31, 2022 Share Posted December 31, 2022 27 minutes ago, Papusan said: Nvidia GeForce RTX 4060 laptop graphics will take a big leap in performance over its predecessor. A wopping 20% performance boost over RTX 3060 laptop graphics. Whats the point with this upgrade if it's true? Bro @Mr. Foxdon't you have the 3060 in your Jokebook? If so, will you jump on a ADA gaming book with the 4060 refresh? Nvidia GeForce RTX 4060 laptop graphics card's 3DMark Time Spy benchmark score leaks online https://www.notebookcheck.net/Nvidia-GeForce-RTX-4060-laptop-graphics-card-s-3DMark-Time-Spy-benchmark-score-leaks-online.677472.0.html No. Half-breed is a 16GB Quadro. Wheezer was 2060. 1 Wraith // Z790 Apex | 14900KS | 4090 Suprim X+Byksi Block | 48GB DDR5-8600 | Toughpower GF3 1650W | MO-RA3 360 | Hailea HC-500A || O11D XL EVO Banshee // X870E Carbon | 9950X | 4090 Gaming OC+Alphacool Block | 32GB DDR5-8200 | RM1200x SHIFT | XT45 1080 Nova || Antec C8 Spectre // Z790i Edge | 13900KS | 3090 Ti FTW3 | 32GB DDR5-8200 | RM1000e | EK Nucleus CR360 Direct Die || Prime A21 Half-Breed // Dell Precision 7720 | BGA CPU Filth+MXM Quadro P5000 | 4K Display | Sub-$500 Grade A Refurb | Nothing to Write Home About Mr. Fox YouTube Channel | Mr. Fox @ HWBOT The average response time for a 911 call is 10 minutes. The response time of a .357 is 1400 feet per second. Link to comment Share on other sites More sharing options...
KING19 Posted December 31, 2022 Share Posted December 31, 2022 1 hour ago, Papusan said: Nvidia GeForce RTX 4060 laptop graphics will take a big leap in performance over its predecessor. A wopping 20% performance boost over RTX 3060 laptop graphics. Whats the point with this upgrade if it's true? Bro @Mr. Foxdon't you have the 3060 in your Jokebook? If so, will you jump on a ADA gaming book with the 4060 refresh? Nvidia GeForce RTX 4060 laptop graphics card's 3DMark Time Spy benchmark score leaks online https://www.notebookcheck.net/Nvidia-GeForce-RTX-4060-laptop-graphics-card-s-3DMark-Time-Spy-benchmark-score-leaks-online.677472.0.html Wow... The only positive about it is that it uses 8GBs of VRAM instead of 6GBs like the RTX 3060/2060 According to notebookcheck it says that the RTX 4060 has only a 96 Memory bit bus and i hope thats a typo because it clearly got castrated...... https://www.notebookcheck.net/NVIDIA-GeForce-RTX-4060-Laptop-GPU-Benchmarks-and-Specs.675692.0.html 1 1 1 Current Laptop: Lenovo Legion 5: AMD Ryzen 7 4800H 2.8Ghz (Boost: 4.2Ghz), 6GB Nvidia Geforce GTX 1660Ti GDDR6 Memory, 15.6" FHD (1920 x 1080) 144Hz IPS display, 32GB 3200MHz DDR4 memory, 512GB M.2 NVMe PCIe SSD, 1 TB Teamgroup MP34 M.2 NVMe PCIe SSD, Windows 10 Home 22H2 Link to comment Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now