
1610ftw
Member-
Posts
1,135 -
Joined
-
Last visited
-
Days Won
1
Content Type
Profiles
Forums
Events
Everything posted by 1610ftw
-
*Official Benchmark Thread* - Post it here or it didn't happen :D
1610ftw replied to Mr. Fox's topic in Desktop Hardware
Hey bro, stuff keeps breaking around these parts so sadly no 😞 Currently we are waiting to get a defective 3080 back from service and before that two systems and some non-computer stuff went belly up and we have been relegated to work on that - it is a damn shame. -
*Official Benchmark Thread* - Post it here or it didn't happen :D
1610ftw replied to Mr. Fox's topic in Desktop Hardware
I know that design very well but you know it is from last year. Good design with regard to the three m2 / 2280 slots in front next to the battery, same for the 7760, long may Dell keep it. However outside of three workstations from last generation from Dell, HP and MSI even 4 of these relatively slim m2 memory slots seem to be too much to ask these days. We will see what happens later this year when and if new workstations are announced and possibly also some more gaming focused laptops. -
No chance that the X370 will be the most powerful. Maybe the most reviled though given its shameless posing as something it is not. To be clear every manufacturer needs some breadwinner in the form of thin and light BGA books but nobody forced Clevo to treat its X-lineup with such contempt.
-
*Official Benchmark Thread* - Post it here or it didn't happen :D
1610ftw replied to Mr. Fox's topic in Desktop Hardware
I hope you are aware that nobody is supposed to have 4 full size NVME SSDs on a laptop - there just isn't enough space for that any more. Not even on the new 18" laptops, just ask Acer, Asus and Dell... They also lost some circuit board real estate at MSI and cannot find it any more so now the GT77 only has 3 slots instead of 4 in the last generation 😄 -
That sounds like an excellent external solution. TB5 may have the potential to give much better results already but I have a gut feeling that somebody somewhere will mess it up again. As for the cards that we find in laptops I would say offer manufacturers the freedom to use desktop chips and see what they come up with, we might be surprised how good these can be. I am completely OK with Nvidia doing some testing of such a machine to make sure they do not go up in flames with one of the larger chips that would hopefully come with a TGP well in excess of 200W. The important point here is that Nvidia stops setting arbitrary and rather low limits on what laptop cards or rather chips can do or be. It is in the interest of all people who spend big money on laptops to get top-of-the-line models that aren't artificially gimped as they are right now.
-
980 to 2080 Super were all pretty close to the desktop cards compared to the last two generations. Looking at Time Spy for comparison the 1080, 2080 and 2080 Super desktop cards were only about 9 to 12% better and only with the 980 was there a ca. 28% difference. Compare that to about 60% today for the 4080 and I predict a whopping 80%+ for the 4090 where so far we do not have that many results for the notebook cards. At the moment the number 100 4090 is more than 90% better than the 4090 notebook ranked in 25th place. Would have made a lot more sense to just call the 4090 notebook a 4080. Even then the desktop 4080 would still be more than 35% better which already creates enough confusion and/or disappointment.
-
notebookcheck commenting on the performance gap and mentioning practical issues with identical names for leptop and desktop GPUs: https://www.notebookcheck.net/The-mobile-GeForce-RTX-4080-is-35-percent-slower-than-the-desktop-RTX-4080-and-that-can-be-problematic-for-consumers.691712.0.html edit: @cylix beat me to it 😄
-
Some of these new games must be directly financed by the hardware manufacturers. "I need to get myself a 5090Ti so that I can finally get 60 fps in Hogwarts Legacy with raytracing" Pathetic.
-
Time Spy has been all over the place for me for CPU, probably because of the combined TDP limit for CPU and GPU. You can also see that in the 4090 scores that show surprisingly low CPU numbers.
-
Again, these are two different benchmarks from two different days, look at the dates on the right.
-
Two different scores. One is better when sorted by overall and the other when sorted by CPU - check the date.
-
I honestly detest the gamery Asus design even in the non-Strix version and I would never even consider it unless maybe it was socketed. I ti salso disappointing to see that Asus would have had enough space for at least one if not two more m2 slots but they chose not to have them.. Still we got to give them these advantages in CPU and screen and some people will surely also prefer that the G18 weighs about 11/2 to 2 lbs less. Hopefully that will not hinder m18 sales. For me 165Hz and 300 nits will be perfectly fine in almost all situations but I would also expect that Dell will do something about their screen weakness relative to their competitors - it just is an obvious point that they have to concede to the competition. I am much more disappointed that despite its heft the Alienware cannot dissipate more heat nor does it have 4 memory slots or 4 full sized m2 slots. I guess we will know more pretty soon but it strikes me as off that Dell did not manage to do a bit more with such a big chassis. But then maybe everything is so over-engineered that the m18 will be very quiet - I would certainly settle for that!
-
Yep, loaded it myself, ultra long link incoming: https://www.3dmark.com/search#advanced?test=spy P&cpuId=&gpuId=1546&gpuCount=0&gpuType=ALL&deviceType=ALL&storageModel=ALL&memoryChannels=0&country=&scoreType=graphicsScore&hofMode=true&showInvalidResults=false&freeParams=&minGpuCoreClock=&maxGpuCoreClock=&minGpuMemClock=&maxGpuMemClock=&minCpuClock=&maxCpuClock= 😄
-
I think it has been put up like yesterday so that is a bit of a strange question 🙂 Are you sure that you are not talking about the Strix Scar? The regular Strix is usually priced lower. You also have to add VAT in Europe and some other costs seem to be added, too but it should be priced between 2999 and 3199 Euro.
-
For 4K gaming you better have a look at Time Spy Extreme where I expect the 4090 to have more of a lead. Edit: Don't bother, not enough results right now. here are the 4090 Time Spy scores for now, including a score from brother @Prema who currently has the top score with CPU and GPU combined: Rank Overall score Graphics score CPU score CPU GPU CPU clock GPU core clock GPU mem clock User Date 1 21716 23589 14978 Intel Core i9-13980HX Processor NVIDIA RTX 4090 (notebook) 5586 2640 2450 神近かおり Februar 9 2023 2 21734 23352 15607 Intel Core i9-13980HX Processor NVIDIA RTX 4090 (notebook) 5586 2595 2300 363813766 Februar 9 2023 3 21675 23207 15776 Intel Core i9-13980HX Processor NVIDIA RTX 4090 (notebook) 5586 2520 2300 XSSER Februar 9 2023 4 22013 23027 17620 Intel Core i9-13900HX Processor NVIDIA RTX 4090 (notebook) 4988 2325 2250 Prema Januar 30 2023 5 21238 22394 16433 Intel Core i9-13980HX Processor NVIDIA RTX 4090 (notebook) 5586 2355 2250 brainhds Februar 9 2023 6 19459 20972 13813 Intel Core i9-13980HX Processor NVIDIA RTX 4090 (notebook) 5586 2340 2275 chungdenny Februar 9 2023 7 18661 20764 11858 Intel Core i9-13950HX Processor NVIDIA RTX 4090 (notebook) 5487 2385 2250 panda131506 Februar 9 2023 8 18379 19027 15406 Intel Core i9-13980HX Processor NVIDIA RTX 4090 (notebook) 5586 2535 2250 nullghost2011 Februar 9 2023
-
Indeed, will have to check the options. The Asus has the advantage of the best laptop GPU and a significantly brighter and faster panel, I think it is something like 500 nits 240Hz vs 300 nits 165 Hz. Personally I am not a fan of the looks of the Asus and i also do not like how they have pared down the connectivity but people get a lot of raw performance and capability for their money. Is it already known how much down the GPU ladder the m18 will still have a vapor chamber? That would be good to know for people who do not really need a 4080.
-
I like what you have achieved with your 7770 but you took a wrong turn somewhere. Why would we not compare Time Spy scores between generations when this is what we have always done? It really has a nice spearate GPU score which is there so that the CPU will not have much of an influence which is easy to see if for example you disable your turbo altogether. In any case those 4080 scores look like they are from desktop cards that are somehow attached to laptops, here is the 4080 mobile leaderboard as of now, note the (notebook) to designate the mobile GPU: Looks like at QHD that 4080 is working extremely well.
-
Not all laptops are that expensive this generation. The Asus Rog Strix 18 with the 4080, a 13980HX and an 18" display is only 2499: https://www.bestbuy.com/site/asus-rog-strix-18-intel-core-i9-13980hx-16gb-ddr5-memory-nvidia-geforce-rtx-4080-v12g-graphics-1tb-ssd-eclipse-gray/6531333.p?skuId=6531333&intl=nosplash Looks like a pretty good deal for most if two memory and storage slots are enough.
-
Top of the line laptops indeed have not gone up in price much. But in the days of the GTX 1080 and RTX 2080 they were also quite close in performance when compared to desktop solutions. Now they are still priced similarly but the GPU performance just isn't competitive any more.
-
Supposedly this is the GPU score, not the combined one. Up until the last generation improvements in Time Spy were a pretty good predictor for improvements in QHD gaming at least.
-
I have no clue, should have just written male or men. Point is that average men should have no trouble carrying and handling a laptop that is a bit heavier and bigger than what we get these days. That is if they are mainly looking for a DTR. I get that there are exceptions and people who cannot lift that much but there are plenty of lighter and thinner laptops for them already.
-
They cannot charge that much if enough people do not buy. Do yourself a favor and check what resolution you really want to game in. If it is QHD you might be happy with an older laptop with a 3070Ti or better or a new one with the 4070. Personally I would probably not game on a laptop in 4K - not good enough for what you have to pay nor do I think that my eyes appreciate the added resolution on a laptop screen. So if you still want to game in 4K on a laptop then you are entering a world of (financial + performance) pain - you have been warned 😄
-
You only have to look at the Clevo X170 for a traditional laptop design that was rated at 325W of cooling, the P870 went even well beyond 400W. There really is no reason for not dissipating more heat except for the manufacturer and Nvidia not wanting to go there. When MSI can do 250W with traditional heat pipes in a super slim chassis (GT77) they could certainly make that design a bit bigger and thicker with an 18" screen and go to 300W+ with the added thickness and real estate. Weight would probably be somewhere between 8 1/4 to 9 1/2 lbs but I bet that enough people of the male variety who just carry their laptop from one place to another would not mind, especially when they also get a socketed Intel CPU and stacked NVME SSDs like HP does it. You only need on average about 5mm more height for all of that to happen.
-
Not all games work that well with reduced power but it is possible that at 250W TGP the average performance would be around 90% across a wider variety of games. Personally I am not affected as I would be happy with a 4060 or 4070 for what I do and I happen to think that save for too low memory the desktop cards aren't that bad either. It is just that the naming and the low power limits on laptops together with the skyrocketing prices rub me the wrong way, it is not the proper way to do things.
-
This is indeed what Nvidia is trying to pull off - pay more for more performance is probably suppossed to be the new normal. I think Nvidia noticed what big of a leap they were making and they tried to capitalize on it. Did not work out completely as now they are being called out for it - A LOT! And that was just in the desktop space where the new cards are really very powerful but now in the laptop space their their decision to use smaller chips than in the desktop card with the same name coupled with anemic power limits is not perceived favorably out there - time to call a spade a spade. What does not help is that they are extremely stingy with memory in the mobile lineup. Instead of 8, 12 and 16 they should have gone for 12, 16 and 20 to 24 for the 4070, 4080 and 4090.