-
Posts
869 -
Joined
-
Last visited
-
Days Won
6
Content Type
Profiles
Forums
Events
Posts posted by ssj92
-
-
54 minutes ago, Csupati said:
Thakns, what i need to plug? AGA cable or the aga power cable?
Thanks , thats good news!
witch cooling solution cards was tested in the AGA? How easy to fit them inside?
3070 MSI Gaming Trio (I removed the whole top cover as I like to look at my cards.
3090 Kingpin & XC3.
2080Ti PNY Blower
RX 6800 Reference
These are the most recent ones I tried.
-
I can confirm i have tested
rtx 3070, 3090
rx 6800
to work in aga
4000 series also works
-
2 hours ago, Papusan said:
Still no need to cripple the system down to 250W. They can use hybrid mode (steal needed power from the battery) and 330W power bricks is not utilized. They are good for above 400W peak load and +350w in sustained load. Why castrate the laptops at 250W? A nice way to cut on cooling cost and push you over on next HW refresh.
The fact that every single manufacturer has this limitation or even less tells me nvidia / intel didn’t want manufacturers pushing their products. NVidia didn’t let clevo make a 3080ti mxm card and now even clevo is 100% bga.
at this point nvidia/intel control what and how their products get used.
will be very interesting seeing the AMD versions of the same laptops.
-
8 hours ago, serpro69 said:
Oh really, is it going to be that soon? I'm regularly checking for new announcements , but have not seen these dates anywhere. Will be very nice if this is true.
Some manufacturers yes but no sure on AW
-
5 hours ago, Kataphract said:
I have engineering sample myself for some time, and with good enough noise-cancelling headphones, it can sustain 150W+ for 30 minutes in my unit, never dropping bellow 33K in Cinebench R23 :- ).
It's a thing of a magic.
I'll have to ask when can one talk about everything. Might be first time in decade I find laptop a viable alternative to desktop for power users (well, noise-cancelling headphones wearing power-users, both my dog and son were scared when the test suite run...)As far as I know, embargo has already been lifted. It's nVidia's GPUs holding back people from discussing laptops but CPU wise it's been lifted.
Will be curious if OC works and it can sustain the OC.
Hoping XTU or TS helps keep clocks up on load and thermals are only limiting factor, but we will see.
-
Jarrod'sTech on Twitter: "Not bad, for a laptop https://t.co/uFM1tZKww4" / Twitter
I'm a bit disappointed he hit us with the "it's an engineering sample so I didn't test power consumption, temps, boost time, etc"
True difference between a enthusiast and one who isn't.
I would have tested all that. We know by now that the "engineering sample" in that system is likely a QS "qualification sample" which is 99% representative of real world performance. If it was some ES from early stages I'd understand.
Can't wait to put these level of systems to their paces myself 😄
-
2
-
-
6 hours ago, Reciever said:
Total 250w? 😞
The 250w total is already a thing on current 12th gen/30 series laptops.
The CPU by itself can hit full speed and gpu can always hit full speed but cpu gets limited to 75w tdp when gpu is at full power.
All the RTX 4090 + i9 HX CPU systems will have this 250w limit or less.
It's unfortunate that dual psu systems are gone. We will have to wait until they're released to see how they really perform.
-
On 1/15/2023 at 4:35 PM, srs2236 said:
What I don't understand is why haven't anyone made an EGPU setup through one of the P870 MXM slots... No major PCI-E bottleneck there.
For M.2 and Thunderbolt, nobody should ever bother with it unless its some extreme edge case where it actually makes sense, but 99% of the time it won't.
(Coming from someone who put his heart and soul in trying to make EGPU setups be actually worthwhile)
There's a MXM 3.0b to PCIe 3.0 X16 adapter that exists in China. One of our members in AW Club has one. He ran a 6900 XT in M18xR2 if I remember correctly.
BTW Does anyone know if the i9 10980HK LGA version from China works in X170SM-G? I think @ViktorVis using one in P870TM or something.
EDIT: Just realized it's 8 cores lol, 10900K or KF it is.
-
2
-
1
-
-
6 hours ago, 1610ftw said:
Intel and Nvidia want to sell - obviously Nvidia can set a max TGP and Intel will not change their chips for manufacturers but I doubt that Intel will dictate to manufacturers that their CPUs can only get 75W in mixed usage - it makes Intel look bad when the GPU gets max power and their CPUs are operated at much less power than they need so why would they want that. In any case I cannot see Intel as the bad guy here in so far as they obviously allow up to 200W for their CPU - not like Nvidia who don't care how good your cooling is - even if you are below 60 degrees at max power you cannot supply their mobile cards with more than 175W which is ridiculous especially when dektop GPUs can be supplied with power way beyond their very generous TGP.
Yes things used to be better. I have a GT83 here right now that pulls about 400W I believe in a Time Spy run with GTX 1070 SLI - has two 240W power supplies and the bigger version with the 8950HK and 2 x GTX 1080 pulled up to around 500W iirc. I would guess that all the later SLI designs and then the first single CPU designs with the 9900K or 10900K all could sustain at least 300W and often more - one 330W power adapter was not enough back then.
About the current 250W: Which manufacturer is the third one, do you have a link? I have only seen MSI openly mentioning it, Alienware definitely is not very vocal about it.
Do you have a 12800/12900HX system with a RTX 3070Ti or better gpu? As far as I know, this whole 'total TDP" thing has already been a thing since 12th gen /30 series. If you have one run prime95 on cpu and msi kombuster on gpu and see the tdp of both chips. They probably won't run full power.
my 12700H/RTX 3060 have this limitation where one gets less power (intel cpu in this case) to allow max gpu performance.
Go to stress test section: MSI Titan GT77 12UHS Laptop Review: Alder Lake-HX poster child with unhindered desktop-class performance - NotebookCheck.net Reviews
"he Titan GT77 uses MSI OverBoost for a combined 250 W load (75 W CPU + 175 W GPU) from both the CPU and the GPU depending on the scenario"
nothing new is happening with the new systems, this has been a thing this whole time
This is the same thing that will happen with new systems. 250w total system power means if cpu has 100% load and gpu is 0% load, the cpu will still be able to use all of its power. It wouldn't make sense for intel to limit 13980HX power in cpu bound situations less than 12900HX especially with 8 additional cores.
First set of laptops can be ordered Feb 1st and release Feb 8th so just a few more weeks before tests will be performed.
You can bet I will be doing my own tests when I get my system.
Alienware openly states in their press materials (link in first post) about 250w total power.
I can't find the info on third manufacturer right now but if I remember correctly it was razers system.
-
8 minutes ago, 1610ftw said:
The predecessors 12800HX and 12900HX with the same exact nominal max TDP could easily go up to 200W and more, I think I tested up to 220W.
You are right about 175 + 75 - they all max out the GPU while limiting the CPU to less than half of its nominal max TDP and less than 40% to its usable max TDP of 200W.
As for Intel deciding this it seems to me more like the gentleman's agreement between German car manufacturers Audi, BMW, Mercedes and VW who limit their high performance cars to a top speed of 250 km/h but then we will never know exactly if that was the case here. In any case 250W just isn't enough to do justice to both CPU and GPU when they are rated at 175 + 157 got a total of 332 which by the way is very close to the combined TDP that Clevo gave for the X170 (325W).
The predecessors also have 8 less cores. These CPUs can be OC so I assume you can exceed the TDP on CPU only loads. The 75w limit comes in when GPU load goes up, which is how it is right now too.
There are already 3 manufacturers (I think 5 total but 3 100% for sure) that have this same 250w total tdp for cpu/gpu. This is definitely something intel/nVidia probably came up with and enforce onto manufacturers.
nVidia didn't want Clevo to make a 3080Ti MXM for X170TM-G, I wouldn't doubt they won't allow higher TDPs on these systems with agreement from Intel.
yes it is stupid to give a 250w total tdp. My Area-51m can easily exceed 400w combined on 9900k/2080 and supports 660w thanks to 2x330w.
Even my old m18xR2 can easily exceed 330w with XM CPU OC/SLI GPUs.
From what I saw, no single laptop at CES has dual power adapter support. They will all probably be limited to 330w total system power.
-
^Exactly this. The Intel CPU in mobile has a 157W max TDP however, it looks like they are targeting less or only reach that with CPU only load.
GPU will be 175w max and again with combined power it looks like gpu will probably do 175w and cpu will do 75w.
-
2 hours ago, 1610ftw said:
Yes I know, the 51m R1 was the last Alienware with 4 sodimm slots, seems to be a thing of the past now.
I need more than 64GB now and I would not want to wait around for MAYBE 64GB dimms popping up and then they never arrive.
I am also not at all impressed with the shared TDP of 250W between GPU and CPU as per ultrabookreview, looks like for such a heavy system with only two dimm slots and a vapor chamber it should be able to do better than an MSI GT77 or GE78 that weigh about 2 lbs less:
https://www.ultrabookreview.com/60886-alienware-m18-m16/
On the Intel side, expect up to Intel Core i9-13980HX + RTX 4090 16GB configurations.
Alienware didn’t go into specific details on the power settings applied to the components here, and only mentioned up to 250W of crossload CPU + GPU power. That should translate into up to 175W TGP on the 4090/4080 models, with the rest going to the CPU, and I’d be surprised to see a higher-power GPU given what we already know from the other brands. Similar settings should be possible on both the m16 and the m18, but with arguably better thermals on the m18, due to its larger chassis.
Just so you know, every single laptop at CES 2023 with RTX 4090 + i9 13980HX have a combined 250w TDP for the system. This is most likely a limitation set by Intel/nVidia.
-
4 hours ago, Maxware79 said:
They are aware of the complaints about the 300nits compared to other systems. I'm not sure if they will be changing anything in the near future though.
Are they aware of the sound system concerns vs x16 as well? m18 better have a sub if they want to call it an 18" =D
Personally, I can accept a 300 nit screen if it's a high quality screen with 100% adobergb color space, deep blacks, and good contrast (refresh rate too).
I personally would be more critical on sound system. Most high end displays on laptops are 500 nits, some 600 nits. hdr is a different story.
-
"Intel EVO Laptops"?
WHY are they doing this to themselves? The EVO laptops with 1260p does not have any special hardware that my 12700H doesn't have as far as I know?
Unless one of y'all got a link to the software that should work on any pc running windows 11?
AW connect I tried once and it was cool but no where near the speed/integration of macOS + iPhone.
Hoping this has improved things quite a bit
-
Personally I went with a Vita. The OLED display is very nice. Not to mention I can now play any PSP game as well.
It's more expensive yes but it has a lot of software support. I can play my PC games remotely using moonlight app.
PSP was a really good portable back in the day. I used to have so many mods/homebrew for it. I still have one somewhere but it doesn't turn on.
-
1
-
-
I see it as two ways.
1) It really is 300 nits typical brightness and the peak isn't accounted for (I honestly prefer measuring this way) so we are going to have a display that isn't that bright (which is weird because there are other 18" panels that are 240hz and higher brightness
2) It is a typo and we will be shocked (doubtful on this)
I'm fairly confident this is the 2560x1600 panel:
-
6 hours ago, 1610ftw said:
It would be nice to get those 64GB memory sticks but I will not bet the house on it 🙂
Alienware did away with 4 slots after the first 51m so I would be very surprised if they returned.
For my work I prefer 4 SSD slots, 4 memory slots and RGB lighting on the keyboard in order to color code certain shortcuts and currently only MSI produces such a machine and sadly with the new GT77 starting in February they will drop the 4th SSD slot.
Wow I didn’t realize the Area-51m R2 only has 2 sodimm slots. I can do 128gb on my R1 (have 64gb right now).
My X58 board was supposed to do 12gb originally, then it did 24gb thanks to 4gb dimms, now I’m at 48gb thanks to 8gb dimms. It can happen with ddr5 as well. I remember it being mentioned somewhere when ddr5 was announced that we would be getting higher density ram.
clevo did have an 18.4” laptop, the x8100. Their first all bga laptop the x270 never came out as far as I know. Most of us went to Clevo for their upgradable laptops. Now that no one offer it, it’s down to what laptop offers everything one wants.
-
1
-
-
Just to copy/paste in the event that post ever gets deleted or something:
Further reading of experiences around the AGA in this and other forums, made me even try the plug and unplug technique proven to work by some users with the 3000 series (https://www.dell.com/community/Alienware/PCI-e-3-0-x4-for-ampere-3080-3090-question/m-p/7894720#M43691), however I had no luck until I did the following:
-
Uninstalled AWCC using Revo uninstaller (advanced scan and deleting found entries is mandatory after uninstalling)
-
Restarted system
-
Uninstalled Nvidia drivers
-
Restarted system
-
Installed the Chipset INF utility
-
Restarted the system
-
Located the following subkey in regedit HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\PnP\Pci
-
Generated the HackFlags entry with a value data of 200 as depicted here (https://learn.microsoft.com/en-us/troubleshoot/windows-client/deployment/error-attach-pci-express-expansion-chassis)
-
Reinstalled Nvidia drivers using the latest package
-
Restarted system
-
The card was still being detected as “Microsoft Basic Display Adapter”, so I uninstalled the Nvidia drivers
-
Restarted the system
-
Generated the HackFlags entry with a value data of 400 as previously described (this value worked for me)
-
Reinstalled Nvidia drivers
-
Restarted the system
-
After restarting, the card was outputting video and recognized as NVIDIA GeForce RTX 4070 Ti
-
Reinstalled the latest AWCC
-
Restarted the system
-
After restarting, the card and AGA were also being recognized in the AWCC
Important notes:
-
The HackFlags entry never worked for me until I uninstalled the NVIDIA drivers each time I modified the subkey; for this reason I suggest testing systematically the 200, 400 and 600 values always uninstalling and reinstalling.
-
Apparently AWCC can interfere with process since I tried everything depicted below and it did not work until AWCC was uninstalled.
-
The AGA and card were connected to my laptop the whole time
-
-
The fact that Clevo decided not to enter the 18" market like everyone else shows have they have fallen.
The m18 will be one of the best options for sure. They used the thicker m chassis instead of lightweight x chassis.
I am still looking forward to get more info on the m18 as I want to know about the speakers too.
-
Try a trial version (if they still offer one) of PowerDVD and see if that helps. Might be drive going out but you can try one of those cleaning discs
-
41 minutes ago, entangledinchaos said:
Question for Azther. What bios version are you running?
I just purchased the adapter from a member on here and tried 2 different bios with it and it's still not being seen by Windows 10 and no way to check in the bios. I'm going to check the 2nd MXM slot with a graphics card and make sure the MXM slot is functioning properly. Besides that it's kind of screwed I think. I've tried both a Toshiba RD400 1TB drive and a Samsung 128Gb Sata M.2 Drive and no luck on either. Did your adapter come with a screw? I've had issue with the screw not threading properly. I've tried 2 separate M.2 screws and its loose as hell and it didn't come with a screw since I bought it used. Wonder if it's stripped out.
All of us are running this BIOS here:
It is the last update for M18xR2.
Make sure in BIOS you are set to SG mode.
Make sure UEFI is enabled. (You may want to try booting with a usb install of windows and turn off legacy option rom to see if you can install onto an NVMe.)
Drives I have personally tested on that same adapter:
Samsung 980 PRO 1TB
Samsung 970 Evo Plus 1TB
SK Hynix Gold P31 1TB
I definitely recommend trying a gpu in the 2nd slot to make sure it works.
BTW, SATA M.2 drives do not work. It has to be a true NVMe M.2 drive not SATA.
-
1 hour ago, jaybee83 said:
all BGA Clevo X370? more info on that please so i can properly vomit and mourn.... 😅
https://www.clevo.com.tw/product/product_content/1/30
@razor0601 my brand new Area-51m with i9-9900K & RTX 2080 was $3840, but I got a decent discount with my 4-year complete care warranty.
I expect high end to be around $4k.
-
8 hours ago, 1610ftw said:
It should be great - except if you need 128GB memory. Where else are we supposed to get that if not in the biggest DTR of this generation?
Looks like the MSI GT77 and its Creator variant will remain the only high performance options with 4 memory slots but they don't have an 18" screen...
We will probably get 64GB SODIMMs like how we got higher capacity DDR3 and DDR4 modules later on.
Who knows, maybe the spec sheet is wrong on the dimm slots too like how it was for the webcam lol
-
13 hours ago, electrosoft said:
Fn+D on power on should technically reset the CMOS/BIOS but sometimes you will need to remove the CMOS and battery.
When doing memory tuning I tend to remove both as I know I'm going to be doing a lot of note taking and experience a lot of blank screens requiring a reset. Then all you have to do is unplug the power cable(s) to auto reset the unit.
Did you loosen up the timings too? It can definitely be the kit just can't go any higher than 2667 but make sure to relax your timings and set your Vmem to ~1.35 at least.
I tried auto timings 1.35v 2800, 2933, 3200 and also tried 22-22-22 timings 2933, 3200 , none boot.
So far I am at 2666.
stock 8gb 3200mhz ram:
32gb (2x16gb) ddr4 2400 stock
2666mhz auto timings
2666mhz my timings
-
1
-
RTX 4000 mobile series officially released.
in Tech News
Posted
https://www.notebookcheck.net/NVIDIA-GeForce-RTX-4090-laptop-GPU-first-impressions-pit-flagship-graphics-card-against-GeForce-RTX-3080-Ti.683346.0.html
This is straight up following nVidia's guidelines on how to show performance.
We don't care about dlss 2 vs 3 and the performance improvement over 3080Ti.
We want to see direct raw performance between 3080Ti & 4090