Jump to content
NotebookTalk

ssj92

Member
  • Posts

    908
  • Joined

  • Last visited

  • Days Won

    6

Everything posted by ssj92

  1. If 1-5 fps gain is worth it to you then sure. But going from 180w to 200w on rtx 2080 is barely any faster.
  2. I added some new pics from the reviewer's guide. If the i9 13980HX is really missing then that is a disappointment.
  3. Flashing Alienware RTX 2080 vBIOS from Area-51m R1 will boost TDP to 200w. However, unless you are benchmarking, it will probably not be worth it
  4. Yes it is, as a matter of fact, it will be matching a 3090 (at least the 150w max power variants with 25w dynamic boost) the 4080 is faster than 3090ti, the 4090 mobile is a underclocked 4080 desktop, so it's not crazy to see the 4090 mobile beat desktop 3090
  5. You would be bottlenecked by the x4 bus speed. Not to mention you would need to hack the drivers to allow SLI support (not sure it'll work due to optimus)
  6. My Area-51m R1 seems to be holding up really well. It has a rubberized surface. I know my M4800 & all previous Alienwares (M14x/M17x/M18x etc) all have palm rest becoming sticky but newer laptops (Aw14, 17, 18) seem to be much better and the newest gen (Area-51m) seems to be really good (mine is 4 years old and still feels perfect). We will have to wait and see what the newer m series has. My X series also has rubber surface it seems but it's too soon to judge it.
  7. Makes lots of sense if it was 2x 2280 & 2x 2230 but that will be disappointing. Right now you can do 2x 8TB in 2280 and 2 x 1TB in 2230 so total of 18TB.
  8. I'll believe it when I see it. nVidia has been quite strict on TDPs. Clevo didn't even have a 200w vBIOS for RTX 2080 (you had to flash AW vBIOS). If it's true it will be good news for all since you should be able to flash vBIOS using programmer still on other laptops
  9. I have not but a fellow aw member tested p5200 in m18xr2. It works once you solder on the bios chip and flash it
  10. I tried messaging crucial to see when I can order the 48gb so-dimms. about the 9tb thing...it sounds like typical AW selling their "certified" SSDs from the dell online store. Should be no problem using 4x 8TB NVMe SSDs but we won't know that until someone tests it. All four ports should be 2280 size I think
  11. My RTX 3000 is from Adlink, you will probably never be able to find it online unless someone from a company sells it. The HP RTX series are available on eBay and quite cheap. Not sure if they will physically fit as they are slightly larger than standard mxm 3.0b. There is a P4000 on eBay for $280 P5000 for $350 Prices have come down and those are also great options.
  12. If upgrading from 20 series to 30 series, gpu power cable is the same correct?
  13. @1610ftw looks like we are getting 48gb x2 (96gb) sodimms now? Crucial Now Has 24 GB & 48 GB DDR5 Memory Options For Desktop & Laptop PCs (wccftech.com)
  14. https://www.notebookcheck.net/NVIDIA-GeForce-RTX-4090-laptop-GPU-first-impressions-pit-flagship-graphics-card-against-GeForce-RTX-3080-Ti.683346.0.html This is straight up following nVidia's guidelines on how to show performance. We don't care about dlss 2 vs 3 and the performance improvement over 3080Ti. We want to see direct raw performance between 3080Ti & 4090
  15. 3070 MSI Gaming Trio (I removed the whole top cover as I like to look at my cards. 3090 Kingpin & XC3. 2080Ti PNY Blower RX 6800 Reference These are the most recent ones I tried.
  16. I can confirm i have tested rtx 3070, 3090 rx 6800 to work in aga 4000 series also works
  17. The fact that every single manufacturer has this limitation or even less tells me nvidia / intel didn’t want manufacturers pushing their products. NVidia didn’t let clevo make a 3080ti mxm card and now even clevo is 100% bga. at this point nvidia/intel control what and how their products get used. will be very interesting seeing the AMD versions of the same laptops.
  18. As far as I know, embargo has already been lifted. It's nVidia's GPUs holding back people from discussing laptops but CPU wise it's been lifted. Will be curious if OC works and it can sustain the OC. Hoping XTU or TS helps keep clocks up on load and thermals are only limiting factor, but we will see.
  19. Jarrod'sTech on Twitter: "Not bad, for a laptop https://t.co/uFM1tZKww4" / Twitter I'm a bit disappointed he hit us with the "it's an engineering sample so I didn't test power consumption, temps, boost time, etc" True difference between a enthusiast and one who isn't. I would have tested all that. We know by now that the "engineering sample" in that system is likely a QS "qualification sample" which is 99% representative of real world performance. If it was some ES from early stages I'd understand. Can't wait to put these level of systems to their paces myself 😄
  20. The 250w total is already a thing on current 12th gen/30 series laptops. The CPU by itself can hit full speed and gpu can always hit full speed but cpu gets limited to 75w tdp when gpu is at full power. All the RTX 4090 + i9 HX CPU systems will have this 250w limit or less. It's unfortunate that dual psu systems are gone. We will have to wait until they're released to see how they really perform.
  21. There's a MXM 3.0b to PCIe 3.0 X16 adapter that exists in China. One of our members in AW Club has one. He ran a 6900 XT in M18xR2 if I remember correctly. BTW Does anyone know if the i9 10980HK LGA version from China works in X170SM-G? I think @ViktorVis using one in P870TM or something. EDIT: Just realized it's 8 cores lol, 10900K or KF it is.
  22. Do you have a 12800/12900HX system with a RTX 3070Ti or better gpu? As far as I know, this whole 'total TDP" thing has already been a thing since 12th gen /30 series. If you have one run prime95 on cpu and msi kombuster on gpu and see the tdp of both chips. They probably won't run full power. my 12700H/RTX 3060 have this limitation where one gets less power (intel cpu in this case) to allow max gpu performance. Go to stress test section: MSI Titan GT77 12UHS Laptop Review: Alder Lake-HX poster child with unhindered desktop-class performance - NotebookCheck.net Reviews "he Titan GT77 uses MSI OverBoost for a combined 250 W load (75 W CPU + 175 W GPU) from both the CPU and the GPU depending on the scenario" nothing new is happening with the new systems, this has been a thing this whole time This is the same thing that will happen with new systems. 250w total system power means if cpu has 100% load and gpu is 0% load, the cpu will still be able to use all of its power. It wouldn't make sense for intel to limit 13980HX power in cpu bound situations less than 12900HX especially with 8 additional cores. First set of laptops can be ordered Feb 1st and release Feb 8th so just a few more weeks before tests will be performed. You can bet I will be doing my own tests when I get my system. Alienware openly states in their press materials (link in first post) about 250w total power. I can't find the info on third manufacturer right now but if I remember correctly it was razers system.
  23. The predecessors also have 8 less cores. These CPUs can be OC so I assume you can exceed the TDP on CPU only loads. The 75w limit comes in when GPU load goes up, which is how it is right now too. There are already 3 manufacturers (I think 5 total but 3 100% for sure) that have this same 250w total tdp for cpu/gpu. This is definitely something intel/nVidia probably came up with and enforce onto manufacturers. nVidia didn't want Clevo to make a 3080Ti MXM for X170TM-G, I wouldn't doubt they won't allow higher TDPs on these systems with agreement from Intel. yes it is stupid to give a 250w total tdp. My Area-51m can easily exceed 400w combined on 9900k/2080 and supports 660w thanks to 2x330w. Even my old m18xR2 can easily exceed 330w with XM CPU OC/SLI GPUs. From what I saw, no single laptop at CES has dual power adapter support. They will all probably be limited to 330w total system power.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. Terms of Use