Jump to content
NotebookTalk

Talon

Member
  • Posts

    580
  • Joined

  • Last visited

  • Days Won

    10

Everything posted by Talon

  1. I'll sell you one. Barely used, in my second rig, has a MC warranty.
  2. Yes I’m aware of the combined/crossload power limits on these laptops. I have one. Got a link to that whooping you’re talking about? I’d be curious to see it. 1080p low settings to drive FPS in BF2042 I was seeing over 300fps on my 13900HX and 4080 laptop. My 4090 likely does the same given it’s almost entirely CPU limited in that situation with the GPU pulling nowhere near max TGP. The crossload is under 240w CPU/GPU. Swap to 4K and you’ll be GPU TGP limited and nowhere near CPU limited.
  3. If you have a MC nearby, your best bet is to go into the store, grab a kit or ask the sales rep to see the kit your interested in and look at the model number on the box. GSkill identifies A-Die or M-Die or Samsung Dies right on the box.
  4. No that GPU score is so high because it's shunt modded. It has absolutely nothing to do with stock power at all. All 4090 laptop will happily sit at 175w TGP which is the max allowed through any vendor (except tongfang chassis that use a built in shunt). All 4090 laptops will happily sit at 175w in games/benchmarks. This is the BEST RTX 4090 Laptop (shunted of course) and i9 13900HX. It curb stomps that score. Legion 7i Pro Gen 8 beast mode. https://www.3dmark.com/spy/38610285
  5. Nice! Anyway to get that CPU score up? Seems pretty low and holding back your overall hard.
  6. Unfortunately no, FSR and DLSS do not look the same, FSR might look similar in still images, but when compared in motion, where it counts, FSR 2 quickly falls apart compared to DLSS. In fact HWU did a comparison recently where DLSS won out in EVERY single match up. There are instances where FSR doesn't look terrible, but it never looks better than DLSS. As someone that uses 4K, DLSS 2/3 has been amazing. DLAA in particular renders at native resolution and then uses the AI to apply vastly superior AA so the image looks far better than native + any AA could ever look. 4K DLAA and Frame Gen in Diablo 4 is amazing. You mention the GPP, but that was quickly slammed by tech media and just as quickly cancelled. So I'm not quite sure why were bringing that up, more importantly, just because one company has pulled shady dealings in the past, should we then give another company permission to also proceed with shady dealings? Or is that only OK for AMD? Bottom line, allegedly paying a developer to prevent use of a competitors technology is 100% wrong and goes against "being open" and the summer of love AMD preaches. AMD should learn to compete on through their own superior technology and innovation. Attempting to stifle innovation and progress because you don't have an answer is disgusting and not good for any consumer. Also Intel XESS is hardware agnostic, it works any on any GPU, but does work slightly better on Intel GPUs as they have hardware accelerators that it's GPU can take advantage of. Something AMD should have done from the beginning but never included on their consumer GPU line. That is AMDs fault, their downfall in this case, and why they are "open". They can't exactly beat DLSS so they attempted to go for an "open" approach. Problem is, that only works if your product is as good or better than the competitions which is isn't. And AMD has another problem, market share. Not only does Nvidia have the market share, they have the better product. Then there is the fanboy argument that DLSS only works on "certain GPUs". DLSS works on any GPU from a lowly laptop RTX 2050 all the way up to a RTX 4090. That is 5 years of GPUs as Nvidia has largely only released RTX cards in that time frame. That is a very large upgrade window and if you're gaming on a GPU older than 5 years, well I'm sorry and you're likely in a small pool. So this argument is stale and outdated in 2023 when looking at the Nvidia market share. With AMD having less than 20% of the market share, we can safely assume that a huge percentage of gamers would benefit from DLSS being included over FSR. Not that I am arguing for that though. Instead all 3 should be present as they are quickly becoming nothing more than a plugin to add once one is added. A modder has added DLSS and Frame Generation to both The Last of Us and Star Wars Jedi recently in just a few days to a week of work through his limited tools and resources, and this is a single person. I've used both and they work flawlessly. Arguing that DEVs with huge teams and resources can't do this in a day or 2 is just silly and quickly proven wrong. Sorry there is no defending anti consumer, anti competitive behavior, no matter what team you root for.
  7. Next Gen GPU will be Gen 5. X8 Gen5 will be same as x16 4.0. Not a chance it gets saturated by next Gen.
  8. Nice pick! The Z690 Apex is complete garbage unfortunately. The Z790 Apex is a whole other thing, a total monster board. Luckily were going to get another gen of LGA1700 toys to play with, hopefully with godlike IMCs and the Apex should be able to push those ridiculous memory clocks.
  9. I loved that card. I had a golden sample for sure and should have never sold it. It should have lived out its days on the shelf next to me. Honestly a truly awesome card. It guzzled power, but ran very cool thanks to it's huge heatsink and vapor chamber. And EVGA knew where to put the power connector.
  10. Behold the power of this 1.3lb handheld Asus Steamdeck lol.
  11. I haven't tested those driver out on the laptop yet. I had major issues with the new Nvidia driver on my desktop when using dual 4K 160Hz screens. One screen would go black on/off, and I heard the hotfix driver still has the issue unresolved so I rolled back and haven't bothered trying again. Actually this OS is installed on an external nvme via thunderbolt as a TEST. And even running on an external driver the performance in incredible compared to regular Windows 11. I'm going to clean install to my regular internal NVME after I verify everything is working as intended.
  12. https://www.3dmark.com/spy/39116906 https://www.3dmark.com/3dm/95305503 Finally using a non cancer edition of Windows 11. CPU score is where it should be now. @Mr. Fox I used your suggested version. Thanks!
  13. Absolutely garbage filth that would exist in a laptop. It means you can't use these new cards in 'old' motherboards when you want to upgrade. Better throw out your old motherboard from last gen and get a new shiny board with this useless adapter. O your board burned and kill both the board and GPU? Better buy new again. Sorry that old revision doesn't work with this new GPU, gotta buy a new motherboard to buy a new GPU. I swear if this shit happens, I'm done with PC gaming/hardware as a hobby. I'm not buying a desktop tower sized BGA laptop.
  14. Asus BIOS engineer showing Steve how to actually measure voltage.
  15. Not my highest TS CPU score, but Windows 11 is terrible for consistency with this laptop. I can get around 20K CPU, but it's almost like it's random. Something is eating up cycles/performance in the background. Might need to swap to Atlas10 for real benching. Either way for a less than 6lb laptop this thing absolutely rips. Fans on auto on a cooling pad, can cool a lot better but I'm testing with auto fans. Ears are sick of max blast fans these days lol.
  16. Is the SlimQ 330w adapter with Lenovo tip significantly smaller/lighter than the 330w it ships with? I am looking to purchase it.
  17. Yes and definitely disconnect the battery before swapping ram because of that. I actually just tested 6400MTs CL44 but got a no post. The laptop auto recovered to 5600 CL40 which are the Kingston DIMM defaults. Nice to see some protections built in. I'll stick with 6000 CL36 for awhile before I decide to push 6200 CL40 or similar.
  18. Trust me I feel it too. I thought I got an amazing deal at $2500 after tax/and 3 year accidental warranty extension with coupons and cashback. I do wish I had a 4090 version now that I love the laptop. The 13900HX is such a beast of a mobile chip and that MC deal on the 4090 would have sealed the deal. O well. I will wait for next gen lol and 4090 Ti with 24gb VRAM. Something else is that my memory controller on my 13900HX is cherry. My unit runs 6000 CL36 1.1v stable with default SA and obviously default MC voltage. The Kingstons are A-Dies and absolute units. In something like Warzone II, I am beating 4090 laptops lol with that setup.
  19. They look amazing. Good thing with GSkills is you can verify A-Die without opening the box. 96gb in 2 sticks is just wild. I assume they have to also be DR setup. Could definitely be fun to play with. Just a year ago or so they were scalping us so hard for $400+ 32gb kits. Now were getting 96gb. Won't need memory anytime soon with that kit.
  20. I am not at home right now to check but I think it was lower than my 8000 CL36 SR setup but not by much. I think I was around 117K across the board and latency was around 52ns. Oddly enough gaming performance was the same or better in some cases, I expect due to dual ranks. But I wanted a 64gb setup so for me the tradeoff was worth it. I grabbed the G.Skill 6000 CL30s and they are rockstars. They do that 7466 CL34 at just 1.45v VDD/VDDQ on the Apex board. Obviously A-Dies. I'm sure they can go further with proper voltage and cooling, but cooling dual ranks is definitely more difficult. Need water IMO. https://www.microcenter.com/product/651260/gskill-trident-z5-rgb-series-64gb-(2-x-32gb)-ddr5-6000-pc5-48000-cl30-dual-channel-desktop-memory-kit-f5-6000j3040g32gx2-tz5rk-black For the price, they provide me with a similar experience with double the memory capacity so it was a no brainer over the $300 8000 kits.
  21. Curious to see how far you can push them. Are they dual rank or single? I've got my Dual Rank setup at 7466 CL34 64gb of DDR5. I love it.
  22. Gigabyte boards now as well. O boy, this looks awesome. Will be curious to see the performance change after they fix/reduce the voltages. Gigabyte scrubbing the internet of their older BIOS as well now. Love that everyone was quick to blame Asus though.
  23. Apparently NOT just Asus. MSI is also now restricting voltage to X3D CPUs. https://videocardz.com/newz/msis-new-bios-restricts-voltage-for-amd-ryzen-7000x3d-cpus Sounds like AMD rushed these out the door without proper validation.
  24. @Papusan Looks like MSI Afterburner author fixed the Maxwell voltage lock in latest release! https://www.guru3d.com/files-details/msi-afterburner-beta-download.html
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. Terms of Use