Jump to content
NotebookTalk

electrosoft

Member
  • Posts

    2,657
  • Joined

  • Last visited

  • Days Won

    116

Everything posted by electrosoft

  1. Switching to LM or PTM7950 for the CPU along with a dialed in UV and you will be blowing past 30k on CB23. Your TS score is definitely better than a 10GB 3080 and on a 3080ti level. For perspective, your laptop is performing slightly better than my wife's desktop with a Strix 3080 10GB and 12900k and once you dial everything in and optimize the cooling, you're going to blow past it even more especially on the CPU.
  2. However you look at it, it is a monstrous upgrade over your old laptop! Down the road, PTM7950 + tuning the voltages will net even more performance. Congrats!
  3. Congrats! I had a feeling in the end you were going to go with a laptop as they really are your true love computing wise. I love laptops too just as much as desktops so I get it. 🙂 Post up full specs on the system, a pic of it and a Timespy.
  4. It really is. I tried to make the math work to upgrade the wife's Strix 3080 10GB but as much as I "Beautiful Mind"'ed it, it just didn't make sense. 🙂 Yup, you called this out weeks ago. Even when working with SFF 13900ks last year, the logic was to enforce the limits before fine tuning with a UV to work with a 280mm and it was still hitting 41.8k no problem in CB23 with killer temps. I ended up enforcing ~400a 300/300 and that 13900KS ran so smooth on that 280mm in the small case and never throttled.
  5. I'll second this @ryan . I also like that NZXT has much easier to access components (they're all basically off the shelf and usually their brand) and they have good airflow. Along with checking "Facepoot" (lol), don't be afraid to check eBay. Plenty of good deals to be had there too. I still stick to building out your own from scratch to really dial in what you want and hand pick your components, but if you're wanting to go pre-built, NZXT is not a bad choice.
  6. It's scary how so many have lost the ability to agree to disagree and remain amicable. Identity politics and extremism on both sides hyper inflated by social media makes it rough waters out there and I've seen life long friendships disintegrate over politics and/or religion. I have numerous friends and loved ones on both sides of the aisle and watching some of their life long relationships with each other obliterated is disheartening. As for hardware, everybody has their own personal criteria and use cases. I do see your fervent argument @Mr. Fox about neutering BIOS because Intel messed up big time and were over zealous with their CPUs in the need to compete and clearly as data has gushed in, Intel sees a need to cap PL/AMPS because they were redlining their chips. I'm all for issuing bios updates that change the default profile to Intel's new recommended spec which is a massive nerf (125w/188w? Wow). What I do not want to see happen is an AMD hardline cap to voltage and allow users to OC as they see fit at their own risk. Imagine someone who dropped serious coin on binned chips and top end motherboards like Encores and Tachyons to purposely overclock suddenly having to make the choice of being capped at the bios level or never being able to update their bios to retain full control over their chip. Lastly, As we enter the end phase of Ada, 4090 reigns supreme but for me this cycle's clear price:performance winner is the 4070 Super:
  7. utilized variable length stalls/cycles versus virtual pipelined instructions is my guess. AKA reimagining hyperthreading for more efficiency is my guess but we'll see.
  8. I would go with something like this in the here and now: No GPU: https://pcpartpicker.com/list/h7szrv w/ PNY 4080 Super: https://pcpartpicker.com/list/LcB8cH 7800X3D complete build out and all you need to do is add a GPU of your choice. Either a place holder or actually pick up a 4080 and the total cost is the same as a 4080 laptop but with much more zip, power and expand ability. Either way the base system is as good as gaming gets with the 7800X3D and it will be ready for Zen5.
  9. I'll double down on this. AM5 or wait for Intel's Socket 1851. Socket 1700 is dead and AM4 is double dead (even if AMD keeps releasing silicon for it).
  10. Agreed. I think I recommended this to @ryan several months back to slowly build the system starting with the bones and work your way up depending on prices and what's on sale. This is exactly what I do when I start to map out systems to build. I slowly start to accumulate parts over several months depending on prices as I love used/clearance items and go from there. You can literally slap together a very competent AM5 system with 32 to 64gb of ram and a 1tb SSD for killer prices right now and build off of that. CPU and GPU will almost always be the most expensive components.
  11. Here's a run on my returned Asus G18 w/ full power 4080 to give you a good idea where it stand graphically. You're spot on with the ~3080 desktop performance metric. This is stock no OC on anything: One of our local best buys has the G18 18" 13980hx/4080 on close out for $1584.00 which is a killer price especially toss in 5% back and it has the same (if not greater CPU performance depending on silicon) performance basically as the 14th gen version. 4090 variant of the same laptop gets you ~21.4k so 3090+ power in a laptop so a ~16% uplift along with a jump in VRAM. Crazy to think just 2-3 years ago about a laptop with the former king desktop GPU power.
  12. Ya know, there's nothing wrong with preferring laptops. I know plenty of us go on and on about the "ultimate power" of a desktop but if laptops are your forte and you can budget it, go wild and get a monster laptop. Sell the bulk of your laptops you have now for funding. Keep the best one for portability and go get a monster laptop. I love desktops but I equally love laptops too. I always end up having 2-5 on hand at all times even when I eventually downsize they build themselves right back up again lol. If you can find a clearance/close out last gen 13th gen / 4090 laptop for a good price that will net you ~3090ti level performance. Even a 7950hx / 4090 is no slouch either. Don't ever waste your $$$ on a 4070 laptop. for Ada, it is 4080 and 4090 or seek out a last gen for a substantial discount.
  13. So I take it you don't want to build you own? Did you do some price comparisons on PCpartpicker to see what you could build for a comparable price? I'm not really feeling the 5800X on that platform as even a 5800X3D will hamstring a 4090 let alone a 5090. I would price out and build an AM5 system in preparation for a 4090/5090 class card and slap a 4070 in there for now.
  14. As always, it comes down to desired resolution, details and framerate. If your objective is to play at any resolution or settings and you don't care about sub 60fps, you can get away with a much much lower tier of hardware. Take Hogwarts for example. Everyone bemoaning how taxing it can be but if you are willing to compromise across all three criteria, I had it running on my daughter's P377SM-G at 1080p with a 4810mq and 970m and 32GB of 1866 DDR3 and it was playable. Details were at low but if you don't care about resolution, details or framerates have at it. 🙂 My objective is always at least 120fps+ with the best detail level possible at 4k with no compromises and that tends to dictate my hardware level. When I sit in front of my wife's desktop with a 12900k, 3080 10GB and a 32" 144p 165hz display I can see and feel the difference between my setup immediately. When she had my old 32" 60hz HP 10-bit display it was always nauseating to play on it after coming off my 144hz G7 and I used that HP for over 5 years for gaming! 🤣 As for Linux, I'm an Unbuntu and now Mint kinda guy and I keep Kali on thumb drives for other targeted work on the side. Nothing yet on the 5090 front but I'm fully expecting $1900-2000 for launch MSRP.
  15. Stock Dell G15 13650HX / 4050 laptop.....fresh from Dell with the newest BIOS off my storage shelf.... "I think something is wrong with the laptop you loaned me to play Hogwarts" "Why?" "The fans go crazy and it locks up or shuts down when I'm playing sometimes" I give it a basic Timespy run....yeah there's a touch of overheating going on.....🤣 Pulling up to ~115w Incoming PTM7950 and UV on this hotbox...
  16. Still crazy that it seems MSI has basically jettisoned AMD 7000 GPU sales (sans 7600 mech) but I understand lol.... At least they're staying strong with AMD motherboards though.
  17. Nice little boost there Meaker and still showing the reason fully upgrade enabled laptops have meaning and purpose.
  18. Agreed, mLED is such a step up from normal panels but OLED is just that much next level and the days of "OLED being too dim" are long gone. They are painfully bright even to me. One reason I decided to give it a try is I gave my daughter an OLED Acer laptop last year to finally replace her Clevo P377SM-A she had for years for college (She basically was tethered and weighed down by it no matter what and she needed to go and be flexible) and she uses it every day for hours on end with no screen mitigation in place for over 6 months and when I was improving her cooling the other week I couldn't help but notice she had zero burn in even on the task bar. She uses it easily 2-3hrs daily and on some days 5-10hrs and there was no burn in anywhere. I'm sure burn in is still an issue with OLED (even the newest stuff), but I don't know what magic is being worked on newer panels but she had none. I also have an Asus Vivobook 15.6" 1080p OLED laptop I'm using now with a basic 1080p screen so we'll see how that one fairs down the road. It has an Ultra 7 155h in it and it has become my new field/projects/raspberry/bookstore/Starbucks laptop. The Samsung Odyssey G7 43" will replace our bedroom 15yr old 32" LED display. It's so old it doesn't even support HDCP properly so many Firestick offerings wouldn't run. I'm still going to be pro-active for some very basic measures to thwart screen burn in just in case. Many games have static elements for HUDs and such and I am sure gamers play for many hours so we will see. I remember @tps3443 saying he spent plenty of hours gaming on his OLED and has had zero burn in.
  19. I do actually use a Razer Deathadder V3 wired...... 🤣 I will never by an Alienware PC or KB. The Alienware M18 isn't all too bad if I went a modern large body laptop and accepted true DTRs are most likely never coming back. 😞 I completely understand. For my real work I still rely on my BenQ on my other desk that doubles as my diagnostic/workstation display. I just roll my chair over there and use a Cherry MX keyboard and actually an EVGA Mouse for real work. Even with all the modern "anti-burn" technology, I still switched my taskbar to disappear and set the screen saver to pop on after 60 seconds of non use to be safe. 🙂
  20. Oh, so you know how it is then! I think they all use basically the same panel. 🙂 After sitting up playing Fallout 76 last night, the difference is pretty amazing. The color reproduction and brightness (especially explosions) is just sick. Plenty of times I just sat back going, "whoa...whoa..." This is the first display to finally check ALL my boxes for a gaming display: 32" 4k 10-bit color accurate 144hz+ True HDR Excellent response time OLED level per pixel dimming I had quickly adapted to the 43" Samsung, but it was mLED and 8-bit which coming off my BenQ 10-bit was a let down. BenQ was IPS with no local dimming. I'm loving this setup with the AW and Razer Nommo Pro V2 speakers very much!
  21. Set up my Alienware AW3225QF Oled 32" today and I've been playing around with it for awhile. I was previously using a 43" Samsung G7 144hz 4k IPS 8-bit display for over a year (~13 months) and as expected this thing crushes its picture quality. I would say this is like taking my favorite previous 32" BenQ 10-bit 60hz Gsync display and giving it 240hz with OLED level blacks. Still not a fan of curved displays but the curve is slight enough to not send me into fits. After going back and forth with HDR and testing, I actually prefer it off most of the time including in some games. For example, Fallout 76 looks better with it off but Starfield looks better with it on. For movies I prefer it on.
  22. Ack! Back to the previous BIOS you go..... For most normal users who don't care about R23 or benching, all they care about is gaming and not crashing. If new voluntary limits gets them stable, so be it. Other users (like us) will get under the hood and optimize our cooling as much as possible and then tune our chips to perform as well as possible in a D2D config along with a benching config. This of course makes the 14900KS a bad buy for most consumers and it has already dropped in price to ~$664 from all sellers. If I was in the market for just a 24 core chip, I would go snag as cheap as possible 13900k as I could find, super cheap board with good VRMs and dirt cheap 7200mhz memory and save a ton of $$$$.
  23. As long as some stock settings are changed for Joe Gamer to not crash but does not neuter or hinder those who want to get under the hood and have full access, I am all for this change. Like @Talon has been saying, the stock, out of the box settings are too aggressive and this time redlining the silicon that has already been pushed too far has had consequences. I ended up having to drive almost two hours and tune my cousin's system last week because it was overheating like crazy with a 14900ks he had a local shop (mistake #1) swap in for his 12700k and they told him when he said it was crashing "This is just how these new Intel chips run" and a I want to say an ID cooling level 240mm AIO. When All was said and done, we ran to MC and picked up an EK Nucleus 360mm AIO and some case fans. I took my industry sized vial of KPx over and a complete cleaning, proper air flow, replacing that POS AIO, newest BIOS update and tuning his Asus Z690 Hero and it is now purring along properly. I also tuned his memory at 6400mhz too while I was there (he was running it at 4800mhz Jedec lol) but not too tight as that would take days to certify to my standards. I also undervolted his 3080ti so it wouldn't sound like a jet engine taking off and promised to take it apart and clean it up next time. He skipped 4000 series but plans on picking up a 5080 or 5090. I also cleaned up his W11 install. Everything is running right as rain while we game online and he now owes me dinner and drinks. In case you're wondering, his MC purchased 14900ks was reported as SP108 but I'm not sure with Z690 reporting as even my 13900KS on the Z690 Strix kept coming back as an SP98 but once tested on a Z790i Strix came back as a proper SP115. I know his V/F curve had his 6.2 at 1.468v and once dialed in within 320w PL limits and a small UV it wasn't throttling at all. It looked to be a pretty decent sample. So why mention this? My cousin is typical of Joe User. He plays BF, FO, FN, WoW and MW. He's also started playing Alan Wake which sent his system into crashingville. A simple bios update and 14900k swap in had his system (which was already poorly optimized overall. The two case fans was atrocious) freaking out and overheating/crashing.
  24. It really comes down to the games played and with some newer titles being so graphically demanding, with a decent CPU the 4090 can hit a wall but that hasn't been the case with WoW and FO76 when stepping up from the 12900k -> 13900KS and 7800X3D saw tangible gains and GPU load finally moved into the 90s most times but still can't peg it except in spots here and there even at 4k. Starfield and Hogwarts definitely pegged around 98%+ at 4k though. I'll have to get in there and toggle my CCDs with the 4090 to see the gains in my known benchmark runs in FO76 and WoW. Just like with the 5800X3D, I suspect with tuned ram, the non-X3D will pull closer to the X3D CCD but we will see. I just flip CCD1 off and on as needed for gaming. It only takes a few minutes to load the profile and reboot. I'd rather do that than have Project Lasso still not prevent some drifts or gamebar nonsense.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. Terms of Use