Jump to content
NotebookTalk

Leaderboard

Popular Content

Showing content with the highest reputation since 11/09/2025 in all areas

  1. Hey everyone! Still rocking the same system. My 5090 has been strong! Hard to think its already been 8 months since I got it. 6 months of that has been shunted for 1200w. It has been getting rung out hard though. I essentially lost interest in talking about this hobby and just been using it for work/gaming daily. 😆Something has pulled me back in to talking about the hobby though. I suppose I needed a break from the daily PC discussion lol. On another note, tonight I flashed my Acer Predator 4090 18" laptop to some random unverified Asus 4090 Laptop bios from techpowerup, and this thing picked up tons more performance. I am able to pull 175w steady in games, I could never do that before, usually 150-160w before. So, I checked my bios, and apparently my bios was the 1215Mhz clock/ 1590 Boost. And the bios I flashed to now is a 1590 clock/ 2040 boost. My GPUz said mine was a 175W bios, so thats why I never tried this before. So, I flashed out of curiosity. And it's just blowing the doors off of the prior bios that was on this Acer Predator. I am not really sure why. But I can run another +250Mhz on the core. It is really odd. Anyways, hello everyone!
    7 points
  2. Regarding 5090, still have some telegram bots reporting new drops and prices, yesterday was a founders 5090 drop from nvidia directly on their website for 2099 Euro. Got the notification and went on the site. Got one in the shopping cart but after like 5 minutes of thinking if I need it, I decided to close the website. Every game I play like battlefield 6 or stalker 2 works perfectly on the 4090. So it just a waste of money for me. Also went to the website like almost 40 min later and they were still available, tried to put one again in the cart and it went through, I was against one click to order it and closed again. So it looks like the market is saturated a little. Also got this baby to play with. Nice looking card😃. An oldie but still packing a punch with those 32GB HBM2 memory for larger AI models. Will try some qwen 2.5 and Llama bigger ones on it
    6 points
  3. Sold my Z890 Apex and 285K to @win32asmguy Paid for and shipped to them this afternoon.
    6 points
  4. A little preview of what I'm building next Sage 621e
    6 points
  5. So I was going to pick up the open box 5080, but it was gone when I refreshed the page. So I decided on the 9070XT Red Devil from Amazon at $650...but its delivery window is Dec 21 to Jan 23rd (lol). I would have preferred a 5090, but I don't want to spend that kind of money right now. My mindset in regards to the 5070 TI and 5080 is simple, I know a refresh is coming (one way or another) and I don't want to buy a card that is going to be supplanted for the same price or a bit higher within a year of ownership. I know with the 9070xt, replacements aren't coming out until 2027 and when they do, they are going to be a new generation. I also picked up a 9950X3D from Newegg, but I don't know if I am going to keep the order. I have a 7600X and it should be adequate for video editing. Although, for the first time ever, I am going with a power usage approach versus max performance...because where I live, I have subsidize others through our power bill. The 9950X3D ECO mode does interest me. Anyway random rant post over. Hahaha.
    5 points
  6. @jaybee83 It's a gv100. It's a loaner from a friend so I will just use it for some experiments with LLMs for a while 😀. don't want to broke it as it's still going for good money on ebay.🫣
    5 points
  7. I haven't uploaded a video in a long time. Now that I finally found stable settings I wanted to save a permanent record for myself and help anyone else that feels like they are spinning their wheels. Moving from Intel to AMD overclocking is more difficult than moving from Windoze to Linux. Maybe (just maybe) there is something in here that would be useful to Brothers @jaybee83 or @Raiderman
    5 points
  8. Very nice for your Astral LC card! Hilarious that magically Asus can release the XOC vBIOS after the Matrix made their entire 5090 line (except white Astral) complete garbage overnight lol. Good to see them realize most will never flash this vBIOS and it's only the very enthusiast community so risk is tiny.
    5 points
  9. What a coincidence with the leak of the Astral 5090 XOC. Within a week after the Matrix bios got released for everyone, and that almost none of the owners of Asus cards could use. Hmm, I don't believe in Santa bro @Mr. Fox. I'm too old for that. What else could it be? A$us try avoid further bad publicity for the brands overpriced GPU's=? Hmmm. I'm sure I know the answer. Same with you😎 https://www.overclock.net/posts/29533017/ Edit. Jup, Asus had to do something.... Matrix BIOS can even be flashed to "HP OEM" lmao. Binning hundreds of Astrals to find one that clocks the highest in Valley, then ASUS ruins it. https://www.overclock.net/posts/29533316/
    5 points
  10. That Matrix GPU is only good so others can use the vBIOS. The price is extremely stupid and I think it looks stupid, too. 🤣 I have gone pee in my own backyard before. When you have a family of 7 and 3 bathrooms, there are times when you just can't hold it any longer. Now that we are down to 2 people at home (empty nesters) and 2 bathrooms I have not found myself needing to do that for a long time.
    5 points
  11. Hopefully this was not a waste of sand. But, I will find out as soon as I button things up. If it's not better than what I already have it's going back for a refund.
    5 points
  12. Nicely an clean even with the extra pipes, the distribution plate puts in some good work. I got my network into it's final form. Full 10Gb backbone to my study and living room with 2.5Gb connections. DAC cable to the 8 port switch and an SFP+ to ethernet adaptor for 10Gb for my uplink to upstairs. I'm getting a revised version of this holder so I can cable tidy this properly.
    5 points
  13. I finally took the time to get the X870E-E Strix memory on water like the Apex. First time in quite a long time that I ran the memory with air cooling and I am so glad to not have to use a fan for it any more. I really did not like that. Water is so much better, not to mention looking a whole lot better as well. That 120mm fan hanging on a bracket from the top radiator was pretty ugly. It will take a day or two to work all of the air out of the distro block. Now it's time to hit the sack.
    5 points
  14. The new Threadripper CPU’s are so awesome, been doing some reading on those lately. They have improved IPC so much so, that I do not think it’d be even worth picking up a prior gen Threadripper 7960/7970/7980. This is how the new chips stack up in Cinebench. TR 9960X CB R23: 59,000 stock TR 9970X CB R23: 77,000 stock TR 9980X CB R23: 113,000 stock (People with custom loops are overclocking 9980X 64/128 chips to 5.5Ghz on all cores which is just pure insane to imagine how crazy that is and how over the top beastly) Apparently the 9970X is the sweet spot. Now here’s where it gets interesting. These chips can run fast DDR5. People were already pushing DDR5 6200-7000 with the older 7960x/7970x/7980x. The new Threadripper 9000’s run DDR5 6400 natively and apparently the IMC is worlds better. And they can easily handle DDR5 7200 right out of the box. Tuning will probably net you closer DDR5 8000 or more. What a shame DDR5 prices have gone hay wire. Anyways, AMD Threadripper is sick.
    4 points
  15. So I ended up snagging a pair of V-Color 9600 2x24GB DDR5 off of eBay new from a seller I've dealt with before In today's market conditions, couldn't beat the price of $489.99 plus I had two coupons for eBay (basically, "you haven't shopped us in a long time!" coupons) for $15 off and $10 off respectively. I was able to use both of them knocking it down to $464.99 for a total of $495.80 shipped and with ebay rewards, that knocks another $15 off of it making it $480.92 to my door. Am I doing backflips? No, but much better than those G.Skill sticks that would have been $650.40 to my door.....still..... This is *exactly* why I try to avoid new platforms because once I start it gets ugly real fast. 🤣 Remember 11th gen? I was just going to pick up a 11700k and the cheapest Z590 board and use my dirt cheap DDR4 sticks to "give it a whirl." Within 4 months, it is several 11900k's binned, Asus Z590 board and EVGA Z590 along with multiple kits of DDR4.... I'd like to say I won't end up binning some 285k/290k chips, but that would be most likely a big ole fib....🤣
    4 points
  16. Don't get too excited, I sorta am just dipping my toes in half-assed on it lol. I went sort of a hybrid AIO setup based on someone's recommendation over at overclock.net. It allows for a real full block, and all parts are real DIY watercooling parts, but the pump/reservoir are contained in a combo deal on the radiator. Again, all reusable parts, at least the block, fittings, radiator, etc are reusable. Also you can add additional rads with pumps to expand the setup. I really just wanted something a bit easier to deal with, and will decide to go further if needed. https://www.ebay.com/itm/196057615070?var=495915734111 I went with this 360mm setup. I might add in an additional 120mm setup on the rear of the case to add more pressure through the system and cooling capacity eventually. https://www.amazon.com/dp/B002QWBGQ2?ref=ppx_yo2ov_dt_b_fed_asin_title -- Went with this fluid, again recommended. I am still trying to figure out how to setup a drain, and which quick disconnects I need/want to make it even easier to move parts around in the system. I assume I need a drain planned into the setup.
    4 points
  17. Nice to see you are watercooling. That’s an expensive hobby on its own, at least the initial investment is. What cooling components are you gonna be using? What are you putting this in? Yeah that sucks about DDR5. I have been ignoring this hobby completely. Other than using my components. I am not all that up to date on the latest. I’m telling you a 5090Ti X2 could have dropped, and I would have had no idea. Looks like Intel released some of their new server chips though. They have these new fabrication Xeon 6 chips they have a X3D like cache as well. You can also pick from two product stacks this time around: P core only version, or E core only version in a server/workstation socket 4710. It looks to me like they are not giving up yet, and are taking their time before releasing the new mainstream products. https://www.provantage.com/intel-pk8072006347700~7ITEP9KR.htm
    4 points
  18. Dang, DDR4 actually is overpriced nowadays. I was able to get my 16x16GB of G.Skill 3200c14 Bdie for around $500 back when I built the Dominus Extreme rig last year. Running 192GB or 12x16gb sticks at 3800 c14-14-14-34 with 1.560v, I am running 2T for daily which is stable. But oddly enough it can bench/game with all of this ram at 1T. Blistering fast though. And the latency is killer good at around 55ns. I love my DDR4. 💯 I think the main downside is people do not realize how much power high capacity DDR4 uses when at this voltage. Running something like TM5 pulls like 140+ watts for the memory alone. I found the Clevo 4090 mobile bios on techpowerup VGA data base. And I just flashed it. My brightness controls work in SDR and HDR again now!
    4 points
  19. Well....you live by the open box gamble sword and die by the open box gamble sword and I finally crapped out for the first time in quite a while.... Amazon Asus Z890 Strix-A MB which was listed as "used, acceptable" was basically brand new and still had the original 503 launch BIOS on it but I did see a few thermal smudges, so I know somewhere, somehow someone tried to use it and probably got frustrated and sent it back where it sat on a shelf most likely for the last year+ Microcenter open box 265k is an SP77 and runs crazy cool even benching on an air cooler on top of the Asus box for now using an ID-Cooling Frozn A410 before it ends up in final boss form. Bonus? It qualified for the Intel game giveaway which Microcenter attached to the receipt. I'll spend the next week tuning it up and then give it a whirl with what will initially most likely be that Nitro+ at this point because..... Microcenter Asrock Challenger 9070xt was DOA right out of the box. Wouldn't boot on my Aorus B650i Ultra but it has no diagnostic lights so I used that as a reason to quickly assemble the 265k and popped the sticks out and got it up and running and tried it in there and the VGA bad light lit right up. As a final check, I popped it into the wife's system which has been running a 9070xt already no problems a few minutes ago and yep, VGA led light lit right up too. You would think they would check all their open box items before letting them go out the door. Lord knows they check 5090s but I guess 9070xt's don't get the same treatment. MC is over an hour away gas + $6 bridge toll and now I get to make that wonderful trip back to return it. They sent me a survey and you know I ripped them apart for several reasons on the survey they failed to pass. We were in there for well over an hour. Not one person asked us if we needed help but plenty were standing around. 18 min checkout turned into almost an hour and for some reason they took my license in the back and I don't know why..... That didn't happen before. Before the testing with TG contact frame installed: Wife's system giving me the final third strike "You suck Bro! Back to MC you go!" VGA diagnostic fail on the Challenger 9070xt. 😞 ----------------------------------------------------------------------------------- 12900k was a beast. 14900ks in WoW end game form (59/45/50) on the wife's system is beastly too but even it can't handle the load on Dorongal when heavily populated and fps dip down and utilization drops quick even on her 9070xt at 1440p ultra settings rt high. I'll have to watch her raid next time to see how it's performing comparatively with all the new driver and WoW updates that have drastically increased 9070xt performance in WoW from launch where it didn't work at all meaningfully to working very well to now routinely capping out her fps even in Tazevesh 1440p Ultra/RT 165hz display. Ouch, I'm kicking myself for not picking up a few more of those..... memory wise even your 12900ks probably has more in the tank as my launch 12900k towards the end when tested on some 8000 sticks on an Asrock Z790i Lightning gave up the ghost at 7200. To really get Ultra to shine, you're probably going to need much faster memory at least 8000 but with current pricing? Ugh... How are the temps on your laptops when gaming at those settings? What specs? Some might be Acer is stupidly conservative with their hardware and BIOSes. This has been true for years now. They lock things down well under spec and really limit you in BIOS settings too. It's annoying. Dell and Asus give you a little bit more control and MSI is next followed by Prema enabled laptops. Glad to see you back in the forums bro! Hopefully the break was nice but hardware wise we're in that lull period atm.... I can understand that logic wanting the top dog of a current generation. There's always that irrational fear of Nvidia introducing a 5090ti if the market had a sudden downturn but they have no competition at the top so that most likely will not happen. I think you will be pleased with the 9070xt. Red Devil is one of the top rated models for 9070xt's along with the Nitro+ and Taichi but uses traditional pcie connectors so that's a plus. I know on my wife's system performance at 1440p is excellent for WoW running full tilt and she'll be using that card for years to come (or at least towards the end of its 3yr warranty then maybe move on). I was going to either pick up a 9950X3D or an Epyc like @Mr. Fox, but decided the experimental Ultra adventures first. Luckily, CPU prices are stagnant and within the holiday window you always have the option of returning the 9950X3D if it isn't a good fit. Yeah, no way they're squeezing in that 600w+ fire breather in a laptop! Things are much closer on the lower end but by the time you hit 5070ti laptop vs 5070ti desktop it starts to get brutal and from there it's lights out especially at the 5090 level: 5070ti laptop vs desktop:
    4 points
  20. Here is my 4090 laptop with new bios. Hard to believe my desktop 5090 is so much faster than this. But this gaming experience out of this laptop is amazing. http://www.3dmark.com/sn/10199710
    4 points
  21. Nice! I'll have to take a looksy and see what I can extract and apply to my humble 9800X3D and Asus X870e Hero. 🤣 This is Asus. I highly doubt they binned anything ala 4090 Matrix...... Even as robust as the PCB and cooler design is, there is still the fragility of the 600w connector even in the face of the BTF connector which while I'm sure on paper can handle a lot of juice, why chance anything over 800w? They need their "limited edition" (Same card I could have pre-ordered on launch day for well over 8hrs) Matrix cards to be problematic free and run perfectly. Personally, if the BTF connector is so good, I'm not sure why they didn't split the load evenly at a minimum 400/400 or at the very least supply a 3-prong lead and cap the 12v2x6 to an official 450w and the BTF to 350w for an extra added layer of protection..... Exact same plan of attack, but the 800w is turning out to be the new Gigachad on steroids. I can see a low voltage card wanting to run the XOC though and getting access to all that 1.15v. If it can be dialed in / reeled back, it has its place but for normal use, I still think the Matrix 800w is the way to go because it seems to be the special sauce on all fronts sans memory OC unless severely voltage capped cards are purposely limited even when running the 800w MatrixMagic. Running XOC on 5090 vs 4090 I rarely approached 500w let alone 600w+ with my 4090 even OC'd in most gaming scenarios whereas OC'd my Vanguard routinely hits 630w and it clearly wants more. At 630w my voltage will hit ~1.090 leaving me a nice voltage chunk on the table that I'm sure the Matrix 800w will gladly slurp up with my 1.125v native cap. I'm not willing to take that leap till I have a nice layer of monitoring, buffer and protection in place. You must be ready and willing to know that 600w connector is going to be severely abused more often than not running either of the bios but especially that XOC, whew..... I can see @Mr. Fox now when 1.15v popped up on his screen and his UPS started to howl going: I'll have to take a look at the video later. I usually like his outlook on market forecasting. This is why I'm waiting for my Wireview Pro 2 before I do anything.... As for Asus, they don't officially condone everyone using their vBIOS files even on their own cards. They are not going legitimize it by compounding the issue and creating a version of the Matrix to run on their other cards. The objective is clear in their response. That is to NOT run ANY of the XOC/Matrix vBIOS files on ANY of their non sanctioned cards..... that's pretty clear. Wow, they are retroactively adding it? I completely get it and blame the sophistication and sheer number of cheaters out there that somehow extract a sense of victory winning matches cheating at all costs but this truly is the only way to completely oversee every aspect of what is going on because some of the cheats were near that level of low level hardware access to circumvent previous measures. This is what forced EA's hand because in the world of competitive real player games, the sanctity of fair play overrides everything. I remember back in the Quake competitive days so many "good" or "great" players would never attend gaming lans or cons and those who did show their faces suddenly weren't anywhere near as good as they were online....imagine that. I remember one dude even trying to play isolated in a closed door room with his clan and the judges were like "no, anybody can observe anybody" and they opted not to play and made up some BS excuse. I don't blame EA. I blame the cheaters basically forcing EA to implement their version of The Final Solution and there is no circumventing this measure....yet. I took great joy in watching players being banned at an unprecedented rate in the history of EA competitive games and running to reddit to cry about it. As long as they don't start adding it to non competitive games, I'm ok with it. If I truly wanted to play, I would just create a Windoze gaming only install stripped to the bone on its own drive and tuned to high heaven and keep it all on there and dual or triple boot to it as needed.
    4 points
  22. Another way to see it... But yaea, 5090 live their own lives. Same as with 24GB 4090's. NVIDIA restocks GeForce RTX 5090 Founders Edition in Europe https://videocardz.com/newz/nvidia-restocks-geforce-rtx-5090-founders-edition-in-europe Can't compare them. 5090 ain't a 600W graphics card. And not many run 1000W on 4090's. Neither did many go above 660-700W for daily use with their 4090's. 800W ain't healthy for gaming all day long with single tiny trash power connector. Edit. Why not make a 800W Matrix bios for Asus cards if they are so worried that people will use the 2001W XOC version? The cat is out of the bag. Just release it. Unofficial ASUS RTX 5090 BIOS Emerges With Extreme 2000W Power Target ASUS responded quickly after the BIOS began circulating. The company emphasized that installing unofficial firmware carries significant risks, including potential hardware damage and warranty loss. Because the BIOS bypasses the standard checks used to keep the GPU and its VRM within safe operating conditions, any attempt to use it in a normal system could cause immediate failures.
    4 points
  23. well dayum, looks like ill have two vbioses to play around with once my wireview pro II comes in 😄 so if i understand it correctly, the asus XOC vbios forces 1.15V at all times on the core? is there an option to lower it and have some control? otherwise id have to try and reign in the temps via power limiting but id prefer voltage tbh. in any case, nice to FINALLY have more options 🙂 is it just me or does it somehow feel like running the 5090 at 1kW+ is more dangerous than the 4090 at 1000W? with my previous 4090 suprim i just ran the 1000W XOC vbios as a daily without issues. and that was without any monitoring tool like the wireview i will have available now. sweet, ill save it up for future reference 🙂 want to get my DDR5 issues sorted first before i start properly tuning the CPU. but now with the Matrix and XOC vBioses out ill have to go back to GPU tuning too 😄 so much to do, so little time, argh! enjoy the bit of freetime u have now between jobs, im sure ull be in over ur head once the new position starts 🙂 keeping my fingers crossed for you on that btw!
    4 points
  24. If Steve is right you might be able to get a 5090 for a decent price. Let's hope it crashes real hard. I wouldn't mind my 5090 losing value if that means that I could buy a second 5090 extra cheap. If it is cheap enough I might even consider a 5080. Anything but an FE turd.
    4 points
  25. I am going to have to fire up the chiller tomorrow and play with the power slider to see if I can reign in some of that because pulling 1700W from the wall is not sustainable. It will trip circuit breakers (and I have this on a 20A circuit) and exhausts the capacity of my 1500W UPS. If that does not work will I will just go back to the Matrix vBIOS and use the EVC2. I'm not gonna lie though... I just LOVED seeing that 1.150V default. For now I just flipped my GPU over to using the stock Zotac 600W vBIOS. I think letting it run wild and free on the XOC vBIOS probably isn't a smart thing to do. It might be safe, but I don't want to just assume that and hope it turns out OK. Temping fate to such an extreme degree (1700W from the wall, probably 1300W from the GPU) with the petite arson PCIe power cable should put the fear of God in all of us that possess an ordinary amount of common sense. Here is something interesting. I use a GPU overclocking program in Linux called LACT. It shows both core and hotspot temperature readings on the 5090. So, maybe the register is actually there and the NVIDIOTS gave Windoze software developers marching orders to not expose it. -?- I think they rank right up there with bad actors like crApple when it comes to being among the kings of command and control dictatorships. I received my Thermal Grizzly GPU die gaskets for liquid metal, so I really need to make time for that. The package comes with 3 for 4090 and 3 for 5090, so I can do both GPUs. I am expecting to get an offer on the job I want most on Monday or Tuesday and I am sure that I am going to initially not have much discretionary time to do things until I get past my orientation, which usually take a bit more time in c-suite positions.
    4 points
  26. Process Lasso is definitely the way to go if you have a dual CCD processor with 3D V-Cache. No need for Micro$lop XBOX feces bloatware and the extra chipset driver crap. Just select the EXE and choose the affinity and BOOM, every time you launch that game it uses the CCD with the cache. When you close the game it is back to using all cores and threads. Easy, peasy... no hokey pokey nonsense with flakey rubbish from the Redmond Reprobates.
    4 points
  27. OK. New problem. My CyberPower 1650W UPS does NOT like this XOC vBIOS, LOL. SCREAMING BLOODY MURDER. I will be using the Matrix vBIOS one my "quiet" vBIOS position for normal daily driver usage and gaming. I launched Cyberpunk 2077 and all the sirens began blaring on my shunted 5090. I do not even need the EVC2 now because the XOC vBIOS gives me the voltage I was setting manually using the EVC2 LOL. I will test with benchmarks to confirm, but I may move the EVC2 over to the 4090 Suprim if I no longer need it on the 5090. Check the voltage and power now. Nice.
    4 points
  28. A few of you might know I was put on unpaid furlough on Friday, along with about 8 or 10 other people where I worked, including my daughter and the company owner's nephew that has worked there for 15 years. I had my second job interview for an opportunity that looks really promising and I expect to be an even better job than what I had (which was great in every respect until Friday). I have never been unemployed before, but I am hoping to get an offer on this next week. So, please keep me in your thoughts and prayers. I hope all of you in the US celebrate a great Thanksgiving with your loved ones. We all have so very much to be thankful for, even when there are days that it doesn't seem like it.
    4 points
  29. Hmmm. A$us prefer the cheap way. The more you pay the more they save. Nvidia offer their 5090 FE cards at MSRP ($2000) and still come with rubber die guard. I expect Asus prefered to use paste as die guard to save sub 1$ in cost of build. Isn't that cheapo? Or is it the classic greed?
    4 points
  30. Biggest problem to me is the lack of a standard for the BTF connectors. Hyte goes where the market goes as do all case makers. If BTF is properly standardized and gains traction, they will go there but it has been swirling around for years now going nowhere so..... And that makes the Matrix and Astral BTF editions problematic because it suddenly limits your choice of motherboards if you want to switch/upgrade. I don't care for that. ----------------------------- Ahhh, much better. Here is a good video from BBT where he pits maxed tuned (memory and CPU) 14900KS vs 285k vs 9800X3D vs 9950X3D in a proper shootout. The overall consensus is still the 9800X3D is still king. Shines at 1080p. Loses a little steam at 1440p and really loses steam at 4k to pretty much make all tuned chips the same and the 285k actually comes out on top as the GPU experiences being capped. Like I said, 285k gets a bad rap, but properly tuned and paired with what is gobsmackingly expensive ram now, it can hold its own and then some. Also the newest edition of Intel's APO is actually valid now and works as intended when it can take advantage of its optimizations. For X3D, it comes down to the games being played/tested and the resolution too and how that effects the cache and saturation. Even Jufes, who beats on AMD regularly said if all he played was racing sims or flight sims hell yeah his main rig would be X3D as it dominates in those types of games. I still feel to this day the 14900KS when tuned properly provides a smoother, more steady gameplay experience in WoW for handling the lows but I also play at 4k maxed out so that subjects the X3D cache to more saturation than 1440p and especially 1080p. I actually logged into my WoW account on the wife's system the other day and played for a few hours at 1440p then set internal render to 4k for the next few hours and the gameplay, even on her 9070xt, is just smoother especially in raids. The 4k evidence from BBT's video just makes me want to build out a 285k even more at this point for 4k testing. I said it before, but if a decent 285k sample pops up on the forums, I'll probably pick it up and build out a cheap rig and just use these 8400 Kingspec sticks and see where they can go on it. Overall, when tuned ALL the top end chips provide a good gaming experience.
    4 points
  31. Damn, scammers now running scams through BB. That is just unreal, I hate people lol.
    4 points
  32. Awesome. I am looking forward to seeing how it runs, brother. I got up early to delid the CPU. It always amazes me how much better that makes things. I increased the core clock 100MHz across all 16 cores and my max temperature is still over 20°C cooler. I hate owning CPUs that are not delidded. The experience sucks with the IHS. So, I gave up on trying to use Windoze worthless trash software bloat to determine whether or not 3DvCache actually worked or not. Micro$lop butchers or botches up everything they get their hands on. I used Process Lasso to set parameters on the EXE file to force 3DvCache CCD0 use or all-core use with both CCDs. As you can see below, it DOES make a difference. All Core (no 3DvCache) CCD0 only (3DvCache only) - I used Process Lasso for this instead of Windoze trash All Core (no 3DvCache) CCD0 only (3DvCache only) - I used Process Lasso for this instead of Windoze trash
    4 points
  33. But we also know when rallied they can make a difference. Look at the de-prioritization of 8GB 5060/9060 cards due to lack of sales and more focus on the 16gb models. Look at prices falling (for now) for sub 5090 cards. Look how the market shifted away from Intel to AMD. Many of those decisions (and more) were consciously made by consumers even normies. If they could stop just shrugging their shoulders and locking onto RGB and aesthetically pleasing things and focus on the hardware and capability/functionality but I don't think we're ever going to be there because most end users don't care about any of that. They just want to plug in and get going and if problems arise they track down peeps that are like us in this thread. The A$$zeus license comment made me lol btw..... 🤣 ----------------------- @jaybee83 confirmed working on the Suprim Air by two users who have flashed it and it's working as intended. So far, it appears higher voltage card users that had higher ranges but the 600w limit kept them lower are experiencing increases in voltage to go along with their newfound 200w of power. On the other hand, if your card is mid or lower and already bouncing off that vmax as is at 600w, you won't suddenly open up more voltage but that 200w will still give you a nice, healthy kick. List so far of cards not working with it: Asus Astral 5090 Black Asus Astral 5090 LC Black Asus TUF Suprim Liquid 5090 FE (duh) Ironically, the revamped design of the Astral white 5090 does work with it.....so clearly not done out of maliciousness but a byproduct of the fan design. Haven't seen the Zotac 5090 AIO tested along with a few other cards yet but the vast majority are able to run it. Someone in Asus won't walk right after they are separated from their keister for letting this slip out as is and not differentiating power detection properly.
    4 points
  34. Definitely a fan controller / lack of headers issue same as the Aorus Waterforce. The Vanguard and Suprim are basically the same PCB. I would be highly surprised (and disappointed) if it didn't work on the Suprim. This is my plan as my order is in too and I plan on opening up a can of 800w whup arse at that time. I'll also probably migrate to a modern PSU too but we'll see. I suspect some at Asus have steam coming out of their ears over this..... Nice! So far it seems that if you have a proper 3 fan designed 5090 PCB AND it isn't named Asus, you can run the Matrix 800w vBIOS..... Yup, many disgruntled Astral owners over the XoC debacle, sky high pricing and now this? So glad I returned my Astral 5090 for many reasons and now just add this one on top of that list. You bought an Asus 5090 LC this time around. Will you be purchasing Asus next gen @Papusan? I don't think Asus would patch the BIOS to their own customer's detriment, but then again they did proactively stop their motherboards from retaining RGB settings so you needed to run their software with a BIOS update soooo........yeah.
    4 points
  35. I've been watching the insane increase in DDR5 prices. I don't need any, but decided to see if I could find some spares cheap to hang onto. Just scored a Trident Z5 DDR5-8000 48GB CL40 1.400V kit with Supercool direct touch cooler on eBay for $250. That same RAM kit is now about $300 sold and shipped by NewEgg. So, basically like getting the Supercool cooler kit for free, give or take a couple of bucks.
    4 points
  36. Intel Core Ultra 290K, 270K and 250K Plus spec leak: “Arrow Lake Refresh” with higher clocks, more cores and faster memory support So, a real "nothing burger" when all is said and done. The most intelligent thing AMD has going for it is long-term socket viability. Intel needs to learn from this now and back-track on some of their recent very stupid ideas (bring back hyperthreading and go back to monolithic design without the stupid Atom cores). It is hurting them more than they know. Their ending of hyperthreading was the final nail in the coffin for me. But their constant socket revisions requiring new motherboards after a few CPU generations is a compelling reason for people to choose AMD even if they would rather stay with Intel. (I am conflicted because I can't say that I like either one now. They both have pros and cons, but both have more cons than pros.) The short socket lifespan is a very damning attribute for Intel now that they no longer operate from a place of absolute supremacy and domination. They used to expect us to put up with it as going with the territory, which was fine when they still ruled the world. I'd really like to make my next build an Intel platform again. The overclocking experience (CPU and memory) is much better on Intel, but the lack of hyperthreading and abbreviated upgrade path due to frequent socket changes is making me question the logic of it. Once upon a time, not long ago, that was the only somewhat logical basis for choosing to go with AMD instead of Intel and that alone was never enough because there were too many other compromises attached to the idea. Intel should revive X299 and make modern versions of CPUs for that socket LGA 2011-v2 platform... 32 or 36 threads at 6.0GHz+ all-core overclock, quad-channel DDR5 8000+ and 40+ PCIe lanes... I'd be all over that, like white on rice... hell yeah, in a heartbeat. And, I'd be willing to pay twice as much as a Core Ultra (or Ryzen) flagship CPU to have it. The most fun I have ever had with overclocking was on X299. 100% (double stock clock) overclocking wasn't that difficult.
    4 points
  37. They fact that they used basically the best memory option 6000 CL30 for the AMD part, but then use far below Intel's optimal spec is enough for me to call it biased. Even if you use a mainstream 4 dimmer, most can achieve around 7600 CL32-34. Hell even using 7000 CL30 2x32gb dual rank would be a better option as it will offer up gaming performance of an 8000 SR kit on Intel. Unfortunately, this type of testing is mainstream and how most outlets test. It is what it is, I've just learned to ignore most of it.
    4 points
  38. Same, a function to import old BIOS settings would be nice. I've just gotten used to screen shotting my settings at this point and then manually re-entering them. Doesn't take too long (~10min tops if that) but still a nuisance. Hmmmm, I'll have to give this a true next time on the z690 D4 strix w/ SP109 14900KS and x870e Hero.... Didn't even know they basically offered the 9950X3D in Epyc form. Looks like the few reviews and look ups say it is better binned than the 9950X3D and the price is now down to basically 9950X3D levels. Tempted to try one myself! Unfortunately cheaters and cheat suppliers have gotten so good that EA has implemented their version of "The Final Solution" and it works. It wipes out any and all chances of cheats sitting in any memory space without being detected but it requires you to basically give up just about every aspect of protection on your system to do it. It is insane watching cheaters lose their minds in threads because they are being banned left and right trying to run updated paid for (!) cheats that have always worked getting them permanently banned on a hardware level. Just like with the Switch2, buyers have to be wary buying second hand hardware that could be banned from the gaming networks or games they want to play. I am not a fan of the level of access that is required to play some of the newest competitive games, but I also understand why in such a hyper competitive environment. When I used to play competitive Quake the sheer amount of cheaters was ridiculous and was always refreshing at gaming cons to suddenly see many of those same players under observation during matches suddenly get "less good" real quick. Only way I would entertain this level of intrusion to play would be a separate, games only install of Windows with zero Windows login or any other type of logins anywhere and use it to game exclusively on its own drive and I'm not willing to go to that level yet. Next WoW xpac, Blizzard is basically taking security to the next level and locking out mods and assists in what appears to be an attempt to heighten security. Remember, Blizzard via Activision was purchased for almost 70billion from Micro$oft. The Midnight xpac is supposed to be another major overhaul so we will see. ----------------- I like Testing Games to a degree, but the fact they still test the 14900k unoptimized at 6000 memory vs the X3D running the classic sweet spot 6000 just doesn't work for me. The way they run their DDR5 my tuned DDR4 B-die setup with the 14900KS will decimate their results each and every time let alone when I was running it tuned w/ 8400 DDR5..... -------------------- In the same vein of Socket 1700 and BF, if the 12 core Bartlett-S drops and is tunable, I'm still definitely interested in it too depending on where we can go with it. I'd pick up an extra board and slap these 8400 kingspecs in there and see where it goes....
    4 points
  39. nifty, but my main gripe is still that in many instances i need to manually come up with new profiles whenever i update the bios version. cmon now, how hard can it be to make bios profiles compatible with all new bios versions?!
    4 points
  40. Nice new feature in ASUS BIOS for F3 Save as CMOS file to save all BIOS files to USB as a single file instead of a separate CMO file for current profile only. Explanation from safedisk: https://www.overclock.net/posts/29527571/ @jaybee83 @Raiderman https://www.overclock.net/posts/29527497/ Explains why CMOS is much larger than CMO.
    4 points
  41. A coworker of mine was chatting with me while we were frantically working onboarding a new client. The result of said conversation is that I now have a GTX 1080Ti from EVGA. As such, my second morpheus heatsink will now have a home. I might do a as is test, then cleaned up test, then morpheus. Like a complete idiot I also damaged the Asus Sage 621e motherboard when trying to do a mock fitting. I didnt realize that the motherboard stand-off's were preinstalled by the manufacturer. Shows how long I've gone without a case lol. I cant seem to find anyone willing to repair so I might go another route. Oh and does anyone have any spare 13th Gen CPU's collecting dust? Coworker bought a 14th gen used and turns out its defective. Sorry I havent been online for a spell, onboarding clients requires 12-14 hour days evidently
    3 points
  42. I suspect the PSU is the problem. I would get a Lian Li Edge, ASRock, Seasonic, NZXT, be quiet! or Corsair PSU. Just don't get one that boasts about "silent" or zero-RPM fans because they run very hot. I think that alone could compromise the 12V-2x6 cable integrity at the PSU side. I would steer clear of ASUS, MSI, Gigabyte and Cooler Master PSUs. Definitely go with native ATX 3.1, not 3.0 or "compatible" PSU. This rating relates to the 12V-2x6 cable specifically. This list is very useful (from the video) SPL's PSU Tier List | Google Sheet These are very good fans. They are the only ones I buy now. Best bang for the buck by far. I recommend them based on price and specs. They are very heavy. They make them in white also, for those that care. The three-pack includes a fan/ARGB hub. I have 12 in one build and 9 in the other one. MONTECH Metal PRO 12 ARGB 120mm PWM - High Static Pressure, Quiet RGB Fan with Fluid Dynamic Bearing, 3 in 1 Pack with 6X6 Fan Hub - Black
    3 points
  43. LMAO, you’re toast! I also have the 9600 GSkillz 2x24gb set that was very briefly offered. They’re Gear 4 XMP which is annoying. But I just use the very tuned Apex profile for 8600 CL38. It’s pretty epic as it’s tuned extremely well and no real FAFO needed. I set 1.4 IMC and 1.3v VCCSA as auto is wild high. Also I’ll be binning some 290Ks, at least a few. How’s your 265K? Can it do 40x D2D? The chip I sent @win32asmguy is a 40x D2D chip.
    3 points
  44. Very close to the same performance as Windoze in spite of the emulation overhead. So close, in fact, that my scores are significantly higher than most people see in Windoze. Unfortunately, HWBOT is behind the times. They support stupid crap like smartphone and tablet benchmarks, but not Linux. Dummies. Cinebench R23 on Kubuntu Cinebench R20 on Kubuntu
    3 points
  45. It's all crazy. Have you checked DDR5 prices? It has tripled to quadrupled in price in the past 30-45 days. Insane. Used kits are selling for 50 to 100% more than what people paid for it. What would we do without these "precious" AI innovators effing things up for the rest of the world? We have them to thank for many of our man-made financial crises. Families are footing the bill for their retarded nonsense. Even the cost of electricity skyrocketing in some areas is directly or indirectly a result of their insane pursuit of this idiotically obsessive business endeavor. They should not get any special incentives or tax breaks by federal, state or local governments and giving them should be illegal. I am not against AI. I use it and see value in it. What I do not see value in is it becoming the most important thing and the cost of living going up to subsidize development of it. That needs to stop. Companies looking to develop it need to absorb 100% of the cost even if it means they do no better than break even. If they can't do it that way and can't afford it without outside help then they need to step aside and excuse themselves from participation. Screw the shareholders looking for instant financial gratification. It should suck to be them rather than it becoming everyone else's problem.
    3 points
  46. https://hwbot.org/benchmarks/cinebench_-_r23_multi_core_with_benchmate/submissions/5931059
    3 points
  47. I suspect that it is something to do with the firmware/AGESA version. Try going back to an older bios version that was allowing you to run stable with your 8000 memory settings. If you already flashed one of those firmware that tells you you cannot go back once flashed you probably still can if you do the BIOS flashback from USB. That almost always works for me. Just remember to rename the file correctly before putting it on a USB thumb drive all by itself. You could even try it with the same BIOS version first because it could be something that got corrupted in the firmware and the best way of fixing that is doing the flashback feature manually instead of trying to reflash the normal way.
    3 points
  48. One thing that I find difficult with overclocking Ryzen versus Intel is how fussy and sensitive they are in comparison. There are lots of settings that do not play nice together and if one thing is just slightly off it is like a house of cards. Performance and stability can go from great to garbage if all of the stars are not aligned. The CPU overclocking on Ryzen is about as fickle and finicky as RAM tuning is on Intel, and RAM tuning is a tedious process on either platform. In a way it is fun, but can be the opposite of fun (a really huge pain in the butt) when finding the right lever to pull is elusive. When you factor in the wildcards of the silicon lottery (trashy CPU and memory quality samples are common) it becomes even harder to be pleased with results. The crazy memory overclocking results posted places like overclock.net are fringe examples of the small winner count in the silicon lottery and they are not representative of the norm. Over the past week or so I have moved around CPUs and memory between the Strix and Apex. The Strix CONSISTENTLY has higher Cinebench scores compared to the Apex regardless of the CPU sample (9950X SP118, 9950X SP119 and 4585PX SP120). The memory overclock limits are exactly the same on all three CPUs no matter which motherboard they are installed in. Nothing I have tested in terms of settings allows the Apex to beat the Strix X870E-E on Cinebench. EVER. This was equally true comparing Cinebench scores again the X870E AORUS Master. The Apex loses 100% of the time in Cinebench. Why? I wish I knew. Is the the idiotic PCIe lane allocation on the Apex? The 2-DIMM memory topology ASUS used on the Apex? Is it the DIMM.2 somehow dragging down the CPU? Don't have a clue. Whatever it is, Cinebench doesn't like it. As far as memory overclocking in concerned, either the Apex is hardly any better or both of my memory samples have reached their functionality threshold. Same threshold with four different CPUs. The Apex maxes out at 8200 stable with my G.SKILL 6000 C26 1.450V 32GB kit and the Strix maxes out at 8000 stable with that kit. The Apex maxes out at 8400 with sloppy timings that gives performance the same as 8000 with tight timings using the Kingbank 8400 C40 1.450V 48GB kit (which has both XMP and EXPO 8400 profiles). With that kit the Strix maxes out a 8000 with the same tight timings that work on the Apex. The Strix cannot boot 8400 stable but the end result is the same memory performance on both systems. The latency and speeds follow the CPU and RAM kit to achieve identical results in each motherboard. This essentially makes the Strix the winner overall. The fact that it is roughly $300 cheaper, depending on where and when you buy it, makes it an absolute no brainer. The Apex is a great motherboard, but it doesn't deliver based on price or the fact that Cinebench scores are CONSISTENTLY 500-1000 points lower for any given CPU installed in the Apex. I captured this comparison of the small latency penalty of 3D V-Cache. I did not bother capturing screenshots of different motherboard, RAM and CPU combos because it always ends the same. The screenshots would not show anything different overall. The latency results follow the X3D and non-X3D CPU and memory combo regardless of what motherboard they are installed in. The G.SKILL 8000 C40 kit that I picked up with the Supercool Direct Touch RAM block is an XMP kit. It is virtually impossible to tune it stable at 8000 with all manual tuning in either of the AM5 motherboards. If I use timings that are stable with either of my other two memory kits mentioned above it refuses to boot and I have to clear CMOS. I cannot figure out why that is the case and it does not make any sense. Maybe the kit is rubbish, but I don't have an Intel motheboard to test it in to find out. However, I have experienced the exact same problem trying to use XMP memory that I had that were stable on Z790 motherboards that were also unusable on X670E and X870E motherboards, so I think it is something peculiar with the XMP memory, even though it makes no sense that manual tuning doesn't work. I have been trying to get the memory latency on the 4585PX below 60ns and it is tough to do. This CPU is nothing more than a better binned 9950X3D and I think this one might be slightly better than average. While the memory and L1, L2 and L3 read/write/copy and latency values are a direct result of clock speed, I wondered if the 3D V-Cache was making the latency higher. It seems it is, but not by enough to be statistically relevant. I asked Google Gemini about it, and this is what the AI regurgitated for me. The "chef" analogy (below) is pretty good. I thought @Raiderman and @jaybee83 might find this particularly interesting. Per Google Gemini - for whatever that is worth... Question: Do AMD X3D CPUs have higher memory and L3 cache latency than the non-X3D processors? Yes, technically they do. It is a paradox of the architecture: AMD X3D CPUs generally demonstrate higher latency for both L3 cache (in cycles) and system memory (in nanoseconds) compared to their non-X3D counterparts. However, they are faster in gaming because their massive cache capacity prevents the CPU from needing to access system memory in the first place. Here is the technical breakdown of why this happens and why it doesn’t hurt gaming performance. 1. L3 Cache Latency: The "Stacking Tax" The primary difference between an X3D chip (like the 7800X3D) and a non-X3D chip (like the 7700X) is the 3D V-Cache technology. AMD physically stacks a slice of SRAM on top of the processor die. This introduces a slight delay. The Penalty: Accessing the 3D-stacked cache typically adds a ~4-cycle penalty. The Clock Speed Factor: Latency is a product of cycles and clock speed ($Time = Cycles / Frequency$). Since X3D chips run at lower clock frequencies to manage heat, the absolute latency in nanoseconds is slightly higher than non-X3D chips that boost 400-500MHz higher. Feature Non-X3D (e.g., 7700X) X3D (e.g., 7800X3D) L3 Cache Cycles Lower (Faster) Higher (~4 cycle penalty) Core Frequency Higher (~5.4+ GHz) Lower (~5.0 GHz) Result Lowest raw L3 latency Slightly higher L3 latency 2. Memory Latency: The Benchmark Illusion If you run a synthetic benchmark like AIDA64, an X3D CPU will almost always show higher memory latency (worse performance) than a non-X3D chip. Reason 1: Clock Speed: Memory latency is sensitive to core frequency. Because X3D chips have lower peak clocks, the time it takes to request data from RAM and process the return is technically longer. Reason 2: Testing Methodology: Synthetic benchmarks deliberately force the CPU to fetch data from System RAM (DDR5) to test the worst-case scenario. They bypass the cache to measure the "wire delay" to the memory sticks. In this specific "worst-case" scenario, the non-X3D chip wins because it has faster clock speeds and no cache-stacking overhead. 3. The "Effective Latency" Paradox If the latency is higher, why are X3D chips faster for gaming? The answer lies in Hit Rate. Non-X3D: Fast access to L3, but the cache is small (32MB). The CPU frequently "misses" (can't find data) and must go to System RAM. Penalty: Trip to RAM takes ~60–70ns (eternity in CPU time). X3D: Slightly slower access to L3, but the cache is huge (96MB).7 The CPU almost never misses. Benefit: It stays within the L3 cache, which takes ~10–12ns. The Math of "Effective" Latency: Even if the X3D cache is 10% slower to access, avoiding one trip to system RAM (which is 600% slower) saves massive amounts of time. Analogy: Imagine you are a chef. Non-X3D: You have a small fridge under your counter. You can grab ingredients instantly, but you frequently have to run to the grocery store (RAM) for missing items. X3D: You have a giant walk-in pantry. It takes you 2 seconds longer to walk into the pantry than the small fridge, but you almost never have to leave the building to go to the grocery store. You cook faster overall. Summary Raw Latency: Higher on X3D. (Due to lower clocks + stacking penalty). Effective Latency: Lower on X3D. (Because you rarely hit the slow System RAM). If your workload fits entirely inside the cache (like most games), X3D wins. If your workload is random and too large for even the X3D cache (like some scientific rendering), the non-X3D chip might win due to higher clock speeds. The video below is relevant because it directly compares the older 7800X3D and 7700X, illustrating how the theoretical latency differences translate into actual gaming performance. Ryzen 7 7700X vs Ryzen 7 7800X3D - Any Difference Or Not? TL;DR - See below: Lesson #1: If you want to go with AM5 and care about Cinebench scores buy an X870E-E Strix or X870E AORUS Master and save yourself $200-$300. If you a lottery winner have an extremely rare Ryzen golden silicon CPU sample and an extraordinarily superior memory kit and only care about synthetic memory benchmarks and nothing else matters, cough up the money for an Apex then have fun watching paint dry overnight with Karhu speed tests that have no measurable bearing on normal, real-world performance results in anything other that Karhu. Lesson #2: Unless you want to waste a lot of time and drive yourself insane, don't try to get an Intel XMP memory kit to work on AM5. Unless you get lucky somehow, you are doing to hate yourself for not just buying an EXPO memory kit.
    3 points
This leaderboard is set to Edmonton/GMT-07:00
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. Terms of Use