Jump to content
NotebookTalk

Leaderboard

Popular Content

Showing content with the highest reputation since 10/25/2025 in all areas

  1. 5 points
  2. Off-Topic: The Phoenix Micro Center grand opening for VIP members is 11/5. My wife better hide my wallet. 🤣 It's an hour each direction from where I live, but at least it's not the 6-hour drive each direction to Tustin, CA.
    5 points
  3. For the GPU, I would say select whichever one costs less, especially if you are putting a waterblock on it. My Zotac 5090 Solid OC was a good buy compared to the other more expensive options that deliver nothing for the extra money. It is an excellent GPU. The air cooler on it was fantastic (unlike some of the other affordable brands/models). It ran freakishly cool for an air cooled GPU. The only 50-series GPU I would recommend avoiding like a plague is an FE model. For overclocking potential probably the best GPU silicon quality most consistently will be an AORUS Master, but the cost vs benefit isn't justified. I love overclocking more than anything else I do with a computer, and really the only reason computers matter to me at this point, but paying a WHOLE LOT more for a very small gain in GPU benchmark scores is just not a very intelligent decision. I have owned the following X870E motherboards and I list them in my order of preference: X870E AORUS Master (best overall - only flaw is no way to disable WiFi/BT in BIOS) X870E-E Strix (replacement for second AORUS Master that arrived with shipping damage) X870E Apex (returned first for refund, second was junk, I am using #3) X870E Carbon (returned for refund - good mobo, but no async BCLK and weird glitches) X879E Taichi (my least favorite out of all AMD motherboards I have owned - hated it) I had a X870E Taichi and hated it. The PCIe bifurcation was garbage and I did not care for the firmware. I have only owned two ASRock motherboards and did not like either one. I had a B850 AORUS Elite that I used in a build for my granddaughters and it was excellent. The only criticism I had was the PCIe slots below the GPU slots were X1, but using them did not drop the GPU to X8. This is unavoidable with anything below X870E dual chipset due to a lack of PCIe lanes with an non "E" AMD dual chipset motherboard. PCIe X1 dramatically reduces NVMe speed... makes NVMe speed like SATA SSD. If you plan to insert anything in other PCIe slots in addition to your GPU in an AMD motherboard the "E" version is an absolute must have. The only complaint I have with the Gigabyte boards is no way to disable WiFi/BT in the BIOS. Super stupid flaw they could fix effortlessly if they cared. If you use WiFi/BT and use Windoze 11 as your main OS (I do not do either one) this truly is a non-issue. It really pissed me off that Gigabyte did not provide that option in the BIOS. I asked twice and both times they said no... "you're the only person complaining about it" (essentially we don't care what you want and you are not worth the minimal effort needed to make a BIOS as good as our competitors). Gigabyte is the only brand I know of that omits this essential basic BIOS option. The Strix was an accidental blessing. I purchased a second AORUS Master from Central Computers on sale for less than what I paid for the first. The big and heavy NVMe heatsink under the GPU was not latched. Apparently shipped from the factory without being latched. It flopped around inside of the box and broke several things and scratched up things that did not get broken. I asked them to open the box and inspect before shipping a replacement. They had quite a few in stock and ended up opening all of the boxes and all were damaged in the same way. They offered the Strix for no difference in price. I accepted. The Strix is better than the AORUS Master in terms of firmware. A close second only because I could not install both of my Sabrent quad NVMe X4 cards like I could in the Master. It only has one extra PCIe slot. The AORUS had two, both usable at X4 without dropping the GPU from X16 to X8. The AORUS Master allowed me to install 10 NVMe SSDs and 4 SATA drives while maintaining the GPU at X16. The Apex is a great motherboard with a glaring engineering defect entirely due to an idiotic PCIe slot arrangement. I can only use the X4 PCIe slot above the GPU. The Sabrent quad NVMe card's heat sink touches the GPU backplate. The Strix performs as well as the Apex in terms of the CPU overclocking. It has asych BCLK and I can use the Sabrent card in the bottom slots without the GPU dropping to X8 like it does in the Apex. If I knew everything I know now before buying my first I probably would have purchased two X870E-E Strix Gaming WiFI. If I were going to recommend one, it would be the X870E-E Strix as the best all-around X870E motherboard with the fewest flaws and compromises. Hope this helps. https://www.newegg.com/asus-rog-strix-x870e-e-gaming-wifi-atx-motherboard-amd-x870e-am5/p/N82E16813119682
    5 points
  4. But we also know when rallied they can make a difference. Look at the de-prioritization of 8GB 5060/9060 cards due to lack of sales and more focus on the 16gb models. Look at prices falling (for now) for sub 5090 cards. Look how the market shifted away from Intel to AMD. Many of those decisions (and more) were consciously made by consumers even normies. If they could stop just shrugging their shoulders and locking onto RGB and aesthetically pleasing things and focus on the hardware and capability/functionality but I don't think we're ever going to be there because most end users don't care about any of that. They just want to plug in and get going and if problems arise they track down peeps that are like us in this thread. The A$$zeus license comment made me lol btw..... 🤣 ----------------------- @jaybee83 confirmed working on the Suprim Air by two users who have flashed it and it's working as intended. So far, it appears higher voltage card users that had higher ranges but the 600w limit kept them lower are experiencing increases in voltage to go along with their newfound 200w of power. On the other hand, if your card is mid or lower and already bouncing off that vmax as is at 600w, you won't suddenly open up more voltage but that 200w will still give you a nice, healthy kick. List so far of cards not working with it: Asus Astral 5090 Black Asus Astral 5090 LC Black Asus TUF Suprim Liquid 5090 FE (duh) Ironically, the revamped design of the Astral white 5090 does work with it.....so clearly not done out of maliciousness but a byproduct of the fan design. Haven't seen the Zotac 5090 AIO tested along with a few other cards yet but the vast majority are able to run it. Someone in Asus won't walk right after they are separated from their keister for letting this slip out as is and not differentiating power detection properly.
    4 points
  5. Definitely a fan controller / lack of headers issue same as the Aorus Waterforce. The Vanguard and Suprim are basically the same PCB. I would be highly surprised (and disappointed) if it didn't work on the Suprim. This is my plan as my order is in too and I plan on opening up a can of 800w whup arse at that time. I'll also probably migrate to a modern PSU too but we'll see. I suspect some at Asus have steam coming out of their ears over this..... Nice! So far it seems that if you have a proper 3 fan designed 5090 PCB AND it isn't named Asus, you can run the Matrix 800w vBIOS..... Yup, many disgruntled Astral owners over the XoC debacle, sky high pricing and now this? So glad I returned my Astral 5090 for many reasons and now just add this one on top of that list. You bought an Asus 5090 LC this time around. Will you be purchasing Asus next gen @Papusan? I don't think Asus would patch the BIOS to their own customer's detriment, but then again they did proactively stop their motherboards from retaining RGB settings so you needed to run their software with a BIOS update soooo........yeah.
    4 points
  6. That Matrix GPU is only good so others can use the vBIOS. The price is extremely stupid and I think it looks stupid, too. 🤣 I have gone pee in my own backyard before. When you have a family of 7 and 3 bathrooms, there are times when you just can't hold it any longer. Now that we are down to 2 people at home (empty nesters) and 2 bathrooms I have not found myself needing to do that for a long time.
    4 points
  7. Hopefully this was not a waste of sand. But, I will find out as soon as I button things up. If it's not better than what I already have it's going back for a refund.
    4 points
  8. They fact that they used basically the best memory option 6000 CL30 for the AMD part, but then use far below Intel's optimal spec is enough for me to call it biased. Even if you use a mainstream 4 dimmer, most can achieve around 7600 CL32-34. Hell even using 7000 CL30 2x32gb dual rank would be a better option as it will offer up gaming performance of an 8000 SR kit on Intel. Unfortunately, this type of testing is mainstream and how most outlets test. It is what it is, I've just learned to ignore most of it.
    4 points
  9. nifty, but my main gripe is still that in many instances i need to manually come up with new profiles whenever i update the bios version. cmon now, how hard can it be to make bios profiles compatible with all new bios versions?!
    4 points
  10. Nice new feature in ASUS BIOS for F3 Save as CMOS file to save all BIOS files to USB as a single file instead of a separate CMO file for current profile only. Explanation from safedisk: https://www.overclock.net/posts/29527571/ @jaybee83 @Raiderman https://www.overclock.net/posts/29527497/ Explains why CMOS is much larger than CMO.
    4 points
  11. Nicely an clean even with the extra pipes, the distribution plate puts in some good work. I got my network into it's final form. Full 10Gb backbone to my study and living room with 2.5Gb connections. DAC cable to the 8 port switch and an SFP+ to ethernet adaptor for 10Gb for my uplink to upstairs. I'm getting a revised version of this holder so I can cable tidy this properly.
    4 points
  12. I finally took the time to get the X870E-E Strix memory on water like the Apex. First time in quite a long time that I ran the memory with air cooling and I am so glad to not have to use a fan for it any more. I really did not like that. Water is so much better, not to mention looking a whole lot better as well. That 120mm fan hanging on a bracket from the top radiator was pretty ugly. It will take a day or two to work all of the air out of the distro block. Now it's time to hit the sack.
    4 points
  13. There is definitely some user error in the equation along with inherent balancing issues. Luckily with something like the WireView Pro II we now have a proper tool just in case to monitor balancing AND it can handle higher loads along with a serious warranty. Once shipped, it definitely dims the light on the Astral a bit. AMD made a major mistake and damage control is in overdrive. The sad part is if the tech media and users hadn't responded so negatively they would have continued with this business as usual. They may change course but the destination is still set and that doesn't bode well. Like Steve said in the video, "I said earlier this year don't %*%*% up AMD!" That's all they had to do. They had the momentum and actually sold way more 9070xt/9070 cards than expected and had their best sales ever in quite awhile then proceed to make dumb mistakes like this.
    4 points
  14. You explanation piques my curiosity so I may have to investigate it. Even if it turns out I do not have a need for it, I am still curious. I would say at least 1300W. The Lian Li Edge seems like a solid and affordable option. I have one in my 4090 build. I have a Thermaltake GF3 1650W PSU in the 5090 build. I like both of them. The GF3 is hard to find. It has dual 12VHPWR sockets and like 6 PCIe 8-pin sockets in addition. I may have to drop down to 0.001 Ohm shunt resistors to bump my power limit. I backed off my core clock a bit and the scores went up. I ran the benchmark again with GPU-Z and HWiNFO64 running on my second monitor so I could watch and it is still showing power limit perf cap reason, so... hmmm. I'm pulling 1350W from the wall already. HWiNFO64 shows like 959W on GPU power rails. Apparently that's not enough. My core temp is still hitting 41°C with 9°C water, so that's not helping. I guess I am going to have to think about using liquid metal on the GPU. I don't want to, but 41°C is definitely not helping https://www.3dmark.com/3dm/144631382 | https://hwbot.org/benchmarks/3dmark_-_steel_nomad_dx12/submissions/5918232
    4 points
  15. Its an OS you dont actually interact with directly like you would most linux distro's or Windows. Once its installed, you interact with it via Web browser of choice. Its main selling point is acting as your File Server, you can also add 2 drives for parity as well. The other large value add is containers, the functions are in my view endless, though it wouldnt surprise if people more adept in the line of expertise would say otherwise. Most importantly its just rock solid and is all installed on a flash drive, simply notate the Serials of the drives and how you arranged them in the GUI, and you can easily transport to another system and have it boot up without missing a beat. They've also recently added in official TailScale support so you can have friends and family have access relatively easily. Its not something I would expect people here to have too much interest in, but its been a thrilling experience for me while I gather up GPU's for benching this Winter. I have a particular fondness to Anime, make no mistake its an ocean of urine like any genre but it long poses a challenge in media management. Unraid has many containers to make this process much more simplified. More importantly just rock solid stability, wish Windows was as consistent. To answer both of you... They weren't the same, some may recall I had a 1300w PSU that was recommended by most here at the time, that one gave up the ghost, EVGA replaced it with a different model 1300w which I had been using with Unraid for a while, its now powering my ITX system. Open style chassis allows much more freedom than a case would in this regard. I've also killed an 850w EVGA Gold PSU, but like I said I killed it so I did not seek RMA for it. Not sure what the issue was for the 550w Im guessing it didnt like being power cycled as quickly as I had done it. In some ways it does clean things up a bit since I am no longer using the 1200w (900w for 110v) server PSU to power the 7900 XTX. Which now opens things up for a new PSU on the bench system (10850K), should I start looking around for another 1300w?
    4 points
  16. @Mr. Fox Screenshot Something about needing Nvidia closed source firmware
    4 points
  17. some recent highlights, ambient https://valid.x86.fr/q4pt03
    4 points
  18. Seems like increasing load line to 150% and setting Vout Command for the value you want your voltage offset to match helps with stability. Just setting the offset and doing nothing else is less consistent. This run flat-lines at 3367 MHz core and 1.150V. I am keeping a finger on the 12VHPWR connector and while I can feel it getting somewhat warmer it is not feeling "hot" even though my meter is showing 1350W from the wall. https://hwbot.org/benchmarks/3dmark_-_port_royal/submissions/5914106 | https://www.3dmark.com/3dm/144195067
    4 points
  19. This definitely makes a difference in maximum clocks. Setting 1.100V I have no problem running 3450MHz on core. But, the scores go up very little and power draw goes up about 250-300W. I also need to change thermal paste and go back to KPX or Kryosnot because Alphacool Apex I slapped on there after the EVC2 mod is no bueno. Core temps are much higher delta, even with the chiller. The Apex thermal paste is very similar in performance to PTM pad. Very durable and consistent, but too much thermal resistance. https://hwbot.org/benchmarks/3dmark_-_steel_nomad_dx12/submissions/5914059 | https://www.3dmark.com/sn/9406069 I glued magnets to the back of the plastic EVC2 case and stuck it next to the motherboard below the SATA ports. When I switch to a more effective thermal paste I will route the EVC2 wiring out of the end near the SATA ports so the wiring is less visible. I also redid part of the loop. I added a ball valve at the first radiator that I can turn off and a QDC fitting to connect the return line to the chiller so this totally bypasses the radiators but still uses all five of my D5 pumps. Even though the temps are impeding the benchmark score, look at how much higher the clocks are before the NVIDIAtard room temperature thermal degradation algorithm effs things up. https://www.3dmark.com/compare/sn/8515121/sn/9406069#
    4 points
  20. Hmmm. A$us prefer the cheap way. The more you pay the more they save. Nvidia offer their 5090 FE cards at MSRP ($2000) and still come with rubber die guard. I expect Asus prefered to use paste as die guard to save sub 1$ in cost of build. Isn't that cheapo? Or is it the classic greed?
    3 points
  21. So the itch to try Arrowlake is slowly winning but frugally, so I snagged an open box Asus Z890 Strix-A from da 'zon. When all is said and done, $176 delivered to my door. It is the cheapest route I could sort out that still gives me access to SP ratings on CPUs as I test. Arrowlake is all new to me so that will be fun as I've twisted AM5 and Intel Skylake++ every which way possible via several motherboards and chips over the last 2-3 years. The plan of attack is to install this into the empty white Hyte Y40 that used to house my secondary/daughter's system and rebuild out of it since she is now ready to come over and game again after the other system gathered dust for well over a year before I broke it down and sold the components so I'll build out again......but she's a good daughter who is working and in school so I don't mind at all. I have a few spare PSUs floating around including an EVGA P2 850w and MSI Ai1300p so I'm good there along with a few Samsung 980 and 990 Pro 1TB and 2TB SSDs. I have at last count 4-5 AIOs of various flavors on my shelf from 240mm up to 360mm so I'm good there too. As previously noted, I have those 8400 Kingspec 2x24GB sticks floating about so those will be used. I wouldn't splurge for 9000 cudimms even before the memory crazy pricing so I'm sure as heck not dropping $420 for another set of 8400 or a mind boggling $675+ for a set of 9000. I'll just see where these or the TG 8200 sticks go. Just fishing for a CPU to use. Might test a few 265k/kf's or 285k's depending on Black Friday specials. 265k is obviously the cheapest way to go.
    3 points
  22. There should be no yellow visible even when it is still in the PSU box.
    3 points
  23. Biggest problem to me is the lack of a standard for the BTF connectors. Hyte goes where the market goes as do all case makers. If BTF is properly standardized and gains traction, they will go there but it has been swirling around for years now going nowhere so..... And that makes the Matrix and Astral BTF editions problematic because it suddenly limits your choice of motherboards if you want to switch/upgrade. I don't care for that. ----------------------------- Ahhh, much better. Here is a good video from BBT where he pits maxed tuned (memory and CPU) 14900KS vs 285k vs 9800X3D vs 9950X3D in a proper shootout. The overall consensus is still the 9800X3D is still king. Shines at 1080p. Loses a little steam at 1440p and really loses steam at 4k to pretty much make all tuned chips the same and the 285k actually comes out on top as the GPU experiences being capped. Like I said, 285k gets a bad rap, but properly tuned and paired with what is gobsmackingly expensive ram now, it can hold its own and then some. Also the newest edition of Intel's APO is actually valid now and works as intended when it can take advantage of its optimizations. For X3D, it comes down to the games being played/tested and the resolution too and how that effects the cache and saturation. Even Jufes, who beats on AMD regularly said if all he played was racing sims or flight sims hell yeah his main rig would be X3D as it dominates in those types of games. I still feel to this day the 14900KS when tuned properly provides a smoother, more steady gameplay experience in WoW for handling the lows but I also play at 4k maxed out so that subjects the X3D cache to more saturation than 1440p and especially 1080p. I actually logged into my WoW account on the wife's system the other day and played for a few hours at 1440p then set internal render to 4k for the next few hours and the gameplay, even on her 9070xt, is just smoother especially in raids. The 4k evidence from BBT's video just makes me want to build out a 285k even more at this point for 4k testing. I said it before, but if a decent 285k sample pops up on the forums, I'll probably pick it up and build out a cheap rig and just use these 8400 Kingspec sticks and see where they can go on it. Overall, when tuned ALL the top end chips provide a good gaming experience.
    3 points
  24. Damn, scammers now running scams through BB. That is just unreal, I hate people lol.
    3 points
  25. Amd chips are cheaper but the motherboard options are slim pickings. Finally got fed up with oem motherboards on the Dell precisions so I'm selling off a few of them and moving to a DIY solution. The board came with a couple xeon silvers which I s fine for testing but long term I'm probably going with a couple xeon gold 6248's 20c/40t Ordered a be Quiet 1000w for it as well I'll begin testing it over the weekend.
    3 points
  26. Awesome. I am looking forward to seeing how it runs, brother. I got up early to delid the CPU. It always amazes me how much better than makes things. I increased the core clock 100MHz across all 16 cores and my max temperature is still over 20°C cooler. I hate owning CPUs that are not delidded. The experience sucks with the IHS. So, I gave up on trying to use Windoze worthless trash software bloat to determine whether or not 3DvCache actually worked or not. Micro$lop butchers or botches up everything they get their hands on. I used Process Lasso to set parameters on the EXE file to force 3DvCache CCD0 use or all-core use with both CCDs. As you can see below, it DOES make a difference. All Core (no 3DvCache) CCD0 only (3DvCache only) - I used Process Lasso for this instead of Windoze trash All Core (no 3DvCache) CCD0 only (3DvCache only) - I used Process Lasso for this instead of Windoze trash
    3 points
  27. Have you identified anyone that owns a Zotac 5090 AIO version? I haven't gone looking but I don't recall seeing anyone in the forum at oc.net that has one. In general, I do not want an AIO GPU, (or CPU AIO,) but definitely think it is better than air. I do not care for the appearance of most of the AIO cooled GPUs. I think most of them are pretty garish-looking and I do not like the short and stubby look. At least the water blocked GPUs are not nearly as short and stubby as the OEM AIO models.
    3 points
  28. daaaamn nice! thx for the headsup and tag bud, much appreciated 🙂 now i know what ill do once i get my hands on the wireview pro II 😄
    3 points
  29. Intel Core Ultra 290K, 270K and 250K Plus spec leak: “Arrow Lake Refresh” with higher clocks, more cores and faster memory support So, a real "nothing burger" when all is said and done. The most intelligent thing AMD has going for it is long-term socket viability. Intel needs to learn from this now and back-track on some of their recent very stupid ideas (bring back hyperthreading and go back to monolithic design without the stupid Atom cores). It is hurting them more than they know. Their ending of hyperthreading was the final nail in the coffin for me. But their constant socket revisions requiring new motherboards after a few CPU generations is a compelling reason for people to choose AMD even if they would rather stay with Intel. (I am conflicted because I can't say that I like either one now. They both have pros and cons, but both have more cons than pros.) The short socket lifespan is a very damning attribute for Intel now that they no longer operate from a place of absolute supremacy and domination. They used to expect us to put up with it as going with the territory, which was fine when they still ruled the world. I'd really like to make my next build an Intel platform again. The overclocking experience (CPU and memory) is much better on Intel, but the lack of hyperthreading and abbreviated upgrade path due to frequent socket changes is making me question the logic of it. Once upon a time, not long ago, that was the only somewhat logical basis for choosing to go with AMD instead of Intel and that alone was never enough because there were too many other compromises attached to the idea. Intel should revive X299 and make modern versions of CPUs for that socket LGA 2011-v2 platform... 32 or 36 threads at 6.0GHz+ all-core overclock, quad-channel DDR5 8000+ and 40+ PCIe lanes... I'd be all over that, like white on rice... hell yeah, in a heartbeat. And, I'd be willing to pay twice as much as a Core Ultra (or Ryzen) flagship CPU to have it. The most fun I have ever had with overclocking was on X299. 100% (double stock clock) overclocking wasn't that difficult.
    3 points
  30. Did you mean to link to a post by @electrosoft or an article about the shiny new product that offers so little? The link leads here: https://notebooktalk.net/topic/109-official-benchmark-thread-post-it-here-or-it-didnt-happen-d/page/824/
    3 points
  31. Same, a function to import old BIOS settings would be nice. I've just gotten used to screen shotting my settings at this point and then manually re-entering them. Doesn't take too long (~10min tops if that) but still a nuisance. Hmmmm, I'll have to give this a true next time on the z690 D4 strix w/ SP109 14900KS and x870e Hero.... Didn't even know they basically offered the 9950X3D in Epyc form. Looks like the few reviews and look ups say it is better binned than the 9950X3D and the price is now down to basically 9950X3D levels. Tempted to try one myself! Unfortunately cheaters and cheat suppliers have gotten so good that EA has implemented their version of "The Final Solution" and it works. It wipes out any and all chances of cheats sitting in any memory space without being detected but it requires you to basically give up just about every aspect of protection on your system to do it. It is insane watching cheaters lose their minds in threads because they are being banned left and right trying to run updated paid for (!) cheats that have always worked getting them permanently banned on a hardware level. Just like with the Switch2, buyers have to be wary buying second hand hardware that could be banned from the gaming networks or games they want to play. I am not a fan of the level of access that is required to play some of the newest competitive games, but I also understand why in such a hyper competitive environment. When I used to play competitive Quake the sheer amount of cheaters was ridiculous and was always refreshing at gaming cons to suddenly see many of those same players under observation during matches suddenly get "less good" real quick. Only way I would entertain this level of intrusion to play would be a separate, games only install of Windows with zero Windows login or any other type of logins anywhere and use it to game exclusively on its own drive and I'm not willing to go to that level yet. Next WoW xpac, Blizzard is basically taking security to the next level and locking out mods and assists in what appears to be an attempt to heighten security. Remember, Blizzard via Activision was purchased for almost 70billion from Micro$oft. The Midnight xpac is supposed to be another major overhaul so we will see. ----------------- I like Testing Games to a degree, but the fact they still test the 14900k unoptimized at 6000 memory vs the X3D running the classic sweet spot 6000 just doesn't work for me. The way they run their DDR5 my tuned DDR4 B-die setup with the 14900KS will decimate their results each and every time let alone when I was running it tuned w/ 8400 DDR5..... -------------------- In the same vein of Socket 1700 and BF, if the 12 core Bartlett-S drops and is tunable, I'm still definitely interested in it too depending on where we can go with it. I'd pick up an extra board and slap these 8400 kingspecs in there and see where it goes....
    3 points
  32. I have been able to use my CMO file from the prior BIOS version, but the trick is to save a profile with the new BIOS first, then apply the old profile over it. (Safedisk shared this.) So after flashing I disable TPM, Secure Poot, iGPU, WiFi, BT and save it as a new BIOS profile. F10 to save and exit, then go back in and apply my OC CMO profile from the previous BIOS. That has not given me any issues. After applying the old profile and confirming all was well, then save again as a new OC profile and new CMO file for the updated BIOS version. I could see where an old profile might be incompatible if they removed, changed or added new features or rearranged menu order, but if the only changes are underlying code and BIOS default values there is no excuse for them not being compatible. I ordered an EPYC 4585PX. I hope it is better at core and memory overclocking than either of my average 9950X so I don't have to RMA/refund yet another one. The only reason I did not return either of the two I have now is they were average based on what I could tell looking at other samples judging from HWBOT scores and better than the below average trash samples I RMA'd before them. If it is not better than I am probably just going to just send it back for refund and be done with trying to enjoy lackluster AM5 overclocking. The 9950X3D that I returned was the worst Ryzen 9 sample I have ever seen. It was an absolute POS. Even gaming it turning to crap. Having to enable TPM and Secure Poot filth to play the new BF and CoD releases is totally unacceptable. I about popped a vein when I discovered they had retroactively applied Javelin to BF 2042 and rendered a game I had thoroughly enjoyed totally worthless to me. Bastards.
    3 points
  33. That's good and its because of the backlash they've received from the people. The main question is how long its going to last. Lets be real they're many misleading and dangerous content promoted on youtube on a daily basis 🙄
    3 points
  34. Oh guys, and FYI, a friend of mine has received the ThinkPad P16 G3 in the same basic config (285HX / RTX4000) that I have, we are currently doing benchmarks to compare. Will post the results later so you know how our machine fares against the Lenovo. So far it looks that we are about 26% better in the gaming / combined CPU and GPU performance (measured with 3D Mark Time Spy). Measured RAM latency on my machine is @ 108ns, on the Lenovo we are @ 151ns. Quite a difference.
    3 points
  35. The big Norwegian party musician... Åge Aleksandersen paired with one of Svedens biggest song stars ever (Björn Afzelius). https://lyricstranslate.com/en/rosalita-rosalita.html Intro in Norwegian (Trøndish, my Norwegian dialect🙂). 0:29 Swedish. I:16 Sami, the norwegian indigenous language. What is the hardest language of those 3 to understand?😀 Björn Afzelius https://lyricstranslate.com/en/tusen-bitar-thousand-pieces.html
    3 points
  36. A$$zeus selling GPUs that can only work with one of their motherboards was likely a very deliberate act and solely the result of ulterior motives. Not a smart buy for anyone that hasn't sold their soul to the ROG clown posse. The idea itself isn't a bad one. Much better than a fragile arson 12VHPWR (aka 12V-2x6) cable. The proprietary part makes it suck. With the baloney we are seeing with respect to AMD/Radeon GPUs now, it just gives NVIDIA a stronger chokehold on the GPU market that ultimately benefits only NVIDIA. There is no reason for them to feel compelled to release a Super or 6000 GPU line because they are effectively only competing with themselves. You either take what they offer and pay more than it is worth, or you settle for something substantially less. If AMD doesn't keep their prices in check they'll have nothing to sell to budget-conscious shoppers that are willing to sacrifice performance and good thermals to save money.
    3 points
  37. im in the same position here, will likely need to replace my 90° Seasonic cable, probably going back to the straight Seasonic one that came stock with my 1600W PSU, thus switching back from current top routing to bottom. should be fine 🙂
    3 points
  38. Only $4733,03 USD incl tax. Not bad Asus. Why not let the retail break $5000 ? Or is it only for the Gold Edition cards? Don't forget punch the "Notify Me" button. You may have a chance to support Asus with more of your hard earned money. https://videocardz.com/newz/asus-rog-matrix-platinum-rtx-5090-slips-to-late-november-at-e4099
    3 points
  39. Yeah, the best bet might be to hang onto the 7900xtx unless you have a specific need you're targeting for upgrading outside a new fun toy. 9070xt, depending on use case, is a side grade at best as is a 5070ti. Anything worth the squeeze is going to be a 5080, 4090, 5090. 4090 and 5090 are very expensive. So as you've found it is all about the 5080. I'd still wait and watch sales heading into black friday. I've been tempted to pull the trigger on a few pieces including another 9070xt for my SFF build but I know there will be some decent sales coming up.
    3 points
  40. Honestly I would just hang on to the 7900 XTX. Neither vendor has anything appealing for us without spending way too much money. I am waiting to see what next generation offerings look like, that being said I can play everything I want to play in terms of gaming at 3440x1440p. Perhaps Monster Hunter: Wilds is the only exception but that game runs like trash on anything right now. I believe you can run a trial for 30 days, there are ways around that "limitation" but I never looked further into it. I used the trial for about that duration and bought a license. The license gives you access to its platform for life but updates for 1 year to the OS itself. You can pay for a lifetime update license of course, i'll probably be upgrading my license to lifetime updates. The only part that is a bit quirky at times is finding a flash drive to use which must have a UUID that it can tie the license too, then of course install the OS to it. Cant remember how I did mine but probably used Rufus.
    3 points
  41. I was looking at the zotac solid core oc. $999, or the zotac amp extreme $1199. Neither has me terribly excited. I've been contemplating waiting a bit longer to see what AMD has in the pipeline. Hoping that it would be comparable to the 4090.
    3 points
  42. I wont pay $2k for a gpu let alone those prices. Gpu+waterblock+thermal paste/pads. I have a 5080 in my cart, but I cannot seem to justify it. The 5080 is only marginally better than the 7900xtx.
    3 points
  43. This really makes you think with everything done right, and oodles of power slammed through the connection, it is just fine. I would say it is human error coupled with potential QC, but I also believe after the 4090, OEMs/AIBs definitely test their connectors under high stress conditions including MSI. I also still stand by the fact lighter colored connectors will show scorch marks much easier than black connectors and to see if a black connector is degrading you will need to either get it to a point of melting and/or have the actual pins scorched too. I would posit that there are probably equal parts black connectors scorched out there as there is lighter color connectors. You just can't detect it as easily with the naked eye. I do not subscribe to the "yellow tip disaster" theory quite yet. We are talking anecdotal meets the potentially lowest level of confidence. It just doesn't fly. I do subscribe to the "a very small segment of connectors in general burn but we just quite don't know why yet" theory which applies to both Nvidia and AMD cards that have adopted this connector and includes MSIs cables along with other ones. In that line of questioning and wondering about cable/tentacle QC, how many founders edition 5090s / RTX 6000 pros have burnt up using the supplied Nvidia adapter cable? You would think their adapters would be the gold standard in all of this....
    3 points
  44. Interesting video. Brother @electrosoft asked a rhetorical question not long ago about why nobody we know has melted connectors. Nobody could give a definitive answer. This makes me even more curious. 1600W for 15 minutes did not melt the connector. I think the answer might be "defective Chinese trash" after watching this. More accurately, "expensive defective Chinese crap." People are getting worried about exceeding 60-70°C and twice that much heat didn't melt the plastic connector. Thank you. I think I can still do better. Just need to get my core temps to stay down in the 20°C range or lower.
    3 points
  45. At least AMD suffering the problem lets us know it is a spec issue and not an Nvidia GPU side issue but still.... Score another for a lighter colored connector showing clear and visible scorch marks..... The blessing and the curse is my closest MC is ~60min away in St Davids. Close enough to where if I really want to go it isn't that far away yet far enough away to keep me honest and not habitually dropping by "just to look" (translation: Coming home with new goodies more often than not). An hour is a medium sized commute. Very doable. Your poor wallet AND you get access to a newer store vs ours which opened in 1991.....
    3 points
  46. If you absolutely want 5080 why not wait for the Super cards? I expect the 5080 will drop in price once the refresh is out. And with an 5090 at MSRP (if you can find one at lowest price) will probably keep it's prices a long time, same as for 4090's. Some even got more for their 2 years old used 4090's than they paid. Won't happen with xx80 cards.
    3 points
  47. The Taichi is now listed on fleabay along with the 7900xtx.
    3 points
  48. That is very odd. Does it have dual vBIOS? Maybe it got swtiched? If not, then maybe a driver update did something. I know NVIDIA has pushed out firmware with drivers in the past. I haven't seen them do that in a long time though. If I remember correctly it was a real hassle that you went through to get that flashed.
    3 points
  49. I ordered a thermal sensor from Amazon to connect to the T-sensor on my motherboard and it works well. I have the probe inserted between wires in the 12VHPWR cable near the GPU socket. I confirmed the temperature in HWiNFO64 from the T-Sensor is within about 1-2°C of what it shows with my IR thermometer. Looks like I did not need to worry about my 12VHPWR connector melting unless something changes in terms of load balance. Wonder if the Ampinel will release on schedule with enough stock to not immediately sell out? Supposedly will be available for pre-order, but I'm not seeing that it has been yet. https://hwbot.org/benchmarks/3dmark_-_steel_nomad_dx12/submissions/5914714
    3 points
This leaderboard is set to Edmonton/GMT-07:00
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. Terms of Use