Leaderboard
Popular Content
Showing content with the highest reputation since 10/18/2025 in all areas
-
The stock voltage is 1.15v and the stock power is 600w. The XOC vBIOS is publicly available on tech power up and is 2001w, which further unlocks voltage and memory another +1000 over other 5090s. Some nice individual over at OC.net PM'd me the extra special HOF tool for even further voltage, LLC, and switching period control. Basically the same as the old KingPin tools. Ordered the HOF 5090D IceMan waterblock on AliExpress. Figured the card is pretty rare and any future owner will def. want the ability to put it under water to take advantage of the voltage and power limits.7 points
-
Off-Topic: The Phoenix Micro Center grand opening for VIP members is 11/5. My wife better hide my wallet. 🤣 It's an hour each direction from where I live, but at least it's not the 6-hour drive each direction to Tustin, CA.5 points
-
For the GPU, I would say select whichever one costs less, especially if you are putting a waterblock on it. My Zotac 5090 Solid OC was a good buy compared to the other more expensive options that deliver nothing for the extra money. It is an excellent GPU. The air cooler on it was fantastic (unlike some of the other affordable brands/models). It ran freakishly cool for an air cooled GPU. The only 50-series GPU I would recommend avoiding like a plague is an FE model. For overclocking potential probably the best GPU silicon quality most consistently will be an AORUS Master, but the cost vs benefit isn't justified. I love overclocking more than anything else I do with a computer, and really the only reason computers matter to me at this point, but paying a WHOLE LOT more for a very small gain in GPU benchmark scores is just not a very intelligent decision. I have owned the following X870E motherboards and I list them in my order of preference: X870E AORUS Master (best overall - only flaw is no way to disable WiFi/BT in BIOS) X870E-E Strix (replacement for second AORUS Master that arrived with shipping damage) X870E Apex (returned first for refund, second was junk, I am using #3) X870E Carbon (returned for refund - good mobo, but no async BCLK and weird glitches) X879E Taichi (my least favorite out of all AMD motherboards I have owned - hated it) I had a X870E Taichi and hated it. The PCIe bifurcation was garbage and I did not care for the firmware. I have only owned two ASRock motherboards and did not like either one. I had a B850 AORUS Elite that I used in a build for my granddaughters and it was excellent. The only criticism I had was the PCIe slots below the GPU slots were X1, but using them did not drop the GPU to X8. This is unavoidable with anything below X870E dual chipset due to a lack of PCIe lanes with an non "E" AMD dual chipset motherboard. PCIe X1 dramatically reduces NVMe speed... makes NVMe speed like SATA SSD. If you plan to insert anything in other PCIe slots in addition to your GPU in an AMD motherboard the "E" version is an absolute must have. The only complaint I have with the Gigabyte boards is no way to disable WiFi/BT in the BIOS. Super stupid flaw they could fix effortlessly if they cared. If you use WiFi/BT and use Windoze 11 as your main OS (I do not do either one) this truly is a non-issue. It really pissed me off that Gigabyte did not provide that option in the BIOS. I asked twice and both times they said no... "you're the only person complaining about it" (essentially we don't care what you want and you are not worth the minimal effort needed to make a BIOS as good as our competitors). Gigabyte is the only brand I know of that omits this essential basic BIOS option. The Strix was an accidental blessing. I purchased a second AORUS Master from Central Computers on sale for less than what I paid for the first. The big and heavy NVMe heatsink under the GPU was not latched. Apparently shipped from the factory without being latched. It flopped around inside of the box and broke several things and scratched up things that did not get broken. I asked them to open the box and inspect before shipping a replacement. They had quite a few in stock and ended up opening all of the boxes and all were damaged in the same way. They offered the Strix for no difference in price. I accepted. The Strix is better than the AORUS Master in terms of firmware. A close second only because I could not install both of my Sabrent quad NVMe X4 cards like I could in the Master. It only has one extra PCIe slot. The AORUS had two, both usable at X4 without dropping the GPU from X16 to X8. The AORUS Master allowed me to install 10 NVMe SSDs and 4 SATA drives while maintaining the GPU at X16. The Apex is a great motherboard with a glaring engineering defect entirely due to an idiotic PCIe slot arrangement. I can only use the X4 PCIe slot above the GPU. The Sabrent quad NVMe card's heat sink touches the GPU backplate. The Strix performs as well as the Apex in terms of the CPU overclocking. It has asych BCLK and I can use the Sabrent card in the bottom slots without the GPU dropping to X8 like it does in the Apex. If I knew everything I know now before buying my first I probably would have purchased two X870E-E Strix Gaming WiFI. If I were going to recommend one, it would be the X870E-E Strix as the best all-around X870E motherboard with the fewest flaws and compromises. Hope this helps. https://www.newegg.com/asus-rog-strix-x870e-e-gaming-wifi-atx-motherboard-amd-x870e-am5/p/N82E168131196825 points
-
They fact that they used basically the best memory option 6000 CL30 for the AMD part, but then use far below Intel's optimal spec is enough for me to call it biased. Even if you use a mainstream 4 dimmer, most can achieve around 7600 CL32-34. Hell even using 7000 CL30 2x32gb dual rank would be a better option as it will offer up gaming performance of an 8000 SR kit on Intel. Unfortunately, this type of testing is mainstream and how most outlets test. It is what it is, I've just learned to ignore most of it.4 points
-
nifty, but my main gripe is still that in many instances i need to manually come up with new profiles whenever i update the bios version. cmon now, how hard can it be to make bios profiles compatible with all new bios versions?!4 points
-
Nice new feature in ASUS BIOS for F3 Save as CMOS file to save all BIOS files to USB as a single file instead of a separate CMO file for current profile only. Explanation from safedisk: https://www.overclock.net/posts/29527571/ @jaybee83 @Raiderman https://www.overclock.net/posts/29527497/ Explains why CMOS is much larger than CMO.4 points
-
Nicely an clean even with the extra pipes, the distribution plate puts in some good work. I got my network into it's final form. Full 10Gb backbone to my study and living room with 2.5Gb connections. DAC cable to the 8 port switch and an SFP+ to ethernet adaptor for 10Gb for my uplink to upstairs. I'm getting a revised version of this holder so I can cable tidy this properly.4 points
-
I finally took the time to get the X870E-E Strix memory on water like the Apex. First time in quite a long time that I ran the memory with air cooling and I am so glad to not have to use a fan for it any more. I really did not like that. Water is so much better, not to mention looking a whole lot better as well. That 120mm fan hanging on a bracket from the top radiator was pretty ugly. It will take a day or two to work all of the air out of the distro block. Now it's time to hit the sack.4 points
-
You explanation piques my curiosity so I may have to investigate it. Even if it turns out I do not have a need for it, I am still curious. I would say at least 1300W. The Lian Li Edge seems like a solid and affordable option. I have one in my 4090 build. I have a Thermaltake GF3 1650W PSU in the 5090 build. I like both of them. The GF3 is hard to find. It has dual 12VHPWR sockets and like 6 PCIe 8-pin sockets in addition. I may have to drop down to 0.001 Ohm shunt resistors to bump my power limit. I backed off my core clock a bit and the scores went up. I ran the benchmark again with GPU-Z and HWiNFO64 running on my second monitor so I could watch and it is still showing power limit perf cap reason, so... hmmm. I'm pulling 1350W from the wall already. HWiNFO64 shows like 959W on GPU power rails. Apparently that's not enough. My core temp is still hitting 41°C with 9°C water, so that's not helping. I guess I am going to have to think about using liquid metal on the GPU. I don't want to, but 41°C is definitely not helping https://www.3dmark.com/3dm/144631382 | https://hwbot.org/benchmarks/3dmark_-_steel_nomad_dx12/submissions/59182324 points
-
Its an OS you dont actually interact with directly like you would most linux distro's or Windows. Once its installed, you interact with it via Web browser of choice. Its main selling point is acting as your File Server, you can also add 2 drives for parity as well. The other large value add is containers, the functions are in my view endless, though it wouldnt surprise if people more adept in the line of expertise would say otherwise. Most importantly its just rock solid and is all installed on a flash drive, simply notate the Serials of the drives and how you arranged them in the GUI, and you can easily transport to another system and have it boot up without missing a beat. They've also recently added in official TailScale support so you can have friends and family have access relatively easily. Its not something I would expect people here to have too much interest in, but its been a thrilling experience for me while I gather up GPU's for benching this Winter. I have a particular fondness to Anime, make no mistake its an ocean of urine like any genre but it long poses a challenge in media management. Unraid has many containers to make this process much more simplified. More importantly just rock solid stability, wish Windows was as consistent. To answer both of you... They weren't the same, some may recall I had a 1300w PSU that was recommended by most here at the time, that one gave up the ghost, EVGA replaced it with a different model 1300w which I had been using with Unraid for a while, its now powering my ITX system. Open style chassis allows much more freedom than a case would in this regard. I've also killed an 850w EVGA Gold PSU, but like I said I killed it so I did not seek RMA for it. Not sure what the issue was for the 550w Im guessing it didnt like being power cycled as quickly as I had done it. In some ways it does clean things up a bit since I am no longer using the 1200w (900w for 110v) server PSU to power the 7900 XTX. Which now opens things up for a new PSU on the bench system (10850K), should I start looking around for another 1300w?4 points
-
4 points
-
some recent highlights, ambient https://valid.x86.fr/q4pt034 points
-
Seems like increasing load line to 150% and setting Vout Command for the value you want your voltage offset to match helps with stability. Just setting the offset and doing nothing else is less consistent. This run flat-lines at 3367 MHz core and 1.150V. I am keeping a finger on the 12VHPWR connector and while I can feel it getting somewhat warmer it is not feeling "hot" even though my meter is showing 1350W from the wall. https://hwbot.org/benchmarks/3dmark_-_port_royal/submissions/5914106 | https://www.3dmark.com/3dm/1441950674 points
-
This definitely makes a difference in maximum clocks. Setting 1.100V I have no problem running 3450MHz on core. But, the scores go up very little and power draw goes up about 250-300W. I also need to change thermal paste and go back to KPX or Kryosnot because Alphacool Apex I slapped on there after the EVC2 mod is no bueno. Core temps are much higher delta, even with the chiller. The Apex thermal paste is very similar in performance to PTM pad. Very durable and consistent, but too much thermal resistance. https://hwbot.org/benchmarks/3dmark_-_steel_nomad_dx12/submissions/5914059 | https://www.3dmark.com/sn/9406069 I glued magnets to the back of the plastic EVC2 case and stuck it next to the motherboard below the SATA ports. When I switch to a more effective thermal paste I will route the EVC2 wiring out of the end near the SATA ports so the wiring is less visible. I also redid part of the loop. I added a ball valve at the first radiator that I can turn off and a QDC fitting to connect the return line to the chiller so this totally bypasses the radiators but still uses all five of my D5 pumps. Even though the temps are impeding the benchmark score, look at how much higher the clocks are before the NVIDIAtard room temperature thermal degradation algorithm effs things up. https://www.3dmark.com/compare/sn/8515121/sn/9406069#4 points
-
Now that I have confirmed it works I will devise a cleaner way of mounting this inside of the chassis.4 points
-
Yeah! EVC2 mod works. As a quick test, I goosed the core voltage a little and ran an MSI Kombustor and my GPU is pulling another 250W from the wall now (about 1350W) with everything else stock. I purchased a PCB heater and man... so much easier to solder when the PCB is warmed up.4 points
-
One thing I find very puzzling about the Apex is that it does not matter what CPU I have installed, my Cinebench scores are always notably lower than the Strix. Every CPU that I test consistently has stronger Cinebench results when I install it in the Strix. This is my third Apex. I don't really understand why this is the case, but it is consistent. Same was true of the AORUS Master... consistently higher Cinebench scores than the Apex using any given CPU. This really bugs the crap out of me and makes me wish I had saved $300 and just purchased another Strix X870E-E or just kept the AORUS Master. I think I am spinning my wheels with the EPYC CPU. It seems like a much better sample overall, but I am not going to be able to really confirm that until it is delidded and bare die. Absolutely ludicrous thermals with the factory solder and IHS. It's unfortunate that most CPUs suck with the factory solder and IHS. It's true of both Intel and AMD, but seems much more so with AMD for some reason. They seem a lot more sensitive to high temperatures. Jury is still out whether I like 3DvCache. The 9950X3D that I RMA'd was such a pathetic silicon sample that it wasn't fair to draw any conclusions based on what a piece of garbage it was. I'm leaning toward either not liking it or viewing it as an irrelevant selling point/gimmick. (I'm a casual gamer, not a gaming enthusiast, and it seems to provide no tangible benefit for an overclocking enthusiasts.) I do see higher FPS in the few games I tested. 3DMark doesn't seem to benefit.3 points
-
I still have a lot to figure out for the best settings for tuning of this new EPYC CPU but it does appear to be better than either of my good but average 9950X samples. Initial impressions are that it is a pretty decent sample. I think my memory kit could be a little better, but I was surprised it could run 6400 1:1 and FCLK 2200. I haven't spent much time with core tuning yet, just set a simple PBO, but will definitely need a good manual overclock to see any respectable Cinebench results. I will play with it more tomorrow to confirm it is worth keeping before I delid it. I always forget how crazy hot these CPUs are that have not been decapitated and run bare die. There are still many hours that need to be invested to find what works best. Requires a lot more time and effort with AMD to get things figured out.3 points
-
Hopefully this was not a waste of sand. But, I will find out as soon as I button things up. If it's not better than what I already have it's going back for a refund.3 points
-
Same, a function to import old BIOS settings would be nice. I've just gotten used to screen shotting my settings at this point and then manually re-entering them. Doesn't take too long (~10min tops if that) but still a nuisance. Hmmmm, I'll have to give this a true next time on the z690 D4 strix w/ SP109 14900KS and x870e Hero.... Didn't even know they basically offered the 9950X3D in Epyc form. Looks like the few reviews and look ups say it is better binned than the 9950X3D and the price is now down to basically 9950X3D levels. Tempted to try one myself! Unfortunately cheaters and cheat suppliers have gotten so good that EA has implemented their version of "The Final Solution" and it works. It wipes out any and all chances of cheats sitting in any memory space without being detected but it requires you to basically give up just about every aspect of protection on your system to do it. It is insane watching cheaters lose their minds in threads because they are being banned left and right trying to run updated paid for (!) cheats that have always worked getting them permanently banned on a hardware level. Just like with the Switch2, buyers have to be wary buying second hand hardware that could be banned from the gaming networks or games they want to play. I am not a fan of the level of access that is required to play some of the newest competitive games, but I also understand why in such a hyper competitive environment. When I used to play competitive Quake the sheer amount of cheaters was ridiculous and was always refreshing at gaming cons to suddenly see many of those same players under observation during matches suddenly get "less good" real quick. Only way I would entertain this level of intrusion to play would be a separate, games only install of Windows with zero Windows login or any other type of logins anywhere and use it to game exclusively on its own drive and I'm not willing to go to that level yet. Next WoW xpac, Blizzard is basically taking security to the next level and locking out mods and assists in what appears to be an attempt to heighten security. Remember, Blizzard via Activision was purchased for almost 70billion from Micro$oft. The Midnight xpac is supposed to be another major overhaul so we will see. ----------------- I like Testing Games to a degree, but the fact they still test the 14900k unoptimized at 6000 memory vs the X3D running the classic sweet spot 6000 just doesn't work for me. The way they run their DDR5 my tuned DDR4 B-die setup with the 14900KS will decimate their results each and every time let alone when I was running it tuned w/ 8400 DDR5..... -------------------- In the same vein of Socket 1700 and BF, if the 12 core Bartlett-S drops and is tunable, I'm still definitely interested in it too depending on where we can go with it. I'd pick up an extra board and slap these 8400 kingspecs in there and see where it goes....3 points
-
I have been able to use my CMO file from the prior BIOS version, but the trick is to save a profile with the new BIOS first, then apply the old profile over it. (Safedisk shared this.) So after flashing I disable TPM, Secure Poot, iGPU, WiFi, BT and save it as a new BIOS profile. F10 to save and exit, then go back in and apply my OC CMO profile from the previous BIOS. That has not given me any issues. After applying the old profile and confirming all was well, then save again as a new OC profile and new CMO file for the updated BIOS version. I could see where an old profile might be incompatible if they removed, changed or added new features or rearranged menu order, but if the only changes are underlying code and BIOS default values there is no excuse for them not being compatible. I ordered an EPYC 4585PX. I hope it is better at core and memory overclocking than either of my average 9950X so I don't have to RMA/refund yet another one. The only reason I did not return either of the two I have now is they were average based on what I could tell looking at other samples judging from HWBOT scores and better than the below average trash samples I RMA'd before them. If it is not better than I am probably just going to just send it back for refund and be done with trying to enjoy lackluster AM5 overclocking. The 9950X3D that I returned was the worst Ryzen 9 sample I have ever seen. It was an absolute POS. Even gaming it turning to crap. Having to enable TPM and Secure Poot filth to play the new BF and CoD releases is totally unacceptable. I about popped a vein when I discovered they had retroactively applied Javelin to BF 2042 and rendered a game I had thoroughly enjoyed totally worthless to me. Bastards.3 points
-
That's good and its because of the backlash they've received from the people. The main question is how long its going to last. Lets be real they're many misleading and dangerous content promoted on youtube on a daily basis 🙄3 points
-
The big thing is when my partner is doing anything with the NAS it has no impact on my own experience now, any one person can max out their connection and the rest of the network is just unaffected.3 points
-
The big Norwegian party musician... Åge Aleksandersen paired with one of Svedens biggest song stars ever (Björn Afzelius). https://lyricstranslate.com/en/rosalita-rosalita.html Intro in Norwegian (Trøndish, my Norwegian dialect🙂). 0:29 Swedish. I:16 Sami, the norwegian indigenous language. What is the hardest language of those 3 to understand?😀 Björn Afzelius https://lyricstranslate.com/en/tusen-bitar-thousand-pieces.html3 points
-
A$$zeus selling GPUs that can only work with one of their motherboards was likely a very deliberate act and solely the result of ulterior motives. Not a smart buy for anyone that hasn't sold their soul to the ROG clown posse. The idea itself isn't a bad one. Much better than a fragile arson 12VHPWR (aka 12V-2x6) cable. The proprietary part makes it suck. With the baloney we are seeing with respect to AMD/Radeon GPUs now, it just gives NVIDIA a stronger chokehold on the GPU market that ultimately benefits only NVIDIA. There is no reason for them to feel compelled to release a Super or 6000 GPU line because they are effectively only competing with themselves. You either take what they offer and pay more than it is worth, or you settle for something substantially less. If AMD doesn't keep their prices in check they'll have nothing to sell to budget-conscious shoppers that are willing to sacrifice performance and good thermals to save money.3 points
-
im in the same position here, will likely need to replace my 90° Seasonic cable, probably going back to the straight Seasonic one that came stock with my 1600W PSU, thus switching back from current top routing to bottom. should be fine 🙂3 points
-
Only $4733,03 USD incl tax. Not bad Asus. Why not let the retail break $5000 ? Or is it only for the Gold Edition cards? Don't forget punch the "Notify Me" button. You may have a chance to support Asus with more of your hard earned money. https://videocardz.com/newz/asus-rog-matrix-platinum-rtx-5090-slips-to-late-november-at-e40993 points
-
By some miracle USPS managed to "redeliver" my already delivered 4TB NVMe. Team Group did not repair the old one. They sent a new one. Knowing how inexpensive the parts are that go into them, and factoring in the costs of shipping and handling, sending it to Taiwan to "attempt" repair was truly idiotic and reflects a lack of regard for the people that buy their products. I will still be a brand detractor after this. I've got a bunch of Team Group drives and flash storage, but not planning buy from them again based on this. I changed my Amazon review from 5-star to 1-star based on the 6-week RMA experience.3 points
-
Yeah, the best bet might be to hang onto the 7900xtx unless you have a specific need you're targeting for upgrading outside a new fun toy. 9070xt, depending on use case, is a side grade at best as is a 5070ti. Anything worth the squeeze is going to be a 5080, 4090, 5090. 4090 and 5090 are very expensive. So as you've found it is all about the 5080. I'd still wait and watch sales heading into black friday. I've been tempted to pull the trigger on a few pieces including another 9070xt for my SFF build but I know there will be some decent sales coming up.3 points
-
Honestly I would just hang on to the 7900 XTX. Neither vendor has anything appealing for us without spending way too much money. I am waiting to see what next generation offerings look like, that being said I can play everything I want to play in terms of gaming at 3440x1440p. Perhaps Monster Hunter: Wilds is the only exception but that game runs like trash on anything right now. I believe you can run a trial for 30 days, there are ways around that "limitation" but I never looked further into it. I used the trial for about that duration and bought a license. The license gives you access to its platform for life but updates for 1 year to the OS itself. You can pay for a lifetime update license of course, i'll probably be upgrading my license to lifetime updates. The only part that is a bit quirky at times is finding a flash drive to use which must have a UUID that it can tie the license too, then of course install the OS to it. Cant remember how I did mine but probably used Rufus.3 points
-
I was looking at the zotac solid core oc. $999, or the zotac amp extreme $1199. Neither has me terribly excited. I've been contemplating waiting a bit longer to see what AMD has in the pipeline. Hoping that it would be comparable to the 4090.3 points
-
I wont pay $2k for a gpu let alone those prices. Gpu+waterblock+thermal paste/pads. I have a 5080 in my cart, but I cannot seem to justify it. The 5080 is only marginally better than the 7900xtx.3 points
-
There is definitely some user error in the equation along with inherent balancing issues. Luckily with something like the WireView Pro II we now have a proper tool just in case to monitor balancing AND it can handle higher loads along with a serious warranty. Once shipped, it definitely dims the light on the Astral a bit. AMD made a major mistake and damage control is in overdrive. The sad part is if the tech media and users hadn't responded so negatively they would have continued with this business as usual. They may change course but the destination is still set and that doesn't bode well. Like Steve said in the video, "I said earlier this year don't %*%*% up AMD!" That's all they had to do. They had the momentum and actually sold way more 9070xt/9070 cards than expected and had their best sales ever in quite awhile then proceed to make dumb mistakes like this.3 points
-
This really makes you think with everything done right, and oodles of power slammed through the connection, it is just fine. I would say it is human error coupled with potential QC, but I also believe after the 4090, OEMs/AIBs definitely test their connectors under high stress conditions including MSI. I also still stand by the fact lighter colored connectors will show scorch marks much easier than black connectors and to see if a black connector is degrading you will need to either get it to a point of melting and/or have the actual pins scorched too. I would posit that there are probably equal parts black connectors scorched out there as there is lighter color connectors. You just can't detect it as easily with the naked eye. I do not subscribe to the "yellow tip disaster" theory quite yet. We are talking anecdotal meets the potentially lowest level of confidence. It just doesn't fly. I do subscribe to the "a very small segment of connectors in general burn but we just quite don't know why yet" theory which applies to both Nvidia and AMD cards that have adopted this connector and includes MSIs cables along with other ones. In that line of questioning and wondering about cable/tentacle QC, how many founders edition 5090s / RTX 6000 pros have burnt up using the supplied Nvidia adapter cable? You would think their adapters would be the gold standard in all of this....3 points
-
Interesting video. Brother @electrosoft asked a rhetorical question not long ago about why nobody we know has melted connectors. Nobody could give a definitive answer. This makes me even more curious. 1600W for 15 minutes did not melt the connector. I think the answer might be "defective Chinese trash" after watching this. More accurately, "expensive defective Chinese crap." People are getting worried about exceeding 60-70°C and twice that much heat didn't melt the plastic connector. Thank you. I think I can still do better. Just need to get my core temps to stay down in the 20°C range or lower.3 points
-
At least AMD suffering the problem lets us know it is a spec issue and not an Nvidia GPU side issue but still.... Score another for a lighter colored connector showing clear and visible scorch marks..... The blessing and the curse is my closest MC is ~60min away in St Davids. Close enough to where if I really want to go it isn't that far away yet far enough away to keep me honest and not habitually dropping by "just to look" (translation: Coming home with new goodies more often than not). An hour is a medium sized commute. Very doable. Your poor wallet AND you get access to a newer store vs ours which opened in 1991.....3 points
-
The Taichi is now listed on fleabay along with the 7900xtx.3 points
-
Happy to help. I have not been able to beat any of my Strix high scores with the Apex using my best 9950X. https://hwbot.org/benchmarks/cinebench_-_r23_multi_core_with_benchmate/submissions/5898291 https://hwbot.org/benchmarks/y-cruncher_-_pi-1b/submissions/5904464 I hate cutting aluminum and you are right about using a dremel or grinder. It loads up the griding disk with metal and resists the process. Using a hacksaw or a jigsaw is the easiest way to cut aluminum, but you can't use a hacksaw for some things. My MSI X870E Carbon and the Z790i Edge both had the rear I/O heatsink made about 1/16" too long and had contact interference with the GPU backplate. I was able to install the GPU in both motherboards but it was jammed against the backplate hard enough to damage the anodized finish on the GPU backplate. The NVMe heatsink was also touching the backplate on the Z790i Edge, but not jammed against it super hard. I had to install the GPU first, then the NVMe heatsink.3 points
-
Amazing, and very appreciated reply. Thank you! You've convinced me to sell the Taichi MB. My gigabyte board had a similar problem with the top pcie heatsink/cpu waterblock, and GPU backplate. I had to dremel a 1/4" off the pcie heatsink in order for it all to fit/mesh correctly. Of course the top pcie is needed for the SSD to run at 5.0, and the GPU to run at x16. Cutting aluminum with a dremel cut off wheel was excruciating. Aluminum does not cut well, so it was frictioned off! 🤣3 points
-
Opinions!? Zotac rtx 5080 solid oc w/ Alphacool water block....or Msi suprim rtx 5080 Or none of the above? Or hang on to my 7900xtx till next gen? @Mr. Fox Did you have/test the Asrock X870E Taichi mb? I have a brand new one and was considering swapping it out with the Gigabyte Aorus elite x870E thats currently in my rig. What were your impressions of it?3 points
-
My little trooper EVGA 550w Gold gave up the ghost tonight. Somehow my Aorus ITX AM4 motherboard has now survived 2 PSU's (both EVGA) I yanked out the 1300w EVGA Gold from the Unraid system for now, I am planning some hardware swapping though so I suppose this is a bit of motivation. I recently acquired a Precision 7910, mostly so I can consolidate 3 systems into one. Unraid has been such a consistent OS that I think it will be a good fit for everything I could probably throw at it. I recently purchase a Morpheus 8057 heatsink, also got a second 280X (Windforce this time) and a PNY 780Ti w/ Accelero Xtreme III. About a 100 USD altogether.3 points
-
The best thing to take from this is him lending to the idea of using the best CPU for the game at hand and that he is CPU agnostic and owns and uses them all from CPU to GPU. For some games, Intel is better. For others, X3D is god mode. This has always been the truth. Fallout 76 runs substantially faster on X3D than Intel. In certain places, gobsmackingly so.... I still think Intel handles the critical lows of WoW better than AMD but since the 9000 series, the dip/chunk is gone with X3D. I like owning a 9070xt and 5090 I like owning that SP109 14900KS and 9800X3D I *want* to own a 285k platform but the 275HX Alienware 18 will have to do for now I know if a nicely priced, well binned 285k popped up I'd probably do something stupid and buy it like an idiot. 🤣 Timestamped: -------------------- Overall, though, Jufus needs to understand there is "hardware enthusiast crowds" and "normal crowds" and most people (like 99% if I had to guess) are just going to go buy their laptops and desktops, plugged them in, install steam, epic, blizzard, MS, etc.... and play their games. They are never, ever going to get under the hood in any way, shape or form. They literally freeze up at the idea of updating their BIOS and will come to someone like us or take it to a Best Buy/MC of sorts for any type of problems or upgrades. This is why his attacks on HUB and other mainstream sources routinely do not make sense to me. This is also why his channel has been stuck at sub 40k subs forever growing at a snails pace. Not that it is a bad channel. I enjoy it. It is because his target audience is a tiny fraction of the audience that watches channels like HUB, JZ2C, LTT and more....3 points
-
Ouch... sometimes truth is painful. If Jufes is speaking it, almost guaranteed to be painful. Tact isn't his thing, LOL.3 points
-
That is very odd. Does it have dual vBIOS? Maybe it got swtiched? If not, then maybe a driver update did something. I know NVIDIA has pushed out firmware with drivers in the past. I haven't seen them do that in a long time though. If I remember correctly it was a real hassle that you went through to get that flashed.3 points
-
I ordered a thermal sensor from Amazon to connect to the T-sensor on my motherboard and it works well. I have the probe inserted between wires in the 12VHPWR cable near the GPU socket. I confirmed the temperature in HWiNFO64 from the T-Sensor is within about 1-2°C of what it shows with my IR thermometer. Looks like I did not need to worry about my 12VHPWR connector melting unless something changes in terms of load balance. Wonder if the Ampinel will release on schedule with enough stock to not immediately sell out? Supposedly will be available for pre-order, but I'm not seeing that it has been yet. https://hwbot.org/benchmarks/3dmark_-_steel_nomad_dx12/submissions/59147143 points
-
Exactly what @electrosoft looks for from modern software🙂 OCCT version 15 adds coil whine detection that doesn't require a microphone — popular stress tester gets genius new feature to silence your PC Me. I just bought this one. Around $167 USD incl the dreaded Norwegian tax and with shipping. Not the black but the white version due the price cut when we ordered. The black one was $50 above. I hope I don't regret going cheapo, LOL3 points
-
We need more options like this state side. We desperately need another EVGA stat..... Basically letting you push voltage to 1.15v and 2000w via software and vBIOS is beautiful. We shouldn't be needing to rip our cards open to shunt and attach an EVC (especially on cards sub ~1.095v) to enable functionality like this. Crazy the XOC vBIOS still hasn't leaked almost 10 months later. We're creeping up onto a year soon since launch. I agree on the waterblock purchase too. It will not only let you push this beauty, but will help with resell value later.3 points
-
24 hours with this machine. My feelings and findings so far: The body is really sturdy with amazing torsion strength, does not twist, just acts like an ingot. One screw (front right corner) was loose from the factory. I have not opened the device, yet. As for the problem with the bottom door, well, in the 3D Visualisation on the Dell´s website, the laptop is pictured with some sort of a halved, sliding bottom door (the one that I have posted prior to this post). I do not have this, I have normal bottom cover that is one piece. Anybody here has a PMP18 with this sliding door? Keyboard is...well, its not a ThinkPad but it is not that HP abomination as well. Its quite good. No complaints except for the arrow keys. The 2560x1600 120Hz panel looks amazing color-wise, almost like IPS Black from the new Ultrasharps, but its slow AF. Ghosting is just there. Lenovo had way better panel in the P16G2 regarding this. I miss that screen, it was really good. Touchpad is very pleasant to work with, both its surface and the click feedback are good. Fan behavior is very good, at least during office work and basic multitasking its like passive or just very quiet, have not had a chance to try out some proper loads except benchmarks. Surface temperatures are pleasant. Sound coming from the speakers is nice. Not MacBook Pro level nice, but usable for both movies and some background music. Weirdly bugged Optimus. I have installed iGPU drivers from Dell. They install as a driver only, without the control panel. If I install the control panel for the Intel iGPU manually from MS Store, it results in bugged Optimus, with the dGPU (RTX 4000) running at 8W idle basically non-stop. Have you encountered this as well? The battery life is all over the place, ranging from 2 to 6 hours in idle with no apparent causes. The total system consumption is very wild, going from basically zero to 50W, again, without doing anything other than web browser with a few tabs (brightness at 7 out of 10) Fingerprint reader dropped once in 24 hours, was missing in device manager with no apparent cause, was fixed by a reboot. I hope that I wont have to RMA this week after the delivery. The internal, factory 1TB Gen5 (Samsung PM9E1) drive tops out at +- 8400/4000 R/W and I do not know why. Eventlog keeps getting spammed with "Smart Card Reader 'Microsoft UICC ISO Reader rejected IOCTL TRANSMIT: Parameter is incorrect" - seems like a SIM/eSIM issue, I do not use WWAN currently. Anyone getting these errors on their machine? Same thing with PCI Express Endpoint / PCI Express Legacy Endpoint WHEA Corrected errors. I am unable to find the source of these. Laptop seems to be working fine. Score in Passmark seems okay, getting about 62k / 4.8k MT/ST ratings, which are quite higher than the median for the 285HX. This is just some basic rambling about the machine, I will do a proper review once I feel "at home" with it. That feeling aint coming yet.3 points
-
3 points
-
OK... got the Apex replacement. Seems good so far.3 points
This leaderboard is set to Edmonton/GMT-07:00