Jump to content
NotebookTalk

Leaderboard

Popular Content

Showing content with the highest reputation since 10/07/2025 in all areas

  1. The stock voltage is 1.15v and the stock power is 600w. The XOC vBIOS is publicly available on tech power up and is 2001w, which further unlocks voltage and memory another +1000 over other 5090s. Some nice individual over at OC.net PM'd me the extra special HOF tool for even further voltage, LLC, and switching period control. Basically the same as the old KingPin tools. Ordered the HOF 5090D IceMan waterblock on AliExpress. Figured the card is pretty rare and any future owner will def. want the ability to put it under water to take advantage of the voltage and power limits.
    7 points
  2. Sometimes I feel like I make the worst decisions. Found this card brand new from a small shop in Florida.
    6 points
  3. You explanation piques my curiosity so I may have to investigate it. Even if it turns out I do not have a need for it, I am still curious. I would say at least 1300W. The Lian Li Edge seems like a solid and affordable option. I have one in my 4090 build. I have a Thermaltake GF3 1650W PSU in the 5090 build. I like both of them. The GF3 is hard to find. It has dual 12VHPWR sockets and like 6 PCIe 8-pin sockets in addition. I may have to drop down to 0.001 Ohm shunt resistors to bump my power limit. I backed off my core clock a bit and the scores went up. I ran the benchmark again with GPU-Z and HWiNFO64 running on my second monitor so I could watch and it is still showing power limit perf cap reason, so... hmmm. I'm pulling 1350W from the wall already. HWiNFO64 shows like 959W on GPU power rails. Apparently that's not enough. My core temp is still hitting 41°C with 9°C water, so that's not helping. I guess I am going to have to think about using liquid metal on the GPU. I don't want to, but 41°C is definitely not helping https://www.3dmark.com/3dm/144631382 | https://hwbot.org/benchmarks/3dmark_-_steel_nomad_dx12/submissions/5918232
    4 points
  4. Its an OS you dont actually interact with directly like you would most linux distro's or Windows. Once its installed, you interact with it via Web browser of choice. Its main selling point is acting as your File Server, you can also add 2 drives for parity as well. The other large value add is containers, the functions are in my view endless, though it wouldnt surprise if people more adept in the line of expertise would say otherwise. Most importantly its just rock solid and is all installed on a flash drive, simply notate the Serials of the drives and how you arranged them in the GUI, and you can easily transport to another system and have it boot up without missing a beat. They've also recently added in official TailScale support so you can have friends and family have access relatively easily. Its not something I would expect people here to have too much interest in, but its been a thrilling experience for me while I gather up GPU's for benching this Winter. I have a particular fondness to Anime, make no mistake its an ocean of urine like any genre but it long poses a challenge in media management. Unraid has many containers to make this process much more simplified. More importantly just rock solid stability, wish Windows was as consistent. To answer both of you... They weren't the same, some may recall I had a 1300w PSU that was recommended by most here at the time, that one gave up the ghost, EVGA replaced it with a different model 1300w which I had been using with Unraid for a while, its now powering my ITX system. Open style chassis allows much more freedom than a case would in this regard. I've also killed an 850w EVGA Gold PSU, but like I said I killed it so I did not seek RMA for it. Not sure what the issue was for the 550w Im guessing it didnt like being power cycled as quickly as I had done it. In some ways it does clean things up a bit since I am no longer using the 1200w (900w for 110v) server PSU to power the 7900 XTX. Which now opens things up for a new PSU on the bench system (10850K), should I start looking around for another 1300w?
    4 points
  5. Off-Topic: The Phoenix Micro Center grand opening for VIP members is 11/5. My wife better hide my wallet. 🤣 It's an hour each direction from where I live, but at least it's not the 6-hour drive each direction to Tustin, CA.
    4 points
  6. For the GPU, I would say select whichever one costs less, especially if you are putting a waterblock on it. My Zotac 5090 Solid OC was a good buy compared to the other more expensive options that deliver nothing for the extra money. It is an excellent GPU. The air cooler on it was fantastic (unlike some of the other affordable brands/models). It ran freakishly cool for an air cooled GPU. The only 50-series GPU I would recommend avoiding like a plague is an FE model. For overclocking potential probably the best GPU silicon quality most consistently will be an AORUS Master, but the cost vs benefit isn't justified. I love overclocking more than anything else I do with a computer, and really the only reason computers matter to me at this point, but paying a WHOLE LOT more for a very small gain in GPU benchmark scores is just not a very intelligent decision. I have owned the following X870E motherboards and I list them in my order of preference: X870E AORUS Master (best overall - only flaw is no way to disable WiFi/BT in BIOS) X870E-E Strix (replacement for second AORUS Master that arrived with shipping damage) X870E Apex (returned first for refund, second was junk, I am using #3) X870E Carbon (returned for refund - good mobo, but no async BCLK and weird glitches) X879E Taichi (my least favorite out of all AMD motherboards I have owned - hated it) I had a X870E Taichi and hated it. The PCIe bifurcation was garbage and I did not care for the firmware. I have only owned two ASRock motherboards and did not like either one. I had a B850 AORUS Elite that I used in a build for my granddaughters and it was excellent. The only criticism I had was the PCIe slots below the GPU slots were X1, but using them did not drop the GPU to X8. This is unavoidable with anything below X870E dual chipset due to a lack of PCIe lanes with an non "E" AMD dual chipset motherboard. PCIe X1 dramatically reduces NVMe speed... makes NVMe speed like SATA SSD. If you plan to insert anything in other PCIe slots in addition to your GPU in an AMD motherboard the "E" version is an absolute must have. The only complaint I have with the Gigabyte boards is no way to disable WiFi/BT in the BIOS. Super stupid flaw they could fix effortlessly if they cared. If you use WiFi/BT and use Windoze 11 as your main OS (I do not do either one) this truly is a non-issue. It really pissed me off that Gigabyte did not provide that option in the BIOS. I asked twice and both times they said no... "you're the only person complaining about it" (essentially we don't care what you want and you are not worth the minimal effort needed to make a BIOS as good as our competitors). Gigabyte is the only brand I know of that omits this essential basic BIOS option. The Strix was an accidental blessing. I purchased a second AORUS Master from Central Computers on sale for less than what I paid for the first. The big and heavy NVMe heatsink under the GPU was not latched. Apparently shipped from the factory without being latched. It flopped around inside of the box and broke several things and scratched up things that did not get broken. I asked them to open the box and inspect before shipping a replacement. They had quite a few in stock and ended up opening all of the boxes and all were damaged in the same way. They offered the Strix for no difference in price. I accepted. The Strix is better than the AORUS Master in terms of firmware. A close second only because I could not install both of my Sabrent quad NVMe X4 cards like I could in the Master. It only has one extra PCIe slot. The AORUS had two, both usable at X4 without dropping the GPU from X16 to X8. The AORUS Master allowed me to install 10 NVMe SSDs and 4 SATA drives while maintaining the GPU at X16. The Apex is a great motherboard with a glaring engineering defect entirely due to an idiotic PCIe slot arrangement. I can only use the X4 PCIe slot above the GPU. The Sabrent quad NVMe card's heat sink touches the GPU backplate. The Strix performs as well as the Apex in terms of the CPU overclocking. It has asych BCLK and I can use the Sabrent card in the bottom slots without the GPU dropping to X8 like it does in the Apex. If I knew everything I know now before buying my first I probably would have purchased two X870E-E Strix Gaming WiFI. If I were going to recommend one, it would be the X870E-E Strix as the best all-around X870E motherboard with the fewest flaws and compromises. Hope this helps. https://www.newegg.com/asus-rog-strix-x870e-e-gaming-wifi-atx-motherboard-amd-x870e-am5/p/N82E16813119682
    4 points
  7. some recent highlights, ambient https://valid.x86.fr/q4pt03
    4 points
  8. Seems like increasing load line to 150% and setting Vout Command for the value you want your voltage offset to match helps with stability. Just setting the offset and doing nothing else is less consistent. This run flat-lines at 3367 MHz core and 1.150V. I am keeping a finger on the 12VHPWR connector and while I can feel it getting somewhat warmer it is not feeling "hot" even though my meter is showing 1350W from the wall. https://hwbot.org/benchmarks/3dmark_-_port_royal/submissions/5914106 | https://www.3dmark.com/3dm/144195067
    4 points
  9. This definitely makes a difference in maximum clocks. Setting 1.100V I have no problem running 3450MHz on core. But, the scores go up very little and power draw goes up about 250-300W. I also need to change thermal paste and go back to KPX or Kryosnot because Alphacool Apex I slapped on there after the EVC2 mod is no bueno. Core temps are much higher delta, even with the chiller. The Apex thermal paste is very similar in performance to PTM pad. Very durable and consistent, but too much thermal resistance. https://hwbot.org/benchmarks/3dmark_-_steel_nomad_dx12/submissions/5914059 | https://www.3dmark.com/sn/9406069 I glued magnets to the back of the plastic EVC2 case and stuck it next to the motherboard below the SATA ports. When I switch to a more effective thermal paste I will route the EVC2 wiring out of the end near the SATA ports so the wiring is less visible. I also redid part of the loop. I added a ball valve at the first radiator that I can turn off and a QDC fitting to connect the return line to the chiller so this totally bypasses the radiators but still uses all five of my D5 pumps. Even though the temps are impeding the benchmark score, look at how much higher the clocks are before the NVIDIAtard room temperature thermal degradation algorithm effs things up. https://www.3dmark.com/compare/sn/8515121/sn/9406069#
    4 points
  10. Now that I have confirmed it works I will devise a cleaner way of mounting this inside of the chassis.
    4 points
  11. Yeah! EVC2 mod works. As a quick test, I goosed the core voltage a little and ran an MSI Kombustor and my GPU is pulling another 250W from the wall now (about 1350W) with everything else stock. I purchased a PCB heater and man... so much easier to solder when the PCB is warmed up.
    4 points
  12. 5090D is only Galax HOF available. Only 100 of these cards were made. I got it for about $800 over an Astral. Which in hind so hindsight is still a terrible deal lol. But at least it’s semi unlocked and can go to 36Gbps memory. Can you DM me the Galax HOF software? I want to see if it works on 5090.
    4 points
  13. Hello everyone, long time no speak. Been super busy with stuff. Today I was pondering windows 7 and was wondering if this was possible? Um not too familiar on the ins and outs of hardware documentation. I asked grok A.I. this question.
    4 points
  14. All I'm saying is the math isn't mathing....and luck is not at play in engineering fault tolerance testing. Plus with the user reports all we have is their subjective takes ("I swear I checked everything right before going to church and feeding the homeless!") usually rife with oversights and human error we do not know or see. All we see is the end product which is a burned cable and/or fused connection. Then people run with it..... If the connector was that flawed, those who routinely push their cards should suffer a much noticeable higher rate of failure yet amongst the enthusiasts on the forums the rate of failure is minuscule if even that. I've long thought this since the 4000 series and it continues to prove itself out over and over. I frequent the various enthusiast forums, enthusiast FB tech message groups, tech focused Discord along with clans and guilds chock full of 4090 and 5090 owners gaming 8-12hrs/day and more waiting for this influx of burned connectors amongst the various types used considering how hard the hardware is being pushed even on non shunted cards running max OC's constantly hitting 600-650w....yet......much ado about nothing. I ran my 4090 OC'd from start to finish. I am doing the same with my 5090 too. One thing I do for my cards since Ampere is I run them max Underclocked when not gaming/benching then Max OC when gaming. Look, it's a cheap part to replace for the cable. I can swap in numerous types or models if I want. That isn't the issue. Zero loyalty to a damn connector cable 🤣 The one and only time I had issues was with the CM adapter cable I was going to use and it kept black screening and crashing and it turned out to be severely problematic and was a legitimate issue even outside of human error or potential inherent problems. The replacement one they sent me is still sitting sealed in its pouch on my shelf. I am really curious about the Ampinel and am almost 100% I'll be picking one up just out of curiosity. And you're correct, many on OC.net DO spend more money on quality products than normies. That just adds even more weight to my argument. With all that being said, I still wish they had stuck with 8-pin PCIe connectors but I guess with the way power consumption is going, we would end up with cards with 4-5 of them if not more at some point on certain models. Imagine a HOF with 8 8-pin connectors..... 🤣
    4 points
  15. From the beta thread in their Discord. https://discord.com/channels/750797327874129930/752301932558942302 Alternatively https://www.overclock.net/posts/29518003/
    4 points
  16. Dang. One set of dark sunglasses ain't enough looking on all that RGB😁
    4 points
  17. aight boiiiiz, new setup is done, pretty happy with how it turned out 🙂
    4 points
  18. I did! Just finished cleaning it up. It's about 5 slots now, these fans are loud but they push a lot of air when pushing 100%. Either way it's ready for benching this winter. Also cleaned up 2x 650ti Boost 2GB cards
    4 points
  19. holy crap one of the most iconic songs of all time, blasting it through my headphones right now yessssss 😄 ladies are asleep, finally some alone time for daddy to continue tinkering 😄
    4 points
  20. im in the same position here, will likely need to replace my 90° Seasonic cable, probably going back to the straight Seasonic one that came stock with my 1600W PSU, thus switching back from current top routing to bottom. should be fine 🙂
    3 points
  21. By some miracle USPS managed to "redeliver" my already delivered 4TB NVMe. Team Group did not repair the old one. They sent a new one. Knowing how inexpensive the parts are that go into them, and factoring in the costs of shipping and handling, sending it to Taiwan to "attempt" repair was truly idiotic and reflects a lack of regard for the people that buy their products. I will still be a brand detractor after this. I've got a bunch of Team Group drives and flash storage, but not planning buy from them again based on this. I changed my Amazon review from 5-star to 1-star based on the 6-week RMA experience.
    3 points
  22. Yeah, the best bet might be to hang onto the 7900xtx unless you have a specific need you're targeting for upgrading outside a new fun toy. 9070xt, depending on use case, is a side grade at best as is a 5070ti. Anything worth the squeeze is going to be a 5080, 4090, 5090. 4090 and 5090 are very expensive. So as you've found it is all about the 5080. I'd still wait and watch sales heading into black friday. I've been tempted to pull the trigger on a few pieces including another 9070xt for my SFF build but I know there will be some decent sales coming up.
    3 points
  23. I was looking at the zotac solid core oc. $999, or the zotac amp extreme $1199. Neither has me terribly excited. I've been contemplating waiting a bit longer to see what AMD has in the pipeline. Hoping that it would be comparable to the 4090.
    3 points
  24. LOL, that would be the sane move. I think we should count our blessings we got in on the 5090s when prices were low. Your Zotac Solid OC is out of stock everywhere and the cheapest price atm is $3279.99+ on Amazon. Even open box used is going for $2612.... New base price on Newegg when in stock is back up to $2899.99 It is better but won't give you that massive upgrade uplift you like to experience true. With black friday specials starting to activate earlier and earlier now, keep an eye out for a good 5080 sale which is bound to happen or do as @Papusan suggests and hunt down a used 5080 on the $750-850 range depending on model. Which model(s) did you have in mind for the 5080?
    3 points
  25. I wont pay $2k for a gpu let alone those prices. Gpu+waterblock+thermal paste/pads. I have a 5080 in my cart, but I cannot seem to justify it. The 5080 is only marginally better than the 7900xtx.
    3 points
  26. There is definitely some user error in the equation along with inherent balancing issues. Luckily with something like the WireView Pro II we now have a proper tool just in case to monitor balancing AND it can handle higher loads along with a serious warranty. Once shipped, it definitely dims the light on the Astral a bit. AMD made a major mistake and damage control is in overdrive. The sad part is if the tech media and users hadn't responded so negatively they would have continued with this business as usual. They may change course but the destination is still set and that doesn't bode well. Like Steve said in the video, "I said earlier this year don't %*%*% up AMD!" That's all they had to do. They had the momentum and actually sold way more 9070xt/9070 cards than expected and had their best sales ever in quite awhile then proceed to make dumb mistakes like this.
    3 points
  27. Interesting video. Brother @electrosoft asked a rhetorical question not long ago about why nobody we know has melted connectors. Nobody could give a definitive answer. This makes me even more curious. 1600W for 15 minutes did not melt the connector. I think the answer might be "defective Chinese trash" after watching this. More accurately, "expensive defective Chinese crap." People are getting worried about exceeding 60-70°C and twice that much heat didn't melt the plastic connector. Thank you. I think I can still do better. Just need to get my core temps to stay down in the 20°C range or lower.
    3 points
  28. At least AMD suffering the problem lets us know it is a spec issue and not an Nvidia GPU side issue but still.... Score another for a lighter colored connector showing clear and visible scorch marks..... The blessing and the curse is my closest MC is ~60min away in St Davids. Close enough to where if I really want to go it isn't that far away yet far enough away to keep me honest and not habitually dropping by "just to look" (translation: Coming home with new goodies more often than not). An hour is a medium sized commute. Very doable. Your poor wallet AND you get access to a newer store vs ours which opened in 1991.....
    3 points
  29. @Mr. Fox Screenshot Something about needing Nvidia closed source firmware
    3 points
  30. Happy to help. I have not been able to beat any of my Strix high scores with the Apex using my best 9950X. https://hwbot.org/benchmarks/cinebench_-_r23_multi_core_with_benchmate/submissions/5898291 https://hwbot.org/benchmarks/y-cruncher_-_pi-1b/submissions/5904464 I hate cutting aluminum and you are right about using a dremel or grinder. It loads up the griding disk with metal and resists the process. Using a hacksaw or a jigsaw is the easiest way to cut aluminum, but you can't use a hacksaw for some things. My MSI X870E Carbon and the Z790i Edge both had the rear I/O heatsink made about 1/16" too long and had contact interference with the GPU backplate. I was able to install the GPU in both motherboards but it was jammed against the backplate hard enough to damage the anodized finish on the GPU backplate. The NVMe heatsink was also touching the backplate on the Z790i Edge, but not jammed against it super hard. I had to install the GPU first, then the NVMe heatsink.
    3 points
  31. Opinions!? Zotac rtx 5080 solid oc w/ Alphacool water block....or Msi suprim rtx 5080 Or none of the above? Or hang on to my 7900xtx till next gen? @Mr. Fox Did you have/test the Asrock X870E Taichi mb? I have a brand new one and was considering swapping it out with the Gigabyte Aorus elite x870E thats currently in my rig. What were your impressions of it?
    3 points
  32. My little trooper EVGA 550w Gold gave up the ghost tonight. Somehow my Aorus ITX AM4 motherboard has now survived 2 PSU's (both EVGA) I yanked out the 1300w EVGA Gold from the Unraid system for now, I am planning some hardware swapping though so I suppose this is a bit of motivation. I recently acquired a Precision 7910, mostly so I can consolidate 3 systems into one. Unraid has been such a consistent OS that I think it will be a good fit for everything I could probably throw at it. I recently purchase a Morpheus 8057 heatsink, also got a second 280X (Windforce this time) and a PNY 780Ti w/ Accelero Xtreme III. About a 100 USD altogether.
    3 points
  33. Ouch... sometimes truth is painful. If Jufes is speaking it, almost guaranteed to be painful. Tact isn't his thing, LOL.
    3 points
  34. Exactly what @electrosoft looks for from modern software🙂 OCCT version 15 adds coil whine detection that doesn't require a microphone — popular stress tester gets genius new feature to silence your PC Me. I just bought this one. Around $167 USD incl the dreaded Norwegian tax and with shipping. Not the black but the white version due the price cut when we ordered. The black one was $50 above. I hope I don't regret going cheapo, LOL
    3 points
  35. 24 hours with this machine. My feelings and findings so far: The body is really sturdy with amazing torsion strength, does not twist, just acts like an ingot. One screw (front right corner) was loose from the factory. I have not opened the device, yet. As for the problem with the bottom door, well, in the 3D Visualisation on the Dell´s website, the laptop is pictured with some sort of a halved, sliding bottom door (the one that I have posted prior to this post). I do not have this, I have normal bottom cover that is one piece. Anybody here has a PMP18 with this sliding door? Keyboard is...well, its not a ThinkPad but it is not that HP abomination as well. Its quite good. No complaints except for the arrow keys. The 2560x1600 120Hz panel looks amazing color-wise, almost like IPS Black from the new Ultrasharps, but its slow AF. Ghosting is just there. Lenovo had way better panel in the P16G2 regarding this. I miss that screen, it was really good. Touchpad is very pleasant to work with, both its surface and the click feedback are good. Fan behavior is very good, at least during office work and basic multitasking its like passive or just very quiet, have not had a chance to try out some proper loads except benchmarks. Surface temperatures are pleasant. Sound coming from the speakers is nice. Not MacBook Pro level nice, but usable for both movies and some background music. Weirdly bugged Optimus. I have installed iGPU drivers from Dell. They install as a driver only, without the control panel. If I install the control panel for the Intel iGPU manually from MS Store, it results in bugged Optimus, with the dGPU (RTX 4000) running at 8W idle basically non-stop. Have you encountered this as well? The battery life is all over the place, ranging from 2 to 6 hours in idle with no apparent causes. The total system consumption is very wild, going from basically zero to 50W, again, without doing anything other than web browser with a few tabs (brightness at 7 out of 10) Fingerprint reader dropped once in 24 hours, was missing in device manager with no apparent cause, was fixed by a reboot. I hope that I wont have to RMA this week after the delivery. The internal, factory 1TB Gen5 (Samsung PM9E1) drive tops out at +- 8400/4000 R/W and I do not know why. Eventlog keeps getting spammed with "Smart Card Reader 'Microsoft UICC ISO Reader rejected IOCTL TRANSMIT: Parameter is incorrect" - seems like a SIM/eSIM issue, I do not use WWAN currently. Anyone getting these errors on their machine? Same thing with PCI Express Endpoint / PCI Express Legacy Endpoint WHEA Corrected errors. I am unable to find the source of these. Laptop seems to be working fine. Score in Passmark seems okay, getting about 62k / 4.8k MT/ST ratings, which are quite higher than the median for the 285HX. This is just some basic rambling about the machine, I will do a proper review once I feel "at home" with it. That feeling aint coming yet.
    3 points
  36. OK... got the Apex replacement. Seems good so far.
    3 points
  37. Congrats bro. Both 4000 and 5000 series HOF have a nice clean design. Did you get the Galax XOC software? It just works. Why change to the new and shiny if the old still works bro @Mr. Fox? LOL
    3 points
  38. I do often tell people that they can sleep when their dead! But that doesn't mean much where I proudly proclaim sleep as my favorite activity. Looks like seller moved to a different buyer for the 460 SLI set, oh well
    3 points
  39. I probably would not even try to refrain. Sleep is optional when there are new parts to be installed. 🤣 UPS tried to deliver and he was not home so he will be picking it up from a UPS location probably later today.
    3 points
  40. MIcrosoft choose to fight their customers and destroy themselves, No empathy for them. I knew it wouldnt be long until there another way to make a local account, I bet they going to patch these methods too eventually but then there will be another way to counter their fix!. In other news Windows 7 is slowly coming back!. https://gs.statcounter.com/os-version-market-share/windows/desktop/worldwide @Mr. Fox Will be happy 😁
    3 points
  41. I do not want to jinx any of us by saying too much. Is it because we do not use janky cables? Is it because we are extra careful about making sure cables are fully seated? Is it because the number that burn are a very tiny percentage that the media blows grossly out of proportion because they live or die based on click bait? Or, is it because we are lucky? Let me see what the Magic 8 Ball says...
    3 points
  42. As soon as I saw this, I thought, "Oh, he's purposely trying to trigger @Papusan and @Mr. Fox!" 🤣 Looks good! I, too, loop my connector(s) over the top of my GPUs when able so you can see the whole card and it takes a bit of the downward pull off the connector. I've done this since Ampere as an additional reason on top of aesthetics. Hmmmm, I wish you had the other board too to see if you could reproduce this on two of them. Might need to talk to other X870e owners to try to recreate a similar test scenario to see if they can trigger a memory fail.
    3 points
  43. Me.... Love all black. Everyday the same. Even my house is painted in black🙂 And me supporting Microsoft with my money? Nope, won't happen. Hell freeze first!
    3 points
  44. Love by the sword, die by the sword @Papusan!! Time to order your pair (or two!) ASAP! 🤣
    3 points
  45. I am trying to figure out how to run Cinebench in real time under Linux. The renice and chrt commands only seem to work on one thread and skip all of the child treads. Maybe it is because of wine being used to run it. Scores are about what I would expect for no tweaks.
    3 points
This leaderboard is set to Edmonton/GMT-07:00
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. Terms of Use