-
Posts
1,896 -
Joined
-
Days Won
13
Content Type
Profiles
Forums
Events
Everything posted by Etern4l
-
Strong option, many thanks. TBW is actually 1400 for the 2TB model vs 2.5k for the Firecuda but that's decent. The reason I'm looking for high TBW is that the usage will be fairly heavy, might easily run into tens of TB per month so looking for something with a bit of headroom. I actually have a 2TB Corsair drive, has been good for me but mostly used as storage. Will def read some reviews on this.
-
*Official Benchmark Thread* - Post it here or it didn't happen :D
Etern4l replied to Mr. Fox's topic in Desktop Hardware
Nice. Not sure what der Bauer was talking about with 320W power draw.. Edit: Ah, so this suggests the CPU is power limited out of the box, and removing the limits and then some OC pushes the power draw to 388W and 399W respectively 🤣 (probably for very little benefit though) -
Hello, after a brief research the answer seems to be clearly FireCuda 530, nothing else comes close to 2500 TBW for 2TB. Performs reasonably well too. Any experiences with this SSD, or perhaps some alternatives I missed? How is the real-life reliability?
-
*Official Benchmark Thread* - Post it here or it didn't happen :D
Etern4l replied to Mr. Fox's topic in Desktop Hardware
Right, but of course the smoothness of CG motion is a function of FPS. With input lag there is a bit more extra neural complexity, however, we can also perceive "smoothness" of response very well. In real world a 100ms "input lag" would probably correspond to really heavy intoxication 🙂 Yeah, I don't know where that 60 fps or something "upper limit on FPS perception" myth came from. Possibly born back in the CRT era where they were figuring out at what point we stop perceiving the scanning effect. I would argue that's very different, because then we are not talking about perceiving FPS, but perceiving individual lines (or pixels) being drawn - a 2-6 orders of magnitude quicker process. -
*Official Benchmark Thread* - Post it here or it didn't happen :D
Etern4l replied to Mr. Fox's topic in Desktop Hardware
This is a common misconception. Reaction times are different to perception and could indeed be much longer. Pretty much everyone in good health can perceive the difference in smoothness when dragging a window around reasonable size monitor at 60Hz, 120Hz and 240Hz - through this simple exercise we are already in the perception of sub 10ms phenomena territory. -
*Official Benchmark Thread* - Post it here or it didn't happen :D
Etern4l replied to Mr. Fox's topic in Desktop Hardware
A reputable retailer here says availabe tomorrow for.... $840. -
*Official Benchmark Thread* - Post it here or it didn't happen :D
Etern4l replied to Mr. Fox's topic in Desktop Hardware
You keep referring to some DF video, although you haven't provided a link recently. Monitor latency is a component of the total system latency. 10ms is deemeed significant for one and therefore any component of the total latency by a major and fairly sophisticated review site, and this is in reference to the impact on the overall latency as perceived by the user. Again, highly subjective and dependent on the relevant user reaction time/sensitivity to some extent. All good as long as neither the extra latency nor the artifacts detract from the user experience. Well done scoring a 4090! Edit: BTW DLSS artifacts are not just showing up in between frames, since the two "master" frames are also DLSSed, so some of the artifacts and the loss of quality are permanent and very visible in many cases (rather than just imperceptible flickers between two perfect frames). -
*Official Benchmark Thread* - Post it here or it didn't happen :D
Etern4l replied to Mr. Fox's topic in Desktop Hardware
A link to scrutinize would help, as I'm not even sure how the imput lag was measured. For example, 10ms of extra monitor input lag alone would be very noticeable, see https://www.rtings.com/monitor/tests/inputs/input-lag I wish I knew what you are referring to re 100ms input lag. Atari 2600 pong perhaps? There is an absolute technological chasm between DLSS and prior technologies, and the artifacts are on a different level as well. For example here: DLSS 3 introduced a gigantic quantity of artifacts, which applied together kind of look plausible. In the example above the algorithm invented some cloud reflactions it assumed should be there, the colours are off, the details are blurred etc. but it all looks convincing if you don't know what the raw image looks like. Think of this as deepfakes going wrong. There are lots of other examples involving conjured up/overdone reflections for some reason. Here is an interactive comparison. Shocking! https://imgsli.com/MTI2NzI5 -
*Official Benchmark Thread* - Post it here or it didn't happen :D
Etern4l replied to Mr. Fox's topic in Desktop Hardware
The latency would basically double, so in your case 25ms to 50ms. The impact would be highly title dependent. Perfect for a relaxed flight sim for sure, anything that requires frequent and precise control might start feeling slightly slippery, depending on one's perception threshold of course. Naturally, the higher the raw framerate the lower the risk of input lag issues, so this is probably a better tech to use going from 120 to 240 fps than from 60 to 120 or from 30 to 60. Then there are the model errors / artefacting - plenty of examples on YT to enjoy. The beauty of it is that most people wouldn't notice since they would not routinely scrutinise correctness vs the raw material. Clever Nvidia, good for them. -
*Official Benchmark Thread* - Post it here or it didn't happen :D
Etern4l replied to Mr. Fox's topic in Desktop Hardware
Let's hope so. My point is that things like core counts don't tell much of a story, and kind of look underwhelming in comparison. Looking deeper, 6800xt is on par with with 3080 on Linux (and outperformed it in gaming) so go Team Red! -
*Official Benchmark Thread* - Post it here or it didn't happen :D
Etern4l replied to Mr. Fox's topic in Desktop Hardware
There is obviously a one frame input lag with Frame Generation, but this would likely only possibly bother esports twitch shooter players, who would already be enjoying raw 400-500 fps on 4090 rigs, so wouldn't need any extra frame generation. Casual player's dream for sure, similar to gsync. -
*Official Benchmark Thread* - Post it here or it didn't happen :D
Etern4l replied to Mr. Fox's topic in Desktop Hardware
The specs look like "Navi 31" will compete robustly with the 4080, unless the much larger cache provides a substantial enough boost. -
*Official Benchmark Thread* - Post it here or it didn't happen :D
Etern4l replied to Mr. Fox's topic in Desktop Hardware
Hope they manage to stay in business after the falling out with Nvdia, and maybe come out with a PSU with native 12VHPWR if not PCIe 3.0 support. Of course, according to Jensen, his pal EVGA owner is just wanting to close shop, nothing to do with Nvidia specifically 🙄 -
Great, I'm keeping this PSU on my shortlist then 🙂
- 153 replies
-
- 1
-
-
- build log
- monster desktop
-
(and 3 more)
Tagged with:
-
I mean of course the PSU-side 8pin PCIe ports could be capable of delivering 300W of power each, it's just that we don't really know, although they can evidently do ~230W. Was the 465W sustained? Could that have been within the 450W tolerance margin? Unless you hit 500W+ sustained, as people seem to do, something's a bit off (especially given Seasonic's 450W disclaimer). And again - trivial to check if the Seasonic adapter is the issue, first thing I would do already.
- 153 replies
-
- 1
-
-
- build log
- monster desktop
-
(and 3 more)
Tagged with:
-
I don't understand this. With the 4x Octopus, you'd need 4 8pin PCIe cables coming from the PSU. Those 4 cables, would require 4 PSU-side sockets. There are no 1->2 splitters on the PSU side, just straight up 1-1 cables: https://seasonic.com/pub/media/wysiwyg/feature-pics/PRIME-TX-1000-accessories-shadows.png The Seasonic 12VHPWR cable can be rated to 600W but that does not mean that the PSU is actually going to deliver 600W (as per their own disclaimer). Could simply be because the PSU is not delivering the power...
- 153 replies
-
- build log
- monster desktop
-
(and 3 more)
Tagged with:
-
Worth a shot. I'm pretty sure I saw mainstream YT videos of people reaching 500W+, so if you can't do that maybe the 2x 8pin adapter cable is at fault. How about you switch to the 4x Nvidia Octopus for comparison? With the Seasonic cable, can you specify power target > 100%?
- 153 replies
-
- build log
- monster desktop
-
(and 3 more)
Tagged with:
-
Have you tried any GPU rendering benchmarks yet? Maybe give IndigoBench a go.
- 153 replies
-
- 1
-
-
- build log
- monster desktop
-
(and 3 more)
Tagged with:
-
Wow, looking at this angle, the 12VHPWR cable is quite thick (which has its advantages obviously... safety, hello) but not as easy to route as say the Corsair cable: https://www.corsair.com/uk/en/Categories/Products/Accessories-|-Parts/PC-Components/Power-Supplies/600W-PCIe-5-0-12VHPWR-Type-4-PSU-Power-Cable/p/CP-8920284 the BeQuiet one: https://www.bequiet.com/de/accessories/3959 or the one in the the new Toughpower GF3 (the second cable from the bottom right I believe):
- 153 replies
-
- 1
-
-
- build log
- monster desktop
-
(and 3 more)
Tagged with:
-
I hate this remote Windows home directory business, it kind of works but there are issues with profile synchronisation etc. What's wrong with cloud solutions? You can add your own encryption layer on top? Alternatively, you can host your own - e.g. Synology Cloudstation.
-
*Official Benchmark Thread* - Post it here or it didn't happen :D
Etern4l replied to Mr. Fox's topic in Desktop Hardware
Well, exciting, although I would love to see some concrete numbers. To be fair, there could be a valid technical reason - it's somewhat unlikely anyone here is an IC engineer enough to reasonably opine, however, what'd be really wicked is the behaviour being poorly undocumented, leaving customers guessing as to what cooling system to opt for (more likely complely oblivious). This could conceivably be done for marketing reasons - you wouldn't want to generally advertise a loss performance at 50 or 60C if you are selling cards that always suffer that loss in practice. Yeah, I'm starting to wonder if if 3090 Ti wouldn't be a better idea (for my purposes). I'm more VRAM bound and there is that NVLink. I can click and buy an FE right now, and if thermals suck - WC it (albeit at an almost certain loss of warranty presumably). -
*Official Benchmark Thread* - Post it here or it didn't happen :D
Etern4l replied to Mr. Fox's topic in Desktop Hardware
Safe bet. Some youtubers have bragged about being busy testing. -
*Official Benchmark Thread* - Post it here or it didn't happen :D
Etern4l replied to Mr. Fox's topic in Desktop Hardware
Well, I looked at the link - says clearly the drop starts at 70C which is more intuitive. Still, keeping GPU temps even under 70C on air can be a challenge and/or very noisy, so even if the threshold is at 70C, this doesn't detract from the appeal of water cooling in my view. Yes, but the fact that cores get minimal use can be misleading and doesn't mean that the GPU can't get moderately hot - this is due to the heavy memory controller load during mining. Going back to the miners' advice, for them keeping the GPU chipset cool can be was an extra challenge if the space is was overcrowded with devices and insufficiently ventilated. This is an equivalent of mining on a laptop 🤣where keeping GPU temps <70C can indeed also be a challenge. I am not 100% sure we are talking about the same thing, but I do see low temp throttling in practice as well. So annoying. In case the GPU is lightly loaded, say < 20% the clocks dial down to sometimes as low as 50-60% of max. This is despite max performance power mode being set in both Windows and Nvidia control panel. If anyone is aware of a workaround, I would be grateful. As soon as load goes up, so do the clocks (and consequently the temps). Nvidia, just because the load is 20% doesn't mean I'm OK with it being processed much slower to save power! Goofy indeed. BTW I tried the latest Nvidia driver 522.30. Definitely a keeper, up to +2% in *some* benchmarks! -
What a mess. Wonder if the fixed the 21H1 issue with Capability Access Manager Service randomly waking up to consume massive amounts of CPU, or whether they added more bloatware on top....
-
*Official Benchmark Thread* - Post it here or it didn't happen :D
Etern4l replied to Mr. Fox's topic in Desktop Hardware
That's quite unexpected, obviously not something one would be able to observe with air cooling. How much % performance loss are we talking about between say 40C and 65C? @tps3443could you run a quick experiment to roughly quantify this (assuming that @Mr. Fox's temporary 3070 is not WCed)? Could just run a GPU benchmark of choice twice: with 40C temp target and 65C temp target (assuming you can control this via cooling only), everything else being equal. Good point about the memory temps, I guess the assumption is that memory has higher temperature tolerance. The 3090 suffers from a specific issue - I don't remember exactly, but it is a combination of chip types and insufficient memory cooling due to 2-sided card design. I recall that on the 4090 all mem chips are on one primary side of the PCB, and mem temps are much lower. That said, while Ethereum mining was memory bandwidth bound, it would still load the cores to decent temps so the recommendation was probably based on some emprical results in terms of the ratio of failed cards operating in certain conditions.