Jump to content
NotebookTalk

Etern4l

Member
  • Posts

    1,876
  • Joined

  • Days Won

    13

Everything posted by Etern4l

  1. Right. An alternative to going by the average is to follow the reviewers that seem to have had similar taste and preferences before. When I see 10/10 from PCW, arguably the most crooked reviewer, it doesn’t mean much to me personally.
  2. I wouldn't get worked up about it to the point of calling reviewers trolls. Major reviews (including the 7/10 ones) are written by people who had played through the entire game and review games routinely if not for a living. They are critics who should possess a broad perspective, and deep knowledge of the genre etc. Taking just all this into account, inevitably there will be differences between individual opinions (especially if it's really just first impressions) and particular reviews. BTW if we don't trust critic reviews, we can peruse gamer opinions on Steam itself. "Mostly positive", which means a fair number of negatives pointing out the familiar issue with most contemporary Bethesda games: a weak main story. For you it may not matter, or you may actually like the story once you get to the end, so you will end up rating it higher for those reasons. Have fun and let us know the final verdict once you beat the game.
  3. Fair commentary on the absence of support for Intel GPUs, but, unsurprisingly, the PCW piece looks like sponsored anti-AMD garbage. Statements made without any reasonable source or data to suggest validity. How many distinct reports related to Intel CPUs, how many to Ruzens? How many Starfield players own Intel and Ryzen CPUs respectively? We don’t know, and neither does PCW who just took the FUD piece money. There is one laughable link to a tiny Reddit thread when one user at the bottom mentions they have a Ryzen CPU and has some crashes but nothing major. We don’t know if his machine is OCed, crashes outside of Starfield etc. The game likely got rushed out (most games are) and some bugs may remain.
  4. Kind of a blatant strawman attempt by electro there, I’m disappointed you fell for it. Nobody is talking about government intervention there (yet).
  5. No, and I stated that explicitly earlier at least once before. In this particular case, the only way to achieve that balance without state intervention is for people to make strategic purchases in support for the underdogs, if possible. If you don't, you are adopting a myopic/greedy strategy which will reinforce the status quo. That's it. If not for people like @Raiderman, AMD's GPU business would have been dead long ago. As intended by whom exactly? What kind of idiots would intend for one company to gain basically 95% of the overall GPU market (compute + gaming)? It is an effective monopoly, and a market without a meaningful competition is broken. I guess that's why you do have antitrust regulations, although they probably have too little teeth to grab NVidia yet, and what would they do anyway? Split the company? Controversial and brute force, but it could come to that. They don't have a strict monopoly in gaming, but they do have a lion's share (80% or so), and they do have an effective monopoly on the compute side. Yes, they have technically superior products, but they also have most cash to spend on further R&D, they have patents, they have talent, there are potentially deals with developers requiring them to favour their GPUs. I'm worried you have insufficient concept of how difficult it is to compete in this business, and how easy it is for the dominant player to thwart opponents, and this semiconductor sector is particularly crazy. It's not as easy as saying "make better products, then we will buy them". It's not a market some XVIII century theorist could have remotely imagined. Well yes, so that's a very interesting topic, unfortunately those things rarely come to light. How many games support DLSS, and how many support the AMD equivalent, and why is that exactly? I wonder what percentages of games support upscaling technologies from: 1. NVidia, 2. AMD, 3. both. What about deals with OEMs preventing or limiting their use of competing GPUs?
  6. That was never quite in question. Those presumably NVidia sponsored influencers are just trying to misdirect us here. The underlying issue is the market dominance - we don’t want NVidia or any other company to be in that position, or they will obviously abuse it. There is no point worrying about what AMD, never mind Intel, would do if they achieved a dominant position in the GPU market, they are nowhere near that point. Right now NVidia is way ahead in gaming (taking into account hardware and software), and enjoying almost strict monopoly on the compute side. The optimal state does not involve another company taking their place, but more than one company offering comparable products in every segment - this would automatically optimise prices as well, unless they tried to form a cartel, which is a criminal offence / felony for any individual involved (unless they are in OPEC lol).
  7. Very interesting, so this CoWoS is not needed for gaming chips? Yep, that would explain why they are being offered to the market at all.
  8. I wouldn’t overestimate the impact of this. It’s just one game, and yes - an aggressive play by AMD but I wonder if it won’t backfire, especially on Bethesda who look like sucky devs here. it’s possible that hey simply needed the money to finish the admittedly ambitious in scale game. It’s an extremely tough industry, with people getting worked to the ground in year-long crunches. As for Team Red, I would argue that instead of spending money on Azor and his dodgy deals, they should have just put the money in their software division. Also why is it the case that the game doesn’t work on Arc? Probably some driver issue Intel can address on their end.
  9. That’s just hush money for his stripper gfs/bfs. His shares are worth literally 1000x more. It’s obscene. Google, MS, Meta, Amazon and NVidia themselves are the primary players and culprits behind the push for AI. AMD and Intel are just playing catchup. Strategic thinking and pushback from the people is the only way to clamp down on this, although as we know this is in short supply and the grand plan is to replace people”s thinking with AI anyway.
  10. This Startfield GPU support has Azor's AMD deal written all over it. The support response from Bethesda will look familiar to any Alienware customer. It’s a 16 GB card, it should work. The correct reply would have been “Dear customer, sorry, Intel GPUs are not supported at this time, we are working on a patch”. Or maybe it’s just Bethesda being Bethesda through and through.
  11. It’s a bit different than Crysis with and without DLSS, because Starfield already uses FSR2. So the question is what would be the effect of using proper DLSS instead of FSR2 on NVidia. Power draw would likely go up, but would FPS? My guess is: probably, but not as much as going from pure rendering to DLSS. @cylixYeah, Todd doesn’t impress me either. His presence at the top explains a lot.
  12. We can safely assume the game is as optimised as can be for AMD (whatever that involves, perhaps not much), however, we don't know if it's not optimised for NVidia, other than it doesn't use DLSS. I suspect this could be key. FSR2 is designed to run on any GPU, therefore it probably does not utilise NVidia's tensor cores, hence the lower power draw, since that part of the GPU is inactive, and hence the lower framerate on NVidia, since now some standard CUDA cores have to be used for the upscaling etc. Just a theory, but a decent one if I may say so myself 🙂 If I'm right then then we will see some speedup on NVidia cards if and when Bethesda implements DLSS. Now, I've heard that there are some unofficial DLSS patches/hacks, but we don't know how well those work vs proper implementation. BTW this would also point out to how power consuming DLSS is, for no significant improvement in image quality over FSR2. We can see that with the very similar video upscaling technology - upscaling a 30 fps video takes up half of my 3090Ti lol. Sells beefy GPUs though, another brilliant AI move from Jensen&Co. To be fair it”s worth noting that the 4090 is still the fastest card in Starfield at 4K, despite the possible handicap. The 5% boost from GDDR6X helps. I stress that it”s just a theory. Maybe FSR2 does utilise tensor cores, it’s just much more efficient. That would be quite embarrassing for NVidia.
  13. I guess the plan was to release 4090, then follow up with $2k+ Titan. Could be why they castrated the card out of its pro features. Then they realised it's not crypto boom anymore, the market isn't there, and why release a Titan when you can rip prosumers right off with the A-series cards. Neither, both are a step down in terms of RAM vs your 3090. Maybe try to scope a deal for a 7900 XTX Nito+? Beats the 4080, 24GB of RAM, and of course you get that ultimate Starfield performance ;) (and who knows what else using the new/experimental HIP driver) https://www.techpowerup.com/review/sapphire-radeon-rx-7900-xtx-nitro/32.html "The Sapphire Radeon RX 7900 XTX Nitro+ comes with the largest factory overclock of all Navi 31 cards that we've tested so far. Sapphire was also very generous with their power limits. During overclocking, with increased limits we were able to max out the 525 W power delivery capability of the 3x 8-pin inputs." I really don't trust those Tom people, so I hope that the review has been sponsored by NVidia to spread FUD. What they are saying is that 7900 XTX drew some 350W, while the 4090 only needed 230W to achieve a bit less performance. If that's true then I fold lol Edit: other people report this effect as well, but it's likely it's not a conspiracy. The difference could be due to FSR2 being used instead of the much more compute intensive DLSS. Sounds plausible. Still doesn't look that good for AMD in terms of W/frame. Doesn't look good for NVidia either unless DLSS produces higher image quality than FSR2.
  14. The 24GB 7900 XTX should have cost $700 2 years ago? I see wine has already been served at dinner sir 😉 GeForce RTX 3080 Ti 12GB [40][61] Jun 3, 2021 $1199 GA102-225 Anyhoo, I get the message: Azor = no deal regardless of specs/potential, driver improvements etc. It's like getting Mr Fox to give JB a second chance lol Edit: Personally I don’t care about Azor (he is just a gaming guy there), I would only worry about drivers/stability/features. The specs look awesome at that price point. HW reliability should be the same: everything comes from TSMC and Micron or Samsung + MSI.
  15. Slightly cheaper? Pulled some prices right now, cheapest card per GPU: MSI 7900XTX £909 Zotac 4080 £1150 (+22%) Zotac 4090 £1550 (+70%) how much more do you want them to undercut NVidia? Of course, the Green Ripoff has a knock-on effect on pricing elsewhere - if NVidia cut their prices, AMD might as well. The only good thing about all this is that the elevated prices gives AMD and Intel some breathing room.
  16. I agree NVidia competitors should prioritise stability over performance, but let”s just look at how ridiculous the complaint brought up by Papu is: I overclocked memory and the screen went black!!! Shock and horror!! Let”s be reasonable guys, and not act like blinkered fanboys/haters. We should differentiate between minor flaws (e.g. driver GUI not ideal, Azor hired into marketing and who cares, a new game crashes on Arc ) and showstoppers. Let”s wish Intel and AMD success in their struggle against NVidia and support them financially if possible. Given the magnitude of the challenge at the moment they should be saluted for even trying. If they do, we will have a greater variety of excellent hardware available in a few years’ time. If they don’t we will be stuck with NVidia who will be free to charge arbitrarily for whatever scraps they have left over from the AI business. The trade-off is short-term inconvenience + some money left in the pocket in exchange for long term gain in terms of the diversity, quality and pricing of GPUs available in the market.
  17. They may or may not. The odds are stacked against Intel and AMD in the GPU space, and they obviously need all the help they can get (and conversely NVidia neither needs nor deserves any support if a balanced market is a goal). Put another way, it's true that things may re-adjust themselves on their own, but I wouldn't want to put a probability number on that, and it's also strictly true that by not helping NVidia, and ideally also helping their competition, we are increasing that probability. Wasn't that you who brought up that power of the people Bud example? Obviously that's p*****, and where there is p***** there is money to help things along e.g. via social media.
  18. Really cool, but, other than the aesthetics (which is reason enough), why? Mobo heatsink not enough? What SSD temps are you targeting?
  19. That's one type of capitalist behaviour, the most common for sure, but strictly myopic and thus often suboptimal. I hope it's not controversial that, from the customer's perspective, monopolies or huge market dominance such as enjoyed by NVidia are quite undesirable. What we would want is for the industry to resemble say autos, with several competing companies vying for our business. That's immediately unrealistic for a number of reasons, however, at the very least we have 2-3 players which are strong candidates are likely to fulfill this purpose. It immediately follows that any financial support to the current dominant company moves us in the direction opposite to the reasonable goal. We can rationalise this away however we like, but the above is an incontrovertible fact, whatever the short-term motivation. If people incorporated more of that sort of longer-term planning into their behaviour and prioritised their purchases accordingly at scale, this would result in very quick changes in the way NVidia operates (in this case). For a motivating example of a poor outcome resulting from myopic capitalist behaviour, we need to look no further than at the gigantic example of PRC and how it came to be a global superpower challenging the West, as a direct result of capitalist greed and lack of foresight. Mind you, I'm not saying that there is a superior systemic alternative to capitalism, I'm just suggesting we can do way better within the system. Note there is no state intervention involved in the proposal, no new regulation, no institutions, just people adjusting the way they think. Food for. Starfield. Q.E.D. As you may know, I only need one example to disprove your 4090s undrelying massive hardware superiority theory. The more I look at it, the clearer it is a lot of the performance differences are explained through PC software issues (not just drivers, libraries used, optimisation etc.) For an example of this, take a look at the following review excerpt: https://www.tomshardware.com/reviews/amd-radeon-rx-7900-xtx-and-xt-review-shooting-for-the-top/7 Good old Tom came up with a review which tested the new Blender engine and found the 4090 to perform far better. Job done, the consumer is ready to make his informed decision, right? Not necessarily. According to the blog posts below, you do need the new experimental driver to enable massive performance improvements in Blender 3.1+ on AMD GPUs by bypassing OpenCL: https://code.blender.org/2021/11/cycles-x-project-update/ https://code.blender.org/2021/11/next-level-support-for-amd-gpus/ Did Tom use that driver? I can't be bothered to look for this detail but almost surely not - they wouldn't want to test using experimental drivers by default anyway, and if they did, they would have clearly indicated this. We could easily verify those results if someone, ekhm bro @Raiderman, was willing to burn some time to do the legwork. Just run the same version of blender benchmark as Tom did with the HIP driver.
  20. The drivers must be new, remember that this stuff didn’t support latest consumer products a year ago. I guess they could have released them to the public as beta. Clearly they are struggling in the PC software department to some extent, although the direction of travel is encouraging.
  21. ROCm (AMD”s counter to CUDA) was in a sorry state when I looked at it a year ago, didn’t even support the current consumer cards (6900 at the time). Good to see they are making progress while maintaining competitive pricing across the entire product line - shocking as that may seem to us, NVidia slaves, I know. More seriously though, both them and Intel face the same challenge: must provide a software gap discount while they are catching up, and they have to make progress on software with less cash - a bit of a chicken and egg situation. Luckily there is now also more software that leverages ROCm as people are fed up with NVidia”s grip on the industry, and are being outspoken about it. I love the 2 slot FF BTW. If bro @Raidermanwould be so kind as to run a couple of benchmarks with those drivers, such as AIDA64 GPGPU, and maybe Indigo and Blender, we could see if they help the card deliver compute performance that”s closer to “theoretical performance”, and we could see if and how far things have improved just thanks to the driver/software vs launch review numbers which frankly weren’t corresponding to the specs. Edit: now that I think about it, the odds are not great unless said benchmarks specifically utilise ROCm rather than OpenCL, but worth a quick shot I guess.
  22. Well, I’m not sure if Starfield is the only example (that would indeed be a bit suspect), but it seems that the 7900XTX is actually a solid competitor. I didn’t see Azor’s name anywhere near the specs BTW, I’m not sure he would even understand them. Innocent until proven guilty. Any concrete examples of AMD behaving unfairly towards the consumer? Look, NVidia creates proprietary DLSSes which don’t even work on previous gens of their own GPUs, AMD creates open alternatives which work on any GPU. As for the competitive pricing, we are actually seeing the same on the server CPU side, where Sapphire Rapids unfortunately represents terrible value vs EPYC purely from the hardware performance standpoint.
  23. Are you sure Azor is in any way involved in this? He is not a GPU designer. Anyway, ideally we would have more of an in-depth explanation as to why the theoretical performance figures are not reliable. We see that the 7900 XTX can perform extremely well in some cases. The question is whether there exists any bias in the body of results that point to the 4090s advantage. Were the tests equally well optimized for AMD and NVidia to make any comparison meaningful? Clearly, any results based on any variants of DLSS would be invalid, unless compared with an alternative technology from AMD. We would need a heavyweight game engine dev insider to weight in here, any of whom would be under many NDAs anyway, so I guess a bit of an open question. Having seen many NVidia logos in game intros over the years, I am worried those titles wouldn't perform particularly well on AMD HW and vice versa. Again, it could be the case of people subsidising NVidia, so they can pay devs to optimise stuff for their HW alone (and vice versa, although one might worry the $50B Goblin Chieftain would be hard to beat on bribes). If the above is not far from the truth, then we should be even more appreciative of Intel's efforts to enter the crooked market.
  24. Sure, we know there is a premium to pay, the problem is that NVidia raised prices of products across the board to create an illusion of necessity to purchase the top end card, as everything else seems like bad value. This and the other unhealthy practices have hurt the entire PC market, which is already in a worrying state of decline. So Level 2 thinking here would be to avoid supporting a company which acts to the detriment of the entire community (if not humanity). Sure, people can be selfish, myopic and just pay up the extra $400-500 over par, if that makes them feel as if they were driving a Ferrari among plebs in packed Camrys lol, but will this help move the world in the right direction? You probably meant to say "this is a free market". Well, it's not exactly, because NVidia is a pseudo-monopolist. They are in a position to rip people off by charging excessively for their products. They are doing so without regard for us, the enthusiasts, or the PC market, because Jensen has his head stuck high up in the clouds, hoping for AI-driven world dominance. It's only us, the consumers, who can bring him down a peg or two. People make their decisions based on a plethora of fuzzy factors: their knowledge, their CPU characteristics, the information in their possession, their habits, and crucially - their emotions. Arguably, what's often missing is long-term thinking. I mean we know for example that people make decisions they later regret, buyer's remorse is a thing. But sometimes, there is no remorse even if the consequences are bad - that's flawed human nature, we tend to love ourselves the most, sometimes in situations where actually helping others would be more beneficial. All of this can be far from optimal, it's impossible to even argue that the decisions people take are always right and optimal for them. The best we can do is think about our decisions and constantly try to improve our decision making, knowing full well it's necessarily flawed to some extent... If we look a few steps ahead certain sets of criteria might be globally better than others. I don't want to go as far as invoking any of the striking examples of application of people's flawed criteria leading to various historical disasters, however, there are some future paths where supporting NVidia in any way could lead to very much unintended and undesirable consequences. Less drastically, and more immediately, NVidia clearly isn't helping our beloved PC world flourish. Well, words are cheap, I don't own any AMD GPUs either (harder constraints unfortunately), but the least we can do is avoid helping NVidia by hyping up their products. Yes, based on the theoretical performance numbers sources from the links I posted earlier. I have used techpowerup GPU specs page quite a bit when comparing GPUs - if there is a major flaw in the methodology, I would like to know. You can see for example that the 7900XTX has a higher pixel rate (the GPU can render more pixels than NVidia's top dog), but lower texture rate (NVidia can support more or higher res textures, by about 25%). Compute specs are mostly in favour of the 7900XTX, except for FP32, where the 4090 is in the lead. The differences either way can be fairly large, which suggests there exist architectural differences between the GPUs which might also explain differences in game performance depending on the engine or content type. Based on the specs alone, the 4090 should be priced maybe within 10%, certainly not 60%, of the 7900XTX. The 50% excess constitutes software and "inventory management" premiums, neither of which should really be applicable in a healthy market (to be fair, the software part of it could - to some extent - be AMD's own goal as per @Mr. Fox, I have yet to see bro @Raiderman jump in to the drivers' defence).
  25. A story? It’s Bethesda lol “You are a miner and so you mine. You”ve mined so hard you find an mystic artefact. Congratulations, you are automatically admitted to a secret club of artefact seekers”. Do they have 12 y.o. interns writing this stuff? One review summed up this and all the other Bethesda games nicely: a journey a mile long and an inch deep.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. Terms of Use