Jump to content
NotebookTalk

Mr. Fox

Member
  • Posts

    5,849
  • Joined

  • Days Won

    635

Everything posted by Mr. Fox

  1. There's no way in hell that I would pay $70 for it. Maybe I will see if I can find something to like about it when the price drops to $15-$25. I think I can count on one hand how many games of the hundreds that I own that I think are worth what they sold for at the time of launch. That approach generally works out really good for me. If it turns out that I think it sucks I didn't waste a lot of money on it. If it turns out to be something I like, then I paid what it was actually worth; not what suckers paid for it when it was a new release. A lot of the games I own that I paid $20 or less for aren't even worth what I paid for them and I regret wasting any money on them. (And most are very popular titles.) A good number of them were purchased only for the in-game benchmark and I knew I had no interest in the game before I bought it for the benchmark.
  2. To be fair, I haven't used a "stock" NVIDIA driver in years. Before NVCleanStall I manually did my own mods, or used j95 mods, because I didn't want the garbage "features" mucking up my system, like Ansel, ShadowPlay, GeFarts Experience, automatic driver updates and what not. Part of what I loathe about the Adrenaline GUI is it has a similar payload of filth and I don't know how to eradicate it. With the GeFarts software it is not all combined into a single interface, so eliminating the trash might be easier.
  3. I don't really have any complaints about the 6900 XT at a hardware level. It is a lot more powerful than the 3060 Ti it replaced (which I was very content with) and does an excellent job at some things. Software and drivers issues are, indeed, fixable. But, they sometimes never get fixed. Firmware, drivers and software made me hate my X570 setup. Driver bugs aside, in terms of software I felt Ryzen Master was a trashy and bloated-feeling GUI and I don't care for Adrenaline's GUI for the same reasons. Layout is chaotic, somewhat illogical and overall unintuitive and it feels bloated.
  4. I haven't had my 4K monitor long enough to get used to it, and I haven't started liking it on the desktop. I hate using more than 100% scaling and text is smaller than I would like it to be on even a 27-inch screen. I think with more time the smaller text won't annoy me as much. But, yeah... gaming is like... wow... major improvement. Way better and more noticeable than I expected it would be. (If I am honest, my expectations were very low and that might be jading my impressions.) It is like the video rendered on my display has a more chromatic and less cartoonish quality than before. It was kind of trippy at first, but it didn't take long for me to get used to how much better games look.
  5. You're forgetting I rarely game and maybe didn't notice what I said about the 6900 XT (and have previously posted about it). Their drivers suck. They way they work (or don't work) sucks and the GUI to manage them sucks. Blurry text, disappearing text while typing and DWM desktop rendering graphical glitches just trying to do my job. Nice, huh? Maybe if they burned as many calories on a $900 GPU as they do on a $500 console their PC drivers wouldn't suck. We might be giving them too much credit though. We don't know that they actually produce the drivers for consoles. It might be something Micro$lop and Sony have taken ownership of to make sure it gets done right. I have not ever seen an example of good AMD drivers or software before.
  6. To be clear, AMD is their own worst enemy. A lot, maybe all, of their shortcomings are self-inflicted damage. AMD continually hurts itself with lousy drivers. They have never been good at it... for decades now. Crappy drivers keep me from enjoying the 6900 XT to the extent I would be able to with good drivers. Just simply using it for work, where performance really doesn't matter, it delivers an inferior experience due to driver crap. Deja vu for me. Their drivers sucked when I gave them their last chance (2012) and they still do. They need to get a clue. Had they been smart, they would have used GDDR6X. Opting for the slower/cheaper GDDR6 hurt them in the GPU war. Whether it was a mistake that reflects poor judgment or they actually wanted to help keep the price down by using cheaper memory matters not. They slit their own throat opting for cheaper.
  7. They don't know any better and because they do not they simply believe what they are told. Or, they haven't paid close enough attention to notice any difference. When people use something until it gets too old and slow, they get excited that their new system seems better. It might not be as good with a simple part upgrade or two and a fresh OS install. But, they don't measure the difference. They rely on what they were told and how it feels compared to the old system.
  8. Newer is always better newer. Newer is often only better in the minds of those that stand to benefit from the use or sale of the newer product. That is the case with Windows.
  9. That simply isn't true. If you optimize your system and disable or delete all of the unnecessary services and background filth that is running and doesn't need to be you will immediately notice a difference. You'll be able to see it and feel it. Even when you disable everything in Windows 10 and 11 that is unnecessary garbage it's still smaller than Windows 7 due to wasteful overhead that steals CPU clock cycles. It's actually very simple and boils down to a matter of operating system overhead and wasteful use of resources. The latest versions of Windows 10/11 behave in a way similar to a computer that has been infected by malware or running buggy software with a memory leak. When you're used to it you likely not notice. The degradation has taken place gradually over time due to the incremental addition of garbage. The difference is the equivalent of opening 10, 15 or 20 programs that are not being used and expecting system performance to be the same as if they were not running.
  10. All you have to do is measure it. The biggest hit is on something that requires a swift reaction from the CPU, such as wPrime or a 3DMark Physics test. Huge hit on performance due to lag that you can actually see and feel. I posted side by side comparisons more than once and it's true. Numbers don't lie and neither does the seat of your pants. That's probably why the most recent few releases in the 3DMark suite don't even have a CPU test anymore. It would raise too many questions about why CPU performance has gone down and continues to decline and more and more performance hindering updates occur. If all you're doing is web browsing and poking around with an office productivity application, or anything like that, you probably won't notice because those tasks don't require a swift, low-latency reaction and you're not trying to figure out why performance has taken a nose dive. And, people that limit the use of their computer in that way are generally not performance enthusiasts.
  11. I can't speak for anyone else, but I'm willing to go there if it means giving the support to Intel. And, who knows if 3090/4080 level is where its performance ends. Look how far ARC has come just with drivers... massive performance increases... Plus, if they are successful, who knows how Battlemage Gen2 is going to look. Might get pretty interesting. And, if we are honest, there is nothing wrong with 3090 or 4080 performance. It's not 4090-level, but it's also not nearly as severely overpriced. But, some of us get hung up on feeling like having the top performance GPU is imperative. I am starting to rethink that. I hated how much I paid for the 3090 KPE and hate even more how much I paid for my 4090. Big performance, little value... not necessary. Yeah, that would not surprise me in the least. NVIDIA is unmatched in their ability to misrepresent things for secondary gain. They are masters in the art of deception.
  12. Yes, in no small part, that is one of the reasons I am looking forward to seeing how the Intel Battlemage GPU turns out. I don't like supporting the Green god of Greed and its goblin minions, but I also can't see myself being interested in buying an AMD GPU at this point. It wouldn't surprise me if there are a lot of people thinking this way besides me. One of the things I like most about Intel is their attention is not distracted from PC by consoles like AMD, nor are they distracted by the things NVIDIA is/has been distracted by (crypto, AI, medical science technologies, etc.).
  13. Check out this free GPU benchmark on Steam. It's pretty cool visuals and audio. It pulled up to about 480W from my 4090. There are two different scenes and you can choose no DLSS or the range of DLSS settings. https://store.steampowered.com/app/770170/EzBench_Benchmark/ https://www.ezbench.gg/
  14. I think if we began making a list of all the things that suck because of consoles it would be a long one. When you start lowering standards and accepting compromise it has a way of spreading and dumbing down other things. We start allowing caveats and making exceptions. Accepting mediocrity and tolerating less becomes too easy. Today's turdbooks are one of the most popular symptoms of the bar being set too low.
  15. Maybe for some folks. Not for me. Not my cup of tea. I think computers would suck less if there were no consoles lowering the bar. But, I don't hate them as much as a handheld POS like the Steam Deck. That is about as low as the bar can go. It's in a hole on the ground, LOL. A joke like that makes a sucky turdbook seem like something special.
  16. I can't see myself ever gaming on a console. I barely make any time for gaming on a PC and if I had to use a console and controller I would probably never play any games. The thought of it disgusts me. But, your point isn't lost. If you don't count the sloppy crap, the minimum system requirements of most games allow the use of weak and antiquated components.
  17. ^^^ this ^^^ maybe the idea follows the logic that it would be difficult to find any purpose for those living in an post-apocalyptic wasteland 😜
  18. Unfortunately, I think that represents a sizable portion of the earth's population... gluttonous lust of the eyes, lust of the flesh and pride in possessions. Truth is stranger than fiction, but in the end it's all gonna burn.
  19. I found the TV series equally slow and boring. I forced myself to watch the first 3 or 4 episodes and just couldn't get into it. My wife likes it. I find it difficult to not fall asleep watching them. One of the things I love about the Doom games is the nonstop action. There is no opportunity to slow down or stop to think about doing anything other than slaughtering demons to avoid being slaughtered. 10/10 for the adrenaline overdose. I had to edit my previous post as I somehow forgot to mention two extremely important franchises.
  20. I did. Thank you. If it's what I think you're referring to I replied in that thread. While my preference is first person shooters, I have enjoyed a few other genres as long as they don't get too bogged down with collecting garbage and crafting trash, or interacting with AI to reveal clues and similar nonsense that bores me to death. Bingo. The Green God of Greed has made participation prohibitively costly and unnecessarily complex. Amen. I'm not sure why but I found your comment hilarious and it perfectly captured the essence of what I strongly dislike about Witcher games and titles with a similar style of gameplay (i.e. Skyrim). I have literally fallen asleep at the keyboard trying to force myself to play them. A really good action game will leave me with a bad case of jitters from being amped up on adrenaline for an extended amount of time. I love when that happens.
  21. Well said. It doesn't even need to have a lot of tactical thinking to be fun if the gameplay is not as slow as molasses. If the game moves at a frenetic pace it is harder to get bored and lose interest due to slowness and monotony.
  22. I really do think that part of it is just crappy work on the part of the game developers. Sloppy coding doesn't run smoothly and requires stronger hardware to minimize the impact of the half-assed products.
  23. I have never, ever been a console gamer, always only PC. I hate game pads and joy sticks and only like keyboard and mouse. I think it's a matter of personal preference and I'm much the same. I have several of the Fallout games and find them too boring. My sons love the Fallout franchise. I have tried super hard to discover what they like about it and can't. Same applies to the Skyrim, Witcher and GTA franchises (which my sons also love). I've tried hard and forced myself to play more than an hour of each release and it's like watching pain dry to me. My gaming interest is concentrated on AAA FPS titles along the lines of the Crysis, CoD, Unreal Tournament, Gears of War, Wolfenstein, Quake and Doom franchises, and anything similar. Oddly enough, I do like Elder Scrolls. Vermintide is acceptable, too. I also like God of War.
  24. Did you try bumping up the memory voltage? XMP profiles are often not stable because their default voltage is too low and if you tighten some of the timings things will come unraveled unless you increase the voltage. I would start at 1.500V and see if you can dial in settings you want. Once you do, slowly start decreasing the voltage until it begins acting up, then go back up about 0.020-0.025V.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. Terms of Use