Jump to content
NotebookTalk

Etern4l

Member
  • Posts

    1,876
  • Joined

  • Days Won

    13

Everything posted by Etern4l

  1. Turns out they have enough customers who are not as knowledgeable as you, or who don't mind just dropping $2k for a graphics card. Remember, that the desktop PC market itself is in a downfall. If you are one of the very few gamers buying or building a PC today, are that worried about weak offering in the <$500 segment, when you have already spent $2k on on other components. Probably not, or you would have just gone with an Xbox or PS5 instead. They've got the pricing in hand bro, don't worry about it. The only thing that would help is some massive boycott which is not going to happen (yet). Every customer who just says "* it, nothing under $500 or $1k" and drops $2k for a 4090 is a massive win for NVidia, as surely that's the highest margin card. In practice, they might have been OK with a lesser model, but the prices are constructed in such a way that the 4090 trap is just too easy to fall into, and there are games to boot, or just DLCs etc to help people rationalize the spend. If NVidia came up with a gaming GPU 4x faster than 4090 tomorrow, there would be a CP2077 DLC within a month or two with boosted effects, textures whatever to just gobble it all up, or people would simply start talking about 8K gaming. It's brilliant and endless.
  2. Perhaps, but then they would make less profit per unit. Rest assured they use sophisticated AI pricing models fed by quite accurate information about the market. They will squeeze the last cent out optimally. People are completely powerless in the face of this.
  3. Why would they do that? They don't care. Microsoft and others need more GPUs for the AI death race than NVidia/TSMC can manufacture. Forget about crypto, that's peanuts in comparison - the only difference is that no retail scalpers are directly involved on that side, because the AI war is not for the little man to partake in on the supply side. Yes, the gaming revenue is collapsing, and that's despite their auto-scalping revenue-boosting efforts - if they cut pricing, it would collapse further. If you are a typical blinkered gamer, you want DLSS3 and therefore you must have an Nvidia GPU. Their way or the AMD/Arc poor man's GPU highway, simple as that. The market (if not the world itself) is broken.
  4. Look, Nvidia cornered the market. They are a monopolist, their pricing reflects that position, and AI runs on GPUs. The range starts with 4060 which is not very generous and there isn't really anything remotely compelling until the $500 4060 Ti 16GB, then again there is nothing really gripping until the 4090, except perhaps the 24GB 3090 Ti for professional purposes (not really available anymore). Just a linear range with intermediate compromise options. 3060 12GB DDR6, 12 TFLOPS $350 4060 8GB DDR6, 15 TFLOPS $350 4060 Ti 8GB DDR6, 22 TFLOPS $400 4060 Ti 16GB DDR6, 22 TFLOPS $500 4070 Ti 12GB DDR6X, 40 TFLOPS, $800 3090 Ti 24GB DDR6X, 40 TFLOPS, $1100 4080 16GB DDR6X, 48 TFLOPS, $1200 4090 24GB DDR6X, 82 TFLOPS, $1600 There is some consistency in that, e.g. comparing 4060 and 4090 - for 4.6x the price, you get 3x the VRAM capacity, DDR6X and almost 4x the VRAM bandwidth, and 5.5x the compute performance. They just want people to pay proportionally for more oomph. They are ruthlessly precise, as expected given their leadership in AI. Sign of the times folks. Hooray, what a time to be alive etc lol The question is whether the price of the 4090 is justified. At this point the regulators would need to take a look (given we are talking about a de facto monopolist), but given the pricing of GPUs in the DC market, my guess is that it actually is, unfortunately. Will get worse once demand for GPU power increases with Microsoft sticking AI in the next version of Windows 11, Office 365 etc. Not talking about trial and error guides obviously, as that's nonsensical, there is no immediate feedback. The only trial and error you can do is fit, wait a few months and check if you destroyed the super-delicate socket, or possibly damaged the CPU solder, due to lateral force/stress being applied over extended time. If you change CPUs/mobos regularly like some people here, then the risk is obviously lower. Neither the derRoman's or the cheapo Thermaltake LGA1700 contact frame comes with exact Nm value recommendation, so your torque screwdriver would be of limited use. My guess is that the reason is that this is mobo- or ILM variant dependent, therefore they cannot provide this - only the manufacturer knows the exact spec, and they are not going to publish them as they don't want people removing the ILMs. I guess you could try to reverse engineer the rough torque while undoing the screws on the ILM, but it would be an approximate effort. Seeing how marvelously the ILM performs on my replacement mobo, I firmly believe that a contact frame is an utter gimmick and a waste of time (*U to GN for peddling this, although to be fair their fitting videos were full of subtle warnings I enthusiastically chose to ignore). The only thing I did with the new mobo was take a moment to make sure that the ILM lever (which does have a little play) is centered before locking. Again, if someone does use a contact frame, an error is unavoidable as they don't know the exact torque spec.
  5. I would stress the distinction between an avoidable human error and an inevitable one. If I give you a precision torque screwdriver, and advise you to screw in the contact frame uniformly at 4x0.1 Nm, and you end up with 0.3,0.2,0.05,0.15 - that's a probably avoidable human error, depending on the precision of the tool. If I ask you to just use your hand armed with a wrench and apply 4x X torque, you are dead in the water. You have virtually no chance of getting it right, because you don't really know the actual value of X in the first place. You might think you know, or have a good idea, but you can't be sure. Engaging in that task at all inevitably leads to an error, but of course one can actually prevent it by taking a step back and not playing the silly game in the first place. Requires some insight into the possible consequences though, and these include a complete waste of time and money.
  6. Well, right, a CPU ILM is not really meant to be fitted by humans in the first place (we don't have 4 hands with precision torque-controlled screwdrivers), so the first human error here is the fall for silly Internet marketing which advertises purported benefits of the device (in the case of GN, in a ludicrous manner in retrospect), and consequently the very removal of a factory-calibrated ILM, followed by its replacement by a gimmick which comes with no torque specification, so there is no way to fit it properly. You can get lucky, or you can ruin your mobo. Edit: BTW Jay offered a more honest review ("no difference, whatevs"). I have more and more respect for that guy now after he pulled the shilly 4060 Ti review. A rare breed, hope he recovers soon.
  7. I totally called this. Very suspicious of any gimmicks like this, including contact frames etc. after the bending fiasco. Shouldn't need any those things, products ought to work properly as delivered by OEMs.
  8. Did you watch the Tesla event video? He said he ultimately expects one or two of those robots per human inhabitant of the planet. Billions. Obviously part of it is marketing BS aimed at getting dumb money to part with their cash, but he wasn't obviously joking.
  9. He already called them Optimus or something. He has been on a cunning campaign of AI safety virtue signalling for dummies, so surely will continue to avoid any obvious PR issues as he continues on his Iron Man-esque quest for world domination.
  10. Plus this thin connector might incurs some unnecessary energy loss, as thinner wiring could mean materially larger resistance. Does the connector get warm under 400W load?
  11. From 11:00 - Elon is working on AGI and ultimately building 16 billion Optimus humanoid robots. What could go wrong... https://youtu.be/D_8WShgZLQM
  12. It won't be long before Nvidia announces that merely plugging the 12VHPWR connector in voids warranty ;)
  13. All I'm seeing so far is bizarre news like this: https://videocardz.com/newz/intel-confirms-adamantine-l4-cache-for-meteor-lake Basically they state that L4 cache will be faster than L3 cache (would be a nonsensical design), but then they also say it won't really be general purpose cache but: As explained by Intel, the main purpose of L4 cache is to improve boot optimization and increase security around the host CPU. Furthermore, the L4 cache would preserve the cache at reset, leading to improved loading times that would otherwise have to go through all boot/reset cycles. Not clear what the fuss is about, definitely can't wait to have my load and boot times optimised that way lol Probably just noise written by AI.
  14. Not clear L4 cache is worth the die space, in light of the current DDR5 speeds. Is there any specific communication from Intel that this is their plan for Meteor Lake, or just online speculation? Anyway, I hope you are wrong about Intel's future, since you sound like they are going down the toilet soon, which nobody- not even AMD and Apple fanboys - should want given the benefits of competition in the CPU space.
  15. Those Meteor Lake leaks make little sense and look like clickbait. Why would Intel cut the number of cores when transitioning from 10nm to 7nm process?
  16. We have a "limited free will". We can attempt to choose among the ideas/potential actions that "come to mind", i.e. what the subconscious part of the brain suggests to us. The key question is: what does come to one's mind and why? This is most likely based on the more recent input, now controlled by big tech to a dangerous degree, but then what we actually do is heavily dependent on our fairly primitive reward-motivation pathway which also gets hacked and hijacked by technology. All worked great back in on the savanna 100,000 years ago, not so great today, in the era of AI-controlled, and soon generated, (anti-)"social" media.
  17. @ryan "How" is a fair question. The problem is that there is a vast multitude of ways by which AI could damage or destroy humanity. I think the best we can do right now is enumerate as many as we can, and work as hard as we can on preventing them from happening. The bottom line is that once you have a super-intelligent entities making decisions for you, all bets are off. It need not even be sentient to be extremely harmful and dangerous, and yes - the ability to predict the future better than we can will be at the core of AI's power. Here is another clip from an interview with Yuval Noah Harari, a historian, futurist and renowned author (recommend watching the whole thing):
  18. An interview with a renowned historian, futurist and author (rather than a Joe Schmoe from YouTube). Worth watching the whole thing with an open mind IMHO, however, picked one starting point for those pressed for time:
  19. Poll: 61% of Americans say AI threatens humanity’s future Only 61%? Well, it's a start, and enough to get AI in the US under proper control.
  20. OK, got you. Well, the UK setup looks more sensible then. It's a non-governmental organisation (Ofcom), it really is independent from the government, and their scope is fairly limited. I don't think they regulate user-generared content. They would stop TV news channels from spreading misinformation though.
  21. Not necessarily a problem - some oversight can be helpful, especially in the era of AI generated fake news. Depends on what the bill actually says. What caught your eye specifically?
  22. BT to cut 55,000 jobs with up to a fifth replaced by AI Here we go.. BT is a British telecoms giant, basically a monopolist.
  23. Reminded me of the brilliant HBO miniseries "Chernobyl" (the trailer kind of sucks in comparison lol)
  24. I've used split keyboards since Microsoft Natural Keyboard came out, for personal use mostly. Have absolutely no issues using standard keyboards BTW. Also have the original Kinesis with mechanical switches, albeit they are too soft for me. The nice thing about the Kinesis Freestyle is the adjustability in terms of the width and angle. The switches are membrane but very pleasant.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. Terms of Use