Jump to content
NotebookTalk

MyPC8MyBrain

Member
  • Posts

    669
  • Joined

  • Last visited

Everything posted by MyPC8MyBrain

  1. OpenAI will go down in the history pages in the very near future, despite the arrogance displayed and passed as false confidence. similar historical events teach us important lessons if we choose to use the data when evaluating the current landscape, the good news are they will not disappear completely, case and point https://myspace.com NVIDIA is also about to be put in its place imho https://notebooktalk.net/topic/2901-the-madness-has-begun/page/2/#comment-64434
  2. Thinking out loud, the AI buildup toward 2025 feels like it triggered a delayed realization at the top of the major CPU vendors like Intel, AMD, and ARM. For decades, Intel, AMD, ARM, and others built their market share through platform control. OEM relationships, ecosystem lock-in, and incremental architectural leverage made the CPU the anchor of the system. Then, almost overnight, they found themselves paying billions to accelerate someone else’s technology, reshaping entire systems around it, and effectively handing over slices of the very market they spent decades carving out. That is a reversal of logic. Instead of defending the platform, they subsidized the erosion of their own leverage. Value migrated away from the CPU and toward an external accelerator that began defining how systems are built, priced, and scaled. At some point, executives notice when they are funding their own displacement. What seems to be happening now is a quiet correction. Rather than continuing to pour capital into accelerating a smaller player’s dominance, money appears to be flowing back inward. Into their own roadmaps. Their own architectures. Their own standards and system-level control. This is not something you see announced in press releases. You see it in behavior, hesitation, and shifting priorities. You also see it in the market. Intel and AMD are no longer acting like companies resigned to becoming feeder components. At the same time, NVIDIA’s stock is not following the straight-line hype trajectory you would expect if total platform capture were inevitable. That divergence suggests reassessment behind the scenes, not universal buy-in. If the AI story were as simple as the headlines claim, capital flow would be one-directional. It is not. This does not mean NVIDIA goes away. It means the assumption that the entire industry will continue financing its expansion indefinitely is likely wrong. Platform owners eventually defend platforms. The AI boom did not just create demand. It exposed a power imbalance that incumbents tolerated for too long. What we are likely seeing now is the early phase of that imbalance being corrected quietly, deliberately, and before it becomes irreversible. The solution is simple, but overlooked: eliminate the silo. Unified, coherent memory allows CPUs and accelerators to operate on the same data without costly copies. Once that bottleneck is gone, the perceived GPU advantage evaporates. CPUs retain their performance, and accelerators return to their intended role: augmenting the platform, not defining it.
  3. I think the issue is deeper than AI specifically. When you enter a building, do you take the stairs or the elevator by default? When you travel somewhere unfamiliar, do you use a paper map or Google Maps? When you need to remember a phone number or an address, do you commit it to memory or store it on your phone? No matter how we frame it, our cognitive and physical habits have been eroding for decades. It happens gradually and mostly without us noticing. I grew up in a time when elevators were rare. You took the stairs because there was no alternative. When you needed a phone number or an address, you used the yellow pages or asked around. Your brain was the address book. I carried hundreds of numbers and locations in my head and could recall them instantly. Navigation worked the same way. You learned your surroundings by observing them. You placed mental markers along the way. Physical maps mattered. An atlas was a normal household item. Many kids today have never even seen one. AI is not the beginning of this shift. It is just the latest step in a long process where convenience replaces effort and external tools quietly take over functions we used to exercise ourselves. The problem is not using technology. The problem is forgetting what we give up when we stop exercising those abilities altogether.
  4. the erosion of our brains started long time ago when PDA's and Smartphone first arrived on the scenes over 2 decades ago.
  5. as they should be, Substrate Lithography next-generation semiconductor foundry - https://substrate.com/
  6. its much simpler to answer this question from the other direction, when was the last time Microsoft did something right? the answer to that is simple, since XP. everything since has been an ongoing lame attempt after another in converting PC's as in personal computers into public money grabbing kiosks.
  7. No, AI is not what is causing the damage (Jensen). What is causing the damage is deliberate market manipulation that was planned years in advance (Jensen). Decisions like removing NVLink from workstation-class hardware (Jensen) were not technical necessities. They were strategic moves designed to ensure the hardware could only be used properly in the specific ways that extract the most value from customers (Jensen). That is not AI hurting society. That is vendors engineering artificial constraints to maximize rent extraction. Blaming AI is a convenient deflection (Jensen). The damage comes from choices made by people who were trusted with the keys to the ecosystem (Jensen). Over two decades, that trust was built carefully. The moment leverage became absolute, it was abused (Jensen). This is no longer about innovation. It is about control, segmentation, and rent-seeking behavior disguised as progress. History is unforgiving to this pattern. Companies at the peak of perceived indispensability (Jensen) convince themselves the fall cannot happen. It always does. The bigger the pedestal, the harder the landing. Today’s industry heroes (NVIDIA) have a habit of becoming tomorrow’s cautionary tales. AI is just the excuse. The real problem is greed dressed up as inevitability, Mr. Huang.
  8. CES is over, and look at what actually showed up. One low-tier new Intel CPU. One mildly boosted AMD SKU. No new GPUs. No real platform shifts. No meaningful architectural progress. For an industry supposedly in the middle of the biggest computing revolution in decades, this was a staggeringly thin showing. If this were a classroom, you would not call it cutting edge. You would call it remedial at best. Lots of repetition, very little advancement, and an uncomfortable sense that everyone is stalling for time. That is not because innovation suddenly got harder. It is because the market is locked into monetizing what already exists. When margins come from scarcity, price control, and segmentation, there is no urgency to move fast. So instead of bold platforms, we get SKU shuffling. Instead of new architectures, we get minor frequency bumps. Instead of GPUs, we get silence. This is what an industry looks like when it is more focused on managing demand than pushing capability. What makes this year stand out from previous is not that the products were underwhelming. It is that there was no clear sense of direction. Even going back to the 1980s and early 1990s, CES consistently showed forward motion. The hardware was primitive, but the trajectory was visible. New buses, new form factors, new categories, and clear signals about where platforms were heading next. This year did not feel like that. This felt like a strategic pause. An industry more focused on managing inventory, pricing, and segmentation than on moving the platform forward. When margins are being extracted from scarcity and control, there is little incentive to accelerate change. So instead of architectural shifts, we get SKU reshuffles. Instead of new GPUs, we get silence. Instead of platform evolution, we get minor bumps that preserve the status quo. That is not how CES has historically behaved, even in slower decades. There may have been weaker products in the past, but there has rarely been a show with so little visible intent to move computing forward. And that ties directly back to everything above. When markets are optimized for monetization over progress, innovation slows. Not because it is impossible, but because standing still has become more profitable than moving ahead.
  9. I’m still skeptical. It remains to be seen how close they’ll actually get to “back to their roots.” So far, the messaging only references XPS and implicitly Precision via the new naming convention. Both lines have earned multi-generation followings, but after the recent roller coaster, there’s real ground to make up to regain trust. And that trust isn’t tied to a name. It’s tied to a philosophy one they’ve been slowly eroding over the past few years.
  10. they could just used Pro for the latitude equivalent and leave Precision as Precision for the higher tier up as it already was, so essentially the logical rename would be Latitude to Pro if a name change was so necessary, what is the equivalent entry point Inspirons named now?
  11. Precision = Pro, calling it "Pro Precision" is redundant especially since theres no non Pro Precision model. cup half full, its better late then never. at the least they came to some senses before they completely blow it out.
  12. There is another consequence coming out of this landscape that few vendors seem to be accounting for. This pricing behavior fundamentally changes upgrade psychology. For decades, the PC and workstation market relied on a simple pattern. CPUs advanced, prices were tolerable, and users chased the next generation because the platform cost made sense. Memory was an afterthought, not a gating factor. That is no longer true. When customers are forced to buy systems loaded with 64GB, 96GB, or 128GB of memory just to remain functional, that memory stops being expendable. It becomes the anchor. The sunk cost people protect. The result is predictable. Instead of chasing a new CPU with marginal gains, users will sit on older systems that are already fully populated with expensive memory. Upgrading to a new platform now means rebuying RAM at inflated prices for single-digit performance improvements. That math does not work. So unit sales slow. Refresh cycles stretch. The traditional excitement around new CPUs fades. The market shifts from progression to entrenchment. Ironically, the attempt to extract more value per system accelerates the opposite outcome. Fewer systems sold. Longer lifespans. Less incentive to move forward. In practical terms, memory pricing has become a brake on innovation. People will not discard fully loaded, capable machines just to step into a new socket with unaffordable RAM and negligible gains. They will optimize around what they already own. That is how a market stops moving forward not because technology stalled, but because economics made progress irrational. There is a reason this kind of behavior was pushed out of civilized markets in the first place. Pillage-and-plunder economics do not create growth. They extract value until the system stops moving. That model was understood, corrected, and regulated out of functional societies because it destroys trust faster than it creates profit. What is happening now feels like its quiet return. Not through force, but through leverage. Not through scarcity, but through manufactured constraints. While attention was elsewhere, the same behavior re-emerged the moment the market gave cover. When customers are told that doubling prices on inventory already sitting in warehouses is “unavoidable,” when modularity is removed to lock in replacement cycles, when architectural inefficiencies are preserved because they are profitable, that is not innovation. It is extraction. Markets tolerate this only briefly. Once buyers adjust behavior, upgrades slow, alternatives are explored, and loyalty evaporates. The damage does not show up in quarterly reports at first. It shows up later, when momentum is gone and trust cannot be rebuilt on demand. History is consistent on this point. Systems do not collapse when people complain. They collapse when people quietly stop participating. And that is where this trajectory leads if it continues unchecked.
  13. it is clear how much thought and high level engineering placed in the new platform, placing the dGpu right under the CPU is clutch.
  14. Traditional industries like tires, rubber, steel, power, and aviation learned their lessons the hard way through catastrophic failure. You cannot ship a “minimum viable tire,” push a subscription brake pedal, or break interchange standards every 18 months and expect society to function. Physics, liability, and real-world consequences forced discipline. Standards became sacred because lives depended on them. The modern IT industry escaped that crucible. Software abstracts consequences until they’re diffuse, delayed, or offloaded onto users. That created moral hazard. When failure doesn’t immediately kill someone or bankrupt the manufacturer, executives optimize for lock-in, rent extraction, and quarterly optics instead of durability, interoperability, and stewardship. Subscription everything, forced obsolescence, cloud dependency, vendor captivity none of this would survive five minutes in a regulated physical industry. Calling it “innovation” is a fig leaf. Much of it is controlled degradation: removing user agency, breaking working systems, centralizing control, and monetizing dependency. Historically, societies only tolerate this until the hidden costs surface systemic outages, security collapses, economic drag, loss of skills, and institutional fragility. At that point, the response is not polite market correction; it’s regulation, breakup, or replacement. They are not criminals in a legal sense yet. But they are reckless custodians of critical infrastructure, behaving as if software is a toy rather than the nervous system of modern civilization. History is unkind to people who confuse temporary leverage with permanence.
  15. this is not an analog grounding case, analog isolation in this context is referring to the circuit design with no relation to "ground" in this spesific context. Dell relies heavily on Intel Dynamic Platform & Thermal Framework (DPTF). Power mode changes trigger firmware level policy changes, not just windows settings. these transitions are abrupt by design to meet thermal and battery targets. this is common on Dell (not so much with other brands) due to tighter power envelopes, Less analog isolation on modern, compact boards. emphasis on efficiency over electrical quietness, older, systems masked this better with larger inductors, heavier grounding, and more conservative power transitions.
  16. i agree! my point wasn't to challenge the phenomena just explain the root cause of it so you don't end up chasing windmills. analog Isolation means physically separating two circuits so they don't share electricity or ground which Dell redesign cutbacks and consolidation we seen the past few years is losing modularity in design not adding complexity, hence the circuits designs also getting chopped and consolidated all in name of selling more laptops faster, soon we will pickup laptops like a hotdog from 7-Eleven with same emotions and customization reserved for socks.
  17. This is not a defect, not a failing speaker, and not Windows “doing something wrong.” It’s a byproduct of modern mobile power design, aggressive firmware control, and minimal analog isolation. Dell’s recent designs prioritize efficiency and thinness over the kind of electrical overengineering older workstations had. If you want it gone don’t change power modes during playback, or Lock the system to a single power mode, or Use external audio (USB DAC or Bluetooth), which bypasses the internal analog path entirely.
  18. Dell used to be a systems company. It built machines around customer needs, real workloads, and long-term relationships. You could spec a system, argue about it, push back on pricing, and eventually land on something that made sense for both sides. It was not perfect, but it was rational. That Dell no longer exists. What replaced it is a pricing engine that reacts to market hysteria rather than fundamentals. Overnight RAM hikes. Claims of “losing money” while sitting on years of inventory. Pressure tactics that look less like negotiation and more like ransom notes. That same mentality now shows up in the hardware itself. Recent Dell platforms, including flagship models, have steadily lost modularity. Fewer replaceable components. More proprietary layouts. Tighter coupling between parts that were once serviceable and upgradeable. None of this improves performance. It improves control. For decades, Dell’s strength at the edge was adaptability. Systems could evolve with workloads. Their useful life could be extended. That is how trust was built. Now that trust is being monetized. Momentum earned over decades is being exploited as the rug is slowly pulled out from under customers who are not at the very top of the enterprise stack. Buy what is offered, at the price dictated, and replace it sooner. That is the new model. This is not leadership in edge technology. It is brand liquidation. When modularity disappears, long-term value disappears with it. Hardware stops being an investment and becomes a disposable purchase. At that point, Dell is no longer chosen for engineering or reliability, but for convenience. That is how a systems vendor turns into shelf hardware. Pick it up if it is there. Discard it when it is not. So this is not anger. It is an obituary. Dell had decades of goodwill, engineering credibility, and customer trust. It traded that in for short-term extraction during a bubble. It was good while it lasted. But the Dell that built systems is gone.
  19. This year’s balance sheet and sales projections coming out of Dell have to be some of the worst I’ve seen in decades. I honestly don’t know who over there drank the Kool-Aid, but the disconnect from reality is getting absurd. Dell pushed another round of price increases at the start of this week. This time it wasn’t just servers. It hit the consumer side as well. Because our order includes consumer-grade components, we got nailed with roughly a 30% jump on desktops and laptops, right after they already hiked server pricing a few weeks ago. To put numbers on it: a very basic configuration, i9-285, 16GB DDR5, 512GB NVMe, which was already overpriced at $1218, is now sitting around $1500. And they’re acting like this is some kind of favor due to “market conditions.” Meanwhile, HP is selling the same class machine on their public website, no bulk discounts, no backroom pricing, no end-of-year promos, with higher specs, i9-285, 32GB DDR5, 512GB NVMe, for $1299. That’s what any random person sees when they go online and compare systems. So Dell’s “heavily discounted” bulk pricing is now more expensive than HP’s public retail pricing with better hardware. Let that sink in. This isn’t about RAM shortages anymore. This is pricing discipline gone off the rails, and it’s happening at the worst possible time. If this is what Dell thinks the market will tolerate going into next year, they’re in for a rude awakening.
  20. One thing that is not being talked about enough is how much actual technology progress may end up stalled because of this. The AI boom is locking in massive, multi year capital commitments based on today’s hardware and today’s software assumptions. Billions are being committed now, but supply takes time. By the time much of this equipment is delivered, it will already represent a frozen generation. That creates an incentive problem. If NVIDIA advances the architecture too aggressively while these contracts are still being fulfilled, it risks legal exposure from customers who just spent enormous money on hardware that was positioned as long term viable. If it does not advance fast enough, it risks falling behind competitors and alternative architectures. The safest path, from a legal and financial standpoint, is to slow real architectural change while extracting as much value as possible from the current generation. That is not how technology normally progresses. Historically, hardware moves forward because the next thing makes the previous one clearly obsolete. In this cycle, progress is constrained by the need to protect sunk costs and contractual commitments tied to an artificial scarcity model. What makes this worse is that the hardware being purchased now is tightly coupled to the current non coherent memory and CUDA centric software stack. If a materially better memory model arrives in the next few years (like CXL), large portions of today’s AI infrastructure could become inefficient overnight. That puts vendors in a bind. Advance too fast and you anger your biggest customers. Advance too slowly and you turn the boom into a dead end. Either way, there is a real risk that we are not just inflating prices, but also delaying the next meaningful architectural step, because the money is already committed to preserving the current one. That may end up being the most expensive part of this cycle.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. Terms of Use