Jump to content
NotebookTalk

MyPC8MyBrain

Member
  • Posts

    660
  • Joined

  • Last visited

1 Follower

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

MyPC8MyBrain's Achievements

Experienced

Experienced (11/14)

  • Conversation Starter
  • One Year In
  • Posting Machine Rare
  • Very Popular Rare
  • One Month Later

Recent Badges

374

Reputation

  1. CES is over, and look at what actually showed up. One low-tier new Intel CPU. One mildly boosted AMD SKU. No new GPUs. No real platform shifts. No meaningful architectural progress. For an industry supposedly in the middle of the biggest computing revolution in decades, this was a staggeringly thin showing. If this were a classroom, you would not call it cutting edge. You would call it remedial at best. Lots of repetition, very little advancement, and an uncomfortable sense that everyone is stalling for time. That is not because innovation suddenly got harder. It is because the market is locked into monetizing what already exists. When margins come from scarcity, price control, and segmentation, there is no urgency to move fast. So instead of bold platforms, we get SKU shuffling. Instead of new architectures, we get minor frequency bumps. Instead of GPUs, we get silence. This is what an industry looks like when it is more focused on managing demand than pushing capability. What makes this year stand out from previous is not that the products were underwhelming. It is that there was no clear sense of direction. Even going back to the 1980s and early 1990s, CES consistently showed forward motion. The hardware was primitive, but the trajectory was visible. New buses, new form factors, new categories, and clear signals about where platforms were heading next. This year did not feel like that. This felt like a strategic pause. An industry more focused on managing inventory, pricing, and segmentation than on moving the platform forward. When margins are being extracted from scarcity and control, there is little incentive to accelerate change. So instead of architectural shifts, we get SKU reshuffles. Instead of new GPUs, we get silence. Instead of platform evolution, we get minor bumps that preserve the status quo. That is not how CES has historically behaved, even in slower decades. There may have been weaker products in the past, but there has rarely been a show with so little visible intent to move computing forward. And that ties directly back to everything above. When markets are optimized for monetization over progress, innovation slows. Not because it is impossible, but because standing still has become more profitable than moving ahead.
  2. I’m still skeptical. It remains to be seen how close they’ll actually get to “back to their roots.” So far, the messaging only references XPS and implicitly Precision via the new naming convention. Both lines have earned multi-generation followings, but after the recent roller coaster, there’s real ground to make up to regain trust. And that trust isn’t tied to a name. It’s tied to a philosophy one they’ve been slowly eroding over the past few years.
  3. they could just used Pro for the latitude equivalent and leave Precision as Precision for the higher tier up as it already was, so essentially the logical rename would be Latitude to Pro if a name change was so necessary, what is the equivalent entry point Inspirons named now?
  4. Precision = Pro, calling it "Pro Precision" is redundant especially since theres no non Pro Precision model. cup half full, its better late then never. at the least they came to some senses before they completely blow it out.
  5. There is another consequence coming out of this landscape that few vendors seem to be accounting for. This pricing behavior fundamentally changes upgrade psychology. For decades, the PC and workstation market relied on a simple pattern. CPUs advanced, prices were tolerable, and users chased the next generation because the platform cost made sense. Memory was an afterthought, not a gating factor. That is no longer true. When customers are forced to buy systems loaded with 64GB, 96GB, or 128GB of memory just to remain functional, that memory stops being expendable. It becomes the anchor. The sunk cost people protect. The result is predictable. Instead of chasing a new CPU with marginal gains, users will sit on older systems that are already fully populated with expensive memory. Upgrading to a new platform now means rebuying RAM at inflated prices for single-digit performance improvements. That math does not work. So unit sales slow. Refresh cycles stretch. The traditional excitement around new CPUs fades. The market shifts from progression to entrenchment. Ironically, the attempt to extract more value per system accelerates the opposite outcome. Fewer systems sold. Longer lifespans. Less incentive to move forward. In practical terms, memory pricing has become a brake on innovation. People will not discard fully loaded, capable machines just to step into a new socket with unaffordable RAM and negligible gains. They will optimize around what they already own. That is how a market stops moving forward not because technology stalled, but because economics made progress irrational. There is a reason this kind of behavior was pushed out of civilized markets in the first place. Pillage-and-plunder economics do not create growth. They extract value until the system stops moving. That model was understood, corrected, and regulated out of functional societies because it destroys trust faster than it creates profit. What is happening now feels like its quiet return. Not through force, but through leverage. Not through scarcity, but through manufactured constraints. While attention was elsewhere, the same behavior re-emerged the moment the market gave cover. When customers are told that doubling prices on inventory already sitting in warehouses is “unavoidable,” when modularity is removed to lock in replacement cycles, when architectural inefficiencies are preserved because they are profitable, that is not innovation. It is extraction. Markets tolerate this only briefly. Once buyers adjust behavior, upgrades slow, alternatives are explored, and loyalty evaporates. The damage does not show up in quarterly reports at first. It shows up later, when momentum is gone and trust cannot be rebuilt on demand. History is consistent on this point. Systems do not collapse when people complain. They collapse when people quietly stop participating. And that is where this trajectory leads if it continues unchecked.
  6. it is clear how much thought and high level engineering placed in the new platform, placing the dGpu right under the CPU is clutch.
  7. Traditional industries like tires, rubber, steel, power, and aviation learned their lessons the hard way through catastrophic failure. You cannot ship a “minimum viable tire,” push a subscription brake pedal, or break interchange standards every 18 months and expect society to function. Physics, liability, and real-world consequences forced discipline. Standards became sacred because lives depended on them. The modern IT industry escaped that crucible. Software abstracts consequences until they’re diffuse, delayed, or offloaded onto users. That created moral hazard. When failure doesn’t immediately kill someone or bankrupt the manufacturer, executives optimize for lock-in, rent extraction, and quarterly optics instead of durability, interoperability, and stewardship. Subscription everything, forced obsolescence, cloud dependency, vendor captivity none of this would survive five minutes in a regulated physical industry. Calling it “innovation” is a fig leaf. Much of it is controlled degradation: removing user agency, breaking working systems, centralizing control, and monetizing dependency. Historically, societies only tolerate this until the hidden costs surface systemic outages, security collapses, economic drag, loss of skills, and institutional fragility. At that point, the response is not polite market correction; it’s regulation, breakup, or replacement. They are not criminals in a legal sense yet. But they are reckless custodians of critical infrastructure, behaving as if software is a toy rather than the nervous system of modern civilization. History is unkind to people who confuse temporary leverage with permanence.
  8. this is not an analog grounding case, analog isolation in this context is referring to the circuit design with no relation to "ground" in this spesific context. Dell relies heavily on Intel Dynamic Platform & Thermal Framework (DPTF). Power mode changes trigger firmware level policy changes, not just windows settings. these transitions are abrupt by design to meet thermal and battery targets. this is common on Dell (not so much with other brands) due to tighter power envelopes, Less analog isolation on modern, compact boards. emphasis on efficiency over electrical quietness, older, systems masked this better with larger inductors, heavier grounding, and more conservative power transitions.
  9. i agree! my point wasn't to challenge the phenomena just explain the root cause of it so you don't end up chasing windmills. analog Isolation means physically separating two circuits so they don't share electricity or ground which Dell redesign cutbacks and consolidation we seen the past few years is losing modularity in design not adding complexity, hence the circuits designs also getting chopped and consolidated all in name of selling more laptops faster, soon we will pickup laptops like a hotdog from 7-Eleven with same emotions and customization reserved for socks.
  10. This is not a defect, not a failing speaker, and not Windows “doing something wrong.” It’s a byproduct of modern mobile power design, aggressive firmware control, and minimal analog isolation. Dell’s recent designs prioritize efficiency and thinness over the kind of electrical overengineering older workstations had. If you want it gone don’t change power modes during playback, or Lock the system to a single power mode, or Use external audio (USB DAC or Bluetooth), which bypasses the internal analog path entirely.
  11. Dell used to be a systems company. It built machines around customer needs, real workloads, and long-term relationships. You could spec a system, argue about it, push back on pricing, and eventually land on something that made sense for both sides. It was not perfect, but it was rational. That Dell no longer exists. What replaced it is a pricing engine that reacts to market hysteria rather than fundamentals. Overnight RAM hikes. Claims of “losing money” while sitting on years of inventory. Pressure tactics that look less like negotiation and more like ransom notes. That same mentality now shows up in the hardware itself. Recent Dell platforms, including flagship models, have steadily lost modularity. Fewer replaceable components. More proprietary layouts. Tighter coupling between parts that were once serviceable and upgradeable. None of this improves performance. It improves control. For decades, Dell’s strength at the edge was adaptability. Systems could evolve with workloads. Their useful life could be extended. That is how trust was built. Now that trust is being monetized. Momentum earned over decades is being exploited as the rug is slowly pulled out from under customers who are not at the very top of the enterprise stack. Buy what is offered, at the price dictated, and replace it sooner. That is the new model. This is not leadership in edge technology. It is brand liquidation. When modularity disappears, long-term value disappears with it. Hardware stops being an investment and becomes a disposable purchase. At that point, Dell is no longer chosen for engineering or reliability, but for convenience. That is how a systems vendor turns into shelf hardware. Pick it up if it is there. Discard it when it is not. So this is not anger. It is an obituary. Dell had decades of goodwill, engineering credibility, and customer trust. It traded that in for short-term extraction during a bubble. It was good while it lasted. But the Dell that built systems is gone.
  12. This year’s balance sheet and sales projections coming out of Dell have to be some of the worst I’ve seen in decades. I honestly don’t know who over there drank the Kool-Aid, but the disconnect from reality is getting absurd. Dell pushed another round of price increases at the start of this week. This time it wasn’t just servers. It hit the consumer side as well. Because our order includes consumer-grade components, we got nailed with roughly a 30% jump on desktops and laptops, right after they already hiked server pricing a few weeks ago. To put numbers on it: a very basic configuration, i9-285, 16GB DDR5, 512GB NVMe, which was already overpriced at $1218, is now sitting around $1500. And they’re acting like this is some kind of favor due to “market conditions.” Meanwhile, HP is selling the same class machine on their public website, no bulk discounts, no backroom pricing, no end-of-year promos, with higher specs, i9-285, 32GB DDR5, 512GB NVMe, for $1299. That’s what any random person sees when they go online and compare systems. So Dell’s “heavily discounted” bulk pricing is now more expensive than HP’s public retail pricing with better hardware. Let that sink in. This isn’t about RAM shortages anymore. This is pricing discipline gone off the rails, and it’s happening at the worst possible time. If this is what Dell thinks the market will tolerate going into next year, they’re in for a rude awakening.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. Terms of Use