When Software Loses Its Value, What Becomes the Scarcest Resource?
When software becomes a commodity, what becomes the scarcest resource? This article explores how AI is devaluing code, why data becomes the new competitive moat, and why objective cognition is the ultimate scarce resource in the AI era.
Anthropic published a casual technical blog post: How AI helps break the cost barrier to COBOL modernization. When this article was released, IBM's stock price dropped by approximately 13% that same day, marking one of the largest single-day declines in over two decades, with market capitalization evaporating by roughly $30 billion.
This isn't another routine news piece about 'yet another model getting stronger.' It's more like a wake-up call: software itself is undergoing structural devaluation.
What's particularly dramatic is that IBM proactively cut those 'unsexy' hard-tech and heavy-asset businesses years ago, betting its future on lighter, more 'financial-asset-like' software services and consulting—a correct choice for that era. But as AI begins to transform 'software production and renovation' into nearly scalable automated engineering, this path has suddenly become perilous: light assets, so-called know-how, seem to have no moat whatsoever.
Actually, this blog post is really about only one thing: Claude Code has turned 'COBOL modernization'—something that was previously prohibitively expensive and time-consuming—into scalable, automated engineering.
COBOL is a language from the 1960s, yet it still underpins the foundations of financial, aviation, and government systems today—especially within legacy mainframe-centric systems, where COBOL isn't just code but more like a 'business machine' that's been running for half a century. And the people who maintain this machine are retiring at a visible pace.
For decades, 'legacy system modernization' has been an extremely profitable business: built on stacking people, time, and consulting fees. Now, AI is directly reaching for this 'cost lever.' Suddenly you realize—we've always assumed, when talking about digital transformation, a single premise: using software well is a core competitive moat for enterprises.
Large enterprises invested heavily in custom proprietary systems while small and medium businesses could only look on with envy. Back then, the ability to 'land software projects and iterate continuously' almost directly determined a company's competitiveness in the digital realm.
But all of this is being rewritten by AI.
Today, code generation has moved from 'efficiency improvement' to 'supply surplus': one person, aided by AI, can complete in a short time what previously took a small team days or even weeks. The production cost of basic functionality, general systems, pages, and interfaces is rapidly declining, increasingly approaching 'infinitely replicable consumables.'
When 'having software' is no longer the differentiator, a more fundamental question surfaces—
When software loses its value, what becomes the scarcest resource?
I. The Truth About AI Programming: The Bottleneck Isn't Code, But 'Decidable Definitions'
Many people still have reservations about AI programming: code has bugs, logic can drift, and 'hallucinations' occur. These problems certainly exist, but the real root cause often lies not in model capabilities but in the input we give it—unclear definitions, ambiguous boundaries, and undecidable acceptance criteria.
Simply providing detailed test examples—such as clearly defined attendance system fields, logic rules, and interface requirements—allows AI to quickly replicate software in multiple languages. Those so-called bugs are essentially the fuzziness and absence of requirements descriptions. As AI training data becomes richer, its precision will continue to improve. In the future, as long as requirements are clearly defined, AI can handle the vast majority of repetitive programming work, driving development costs to the extreme minimum.
In the AI era, one increasingly common sharp truth about software engineering is:
Requirements without acceptance criteria aren't requirements—they're wishes.
AI has solved the problem of 'how to write software,' while the ability to define requirements—'what software to write' and 'why write it'—has become the new core variable. The decline in software costs is essentially the beginning of human definitional capability determining software's value.
II. After Software Becomes Universal, Small Businesses Can Afford Good Software Too
In the past, software was the exclusive domain of large enterprises because their businesses were stable and processes fixed, allowing software to be reused long-term and costs amortized. Meanwhile, numerous small and medium businesses with flexible operations and changing needs couldn't afford traditional software development's high costs and slow iteration—they didn't use software because they didn't need it, but because the model didn't fit.
AI has broken this dilemma. Its advantages in rapid iteration and low-cost trial and error allow small and medium businesses to quickly develop and iterate software tailored to their operations without massive budgets or professional teams. For example, a small e-commerce business only needs to clarify requirements, and within hours can land a product + order management system using AI, with new features quickly adapted later.
Software has fallen from 'luxury good' to 'infrastructure.' When all enterprises can easily have software, the competitive focus shifts from 'having it or not' to 'using it well or poorly.' And what determines software's value was never the software itself, but the data it carries and processes—this is the first core direction of scarce resources after software loses its value.
III. The Core Answer: When Software Loses Value, What's Scarcest Is 'Data'
Many people still view software as an 'efficiency tool,' ignoring its core value: recording real-world business and behavior, transforming it into analyzable data, combining it with historical and external data to support decision-making. Whether for individuals or enterprises, software's value depends entirely on the quality and quantity of data.
At the individual level, shopping software's value isn't in placing orders, but in recommending suitable products by combining personal preferences, product sales, user reviews, and other data. Travel software's value isn't in booking tickets, but in integrating weather, passenger flow, and other real-time data with personal needs to customize exclusive itineraries. All of this depends on the support of one's own objective data, historical reference data, and real-time data.
At the enterprise level, production management software's value lies in recording full-process data and optimizing production plans by combining market and supply chain data. Transportation management software's value lies in integrating road conditions and vehicle data to optimize routes and reduce costs. Without data, software is just a worthless shell.
Truly valuable data must possess three characteristics: objectively true, comprehensive and complete, and real-time and flowable. But currently, while data seems abundant, trustworthy, high-quality data is extremely scarce—this contradiction is the core reason for its scarcity.
First, internet data is filled with noise and false information. Merchants fake sales volumes, platforms push advertisements, institutions manipulate data—this interest-driven data is not only worthless but misleading. For instance, enterprises developing production plans based on fake sales data will only end up with excess capacity.
Second, enterprises' own data often suffers from incompleteness and lack of objectivity. Some only record core data while ignoring key auxiliary data. Some filter favorable data and avoid unfavorable data due to subjective bias, leading to distorted data that can't support correct decisions. For example, analyzing marketing effectiveness by only looking at successful data won't reveal optimization directions.
More critically, real-time data acquisition and integration are extremely difficult. Enterprise data is scattered across different systems and departments with high interoperability costs. External market and policy real-time data is not only hard to obtain but requires rapid filtering and analysis to transform into usable information.
We must be clear: code can be free, software can be open-source, AI can be free to use, but clean, true, complete, high-quality data can never be obtained for free. When software barriers are erased, data becomes enterprises' core differentiator—those with high-quality data can make precise decisions aided by AI. Without data, more software is just 'having tools but no rice to cook.'
But the true scarcity of data has never been about 'quantity,' but about 'structural completeness' and 'high quality.' Below are three types of data that are core prerequisites for decision-making closed loops and AI output value:
The first layer is individual personalized data, core to answering 'who am I and what do I want.' Individual traits such as budget, preferences, and risk tolerance directly determine whether AI output is 'personally usable' or 'generic and unfocused.' Without it, AI can only give 'averagely correct' conclusions that are hard to implement.
The second layer is historical and environmental data, core to answering 'how did we do it before and what are the boundaries.' Enterprises' past experiences, industry patterns, and process inertia all fall into this category. Their value lies in stability, providing referable paths for decision-making and avoiding being led astray by short-term fluctuations. Without it, even perfect suggestions can't withstand reality's scrutiny.
The third layer is real-time data, core to answering 'what's happening now and where's the opportunity.' Real-world dynamic changes require decisions to be not just 'correct' but 'timely.' Real-time data can push decision-making from 'post-hoc review' to 'in-process response and pre-emptive prevention,' avoiding losses from delayed reactions. Without it, correct answers miss their timing.
The three are tightly related: individual data sets the 'direction,' clarifying who decisions are tailored for; historical and environmental data sets the 'foundation,' defining decision boundaries; real-time data sets the 'timing,' ensuring decision timeliness. None can be missing—only with individual data risks 'understanding needs but not being grounded in reality,' only with historical data risks 'being steady but unable to keep up with changes,' and only with real-time data risks 'being agile but directionless.'
When all three types of data are in place, AI's scarcest value in this era can be realized: transforming software's 'recording function' into 'judgment capability,' then landing it as 'effective action.' Going further, software will eventually become cheap capacity, while data becomes scarce 'decision evidence.' The core value lies not in 'having data' but in establishing a complete 'evidence structure,' allowing every AI suggestion to clearly answer: who is this useful for? Why is it useful? Is it still in time?
Data acquisition, filtering, and analysis ultimately cannot be separated from humans. And more scarce and harder to possess than data is humans' 'objective, calm' cognitive attitude—this is the ultimate direction of scarce resources.
IV. What's Scarcer Than Data Is 'Objective, Calm, Undistorted Cognition'
In the AI era, what we lack more is objectively calm people. Data itself is neutral and unbiased, but its value realization depends entirely on humans' subjective judgment—once humans lose objectivity, data becomes distorted, and software and AI become tools that 'amplify errors.'
Objectivity is rare because humans naturally carry positions, emotions, and biases, always unconsciously filtering data that matches expectations and benefits themselves, using locally accidental data to replace globally inevitable patterns, leading to decisions misguided by subjective cognition.
For example, enterprise managers who subjectively favor a product will deliberately amplify small amounts of positive feedback and short-term sales growth, ignoring large amounts of negative feedback and long-term market contraction, ultimately blindly investing and causing losses. Some practitioners, skilled in traditional programming, deny AI's efficiency advantages and stick to their weaknesses, only to be eliminated by the era. These are all costs of subjectively distorted cognition.
An objective attitude allows people to break free from the constraints of emotion and bias, seeing the essence of data and the whole truth. Only with objectivity can we eliminate data noise and filter valuable information. Only with objectivity can we use AI correctly—AI is like a mirror: input real data and get precise suggestions; input distorted data and get absurd conclusions.
If AI is only fed premium customer data, it will ignore ordinary and potential customer needs, leading to decision bias. If production loss data is hidden, AI's plans will only increase costs.
In the AI era, human core competitiveness has evolved from 'can use tools, can write code' to 'can maintain objectivity, can see the big picture, can define standards.' Objectivity is no longer a personality trait but a core capability of the digital age—a scarce cognitive ability that can make data valuable and AI create value.
This capability requires us to break free from subjective bias, learn to speak with data and respect patterns; maintain a calm mind, not be tempted by emotions and short-term interests; possess global thinking, jumping out of the local to see the essence and trends behind data—this is precisely the most scarce and irreplaceable capability right now.
Summary: The Value Hierarchy of the New Era Has Already Been Restructured
The wave of AI has not only stripped software of its 'luxury good' halo and turned code into consumables but has also thoroughly reshaped the underlying logic of value—true scarcity is leaping from 'tools' to 'essence.'
Code and software can be obtained for free, representing the most basic tools. Trustworthy, objective, comprehensive data is the fuel driving value—scarce and precious. And people who can maintain objectivity without distorting cognition are the core that activates all of this, and even more so, the ultimate irreplaceable competitiveness of the AI era.
For enterprises, there's no need to persistently chase the appearance of software and technology. Precipitating high-quality data and cultivating objective cognition is the path to breakthrough. For practitioners, developing the ability to speak with data and decide objectively is how to establish oneself in this era.
Software will depreciate, technology will iterate, but real data and clear minds will always be the confidence to ride through transformation and continuously create value.