The enemy of knowledge is not ignorance, it’s the illusion of knowledge (Stephen Hawking)

It ain’t what you don’t know that gets you into trouble. It’s what you know for sure that just ain’t so (Mark Twain)

Invest with smart knowledge and objective odds

YOUR DAILY EDGE: 11 February 2026

U.S. Retail Sales Unexpectedly Flat in December The softer-than-expected data feed into economists’ concerns about a fragile consumer economy

U.S. retail sales were flat in December, a disappointing datapoint for economists who had expected growth despite concerns about a fragile consumer economy.

Sales at U.S. stores were roughly unchanged in December versus November at $735 billion, after rising by 0.6% in November, the Commerce Department said Tuesday. Analysts polled by The Wall Street Journal were anticipating a 0.4% increase.

Overall in 2025, total sales by retailers grew by 3.7%. In December, sales declines at car dealers, furniture and electronics stores were offset by sales growth at sellers of building materials, food and beverages and gasoline. The figures are seasonally adjusted but not adjusted for inflation. (…)

How significant is such a big miss in the biggest retail month of the year?

  • Total sales were flat vs +0.4% expected.
  • Ex-autos & gas sales were flat vs +0.4% expected.
  • Control group sales were –0.1% vs +0.4% expected. (ex autos, gas, building materials and restaurants)
  • The level of core retail sales was revised down by 0.4% in November, reflecting a 0.2pp revision to core retail sales growth in both October and November.

Goldman Sachs:

Soft retail sales growth in December and downward revisions to prior months suggest softer consumption growth in Q4 than our previous GDP tracking assumptions. We lowered our Q4 GDP tracking estimate by 0.4pp to +1.6% (quarter-over-quarter annualized). Our Q4 domestic final sales estimate stands at +1.1% and our private domestic final sales estimate stands at +2.3%.

Last 4 months of 2025:

  • Total sales up 2.0% annualized thanks to one strong November.
  • Control group sales: +1.2% annualized.

Wells Fargo:

(…) a weak end isn’t necessarily indicative of the start of a new consumer trend. (…)

The latest data ultimately suggest households are still broadly spending in the face of sustained and compounding inflation and an uncomfortable moderation in the jobs market. We expect consumer spending to be sustained this year as higher after-tax incomes and average tax refunds as part of the One Big Beautiful Bill Act help offset some of these household constraints.

One can also point to seasonal adjustments given that unadjusted sales rose by 3.8% YoY, the biggest YoY increase in three months, slightly higher than the +3.7% YoY gain in retail sales for all of 2025.

The fact remains that, starting in September, growth in retail sales (mainly goods) got progressively weaker than payroll growth, likely in reaction to rising goods prices, now up 1.6% vs flat last spring.

image

The Census Bureau cannot give us a date for January 2026 data. The only official data so far is light vehicle sales, down 4.1% in units YoY. January volume was the lowest reading since December 2022 (amid the industry chip shortage) and represents the fourth consecutive month with a YoY decline. But January sales were likely negatively impacted by the large winter storm late in the month.

image

Bank of America today informed us that seasonally-adjusted spending growth per household was flat MoM in January, following the 0.5% MoM increase in December, likely reflecting weakness given the winter storms.

Interestingly:

In our view, a “K-shape” (or divergence) in spending growth may be beginning to emerge between higher-income households and middle-income households, as opposed to just with lower-income households. In fact, in January, the gap in spending growth between higher-income households and all others was at its largest since mid-2022, according to Bank of America internal data. Lower- and middle-income households’ spending growth ticked down to 0.3% and 1.0% YoY, respectively, while higher-income households’ spending was more stable at 2.5% YoY.

A similar pattern is emerging in after-tax wage growth, with the gap between higher- and middle-income households at its largest in nearly five years, according to Bank of America internal data. While higher-income households’ wage growth was 3.7% YoY in January, a solid improvement from the 3.3% YoY in December, middle-income families’ wage growth saw only a marginal improvement, increasing to just under 1.6% YoY in January from over 1.5% in December.

(…) lower-income households saw an average wage increase of 0.9% YoY in January.

image

Yesterday we also got the Q4 Employment Cost Index

Employment cost growth continued to moderate at the end of the year, rising a softer-than-expected 0.7% in the fourth quarter. Over the past year, labor costs have risen 3.4%, the slowest pace since the spring of 2021. The ongoing slowdown in compensation growth according to the ECI marks a departure from the more timely average hourly earnings data, which has shown wages rising at just under 4% year-over-year since early 2025.

But we believe the ECI’s signal of further softening in the jobs market carries more weight in Fed officials’ assessment of the labor market. The ECI has long been considered the Fed’s preferred measure of labor costs because it controls for compositional changes in employment and is broader in scope since it includes benefit costs and compensation costs for public sector workers.

The easing in total compensation growth has been underpinned by the gradual slowdown in wages & salaries, which rose 3.3% year-over-year in the fourth quarter. (…) (Wells Fargo)

Enlarge

If the ECI is right, wages are actually rising 3.0-3.5% YoY vs 3.5-4.0% per hourly wages. Not insignificant when inflation is around 3.0%.

image

The squeeze is still on, and it seems to be percolating to the middle class per BoA. But

In the short term, higher tax refunds are likely to help. BofA Global Research estimates that tax refunds in 2026 will increase by around $100 billion or 25% higher than in 2025. Exhibit 12 shows that in 2025, younger (Gen Z and Millennials) and lower-income households appeared to see the largest increases in their card spending as a result of refunds. Given these cohorts are likely to be the most challenged by affordability issues, larger refunds in 2026 could also help support their spending this year.

Finally, Bank of America data shows that 401(k) plan participants’ average account balance has been steady over the past two quarters and is up significantly compared to the past two years.

image

Mattel shares plunged as much as 28% in late trading after the toy maker said an anticipated surge in holiday sales came up short, and it issued a lower-than-expected profit forecast for 2026.

The shortfall in sales in the critical weeks before Christmas prompted the maker of Barbie dolls and Hot Wheels cars to step up discounts, it said, putting a squeeze on profit margins. Both sales and profit came in below Wall Street expectations for the fourth quarter. (…)

Mattel executives said price-sensitive consumers shopped for deals while retailers were cautious in managing their inventory.  

“December is historically the biggest month of the year,” Chief Executive Ynon Kreiz said in an interview Tuesday. “But because of the shift in retailer ordering patterns, orders were even more back-end loaded.”

Fourth-quarter sales rose 7% to $1.77 billion, below the $1.84 billion that Wall Street had modeled. Profit and gross margins fell sharply because of increased discounts, as well as tariff costs and other factors.

Mattel also issued guidance for the current year that was below expectations, as the company plans additional investments to stoke sales.

Pointing up The results stood in contrast to rival Hasbro, which reported earnings earlier in the day. Hasbro said shoppers had been willing to pay higher prices for toys during the holiday season, allowing the company to pass along tariff costs without significantly hurting demand.

Mattel’s Kreiz noted that December sales only missed expectations in the U.S., and that the company performed as expected and gained market share internationally.

The promotional environment in December was steeper than expected, requiring Mattel to lean more heavily into discounts than originally planned. The company raised prices last summer in response to tariffs, and Kreiz said it continues to take a “very strategic” approach to pricing to drive demand and “offer the right quality and value for consumers.” (…)

For the year, Mattel projected adjusted earnings of $1.18 to $1.30 a share, below analyst views for $1.77 a share. The company forecast sales to climb 3% to 6% on a constant currency basis.

For its three-months ended Dec. 31, Mattel posted a profit of $106.2 million, or 34 cents a share, down from $140.9 million, or 42 cents a share, a year earlier.

Stripping out certain one-time items, earnings were 39 cents a share. Analysts polled by FactSet expected adjusted earnings of 54 cents a share. (…)

Delinquency rates on loans ranging from mortgages to credit cards rose to 4.8% of all outstanding US household debt in the fourth quarter, the highest level since 2017, driven by higher defaults among low-income and young borrowers.

While the overall share of loans in some stage of default is near pre-pandemic averages, the rise in delinquencies among the lowest earners adds to evidence of an increasingly bifurcated economy, data from the Federal Reserve Bank of New York’s Quarterly Report on Household Debt and Credit released Tuesday showed.

image

The rise in defaults was driven by delinquencies in mortgage payments, and New York Fed researchers found that they were particularly high in lower income zip codes. Student-loan delinquencies, which have surged following a pause in payment requirements during the pandemic, also contributed to the rise in defaults, the researchers said. (…)

“Delinquency rates for mortgages are near historically normal levels, but the deterioration is concentrated in lower-income areas and in areas with declining home prices.”

The share of credit-card loans that were at least 90 days delinquent rose to 12.7% — the most since the first quarter of 2011 — and the share of auto loans in serious delinquency climbed to 5.2%, just shy of the record reached in 2010.

The increased struggle in low-income and young borrowers’ ability to pay their loans is consistent with elevated unemployment rates among some parts of the population, they added. The jobless rate for workers 16 to 24 years old stood at 10.4% in December, near the highest levels since the depths of the pandemic in 2021.

Some 16.3% of student-loan debt became delinquent in the fourth quarter, the biggest increase on record in data going back to 2004.

U.S. Farmer Sentiment Weakens To Start The Year

The Purdue University-CME Group Ag Economy Barometer dropped 23 points in January to 113, down from 136 in December, marking the sharpest monthly decline in recent quarters and the lowest reading since September 2024. The survey of 400 U.S. agricultural producers, conducted January 12-16 and released February 3, showed deteriorating sentiment across both current conditions (down 19 points to 109) and future expectations (down 25 points to 115).

The most pronounced weakness emerged in producers’ five-year outlook for U.S. agriculture, with the index measuring expectations for widespread good or bad times falling from 122 to 88. Half of surveyed farmers reported their operations were worse off than a year ago, while 59% now expect bad financial times in the next twelve months, up from 47% in December. Only 62% of producers believe the U.S. is headed in the right direction, down from 75% the prior month.

Mounting debt pressures are compounding the downturn; 21% of respondents anticipate larger operating loans in 2026 versus 2025, up from 18% a year earlier. Among those expecting increased borrowing, 31% cited carryover of unpaid operating debt from the prior year as the primary driver, up sharply from just 5% in 2023, 17% in 2024, and 23% in 2025. “What stands out this month is the growing number of producers who report that higher operating-loan needs stem from carrying over unpaid debt from the previous year,” said Michael Langemeier, the barometer’s principal investigator and director of Purdue’s Center for Commercial Agriculture. “That points to increasing financial pressure heading into the year.”

Export concerns intensified in January; 16% of respondents expect U.S. agricultural exports to decline over the next five years, tripling from 5% in December. For corn and soybean producers specifically, 21% foresee soybean export declines, up from 13% previously, with 80% expressing concern about Brazilian competition (44% “very concerned”). The Farm Capital Investment Index dropped 11 points to 47, its lowest level since October 2024, with just 4% of producers planning to increase farm machinery purchases, a direct headwind for agricultural equipment demand.

The January survey coincides with USDA’s World Agricultural Supply and Demand Estimates report, which showed higher-than-expected U.S. corn yields, adding to bearish sentiment. For AFN, the barometer’s deterioration underscores a challenging demand environment as farmers face tightening margins, rising debt services, and heightened capital expenditure caution entering 2026, reaffirming the belief that 2025 was not the trough for the North American agricultural industry.

The backlash captures a deeper problem for Trump: Niche constituencies he courted in 2024 are growing disillusioned.

  • Podcast populists: The Trump administration’s handling of the Epstein files and aggressive immigration enforcement has alienated some anti-establishment podcasters who helped shape his appeal to young men, including Joe Rogan and comedian Andrew Schulz.
  • Nonwhite voters: Trump’s support among Black and Latino Americans has slipped after he made major gains in 2024, as cost-of-living pressures squeeze these voters.
  • Farmers: Even after Trump rolled out a $12 billion “bridge payment” to offset tariff-related losses, agricultural leaders warned last week of potential “widespread collapse” if Congress fails to act.
China’s Years-Long Retreat From US Treasuries Flags Bigger Risks

The slump in Treasuries after China’s latest call to curb its holdings was fleeting, but it put a spotlight on Beijing’s decade-long shift from US debt and rekindled fears about a broader, global retreat.

A look at the data on China’s Treasury holdings suggests why traders were so quick to move on from the report that Beijing had urged Chinese banks to limit their Treasury purchases. Once the largest foreign lender to the US government, China has quietly halved its holdings of Treasuries since 2013 — and investors appear to have decided the latest headlines fit that trend.

The danger now is whether President Donald Trump’s unpredictable policies alienate US allies further, and encourage traditional lenders like Europe and Japan to follow in China’s footsteps.

ImageSo far, a surge in foreign demand suggests the long-term shift by America’s biggest geopolitical rival is more the exception than the rule. (…)

For China, regulators have grown worried that large holdings of US government debt may expose banks to sharp swings. Officials advised financial institutions to limit purchases and instructed those with high exposure to pare down their positions, Bloomberg reported, citing people familiar with the matter. (…)

“The whole idea of lending to the government of your primary adversary should no longer be welcomed in Beijing,” he said. (…)

Outside Europe, India’s holdings have dropped to a five-year low as the nation pushed to support its currency and diversify reserves. Brazil’s long-term Treasuries holdings have also declined.

“The broader trend is clear,” said Damien Loh, chief investment officer at Ericsenz Capital in Singapore. “Non‑US entities, both sovereign and corporate, are moving to reduce their overweight positions in US assets, particularly Treasuries.”

While foreign holdings of Treasuries hit a record $9.4 trillion in November, the share of the total debt is smaller — reflecting that their purchases haven’t kept up with the growth in US government borrowing. Overseas investors now hold about 31% of the total, compared to roughly 50% at the beginning of 2015.

Still, the moves in the world’s largest debt market are a long way from adding up to a buyers’ strike.

As long as the US runs a trade deficit and sends dollars overseas, foreign countries must find a home for those dollar revenues, with Treasuries remaining one of the main destinations, said Jim O’Neill, the former chairman of Goldman Sachs Asset Management.

“It is a red herring,” said O’Neill about foreign investors dumping US debt en masse. “The US bond market is very large. If China or Japan reduces their holdings, someone else will buy them.”

image

Brad Setser, a senior fellow at the Council on Foreign Relations, estimates that China’s “true” holdings of US Treasuries exceed $1 trillion, far above the $683 billion reported by the US Treasury.

That’s because Beijing may have obscured its footprint by shifting assets to custodians in Europe. Belgium — whose holdings are considered to include some of those Chinese accounts — has seen its Treasury ownership quadruple since the end of 2017 to $481 billion. (…)

“The PBOC is largely stuck with the dollar because of a dearth of safe and liquid assets denominated in other currencies,” he said. “It is highly unlikely that China has diversified away from US Treasury securities to the extent suggested by official data.”

Meanwhile:

Dozens of cities in #China have reported progress in clearing hidden debt in the past year, including a significant reduction in the number of local government financing vehicles, the Securities Times reports, citing regional governments’ work reports for 2025.

With the 2028 target of eliminating all existing hidden debt and the June 2027 deadline for local government financing platform companies (hereinafter referred to as “urban investment companies”) to complete their “exit from the platform” task drawing ever closer, local debt reduction efforts have entered a critical stage of overcoming difficulties.

A review of the 2026 government work reports and fiscal budget reports recently released by various regions by the Securities Times reporter reveals that in the past year, many localities announced the elimination of local hidden debt, a significant reduction in the number of urban investment companies, and some regions even exceeded their debt reduction targets.

In 2025, the risk of local government debt in my country further subsided, and after debt swaps in various regions, the average interest cost of debt decreased by more than 2.5 percentage points. According to incomplete statistics from reporters, since 2026, at least 34 cities across the country have announced the latest progress in their efforts to eliminate hidden debt, including Siping City and Songyuan City in Jilin Province, and Shuangyashan City in Heilongjiang Province, which announced that they had achieved zero hidden debt last year. (…)

Hidden debt in more and more regions will be cleared ahead of schedule, and the work of resolving hidden debt is nearing completion,” Zhou Lijun, executive director of the Public Utilities Department of Orient Securities, told the Securities Times. (…)

A Bridge Too Far?

In June 2023, Michael Leppert wrote For Trump supporters, ‘a bridge too far’ must be nearby

I often wonder where idioms like “a bridge too far” originate. This phrase comes from the title of the 1974 book of the same name, by Cornelius Ryan. The book tells the story of Operation Market Garden, the Allies’ plan of attack to conquer several bridges in the Netherlands in September of 1944. The goal was to establish a route over the Rhine River for an invasion into Germany.

The Allies never captured that last bridge at Arnhem, and the mission ultimately failed.

For Americans who possess apparently unrelenting fealty to former President Donald Trump, the bridges in front of them are also numerous. Trump’s influence, his top position in Republican party politics, relies on a constituency that will seemingly follow him anywhere.  (…)

What bridge will be too far? (…)

Yesterday’s NYT: Bridge Owner Lobbied Administration Before Trump Blasted Competing Span to Canada

The billionaire owner of a bridge connecting Michigan with Canada met Howard Lutnick, the U.S. Secretary of Commerce, on Monday hours before President Trump lambasted a competing span, in the latest flashpoint in the deteriorating relationship between the United States and Canada.

Matthew Moroun is a Detroit-based trucking magnate whose family has operated the Ambassador Bridge between Detroit and Windsor, Ontario, for decades. He met on Monday with Mr. Lutnick in Washington, according to two officials briefed on the meeting who requested anonymity to discuss a private conversation.

After that meeting Mr. Lutnick spoke with Mr. Trump by phone about the matter, the officials said.

Shortly afterward, Mr. Trump threatened to block the planned opening of a new bridge between Detroit and Windsor, which would take away toll revenue from Mr. Moroun’s crossing, if Canadian officials did not address a long list of grievances. (…)

The Moroun family had previously called on Mr. Trump to halt the construction of bridge — which, once opened, would compete with the Ambassador Bridge for the more than $300 million in daily cross-border trade. (/…)

In his first term, Mr. Trump had promoted the Gordie Howe Bridge in a joint statement with Canadian officials as a symbol of the countries’ deep ties and as “a vital economic link between our two countries.” (…)

Totally unrelated:

I am a bit of an etymologist, interested in the origin of words or expressions.

Henry Goddard served as the Director of Research at the Vineland Training School for Feeble-Minded Girls and Boys in Vineland, New Jersey from 1906-1918. Between 1908, when Goddard first translated a version of the Binet scale [IQ tests] and had it published in America, and 1930, over nine million adults and children had been tested using this scale. Standardized mental measurement cemented the authority of psychology as a serious science.

Binet’s original scale of mental measurement had included two gradations of deficiency: the “idiot,” who had a mental age of 2 or younger, and the “imbecile,” who had a mental age of 3 to 7 years. However, Goddard was not satisfied that this scale adequately addressed the problem of mental deficiency.

He believed the greatest threat to civilization’s advance lay with those who demonstrated a mental age of 8 to 12 years. This group, consisting of those closest to a “normal” mental age (13 or older), posed the greatest danger, in his opinion.

Goddard hoped to draw attention to their presence in the public school systems that were struggling to make “normal” people out of them by keeping them in regular classes. Government agencies were making a grave error in treating them as “normal,” in Goddard’s mind. Even the highest grade of the feebleminded could never become normal, he argued, though they could pass for normal, making them the most likely culprits for spreading the defect to future generations. Rather than trying to disguise or ignore their disabilities, physicians and superintendents needed to underscore them.

Goddard needed a word that would carry scientific legitimacy and arouse public concern, for as Goddard stressed, physicians needed public assistance in hunting out individuals with high-grade deficiencies. Yet there was no word in the English language which adequately expressed the distinctiveness and urgency of their condition. Goddard, therefore, constructed his own term from the Greek word for foolish, moronia and the result was the diagnostic label of the “moron” for those who exhibited a mental age of 8 to 12 years. (…)

The “moron” represented those who could not develop beyond the primitive savagery of adolescence. He (or she), because of faulty genes resulting in low intelligence, remained trapped in this primitive phase of development. (…) (Encyc)

The word “moron” is derived from the Ancient Greek μωρός (mōrós), which translates to “foolish,” “dull,” or “stupid”. While the Greek word existed for millennia, it was not used as a specific English noun until the early 1900s.

Under Goddard’s system, a “moron” was an adult with a mental age between 8 and 12 years (or an IQ roughly between 51 and 70).

It was the highest functioning of three “scientific” categories of intellectual disability:

  • Moron: Highest (Mental age 8–12)
  • Imbecile: Middle (Mental age 3–7)
  • Idiot: Lowest (Mental age < 3).

Goddard’s creation of the term was heavily tied to the eugenics movement. He believed “morons” were a threat to society because they could “pass” as normal but were supposedly prone to criminal behaviour and “bad breeding”. His work was used to justify compulsory sterilization and restrictive immigration policies, particularly at Ellis Island.

By the 1920s, the word leaked into the general public and began to be used as a common insult. Because it became so widely used as a slur, the medical community eventually abandoned it in favour of terms like “mild mental retardation” (which has also since been deprecated).

Also totally unrelated:

FOX News: Trump promises Schumer funding for NY tunnel project — if Penn Station, Dulles Airport are renamed after him

President Donald Trump told Senate Minority Leader Chuck Schumer last month that he was finally prepared to drop his freeze on billions of dollars in funding for a major New York infrastructure project.

But there was a condition: In exchange for the money, Schumer had to agree to rename New York’s Penn Station and Washington’s Dulles International Airport after Trump. (…)

In the weeks since, Trump has continued to withhold the more than $16 billion earmarked for the long-planned Gateway project connecting New York and New Jersey through a new rail tunnel beneath the Hudson River.

The two states are now suing the Trump administration over the freeze, alleging in a complaint filed earlier this week that the funding suspension is unlawful. (…)

Since returning to the White House, the president has introduced a slew of initiatives bearing the Trump name, including the Trump Gold Card offering a high-priced path to citizenship, the TrumpRx website offering lower-priced prescription drugs, and a new Trump-class battleship meant to solidify his era of “peace through strength” foreign policy for years to come.

Trump in recent months has set his sights on even bigger targets: Adding his name first to the US Institute of Peace and then, even more controversially, to Washington’s iconic Kennedy Center.

Still, Trump’s offer to Schumer would have represented perhaps his most audacious move yet, an apparent attempt to leverage the future of a massive infrastructure project to fulfill his own personal wishes.

The commission in charge of the Gateway tunnel has warned that it will soon have to shut down work on the project and lay off roughly 1,000 workers if the Trump administration does not release the funding it needs.

The tunnel’s construction predates Trump’s return to office, with the federal government on the hook for a significant portion of the funding needed to complete it. But Trump moved to halt the project late last year, a decision that Democratic officials in New Jersey and New York have argued was politically motivated. (…)

Feb. 6: Judge orders Trump administration to restore funding for rail tunnel between New York and New Jersey

(…) U.S. District Judge Jeannette A. Vargas in Manhattan approved a request by New York and New Jersey for a temporary restraining order barring the administration from withholding the funds while the states seek a preliminary injunction that would keep the money flowing while their lawsuit plays out in court.

“The Court is also persuaded that Plaintiffs would suffer irreparable harm in the absence of an injunction,” the judge wrote. “Plaintiffs have adequately shown that the public interest would be harmed by a delay in a critical infrastructure project.” (…)

Speaking to the media on Air Force One, Trump was asked about reports that he would unfreeze funding for the tunnel project if Schumer would agree to a plan to rename Penn Station in New York and Dulles International Airport in Virginia after Trump.

“Chuck Schumer suggested that to me, about changing the name of Penn Station to Trump Station. Dulles Airport is really separate,” Trump responded.

Schumer responded on social media: “Absolute lie. He knows it. Everyone knows it. Only one man can restart the project and he can restart it with the snap of his fingers.”

Feb. 9: Judge temporarily halts order requiring Trump to unfreeze tunnel funding

(…) Vargas said she would put her order on hold until Thursday at 5 p.m. to give the Second Circuit U.S. Court of Appeals time to consider the government’s emergency request. But she denied the government’s request for an extended halt, saying New York and New Jersey have demonstrated the shutdown of operations prompted by the funding freeze “will have an immediate and severe impact on the region’s economic interests.” (…)

Tunnel vision: “the tendency to focus exclusively on a single or limited goal or point of view.”

Tunnel vision as slang refers to a state of extreme, singular focus on one goal or task while ignoring all other, often important, aspects of a situation. It implies a narrow perspective or “one-track mind,” often causing a person to disregard context, advice, or surrounding details.

Key Aspects of Tunnel Vision:

  • Intense Goal Focus: Being hyper-motivated or driven to achieve a specific outcome, such as working solely on a project or chasing a goal to the exclusion of other responsibilities.
  • Neglect of Surroundings: Ignoring potential risks, alternative perspectives, or other important details outside the immediate focus.
  • Negative/Neutral Context: Often used to describe narrow-mindedness, refusal to consider other viewpoints, or an unhealthy obsession that ignores the bigger picture.

YOUR DAILY EDGE: 9 February 2026: Railroaded?

Big Tech’s AI Push Is Costing a Lot More Than the Moon Landing As a percentage of GDP, the projected 2026 spending of four tech giants rivals the most momentous capital efforts in U.S. history

image

image

image

The 1850-59 period was only the beginning of the rail era in the US with 21,000 additional miles of railroad installed after the first 9000 miles set between 1823 and 1850. During the Western expansion after 1860 (the “golden age”), the amount of railroad track in the U.S. reached 100,000 miles in 1880 and more than 200,000 miles in 1902. It peaked in 1916 at 254,000 miles.

Last October, Michael Magoon, a PhD from Brown University, wrote an article on the largest investments booms:

Investment booms are episodes when vast financial and material resources are concentrated into one economic sector for a sustained period. Some examples include:

  • railroads in the 19th century,

  • electrification and automobiles around the turn of the 20th,

  • IT and internet in the late 20th, and

  • AI today.

These booms are not merely curiosities; they are catalytic turning points that reshape economies, societies, institutions, and expectations. (…)

When a sector—railroads, electricity, cars, computing, AI—comes to dominate that private investment, it signals not just innovation but systemic transformation. (…)

Imagine the 19th-century world: could a technology be both so glamorous and so costly that whole fortunes and banking systems revolve around it? Railways did exactly that. (…)

Railway capital was immense. In the United States, in 1860, the total of railroad stocks and bonds was $1.8 billion—by 1897, it had reached $10.6 billion—compared to a national debt of just $1.2 billion. This underscores how railroads formed the backbone of the private-sector financial system. (…)

Although precise conversion of these figures into share of total private investment is tricky—GDP accounting was rudimentary and public/private investment distinctions blur—scholars agree that in many cases, railway investment consumed a very large share of private capital formation, perhaps 10–20 percent of GDP in some peak years. The speculative frenzy, sheer capital involved, and societal centrality of rail development make the railroad boom arguably the greatest private-investment-share boom in history.

Railways didn’t just dominate private investment; they transformed economies. They slashed freight and passenger costs—by 1860, long-distance bulk rates in the U.S. had plunged nearly 95 percent, fostering national markets and reshaping agriculture, manufacturing, urbanization, and finance.

Thus, the railway boom looms large: a sustained, capital-intensive, globally-visible moment when one sector dominated private deployment of capital and redefined economic power.

The US rail system was built with private capital, mainly debt. While the Eastern railroads were built to connect existing, bustling cities, the Western expansion was primarily fueled by financial engineering, government subsidies, and the creation of the (speculative) bond market. Many economic historians have argued that much of the transcontinental expansion was “built ahead of demand”, the railroad expected to create the demand by bringing the settlers and industry with it.

Because there was no immediate revenue, these projects were incredibly risky. To attract investors, the government provided Federal Land Grants (giving railroads millions of acres to sell to settlers) and Government Bonds.

The Western railroad expansion was essentially a massive financial scheme where fortunes were first built selling bonds. Promoters sold bonds based on the future value of the land the tracks would pass through, pocketing large commissions raising money that was then spent on rail tracks connecting ghost towns.

By the late 1880s, there were often multiple parallel lines serving the same empty territory, leading to “rate wars” and eventual bankruptcies because there wasn’t enough actual cargo to support the liabilities.

Because the expansion was driven by the availability of capital (bonds) rather than actual commerce (demand), the industry was prone to booms and busts. Hence the Panics of 1873 and 1893, both triggered by the collapse of over-leveraged railroad companies.

In Q4’1873, 25 railroads with $150 millions in outstanding bonds defaulted. “In 1874, 71 followed with a bonded debt of $262M, and another 25 having $140M in outstanding bonds defaulted in 1875.” (Richard White, Railroaded: The Transcontinentals and the Making of Modern America)

Spending on railroads was actually much more than the 2.0% assigned to the 1850-59 period. Magoon’s research pegs it as high as 20% in the peak mania years. “Since most investment then was private, railways may have consumed a similarly large fraction of private investment”. This chart, using data from an ARK Investment analysis suggests around 3% of GDP during the second half of the 19th century.

Image

Michael Cembalest, Chairman of Market and Investment Strategy for J.P. Morgan Asset & Wealth Management last month published Smothering Heights (a must read), asking “is the largest moat in market history indestructible?” Note that his numbers were after Q3 results and thus do not reflect the capex boosts announced in last week’s Q4 releases.

[One] way to visualize the magnitude of tech capital spending: compare it to some of the largest capital outlays of the 20th century. Tech capital spending in 2025 was roughly equal to the Manhattan Project, farm electrification, the Moon Landing, the Interstate Highway system and several FDR-era public works projects combined, measured as a share of GDP. (…)

image

Our universe of 28 direct AI stocks represents 50% of S&P 500 market cap and just 5% of S&P 500 net debt. Despite the recent rise in capex, the amount financed by debt was still very low as of Q3 2025 in contrast to the late 1990’s. [Oracle and Meta are the 2 debt outliers]. (…)

Despite the enormous amount of capital deployed so far, tracking hypescaler AI profitability is difficult. (…) Given gradually declining hyperscaler free cash flow margins and falling cash balances, a clearer path to profitability on AI investments may be needed in 2026 for current valuations to be sustained. That path is made murkier by questions on GPU and networking equipment depreciation (…).

Since 2020 hyperscalers ramped up depreciation lives of GPU and networking assets. Rationale: older chips remain in use for several years; even NVIDIA A100 chips still run at high utilization and generate positive margins beyond 2 -3 years.

GPUs don’t become useless when new versions arrive; processors can be repurposed for inference or resold to emerging countries. Also: it’s expensive to replace GPUs since power/HVAC equipment may need to change, so some users will not swap out for every upgrade. That said, most observations for GPU hourly rental rates paid by neocloud and hyperscaler customers have declined by 20% – 25% over the last year (…).

The table estimates the impacts on depreciation, margins and EPS using 3-year depreciation for GPUs and networking equipment added since GPT’s launch in 2022. Given incomplete disclosures, we had to make some assumptions which are listed below the table.

The results: EPS and operating margin revisions would range from -6% to -8%, other than for Oracle whose declines would be larger. The next stage might entail more disclosures from hyperscalers clarifying the basis for GPU and networking depreciation assumptions which could partly defuse this issue. Trump’s decision to allow NVIDIA H200 chips to be sold to China could support long term values and support existing depreciation assumptions. (…)

image

Michael then asks the crucial question: “how do future computing needs for consumer inference stack up against current AI workload capacity?”. His complex calculations lead him to conclude:

Using a broader range of GPU utilization rates from 33% to 75%, 2026 AI compute would only meet 14% – 66% of future demand just for consumer inference, before adding any demand from enterprises, sovereigns/defense or low-latency inference (robotics, autonomous cars, warehouse automation, delivery robots, farming and commercial drones).

This is not meant as a projection; I just wanted to illustrate what some optimistic arguments I see and hear are based on. AI professionals at JP Morgan helped me outline below the factors that may affect future demand for AI workload compute; the hyperscalers are obviously betting the ranch on the top part.

Increased demand for cloud computing resources

  • Reasoning models use 20x – 40x more tokens than non reasoning ones. More and more agents will use reasoning models, and most coding tasks will use reasoning models
  • We are still in very early stages and many of the domain specific models (e.g., legal, finance..) are still in their infancy
  • Most large companies have still not scaled up AI use cases (especially the high-volume ones)
  • Creative work is very token intensive, and most studios, ad agencies and citizen content generators are still in early stages of using this technology
  • Agents will be used extensively for constant monitoring of software services in production for dev-ops, security ops, anomaly detection …. and these tend to be very input token intensive
  • AI for healthcare/drug discovery is in very early stages and could be big wildcards in terms of compute required
  • More and more data will become AI ready and AI will generate more and more data (flywheel)
  • Fraud Detection, recommender systems and personalization is already moving to transformer based architecture that requires far more compute than traditional models

Decreased demand for cloud computing resources

  • Ensemble of models, mixture of experts, GPU advances, inference optimization and specialized accelerators will continue to bring the cost down by reducing the compute required for a given workload [though costs must come down for usage to expand]
  • Big companies will likely move from generalized models to post trained smaller purpose-built models [which could use less powerful chips]
  • AI compute will move to edge devices, PCs and mobile devices as cost of adding accelerators is cheap

By some accounts OpenAI is 6-12 months ahead of competitors based on user growth, monetization and model/product capabilities. The table shows how OpenAI’s forecasts for 2027 that the company made in mid-2025 were already eclipsed by forecasts for 2027 it made just three months later.
If OpenAI stays on its current trajectory, the company projects it will still need more compute for training than for inference through to 2030.

image image

The bet is thus that, by the time (2028?) training peaks, usage (inference) will be quickly ramping up and fill the capacity vacated by the training phase.

If so, whatever chip hoarding is currently made for competitive reasons, it will be quickly absorbed by rising inference. The real risk isn’t that the GPUs will sit empty; it’s that the cost to run them will exceed the revenue they generate:

  • When railroads overbuilt, they didn’t leave the tracks empty; they fought for the same few tons of wheat by dropping prices to near zero.
  • We may see “Compute Wars.” If there is a massive surplus of GPUs, the price of tokens (API calls) might crash to the point where some companies can’t service the massive loans they took out to build the datacenters.

There is so much spec money floating around AI that the risk of over-buying GPUs is real.

The “$600 Billion Question” is a framework popularized by David Cahn at Sequoia Capital. It highlights the massive “expectation gap” between the money being spent on AI infrastructure and the actual revenue being generated by AI products.

In the software world, you usually want a healthy ratio between your infrastructure costs and your revenue. Historically, for every $1 spent on hardware (like a GPU), the ecosystem needs to generate roughly $2 in revenue to break even after accounting for energy, maintenance, and the “brains” (staff) to run it.

Here is how the gap is calculated for 2024/2025:

  • Nvidia Revenue: Nvidia is on track to sell roughly $150 billion worth of chips for AI data centers annually.
  • The Multiplier: To account for the cost of the actual data center buildings, the massive amount of electricity, and the profit margins for the cloud providers (AWS, Azure, Google), you multiply that chip spend by 2x. We are now at $300 billion in total infrastructure costs.
  • The Software Requirement: For the end-users (the companies buying the cloud space) to be profitable, they generally need to make a 50% margin. This means the world needs to produce $600 billion in AI-driven revenue to justify the current spend.

There is no precise data on this yet but the industry is currently generating anywhere between $50 and $100 billion in direct AI revenue.

I then got into a lengthy “exchange” with Gemini 3.0 exploring various topics of the current AI narratives. (Gemini 3.0 emphasis)

Q. Cost increases must account for some of the recent capex boost. Is there a measure of real growth like gigawatts or GPU demand.

You are hitting on the most critical metric in the industry right now: The decoupling of Dollars from Compute.

If you look only at the dollar amounts (CAPEX), you are seeing a mix of inflation, scarcity pricing, and labor costs. To see the “real” growth, analysts look at Gigawatts (GW) of contracted power and FLOPs (Floating Point Operations) per dollar.

Here is the data on the “Real” physical expansion.

1. The Real Metric: Data Center Power Capacity (GW)

In the railroad era, the real metric was “track-miles.” Today, it is Gigawatts.
While CAPEX is rising by ~30–40% YoY, the actual power capacity being built is growing even faster due to economies of scale in “mega-campus” construction.

  • 2023 Baseline: The global data center inventory was roughly 25 GW.
  • 2026 Projection: Capacity is expected to nearly double to 48–50 GW.
  • The Concentration: Approximately 80% of all new power capacity currently under construction in the U.S. is being claimed by just four companies (Amazon, Microsoft, Google, Meta).

2. Physical Inventory: The “H100 Equivalent” Unit

Because chip prices fluctuate, analysts use the “H100-Equivalent” to measure real hardware growth.

  • 2024: Total high-end GPU shipments were roughly 3.5 million units.
  • 2026: Projections suggest shipments will exceed 8 million units annually.
  • Efficiency Gains: The move from the H100 to the Blackwell (B200) and then the Rubin architecture provides a 3x to 5x increase in compute power for the same amount of electricity.

The “Real” Growth Fact: Even if CAPEX stayed flat, the “Real Compute” would still be growing because of chip architecture improvements. The fact that CAPEX is also rising means the total compute capacity is growing exponentially, not linearly.

3. “Real” vs. “Nominal” Growth: The Cost of a Data Center

To your point about cost increases, the “price per Megawatt” to build a data center has skyrocketed.

image

Conclusion: Roughly 40% of the CAPEX increase you see in earnings reports is likely “dead weight” cost increases (inflation, labor, land). The remaining 60% represents the actual “Real” growth in the number of chips and buildings.

4. The “Railroad” Risk: The Yield on the Gigawatt

This brings us back to your original insight. In the 1880s, railroads measured “Revenue per Ton-Mile.” If they built more track but the revenue per mile fell, they were in trouble.

Today’s equivalent is “Revenue per Gigawatt.”

  • If Big Tech builds 50 GW of capacity but the “AI Revenue” doesn’t grow at the same rate, their Return on Invested Capital (ROIC) will crater.
  • Currently, the “Yield” on a Gigawatt is falling because the cost to build the capacity is rising faster than the immediate revenue from AI software.

Would you like to see a projection of how “Compute Pricing” (the cost per 1M tokens) is trending? It shows exactly how the “Rate Wars” of the 1890s are playing out in the AI era.

The AI industry is currently in a state of “Token Deflation” that mirrors the railroad “Rate Wars” of the late 19th century. Just as over-competition in the 1890s forced railroads to cut shipping prices to near-zero, AI providers are slashing token prices to capture market share [and presumably to grow total demand/usage]

1. The Crash in Compute Pricing (Cost per 1M Tokens)

The cost to process 1 million tokens has collapsed by over 99% in less than three years. In late 2022, GPT-3.5-level performance cost $20.00 per million tokens; by early 2026, efficient models can process that same volume for as little as $0.07 to $0.15.

2. The “Rate War” Parallel

In the 1890s, railroads had massive fixed costs (bonds) but low marginal costs (one extra coal car). This led to desperate price-cutting just to keep some cash flowing. We see the same in AI:

  • The Marginal Cost Trap: Once a GPU cluster is built and powered, the marginal cost of running one more query is negligible. Providers like Google and Meta are using this to offer “Flash” models at near-zero profit margins to starve competitors of users.
  • The “Land and Expand” Strategy: Anthropic and OpenAI are shifting to “all-you-can-eat” enterprise seats or “agentic” credits to move away from volatile per-token pricing.

3. Jevons Paradox: Why Costs Fall but Spend Rises

Paradoxically, as tokens get cheaper, total corporate spending on AI is increasing.

  • Elasticity of Demand: When tokens were $20, companies used them only for high-value summaries. At $0.07, companies use them for “agentic workflows”—autonomous bots that might use 100,000 tokens just to answer one email.
  • Inference Expansion: While per-token costs fell 280-fold, total inference spending grew by 320%. This confirms that the “railroad tracks” are indeed being used, but they require massive volume to be profitable.

4. The “H100” Rental Crash

The most striking evidence of oversupply is in the GPU rental market.

  • In late 2024, renting an H100 GPU cost $8.00–$10.00 per hour.
  • By early 2026, those same GPUs are renting for as low as $2.99 per hour on platforms like Lambda Labs and RunPod—a 70%+ collapse in 15 months.

The Takeaway: We are seeing “Real Growth” in the volume of intelligence being produced, but the monetary yield per unit of that intelligence is crashing. This is the classic precursor to a “shakeout” where only the companies with the lowest power costs and the biggest cash reserves (the “Vanderbilts” of AI) survive.

Q. Isn’t your h100 leasing cost example misleading given that demand has quickly shifted to h200, Blackwell and now Rubin

That is an excellent point and gets to the heart of why this “boom” is even more volatile than the railroad era. You are highlighting the “Depreciation Trap

In the 19th century, a steel rail was a steel rail—it didn’t become 10x slower because a new rail came out three years later. In AI, the physical asset degrades in value almost as soon as the next architecture is taped out.

1. The “Shadow” Oversupply of H100s

The H100 leasing crash (from ~$8/hr to ~$2/hr) isn’t just because people stopped wanting compute; it’s because the Utility-per-Dollar of the H100 collapsed relative to the H200 and Blackwell.

  • The Yield Gap: If a Blackwell (B200) chip is 4x faster at inference than an H100 but costs only 2x as much to lease, any rational company will pay the premium for the B200.
  • Stranded Assets: This leaves the H100 as a “stranded asset.” Small cloud providers (the “Tier 2” clouds) who took out massive loans to buy H100s in 2023/24 are now stuck with hardware that is still physically perfect but economically obsolete.
  • The “Dark GPU” Reality: While there are “zero dark GPUs” at the frontier (the latest chips), there is a growing surplus of “last-gen” chips that are being dumped onto the market at fire-sale prices.

2. The “Rubin” Acceleration (2026)

With NVIDIA moving to a one-year release cycle (Blackwell in 2025, Rubin in 2026), the “economic life” of a GPU has shrunk from 5 years to 18–24 months.

3. The “Sovereign” Advantage

This is precisely why the “Big Five” (Microsoft, Amazon, etc.) are the only ones who can survive this.

  • They don’t just “lease” chips; they build custom silicon (TPUs, Trainium, Maieuta) to augment the NVIDIA cycles.
  • They have the cash flow to throw away $10 billion worth of H100s and replace them with Blackwells without blinking.
  • The “Leasing Cost” example is a warning for the middle class of the AI boom—the startups and Tier 2 clouds—who are being crushed by the rapid “versioning” of the infrastructure.

The Real Measurement: “Effective FLOPs per Dollar

To avoid being misled by chip-specific pricing, we look at the Cost per Petaflop.

  • In 2023, 1 Petaflop of compute cost roughly X.
  • By early 2026, 1 Petaflop of compute (via Rubin) costs roughly 0.1X.

The Conclusion: The “demand” is shifting so fast that the infrastructure is essentially “rotting” while it’s still being installed. We aren’t just building a railroad; we are building a railroad where the tracks have to be ripped up and replaced every 24 months.

Q. But these “obsolete chips” must have value for other applications?

In the context of 2026, the market for “legacy” AI silicon (H100s and early A100s) has bifurcated. While these chips are obsolete for frontier model training, they are finding a massive “second life” in lower-tier applications. However, this transition is fraught with a “Value vs. Utility” paradox.

The summary of the most recent findings (Q1 2026) suggests that while these chips have physical utility, their economic value is being crushed by the high cost of the electricity required to run them.

1. The Rise of “Sovereign AI” and National Clouds

The most recent data from late 2025 and early 2026 shows a massive surge in H100 exports to secondary markets.

  • The Findings: Developing nations (Middle East, SE Asia, and Eastern Europe) are buying “second-hand” H100 clusters to build National AI Clouds.
  • The Logic: These nations prioritize data sovereignty over cutting-edge speed. For a localized LLM (e.g., a Thai or Arabic-specific model), an H100 cluster is perfectly sufficient and significantly cheaper than the wait-listed Blackwell/Rubin systems.

2. Edge Computing and Industrial IoT

New 2026 deployments are seeing “repurposed” chips moving out of massive datacenters and into localized “Edge” nodes.

  • The Findings: H100s are being redeployed into Smart City and Autonomous Manufacturing hubs.
  • The Logic: These applications require high-speed “Inference at the Edge” (processing video feeds from thousands of cameras or sensors) but don’t need the massive memory bandwidth of the Blackwell architecture. The chips are being moved from $5 billion “Super-sites” to $50 million “Local-sites.”

3. The “Crypto-Mining” Echo

There is a resurgent trend of using legacy GPUs for Decentralized Compute Networks (e.g., Bittensor, Render).

  • The Findings: Smaller researchers and “Open Source” developers are utilizing “scrap” H100 time on decentralized platforms.
  • The Logic: Since the hardware has been fully depreciated by the original owners (like Meta or Microsoft), the rental cost on these platforms is near the floor (electricity cost + small margin). This is the “Budget Rail” of the AI world.

The “Opex” Wall: Why Value is Still Crashing

Despite these applications, the most recent economic data (Feb 2026) highlights a major headwind: The Electricity Bottleneck.

In regions where power costs more than $0.08 per kWh, it is actually cheaper to buy a new Rubin chip than to run a “free” H100.

image

In February 2026, the secondary market for NVIDIA H100 GPUs is experiencing a “dual-tier” price discovery phase. While newer architectures like Blackwell (B200) and Rubin have stripped the H100 of its “frontier training” crown, the chip remains the global “mid-tier workhorse” for fine-tuning and inference.

The price floor is currently being set by operating expenses (OpEx)—specifically electricity and cooling—rather than just the initial purchase price.

Rental rates on specialist clouds (like Lambda Labs or Jarvislabs) have hit what many analysts consider a hard floor of $2.35 to $2.99 per hour.

  • The Operational Ceiling: At $2.50/hour, the cost of the chip is only part of the equation. Power consumption (~700W per GPU) and cooling add approximately $0.15 to $0.40 per hour to the provider’s cost.
  • Market Conviction: Prediction markets show a 70%+ probability that H100 rentals will not drop below $2.30/hour in February 2026, as lowering prices further would make it unprofitable to keep the servers powered on.

Summary: Has the Floor Been Hit?

For rentals, yes. Prices have stabilized around the cost of “keeping the lights on” ($2.50–$3.00 range).
For outright purchases, no. A second wave of depreciation is expected in late 2026 as Rubin production ramps up, likely pushing used PCIe H100s toward the $10,000 mark—the historical “legacy” price point previously held by the
A100.

Q. Is there hard data to support the hope that compute demand is at least keeping pace with capacity builds?

To answer this, we have to look at the utilization metrics and backlog data from the “Big Three” cloud providers (Azure, AWS, GCP) alongside the token consumption of the largest consumer platforms.

The “hard data” suggests that while we are building a massive amount of “track” (capacity), the “trains” (demand) are currently barely keeping pace, and the nature of that demand is shifting from human-driven queries to machine-driven “agentic” loops.

1. The Utilization Data: The “90% Rule”

The most robust data point supporting the “keeping pace” argument is the reported utilization rates of H100/B200 clusters at major cloud providers.

  • The Hard Number: In their most recent 2026 earnings calls, Microsoft and Google reported that their AI-ready GPU capacity remains “supply-constrained.”
  • The Interpretation: If utilization were dropping, we would see “spot pricing” for GPUs collapse across the board. Instead, while legacy (H100) prices are soft, Frontier Compute (B200/Rubin) still has a 4-to-6 month waitlist for large-scale clusters.
  • The “Inference” Surge: For the first time, in early 2026, inference (running models) has overtaken training as the primary driver of GPU consumption, accounting for roughly 62% of total compute demand.

2. The Proxy Metric: Token Volume Growth

If demand were slowing, the “number of words” generated by AI would plateau. It hasn’t.

  • The Data: Leading API providers (OpenAI, Anthropic, Meta) reported a 10x year-over-year increase in total tokens processed in 2025.
  • The “Agentic” Multiplier: The shift from “Chat” (where a human types a prompt) to “Agents” (where a bot researches, writes code, and tests it) has increased the compute demand per user task by an estimated 50x to 100x. One human request now triggers thousands of internal “thought” tokens.

3. The “Backlog” as a Leading Indicator

We can look at the unearned revenue and Remaining Performance Obligations (RPO) of the infrastructure providers:

  • Azure AI: Reported an RPO increase of 25% in the last quarter, specifically tied to multi-year AI capacity commitments.
  • The Signal: Companies are signing 5-year contracts for compute. This is “hard data” that enterprises believe their demand will not only persist but grow, justifying long-term fixed costs.

The Skeptical Counter-Data: The “Revenue Per FLOP”

While physical demand (using the chips) is keeping pace, the financial demand (paying for the chips) shows a concerning trend:

image

The Findings Summary:
The “hard data” shows that the chips are not sitting idle. However, it also shows that the yield per chip is falling. We are in a higher-volume, lower-margin environment.

Would you like me to pull the specific “Return on Invested Capital” (ROIC) comparisons for the top AI spenders to see if their massive CAPEX is actually translating into higher stock-holder value yet?

In early 2026, the AI boom has entered a “ROIC Reckoning” phase. While NVIDIA continues to report extraordinary returns, the hyperscalers (Microsoft, Google, Meta, and Amazon) are seeing their Return on Invested Capital (ROIC) come under pressure as massive capital expenditures (CAPEX) dilute near-term profitability.

The following data compares the fiscal performance of top AI spenders, highlighting the tension between record investments and capital efficiency.

image

Key Insights from 2026 Financial Reports

  • The NVIDIA Outlier: NVIDIA remains the primary beneficiary, with a staggering 70.9% ROIC reported for fiscal year 2025. Its quarterly operating income jumped 65% to $36 billion as it successfully shifted from selling chips to integrated AI systems.
  • The Hyperscaler Dilution: For the companies building the infrastructure, the story is different. Alphabet (Google) announced a massive 2026 CAPEX plan of $175B–$185B, nearly double its 2025 spend. This “all-in” bet has caused its ROIC to dip from a 2024 peak of 21.2% toward a projected 16–17% for 2026.
  • Microsoft’s Supply Constraints: Microsoft reported record quarterly cloud revenue of $51.5 billion in early 2026, but its stock suffered a historic 10.5% plunge on January 29. Investors are concerned by the $37.5 billion quarterly CAPEX and CFO Amy Hood’s warning that growth is “supply-constrained” despite the spending.
  • Amazon’s Scale Play: Amazon’s $200 billion planned investment for 2026 is the largest in tech history, causing an 11% stock drop in February 2026 as markets reacted to the near-term cash-flow pressure.
  • Meta’s “Efficiency” Edge: Meta has been a notable outlier, with its stock surging 10% post-earnings because its AI spending is already demonstrably improving ad targeting efficiency and core engagement, rather than just chasing future “agentic” revenue. [Note that Google (+14% on the largest base) and Amazon (+23%)  also recorded strong ad revenue growth last quarter vs Meta’s +24%]

Q. From all the above discussion and analysis, are investors about to be railroaded by the AI boom?

1. The New “Vanderbilts

The “Big Five” (Microsoft, Amazon, etc.) are unlikely to be destroyed. Unlike the 19th-century railroad moguls who relied on fragile bank loans, today’s giants are [mostly] self-funding this expansion.

  • The Outcome: Even if the “AI Bubble” bursts, these companies will own the world’s most advanced power grids, fiber networks, and datacenter shells.
  • The Verdict: They aren’t being railroaded; they are the railroad. They are capturing the “Right of Way” for the next 50 years of the digital economy.

2. The “1893” Risk

This is where the “railroading” happens. If you are invested in “pure-play” AI startups or Tier-2 cloud providers, the risk is extreme.

  • The Capital Grind: As we saw with the H100 rental price collapse, the Return on Capital is being commoditized by the hyperscalers.
  • The Shakeout: In the 1890s, 25% of U.S. railroads went bankrupt because they couldn’t pay their debt. Today, we are likely to see a massive “GPU-Debt Crisis” where companies that borrowed billions to buy chips that are now “legacy” (H100s) cannot generate enough cash to pay back the loans.

3. The “Productivity” Paradox

The most likely scenario is not a total collapse, but a “Profitless Prosperity.”

  • High Utility, Low Margin: Just as the railroads lowered the cost of shipping wheat and benefited everyone except the railroad investors, AI will likely make the global economy significantly more productive.
  • The Transfer of Wealth: The value created by AI will likely flow to end-users (consumers getting free intelligence) and traditional industries (healthcare, law, and manufacturing) rather than the companies building the models.

Q. Are there companies more immune to the railroad risk?

To determine who is “immune,” we have to distinguish between the Railroad Builders (who carry the debt and the risk of empty trains) and the Steel & Locomotive Providers (who get paid upfront regardless of whether the railroad eventually goes bankrupt

In the 1800s, the investors who got “railroaded” were the bondholders of the tracks. The companies that got rich and stayed rich were the steel mills (Carnegie) and the land owners.

1. TSMC and ASML: The “Bessemer Steel” Tier (Highest Immunity)

These are the most protected companies in the ecosystem because they sit at the very base of the “S-Curve.”

  • The Moat: You cannot build a “competing” TSMC or ASML in three years. They own the physics of the boom.
  • Immunity Factor: Even if the AI software market crashes, the world still needs chips for cars, defense, and general computing.
  • The 2026 Reality: ASML’s High-NA EUV machines and TSMC’s 2nm process are booked through 2027. They have “Take-or-Pay” contracts, meaning the tech giants pay them even if they don’t use the full capacity. They are the ones selling the shovels to the people digging for gold.

2. NVIDIA: The “Locomotive King” (High Immunity, High Volatility)

NVIDIA is often compared to the Baldwin Locomotive Works. They provide the engines.

  • The Moat: Software compatibility (CUDA) is a deeper moat than the hardware itself.
  • The Risk: Unlike ASML, NVIDIA is a “direct” recipient of AI CAPEX. If hyperscalers cut spending, NVIDIA’s revenue also drops unless its market share grows.
  • Immunity Factor: They are shifting from being a “chip company” to a “datacenter-in-a-box” company. By selling the networking (InfiniBand), the software, and the chips, they capture more of the CAPEX “pie” than anyone else. They are “immune” to the competition, but vulnerable to a macro-slowdown in spending.

3. Alphabet (Google): The “Land Grant” Giant (Moderate Immunity)

Google is the Union Pacific of this era. They own the “land” (data) and the “tracks” (YouTube, Search, Android).

  • The Moat: They don’t just rely on selling AI; they use AI to make their existing $300B+ ad business more profitable.
  • The Risk: As we saw in the 2026 ROIC data, their massive $180B CAPEX is diluting their margins. They are spending so much on “tracks” that their stock can be punished even if the business is healthy.
  • Immunity Factor: Because they design their own chips (TPUs), they are more “immune” to NVIDIA’s pricing power than Microsoft or Meta. They are vertically integrated.

image

The companies most likely to be “railroaded” are the Pure-Play Software SaaS companies.

  • Why: They are the “passengers” on the train. If the railroad (OpenAI/Google) raises the ticket price (API costs) or if AI makes their specific software (e.g., basic coding tools, simple CRM) obsolete, they have no physical assets to fall back on.

Summary: The “Infrastructure Advantage”

The closer a company is to the physical world (lithography, silicon wafers, power transformers), the more immune they are. The closer they are to the application layer (chatbots, wrappers, basic SaaS), the more they are at risk of being crushed by the very infrastructure they rely on.

Our own research says that high capex are justified by exploding compute demand generated by the proliferation of agents and the accelerating usage of AI in growing sectors like finance, biotech, robotics and autonomous vehicles. Much like for the internet at its onset, we are likely underappreciating how compute demand (usage) will grow as its utility becomes obvious and compelling and as compute costs keep declining.

Consider that in their latest quarter, the hyperscalers that have reported showed combined data center revenues up 25% (Google 48%!) and combined ad revenues up 20%. Real numbers post AI investments in 2024-25. Seeing such strong demand even before usage really blossoms, they are boosting their 2026 capex by 66%. Executives from Microsoft, Google, and Amazon have all indicated that customer demand for AI services currently exceeds their available supply, necessitating these “aggressive” investments.

Last month, TSMC CEO C.C. Wei justified a record capital expenditure hike to $52–$56 billion by declaring that the “AI megatrend” is a multi-year, fundamental shift rather than a temporary bubble.

Key reasons cited by Wei to justify this aggressive spending include:

  • Verified Customer Demand: Wei noted that he spent several months directly communicating with “customers’ customers”—specifically major cloud service providers (CSPs) like Google, Amazon, and Microsoft—to confirm that their AI-driven demand is real and delivering material financial returns.
  • AI is Real“: Moving away from previous conservative stances, Wei emphatically stated, “AI is real. Not only real; it’s starting to grow into our daily life”.
  • Capacity Bottlenecks: He identified TSMC’s own wafer supply and advanced packaging (CoWoS) as the primary bottlenecks in the global AI industry, necessitating immediate investment to support customers like Nvidia (moving from Blackwell to Rubin).
  • Expansion of Advanced Nodes: Approximately 70–80% of the 2026 budget is allocated to leading-edge technologies, specifically the ramp-up of 2nm (N2) and 3nm (N3) processes in Taiwan and Arizona to maintain technological leadership.
  • AI Sovereignty: Wei highlighted the growing need for “AI sovereignty” among various governments as a new driver for localized, high-end semiconductor capacity.

While Wei admitted to being “very nervous” about the scale of the investment—noting that an overbuild could be a “disaster”—he concluded that the evidence of AI-driven productivity gains at both client sites and within TSMC’s own fabs provided the necessary conviction to proceed.

Based on announcements made in January and early February 2026, the following companies have explicitly confirmed new or expanded AI integration into their daily operations.

A major trend in these Q1 2026 announcements is the shift from “passive” AI assistants to “Agentic AI”—autonomous agents capable of executing complex workflows with minimal human intervention.

1. Enterprise & “Agentic” AI Adopters

These companies were announced on February 5, 2026, as early adopters of OpenAI Frontier, a new platform specifically for deploying autonomous AI agents in the workforce.

  • State Farm: Automating routine insurance workflows.
  • Uber: optimizing complex logistical and operational decisions.
  • Intuit: Enhancing financial data processing and personalized customer insights.
  • HP & Thermo Fisher: Integrating agents into R&D and operational management.
  • BBVA, Cisco, & T-Mobile: Piloting agents for “complex and valuable AI work” in customer service and network management.

2. Industrial, Energy & Supply Chain

  • PowerBank Corporation: Announced signed contracts (Feb 5, 2026) to deploy “IntelliScope”, a multi-agent AI system. It will autonomously analyze geospatial data to identify optimal solar sites and manage regulatory compliance.
  • Graybar: The Fortune 500 electrical distributor confirmed it is expanding its use of predictive analytics and AI to automate inventory optimization, forecasting, and acquisition analysis.
  • Siemens: Rearchitecting its portfolio to support industrial AI agents that can autonomously monitor and maintain factory assets.
  • Hitachi: Deploying AI agents to autonomously monitor and maintain over 30,000 industrial assets.

3. Telecom & Infrastructure

  • BCE (Bell Canada): In its Q4 2025 results (Feb 5, 2026), BCE reported increased revenue specifically from “AI-powered solutions” and highlighted operational efficiencies in customer service centers driven by automation and AI call placement.
  • Qualcomm: Deepening its move into Edge AI (processing AI on-device rather than in the cloud) through recent acquisitions, enabling real-time decision-making for IoT devices without internet latency.

4. Automotive & Retail

  • Solera: Announced a “landmark AI investment” (Feb 4, 2026) ahead of the NADA Show. This includes a new AI-powered CRM for car dealerships that autonomously scores leads and engages customers 24/7.
  • Trellis Corp: Identified as a fast-growing company for its AI-driven profit-optimization platform, which automates advertising and pricing for Amazon and Walmart sellers.

5. Professional Services (The Integrators)

While these firms sell AI services, they are also radically overhauling their own operations to serve as case studies:

  • Accenture, Deloitte, & Infosys: Have all signaled 2026 as the year of “AI integration,” moving beyond pilot programs to enterprise-wide deployment for clients in healthcare and finance.

In all, from our lens

  • AI is truly transformational and will be quickly widely adopted for productivity and competitiveness imperatives.
  • Unlike the railroads and the IT infrastructure booms, there is little front-loading “hoping/waiting for demand” investments, nor much leveraging at this time.
  • Prices/costs are coming down fast, necessary to boost demand/usage along the way.
  • Agentic AI will be huge.
  • In the AI era, moat is crucial and bigger is better.
  • Diversified revenue/cashflow streams are desirable.
  • No or low indebtedness is preferable.

From a macro perspective,

  • AI is clearly and significantly boosting GDP growth.
  • AI is clearly inflationary in some sectors (construction, commodities, power), disrupting other sectors by hording resources.
  • The massive hyperscalers expenditures (almost the size of the US defense budget) are redirecting their excess cash from the fixed income markets into the real economy with high multiplier effects. Will productivity gains offset demand-pull inflation?
  • Will these financial flows (reduced corporate demand) impact US interest rates?

As usual, nothing above is meant to be investment advice. Just my train of thought, so to speak…