era · present · FUTURIST

Andrew Yang

The futurist who brought UBI into the political mainstream.

By Esoteric.Love

Updated  5th May 2026

APPRENTICE
WEST
era · present · FUTURIST
FuturistThe Presentthinkers~7 min · 2,427 words
EPISTEMOLOGY SCORE
72/100

1 = fake news · 20 = fringe · 50 = debated · 80 = suppressed · 100 = grounded

SUPPRESSED

The political establishment was not having this conversation. Andrew Yang forced it open. He was wrong about almost nothing.

The Claim

Yang's 2018 presidential campaign was the first major American political platform to place automation — not immigration, not trade — at the centre of economic collapse. He built his argument before AI anxiety was mainstream, and the years since have proven him right with uncomfortable precision. The policy he championed, Universal Basic Income, is now one of the most seriously debated responses to the age of AI.

01

What does it mean to be right too early?

Yang published The War on Normal People in 2018. The political class was arguing about trade wars and border walls. He was arguing about the forty-four percent of American workers whose jobs consist of repetitive cognitive and physical tasks — the ones a well-trained algorithm could absorb by Tuesday.

He wasn't running a think-tank paper. He was running for president. The proposal was a Freedom Dividend — $1,000 per month, every American adult, funded by a value-added tax on the companies capturing the productivity gains. No means-testing. No bureaucracy. Cash, directly, as infrastructure.

The response from the establishment was polite dismissal. The response from the internet was something else. His base — called the Yang Gang before that phrase embarrassed anyone — was young, technically literate, and genuinely alarmed. They understood what he was saying because they worked in the industries he was describing.

He withdrew from the Democratic primary in February 2020. The automation crisis did not withdraw with him.

The difference between a futurist and a politician is that futurists are measured by accuracy, not electability.

By 2025, white-collar job cuts were accelerating across legal, financial, and professional services at the pace Yang had projected. His figure of thirty to forty million jobs lost over the decade moved from prediction to working assumption. Economists who once called him alarmist were using his numbers. The question had shifted. Not whether. What do we do about it.

Yang's answer to that question has not changed. The mechanism has remained consistent since 2018. The audience catching up to it is simply larger now.


02

Why this wave is different

Every generation of displaced workers has been told the same thing. New jobs will emerge. Retrain. Adapt. The market corrects.

It has been approximately true, historically. The factory workers replaced by machines became service workers. The service workers replaced by software became knowledge workers. There was always a next rung.

Yang's cognitive labour threshold argument is that this rung does not exist. The industries that absorbed previous waves of displaced labour are themselves being automated. Lawyers. Accountants. Radiologists. Call-centre workers. Financial analysts. These were the refuge sectors — the destinations for people who requalified, retrained, moved up.

The timeline on displacement has compressed. A legal AI system that replaces junior associate work does not wait for the workforce to retrain around it. A radiology algorithm that outperforms specialists on image classification does not phase in gradually. Corporate incentives reward whoever cuts headcount fastest. One company automates. Every competitor is forced to follow. The pressure cascades.

Yang's position is that retraining programmes, educational reform, and market adaptation cannot move at the pace of AI deployment. This is not pessimism. It is arithmetic. A displaced truck driver at fifty has approximately zero retraining pathways that match their former income, stability, or identity. A displaced paralegal at thirty-five is competing for fewer positions in a field where AI can do the entry-level work better than they can.

There is no next rung on the ladder.

The forty-four percent figure that anchors The War on Normal People was Yang's estimate of American jobs vulnerable to automation by his benchmark. He specifically flagged truck drivers — three and a half million of them — as the first major wave. He predicted that displacement at that scale, hitting communities that had already been hollowed out by deindustrialisation, would generate civil unrest. In 2018, this was treated as hyperbole.

Seven years later, Yang describes that prediction as having aged "very, very well, unfortunately."


03

The machinery of the Freedom Dividend

What does $1,000 per month actually do?

Yang's framing is deliberate. He did not call it welfare. He called it a Freedom Dividend — a return on the productivity gains generated by automation, distributed to the citizens whose labour and consumption made that productivity possible. The reframe matters because it changes the political register entirely.

Welfare implies dependency. Infrastructure implies function. Yang argued UBI was the same category of policy as broadband access or road networks — a base layer that enables productive activity rather than replacing it. The money flows to people who then spend it locally, supporting small businesses, community economies, and the social fabric of places that corporate tax structures have otherwise drained.

The funding mechanism is a value-added tax on AI companies. The companies capturing the productivity gains pay into the system that supports the workers displaced by those gains. Yang's argument is that this is not redistribution in the punitive sense. It is closer to a dividend. You own a share of the economy that built these systems. Here is your return.

Existing Social Safety Net

Means-tested. Bureaucratic. Conditional. Requires proof of poverty to access. Often clawed back as income rises, creating poverty traps.

Freedom Dividend Model

Universal. Unconditional. No application. No case worker. No stigma. Paid to every adult regardless of employment status.

Retraining Programmes

Assumes workers can identify, access, and complete relevant requalification. Moves at the speed of curriculum design and institutional approval.

Direct Cash Transfer

Assumes people know what they need better than administrators do. Moves at the speed of a bank transfer. Evidence from pilot programmes in Stockton and Finland supports higher wellbeing outcomes than equivalent programme spending.

The pilot data Yang cited has continued to accumulate. Stockton, California ran a guaranteed income experiment beginning in 2019. Recipients spent the money on food, utilities, and transportation. Employment rates among recipients rose. Mental health outcomes improved. The apocalyptic predictions — that people would stop working, spend the money on drugs, become dependent — did not materialise.

Finland ran a two-year experiment ending in 2020. Recipients reported higher wellbeing, more confidence, greater willingness to seek work. Canada has experimented with variants since the 1970s. None of these pilots are UBI at full scale. Yang has never claimed otherwise. He has claimed they disprove the strongest objections, which is a more modest and defensible position.

The companies capturing the productivity gains pay into the system that supports the workers displaced by those gains.


04

From Silicon Valley to humanist

Yang is an unusual figure in American political thought. He came out of the technocratic culture of Silicon Valley — Venture for America, his nonprofit founded in 2011, sent young graduates into economically struggling cities to work with startups. His first extended contact with deindustrialisation was watching it up close, in places like Detroit and Baltimore, through the optimistic frame of entrepreneurship.

What changed was the honest accounting of what he saw. Entrepreneurship was not reversing deindustrialisation. Small startup ecosystems were not replacing the wage base that had disappeared when manufacturing left. The places were still struggling. The people were still struggling. The frame was inadequate to the problem.

The War on Normal People is the product of that recalibration. Yang does not trust the market to self-correct. He does not trust governments to move fast enough. He does not trust retraining mythologies. He arrived at fundamentally humanist conclusions from inside a culture that tends to believe technology solves the problems technology creates.

His community collapse thesis is where this becomes more than economics. Job loss does not only remove income. It removes structure, purpose, identity, and the social infrastructure built around stable employment. Yang's observation that "people also need purpose and community" sounds obvious until you trace what happens when both disappear simultaneously in the same geographic area. The opioid crisis is one data point. Mortality rates for middle-aged white men without college degrees are another. Suicide rates in deindustrialised communities are another.

Economic stability is the precondition for everything else that makes human life bearable. Yang's argument is that cash transfers address the economic precondition directly. They do not solve purpose or community. But they prevent the freefall that makes every other intervention impossible.

The frame was inadequate to the problem.


05

The political economy of speed

Why hasn't any of this happened?

Yang's diagnosis of the political system is as precise as his diagnosis of the labour market. The people most threatened by automation are the least positioned to influence the policy response to it. Displaced workers in deindustrialised communities have low political capital, low media access, and low representation in the institutions designing the regulatory response to AI.

The people with high political capital are the technology companies deploying the systems. They have regulatory access, lobbying infrastructure, and the cultural prestige of being associated with innovation. The framing they prefer — AI as job creator, AI as productivity enhancer, AI as American competitive advantage — is the framing that dominates most policy discussion.

Yang's Department of Technology proposal was an attempt to create institutional capacity that could respond at the pace of deployment. The United States regulates aviation, food, pharmaceuticals, and financial instruments through dedicated agencies with technical staff and enforcement authority. It has no equivalent for AI systems transforming labour markets at scale. The gap between the speed of deployment and the speed of regulatory response is not an accident. It is the shape of the problem.

The Forward Party, which Yang co-founded in 2021, was an attempt to operate outside the binary of Democratic and Republican framing. His argument was that the AI governance question cuts across existing party lines — that there are conservatives and progressives who understand the threat, and that the existing party structures are too captured by their donor bases to respond coherently.

The Forward Party's electoral results were modest. The policy argument did not become modest with them.

The gap between the speed of deployment and the speed of regulatory response is not an accident. It is the shape of the problem.


06

The Stanford moment

In 2021, Yang presented at Stanford's Human-Centred AI institute. The audience was researchers, technologists, and policy people who work on AI governance professionally. He framed UBI not as a political proposal but as a rational policy response to a documented, measurable disruption.

This matters because of where it happened. Stanford HAI is not a fringe venue. It is one of the central institutions shaping how the technology industry and government think about AI risk. Yang presenting there was a marker of how far his diagnostic framework had moved from the campaign trail into the technical policy conversation.

The questions being asked at institutions like Stanford HAI in 2021 were the questions Yang had been asking in 2018. The language was more technical. The conclusions were largely the same. Automation is displacing cognitive labour. The rate of displacement is accelerating. The social safety net was designed for a different economy. The gap between productivity gains and wage growth has been widening for decades. Something structural needs to change.

Yang's contribution was to say this plainly, in public, to an audience that was not an AI conference — and to attach a specific policy mechanism to it before the question was fashionable. By 2021, the question was fashionable. His mechanism was still the most clearly articulated one on the table.

Yang had been asking in 2018 the questions that Stanford HAI was formalising in 2021.

The pilot programmes he had cited on the campaign trail were accumulating results by then. Stockton's guaranteed income experiment was publishing data. Oakland had launched its own programme. Finland's results were being analysed. None of it proved UBI at national scale. All of it undermined the behavioural objections that had been used to dismiss the idea.


07

What the numbers are saying now

Yang's projection was thirty to forty million jobs lost over the decade following 2018. He named the sectors: transportation, manufacturing, retail, call centres, legal support, financial analysis, radiology, accounting. He named the timeline: the disruption would become visible by the mid-2020s and cascade through the decade.

The mid-2020s arrived. Legal AI systems are replacing junior associate work at large firms. Financial analysis is being automated at investment banks that once hired hundreds of analysts per class. Radiology AI is outperforming specialists on specific imaging tasks. Call centres are being replaced by large language models. Accounting software is eliminating bookkeeping roles that sustained middle-class incomes for decades.

The thirty to forty million figure is now a working assumption in labour economics, not a political claim to be disputed. The International Monetary Fund has published estimates in this range. McKinsey Global Institute has published estimates in this range. Economists who would not have cited Yang in 2018 are citing data that validates his projections in 2025.

The prediction became the working assumption. The question shifted from whether to what.

This is not vindication in any comfortable sense. Yang's predictions being accurate means the disruption he described is real and accelerating. The argument he made for why cash-based intervention was the most direct available response has not been implemented at scale anywhere. The pilots continue. The deployments continue faster than the pilots.

The gap between what Yang described and what has been done about it is the central political fact of the current moment in AI governance. He named the mechanism. The mechanism is visible. The policy response remains, by any honest accounting, inadequate to the speed.


The Questions That Remain

If thirty to forty million jobs disappear and cash transfers stabilise income, what happens to the purpose and community that the jobs carried with them?

The people most threatened by automation have the least political power to shape the response — how does any democratic system correct for that asymmetry?

Yang's Freedom Dividend would be funded by a VAT on AI companies capturing productivity gains — but what happens when those companies have more political influence than the governments trying to tax them?

Is UBI a solution to the speed of displacement, or a way of making the aftermath survivable while leaving the underlying dynamic unchanged?

If Yang's diagnostic framework is now mainstream, why has no major government implemented the policy response he proposed in 2018?

The Web

·

Your map to navigate the rabbit hole — click or drag any node to explore its connections.

·

Loading…