The Digital Caste: Surveillance Capitalism and the Architecture of Permanent Inequality
How algorithmic systems are building a new structure of social stratification — and why your next cup of milk might be an act of resistance
I grew up in small-town America. The kind of place where you knew the person behind the counter at the hardware store and bought your milk from someone whose name you could actually remember. It wasn’t romantic — it was just how commerce worked. You exchanged money for goods, and nobody was quietly adjusting your price based on what phone you carried or how long you lingered in the dairy aisle.
That world is vanishing. Not because people stopped wanting it, but because a different architecture of exchange has been built on top of it — one that watches, sorts, predicts, and prices with a precision that would make any historical caste enforcer envious.
This essay is about that architecture. It draws on the work of scholars who have been mapping it for years — Shoshana Zuboff, Ruha Benjamin, Cathy O’Neil, Virginia Eubanks, and others — and it tries to connect their insights to something I think we’re collectively failing to name: the emergence of a digital caste system. Not caste in the sense of explicit hereditary ranking, but caste as a structural logic — a way of sorting people into categories that determine access, opportunity, and dignity, enforced not by tradition but by code.
The mechanism is new. The shape is old.
I. Caste as Structural Logic
What makes traditional caste a technology of social control isn’t just hierarchy — it’s the way that hierarchy becomes invisible through repetition. The Brahminical system didn’t need to justify itself daily because it was embedded in every interaction, every spatial arrangement, every economic exchange. It was the water the fish couldn’t see.
Digital systems are building something structurally similar. When an algorithm decides your creditworthiness, your insurance premium, your likelihood of recidivism, or the price you see for a flight — it is performing a sorting function. That sorting determines access. Access determines life outcomes. And the whole thing operates at a speed and scale that makes interrogation nearly impossible.
Virginia Eubanks documented this with devastating clarity in Automating Inequality. She showed how automated eligibility systems, predictive models, and coordinated databases create what she calls a “digital poorhouse” — a system that profiles, polices, and punishes the poor while remaining largely invisible to everyone else.
“The digital poorhouse is not a single building or institution. It is a network of technologies, practices, and ideologies that manages the lives of poor and working-class people in ways that are often invisible to those who benefit from them.”
— Virginia Eubanks, Automating Inequality
The point isn’t that these systems were designed to create caste. The point is that they function like one. When the sorting is automated, the hierarchy feels natural. And when it feels natural, it becomes nearly impossible to challenge.
II. From the New Jim Code to Surveillance Pricing
Ruha Benjamin coined the term “the New Jim Code” to describe how seemingly neutral technologies encode and reproduce racial and social hierarchies. Her argument in Race After Technology is precise: the problem isn’t that algorithms are occasionally biased. The problem is that they are built on top of data that already reflects decades — centuries — of structural inequality. The algorithm doesn’t create the bias. It launders it.
Cathy O’Neil made a complementary case in Weapons of Math Destruction. She showed how algorithmic models used in hiring, lending, policing, and education don’t just reflect inequality — they create feedback loops that amplify it. A model that uses zip code as a proxy for risk is, in practice, using race. A model that penalizes you for attending a low-ranked school is penalizing you for being poor. And because these models are proprietary, opaque, and unregulated, there’s no mechanism for appeal.
But both Benjamin and O’Neil were primarily writing about systems that sort people into categories — creditworthy or not, hirable or not, suspicious or not. What’s emerged since their books is something more granular and arguably more insidious: the ability to sort people into individualized price tiers based on surveillance data. This is surveillance pricing, and it represents a new frontier in algorithmic stratification.
III. How the New Caste Is Being Priced Into Existence
Shoshana Zuboff’s The Age of Surveillance Capitalism provides the foundational framework here. Her core insight is that surveillance capitalism doesn’t just collect data about users — it uses that data to predict and modify behavior, and it sells those predictions as products. The raw material isn’t your data. It’s your future behavior.
Surveillance pricing is the commercial application of this logic. If a company knows your browsing history, your location, your device, your purchase history, your income bracket (estimated), and your behavioral patterns, it can calculate not just what to offer you, but what maximum price you will bear.
This isn’t theoretical. Orbitz was caught showing Mac users higher-priced hotels than PC users as far back as 2012. But the practice has become dramatically more sophisticated.
In January 2025, the Federal Trade Commission released a study examining the surveillance pricing practices of eight major companies, including Mastercard, JPMorgan Chase, Accenture, and McKinsey. The study found that these companies are using vast quantities of personal data — including income, location, demographics, browsing behavior, and individual consumer profiles — to enable retailers to set individualized prices. The FTC noted that this practice raises “serious concerns about privacy, competition, and consumer protection.”
What the FTC study revealed is the infrastructure: a network of data brokers, analytics firms, and consulting companies that have built the plumbing for real-time personalized pricing. This isn’t one company doing something sketchy. It’s an entire industry optimized for extracting maximum value from each individual consumer, calibrated by how much they can afford to pay — or how much they can be made to believe they should.
In response, New York introduced the Algorithmic Pricing Disclosure Act, which would require companies to disclose when they use automated systems to set individualized prices. The bill represents an acknowledgment that surveillance pricing isn’t just a market inefficiency — it’s a mechanism of stratification.
Think about what this means structurally. Two people standing in the same virtual store, looking at the same product, seeing different prices — not because of a sale or a coupon, but because one of them has been algorithmically determined to pay more. This is price discrimination refined to the individual level. It is, in effect, a caste tax: an invisible surcharge applied based on your position in a hierarchy you didn’t choose and can’t see.
I want to give this a name: price-locked poverty. The condition of being algorithmically identified as someone who can be charged more for essentials — or, conversely, as someone who will only be shown the cheapest, lowest-quality options. Both ends of the spectrum are forms of economic control. And both are enforced by systems that are invisible, unaccountable, and nearly impossible to opt out of.
IV. Algorithmic Class Formation
Recent sociological work has begun theorizing what this means for class structure itself. Jenna Burrell and Marion Fourcade, in their analysis of digital classification systems, argue that algorithmic sorting is creating new forms of social stratification that don’t map neatly onto traditional class categories. They describe an emerging split between what they call the “coding elite” — those who build and control algorithmic systems — and the “cybertariat” — those who are subject to them.
This isn’t just a digital divide in the old sense of who has internet access. It’s a divide in who gets to be a subject versus an object in algorithmic systems. The coding elite set the parameters. Everyone else is a data point.
Vassilis Charitsis and Mikko Laamanen have pushed this further, arguing that algorithmic sorting doesn’t merely reflect existing class positions — it actively constitutes new ones. When a platform decides what content you see, what prices you’re shown, what jobs appear in your feed, and what credit you qualify for, it isn’t just responding to your class position. It is producing it.
This is the caste mechanism at work. Not a single decree from above, but a thousand small algorithmic decisions that cumulatively construct your social position. And because each decision is automated, distributed, and opaque, the resulting hierarchy appears natural — as if it simply reflects “the data.”
V. The Emotional Timing of Extraction
There’s a temporal dimension to this that most analyses miss. Surveillance capitalism doesn’t just sort and price. It times.
Zuboff documented how the industry moved from “monitoring” behavior to “actuating” it — from watching what you do to actively shaping when and how you do it. The push notification that arrives when you’re most likely to buy. The price that spikes when demand data shows you’re most desperate. The loan offer that appears at the moment your checking account dips below a threshold.
Social media platforms have become laboratories for this kind of temporal extraction. They don’t just serve you content — they serve it at moments calculated to maximize engagement, which is to say, at moments of emotional vulnerability. The algorithmic feed is a precision instrument for exploiting affect.
When you combine temporal targeting with individualized pricing, you get something genuinely new: a system that can identify when you are most vulnerable and then offer you a worse deal at precisely that moment. This isn’t market efficiency. It’s predation operating at machine speed.
VI. The Illusion of Choice
The standard response to critiques of surveillance capitalism is that consumers have a choice. You can opt out. You can use different platforms. You can read the terms of service. You can “vote with your feet.”
This argument is, to put it precisely, the argument of the maze designer explaining to the rat that it could always turn left instead of right.
The architecture of digital commerce is designed to make opting out either impossible or punishingly expensive. Try to participate in modern economic life without a smartphone, a bank account linked to automated systems, a credit score generated by opaque algorithms, or an email address connected to a platform that tracks your behavior. The “choice” to opt out is the choice to accept systematic economic disadvantage.
O’Neil called this the feedback loop problem: the models that sort you into categories use your responses to those categories as input for further sorting. If you’re shown only expensive options and you buy them, the model confirms you’ll pay more. If you’re shown only cheap options and you buy them, the model confirms you’re a discount shopper. Your “choices” within the system become the data that reinforces the system’s classification of you.
Eubanks made a related point about what she calls “ethical distance” — the way that automated systems allow decision-makers to avoid confronting the human consequences of their choices. When an algorithm denies you benefits, nobody has to look you in the eye. When a surveillance pricing system charges you more, nobody has to justify it to you personally. The system does the sorting, and the humans who profit from it maintain plausible deniability.
This is, again, how caste works. Not through explicit cruelty — though there is that — but through structural arrangements that make inequality feel like the natural order of things.
VII. The Yoga, the Milk, and the Bookcase: A Defense of the Local
I run a yoga shala in Washington, DC. This might seem like a non sequitur, but I think it’s actually the point.
A local yoga studio is one of the few remaining spaces of unmediated economic exchange. You walk in. You practice. You pay a known price — the same price as everyone else. Nobody is algorithmically adjusting your drop-in rate based on your browsing history or the estimated value of your home.
The same is true of the farmers market, the independent bookstore, the local craftsperson. These aren’t just quaint alternatives to Amazon. They are structural resistances to surveillance pricing. Every transaction that occurs outside the algorithmic pricing infrastructure is a transaction where your data isn’t being harvested, your vulnerability isn’t being timed, and your price isn’t being individualized.
I’m not naive about this. Local economies have their own inequities. Small businesses can be exclusionary. The farmers market isn’t accessible to everyone. But the structure of the exchange is fundamentally different from one mediated by surveillance. In a local exchange, you are a person. In a surveilled exchange, you are a data point being optimized against.
Zuboff herself noted this tension:
“It is very difficult to participate effectively in society without interfacing with the same channels that are surveillance capitalism’s supply chains.”
— Shoshana Zuboff, The Age of Surveillance Capitalism
She’s right. And that’s exactly why the local matters. Not as a complete alternative — that’s unrealistic — but as a practice. A discipline. A way of maintaining some portion of your economic life outside the algorithmic sorting machine.
Call it algorithmic Luddism if you want. The original Luddites weren’t anti-technology. They were against the specific use of technology to destroy their livelihoods and concentrate power. They were right then. The instinct is right now.
VIII. What We Are Choosing Not to See
The most effective feature of any caste system is its invisibility to those it benefits. If you’re not being charged more, you don’t notice surveillance pricing. If your credit score is fine, you don’t think about algorithmic lending discrimination. If your neighborhood isn’t being predictively policed, the system seems to work just fine.
This is the deepest analogy to traditional caste. The Brahmin doesn’t need to understand the system that benefits them. They just need to live within it. The system does the rest.
What’s different about digital caste is the speed and granularity of the sorting. Traditional caste systems were crude instruments — they sorted by birth, by family, by visible markers. Digital systems sort by thousands of variables, updated in real time, invisible to the people being sorted. The precision makes the system more effective and harder to resist. You can’t organize against a classification you can’t see.
I think about this from the perspective of someone who grew up in small-town America and now teaches yoga in a major city. The distance between those two worlds is enormous, but the thread connecting them is the same: the belief that economic exchange should involve some minimum of human dignity. That you shouldn’t have to be surveilled to buy milk. That the price of participation in society shouldn’t be the surrender of your behavioral data to companies whose interests are structurally opposed to your own.
A note of humility: I’m not a sociologist or a computer scientist. I’m a yoga teacher who reads too much. But the scholars I’ve cited here — Zuboff, Benjamin, O’Neil, Eubanks, Burrell, Fourcade, Charitsis, Laamanen — they’ve done the rigorous work. What I’m trying to do is connect their findings to a pattern that I think we need a word for. That word is caste. Not because it’s a perfect analogy, but because it captures the structural reality: a system of hierarchical sorting, enforced by mechanisms that appear neutral, producing outcomes that feel natural, while being anything but.
Conclusion: Seeing the Maze
If surveillance capitalism is building a new caste system — and I believe the evidence strongly suggests it is — then the question is what to do about it.
The scholars I’ve drawn on here offer different but complementary answers. Zuboff calls for new legal frameworks. Benjamin calls for abolition of racist systems disguised as innovation. O’Neil calls for algorithmic auditing and transparency. Eubanks calls for centering the experiences of those most harmed by automated systems.
I’d add something simpler: see the maze.
The first step in resisting a caste system is recognizing that you’re in one. The second is acting locally — supporting economic structures that aren’t built on surveillance. The third is organizing collectively — because individual consumer choices, while meaningful, are insufficient against systems of this scale. The fourth is demanding algorithmic literacy as a basic civic competence — understanding not just how to use technology, but how technology is using you.
I teach yoga. The practice is, at its core, about seeing clearly — seeing through the stories we tell ourselves about who we are and what’s real. The digital caste system survives precisely because we don’t see it. Because it operates in the gap between convenience and extraction, in the milliseconds between your search query and the price you’re shown.
Seeing it won’t fix it. But it’s where every meaningful response begins.
And in the meantime, buy your milk from someone whose name you know.
Ways to Push Back
- Buy local when you can. Farmers markets, co-ops, and independent shops operate outside the surveillance pricing infrastructure. Every untracked transaction is a small act of structural resistance.
- Support local studios, gyms, and gathering spaces. These are among the last spaces of unmediated human exchange. They matter economically and socially.
- Choose independent craftspeople and service providers. The algorithm can’t individualize your price when the transaction is person-to-person.
- Use cash when possible. Cash transactions generate no behavioral data. They are invisible to the surveillance pricing infrastructure.
- Contact your legislators. Support bills like the Algorithmic Pricing Disclosure Act. Demand transparency in how prices are set and how your data is used.
- Read the scholars. Zuboff, Benjamin, O’Neil, and Eubanks have done the rigorous work. Their books are accessible and urgent. Start with any of them.
- Develop algorithmic literacy. Understand how recommendation systems, dynamic pricing, and behavioral targeting work. Teach your kids. Talk to your friends.
- Talk about this openly. The digital caste system thrives on invisibility. Naming it is the first step toward resisting it.
Sources
Primary Scholarly Works
Zuboff, Shoshana. The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power. PublicAffairs, 2019.
Benjamin, Ruha. Race After Technology: Abolitionist Tools for the New Jim Code. Polity, 2019.
O’Neil, Cathy. Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy. Crown, 2016.
Eubanks, Virginia. Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor. St. Martin’s Press, 2018.
Algorithmic Class Formation
Burrell, Jenna, and Marion Fourcade. “The Society of Algorithms.” Annual Review of Sociology, vol. 47, 2021, pp. 213–237.
Charitsis, Vassilis, and Mikko Laamanen. “Engagements with Platform Capitalism: Reconciling Consumers and Workers.” Consumption Markets & Culture, 2023.
Surveillance Pricing
Federal Trade Commission. “FTC Staff Report Examines the Data Broker and Surveillance Pricing Industry.” January 2025.
Mattioli, Dana. “On Orbitz, Mac Users Steered to Pricier Hotels.” The Wall Street Journal, June 26, 2012.
New York State Legislature. Algorithmic Pricing Disclosure Act. Introduced 2024.
Caste and Structural Inequality
Wilkerson, Isabel. Caste: The Origins of Our Discontents. Random House, 2020.
Ambedkar, B.R. Annihilation of Caste. 1936. Annotated critical edition, Verso, 2014.
Historical Context
Sale, Kirkpatrick. Rebels Against the Future: The Luddites and Their War on the Industrial Revolution. Addison-Wesley, 1995.
