Skip to content
4 min read REFLECTIONS

Karl Marx did not see AI coming

What if Karl Marx was right but for the wrong reasons? In this powerful reflection, Oriane Cohen revisits "Capital" through the lens of AI, strategic thinking, and the rise of mental inequality.

Karl Marx did not see AI coming

Recently, I was thinking about the good old Karl Marx and his 'Capital'.

How would he feel, what would he think today?

"Oriane... what the ***? Marx now?".

Yes, I can hear you. But hey, stay here and follow my argument, it will make sense by the end of this piece.

"Labor" is no longer what it was.
Neither is value. Neither is power.

We are witnessing an economic transformation, but also a deep shift in how we understand reality, identity, and human worth.


Marx and the meaning of "work"

Marx believed that work was the essence of man. By working, humans transformed nature, and in doing so, they fulfilled themselves.

For him, work was a creative, social, and meaningful act. But under capitalism, this changed. Work became alienated:

→  The worker no longer owned what they produced.
→  The act of working lost its creativity.
→  Colleagues became competitors.

And most tragically, the worker became estranged from his own human essence.

In this frame, wage labor is exploitation.

Workers sell their labor power to survive while the capitalist capture the surplus value they created.

For Marx, true liberation would come when humans reappropriated their labor.

When work was done freely, collectively, and creatively, no longer under the pressure of survival.


But Marx did not see AI coming!

In tomorrow's world, the value no longer depends on human labor. The formula of exploitation changes. Because, work is going to be outsourced to the non-human.

Paradoxically, this brings us closer to Marx’s idea !!

What a strange twist of history.

AI may fulfill Marx’s dream... but under the control of capital, not its abolition.

There are still two problems:

Problem 1: we need to redefine "work"

The challenge isn't "no longer working," but redefining what it means to work. With AI, humans will have to:

→ Manage the cognitive divide between those who master AI and those who are subjected to it.
→  Focus on deep skills (emotional intelligence, judgment, creativity, ethics, strategy)
→ Reinvent a society where identity and meaning no longer come from "work" and being productive, but from a more subtle and human contribution.

This paradigm shift can become extraordinary, if we are ready to face it lucidly.

We're entering the era of what I call "human refinement".

💡
Human refinement is the discipline of upgrading our perception, strategy, and emotional clarity to stay sovereign in the age of AI.

Problem 2: we need to decentralize the power when it comes to AI

When the tools of intelligence - algorithms, platforms, data - are concentrated in the hands of a few, and the majority are left untrained, we create a system where confusion becomes a business model.

AI is not the enemy.
AI is a tool.

The biggest and most dangerous problems of all is the absence of public literacy.

Add to this the quiet monopolization of attention, behavior, and thought.

It's not that someone or a group is necessarly "evil" in here. It's a superposition of complex dynamics from the white, the dark and the grey zones - creating an unbalanced reality.

BUT.

One can't keep on blaming the "Big tech" while doing NOTHING to educate oneself. That's victimization. That's resignation.

We all have a responsibility here.

This divide is not about class. Not about race. It’s about mental clarity.

From my observation, something deep is happening, a silent fracture growing every day.

Not between rich and poor.
Not between black and white.
Not even between humans and machines!

But between those who can navigate this new world, and those who can't.

I see it in my own environment, among friends. There are the one understanding, using, playing with this new world, embracing it with caution and fascination.

And the others: afraid, confused.
They are losing their bearing. They close themselves completely to these changes. They refuse it. They deny it. They resist it. They complain about it.

We've entered the age of mental structure.

Those who think in systems.
Who connect dots before they become data.
Who resist manipulation.
Who stay calm in complexity.

They rise.

The others? They drown. Or worse: they are used.

The new capital is not the $$$. It's not even data! It’s the ability to think clearly in a world designed to confuse you.

You're unbreakable, uncorruptible if you have this ability.

We love to blame AI but... AI didn't create the divide.

It just made it visible, and accelerated it.

We now live in a society where strategic thinking is an "elite" skill
but no one teaches it.

Not in school.
Not in universities.

This is why so many feel lost.
They are not stupid.
They are simply unarmed.

At the end of August, I'll be revealing a protocol I've spent the last months working on. It's the longest, deepest system I've ever created.

It's composed of 5 modules, layered in a logic manner to bring you from:

→ seeing the world for what it is,
→ seeing yourself and others for real,
→ thinking and analyzing the world with grids and methodology,
→  act in the world with impact, and transmit for a better tomorrow.

In this system, I don't teach what to think, but how. How to work with mental models, how to maintain your axis, how to use the intelligence operative mindset to navigate complex environments and the world of tomorrow.

Creating this has been... exhausting. And liberating as the same time. I don't really care anymore what the ex-colleagues will think, what people will say. This program is somehow very intimate but never truly personal.

Anyways, you'll see, if you decide to get inside.

It will be accessible to all Premium members.

Stay lucid,

Oriane