First we commoditized music. Then knowledge. Then relationships. Now intelligence. What's left?


Yesterday I discovered a song that moved me. It was buried somewhere in an algorithmically generated playlist, sandwiched between dozens of other tracks that Spotify had assembled based on whatever it thinks it knows about me. I was listening to music, not doing anything else, and a few bars in I felt the familiar pull — skip it, move on, the next one might be better.

I didn't have my phone in hand, and instead of picking it up I let the song play. A small decision, barely conscious. It turned out to be good — the kind of good where, twenty years ago, you would have flipped the CD case to read the artist's name and played it for a friend the next day, watching their face while they listened.

Instead, the playlist moved on. The next track started, then the one after that, and by evening I couldn't remember the title.

This has been nagging at me since. Not the forgotten song — I've forgotten thousands. What nags me is how close I came to skipping it, that the impulse to move on was there before I'd even listened, and that the only reason I didn't act on it was that I chose, in that moment, not to reach for the phone. A tiny act of resistance against an urge I didn't ask for.

A flood made of equally great

That impulse didn't come from nowhere. It's the product of a cascade that's been building for two decades, and it follows the same pattern at every stage: once something becomes infinitely available, it becomes interchangeable, and once it becomes interchangeable, we stop valuing it.

Music went first. My generation — I'm Gen X — bought albums after careful deliberation, because money was limited, and then listened to them dozens of times, not always by choice, because you only had twelve records on the shelf. That forced repetition created a familiarity that went beyond enjoyment; the songs became biographical, tied to the summer you wore out that one tape or the person who lent you that record. When Spotify replaced scarcity with a hundred million tracks, the quality didn't suffer — if anything, there's more great music today than at any point in history. But greatness drowns when the flood is made of equally great. Other media followed the same trajectory. My reading list has books I started five years ago and never finished, not because they were boring, but because a new one always seemed more urgent. The abundance that was supposed to enrich my life has turned into a logistics problem. I don't savor things. I triage them.

Knowledge was next. Search engines and Wikipedia made what used to require a library visit or a knowledgeable person available to anyone with a connection. This was progress, and I don't want to romanticize ignorance, but the shift had a cost nobody talks about: when knowing things was hard, the people who knew things mattered, and when knowing things became free, it stopped being a reason to seek someone out.

Then came relationships themselves. Before AI entered the picture, Tinder and its successors had already applied the same logic to human connection — an infinite catalog of potential partners, each a swipe away. The same FOMO that makes me skip songs makes people end relationships, not because they've failed, but because someone better might be one screen away. Barry Schwartz wrote about the paradox of choice when the problem was twenty-four jars of jam on a shelf; we've since applied that paradox to people. The result is a dating culture that treats partners as consumables, where staying through difficulty — the very thing that turns an encounter into a relationship — has become an increasingly hard sell when the alternative is so effortless.

Each step in this chain follows the same logic: abundance creates optionality, optionality feeds the fear of missing out, and that fear erodes the willingness to stay with anything long enough for depth to develop. The chain moved past entertainment and knowledge, and the dating apps proved it could reach human relationships too. Now it has arrived at the one thing that used to be inseparable from what makes us human.

The colleague I stopped calling

For as long as I can remember, a smart conversation partner was something you sought out and treasured — a brilliant colleague, a friend who could untangle a complicated problem over coffee. These were rare goods. You cultivated those relationships because you valued people who could think in ways that surprised you.

I've noticed a shift in my own behavior. When I have a question now — a real question, the kind I would have brought to a trusted colleague three years ago — I increasingly ask the AI, not because my colleague is less capable, but because the AI is available right now, without scheduling a call, without the social overhead of admitting I don't know something. I still value my colleague, but I need him less, because he's no longer essential for the thing that used to make him essential — his ability to help me think. This cuts both ways, of course; my colleagues need me less too, and the mutual dependency that once held professional relationships together is dissolving.

It would be convenient to say that AI replaces intelligence but not emotion, and that human connection will therefore remain valuable. But the separation doesn't hold. People are already using AI as a therapist and confidant, a sounding board for their most private thoughts — what psychologists call parasocial interaction, a one-sided relationship with something that isn't really relating back. We used to worry about this phenomenon with television celebrities. Now it's becoming the default mode for some of the most intimate conversations people have.

And it's easy to see why. The AI doesn't judge, doesn't get tired of your problems or bring its own bad day into the room. It meets your expectations in a way no human ever can, precisely because it has no expectations of its own. That's comforting, and it's a trap, because every conversation you have with a machine that perfectly accommodates you is a conversation you're not having with a person who might frustrate you, misunderstand you — and in doing so, actually change you.

What made a good conversation partner valuable was never just information. It was the intersection of thinking and presence — someone whose mind you enjoyed being around and who pushed back when you were wrong. Intelligence and emotion were always tangled together in those moments, and AI unravels the tangle by simulating both sides separately and making each available on demand. The colleague's knowledge becomes replaceable, but his warmth is harder to substitute than we tend to admit. Covid and the shift to remote work already demonstrated this: what people missed most weren't the meetings but the conversations at the coffee machine, those informal moments that served as both information exchange and social bonding at the same time, without anyone planning them that way. AI can simulate the knowledge transfer, but it can't replicate the accident of a good hallway conversation — and the attempt to replace it only becomes visible as a loss once it's gone.

Spotify devalued the song, which was a product. Tinder devalued the partner, which was a relationship. AI devalues intelligence and emotional presence together, which is the last thing that gave people a reason to need each other — not economically, that erosion is already underway and analyzed elsewhere, but personally, in the texture of how we relate and how we matter to one another.

What's left of us

The chain won't stop here. That's the part I can't stop thinking about. If music, knowledge, relationships, and now intelligence-plus-emotion have all followed the same pattern, there's no reason to believe the pattern has reached its end. What comes after intelligence? Creativity and judgment, perhaps — the sense that you, as a particular person with a particular history and particular blind spots, bring something to the table that a system cannot generate. The things that make someone irreplaceable, not as a worker but as a human being, are the next candidates for commoditization, and some of them are already on the way.

This isn't happening in a vacuum, either. It's happening while governments openly treat citizens as disposable and public discourse coarsens, while the platforms that could even host a conversation about what we're losing are owned by the people who profit from the loss. The conditions for thinking clearly about this are themselves eroding.

The insidious part is that the losses may never register as losses at all. Nobody planned to go to the coffee machine for social bonding — it just happened, and when it stopped happening, most people couldn't name what they were missing. They just felt it as a vague unease, if they felt it at all. You don't mourn the hallway conversation you didn't have or the relationship that didn't deepen because you never needed to ask that colleague for help. These absences are invisible by nature, because you can only miss something you knew you had.

This is what makes the devaluation chain so difficult to resist — it doesn't feel like loss while it's happening. It feels like progress and convenience, like one less obstacle between you and what you want. The cost only becomes visible in retrospect, if it becomes visible at all, usually in a moment when you're unexpectedly thrown back into the old conditions: an in-person day at the office after months of remote work, or a long dinner with someone you haven't seen in years where the conversation wanders in directions neither of you expected. And in those moments you realize something was there that you'd stopped noticing the absence of.

I use AI intensively — I'm not arguing against it, and I wouldn't want to go back to a world without it. It has made me more capable and more productive in ways that are genuinely valuable. But a tool that gives you an answer in seconds is doing something different from a conversation with a colleague that takes forty-five minutes and ends with more questions than it started with. The AI gives you the efficient result; the human gives you the unexpected detour, the creative friction of two minds that think differently. These are not the same thing, and treating them as interchangeable — which the convenience of AI quietly encourages — means losing something that only becomes apparent once it's gone.

The question, then, is not whether to use the tools we have — that ship has sailed, and it should have sailed. The question is whether we remain conscious of what we're doing while we use them. Whether we notice the moments when the convenient option is quietly replacing something that mattered, not because anyone decided it should, but because nobody decided it shouldn't. The devaluation chain runs on autopilot. The only thing that interrupts it is awareness — the willingness to ask, before reaching for the easy answer, whether this is a moment where the harder path would have given you something the easy one can't.

I keep coming back to that song, the one I discovered because I chose not to reach for my phone. The impulse to skip was there, and I noticed it, and I didn't act on it. That tiny gap between impulse and action was enough to let something worth hearing reach me. I didn't choose difficulty for its own sake. I just paused long enough for something unexpected to happen.

That's not a grand strategy. It's barely even a decision. But it's the seed of one, because the same pause applies to everything the devaluation chain is swallowing. It applies to calling a friend instead of querying the chatbot, not because the chatbot is worse at answering, but because the friend might take the conversation somewhere you didn't anticipate. And it applies to staying in a relationship when it gets uncomfortable, because that discomfort is where both of you actually change. In every moment where the frictionless option presents itself, there's a brief and easily missed chance to ask: what am I giving up by taking it?

The devaluation chain has a direction, and it points at us. Our knowledge is already replaceable, our emotional availability is being simulated, and our relationships have been turned into a catalog to swipe through. The question is not whether this process will continue — it will — but whether we sleepwalk through it or face it with open eyes. And facing it doesn't require rejecting the technology or retreating into nostalgia. It requires something much simpler and much harder: staying awake to what we're doing and choosing — again and again, imperfectly, inconsistently — to keep the things that only exist between people who show up for each other.