On Taste
Notes on friction, the unoptimised self, and the comfortable lie
The most comforting lie one can tell oneself, in the boardroom or in the bedroom, is that the rules governing taste, those vague and inconvenient mechanisms reflecting a point of view, do not apply to you. It is a necessary fiction, for to admit that taste matters is to admit that the model and the metric can fail.
A few months ago, I was in one of the Studio conference rooms only the global leadership of one of the Big Five can inhabit. The room itself was an expensive artefact of insulated certainty. The audience looked the part—which is to say, they looked like men and women who believed the rules of the market were different for them. I had just finished forty-five minutes on the future of theatrical cinema: why franchise logic had plateaued, and why clear point of view was now the market antidote to the centre.
When I finished, one line shook off the last forty-five minutes: “I get what you’re saying about taste and all that. But those rules don’t really apply to us. Our flywheel is different.”
The statement was not about their current slate. It was about immunity. They were asserting that the sheer tonnage of their market position made them structurally exempt from the laws of taste and attention that governed everyone else.
This logic of exemption is not reserved for the executive suite. It is the argument we make, out of fear, when we settle for the path of least risk. The essence of the lie is not just convenience, but legibility to power—being safe because you are easy to model and therefore easy to approve. The personal model is simple: my anxiety, my financial compromise, my fear of not getting what I want—these make the rules of taste, effort, and integrity irrelevant to me. You believe the model will save you from the pain of personal failure, just as the executive believes the metrics will save them from the failure of a fourth sequel.
The problem is that metrics can only really see the middle. So anything built to please them will, over time, drift towards the middle too, whether that’s greenlighting based on “comps” or the quiet internal agreement to model one’s own appearance and desire on what the feed rewards.
If taste has no impact, you do not have to ask harder questions about what you think, or why. You save yourself the effort of choice, and the discomfort of saying no. But without the ability to say no to the thing that is authentically not yours, the ability to say yes when it matters slowly atrophies, too.
Metrics and models promise a near-term survival, a cheap self-defence modelled against your assigned group, and ensures no one can point directly at you for your specific, personal failing.
If the system has already done the choosing, you save yourself the effort of choice and the pain of judgment. You save yourself the effort of knowing who you are and the inconvenience of acting like it.
The Algorithmic Self
Under the condition that the middle must be served, a specific version of you emerges: the Algorithmic Self. Its only purpose is to be consistent, legible, and optimisable. It is built for the utility of others, largely platforms, not for your own.
We built the modern internet on measurement, modelling, and monetisation. Now, at every level—from the dating grid to the greenlight deck—the move is the same: Let the metric stand in for taste. Let the model decide what is ‘good’. Optimise towards that goal; the loop, not us, makes the choice.
The Algorithmic Self is the you that can be plotted, scored, and recombined: the cluster of interests inferred from your watch history, the ‘type’ inferred from your swipes, the ‘tone’ inferred from the posts that perform best, and the risk profile inferred from where you live, what you buy, and how you move.
It is not imaginary. It is the silhouette ad buyers pay for, the pattern-matching composite of your past behaviour. The more tightly the system can model you, the more of your environment it can pre-arrange—which stories rise, which potential partners appear in the queue. You feel as if you are choosing, but the set you are choosing from has already been curated according to what the model believes you will do.
What erodes is not “free will” in some grand metaphysical sense, but the room you are granted to be genuinely off pattern—to want something that has not been optimised for, to act in a way that does not make immediate sense to the model. This figure is the inevitable consequence of a system that can only offer the past as a promise of the future. The human condition, under this endless optimisation, degrades into its cheapest form: distraction running on boredom and loneliness.
The result is that our environment becomes little more than the painting between lines someone else has laid out. Life, then, an illusion of a blank canvas. Under these conditions, the point of individual preference is lost. Our deliberate, unoptimised thoughts cannot matter when the primary form of value is being legible to a machine. You begin to suspect, finally: that when nothing is unmeasured, the self becomes the consequence of the metric, and not the source.
The Acquired Self
There is another version of the self that persists alongside this. It is less efficient, less legible, and harder to monetise. Call it the Acquired Self.
This self is the inconvenience—the part that falls in love with someone who is not your ‘type’ on paper, that writes something which defies a topical slot, or changes its mind without warning. It does not hate metrics; it simply refuses them as final authority.
Taste is moral friction and the necessity of inconvenience. It is the cost of living outside the middle.
We have built a world for frictionless survival, where the system defaults to the easiest path, removing all resistance from consumption, communication, and desire. This system operates on digital logic, where efficiency is the highest virtue and the outcome is optimised for the least possible exertion.
But taste belongs to the physical world—the world of texture and effort and the irritation of waiting. It is the realm where we must exert will: to sit with an impulse and resist it, to develop a life around virtues rather than habits.
But there is also a failure of maintenance. Taste is work. Left unattended, it gets colonised. If you never look away from the feed, your preferences will slowly converge on whatever the feed rewards. If you only ever chase what “does well,” your work will eventually sound like everything else. If you do not question why the same kind of man appears on your phone every night, you will internalise the grid as reality.
The Acquired Self is not mysterious. It is simply the part of you that preserves moral nerve: the deliberate choice to keep corners of your life unoptimised.
Living With the Loop
I am not writing this from outside the loop. I, too, have stared at Substack metrics and shifted my sentences toward whatever spiked. I have used the dating applications and felt my sense of what is possible narrow. I have stood before executives who control billions in capital and listened to them explain why taste is someone else’s problem.
It would be comforting to frame this as a moral choice—to delete the apps, ignore the metrics, be pure. That is not how any of this works. The loop is the water now.
The more honest question is smaller and harder: How much of yourself are you willing to turn over to the model? How many of your fundamental decisions—about what you make, what you watch, who you love—are you comfortable delegating to whatever happens to be legible as a metric?
Some drift is inevitable. It is not a personal failure to have internalised the grid; it was designed to be internalised. A certain tenderness is required in admitting that.
But once you see the move—once you hear the “our flywheel is different” voice in your own head—you do have a sliver of choice.
You can decide, in small, concrete ways, to keep one corner of your life slightly less knowable: training a body for how you want to feel, not merely what the rings measure; committing to work that occasionally ignores performance; or allowing desire to be inconvenient to the grid.
None of that will break the loop. It is not revolution. It is not even especially visible from the outside. What it does do is preserve a pocket of unpredictability—a place where the algorithmic self does not get the last word. A place where taste, in the deepest sense, is not merely what the system has trained you to want, but what you have insisted on wanting anyway.




