The Algorithm Already Decided Who You Are
On algorithmic identity, the shrinking internet, and why I built a tool to break the mirror
You know that feeling. You mention something to a friend, a car brand, a kitchen gadget, a passing interest in fishing, and suddenly your entire feed reshapes itself around it. YouTube, Instagram, Google, all of them, overnight, have decided you are now a Person Who Wants This Thing. Every ad, every recommended video, every sidebar, all working together to suck your attention like rare blood from a horseshoe crab.
Or sometimes you didn’t even search for anything. You just exist in a certain demographic, at a certain age, in a certain zip code, and the algorithm has already filed you. It decided you’re probably anxious about the economy, probably interested in crypto, probably in the market for a stroller. Your whole feed is shaped around a version of you that was assembled without your input from data points you don’t remember generating.
This is not your phone reading your mind. It’s not telepathy. But it’s not as far from it as we like to think.
I Miss the Old Internet
I keep thinking about what the internet used to feel like. Remember StumbleUpon? You’d hit a button and land on some random page about, I don’t know, a guy documenting every payphone left in New York, or a blog about the history of colour in medieval manuscripts, or someone’s personal site about building boats in their garage. You’d go online and genuinely not know what you’d find. You’d end up reading about something you never would have searched for, and it would stick with you.
That internet is mostly gone. Not because the content disappeared, but because the layer between you and the content got optimized. The feed doesn’t want to surprise you. Surprise is inefficient. Surprise doesn’t convert. The feed wants to show you exactly what you’re most likely to click on, which is almost always more of what you already clicked on yesterday.
And so the internet got smaller. Not literally. But experientially. Your slice of it shrinks every day into a tighter loop of the same topics, the same angles, the same products, the same outrage. You open YouTube in the evening, you’re tired, you’re just... available. And what does the algorithm do with that window? It gives you five videos about the car part you searched for this morning. It’s like talking to someone who only knows three things about you and won’t shut up about them.
The Mirror and the Mask
Google has a page where you can see the interests and demographics they’ve assigned to you. Go look at it sometime. It’s strange, seeing yourself flattened into a list of ad categories. “Male, 25-34, interested in technology, cooking, outdoor recreation.” Like a dating profile written by a surveillance system.
That profile determines what you see. And what you see, day after day, evening after evening, shapes what you think is normal, what you think is interesting, what you think is worth caring about. You watched one fishing video, so the algorithm shows you more, so you watch more, and three weeks later you’re pricing rods on Amazon and you’re not entirely sure if you actually like fishing or if you were slowly walked into it. The mirror doesn’t just reflect. It shapes. And the thing about mirrors is that if you stare into one long enough, you start adjusting yourself to match the reflection.
Some people are fine with this. That’s their call. But some of us, especially now in this era of AI-everything, want to build a more real, non-artificial kind of intelligence and personality. To be exposed to things outside the categories that three companies in California decided we belong to. You might be the kind of person who’d get into traditional woodworking, or Japanese ceramics, or the engineering behind public transport systems. You’ll just never find out, because no one in Mountain View put you in that box.
Your attention is going to be harvested regardless. The question is whether it gets fed back to you as a narrower, more profitable version of yourself, or as something that actually makes you more interesting.
That’s where the mask comes in. Not hiding, but choosing to introduce noise into the signal. When the data that feeds your profile gets mixed up, contradicted, polluted with things that don’t fit the box, two things happen.
First, the statistical “you” gets blurry, and in that blur there’s something that feels like freedom. The algorithm can’t compress you into a predictable consumer if the box keeps changing shape.
But second, and this is the part that matters at a deeper level, your data becomes less valuable. The whole ad economy runs on the assumption that your profile is accurate enough to sell to the highest bidder. When the signals are mixed, when your profile says you’re simultaneously a fisherman and a fitness pro and a home chef, the targeting breaks down. The data is still there, but it’s worth less. You’re not opting out of the system. You’re degrading the quality of what they stole from you.
This isn’t really about privacy in the way we usually talk about privacy. It’s not about blocking trackers or hiding behind a VPN. I’d call it algorithmic self-determination and curation. You should have a say in what the machine thinks it knows about you, because what it thinks it knows shapes what you see, and what you see shapes who you become. That chain is worth interrupting.
So I Built the Thing
I chewed on this for a while and eventually just built it.
MirrorMask is a native Mac app that scrapes your real Google ad profile, shows you every interest and demographic category they’ve pinned on you, and then browses the web as a completely different person overnight. You pick a persona. Fisherman, Home Chef, Fitness Pro, whatever. It runs searches, watches YouTube, visits sites, all through your actual Chrome browser, not a headless instance, not a hidden process. You can watch it happen in real time. Real browser fingerprints, real cookies, real session behavior. Google doesn’t see a bot. It sees what looks like a different person sitting at your computer. Over time, your ad profile shifts. The mirror starts reflecting someone else. The data becomes noise.
Everything runs locally. Built with Swift, nothing leaves your Mac. No accounts, no cloud, no telemetry.
If there’s enough interest, I want to expand beyond the browser. Phantom Alexa commands, smart TV viewing patterns. The profile that follows you around is built from every connected device in your house, not just Chrome. But the browser is where the bulk of it gets collected, so that’s where I started.
Free 5-day trial, then $39 one time. Not a subscription, because I’m tired of everything being a subscription.
I’m not pretending this fixes the internet or solves surveillance capitalism. But I think there’s value in being a harder person to categorise. In making the mirror a little less clear. In choosing your own mask instead of wearing the one the algorithm stitched together from a thousand micro-decisions you barely remember making.
Maybe what I really want is for the internet to feel a little more like it used to. A place where you could end up somewhere unexpected and come away slightly different. That probably takes more than one Mac app. But it’s a start.
If you’ve thought about this stuff or tried other approaches, I’d love to hear about it.

