In the early days of social media, our interests were the driving force behind the content we consumed — we actively sought out pages, groups, and individuals that aligned with our passions and hobbies. But now, the implementation of algorithms to curate our feeds has shifted the dynamic. In the labyrinthine ecosystem of digital interaction, algorithms have emerged as silent architects of personal identity, wielding far more transformative power than simple content recommendation.
(Tap or click the headline above to read this story with all of the links to external sources.)
By analyzing our browsing history, likes, and comments, the algorithm anticipates our preferences, and now, even pushes us to explore new ones — sometimes at the behest of a paying corporation that weaponsizes the technology as a marketing strategy. After years online, how can we differentiate organic preferences from ones planted in us by the algorithm?
An ongoing lawsuit between two influencers brings up important questions about our digital existence, capitalism, and the creator industry. Sydney Nicole Gifford is an Amazon influencer, promoting their products on social media, and paid in commission whenever a viewer buys the product through her affiliate link. The aesthetic Gifford promotes is what the chronically online may call the “ clean girl aesthetic.” This aesthetic is consistent across her profiles, content, and even her home. And now she’s suing Alyssa Sheil, another Amazon influencer, for copyright infringement and misappropriation of her aesthetic.
Shiel’s defence? The algorithm made me do it. There are thousands of influencers who have the same look, talk the same way, and promote the same products. Personal style choices — like the rose tattoo Gifford and Sheil both have on their arm — are lifted from popular posts on mood board sites like Pinterest. Despite the fact that people, even influencers, are pushed to tailor their lifestyle to fit the trend du jour, Gifford insists that Sheil infringed on her vibe. The kicker? Sheil has had Gifford blocked for years, and hasn’t had access to her posts.
Could these two women have arrived at the same type of content completely organically through their algorithms? Social media is driven by the pursuit of user engagement — popular content and rising trends are promoted to more people for longer, which can lead to a convergence of styles. The creation of a filter bubble of an aesthetic isn’t far from the realm of possibility.
How else is the algorithm changing us? This technology does not simply reflect our preferences anymore — it actively sculpts them, creating a recursive loop of technological mediation that fundamentally reshapes the human experience. The result is a hyperspecialized information environment that operates with surgical precision, reinforcing existing (or promoted) cognitive frameworks. This isn’t just curation — it’s a form of intellectual containment.
In July, The Guardian conducted an experiment to see what kind of content would be pushed on a newly-set-up, blank iPhone with no created algorithm. The result? A surge of sexist and misogynistic content. The fake profile had no identity input for the algorithm to work with. The only user information added to the fake profile was that it belonged to a 24-year-old male. This isn’t algorithmic randomness — it’s a sophisticated mechanism of cultural reproduction, where latent societal prejudices are not just mirrored, but strategically amplified. The algorithm becomes an active agent of ideological ecosystems, not a passive conduit.
All of this illuminates a profound technological paradox: our most advanced recommendation systems are simultaneously sophisticated and deeply primitive. This contradiction isn’t just a technical curiosity, but a window into the complex interplay between advanced machine learning and human preferences and biases. The algorithm doesn’t transcend human proclivities (whether ideological or beauty-oriented), it amplifies it with mathematical precision. This isn’t a malfunction, it’s a feature.
Welcome to the ecosystem of perceived desire. Gone are the simple days of targeted ads and paid promoted posts. Corporations are now using influencers to manufacture what seems like spontaneous, authentic interest. This isn’t advertising in the traditional sense — up until January of this year, influencers weren’t required to disclose paid partnerships with brands (and even now the enforcement of this law is intractable), and companies weaponized this to fabricate “organic” cultural moments.
Brands no longer pay for direct advertisements, they invest in intricate algorithmic strategies that amplify specific content, creating the illusion of grassroots enthusiasm. An influencer’s seemingly casual product mention becomes a precisely calibrated signal, algorithmically boosted to create maximum perceived authenticity. A single strategically placed product in an influencer’s content can trigger recommendation systems that transform casual exposure into a viral phenomenon. The algorithm doesn’t just observe trends, it actively manufactures them, creating what sociologists might call a “ manufactured consensus.”
What makes this approach particularly insidious is its exploitation of human psychological vulnerabilities. A product isn’t just being sold — it’s being socially legitimized through a complex network of perceived authenticity. As these strategies become increasingly sophisticated, they challenge existing regulatory frameworks. How do we classify marketing that isn’t technically an advertisement? When algorithmic influence becomes this nuanced, traditional definitions of transparency and disclosure become woefully inadequate.
This isn’t merely a technological phenomenon, but a profound cultural shift. We’re witnessing the commodification of authenticity itself — where attention is the currency, algorithms are the brokers, and genuine human experience becomes a marketable resource to be algorithmically parsed and strategically deployed.