Transliteration of Critical Analysis for Blog

I Asked AI to Make a Selfie of Me. What I Got Back Changed How I See Myself

What happens when a machine tries to "see" you?
If you’re a woman, an immigrant, or someone who's ever been misrepresented in the media, the answer might feel familiar and unsettling.

From Essay to Blog: Why This Isn’t Just Academic

When I first explored this topic, it was for an academic paper. I used theory, structure, and citations to analyze the intersection of AI (technology), media, and identity. But turning that paper into this blog post? That was a different kind of work, transliteracy in action.

I had to shift from theorizing identity to living it. From quoting McLuhan to asking what it feels like to be remixed by a machine.

And what I discovered along the way blurred the lines between media literacy, personal narrative, and the algorithms that shape our digital selves.



The Prompt: Feed Your Face to AI

Like many people online, I got curious. I asked leonardo.ai to generate an AI version of me and I gave it a few keywords: “Black,” “woman,” “immigrant,” “Canada Then I hit generate.

The image that came back? Striking. I looked like a regal figure, dressed in bright Ankara prints, holding a small Canadian flag with perfect posture. A queen, essentially.

But that’s not how I look. It’s not how I pose. And it’s not how I define myself.

It felt like AI had created not me but a palatable version of “Blackness” — a visual stereotype wrapped in good intentions and coded bias.
So I decided to try again, this time around with additional prompts like “Northern Nigeria,” “Canadian citizen,” “African Art”, “Canadian Flag.” Then it looked closer to what I was looking for, even though it was not perfect.




Misinformation, Disinformation, and the Filter of AI

Having researched misinformation and disinformation in the media, I recognized what was happening.

AI-generated images are not neutral. They are built on datasets shaped by human bias, media repetition, and cultural stereotypes. The result? A curated fiction that often masquerades as truth.

It’s the same issue we see in viral media: simplified narratives that reinforce existing power dynamics. The queen. The warrior. The mystic. Rarely the scholar, the filmmaker, or the working young woman on her third coffee trying to finish grad school.

This isn’t just about falsehoods. It’s about partial truths, and how dangerous they can be when repeated across platforms.


Transliteracy: From Code to Culture

As graduate students in media and communication, we toss around terms like transliteracy — the ability to read, write, and think across media forms. But this experience was transliteracy in the real world.

I wasn’t just reading media, I was inside it. Leonardo.ai didn’t ask who I was. It just predicted who someone like me was supposed to be.

That prediction was gorgeous. But it was also reductive.

Beyond the Selfie: Who Gets to Be Seen?

This isn't just my story. It's a digital dilemma we all face:

  • Who gets to be represented fully online?

  • Who gets flattened into a trope?

  • Who’s missing from the training data altogether?

When machines “learn” from biased media, they reflect those same inaccuracies back at us,  dressed up as art, fun, or personalization.

We need to stop thinking of AI-generated content as apolitical. Every image carries a message. Every absence reveals a bias. Every "prediction" is a remix of the past, not a vision of the future.


Engage With It: Let’s Talk

Have you tried AI portraits? What did they get right  and what did they erase?
Drop your experience in the comments or share your generated image with the hashtag #MyAISelfieStory.

🎯 Bonus challenge:
Imagine AI scanning every photo, caption, and tag you’ve ever posted.
What version of you would it construct, and would you recognize it?

Final Thought: From Spectacle to Self

AI tools can do amazing things. But they also replicate the same old problems in shiny new ways. As a woman navigating race, identity, and storytelling, I’ve learned that we must approach these tools with curiosity and caution.

Because when technology gets to decide who we are, it often gets it wrong.

So yes, the images were beautiful. But the real story? That’s still mine to tell.


 

Comments