Reconstructed, Not Created - How AI Shapes the Aesthetic of Identity




 Who I Am (According to AI)

I never expected to feel so conflicted over a selfie of myself, especially one I didn’t take. I used Mirage AI to generate an image of myself. The instructions were simple: describe your features and let the machine do the rest. I included every possible detail about myself. But what I got back didn’t look like me. It looked like a polished idea of someone AI thought I could be.

Generative AI doesn’t construct new identities. Instead, it reconstructs them. It stitches together what it already knows, Western beauty ideals, social media trends, and biased training data, and then reshapes you to fit that aesthetic. 

Prompting Myself Into Existence

The first prompt I tried was “East African girl.” The result? A young girl in vague traditional attire in a remote like setting. I tried “East African woman” next and ran into the same issue. AI  reduced an entire region’s identity to a flat cliche.

I kept adding more specifics like square glasses, gold hoops, big cheeks, blonde knotless braids, even the setting (a cozy cafe). Eventually, the image looked closer to what I had imagined. But still not me.






Reconstruction Through Erasure

I was surprised by the things MirageAI chose to change. It gave me unnatural braids and styled my baby hairs in the swooping TikTok edges I don’t use. It smoothed out my skin and sharpened my jawline. AI doesn’t reflect identity. It performs on its own terms, not you the user. The identity constructed is shaped by years of biased training data and algorithmic ideals of beauty.

It does not merely recreate you. It curates you.






Data as Aesthetic: The Invisible Design Language of AI

Looking at  my AI selfie, I realized that AI has an aesthetic. And it's not neutral.

Its training data is full of patterns and historical exclusions. What it has seen most, shapes what it reproduces. That is why my first prompt showed a traditionally dressed African girl. It was not a glitch. It was the algorithm’s interpretation of “African”,  based on what it had seen before.


AI doesn’t create. It remixes old ideas, already warped by colonial history, Western media, and digital trends before presenting that as you. Accuracy is not just about skin tone or hairstyle. To me, it’s about whether AI sees your identity as real and complex  or as something to be simplified. 

What AI does to faces, TikTok does to sounds. The way MirageAI reconstructed my identity reminded me of what Kaye et. al. call a “sociotechnical process,” where attribution is a cultural logic built into the platform. Like TikTok’s flawed automatic crediting system, AI’s aesthetic ignores nuance. It assumes, replaces and then beautifies. And just like TikTok creators reclaiming credit, I found myself pushing back against an AI image that was never mine to begin with even though it had my name.

This links to Mariana Acuña’s thoughts in The Future of Storytelling. She explains that the tools we use to represent people do not just show reality. They shape it. Technologies are already curating immersive experiences built on certain assumptions and aesthetics. 

Watch Mariana Acuña’s discussion here


The Ethical Cost of Being “Fixed” by AI 

AI doesn’t just miss the mark, it misleads. By reshaping me into something smoother, trendier and more ‘marketable’, it reinforced an idea I have spent years unlearning. That my natural features need adjusting to be worthy of attention. And if I, someone who knows myself, started second guessing my jawline because of a bot, what happens to the people who only see themselves through AI lenses?

This is not only a tech problem, but a cultural one. AI is not a neutral machine, its images reflect long standing power dynamics. Who gets to be seen clearly, and who gets filtered?

When AI edits people like me to fit its aesthetic, it’s not just beautifying, it’s erasing. We cannot treat digital identity as an experiment when for many, it is the only version of them the world ever sees.


I Am Not My Prompt

This project made me reflect on what it means to be visible online. MirageAI did not create an image of me from scratch. It reconstructed one using what it preferred. It reflected algorithmic beauty, not identity and whilst some features were accurate, what it left out or over styled says more about the dataset than me.


References 

Acuña, M. (2020). The Future of Storytelling with Mariana Acuña – Opaque Studios and FROST. YouTube. https://www.youtube.com/watch?v=dHvWi2hH1Fs


Buolamwini, J., & Gebru, T. (2018). Gender shades: Intersectional accuracy disparities in commercial gender classification. Proceedings of Machine Learning Research, 81, 1–7.

Canva. (n.d.). Design tool for visual communication. Retrieved April 10, 2025 from https://www.canva.com

Eke, D., & Ogoh, G. (2022). Forgotten African AI narratives and the future of AI in Africa. International Review of Information Ethics, 31, 1–8.


Hunter, A. (2022, December 9). AI selfies—and their critics—are taking the internet by storm. The Washington Post.


Kaye, D. B. V., Rodriguez, A., Langton, K., & Wikström, P. (2021). You made this? I made this: Practices of authorship and misattribution on TikTok. International Journal of Communication, 15, 3195–3215. http://ijoc.org/index.php/ijoc/article/view/16592


Lohr, S. (2018, February 9). Facial recognition is accurate, if you are a white guy. The New York Times.


Mokwala, P. (2020). Selfies as self representation tools during the construction of narrative identities. University of the Free State.






Comments