Me, But Not Really. ASSIGNMENT 4 - Theory of the Selfie Part 3 (Transliteration of Critical Analysis for Blog)
1. My Creation Journey
I wanted to make a selfie of myself using AI. Not just any photo—something that felt like me, but created by technology. I started with ChatGPT, which helped me figure out which tools to try. I used Leonardo AI and Starry AI to create images.
At first, I was excited. I thought, “Wow, I’ll finally get to see a version of myself made by a machine.” I typed in descriptions, played around with styles, and tried many prompts. But the results were very mixed. Some images were completely off—they looked like a different person. Others had the right “vibe” but still didn’t feel like me.
One image came closest. It’s still not really me, but it's the one I’ll share. I realized something important: no matter how advanced AI is, it still can’t fully understand or capture a person’s face, emotions, or identity. And maybe that’s okay. Maybe that’s actually beautiful.
2. Self-Representation and Identity
Looking at the final image made me reflect. It was “me,” but it wasn’t. It felt like a version of me created by someone who had never met me in real life.
The AI made my features smoother, my skin clearer, and changed small things—like my eye shape or hairstyle. It added filters and colors that I would never choose for myself. So while it was based on me, it didn’t feel true to who I really am.
That made me think about how we present ourselves online. On platforms like Instagram or TikTok, we also pick and choose how we look. We use filters, good lighting, or take 20 pictures before posting one. In a way, we already act like our own version of AI—trying to show our “best selves.”
But our real identity is more than just appearance. It’s in our voice, the way we laugh, how we move, our thoughts and feelings. An AI selfie can’t show that.
3. Ethical and Cultural Questions
While using the AI tools, I noticed some strange things. In a few images, my skin tone was lighter than it is in real life. Sometimes my face shape was changed. I didn’t ask the AI to do that—it just did.
That made me think about the data these AI tools are trained on. They learn from thousands or millions of pictures online. But those pictures might have hidden biases. For example, maybe they show one type of beauty more than others. Or maybe they forget to include people from different cultures or backgrounds.
That’s a problem. If AI keeps showing one idea of what a “beautiful person” looks like, then it can hurt people’s self-image. It can also make some people feel invisible or misunderstood.
There’s also the question of privacy. When I used these apps, I had to give them some information—like prompts, style preferences, and sometimes even photos. But where does that data go? Who owns the image that the AI creates? These are important things to think about.
One powerful idea came to my mind during this project. The oldest human art we’ve found is in caves—paintings of human hands. Thousands of years ago, someone pressed their hand against a cave wall and said, “I was here.” And now, in 2025, AI still struggles to draw hands properly. That’s kind of amazing. Even with all our progress, there are still things only humans can do well.
4. Turning This Into a Blog
Writing this post was also part of the project. Usually, when I write for school, I use formal language and academic structure. But here, I wanted it to sound more real and natural—like a blog post you’d actually read online.
This wasn’t easy. I had to think about how to explain my thoughts simply, while still being thoughtful and meaningful. I tried to be honest and personal. I also thought about who might read this—friends, classmates, maybe strangers on the internet. I wanted it to make sense to everyone.
In class, we learned about McLuhan’s idea that “the medium is the message.” That means the way we share something matters just as much as what we’re saying. A blog post has a different feeling than a school essay. It’s more emotional, more visual, and more open. That changed how I told the story.
We also talked about transliteracy—being able to move between different forms of communication. I really felt that here. I had to translate my critical thinking into a blog format. It taught me a new way of expressing myself, using both words and visuals.
Final Thoughts
This project started as a tech experiment. I just wanted to see if AI could make a selfie of me. But it became something deeper.
It made me think about who I am, how I show myself to others, and what technology can—and can’t—do. AI might be smart, but it doesn’t have a soul. It doesn’t know my memories, my story, my inner world.
And maybe that’s what makes us human.
AI can draw a face, but it can’t feel what it means to be alive. It doesn’t know what it’s like to grow, to change, to make mistakes, or to find meaning in them. A digital image might get close, but it can never tell the whole story. Maybe that’s the difference. And maybe, that’s the beauty. AI might be able to mimic, but it can’t remember. It can’t hope.
Just like those ancient handprints in caves, we’re still trying to say, “I’m here. This is me.” Even if the tools change, the message stays the same.
And in that, there’s something AI still can’t replicate. Not yet. Maybe never.
Comments
Post a Comment