Module 9 - Assignment 4: Theory of the Selfie Part 3 - Ginger Han

Taking Selfies with HereAfter AI: Exploring the Accessibility and Ethical Landscape of AI-Generated Avatars


Have you ever expressed your grief online? Mourning is a never-ending topic for human beings. And the eagerness to talk to deceased loved ones is a shared experience we all have before.

Nowadays people have expressed mourning and grief through online platforms a lot. Its really common to see that many people express their condolences or mourning online after the passing of their loved ones. Some of them even continue to interact with the online accounts of deceased friends or family members. This kind of action represents one of the ways we express grief in the era of technology.

In the realm of artificial intelligence, the concept of selfies has transcended beyond mere human faces. Now, they encompass narratives, abstract compositions, and digital avatars formed from memories. The rapid evolution of AI continually challenges our imagination and understanding. In recent years, a poignant trend has emerged with the rise of griefbots, digital tools designed to commemorate and interact with deceased loved ones. Inspired by this, I embarked on creating a personalized digital selfie utilizing griefbot technology.

The proliferation of digital mourning tools like griefbots marks a profound socio-cultural shift, where technology increasingly mediates our deepest emotional experiences. This phenomenon prompts profound philosophical inquiries into the interplay between technology and our emotional landscapes. As we navigate this digital age, where even the deceased maintain a presence online, questions arise about how individuals retain control over their own narratives in death.

Griefbots operate by extrapolating digital footprints to construct digital avatars of the departed, facilitating interactive exchanges between mourners and virtual representations of their loved ones. This interaction, often through conversational interfaces, offers a surreal yet tangible connection to the departed.

In essence, griefbots bridge the gap between the tangible and the digital, offering a novel means of commemoration for those grappling with loss. HereAfter AI exemplifies this, providing a platform for preserving and sharing personal legacies in a structured and technologically mediated manner. By restricting interactions to pre-recorded stories, HereAfter AI ensures a controlled exchange between the user's digital representation and their loved ones.

Current research on griefbots primarily focuses on the ethical implications of these technologies, particularly regarding privacy and moral rights. Using HereAfter AI as a case study, my research aims to delve into the accessibility of AI-generated selfies through griefbots, ethical design considerations, and the security of creating digital avatars online. As we navigate this evolving digital landscape, understanding the intricacies of griefbot technology is paramount in shaping ethical and meaningful interactions in the realm of digital mourning.

HereAfter AI demonstration image


The online infringement bug has bitten hard, and it seems like everyone's feeling the sting. Just a simple emoticon or a selfie shared online can open the floodgates to a deluge of unwanted messages and even personal info leaks. Beefing up cyber-security sounds great, but sometimes it feels like it's at the expense of our privacy. You've got companies tracking our every click with sneaky little cookies, and governments peeking into our digital lives with wiretaps and whatnot. It's enough to make you want to unplug and hide under a rock!

But hold on to your hats, because here comes a tale straight out of a sci-fi novel! Picture this: a guy named Gabriel creates a platform called Memoriam that lets you access memories of the departed for free. And get this, he's got a gadget called the Persoc that projects hologram ghosts who tell their life stories (Gamba, 2022)! Spooky, right? But also kind of cool when you think about how it makes you ponder the power of memory.

Now, before you start envisioning hologram meetings at your office job, let's remember that these fancy tech toys aren't exactly mainstream yet. They're like the VIP lounge of the digital world – not everyone gets in. And that's partly because they're still a bit pricey and require some serious tech know-how. So hey, at least our privacy gets a bit of a breather from all the mainstream app shenanigans.

But here's where it gets really wild – they're using all online chatter to create chatbots, like griefbots, that mimic the way our loved ones used to talk. Some folks see it as a nifty new way to remember the departed, while others are raising their eyebrows at the ethical questions it raises. These griefbots aren't just digital shrines; they're like having a conversation with a ghost in the machine! And let's not forget, they're giving the deceased a voice, letting them steer the conversation from beyond the digital grave. So when it comes to digital avatars like HereAfter AI, it's not just about what we want to say – it's about giving the deceased the reins to their own virtual selves. Talk about taking control from beyond the veil!


Description of Creation Process

To create my own virtual avatar, I can follow a variety of provided prompts and video guidance in the app to share my memories (see image 1). The prompts cover a relatively wide range of aspects of life, such as “Childhood”, “Work”, “Relationships”, and “Personality”. Below every section, the sub-choices are more detailed. For example, when I choose “Work”, there are sub-choices like “General work story”, “Fulfilling Aspects”, “First Jobs”, and “Career Choice”, making the storytelling more comprehensive. This allows the deceased to view their life experiences from a first-person perspective, while also automatically setting up a system to prevent listeners from over-interpreting or excessively imagining (please see the image 2).

image 1

image 2


Analysis of Representation

While these choices might be potentially restricting, they also establish limits on the extent to which those bereaved can reconnect with the departed ones.

(1) Restricting Choices and Reconnection:

By setting limits on the interactions between the bereaved and the digital representation of the deceased, the app inherently restricts the depth of reconnection possible. While this might seem limiting, it also ensures that users engage with the app in a balanced manner. Without these restrictions, users might become overly reliant on the app as a sole means of connecting with their departed loved ones, potentially hindering their ability to grieve and move forward.

(2) Anti-Addiction Measures:

Introducing an anti-addiction mode akin to those found in gaming applications acknowledges the potential for users to become excessively dependent on the app. By implementing features that limit usage or encourage breaks, the app promotes healthy engagement with the digital memorialization process. This is crucial in preventing users from becoming too emotionally invested or reliant on the digital representation of the deceased, which could lead to negative impacts on their mental well-being.

(3) Balancing Usage:

The inclusion of anti-addiction measures underscores the app developers' commitment to fostering a balanced approach to grief and remembrance. While users need to have access to tools that facilitate mourning and connection with the deceased, it's equally vital to ensure that these tools do not become a crutch or substitute for healthy coping mechanisms. By promoting moderation and mindful engagement, the app aims to support users in their grieving process without enabling unhealthy dependencies or behaviors.


Incorporation of Feedback

In the process of both giving and receiving peer reviews, I learned that when providing feedback to peers, it's vital to aim for specificity, constructiveness, and a focus on content rather than the individual. Feedback should highlight both strengths and areas for improvement, offering actionable suggestions to enhance clarity, coherence, and persuasiveness. Encourage peers to utilize credible references to bolster their arguments, integrating them seamlessly into their writing to demonstrate engagement with academic discourse. Emphasize the importance of exploring diverse sources to enrich analysis and promote deeper understanding. Ultimately, our guidance can help peers navigate academic complexities, fostering their intellectual growth and contributing valuable insights to their field.

To consider the feedback I got and really take consideration of it and blend it into my final presentation in my post. I adjusted the overall structure of the article according to the assignment requirements and feedback suggestions. I divided and reorganized the content originally belonging to the same module and used a more relaxed and pleasant tone for narration.


Ethical and Cultural Considerations

Ever since the inception of the initial chatbot ELIZA, designed by Joseph Weizenbaum in 1966 to simulate a psychotherapist, the speedy development in artificial intelligence has really transformed our interactions with machines. Following that, in 2015, after the sudden and unexpected death of Roman Mazurenko in a car accident, Eugenia Kuyda felt a compelling desire to have one last conversation with her dear friend. By viewing Roman’s past text messages over and over again, she started to imagine utilizing them as the foundation materials for a chatbot that is capable of simulating his conversational style (Newton, 2016). Going through over 8,000 lines of text messages from Roman’s various conversations and employing a neural network developed at her artificial intelligence startup, Kuyda successfully constructed a chatbot (Elder, 2020), which is called Replika. The core of Replika is a messaging app where users spend tens of hours answering questions to build a digital library of information about themselves. And it’s easy if someone wants to replicate their deceased loved ones on this app and talk to them. 

The ethical issue of these griefbots is obvious too. As Elder argued before, “mired in grief but drawn back into the pseudo-relationship, unable to move on but unfulfilled by the facsimile of a loved one” (Elder, 2020). 

However, HereAfter AI serves users in a different way that allows the “deceased” ones to make an initial move for the bereaved to memorialize them. One notable different approach in HereAfter AI is its utilization of recorded voice, rather than the text-based nature or the synthesized text-to-speech method. Therefore, as a user, the first time that I log in to the app, I am the “deceased” one who is ready to record my life stories and memories by speaking to the built-in AI assistant.

In new media studies, when it comes to identity construction theory, the focus lies in how individuals shape and express their identities through social interactions and cultural environments, which is always automatically considered as a privilege of the living. The preliminary findings reveal that griefbots predominantly target the mourners, simulating the deceased through their digital footprints. 

HereAfter AI, however, introduces a different dynamic by making the deceased share their life stories, providing a new approach to solving issues like mourners getting addicted or being excessively dependent on using griefbots. The study not only contributes to our understanding of how technology shapes our emotional experiences when it comes to griefing but also highlights the unique features and ethical considerations of griefbots. HereAfter AI, with its focus on two-way communication, recorded voice, and user-initiated memorialization, shows a novel approach that subtly balances technological innovation with ethical considerations in the delicate stage of grief and memorialization. To some extent, his application truly gives the deceased the agency to tell their own story, and take the selfies they like, rather than becoming faded old photos pieced together in someone else’s memory. 

In a bigger picture, there are also some socio-cultural impacts. The emergence of digital tools for mourning, such as AI-generated griefbots and applications, forms a unique socio-cultural phenomenon that reflects a significant shift in the human experience. Technology influences our interactions with the world, even to the extent like really deeply personal feelings. Digital mourning is not just an individual experience, for me, it brings up philosophical reflections on the role of technology in shaping our emotional experiences, and I think its valuable to investigate, to understand how apps, games, VR, and AI facilitate grief expression and digital memorialization provides insights into the complex relationship between media, technology, and human emotions, and even the societal attitudes towards death, remembrance, and the evolving nature of human connections. 


Transliteracy

The translation of content from a traditional critical analysis to a blog format demonstrates the adaptability of communication to different mediums and audiences. While the traditional analysis focuses on formal language and objective evaluation suited for scholarly audiences, the blog adopts a more informal and engaging tone to appeal to a broader readership. In the context of digital communication theories, this shift emphasizes the importance of considering the characteristics and preferences of the target audience. Different mediums offer unique affordances and constraints that influence message delivery, highlighting the need to tailor communication strategies accordingly for effective engagement in the digital age.


References:

1. Allahrakha, N. (2023). Balancing Cyber-security and Privacy: Legal and Ethical Considerations in the Digital Age. Legal Issues in the Digital Age, 4(2), 78-121. Retrieved from https://lida.hse.ru/article/view/17666

2. Elder, A. (2020). Conversation from Beyond the Grave? A Neo-Confucian Ethics of Chatbots of the Dead. Journal of Applied Philosophy, 37(1), 73–88.

3. Gamba, F. (2022). AI, mourning and digital immortality. Some ethical questions on digital remain and post-mortem privacy. Études sur la mort, 157, 13-25. https://doi-org.login.ezproxy.library.ualberta.ca/10.3917/eslm.157.0013

4. Morse, T. (2023). Digital necromancy: users’ perceptions of digital afterlife and posthumous communication technologies. Information, Communication & Society, 1–17. https://doi.org/10.1080/1369118x.2023.2205467

5. Jiménez-Alonso, B., Brescó de Luna, I. Griefbots. A New Way of Communicating With The Dead?. Integr. psych. behav. 57, 466–481 (2023). https://doi.org/10.1007/s12124-022-09679-3

6. Newton, C. (2016). Speak, Memory. When Her Best Friend Died, She Used Artificial Intelligence to Keep Talking to Him. The Verge. https://www.theverge.com/a/luka-artificial-intelligence-memorial-roman-mazurenko-bot

Comments