04 a pocket of a future
Life After Death
As a way of memorializing dead loved ones, bots built from the immense digital trail individuals leave behind become common. Preparing the bot is just another step of estate planning. They serve a variety of purposes: to help individuals with the grieving process, to act as archivists and historians for future generations, or to extend one’s legacy. They can take on different forms in text, voice, or visual.
The process of getting an AI is one full of friction; is it done before or after the individual passes away? Who has control of creating it? Who has access to the AI afterwards? Is getting one even the right decision?
I explored the emotional space I want to capture with “Orpheus,” an imaginary AI service/subscription provider. Certain companies, like Facebook, are already well positioned to push a product like this, with their extensive knowledge in AI and access to huge amounts of personal data. A company like Second Life has an interesting platform that they could leverage in a different way and approach the same product (an AI of the dead) from a different angle. With these two in mind, I position Orpheus as secure, respectful, and a touch stoic. To maintain a clear sense of separation between the living a dead, the AI can only be interacted with through a chat interface– a layer of abstraction that maintains distance.
I also consider more potential sideshows; specifically, is there a response to such a highly formalized process? It would be hard to imagine a DIY/hacker community not forming in response to privacy concerns. Of course, some will choose to completely opt out of creating an artificial version of themselves. Others may choose to build it themselves and maintain full control over their data and digital manifestation, congregating in subreddits and Stack Exchange communities.
As for the actual experience of the future, I decided to build a service touchpoint of Orpheus, the way for a new customer to set up their own AI.
Beyond the obvious, flowers and a neat headshot are two visuals I associated most closely to funerals, and chose to use white flowers as the main visual element for the interface. The lightness of the palette is intended to relieve the anxiety or awkwardness of having to confront one’s mortality.
The final screens explore a variety of the potential side effects of using AIs as an archive of self. Beyond the obvious data privacy implications, should individuals be able to change their AIs to create a more perfect self? For those interacting with the AI, would they want to interact with an idealized version of the individual, how they want to remember them, or should the AI be as honest and true as possible?
Much like Facebook, individuals can also set privacy settings so their AI will know to avoid certain topics. For example, if an individual hasn’t come out to their parents, it’d be important for sexuality to not come up in conversations with their parents.
Who owns the AI and any related data? If an individual’s family stops paying for the service, do they disappear?
I intend on fleshing out the narrative more by building out a more informal counterpart (e.g. Stack Exchange community) as well as continue developing the narrative of the service.