Last year, while on stage giving my TEDx Talk on the immortalising power of VR, I found myself saying this:
“Imagine if, ten years after her death, I could relive the memory of my grandma teaching me how to waltz in her living room? And imagine if, even decades after she passed away she could teach her grandson, who she never met, how to waltz in her living room. That’s the thing with VR, we’re not just immortalising memories, we’re immortalising people too!”
Fast-forward six months and I find myself twisted up with a weird sense of guilt and horror after seeing my imaginings brought to life. My prophetic wonderings were realised in the form of Jan Ji-Sung, reliving the memory of a day at her neighbourhood park with her deceased six-year-old daughter, through a VR headset.
The VR piece titled ‘I Met You,’ was created for a TV show in South Korea and became a lightning rod for criticism and discussion. Was this a psychologically traumatising experiment with damaging and negligent consequences? Or a glimpse into the potential of VR to immortalise our memories, our loved ones, even the possibility of reanimating them?
It seems we have arrived, earlier than I ever expected, at a crossroads where innovation and ethics intersect.
And while Silicon Valley is full of stories of optimistic good intentions quickly descending into a morally corrupt dystopia, VR has a chance to do things differently.
As a pioneer and champion of VR, I took a step back to take a holistic look at not just the South Korea experiment, but at what it implies for the VR industry as a whole, and how we can overcome human challenges in creating content for this powerful medium that protects and promotes our audiences’ wellbeing.
Firstly, we need to consider consent.
We’ve seen how big tech has clandestinely acquired our data, images, search history and geolocation and misused, sold it and commoditised it all without the public’s awareness or consent. Offering your likeness in VR has to be different. It has to be a conscious act of consent made with clarity and full understanding.
So, what might that look like?
Rather than signing their lives away, we might see people signing over their likeness by giving loved ones permission to reanimate them in the event of their death. Finer details like whether their effigy can be used for general purpose as an avatar or exclusively to engineer VR experiences from lived memories, like the South Korea experiment, might be included as caveats. It’s not farfetched to think that this might even be an extension of the opt-in schemes organ donors sign up to.
But who has the right to whose memories? Would you want your partner, mother or friends to be able to relive memories of your embarrassing wedding dad-dancing or school plays? Could it be possible to veto memories you’d rather forget to prevent them becoming your VR legacy?
The issue of consent is further complicated with the South Korea experiment as the deceased person in question is a child. Consent legally falls to parents and guardians for children in most cases, it’s likely VR would be no different. By agreeing to take part in the TV show, Ji-Sung had, explicitly or not, consented to having her daughter’s image used in this way.
Once we’ve accounted for consent, there’s still a lot to consider. Whether or not it is legal to create a VR experience isn’t a satisfactory qualifying criteria when it comes to safeguarding our audience.
The next thing we need to look at is whether it is safe and ethical to recreate someone in VR based on acceptance.
Experts have suggested that the South Korea experiment could be considered ethical if the mother had fully accepted that her daughter had passed away. I’m no grief counsellor, but Princeton neuroscientist Michael Graziano told Dell Technologies in December 2019: “Since you know the person is gone, you accept the virtual equivalent for what it is — a comforting vestige.”
With acceptance comes healing, so a VR experience of a deceased loved one is more likely to spark positive feelings of nostalgia rather than despair and grief while immersed in the reanimated memory. So, full acceptance is fundamental in order for viewers to safely engage in this type of VR experience. Graziano gave it his ethical seal of approval, stating that if acceptance is a prerequisite then “there is nothing wrong or unethical about it.”
So, what happens once the headset comes off?
We have an opportunity, as creators, to foster new, transgressive experiences that offer the impossible and bring to life worlds beyond our own. But we also have a responsibility to ensure that we consider our audience’s best interests beyond the virtual world.
That’s where aftercare needs to be appropriately signposted or facilitated if necessary.
All appropriate measures should be taken to put aftercare in place for potentially vulnerable audiences. Neurological studies show that our brains respond to VR experiences the same way they do to things we encounter in reality. This represents a huge threat to our industry if it’s overlooked. Participating in a sensitive VR creation can have traumatising effects on our psyche.
To avoid corporate negligence we need to make sure that a network of support is in place for anyone affected by our creations. This could include helplines, therapy sessions or time with grief counsellors.
As a final backstop for the VR industry, before diving head-first into storyboarding the next ‘I Met You,’ we should consider Jeff Goldblum.
That’s right, Jeff Goldblum. Decrying the skewed moral compasses of genetic scientists in all his unbuttoned-shirt glory in Jurassic Park, Jeff delivers this timeless maxim:
“Your scientists were so preoccupied with whether or not they could, they didn’t stop to think if they should.”
It’s going to be a continual temperature check on the state of the VR industry to keep asking ourselves, why are we doing this? And are we doing more harm than good? That pause for thought and self-regulation is the difference between harming our audience in the name of innovation, or building a better, more inclusive, interactive world.