This Employee Does Not Exist
People have been hiding behind fake online personas for decades – writing on message boards, making friends, and even building their identities on social media. Anonymity is one of the underlying characteristics of the Internet.
However, it’s becoming more commonplace for AI-generated faces and digital human personalities to provide real value in the Internet landscape. And even companies are hiring them for roles.
Digital Humans Online
Digital humans have been an Internet oddity for quite some time. Dating back to 2018, I first covered physically non-existent, virtual influencers like Lil Miquela and Hatsune Miku. Lil Miquela modeled for magazines like Vogue, while Hatsune Miku sold out live concerts. More recently, FN Meka the virtual rapper, gained widespread attention when it was signed to Capitol Records. While their images may be entirely fake, their influence is real.
After the influencer role, companies like UneeQ and Soul Machines began creating digital humans for business. Specifically, digital humans that act as social media mascots and customer service agents. Some of these companies included Samsung, KFC, and UBS.
But now we’re reaching a new era of these digital humans in business. Companies are showcasing these fake people as employees, without disclosing that they’re not real.
Digital Human Employees
This Business Insider article talks about an Internet freelancer/entrepreneur known as Albertina Geller, who teaches people how to be healthy, balance their gut, and improve their immune systems. But Albertina isn’t real. Or at least her front-facing image isn’t. It was created by a GAN (Generative Adversarial Network), a type of AI that is often used to create realistic imagery.
She was, first and foremost, a prolific answerer of questions on health-related message boards. It wasn’t just gut health, either. On an obsessive-compulsive-disorder forum, Albertina offered feedback as to whether constant “sexual thoughts” constituted OCD. On a site called Anxiety Central she offered heartfelt responses to questions about everything from coronavirus tests to antibiotics. On the Q&A forum Quora she answered hundreds of queries, on topics as varied as constipation, rowing machines, and “some creative ideas for decorating cookies with sprinkles.” She edited Wikipedia pages, commented on cake recipes, and made precisely one post on a site called singaporemotherhood.com. – Business Insider
Geller originated on Generated Photos, a site that creates catalogs of human faces that don’t actually exist. They’re great for stock photography, design assets, and other media use cases. Interestingly, though, there are a lot of companies using these GAN images to represent their employees on their about pages.
Companies like Platinum Systems, “a team of passionate designers, developers, and strategists who create mobile experiences that improve lives,” with a GAN team of 12. Or biggerstars.com, a Hollywood news site whose entire editorial team featured fake images.
Whenever I dug into one of these companies, there was usually a kernel of reality behind them. Take Informa Systems, a company in Texas that sells law-enforcement training materials. The company’s site listed the Austin Police Department, the state of Nebraska, and the Los Angeles Police Department among its clients. According to Texas public records, the company really exists. And the police in Austin, Texas, really did contract for its services. But photos on its “Meet Our Team” page were almost all GANs, from CEO “Greg Scully” to the chief marketing officer “Roger Tendul.” (Tendul, a swarthy man with a beard and thick eyebrows, I’d seen before. His photo turned up on 30 other sites.) – Business Insider
One of the companies that utilized these GAN images for their employee profiles said that with so many part-time employees coming and going in their company, it was a lot easier to just use GAN images for their employees because it created uniformity.
In reality, it’s not that much different from the insurance company Lemonade modeling its AI chatbot, Maya, after its Chief Business Officer, Maya Prosor (who is a real person).
What’s crazier to you: a virtual influencer that gains real attention and gets hired for campaigns or a real person working at a company but using a generated photo as their identity?
Both of these scenarios are becoming more of a reality. Especially with remote work dominating company culture, it’s not uncommon to work with people that you’ve never met at all.
I think the future of digital livelihood is one where we must accept coexistence with AI-generated people, falsely-imposed identities, and influencers who don’t exist. What we must think about is whether or not we value the information, entertainment, or services they provide, knowing that they’re not entirely real.
Member discussion