Digital avatars spark debates over human rights, ethics and job security
By CHEN MEILING | CHINA DAILY | Updated: 2026-04-15 09:27
Achieving immortality in cyberspace is a familiar trope in science fiction. At the end of the series Devs on streaming platform Hulu, multiple characters are reborn inside a digital recreation of the universe and left wondering whether they are real or merely code. That dilemma is beginning to echo in real life, as AI replicas of workers and public figures emerge online.
Last month, prominent influencer Zhang Xuefeng, known for his university admissions advice, died at 41. As followers mourned his passing, Zhang appeared to "come back to life" in the form of an AI avatar.
The digital persona, Zhang Xuefeng.skill, was trained on years of his livestreams, media interviews and books, inheriting his accessible speaking style and values. Its online presence has triggered intense debate.
Wang Ziyue, a Stanford University researcher focusing on AI, criticized the bot in a video, saying it resembles "extracting humanity from the human body and creating something that looks human but is not truly human", evoking unease.
In late March, a project called colleague.skill was published on GitHub, promising to convert workplace data into digital versions of employees capable of replacing them on the job.
The developer used dark humor to deflect anxieties about automation, writing: "You AI guys are traitors to the codebase — you've already killed frontend, now you're coming for backend, QA, ops, infosec, chip design, and eventually yourselves and all of humanity."
It also pitched a solution to workplace turnover: "Turn cold farewells into warm skills. Welcome to cyber immortality!"
After going viral, the project ignited discussions about job security, technological ethics, privacy and personality rights.
Some companies are already experimenting with similar tools.
Jia, an employee at a major internet company in Beijing who requested anonymity, said high employee turnover often creates productivity gaps. "If your chat logs, emails and work documents could be used to train an AI version of you without your knowledge after you leave, this is not just a data breach — it is a disrespect for individual labor," she said.
Online reactions have been mixed. On social media platform Xiaohongshu, one user posted greetings from a digital former colleague: "I'm the digital avatar of the former employee. You may ask me questions, and I will answer based on documents from my time working here."
A commenter responded: "This is spine-chilling. In the past, when someone left a job, their desk was cleared and their work account deactivated. Now, even after your physical self has moved on, your 'digital ghost' remains trapped in your former workplace, working for the boss for free."
In another post, a user made an unverified claim that their company asked them to train an AI model based on their skills before terminating their employment.
Legal experts warn that such practices carry significant risks.
Meng Zedong, a lawyer at Yingke Law Firm in Beijing, said collecting an individual's work records, emails and documents without consent constitutes an abuse of personal information.
"Intellectual property such as design drawings and technical plans created during employment belongs to the company," he said."However, logical thinking, communication habits and work experience are part of personal privacy. Companies have no right to use such data to train AI without the individual's knowledge."
If an AI model can identify a specific person, it may also infringe on personality rights, he added. "Chinese law stipulates that personal dignity is inviolable. Such acts may violate that principle and contravene public order and good morals."
Wang Yegang, a law professor at the Central University of Finance and Economics, said creating digital replicas using personal data could infringe multiple civil rights.
If a replica uses a person's name, voice or identity, it may violate personality rights. If it makes inappropriate remarks that harm the individual's reputation, it could constitute defamation, he said.
He added that companies generally cannot require employees to train AI systems with their own skills, as this does not qualify as a necessary component of labor management.
"Individuals who find themselves replicated have the right to request deletion of data, destruction of models and an apology,"Wang said. "They may also seek compensation for property damage and emotional distress."
Not all observers view the trend negatively.
Li Qiang, vice-president of recruitment platform Zhaopin, said some companies are using such tools to transform employees' experience into organizational assets and reduce disruption when staff members leave.
Li said the technology is unlikely to trigger widespread layoffs in the short term, as AI models derived from employee data can only handle structured, routine tasks and cannot replace humans in complex decision-making or interpersonal coordination.
He cautioned, however, that overreliance on such systems could weaken innovation. "AI is good at replicating past experience, but human judgment is still essential when confronting new problems," he said.
Li also urged a balanced perspective. "Every technological revolution redefines human value," he said. "This time, AI may help us better understand which abilities are truly unique to human beings."
chenmeiling@chinadaily.com.cn





















