A new chatbot, similar to ChatGPT, turns text into celebrity voices, It is even possible for users to train NoiseGPT to imitate their own voice, or the voice of their friends, family, or coworkers.
Imagine getting a voice message from your favourite US president, or John Lennon or Elvis sharing some personal information with you that only close relatives know.
In November 2022, Microsoft-backed (MSFT) ChatGPT artificial intelligence content generator will launch as its newest chatbot application.
In an interview with Yahoo Finance UK, NoiseGPT’s chief operational officer Frankie Peartree said: “We are training the AI to mimic around 25 celebrity voices at the moment, and we will soon have 100 plus celebrity voices available.”
On Monday, NoiseGPT was released on Telegram, allowing users to send social media messages to friends in the voice of celebrities.
The company’s website will soon provide instructions on how to train the app to use your voice.
Any smartphone that can download Telegram’s social messenger application can use the app, ensuring its mass-adoption.
The Crypto Mile: How ChatGPT could lead to mass technological unemployment
In the future, AI applications may be able to imitate your own voice, that of your friends, or whomever you can get a voice sample from, raising concerns such as children hearing their parent’s voice in messages.
While deepfakes are technically legal in no jurisdiction, they can lead to mistrust, suspicion, and manipulation.
Deepfake tech enables violations of personal and intellectual property rights, according to NoiseGPT app. In order to avoid infringements, users will be able to select celebrity voices that they don’t want spoken in their text, labeled “not Donald Trump” or “not Jennifer Lawrence”.
The world is on the verge of deepfake chaos, isn’t it?
“I think it’s a positive thing, it will cause some chaos at first, but we will find a balance in the end. This was also a concern with Photoshop, for example.” Peartree told Yahoo Finance UK.
As a result of its legal implications, censorship risk mitigation will be factored into the application’s design. The application will not be stored on a centralised server, but will use blockchain-based decentralised storage.
The legal issues are one of the reasons we will decentralise fast, for training and API connections, so we cannot be censored,” he explained.
The decentralised nature of the new application will mean that the computational burden to run the application will be shared amongst computers across the globe, which will “run the models, training and API feed from people’s homes”. NoiseGPT cryptocurrency tokens will be rewarded for running the program on your computer.
“People who create new popular voices for the app will also be rewarded in cryptocurrency,” Peartree said.
This cryptocurrency currently has a 5% transaction tax, but this will be removed in the future. All funds go to development and operations, and these were not team tokens.
Deepfake technology’s legal and societal implications
Our ability to manipulate the human voice could cast doubt on the veracity of information we receive online and through our phones.
Additionally, it has implications for nation-state interplay and how it can be used to influence rivals and gain public support.
While policymakers are working to mitigate the risks of deepfakes, current UK laws need to catch up.
Only real images are covered by these laws, such as revenge, where ex-partners share private and confidential explicit material publicly.
Deepfake material containing the identity of the “target” in content can only be prosecuted if the offender directly harasses the target or if copyright is violated.
Deepfake technology could have legal and societal implications in the following areas:
Infringement of intellectual property rights – deepfake technology is capable of impersonating a person who owns intellectual property.
Deepfakes can be used to create exploitative content, which violates an individual’s privacy.
Deepfakes can spread false information and damage a person’s reputation, potentially causing problems in their personal and professional lives.
An individual’s privacy and data protection can be compromised by deepfakes, making them vulnerable to identity theft and other forms of cybercrime.
Deepfakes can be used to manipulate public opinion during times of political tension, such as elections.
Deepfakes can spread false information and lead to a general distrust of news sources, individuals, and institutions.
If consumers are misled or misinformed by deepfakes, liability concerns can arise.
Deepfakes can cause geopolitical tensions and pose a threat to national security if they are used to spread false information or manipulate public opinion.
As technology advances, online video and audio communication could become increasingly dubious due to deepfakes.
NoiseGPT Products and Models
AI platform that is decentralized and free from biases and censorship. It also appears to be a community-driven platform that rewards contributors for their work.
1. Hyper realistic text-to-speech
2. Dialogue bots
3. Single-shot voice cloning