Jump to content

Recommended Posts

Posted


On Thursday, Microsoft researchers announced a new text-to-speech AI model called VALL-E that can closely simulate a person's voice when given a three-second audio sample. Once it learns a specific voice, VALL-E can synthesize audio of that person saying anything—and do it in a way that attempts to preserve the speaker's emotional tone.

Its creators speculate that VALL-E could be used for high-quality text-to-speech applications, speech editing where a recording of a person could be edited and changed from a text transcript (making them say something they originally didn't), and audio content creation when combined with other generative AI models like GPT-3.

 

https://arstechnica.com/information-technology/2023/01/microsofts-new-ai-can-simulate-anyones-voice-with-3-seconds-of-audio/?utm_source=join1440&utm_medium=email&utm_placement=newsletter

Posted (edited)

Heck look at some of the deepfakes already out there with an impressionist doing the voice. It could be a double edged sword because on one hand this could be used, as several have said, to make it appear certain people have done or said things they really haven’t. On the other hand, this could be used as another “out” similar to saying your NIL’s twitter account was hacked. “I didn’t do that, they synthesized my voice” or “they deepfaked me”.

 

Edited by Cr1028
  • Upvote 2

Join the conversation

You are posting as a guest. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. Please review our full Privacy Policy before using our site.