Critics Say Sweeping Artificial Intelligence Regulations Could Target Parody, Satire Such as South Park, Family Guy – R Street

Its just not workable, a fellow at the R Street Institute, Shoshana Weissmann, tells the Sun. Although AI impersonation is a problem and fraud laws should protect against it, thats not what this law would do, she says.

The bill defines likeness as the actual or simulated image or likeness of an individual, regardless of the means of creation, that is readily identifiable by virtue of face, likeness, or other distinguishing characteristic. It defines voice as any medium containing the actual voice or a simulation of the voice of an individual, whether recorded or generated by computer, artificial intelligence, algorithm, or other digital technology, service, or device to the extent that an individual is readily identifiable from the sound of it.

Theres no exception for parody, and basically, the way they define digital creations is just so broad, it would cover cartoons, Ms. Weissmann says, adding that the bill would extend to shows such as South Park and Family Guy, which both do impersonations of people.

Its understood that this isnt the real celebrity. When South Park made fun of Ben Affleck, it wasnt really Ben Affleck. And they even used his picture at one point, but it was clear they were making fun of him. But under the pure text of this law, that would be unlawful, she says.

If the bill was enacted, someone would sue immediately, she says, adding that it would not pass First Amendment scrutiny.

Lawmakers should be more careful to ensure these regulations dont run afoul of the Constitution, she says, but instead, they have haphazard legislation like this that just doesnt make any functional sense.

While the bill does include a section relating to the First Amendment defense, Ms. Weissmann says, its essentially saying that after youre sued under our bill, you can use the First Amendment as a defense. But you can do that anyway under the bill. That doesnt change that.

Because of the threat of being dragged into court and spending thousands of dollars on lawyers, the bill would effectively be chilling speech, she notes.

One of the harms defined in the bill includes severe emotional distress of any person whose voice or likeness is used without consent.

Lets say Ben Affleck said he had severe emotional distress because South Park parodied him, Ms. Weissmann says. He could sue under this law. Thats insane, absolutely insane.

The bill would be more workable if it was made more specific and narrow to actual harms, and also made sure that people couldnt sue over very obvious parodies, she says. The way its drafted now, however, is going to apply to a lot more than they intended, she adds.

See the rest here:
Critics Say Sweeping Artificial Intelligence Regulations Could Target Parody, Satire Such as South Park, Family Guy - R Street

Related Posts

Comments are closed.