Meta announced that it will now require advertisers to disclose when they digitally create or alter a political or social issue ad in certain cases, including with artificial intelligence (AI).

“We’re announcing a new policy to help people understand when a social issue, election, or political advertisement on Facebook or Instagram has been digitally created or altered, including through the use of AI,” Meta said in a blog post.

This new policy will go into effect next year and will be required globally.

Advertisers will have to disclose whenever an ad contains a photorealistic image or video, or realistic-sounding audio, that was digitally created or altered.

Buy Me a Coffee

This includes depicting a real person as saying or doing something they did not say or do, depicting a realistic-looking person that does not exist or a realistic-looking event that did not happen, or altering footage of a real event that happened, and depicting a realistic event that allegedly occurred, but that is not a true image, video, or audio recording of the event, the company mentioned.

Advertisers running these ads do not need to disclose when content is digitally created or altered in ways that are inconsequential or immaterial to the claim, assertion, or issue raised in the ad.

Meta said that it will add information to the ad when an advertiser discloses in the advertising flow that the content is digitally created or altered.

READ
Nvidia Denies Rumors of Cutting Supplies to China