Video games must clarify AI use, according to new law

Even partial use of artificial intelligence must be disclosed, though precise application up for interpretation

A screen grab from Uncover the Smoking Gun, a video game that uses conversational artificial intelligence for interaction between players and nonplayer characters (ReLU Games)
A screen grab from Uncover the Smoking Gun, a video game that uses conversational artificial intelligence for interaction between players and nonplayer characters (ReLU Games)

A new law on the regulation of artificial intelligence mandates clear disclosure of AI use in a wide variety of products, including video games, a field where AI has played an integral role since its very inception.

Rep. Kang Yu-jung of the Democratic Party of Korea said Tuesday that the National Assembly Research Service’s reply on her inquisition suggested that video games are subject to the AI Framework Act, which is set to take effect in January next year. As such, video game developers that used AI in development must notify users that their product is based on artificial intelligence, and they must implement a risk assessment and management system.

Video games have been actively using AI to control numerous interactive elements to improve the player experience since its primitive stage, ranging from gradual adjustment of difficulty levels to movement patterns of nonplayer characters. Some video games more recently have adopted conversational AI for dynamic storytelling, such as Uncover the Smoking Gun, published in 2024.

Public disclosure of AI use is not confined to games actively using AI in the gameplay experience, but also includes those that use AI-generated images, sounds or 3D models. The new act stipulates that even partial AI use of a product must be disclosed, including for creative content.

“Video games that used an AI model partially to create text, images, sound or video can be considered an AI product, and its publisher can be regarded to be in the AI industry,” the parliamentary think tank was quoted as saying.

The National Assembly Research Service added that application of the new law may differ depending on how much generative AI has been used and how much human contributions factor into the product.

Creative industry struggling to find balance between AI, human input

“With the rapidly increasing application of AI in games, the boundary (between the AI and) creative realm such as sound and images is tumbling. … As the concerns of existing creators and hopes of industry clashes, there need to be policies that protect the rights (of creators) while promoting the industry,” Rep. Kang said.

The Basic Act on the Development of Artificial Intelligence and the Creation of a Foundation for Trust was passed by the National Assembly in December and enacted in January. It is to provide legal guidelines for AI use in the creative industry, which has been quick to adopt the technology in music, movies, cartoons and other products.

The local creative industry has been mulling adoptive measures in the increasingly wide use of AI for making what had previously been considered products of pure human imagination.

The Korea Music Copyright Association recently implemented a procedure that requires songwriters to verify that they did not use AI at all in writing their songs, in response to possible legal issues related to AI-generated songs. Its stance is that AI-created songs cannot be copyrighted.

A plenary session of the National Assembly is held on May 1. (Yonhap)
A plenary session of the National Assembly is held on May 1. (Yonhap)

[email protected]


评论

发表回复

您的邮箱地址不会被公开。 必填项已用 * 标注