////
1 min read

Google Play Store is introducing a feature to enable users to conveniently report offensive content generated by AI

Google is taking measures to enhance the quality, safety, and privacy of applications on the Android platform by updating its developer policies. These policy changes are designed to create a more secure and trustworthy environment for Android users.

One notable update pertains to the use of generative AI models within applications. Google is committed to responsible AI practices and will now require developers to enable the reporting or flagging of offensive AI-generated content within their apps. This ensures that users can provide feedback on potentially harmful content without leaving the application, aligning with Google’s commitment to providing safe AI experiences while adhering to developer policies that prohibit the generation of restricted content.

Privacy protection is another area of focus for these policy updates. Google is implementing stricter requirements for app permissions related to photos and videos. Under the new policy, apps will only be permitted to access such files when it directly relates to the app’s functionality. One-time or infrequent access requests will be directed to use system pickers, like the Android photo picker, which enhances user data security.

In addition, the policy updates address disruptive notifications. Google is introducing limitations and special access permissions for full-screen intent notifications to ensure they are used for high-priority scenarios, such as alarms and calls. Apps targeting Android 14 and above will require user consent for full-screen intent notification access, unless their core functionality necessitates it.

These policy changes underscore Google’s commitment to maintaining a secure and high-quality Android ecosystem. Developers are encouraged to align their apps with these updated guidelines to contribute to a better Android ecosystem. For more details, developers can refer to Google’s help center article on the subject.

Leave a Reply