During the past year, policymakers have intensified scrutiny of digital features and products, particularly when they affect children. For example, last year the Instagram app was accused of exacerbating eating disorders and body image issues in teenage girls. Policymakers have also been examining digital features and products to determine whether they are a safety risk for children. This includes requiring digital companies to audit their features and products regularly. They are also required to record any potential harms they cause, including the number of children who have experienced such harms. The requirements could be less stringent for smaller companies.
While these requirements have been implemented, there are still questions about whether or not they will be effective. Some lawmakers have questioned whether digital companies have the resources to audit their products and features, particularly those that are available to all customers, including children.