When the two largest mobile app marketplaces (App Store and Google Play), made drastic changes to children’s app development guidelines last year, they left developers unsure of how they could continue to maintain their ad-supported business models. Yet the need to protect children's data is prescient.
While building a successful children’s app does face some additional obstacles, it’s far from impossible to do. The purpose of this post is to provide more insight into the debate around how to create an app for kids that is both fun for them but appropriate and legal as well. There are many guidelines that you need to follow when developing apps that follow coppa compliance and pii compliance.
In-app advertising: How did we get here?
Most of the new Apple and Google Play Store guidelines are related to targeted, in-app advertising. Specifically, they limit the types of data that can be transmitted to ad serving platforms and limit which ad serving platforms developers can work with. Clearly this is a problem for the many children’s app developers who rely on this model to monetize their business. When it comes to protecting our children online, there are no hoops too big for us to jump through. The problem is creating a kid friendly app that allows us to still profit and make money so we can continue to do so.
Why has in-app advertising become so ubiquitous in the kids’ app space? Let’s look at all of the monetization options (as defined by the App Store):
- Paid - A one-time purchase that gives users full functionality.
- Freemium - An app that is free to download that offers optional in-app purchases to access additional content and features.
- Subscription - In-app purchases need to be renewed after a certain duration for users to continue to access the content.
- Paymium - Users pay for the app and have the option to purchase additional features in the app.
- Free - Free to download, but ads are displayed within the app.
While all of these models have found success in the general app marketplace, kids’ apps have to be thought of differently. First, like any children’s product, the customer and the consumer are not the same (parent/guardian vs. child). It’s difficult to generate significant revenue from paid or subscription apps because parents are wary of putting more money toward games or programs that their child may only play once or twice.
In addition, the freemium/in-app purchase model has come under fire because of deceptive practices at the hands of some developers. Regulators have accused some children’s app developers into duping kids into racking up hundreds or thousands of dollars in charges on a parent’s credit card.
So this presumably leaves free apps with advertising as the happy medium. Kids can access lots of apps, and parents don’t have to pay lots of money for this access. This not only lets kids enjoy the apps but it also promotes an environment that doesn’t require them to make additional purchases to achieve further status in the games.
The regulatory risks of this business model
Apple and Google’s more stringent guidelines are meant to keep developers in compliance with the Children’s Online Privacy Protection Act (COPPA). The law contains provisions related to parental consent and protecting the “confidentiality, security, disclosure, and integrity” of personally identifiable information (PII) for children under the age of 13.
Both companies were under pressure from federal regulators and children's advocacy groups to create and enforce more stringent guidelines around secure PII data integration for developers selling through their stores.
This heightened enforcement was not without reason. A 2018 analysis of more than 5800 kids’ apps in the Google Play Store found that the vast majority were in violation of COPPA in one form or another:
- 5% shared location or contact information without explicit parental consent.
- 40% shared PII to third-parties without applying reasonable security measures.
- 19% collected PII via SDKs whose terms of service explicitly prohibit their use with children's products.
- Of the apps that shared resettable IDs with advertisers (3454), 66% transmitted other, non-resettable persistent identifiers “negating any intended privacy-preserving properties of the advertising ID.”
In 2019 Google paid the largest COPPA violation fine in history ($170 million) for practices related to its YouTube platform. It should come as no surprise then that the app marketplaces are being more proactive in enforcing COPPA compliance.
What is the concern around collecting and sharing data?
Let’s start by looking at how prevalent data sharing is in the mobile app industry. A 2018 University of Oxford study of more than 950,000 apps in the Google Play store found:
- The median app transfers data to 10 third parties.
- 20% of apps transfer data to more than 20 third parties.
- The majority of data is sent to ad serving platforms and various analytics and advertising-related subsidiaries of Google, Facebook, Twitter, Verizon, Microsoft, and Amazon.
The type of data collected includes location, device ID, connection via Wi-Fi vs. cellular, time spent in the application, events, and contact information. While collecting this information is not necessarily a COPPA violation, sharing with the wrong parties could be.
It’s important to look at data sharing from the point of view of a parent. A game sending their child’s personal information to upwards of 20 mysterious third parties is scary. And sending that data with no parental consent seems predatory. While developers may be collecting and sharing data to ultimately improve the app experience, parents may not understand that, especially when it is not explained explicitly.
One clear area where we see room for improvement is simply in the area of awareness and understanding. Children’s app design is, of course, not meant to be predatory in most cases. Most companies out there creating apps for children are doing so for profit but also to create entertaining games and apps that kids enjoy. Parents should understand what their kids are doing at all times and monitor all app downloads to ensure they’re not downloading anything that could cause a problem. On the other hand, it’s also up to app developers to follow pii compliance requirements to ensure that non-child friendly apps are clearly identified as such.
How you can make PII compliant apps
When Apple first announced its development guideline changes in June 2019, it took a hard stance stating that no third-party advertising or analytics tools could be used with children’s apps. Fortunately, after hearing from many developers, the company backed down and is allowing limited third-party analytics and advertising. The ad platforms, however, must “have publicly documented practices and policies and also offer human review of ad creatives.”
While that is some relief for developers, it is also limiting. But this doesn’t mean that the in-app ad business model has to be thrown out the window. It just may be time to rethink how data collection fits into your business.
You use data collection to improve the product and customer experience, but is every type of data you collect relevant to that goal? Your data collection practices should be for the benefit of the child and their adult guardian. This builds trust.
Another way to build trust is to limit what you share with third parties by keeping personal information under control in a private cloud. Doing so will allow you to conduct business as usual while transmitting PII only when you need to.
Ultimately, app developers have a big responsibility to maintain compliance and follow the rules. Apps designed for kids face a lot of scrutiny and they’re always under fire for a variety of things. Following children’s app development guidelines and creating apps that are both enjoyable and safe is of the utmost importance.