4 Moderation Myths and What Apple Really Wants from Your App

0
4
Moderation Myths iPhone SE 2 Apple AR Headset
Image source: Apple

According to a 2019 report from BusinessofApps, the number of apps in the iOS App Store was 2.2 million and climbing. With current events driving businesses to bring even more interactions online through apps, that number will likely surge over the next few months.

Get Our Activist Investing Case Study!

Get the entire 10-part series on our in-depth study on activist investing in PDF. Save it to your desktop, read it on your tablet, or print it out to read anywhere! Sign up below!

Q4 2019 hedge fund letters, conferences and more

There may be a sense of urgency to get your business’ app on Apple’s App Store, but don’t be in such a rush that you fail to meet Apple’s App content moderation guidelines. The first step in successfully launching an app is to be aware of Apple’s strict requirements for the moderation of any user-generated content (UGC) produced by your app.

Without the right checks and balances in place, your app could be rejected or banned by Apple. In this post, we’ll discuss what Apple is and isn’t cracking down on, and how to moderate your content and make adjustments before Apple steps in.

Top value fund managers are ready for the small cap bear market to be done

InvestorsDuring the bull market, small caps haven't been performing well, but some believe that could be about to change. Breach Inlet Founder and Portfolio Manager Chris Colvin and Gradient Investments President Michael Binger both expect small caps to take off. Q1 2020 hedge fund letters, conferences and more However, not everyone is convinced. BTIG strategist Read More

What Apple Really Wants From Your App?

Ready to find out what Apple really wants from your app? Let’s get started.

Moderation Myths: #1. My app is new, so it doesn’t need a mechanism for reporting offensive content and responding to concerns

Actually it does. To prevent abuse, your app needs the following if it contains UGC provided content from social media networks:

  • The proven ability to block users from the service if they have been deemed abusive.
  • A filtering method to catch objectionable material before it gets posted to the app.
  • Complete information and accurate screenshots to prevent a metadata rejection.

Although a metadata error from Apple means that your app’s review has been interrupted (which is preferred to complete app rejection), it does cause a delay in launching your app on the App Store. A metadata rejection occurs when Apple’s App Review team decides that your app has metadata items that fail to comply with its guidelines, such as the app name, category, keywords, or rating. Apple will change your app’s status from “in Review” to “Metadata Rejected” if this is the case.

From the start, provide accurate keywords and information that does not violate the App Store Review Guidelines to avoid delays. Double-check that you’ve provided a demo login account, sufficient configuration instructions, published contact information so users can reach you easily, and check that your submission form is not missing any required information.

Even if your app is rated higher than a 4+ age rating, keep it free of misleading or inappropriate screenshots. Additionally, all screenshots should accurately and thoroughly communicate the function of your app, as an irrelevant screenshot could be grounds for rejection by Apple’s review team.

Moderation Myths: #2. My app features content that is appropriate and innocent, so I don’t need moderation

The truth is that where there is content being shared, moderation is needed. In the case of an app that allows UGC, you cannot always anticipate what your users will generate.

Consider the ban of TikTok that occurred in April of 2019. The popular app was removed from the Google Play Store and iOS App Store on the grounds that its user-generated content “allegedly degraded culture and encouraged pornography.” Although the ban was lifted after a week, it wasn’t until TikTok agreed to use AI and humans for content moderation that the app was restored to both app stores.

To prevent users from posting unacceptable content, moderation must be a priority. If you don’t moderate content, the App Store is prepared to take precautionary action, removing apps that allow unsuitable content.

Moderation Myths: #3. My app doesn’t technically generate user content on Instagram. It just uses InstagramKit to pull in the content, so I’m in the clear

The App Store may not see it that way. It requires a way to flag and remove users that are offensive and content that is objectionable, but Instagram’s API does not support flagging of content. So even if your app does not generate this user content on Instagram, Apple may reject it anyway.

Although one could argue that there is no way InstagramKit can accomplish this as long as the Instagram API does not have endpoints for flagging content, Apple regularly rejects apps despite this argument. To comply with Apple, many developers are charged with the task of adding a "flag inappropriate content" feature or other services that enable users to flag content from within the app and hide it.

To increase the chances of getting your app approved and save yourself the hassle of creating content flagging procedures, consider partnering with an algorithmic and live moderation team trained to identify images and text comments that Apple may regard as unsuitable.

Moderation Myths: #4. My app doesn’t need to have an End User License Agreement. I can skip that, right?

Wrong! Any app that allows UGC must have an End User License Agreement (EULA) in place for Apple to approve it. Don’t make the mistake of assuming that just because you have a Terms and Conditions in place, a EULA is not necessary. While both are agreements, they are not interchangeable.

When a user downloads your mobile app, they acquire a limited license to use your app via an EULA. This is an agreement that users accept when they sign up for your app, granting them an app license while covering rights and restrictions for that license. A Terms and Conditions covers everything else, including subscription plans.

An EULA limits your liability and establishes restrictions for use of the app. It states that there is a no-tolerance policy for offensive content and details what that includes. An EULA also states that your app will moderate all content and determine whether or not it’s appropriate to live on the app.

Your EULA should include Termination of License, which gives you the right to discontinue services or dissolve an individual license. Consider adding your EULA at the beginning of installation or download or in your app’s Legal menu so users can easily agree to it.

Moderation Myths Conclusion

Although it’s tempting to rush through the process of building your app in an effort to launch it immediately, investing a little extra time in making sure that your app is compliant with Apple’s requirements can spare you the headache of a rejection or a ban further down the road.

With Apple’s guidelines in mind, you can navigate its stringent app requirements around moderation and reap the benefits of not only having an app in Apple’s App Store but knowing you have created a safe environment for your new users.