Nude photos and other potentially objectionable or illegal materials have been showing up in the iPhone application store in recent weeks, raising questions about Apple’s ability to control iPhone content.
In the most recent example, a nude photo of a young woman, reported to be 15 years old, showed up on an iPhone application called “BeautyMeter,” according to Wired.com and Krapps.com, an app review site. The photo, which was submitted by one of the photo-sharing app’s users, prompted Apple to remove the entire mobile application from its online store. Funnymals, maker of the BeautyMeter app, which lets users posts photos of people and then rank them based on “hotness,” says in a statement on its Web site that it agrees with Apple’s decision to yank the phone application from its online store. Funnymals also says its policies prohibit people from posting nude photos to the application. Neither Funnymals nor Apple responded to requests for comment. About a week earlier, another mobile phone application, “Hottest Girl,” showcased a photo of a topless woman and also was pulled from the app store. “Apple will not distribute applications that contain inappropriate content, such as pornography,” a company spokesman said at the time. The explicit material is putting attention on Apple’s attempts to filter out potentially objectionable apps before they’re posted on its app store.
10 things to love about iPhone’s OS 3.0
CNET: Unlocking the unlocked cell phone market
The iPhone app store, with more than 50,000 applications, is the most popular entertainment and information venue of its kind for mobile phones. Observers say the successful app store buoys the iPhone’s popularity and adds to Apple’s sterling image as a hip and family-friendly company. The explicit content has the potential to tarnish that image. But Apple, like any company or Web site that hosts user-submitted content, may be engaged an impossible task by trying to keep all offensive material from the app store and its phones. Some iPhone apps are developed by Apple, but many are submitted for approval by third-party developers. Phil Malone, a clinical professor of law at Harvard Law School, said it’s unlikely Apple or app developers would be held liable for potentially illegal content that shows up in phone apps, as long as they didn’t know about the questionable content in advance. It would be impossible for Apple or developers to keep all potentially objectionable material out of the app store, since much of the content is submitted by users, he said. As the quantity of new apps and updates for apps increases, it becomes all the more difficult for the company to keep up, said Dan Moren, associate editor of MacWorld, a blog about all things Apple. Joshua Topolsky, editor in chief of Engadget, a technology blog, said the impossibility of policing all app store content should free Apple from some blame. “It’s completely out of Apple’s control that someone uploaded a nude photo, and to some extent, it’s out of the [app] developer’s hands as well,” he said. More pressing, Topolsky said, are Apple’s nebulous policies about which apps get the company’s stamp of approval. No one outside the company knows who at Apple makes those decisions or exactly what criteria they use to accept and reject the mobile programs, he said. That frustrates app developers and could lead some to turn away from Apple and move on to other phones, said Jared Brown, who says updates to his Quick Shot photo app have been rejected by Apple for seemingly random reasons. In several cases, applications have been banned from the iPhone app store for showcasing material that also would be easily accessible through iTunes or by using Apple’s mobile Web browser. A Nine Inch Nails application, for example, reportedly was pulled by Apple because it streamed a song with offensive lyrics. Band leader Trent Reznor lashed out against Apple on his Web site, calling the company hypocritical and pointing out that the song in question also was available on iTunes. In a similar incident, a Twitter app called “Tweetie” was pulled because it gave access to offensive words on Twitter.com. It was later put back on the app store. And an iPhone app that allowed users to shake a digital crying baby to death was yanked from the app store in April. Apple issued an apology, calling the app “deeply offensive” and a “mistake.” Unlike Apple, which acts as a gatekeeper, Google lets developers post games and other programs to its Android app store without going through a screening process. Brown said he favors this idea, which lets the Android store’s community flag objectionable content. Apple’s approval system is “draconian,” he said, adding that “It seems like there’s no clear guidelines for these people, whoever they all are.” Apple has said it approves 96 percent of submitted iPhone apps. In a recent update to the iPhone software, Apple included controls that let users choose which types of content they would like to block from their phones. iPhone apps now come with age-appropriateness labels, submitted by developers. The app that featured a photo of a reportedly underage girl was rated for people 17 years and older and warned that it might show explicit content, said Moren, of MacWorld. Moren he said this new parental-control system offers “fine-grained controls” and helps iPhone owners decide what content they want to buy. The company is doing anything it can to try to keep pornography and offensive material off the site for public relations reasons, he said. “They’d much rather hang onto their image as a family-friendly company” than let offensive material in, he said. “I think they’ve really cultivated that.”