In a bid to officially capture under-13 audiences and develop mindshare, Facebook has rolled out Messenger Kids with a positioning around privacy and safety.
Marketed as a tool for parents to shield their children from child predators, Facebook's Messenger Kids app comes the week after Google took steps to delete over 150,000 videos with children from YouTube.
Working closely with the US FTC (Federal Trade Commission), Facebook rolled Messenger Kids out on iOS this week. Parents are meant to install and administer the app on their child's device, not only creating a profile but also approving the friends and family members the child can text and video chat with.
The app has proactive detection filters that prevent child users from sharing nudity, sexual content, or violent images, while Facebook claims a dedicated team will examine reported or flagged content.