Facebook’s Still Debating Whether or Not to Let in Your 12-Year-Old; Are You Still Concerned?
I’m 27 years old.
Actually, I’m not 27 years old. I’m 26 years old. I just lied on the internet! And trust me, it really is so easy that a preteen can do it.
And they do – all the time. Although Facebook’s current policy plainly prohibits anyone under the age of 13 from operating an account on the site, most reports put the number of preteen Facebook users in the millions. In fact, if a kid is on Facebook (under 18), there’s about a 2 in 5 chance that they are actually under the age of 13. Other reports, acknowledged by Facebook themselves, show that over half of parents with a 12-year-old child say that he/she has a Facebook account.
Do you think that Facebook is safe for kids under 13? If not, do you think it can be made safe with certain limitations and parental controls? Let us know in the comments.
Facebook says that they do all they can to identify and remove underage accounts – but it’s obviously one of the largest games of whack-a-mole ever. Close one down, five 11-year-olds log on in their place. Facebook removes somewhere around 20,000 underage accounts a day, and there’s still over 7.5 million under-13 accounts active on the site.
It doesn’t take the world’s most adept mathematician to see that this is a fight that’s going to be really hard to “win” with a simple ban. It’s clear that simply requiring age verification will never keep young kids off of Facebook. The site’s simply too popular. Everyone, including 10, 11, and 12-year-olds now require the social connectivity that it can provide.
Facing the evergreen problem that is trying to police all of these underage accounts, it shouldn’t shock anyone that Facebook is considering a new approach. Just over a month ago, we heard that the company was mulling over the idea of opening up the site to kids under the age of 13. We heard that Facebook’s “if you can’t beat them..” strategy would have preteen accounts closely monitored by parental controls – which would work in correlation with the safeguards that Facebook already has in place regarding minors’ privacy.
If they’re going to do it anyway, we may as well have them do it under proper supervision, right?
If we believe the stats, the majority of parents with 12-year-olds know that their kids are on Facebook. Of course, that doesn’t necessarily mean that they are cool with it. Once the news of Facebook’s possible shift in age restrictions, people seemed to come out of every conceivable corner of the outrage circuit to voice their opinions.
And who can blame concerned parents? Facebook (and social communication in general) can be a minefield for older teens and even adults – and most people’s natural inclination is to protect young children at all costs. All a parent has to hear is one nightmarish story about child predation and subtle manipulation on Facebook, and the negatives of social networking immediately outweigh any positives. And although Facebook horror stories aren’t so commonplace as to become routine, there are still enough floating around to engender some concern.
To some parents, no amount of control or guidance would make them feel truly comfortable with opening up their preteens to the risks – either external or internal.
In one of the first major addresses of Facebook’s possible plans, nearly a dozen consumer groups sent a joint letter to Mark Zuckerberg. That letter contained some “demands” for any scenario where Facebook opens up the service to kids under 13.
Although the groups mentioned parental controls, and increased privacy for preteen members, their main point involved ads on the site – in that there shouldn’t be any when it comes to kids:
[T]he company’s business model relies, at its very core, on data collection, ad targeting, and viral marketing, and many of its practices have generated public and government privacy concerns. If Facebook opens itself up to a younger audiences, we want assurances that any space created for children under the age of 13 on the site is safe, parent-guided and controlled, and, most importantly, free of ads (including the range of practices that are routinely employed through social media marketing).
Congressional pressures and the evolution of Facebook responses
When the rumors began to fly concerning Facebook’s possible age-shift, the company was unsurprisingly unspecific about their actual thought process. Upon the initial reports, Facebook told me that they were “in continuous dialogue with stakeholders, regulators and other policymakers about how best to help parents keep their kids safe in an evolving online environment.”
After the letter from the privacy groups, Facebook started to say that they were open to suggestions:
Enforcing age restrictions on the Internet is a difficult issue, especially when many reports have shown parents want their children to access online content and services. We welcome today’s recommendations by consumer, privacy, health and child groups as we continue our dialogue with stakeholders, regulators and other policymakers about how best to help parents keep their kids safe in an evolving online environment.
Now, thanks to a newly-released letter from Facebook to two Congressmen, we know that Facebook is still thinking about it, but are a ways away from a final determination.
“At this point, we have made no final decision whether to change our current approach of prohibiting children under 13 from joining Facebook,” said the company.
That’s just a blip in a much-longer letter that the company sent Republican representative Joe Barton and Democratic representative Ed Markey. The two House members sent Facebook a letter back in early June outlining their concerns regarding the protection of kids on the site.
Facebook’s response mirrors many responses the company has had to questions raised by concerned parties. For one, they reiterate that any decision they make will be mindful of the Children’s Online Privacy Protection Act (COPPA), a law that Facebook has formally addressed in a highly detailed fashion. The bulk of the response involves Facebook talking about their current features that promote safety for minors, in lieu of having yet made any actual decisions on the sub-13 crowd.
Facebook actually does do a lot, already
In addition to the everyday privacy controls that all Facebook users have access to, the company already has an admittedly impressive set of features in place that are designed to protect the network’s younger crowd.
- For instance, Facebook users aged 13-17 have different sharing settings, by default, than those users who are 18+. A minor’s sharing capabilities are limited to friends and friends of friends – no more. Also, if you’re not a friend of a friend of a minor, it’s physically impossible for you to send them a direct message.
- In a more controversial move, we recently learned that Facebook is actively monitoring the millions and million of chats and messages between their 900+ million users. Why monitor chats, you may ask? Because Facebook is always on the lookout for suspicious behaviors. And their software scans communications all across the network, looking for possibly criminal activity. That, of course, includes inappropriate conversations between minors and possible predators. Facebook’s monitoring software works in two distinct ways. First, it puts more weight into focusing on conversations between members who aren’t especially connected in any significant way – few mutual friends, new friendships, etc. Second, the message scan can identify certain words and phrases that could signal illicit communications. As a Facebook user, you can raise concerns about privacy and the legality of this sort of monitoring – and your concerns would be legitimate. But from a protecting kids standpoint, it’s hard to argue that this sort of technology is of great benefit.
- Facebook also bans sex offenders from participating in the network – outright. So do many states as well, although these laws are currently begin challenged by free speech activists as we speak.
- As a company that’s put a lot of effort into anti-bullying campaigns (having even been recognized for their work), Facebook says it’s a priority to keep users, especially minors, safe from harassment. Just recently, Facebook made a few tweaks to their reporting mechanisms to help in that effort. Now, when minors feel threatened by a particular post, the “report” procedure has been softened and made a bit more conversational and inviting. For instance “report” will be replaced with “This post is a problem.” After clicking, teens will be walked through the process with scenario-specific questions.
But is it enough?
“Children are not commodities, and their personal information should not be harvested to yield ad revenue for Facebook and its hungry shareholders,” said Rep. Markey, co-author of the letter to Facebook. “The privacy of personal information for pre-teens should not become a post-script in Facebook’s drive for profits.”
Even if Facebook opened up the network to preteens and did so by giving parents granular control over all aspects of their experience, would it be enough to assuage the concerns of many? Even after knowing that Facebook is actively trying to protect young kids from seedy situations online, does it do anything to make you feel comfortable with the thought of your 10-year-old browsing their News Feed?
The advertising issue may be just as potent of an issue as privacy and safety. Both the consumer groups and the Congressmen made ads their front and center complaint.
You could argue that children are inundated with ads all the time – television, movies, etc. Hell, walk into a toy store and you’ve basically just put your kid in the middle of a giant advertisement for various products. How can parents think that targeted ads on Facebook are any more dangerous than a targeted ad on the Sunday morning cartoons?
Or on a site like YouTube?
But it’s clear that something about Facebook ads clearly rub people the wrong way – especially when it comes to children.