G and I made some brownies today. (She did most of the work.) And I took lots of pictures of her stirring and eating the batter and I posted some to the private Facebook group we have for our kid pics.

WELL.

Apparently, our almost-four-year-old is old enough that pictures of her cooking wearing naught but her ladybug underpants triggers the FB community standards algorithms to flag SOME of those pictures for “nudity and sexual content.” (I don’t have a ton of experience with other young kids, but I will say that this kid takes her clothes off at home all the time and, while I’d prefer she wear more, you just have to pick your battles.)

I know a lot of people will have an immediate reaction of outrage because not only is it sickening to think that perfectly innocent pictures of a kid would be flagged as such, but the algorithm is wildly inconsistent because it let me post something like 12 other pictures from the same exact event.

The thing is: content moderation is hard. And there truly is some sick stuff posted to FB. There’s no way humans could check every single post and you wouldn’t want to subject actual humans to all the trash out there. So, it’s natural for FB to lean heavily on algorithms. And algorithms aren’t perfect. They make lots of mistakes. They’re easy to trick if you try. Also, it’s annoying to not be able to accuse Mark Zuckerberg of profound moral failings when I don’t get my way.

But still. It’s incredibly sad that our little girl is so clearly growing up that the computers — the inconsistent, easily-fooled, overly-cautious computers working to save us from the worst humans recognize her as a potential victim.

Facebook gives me the option of appealing to their oversight board, but what would I really protest here? I don’t think they’re beyond their rights to set whatever standards they want and enforce them as (in)consistently as they please. I don’t want them to get more lax about Child Sexual Abuse Materials for sure. I’m willing to let some of my pics get taken down to make sure none of the legit CSAM gets through. Even though I think it’s dumb for them to take that picture down, it’s a very mild inconvenience… especially because they let me post all the other pictures including one very similar to the offending image.

I think mostly I see this and just sigh.


0 Comments

Leave a Reply

Avatar placeholder

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.