With testing of Facebook’s new graph search feature in full swing, the social network took steps to reassure parents of teens that their children are not at risk, outlining steps that were taken to protect users aged 13 to 17.
Facebook released a detailed infographic Tuesday that takes users through the entire process of what happens when content or users are reported to the social network.
Facebook is testing a new product that allows users who report objectionable content to follow the progress of their reports and receive alerts when the social network has decided the fate of the reported content.
Facebook introduced new options for users to report issues with applications on the social network.
We’re curious to know how many of our readers have formally reported bugs to Facebook, warned their friends about security problems via status updates or ignored these issues when they’ve arisen.
Don’t suffer another minute of annoyance: If you feel psychologically tortured by insensitive jerks while on Facebook, there are ways to nip their behavior in the bud.
Yesterday afternoon Facebook posted about more detailed abuse reporting features now being provided for users. Facebook wants to protect those users who experience or witness “bullying, harassment, unwanted contact or offensive behavior”. That’s why the company is rolling out much more granular reporting features. Users can select from “nudity or pornography, drug use, excessive gore or violence, attacks individual or group, advertisement or spam or infringes on your intellectual property” when reporting violating images.