This week, Facebook introduced a new feature that will hide certain types of harmful content and restrict specific search terms for users under 18.
The move comes after the company was slapped with dozens of state lawsuits and increasing pressure from child safety groups to make Facebook and Instagram safer for young users.
According to Facebook, all teens will now automatically be restricted from seeing harmful content including posts about self-harm, graphic violence and eating disorders.
If someone a teen follows shares this kind of content, they won't see it, and if they actively search it out, they will be redirected to expert resources for help on the matter.
Some experts say this is a step in the right direction, and if nothing else, it will help parents feel more comfortable with their children using these platforms.
Of course, Facebook still has a lot of work to do to protect young users, but this is one of the most serious investments it's made yet to address the many complaints and legal issues it's facing.
Choose what the experts use: award-winning cybersecurity you can trust and rely on.
Try Bitdefender Premium VPN, the ultra-fast VPN that keeps your online identity and activities safe from hackers, ISPs and snoops.
System Mechanic 14 - Make your computer run like new.
This Cyber News was published on facecrooks.com. Publication date: Wed, 10 Jan 2024 17:28:12 +0000