Home » Instagram Is Making Big Changes To Protect Children On Its Platform

Instagram Is Making Big Changes To Protect Children On Its Platform

Instagram is implementing algorithm changes and expanding child safety features on accounts featuring minors, managed by adults. The update addresses concerns raised in a 2023 lawsuit and a The Wall Street Journal investigation regarding platform recommendations.

Instagram accounts primarily displaying images of children, run by adults, will no longer be recommended to “potentially suspicious adults.” This includes individuals previously blocked by teenagers. The platform will also conceal comments from these suspicious adults on such posts and limit their discoverability via search. These measures build on earlier updates, including stopping accounts heavily featuring children from offering subscriptions or receiving gifts.

Meta introduced these changes following a 2023 lawsuit alleging Facebook and Instagram facilitated a “marketplace for predators in search of children” by allowing extensive sharing of child sexual abuse material. An investigation by The Wall Street Journal in the same year reportedly identified Instagram’s recommendation algorithms promoting networks of pedophiles.

Additional safety features for teen accounts are being rolled out to accounts that frequently feature children. Message settings for these accounts will automatically default to Instagram’s strictest options, filtering offensive and inappropriate comments. Instagram DMs for teen accounts will integrate a combined report and block option. Teen users will also see the month and year an account they are messaging joined Instagram.


Featured image credit

Related Posts

Leave a Reply

Your email address will not be published. Required fields are marked *