By Jeff Horwitz

Facebook Inc.'s independent content-oversight board issued its first five rulings Thursday, overturning four instances where it found the company unfairly infringed upon users' speech on the platform or misapplied "vague" rules on content that could cause imminent harm.

Among the board's decisions were a determination that Facebook's algorithms were wrong to remove a post about breast cancer identification that featured a woman's nipple, and a finding that Facebook had been too strict in removing a French user's post praising hydroxychloroquine, a once widely discussed treatment for Covid-19 that medical authorities have generally found not to be effective.

The board, created and funded by Facebook through an endowment, was created to tackle Facebook's thorniest content-moderation issues, and the company has pledged to abide by the panel's decisions. The group, which features 20 journalists, lawyers and former politicians from around the world, is also set to determine later this year whether Facebook erred in suspending former President Donald Trump from its platform.

So far, the panel has only been given the ability to determine whether content that has been taken down should be restored -- not if Facebook should be removing live posts or videos.

Thursday, the content oversight board determined the removal of the post with the nipple -- which Facebook had reversed on its own after being notified that the board would review it -- indicates possible overreliance by Facebook on algorithms to police content, said Helle Thorning-Schmidt, the former prime minister of Denmark and a member of the board.

"It became very clear to us that that was part of the problem -- had they had human moderators, I don't think this would have been taken down from Instagram," she said. Facebook has recently been moving toward more expansive use of algorithmic content moderation.

Write to Jeff Horwitz at Jeff.Horwitz@wsj.com

(END) Dow Jones Newswires

01-28-21 1120ET