Last Friday, Facebook fired its team of human curators for the Trending platform due to reported manual insertion of topics. The company's statement read that the move is intended to reduce bias and "make the product more automated."
Facebook said stories on Trending should not necessarily be treated as straight news because it represents "algorithmically detected topics that reflect real-world events." Unfortunately, it backlashed when netizens reacted negatively to a false news story about Fox News anchor Megyn Kelly being fired due to her political inclinations.
Byron Galbraith, chief data scientist at Talla, said Facebook's Trending algorithm cannot resolve problems brought by human bias. "It's an interesting challenge because it speaks to the problem of our algorithms," he said. "The data that derives the algorithm is inherently biased."
Trending engineers can only supervise how the algorithm works but not the actual content of the populated stories. Facebook said the people in the team work for guidance only and they have no editorial privilege.
Basically, "Topics related to sex, pornography, nudity, graphic violence, etc. Stories that can be perceived as R-rated or worse should be tagged risqué," cited Quartz. Facebook populates trending topics based on Mentions, a key term that is talked about by a number of people online. Demographic attributes like gender, race, and location influence how a topic is seen as trending or viral.
CNBC reported that Facebook needs to deploy humans to ensure quality in determining if a story is fictitious or not. Matt Lang, a senior digital strategist at Rain, said there is still a long way to go to clean up the algorithm and significantly reduce bias.
However, Facebook can implement a precautionary measure to distance itself from an accusation of bias. Galbraith said suggesting articles of conservative and liberal tone should be able to strike a balance.