This week, the Web exploded over charges that the content of Facebook’s Trending topics feature is decided by humans – including a bunch of young, college-educated contractors in Knoxville, Kentucky. Who, despite their location, according to the Gizmodo report, deliberately censored conservative political news stories from the news feed.

From the outraged reaction, one might think that Facebook were staffed by none other than  sworn commies (remember them?) plotting the downfall of America, God-given capitalism, and private property. Even a Senate investigation was demanded. So the company issued its standard denials, Mark Zuckerberg himself blandly made a few, saying an internal investigation was underway, and like most Facebook scandals, the whole affair somehow ended much where it began.

Why was it such a big deal? Partly it’s a knee-jerk reaction. There is a deep, long-held conservative paranoid suspicion that the mainstream media is run by Machiavellian liberals with a sinister agenda to denigrate, mock, and destroy their values. (Search for “mainstream media liberal bias” if you doubt.) Despite talk radio being overwhelming conservative if not flaming reactionary, they may not be entirely wrong. But the attitude might be more a reflection of the growing class divide in this country than anything else.

But the big concern seems to be because of the feature’s popularity. Whether wisely or not, Facebook’s Trending News, which appears prominently in the right upper corner of the page, has become the sole source of news coverage for millions. Many implicitly trust the feeds, thinking that they  are entirely composed of stories recommended by their friends or chosen by their own preferences. They are reassured rather than bothered by the idea that a computer algorithm might be picking and choosing their news, for it seems more objective. What disturbs them is the idea that mere humans with whom they might disagree are sneakily interfering with the process.

Facebook does have the right, if it chooses, to edit the news to fit any political perspective it wants. Offering news items comes with an unspoken editorial obligation to users to select the best, most interesting, relevant, and well-written stories. So Facebook is mainly concerned with presenting articles that will help the company’s bottom line by keeping users engaged longer, consuming media and stories on the site, and talking about them with their friends right there on the platform. That gives Facebook more information to sell to advertisers.

However, it is deceptive or at least disingenuous to present these items as chosen by some wise, fair, and objectively disinterested method. But it is an fascinating albeit unanswerable question, though, whether Zuckerberg ever sells any data relating to his users’ political views to politicians or parties…

The anti-conservative bias critics detect may just be a reflection of the site’s culture in some respect. Zuckerberg, who went to the elite institution of Harvard, himself was once overheard talking to Germany’s Angela Merkel about censoring anti-refugee stories. But regardless of whether the policies really existed to prevent any bias before now, there are certainly good reasons to be concerned.

Facebook itself conducted unethical experiments that demonstrated how manipulating the tone of news feeds really could subtly change people’s moods. Moreover, experiments manipulating search engine results showed just how easy it was to change people’s opinions even in real elections. What’s more, the manipulation is very hard for the user to detect or counter, even when she or he knows it to be there.

Facebook having to rely on human news curators mainly shows how poorly their much-praised algorithms actually function. Yet humans can be called out, questioned, and challenged. But the math is very insidious, and their results spring forth from a black box like an oracle. Algorithms cannot be questioned by ordinary users, so they are often implicitly trusted. There’s no way to tell exactly how they were decided upon, or by what criteria, which leaves no firm basis on which to issue a challenge. Perhaps it’s better if humans are the ones choosing the spin.

In a year like this, when political positions have become so extreme and the stakes seem so high, it’s probably not bad that attention is given to how our opinions are being relentlessly and continuously manipulated by all sides. The pressure and mind-games are only going to get worse until November, so stay alert, but don’t take anything too seriously. And try at least to think for yourself.