Poor Mark Zuckerberg. You’d think that being co-founder and CEO of Facebook, the largest and most successful social media empire ever upon whom billions depend, would be a pretty sweet gig. But the baby-faced billionaire gets no respect. His own page on Facebook was hacked back in 2013, supposedly by someone who was irate about the site’s poor security. And his other social media accounts have been hacked, also, including LinkedIn in 2012, which has led to his Twitter and Pinterest accounts being compromised also. Only Instagram has apparently escaped.

Things haven’t been going so well over at Facebook either. The site has gotten into a cycle whereby it implements some new feature which is always enabled by default. When this somehow outrages people – like parents who didn’t enjoy being reminded of their recently-deceased children due to a “Year in Review” feature last New Year’s – Zuckerberg has to come out, apologize, and explain it was all an innocent, well-intentioned misunderstanding on the part of the users. And then it happens again, and again

His latest embarrassment involves Facebook’s Trending News. As we reported back in May, there was a huge outcry when it was revealed that the news was actually being edited by humans. Oh the horror! This fed into many conservatives’ deep-seated suspicion of the media, and the sole dependence on Facebook for news that many users have developed. So Zuckerberg decreed that determining which stories would appear would all be done henceforth instead by mathematical formulae, by algorithms, and promptly fired all his doubtless bewildered and beleaguered editorial staff. And then the algorithms soon posted a fake news story, which they are now actively seeking technical means to prevent from happening again.

Nonetheless, “we are a tech company, not a media company,” Zuckerberg declared, denying that his intention was to provide and/or meddle with content. But that has managed to backfire also. The site automatically censored a post by Norway’s Aftenposten newspaper as child pornography that carried the infamous historical photo of a naked Vietnamese girl running from an American napalm attack. The incensed editor wrote a scathing public letter to Zuckerberg accusing him of abusing his power as “the world’s most powerful editor.” If that wasn’t enough, Facebook’s algorithms then honestly chose a story about a 9/11 conspiracy theory as “trending”, eliciting even more howls of outrage. Like it or not, right or wrong, if Facebook is the source for so many people’s knowledge of current events, then a serious social responsibility comes with it. Sometimes, you just can’t win.

Facebook’s algorithms

Thus, once again, the power of algorithms is revealed – along with their limitations. On Facebook, a screening technique developed by Microsoft known as “PhotoDNA” is used. This checks uploaded images by generating a hash, a summation of mathematical values of all the pixels in a photo. It is compared to a database of hashes of identified sexual or violent offensive images to determine if it is similar. What all such techniques cannot detect, however, is intention. Is the picture of the terrorist beheading children posted as a condemnation of such horrors or an incitement for those thus hideously inclined? Artificial intelligence is not yet at the point where it can even consider such moral conundrums, and may not be for some time, if ever.

To be fair, Facebook is not the only site with algorithms that need to be tweaked. The Warner Bros. studio recently caused a bit of stir when it automatically flagged its own website on Google, Amazon, Sky, and IMDb for copyright violation. With unattended automatic systems, such things will inevitably continue to happen.

Yet algorithms are extremely powerful, When they work right, they can extract a huge amount of information from the data presented. Facebook, for example, offers advertisers nearly 100 categories of data about their users. The list starts with age, sex, location, income and education levels, and quickly advances to home ownership – including the home’s square footage and age. Facebook tracks users’ on-site activity and literally follows them around the web, seeing virtually any website visited. They can also tell advertisers who has an anniversary coming, is away from home, expecting a baby, engaged in a long-distance relationship or a new job, not to mention their relatives, political status, affiliations, and interests. And all that is just in the first third of the list… and every bit of it is data freely generated and surrendered without thinking by the users themselves.

One would be forgiven for wondering with backdoor access to all that, why the NSA would even bother with spying on their own. But the point is that these algorithms are not only describing users lives, they are also subtly guiding them. That gives Facebook a tremendous power that the spooks and alphabet agencies – not to mention any would-be dictator – can only dream about.

“With great power comes great responsibility,” but the responsibility ultimately rests with users as well as with the platform. Facebook users can choose which categories they will accept ads from, but their control is strictly limited. At best they can train the platform to deliver more-agreeable ad content: they can’t eliminate it for Facebook is constantly fighting ad-blockers. And of course, with Facebook, one has to keep up with the changes the site keeps making without notifying anyone beforehand.

So far, the deal between sites like Facebook and their users has been largely beneficial to both, but how long will it remain so? There is no doubt that malicious hacking is constantly getting worse, and the use of large amounts of data by shadowy corporations with ties to the secret world is nearly as troubling as the hackers themselves.

And if Mark Zuckerberg‘s accounts can be compromised, what about yours and everyone else’s? Something to think about the next time a website wants your personal information.

UPDATE

Facebook Comes to New Mexico

Here’s some news that may cheer New Mexicans. Facebook has announced they will be building a $250 million data center in the town of Los Lunas, just south of Albuquerque. New Mexico beat out Utah for the project with an exceedingly generous bid. Los Lunas agreed to not collect property taxes for 30 years and promised the company a monthly reimbursement of the village’s share of gross tax revenues in exchange for annual payments from Facebook that start at $50,000 and top out at less than $500,000. Los Lunas passed both an industrial revenue bond measure of up to $30 billion, a $10 million Local Economic Development Act measure. The state also offered Facebook access to up to $3 million in Job Training Incentive Program funding. The complex agreement also involves tax breaks on billions of dollars in computer equipment over time.

Aside from construction jobs, including those for the solar power facility PNM hopes to build for the huge server farm, only 30-50 jobs will be generated by the new facility. However, it is not at all clear how much actual tax revenue the project will ever generate.

Because like all other things to do with Facebook, it’s not without a downside. The entire procedure was wrapped in utter secrecy, state officials even using code names for the company throughout negotiations. But what could be of critical importance in our desert state is the amount of water it would consume. Though Facebook claims the center would only require 25,000 gallons per day for cooling (or 9,125,000 gallons per year), Utah officials estimated it would need 5.3 million gallons per day.

But Los Lunas wound up guaranteeing 4.5 million gallons a day, though that much water would be necessary only in a worst-case scenario, it is claimed. Compared to Intel’s sucking up 26 million gallons in 2015 (the lowest in years), using the entire promised 4.5 million would result in Facebook consuming an astonishing 1,642,500,000 gallons of our most precious resource just to keep their servers cool. Time will tell whether this is such a good deal for our drought-wracked state or not.