The Facebook/Cambridge Analytica saga has prompted no end of analysis and insight, much of it sanctimonious, about the responsibilities of social media platforms to protect users' information, about fears of data trolling undermining Western democracy and the basic right of its citizens to privacy, about whether the way Facebook makes most of its money (selling digital ads to advertisers who want to reach the extremely well-defined audiences Facebook delivers) is ethical and should be regulated, and so on.
As the scandal has intensified — not helped, in my opinion, by CEO Mark Zuckerberg's odd decision to wait days before making a public response — the market is responding as it is apt to do: with something approaching panic.
Since March 16, when Facebook suspended Cambridge Analytica (a data analysis firm that worked for Donald Trump's presidential campaign in 2016) for allegedly improperly harvesting data from some 50 million Facebook users, the social network's stock has plunged more than 17 per cent, representing about US$80 billion in market cap. On Tuesday alone, when news broke of Zuckerberg's willingness to testify, shares got a five-per-cent haircut.
Fears of regulation, fears of government sanction, fears of a good chunk of Facebook's 2.2 billion monthly users logging off, fears of its business model collapsing under its own success — lots of fear out there. But is it reasonable
I'm not so sure. Some steam was due to come out of the FANGs (Facebook, Amazon, Netflix, and Google, which is now Alphabet), which were outperforming the broader market by leaps and bounds. Facebook had seen a one-year gain of 35 per cent, until the Cambridge Analytica scandal hit; Google (Alphabet) rose 30 per cent; Amazon over the same period soared by 85 per cent, and Netflix gained more than 100 per cent. When investors get nervous about soaring valuations, it doesn't take much to get them into profit-taking mode. (Look at Amazon, which declined five per cent in morning trading Wednesday after a second-hand report that Trump wanted to “go after” the online retailer.)
But beyond that, what, exactly, did Facebook get so wrong Back in 2014, a researcher named Aleksandr Kogan used a Facebook personality test app to gather data on some 30 million users; he passed that data on to Cambridge Analytica. In 2015, Facebook found out about the transfer and, citing rules that prohibit sharing data with a third party for commercial reasons, demanded that Cambridge Analytica delete it. The firm says it did; Facebook, apparently, is not convinced. It suspended Cambridge Analytica and Kogan in mid-March. Media reports claim the data is still out there, and might have been used to help get Trump elected in 2016.
In the history of data breaches, this one is small potatoes compared to Yahoo (three billion accounts hacked), Adult Friend Finder (400 million accounts), and the Sony Playstation Network (77 million accounts hacked, plus network outages), and no financial information was involved. Other unsavoury allegations against Cambridge Analytica have come to light, but it's not clear how the stink rubs off on Facebook. If the social media giant did something wrong, it would be not doing enough back in 2015 when it found out about the breach. That's a sin of omission, not commission, and one that hardly goes to the heart of Facebook's business model.
As for what regulators might do, I wonder what form increased government scrutiny might take. Surely, public investigations into how Facebook uses algorithms or artificial intelligence to analyze data are likely to run into the claim — a merited one — that those are trade secrets. Some have raised the spectre of a drawn-out battle with regulators, drawing comparisons with Microsoft's two-decade-long antitrust case. Yet where is the alleged tort that Facebook committed
If regulators go after the company for its business model, then they will have to cast a pretty wide net. Other social media platforms, broadcast companies, online publishers, newspapers and magazines, and a host of other going-concerns make money doing what Facebook does: selling users to advertisers. All of those other businesses gather information about their audiences; Facebook just does it better than any of them. Do the users care Billions of people log on to Facebook because it works, it's fun, and they don't have to pay for it. Most of those who are aware of the quid pro quo involved — they get Facebook, Facebook gets them — accept the terms. Those who aren't aware of the bargain are ipso facto not worried enough to find out.
Maybe the recent dustup will make them aware, and they'll abandon Facebook for some other social media platform that makes money the same way (or, like Instagram, is owned by Facebook anyway). Maybe the scandal will make more people aware of how data is being used, and encourage them to take steps to protect their privacy as they see fit. That's all well and good, but I doubt it would make much of a dent in Facebook's user base or revenue.
Speaking of which, the company made US$40 billion last year, up more than 50 per cent from 2016. Analysts expect earnings growth over the next five years of more than 25 per cent annually. Until there's any evidence of anything substantial happening to alter those expectations, let's not write Facebook's obituary just yet.