February 24, 2018

With Social Media Be Careful What You Wish For

On Halloween of this year representatives of Google, Facebook and Twitter appeared before the Senate Judiciary Committee’s Crime and Terrorism Subcommittee, and the senators who quizzed them treated them like they were wearing Pinocchio costumes.

qqBoth Democratic and Republican senators questioned the companies about Russia’s attempt to spread disinformation and discord on Google’s YouTube and on social media – Twitter and Facebook – by lamenting the Kremlin’s efforts to disrupt and tip the 2016 presidential election toward Donald Trump. The angry lawmakers stressed the need for Facebook, Google and Twitter to prevent this tampering from happening ever again.

In the middle of November 15 Democratic senators asked the Federal Election Commission to ensure that online political ads display disclaimers stating who paid for the advertising. “The FEC must close the loopholes that have allowed foreign adversaries to sow discord and misinform the American electorate,” Senators Mark Warner (Virginia) and Amy Klobuchar (Minnesota), Claire McCaskill (Missouri) and a dozen others wrote, according to Media Daily News.

However, as the old idiom suggests, politicians should “be careful what they wish for lest it comes true.”

Consider this: If the government regulates political advertising, it is giving digital media, primarily Google, Facebook and Twitter rules on how to label political communication. If these companies follow the rules, they just might tow the line assiduously and then adopt the position that they are “following orders” and wash their hands of taking corporate responsibility for or being accountable for further action.

However, as malicious hackers have continually demonstrated, digital rules and guidelines are merely temporary problems that are challenges they relish in tackling and hacking. So the algorithms that Google, Facebook and Twitter write to follow government-imposed rules and regulations could be hacked and compromised almost as soon as they are instituted.

Eventually, the media platform companies might be able to write algorithms that have good editorial judgment and taste, but probably not for several years. So what should the government do in the meantime, impose rules and regulations?

A look at the history of media regulations might be instructive. Before the invention of radio, the media consisted primarily of newspapers and magazines, which were not regulated by the government. Publishers used their Constitutional guarantee freedom of speech to publish what they wanted, to be as partisan and as contentious as they felt like.

But when radio broadcasting was invented, because it used the public airwaves to distribute its signals, the government regulated radio stations. The stations were given a license to broadcast on a designated frequency as long as they served the “public good, convenience and necessity.” Later, when television came on the scene, TV stations were given the same public-service mandate.

When I was a V.P. of CBS in 1970, the CBS-owned radio and TV stations were required by corporate policy to have a community affairs director who ascertained the needs and interest of the local community and an editorial director who researched and wrote editorials for the general manager to deliver on the air. CBS took seriously its obligation to serve the communities where its stations were located.

Reputable newspapers, such as the New York Times, even though the government did not regulate them, liked to have an editorial environment that was brand safe, that appealed to advertisers who were concerned about their ads appearing near inappropriate content. Thus, the Times does not allow X-rated films or breast-enlargement advertising to appear in its pages or on its website.

Decisions about serving communities and not running X-rated films are matters of judgment and taste, which algorithms have yet to master. Top executives at Google, Facebook and Twitter and other digital and social media platforms need to make decisions about the social responsibility of their businesses. Duty to society must take priority over shareholder value, and good taste must take priority over higher profits for the long-term health of their businesses.

Of course marketers, advertisers and their agencies can’t dictate social responsibility and good taste to publishers and platforms, but they can be more vigilant about blacklisting irresponsible, bad-taste content. To implement this vigilance it’s better to concentrate on advertising effectiveness rather than efficiency, especially in programmatic, which often finds content that is brand toxic.

A guideline for safe content should be that which uplifts, or at a minimum doesn’t diminish, human dignity. Note that it is a guideline, not a regulation, and overall guidelines force the content aggregators such as Google, Facebook and Twitter to take active responsibility for their content and use good judgment and good taste rather than merely follow orders.