We understand the potency of drugs and medicine — both to do harm and to heal — so we entrust the FDA to regulate them.
If a given drug is found to harm more than it heals, we’re encouraged not to use it. But sometimes a drug is so addictive that we use it anyway — even if it hurts us — and we go to extraordinary lengths to obtain another dose. For this reason, harmful drugs are often very good for business (see: Mexico), because, maddened by addiction, users stray from rationality and reason just to get their fix, regardless of the cost.
A lot of software is designed to be addictive. In Silicon Valley, the addictivity of a given piece of software is considered an asset. Companies strive to make their products “viral” and “sticky” so that “users keep coming back” to “get their daily fix.” This sounds a lot like dealing drugs. It might be good for business, but is it good for people?
As citizens, there are some things we could do.
We could introduce citizen oversight for software companies, by creating a crowd-sourced FDA for software.
We could call it the Ethical Software Administration (ESA) — a neutral third-party watchdog group to monitor the actions of major companies with more than ten million users. When such companies introduce new products and features, change their default settings, or modify their terms of service, the ESA could take a look at the changes and issue public warnings when needed.
Since the dynamics of software are so different from those of big pharma, the ESA oversight mechanisms would need to reflect the culture of software.
Technological innovation will always outpace any legislation that tries to constrain it; regulating technology tends not to work. So the ESA would need a different approach, perhaps an open online forum where anyone can post their concerns, and where every company receives an aggregated “ethics” score, based on their actions. Users could file objections, rally support behind those objections, and force the hand of companies to reform gratuitous policies by threatening boycotts of products.
The EFF could oversee this initiative, keeping careful eye on transgressive companies, and mustering legal support for violations that users find particularly bold. If companies repeatedly violate ethical norms, they could be forced to post warning labels on their websites, as tobacco companies are forced to warn buyers their products cause cancer. The irony is that many software companies already use this kind of language to promote their very products (i.e. “Warning: this game is known to be highly addictive and could keep you from your friends and family,” which could easily be the latest ad campaign for Angry Birds or Farmville).
Even when regulations do exist, companies often violate them anyway, preferring to pay fines than to change profitable policies. Factories pay EPA fines so they can pollute rivers; power plants pay carbon tax so they can keep spewing smoke. When profits are sufficiently large, no amount of reprimanding will change how a company acts. But with software, the dynamics are different; software depends on its users. When users choose to stop using software, the company producing that software no longer has a business. If we object to the policies of a given software company, all we have to do is stop using its software.
A complementary approach is to build awareness and accountability within the engineering community.
We could ask our educational institutions to add an ethics curriculum to every engineering program. Universities offering degrees in computer science, electrical engineering, applied math, and interaction design could create coursework to explore the ethical considerations of those fields, especially the tradeoffs between page views, corporate profit, personal health, social impact, and simply doing what’s “right.”
From a young age, engineering students could be taught to speak up for what they believe. Too many engineers remain silent, leaving decisions to “management,” and simply writing code as they're told. This is the same division of ethical accountability that allowed the Manhattan Project to happen. Scientists say, “Oh, but I was only doing science,” politicians say, “Oh, but I was only using what scientists made me,” and businesspeople say, “Oh, but I was only connecting supply and demand.” When people don’t see the big picture, or when they think they’re only responsible for the thing that’s right in front of them, it’s easy for many individuals to be complicit in the creation of damaging things.
We could ask our engineers to take a Hippocratic Oath, as medical students are required to do before we call them doctors. The basic tenets of an Engineering Oath could mirror the medical ones:
- To recognize your power to improve or damage lives, and to pledge not to do the latter.
- To study the lessons learned in the past, and to share the lessons you learn with those who follow after.
- To understand there is an art, as well as a science, to engineering, and that a simple human encounter is often more helpful than a technological intervention.
- To admit sometimes to ignorance concerning the effect a choice is likely to have, and in that case, to seek the wise counsel of colleagues.
- To understand that users are not only data points, but also living human beings with friends, families, and communities, and that affecting one user also means affecting the other humans in that person’s life.
- To remember you are not only an engineer but also a member of a society, with obligations to your fellow human beings.
We could draft an Engineering Oath, post it online, and allow individual technologists to add their signatures, stating their name, hometown, personal website, and affiliations. This browsable directory would easily allow people to see what percentage of a given company’s engineers have taken the oath, and it would give the engineering community a common set of ethics, to guide the evolution of software.
As engineers, we can ask ourselves some basic questions:
Will we feel accountable for the behavioral outcomes of the software we introduce to the world? Will we recognize our responsibility to our fellow human beings to build them decent, useful, powerful, and ethical tools? Will we make things that trick and seduce, or things that nourish and teach? Will we optimize for page views and profit, or for social impact and beauty?