A Hippocratic Oath For Software Engineers?

Think, for a moment, about the time it takes from when you wake to when you have a screen in your face?

If you’re anything like me that, time has dropped from hours, to minutes to seconds in the last few years. And tho we, as a startup community, celebrate apps as addictive the effects of these addictions are beginning to mark indelible train tracks across the soft tissues of our societies.

Some of these new markings are are wonderful and empowering. But, like any addictive substance, there are side effects that can become more and more pronounced over time. We talk of learning to program or be programmed, but that it actually happening under our noses. And the ranks of those doing the programing, tho small, are having an increasingly outsized impact on the world.

As the saying goes, with great power comes great responsibility.

So, it was with great interest that I read the recent proposal from Jonathan Harris calling for a Hippocratic Oath for Software Engineers. It’s a fantastic piece that I would encourage you all to read in its entirety, tho I’ve highlighted a few soundbites below.

He sets the stage for this importance of this kind of pact as follows:

We inhabit an interesting time in the history of humanity, where a small number of people, numbering not more than a few hundred, but really more like a few dozen, mainly living in cities like San Francisco and New York, mainly male, and mainly between the ages of 22 and 35, are having a hugely outsized effect on the rest of our species.

Through their inventions, they alter the behavior of millions of people, yet very few of them realize that this is what they are doing, and even fewer consider the ethical implications of that kind of power.

He continues:

On the Web, there are two main kinds of companies: marketplaces and attention economies. Marketplaces aim to eliminate urges by feeding them quickly (find a date, book a room, etc.), while attention economies aim to keep the urges going forever (continuous updates, another cool video, more new messages, etc.).

On the Web, where people have learned not to value things directly, the most common business model is to make a product, give it away for free, attain tremendous scale, and then, once you have a lot of users, to turn those users into the product by selling their attention to advertisers and their personal information to marketing departments.

This is a dangerous deal — not necessarily in economic terms, but in human terms — because, once the user has become the product, the user is no longer treated as an individual but as a commodity, and not even a precious commodity, but as one insignificant data point among many — a rounding error — meaningful only in aggregate.

Thinking of humans this way produces sociopathic behavior: rational in economic terms but very bad in human terms.

He contextualizes:

On a small scale, the effects of software are benign. But at large companies with hundreds of millions of users, something so apparently small as the choice of what should be a default setting will have an immediate impact on the daily behavior patterns of a large percentage of the planet.

He cautions:

In its capacity to transform the behavior of people, software is a kind of drug — a new kind of drug. As there are many kinds of drugs (caffeine, echinacea, Tylenol, Viagra, heroin, crack), so are there many kinds of software, feeding different urges and creating different outcomes. When designing technology, you should understand what human urge or condition you will be extending. So choose your urges wisely.

As the technology industry empowers “growth hackers" to aggressively increase scale, reach and engagement perhaps we need to be asking more questions about how our creations are impacting the lives of the people behind the eyeballs.  

As with doctors who swear to uphold the hippocratic oath in their efforts to ethically practice and distribute medicine, is there a place for a similar creed for engineers designing a wholly different type of drug?