There’s a lot of pain in the world right now. In nations where freedom exists, the rights and privileges of their citizens are eroding away. The world is changing. Freedom is under attack from every corner and by every means possible, be it war, oppression, apathy, or the destruction of rights by those who have been entrusted to uphold and sustain those rights. As technologists, we have a duty and responsibility to evaluate our roles in those changes.

Government

Technology is an essential part of modern government. No government, be it one based on free rights or one based on oppression, can survive without technology to support it. In a 1999 Department of Energy whitepaper, author Jon Pietruszkiewicz stated:

Technology is also an essential ingredient in our country’s ability to achieve
important public objectives and government missions in areas such as public health, environmental protection, defense, space, and energy.

https://www.nrel.gov/docs/gen/fy00/26970.pdf

This document was written in the early days of the Internet. In the 20+ years since the world of technology and its role in government has expanded exponentially. Hardware and internet improvements such as massive cloud data center infrastructures have made technologies such as machine learning, artificial intelligence, the Internet of things, facial recognition, and cameras everywhere more powerful and accessible, helping them become pervasive at every level.

Facial Recognition

Take, for instance, facial recognition. Perhaps no other technology in recent years has faced more controversy in its role within the halls of government. It seems that every week there is a news story about a failure or misuse of facial recognition or protests against its use.

Law enforcement tells us it makes an ideal tool for finding wanted criminals. Government agencies tell us it’s an ideal tool for verifying someone’s identity to determine eligibility to provide government services. Businesses see it as an ideal tool for improving in-store sales and customer service.

On the opposite end, many others would tell us it represents an invasion of privacy, that we cannot provide true informed consent and that it will lead to massive amounts of identity theft and abuse. Most of all, they tell us that it leads to racial discrimination and oppression of minorities.

Time and again we have seen examples that technologies such as facial recognition are far less accurate with minorities and instances of mistaken identity leading to the arrest and incarceration of minorities abound. Some studies have found that women of color are misidentified by facial recognition as much as 35% of the time, as opposed to white males, who were misidentified 1% of the time.

There are a number of contributing factors to this, including the fact that the majority of developers and engineers who work on tech such as facial recognition are white males who failed to properly consider the effects of skin tone on accuracy with the algorithms they were employing.

Another factor is that these systems are often being designed for use by law enforcement. As such, the images used to train these systems come largely from mug shots from police databases, which are often extremely skewed to minority ethnic groups due to biased policing policies in many parts of the world.

But facial recognition is just one example of the dangers of technology in government hands. What about video cameras?

Video Surveillance

Whether it’s driving down the road, shopping in a store, walking down the sidewalk, or even in and around our own homes, cameras are everywhere we go. Most of us even carry a camera around in our pockets or hands in the form of cell phones.

For the most part, we accept these as part of our lives and don’t give them much of a second thought. That is until we get a ticket in the mail that shows we were half a second late going through that red light or driving 6 miles per hour over the speed limit. Or, until we get a notice that our iCloud or OneDrive has been hacked and our “private” photos are now in the hands of an extortionist demanding money to keep them private or just posted on the internet for all to see.

But the cameras in the hands of the government take that to the next level. In some locations–say London, England for example–the government has gone all-in on video surveillance, spending as much as 20% of their annual law enforcement budget on surveillance. Cameras are everywhere, watching everything people are doing. The government is watching you. It literally is “Big Brother” come to life. And it gets worse with each passing year.

And the funny part is, there is virtually no evidence that it works to stop crime. What it does do is give the government millions and millions of hours of video footage to feed into their AI & facial recognition databases to track what you do.

And some governments take that data to extreme ends. Evidence abounds that China, Russia, and others use video surveillance to oppress minority groups within their country.

I could cite many more examples for every single one of the technologies I listed above, but you get the point.

The Role of Technologists

I am being all-inclusive with this. There are so many more people involved in this technology: developers, managers, engineers, data scientists, testers, steering committees, executives, and so forth.

Each and every one of us share a measure of responsibility for what happens with the technologies we create. Some of us take the attitude that “it’s just a job” and what happens with what I create is not my problem. If that’s your attitude, you ARE part of the problem.

How the technologies we create are used seems to get most of the public attention. When you see news stories about technology and ethics, it’s often the result of that technology being misused or abused. But ensuring the ethical use of what we create is just part of our responsibilities. Our ethical responsibility starts long before that technology gets deployed.

As creators, we are also responsible for the ethical process of how these things get created. Are we using ethical methodologies to create and test our technology? Are discussions about ethics, diversity, and accessibility a part of design discussions from the very beginning? If not, we are failing in our responsibilities. Let’s take a look at a good example, the Microsoft Kinect.

Microsoft Kinect

The Kinect was an accessory with great potential. A series of cameras in a device could track your body movements and facial expressions and translate those movements into actions in-game. It was one of the reasons I got a launch day Xbox One console. And for me and my family, it worked great and was a lot of fun. But guess what happened if you had dark skin?

More often than not, the Kinect would fail to recognize or track users with dark skin. Months of back and forth with Microsoft eventually led to various revelations that, much like the failures of many facial recognition technologies, Microsoft had never really tested the device against a wide range of people with different skin tones and in different lighting. And while it wasn’t the only reason Kinect eventually went down as a failure, it was certainly one of the major contributors to its demise. Microsoft developers and designers failed in their responsibilities to develop ethically.

Volkswagen Emissions Scandal

In the first part of this century, Volkswagen diesel vehicles were having trouble keeping up with ever-tightening pollution standards. They could have taken the ethical path and worked to lower the emissions output of their vehicles. But that would have hurt performance and, they felt, sales of their diesel vehicles. So Volkswagen took another route. As James T Kirk once said, they “changed the conditions of the test”.

Volkswagen developed a piece of code to be included in their engine control software. This software was able to detect when the engine was operating under the types of testing conditions that government agencies such as the EPA would put the vehicle through when testing their compliance with regulations.

When the software detected that these testing conditions were present, it radically changed the performance parameters of the engine to lower the emissions of the engine while the test was going on. They found a way to cheat the test. As a result, they were able to sell millions of vehicles that blatantly and intentionally violated the environmental regulations in the US, Canada, and the EU.

The management of Volkswagen failed their ethical responsibilities by pushing profit over the law. And the engineers at Volkswagen failed their ethical responsibilities by writing the software and engineering the diesel engines to support that management decision.

Two examples: one of them is an ethical breach of failing to consider large groups of society, and the other is an ethical breach of prioritizing profits over following the law. In both situations, technologists failed to follow the ethical path.

Accessibility

Next let’s look at an area where developers, in particular, are largely failing as an industry: accessibility. In the development of most software and websites, accessibility is an afterthought, if it gets thought of at all. And yet there are millions and millions of people the world over with various physical challenges that make using those websites or software difficult. And even in this “modern age”, there are millions and millions more who don’t have access to decent computer or phone hardware, or high-speed internet.

When we design our website, are we considering the blind, the color-blind, those with dexterity or mobility issues, people with slow or restricted internet access, and so forth? Do we incorporate those considerations into our designs from the beginning? Is our HTML code using only tags that are accessible to screen readers? For example, are we using for="" with our label tags to identify which input the label belongs to? Or are we using button tags instead of a div with a JavaScript onclick event?

Are we regularly testing our design against Web Content Accessibility Guidelines (WCAG) and Americans with Disabilities Act (ADA) guidelines? Does our design provide accommodations for those with various types of color-blindness?

There are a lot of parts to accessibility in software design and if we aren’t considering it, we’re failing in our ethical software design responsibilities.

Our Responsibility

So it’s not just the potential government abuse that we have to guard against. It’s also considering how to support those groups outside of the “mainstream” customer. These are all a part of the legal, moral, and ethical responsibilities we have as creators of technology.

When we undertake to work on a project of any kind, we have a responsibility to consider the ethical aspects from the beginning. It’s not easy, and there is a cost to being ethical. Companies will often push back on it in the interest of speed and short-term profits. I have experienced this on a number of occasions over my career. Sadly, it was most often at large companies like banks and insurance companies who should instead have been at the forefront of ethical business practices.

Pushing back takes courage, and it’s not easy, especially in a company culture that prioritizes profit over everything else. In companies like that, pushing ethical design can cost you. Sometimes that cost is career advancement and promotion opportunities. For some people, it can even cost their job, whether being fired or forced to resign. We can frequently turn to the news to provide us with high-profile examples of people losing their jobs for making ethical stands. How many ethicists have resigned in protest or been fired in recent years at Google or Facebook?

That’s a high cost and not something to take lightly. Certainly, in most cases, we can’t rely on those who report to shareholders (i.e. executives) to prioritize ethical development. So if we, as the creators in our businesses, don’t take a stand, who will?

Further Reading

Here are some links to further reading.

Codifying Developer Ethics

There are some organizations that have made attempts at codifying ethics for software developers and engineers. Take a look at the following pages:

White Papers & Articles on Developer Ethics

Here are a few links to some other reading and discussions about developer ethics.