Value-Sensitive Design, is, by definition, designing a technology while taking human values into account. It achieves this by emphasizing ethical values of direct and indirect stakeholders in a comprehensive manner. For example, designing a neighbourhood incorporates the value of safety. Keeping safety in mind while designing influences your ideas on lighting, urban greening, traffic lights etc. If a brighter light, or more lampposts are incorporated in your design, your solution could deter crime. In this manner, the value of safety (by users of a certain building or city block) is incorporated in your design.
How can we design these values? By applying Value-Sensitive Design, we define three building blocks: Values, Norms and Design Requirements. When innovating responsibly, we either think of values and design our norms and design requirements accordingly, or think about design requirements and plan norms and ultimately values. Both ways proved to be effective and can lead to very different effective solutions for the same problem. By following a different route, you gain a different view on the issue, stimulating innovation.
When it comes to technology, it becomes increasingly less futuristic to think about a world where computers aid us in our daily lives. This aid can be strengthened by giving the computers a sense of autonomy, to let them make decisions by themselves. There are already initiatives on letting a car drive by itself, with safety systems that can react on accidents before you can. Or a fridge that orders food by itself when it senses you’re running out of apples. When it comes to household appliances, we most likely want to incorporate values like cost-effectiveness, and convenience.
But there’s another side to this coin when it comes to drones. A drone is an unmanned aircraft. Drones became very popular to the general public when they could be mounted with a Go-Pro camera, to make breath-taking aerial views of cities, memorable videos of weddings and vacations, or making cool recordings of concerts. (https://www.youtube.com/watch?v=fl2jeQDMZ28) but the technology is older, and more widely used in the military. Drone-Strikes become more popular with the world’s armies, as they cost less than jet-pilots and target terrorists efficiently.
But anyone who has seen the movie Terminator knows the danger of giving technology too much power and autonomy. While that doesn’t seem too likely according to drone-expert Prof. Alex Leveringhaus at the TU Delft, it does raise questions on incorporating ethical values. How can a drone correctly determine who, how and when to attack a target, and who’s responsible for when they’re wrong? How big is this risk, and are we willing to take it?
The first questions relate to risk-analysis, asking ourselves if the added safety-value is enough to maybe justify innocent deaths caused by faulty judgment by a drone.The latter questions relate back to responsibility, discussed in week 1. Who is (in)directly responsible for innocent deaths, the machine, or the army?
Thinking about these issues while still designing the product improves the value of the end-product, following the Collingridge Dilemma, but it does mean it’ll take longer before the technology is available. But most of all, it shows a corporation cares about their product as well as the values of its consumers, by taking responsibility in avoiding possible future product disasters.
And this is wat Responsible Innovation is all about. Incorporating values by thinking differently, and showing you care as much about your product as the values of your customers. Taking responsibility in what you design, and what you need to change in your design to be inclusive. And by increasing the value of your design and your thought processes, we become better people.