The paper on which this summary is based is online here: http://www.ischool.washington.edu/vsd/vsd-theory-methods-draft-june2003.pdf
Value-sensitive design (VSD) consists of iterative applications of conceptual, empirical, and technical investigations with the explicit goal of identifying and supporting "enduring human values." In the past, there has been a lot of research on specific values (privacy, property and ownership, physical welfare, freedom from bias, universal usability, autonomy, informed consent, trust, etc.), but not an overarching framework. Value-Sensitive Design combines and extends the related fields of computer ethics, social informatics (which analyzes deployed technologies), computer-supported cooperative work, and participatory design to create such a framework.
VSD has eight unique features:
- VSD is proactive in influencing design
- VSD expands the scope where values can arise beyond the traditional office environment
- VSD combines conceptual, empirical, and technical investigations
- VSD includes all values, especially moral ones ("moral" is philosophically defined and pertains to fairness, justice, human welfare, virtue, etc.)
- VSD includes both direct stakeholders (interact directly with the technology) and indirect stakeholders (can be affected by the technology)
- VSD does not see values as only "inscribed into technology" (endogenous) or separate from technology (exogenous), but both - a technology may more readily support some values and inhibit others
- VSD assumes that some values are universally held, though they may have different instantiations
- VSD distinguishes between usability and ethical values:
- usability and values can be orthogonal
- usability and values may be mutually supportive (e.g. election software)
- values may compromise usability (e.g. cookie notifications)
- usability may compromise values (e.g. covert surveillance software)
- usability and values can be orthogonal
Values, loosely defined, include the things that one considers important in life. Values can't be derived only empirically, since they're rooted in the subjective interests of people. Twelve values are often involved in technology - which aren't fundamentally distinct - are:
- human welfare
- ownership and property
- freedom from bias
- universal usability
- informed consent
- environmental sustainability
The first nine values "hinge on the deontological and consequentialist moral orientations," and the last three are related to system design.
As mentioned above, VSD includes conceptual, empirical, and technical investigations, iteratively applied.
- The conceptual investigation addresses the following questions: Who are the direct/indirect stakeholders affected by the design, and how are they affected? What values are implicated? How can we make tradeoffs between competing values in design, implementation, and use? Should moral values (e.g. privacy) outweigh non-moral values (e.g. aesthetics)? It's also important to carefully conceptualize what each value means in the particular context.
- The empirical investigation tests the conceptual design through interviews, observations, user studies, etc. Useful questions are: How do stakeholders view the values in the context in question? How do they prioritize individual values, usability considerations, and competing values? Are there differences in what people say and do? If the design includes and organization, what are the organizations motives, training, incentives, etc.?
- The technical investigation focuses on how existing technological properties and underlying mechanisms support or hinder values, and how they can be changed to better support them.
Here is a recommended course of action for VSD.
- Start with a value, technology, or context of use - choose the aspect that is most central to your work and interests.
- Identify direct and indirect stakeholders. There may be subgroups, and people may be members of more than one group. Power structures are often orthogonal to the distinction between direct and indirect stakeholders.
- Identify benefits and harms to each stakeholder group. If there are a lot of indirect stakeholders, give priority to those strongly affected or large groups. Consider varying technical, cognitive, and physical competency. If you're defining personas, put them in multiple stakeholder groups (as appropriate).
- Map benefits and harms onto corresponding values. Sometimes mappings will be direct, sometimes vague or multifaceted.
- Do a conceptual investigation of key values (including literature).
- Identify potential value conflicts. There should be seen as constraints on the design space. Some may need discussion between stakeholders to find workable solutions.
- Integrate value considerations into the organizational structure (which may include conflicting economic or power goals).
- Heuristics for interviewing stakeholders: Ask "why?" (or a less challenging query) often. Ask about values both directly and indirectly (through hypothetical situations, common events, tasks, behaviors) - make sure to conceptualize what the topic entails beforehand.
- Heuristics for technical investigations: Make explicit how a design tradeoff maps onto a value conflict and affects stakeholders. Design flexibility into the system to accomodate unanticipated values and value conflicts. Note the control of information in underlying protocols, and add the ability to switch the release of information off.