True research is highly creative; the constant „re“-“searching“ of truth; a mode of being and thinking in constant flow. Deep waters of knowledge are the foundations to understand more of what we are looking for; a depth that we as researchers constantly need to nourish in ourselves. And still it is the ray of light or shadow in our minds that suddenly give us a clue on where truth may finally be buried. Only to find that it is never buried, but always in flow. My life is in this constant flow and the drops of knowledge I could contribute can be found here in my publication list. But it must also be noted: research is not a lonely endeavour. It is also a matter of discourse, dispute and constant co-operation. The journey of my thoughts and my companionships with others is captured in this site. At the core was always one mission: understanding how people interact with technology and then build technology such that it optimally serves their Eudaimonia.
For access to publications go to Researchgate:
Sarah Spiekermann on ResearchGate
Since 2014, I have been engaged with the fundamental topic of ethics in IT. After almost 14 years of research on digital privacy from 2001 onwards, it was clear that privacy and control over machines are not the only values that count for our digitized future. We have to think fundamentally about a broad spectrum of values in IT system design. For this reason, I first published a textbook for students on Ethical IT innovation with Taylor & Francis in New York. This book also contained a first guide to what values are and how we can design systems in a value-ethical way.
Spiekermann, S. (2016)
Ethical IT Innovation - A Value-based System Design Approach
New York, London and Boca Raton CRC Press, Taylor & Francis.
Shortly afterwards, I started working with the IEEE engineering association to build a first standard on ethical engineering going from value principles to practice. Working for three years with system engineers from around the world to build the IEEE P7000 standard I learned a lot about the real problems of engineering. The first major review article on 'Value-based Engineering for Ethics by Design' resulted from this standardization work. It provides clear guidance and definitions on how to build value-based systems.
Spiekermann, S. and T. Winkler (2020 forthcoming)
Value-based Engineering for Ethics by Design.
fourthcoming.
The value-based engineering approach is also described in German Philosophical Handbook of AI. Here I compare value-based thinking with the Utilitarian approach to 'Moral Machines'. From my point of view machines cannot have morals. They can only have technical dispositions built into them so that certain values can then unfold for us humans.
Spiekermann, S. (2020)
Digitale Ethik und Künstliche Intelligenz.
Philosophisches Handbuch der Künstlichen Intelligenz.
München, Springer Verlag.
What a world could look like, in which digital ethics is lived as I imagine it to be, I have published in my popular science book on the subject, together with a description of some fundamental values for the digital age: freedom and knowledge.
Spiekermann, S. (2019)
Digitale Ethik - Ein Wertesystem für das 21. Jahrhundert.
München, Droemer
When I started my research on Personal Data Markets in 2012 I believed that people should be given ownership rights to their private data. I was able to prove my belief in a major experiment. Together with Jana Korunovska, I studied more than 1000 Facebook users who all start to value their data (in monetary terms) only at the moment where they learn that there is a market for it. Moreover, users get very angry when they do not get control over their personal data.
Spiekermann, S. and J. Korunovska (2016)
"Towards a Value Theory for Personal Data."
Journal of Information Technology (JIT) 32(1): 62-84.
Seen my belief in personal data ownership rights I developed a market-model for personal data markets, together with Alexander Novotny. Together we interviewed the ‘who-is-wo’ of the privacy-world to test our model, which describes how personal data markets could be designed in such a way that personal data is traded, but under fair conditions for the natural owners of the data subjects.
Spiekermann, S. and A. Novotny (2015)
"A vision for global privacy bridges: Technical and legal measures for international data markets."
Computer Law and Security Review 31(2): 181-200.
How difficult and conflict-laden such an idea is in practice though - also from a legal and ethical point of view - became clear to me when I worked with my colleagues Alessandro Acquisti and Rainer Böhme on a special issue on Personal Data Markets in the journal Electronic Markets. The joint reflection on the challenges of such markets has started to cloud my ideas on property rights.
Spiekermann, S., A. Aquisti, R. Böhme and K.-L. Hui (2015)
"The challenges of personal data markets and privacy."
Electronic Markets 25(2): 161–167.
I then started a project with the Viennese privacy activist Wolfie Christl, with whom I co-edited his research on personal data markets in a book: 'Networks of Control'.
Christl, W. and S. Spiekermann (2016)
Networks of Control - A Report on Corporate Surveillance, Digital Tracking, Big Data & Privacy
Vienna, Facultas.
The book Networks of Control illustrates the point that Shoshana Zuboff is making in her book on surveillance capitalism as well: Using personal data as a commodity means to commoditize mankind and turn men and women into resources for economic profit. Against this background I cannot pursue the idea of ownership rights for personal data any longer. My ultimate ethical view is that any communication between people should never be commercialized on principle.
In my 2nd book (habilitation), I have been working on the topic of human control in the Internet of Things ('Ubiquitous Computing'). Together with Frank Pallas , I coined the term "technological paternalism" in this context.
Spiekermann, S. and F. Pallas (2005)
"Technology Paternalism - Wider Implications of RFID and Sensor Networks."
Poiesis & Praxis - International Journal of Ethics of Science and Technology Assessment 4(1): 6-18.
In order to measure how far people still feel in control in an Internet of Things, I developed a scales that measure the perceived control of technology users in the IoT.
Spiekermann, S. (2007)
Perceived Control: Scales for Privacy in Ubiquitous Computing.
Digital Privacy: Theory, Technologies and Practices.
A. Acquisti, S. D. Capitani, S. Gritzalis and C.Lambrinoudakis. New York, Taylor and Francis.
Unfortunately, however, the use of this measurement instrument in the Metro Future Store revealed how much people retreat into a learned helplessness, regardless of whether they have privacy technology available or not. This means that more control technologies do not mean that people trust more. They continue to feel helpless.
Guenther, O. and S. Spiekermann (2005)
"RFID and Perceived Control - The Consumer's View."
Communications of the ACM 48(9): 73-76.
As a business informatics scientist, I do not only investigate human perception of technology, but also make suggestions on how we can build better technology. The core article on privacy engineering that I wrote with Lorrie Cranor was a very important personal milestone for me, trying to contribute to better system design.
Spiekermann, S. and L. F. Cranor (2009)
"Engineering Privacy."
IEEE Transactions on Software Engineering 35(1): 67-82.
Later, together with Marc Langheinrich, I investigated whether and how software and systems engineers actually feel ready to implement privacy measures; or security measures respectively. The results of our joint study are sobering. Almost 20 years after the invention of the world wide web and the widespread diffusion of technology into our everyday life 40% of system engineers do not feel responsible for the systems they build.
Spiekermann, S., J. Korunovska and M. Langheinrich (2018)
"Inside the Organization: Why Privacy and Security Engineering Is a Challenge for Engineers."
Proceedings of IEEE 107(3): 1-16.
We still continue to work on better technology. In this vain, I supervise system design projects with ethical import. One such projects was with Sabrina Kirrane and Peter Blank on the subject of privacy-friendly drones.
Blank, P., S. Kirrane and S. Spiekermann (2018)
"Privacy-Aware Restricted Areas for Unmanned Aerial Systems."
IEEE Security & Privacy 16(2): 70-79.
Meine empirischen Arbeiten zum Privacy Paradox und zur Hilflosigkeit von Nutzern im Internet der Dinge, haben mich dazu veranlasst, mich politisch für mehr Privatsphäre einzusetzen. Ich wurde daher Rapporteurin bei der EU Kommission in der DG Connect für diejenige Arbeitsgruppe, die das von derMy empirical work on the privacy paradox and the helplessness of users in the Internet of Things has led me to make a political commitment to greater privacy. I became rapporteur at the EU Commission’s DG Connect for the working group that was to develop the first ‘Privacy Impact Assessment (PIA) for RFID’, which was officially ratified by the European Commission in 2011. Upon completion of the working group, I led the group of European business partners who negotiated with US bodies on how to do PIAs (for RFID/NF) in practice. The history of the emergence of this PIA approach and the political struggles surrounding its coming-about are described in:
Spiekermann, S. (2012)
The RFID PIA- Developed by Industry, Agreed by Regulators.
Privacy Impact Assessment: Engaging Stakeholders in Protecting Privacy.
D. Wright and P. De Hert. Dodrecht, Springer Verlag
The scientifically matured PIA method was then published in the European Journal for IS; together with my former PhD student Marie Oetzel, who had always supported the PIA work in the background and has also prepared a corresponding guideline for the German Federal Office for Security (BSI).
Oetzel, M. and S. Spiekermann (2013)
"A systematic methodology for privacy impact assessments: a design science approach."
European Journal of Information Systems 23(2): 126-150.
What is crucial is that if you go through a PIA properly and you analyze all the threats around a technology in such detail that you can take appropriate measures against them, then you end up with "Privacy by Design". I have described this flow, from PIA to Privacy by Design, in the ACM Magazine 'Communications of the ACM'.
Spiekermann, S. (2012)
"The Challenges of Privacy by Design."
Communications of the ACM 55(7).
In 2001, I first experimentally showed a phenomenon, which was later coined „Privacy Paradox“ This behavioral paradox describes how people, despite their explicit desire for privacy, disclose private information online to a vast extent; that is, they seem to forget all their good privacy intentions.
Spiekermann, S., J. Grossklags and B. Berendt (2001)
E-privacy in 2nd generation E-Commerce.
Proceedings of the 3rd ACM Conference on Electronic Commerce EC'01, Tampa, Florida, USA, ACM Press.
Later I was able to deepen this work with Hannah Krasnova with regard to interactions in social networks.
Krasnova, H., S. Spiekermann, K. Koroleva and T. Hildebrand (2009)
"Online Social Networks: Why we disclose."
Journal of Information Technology 25(2): 109-125.
For me personally, the most important basic research I could contribute to this subject is on the role of entropy in online communication. Entropy, i.e. the degree of chaos in our environment, seems to significantly influence our online disclosure and communication. In an online experiment, I could observe how, with increased entropy in an interface comes more communication; at least in terms of words spoken. But at the same time the richness (depth) of the content of what is said collapses. More entropy less content. The same is true for self-references: The more entropy in the online environment the less am I able to refer to myself. My data suggests: Entropy is eating the self.
Spiekermann, S. and J. Korunovska (2014)
"About the Importance of Interface Complexity and Entropy for Online Information Sharing."
Behaviour & Information Technology 33(6): 336-345.
In my Ph.D at Humboldt University of Berlin I worked on the question how people can be supported by digital anthropomorphic software agents while shopping online. I invented a software agent called “IWA”, which picks up users' perceived purchasing risks and tries to reduce these in the sales-dialogue with customers.
Spiekermann, S. and C. Parachiv (2002)
"Motivating Human-Agent Interaction : Transferring Insights from Behavioral Marketing to Interface Design."
Journal of Electronic Commerce Research 1(2): 255-285.
An essential scientific thought in the development of this software agent was the aspect of privacy. I expected the sharing of personal (sometimes-delicate) information with such an AI as a new search cost category for internet users. It “costs” you privacy to reveal something about yourself to an AI.
Annacker, D., S. Spiekermann and M. Strobel (2001)
E-privacy: A new search cost dimension in online environments.
14th Bled Conference of Electronic Commerce, Bled, Slovacia.
Today we know that 90% of social network users are deliberately withholding information, because they fear for their privacy. However, people also reveal tremendous amounts of personal data. We do not know yet whether the reluctance to share will increase in the wake of surveillance capitalism. Technical developments, such as the MyData movement or decentralized federated identity projects, as well as end-to-end privacy architectures could lead users to no longer having to worry about their private information. If this ideal course of technological development really happens in the coming twenty years, then my search cost category ‘privacy’ would no longer apply. However, if surveillance capitalism prevails, this search cost category is likely to become more important.