Blog • Dorothy R. Howard

Home Writing Research Blog

November 16, 2017

REVIEW: D.E. Wittkower. "Lurkers, creepers, and virtuous interactivity: From property rights to consent and care as a conceptual basis for privacy concerns and information ethics." First Monday. V. 21, No. 10. October 3, 2016.
Dorothy Howard
“To be entangled is not simply to be intertwined with another, as in the joining of separate entities, but to lack an independent, self-contained existence. Existence is not an individual affair. Individuals do not preexist their interactions; rather, individuals emerge through and as part of their entangled intra-relating.” —Karen Barad, Meeting the Universe Halfway: Quantum Physics and the Entanglement of Matter and Meaning. p. ix

In this paper D.E. Wittkower suggests that “care, support, and intimacy” are a useful set of concepts to guide ethical approaches to privacy debates. The author argues that data is produced socially and interactionally to arrive at the idea that that consent is insufficiently realized when individuals are the assumed agents to grant consent. Wittoker's approach is to turn our attention to an ecosystem of legal scenarios institutional actors, power directionalities, and attitudes. To do this work, he analyzes several domains, including at the end of the paper, behaviors on digital platforms colloquially known as lurking and creeping, which represent challenging cases to apply ethical approaches to interactivity .

I'm going to indulge Wittkower and think about care, support, and intimacy, as social formations and interactions afforded by design considerations. I'd like to think that this introducing the concept of affordances into Wittkower's discussion might compliment others who want to think about the effects of design on our lives. Hopefully we can continue to explore together the valences of mobilizing affect theory to more carefully think about the dynamics of exchange and solidarity in a variety of situattions including human-institution, human-object, and object-object relations.

My argument is that the authors's approach is one example of a coupling of the affective experiences of intimacy, support, and care with affordances, a concept taken up in many fields including design and engineering. The evolving conversations about affordances broadly speaking, have developed frameworks for imaginging the ways in which possibilities and available activities (perceptual, cognitive, psycho-social) are embeded in forms and organization. (Gibson: 1977, 1979/1986; Norman: 2002) I hope we can more critically examine Wittkower's assumptions about care as a categorically positive intention with a categorically positive effect when care has also been used to justify ethically ambiguous and/or questionable interventions. (Martin et. al: 2015) Affordances can be functional and dysfunctional, planned and not planned and there is various types of research on this I have to track down.

The "Lurkers, Creepers" paper begins with a discussion of the current ways in which information exchange and interactivity are construed in various (legal) relationships to property. The author says data is generated collectively through interaction, making it hard to locate who data is 'about'. I think of how people generate data by chatting, collaborating in Wikipedia, and sharing machines. There are many other examples. Nonetheless, there are a lot of exceptions in that sometimes data is more about some people/things than others, and even when data might be generated interactionally, some people might have more right to it, for instance, if it is sharing their personal information, if they are minors, or maybe even if they own the machine in which the data is being extracted. But if we were to start testing these ideas in technical scenarios, his argument would become very difficult to manage. But maybe we can withold those particular kinds of evaluations of his argument and take it more at the face value of what he is saying about intimacy, care, and support.

I wish he had cited distributed cognition work in cognitive science, since he is walking on well-trodden ground when he talks about the co-production of information (data) collective authorship and property. What might make his work unique from previous research is that he is focusing on how we can define the ownership of—and right to forefit rights to the information produced from co-production. Morever, he asks how ethical design practices might benefit from using 'care, support, and intimacy' as guiding principles for approaching questions of how to manage group information production, which those production processes involve the production of data-commodities on social platforms. This is not a new argument, but I think of the historical processes cultivating in (neo)liberal individualism which have shaped modern, Western ideas of private property and intellectual property rights and how those relate to what we consider individual privacy (Foucauldian) as far as data-ownership, but that's another can of worms.

I think it would be useful to extend his argument to look at the how consent for dataveillence is construed legally and in the design of Terms of Service agreements signed by people when they do things as individuals like download software or login to something. Using his argument, we might suggest that one person’s consent to the TOS might lead to a violation of another’s privacy since our interactions are commingled. For example, if someone on GMail was corresponding with someone through secure email channels, the person seeking more privacy might forefit some of their work and security by interacting with the person using GMail. I suppose we might also consider this in the context of students being pretty much required to use school computers for many things. If individuals are not sole authors of the data they produce, who should consent and how? As an engineering or technical problem, systems just aren't built for this right now, but that doesn't mean we shouldn't critique what the current norms are.

But though I appreciate that Wittkower attends to interactional production, I wonder if this way of thinking might be demanding the carriage before the horse. Another way of framing this might be that before working on reinventing systems, designs to facilitate the collective/group granting of consent, how about supporting the current range of projects which are trying to deal with the “fine print” problem, where businesses go to great lengths to make agreements that are so small people can't read, are so fast (radio, TV) people can't hear, are so byzantine that people can't understanding, or that are literally discouraged from being read at all by way of manipulative interface design. As well as of course those technical problems like changes of ToS when new updates happen—'negotiated consent.'

There has already been significant work to improve the intelligibility of privacy conditions and agreements, and Wittkower ignored them. A proposed solutions to ToS fine print problems which I've discussed with others include Creative-Commons-like image-based, icons symbolic of different licenses, to make privacy conditions more comprehensible. Mozilla made some cool privacy icons, which were discussed at a W3C Workshop on Privacy for Advanced Web APIs convened 12/13 July 2010. The paper trail is out there. Still, I grant that such projects don’t answer the problem Wittkower suggests, that agreeing to TOS are done at an individual level, which is ineffectual because we generate data collectively through interaction. Maybe we can think about both problems with current sercurity and consent to data-gathering at the same time rather than seperately?

Wittkower's demand that businesses be more care-minded is also counterintuitive because businesses are definitionally ruled by the logic of capitalism and accumulation. On the other hand, as the author says, corporations can be motivated to care about anything if they stand to profit from it or decrease the chances of getting sued. Whether or not it matters that there are ulterior motives (always expanding and collapsing) is up for debate. Altruism is always mediated by economics—even when done to oppose various economic relaties, and we benefit from considering altruistic actions and intentionalities as they relate to stances about economics.

But people are coached into not caring about consent because they see dataveillance as an equal trade for whatever experience they are getting out of the platform (Karahalios). Wittkower says that today’s youth already takes this for granted: Contrary to the view from a property-rights-based perspective, it is not at all incoherent or contradictory to say — as students have told me in classroom discussions — both that “I want to make sure I look good in a photo before I let someone tag me in it,” and that “I like targeted ads because they show me stuff that I might actually want to buy. What he comes to is that they understand the “latent moral ambiguities” but they don’t really care because they think it’s a fair trade. If business-user intimacy is created by things like transparent data collection, what matters most; the ethics guiding business practices, the on-the ground implementation of those practices, user’s perceptions of the process? Care, support, and intimacy are appealing at the level of guiding principles of action, but how can they be weighted when each one represents a different set of possible considerations and interpretations?

Part of the difficulty of talking about consent and data relates to the current ways in which the term is construed legally in a variety of contexts. What are the politics of taking the terms out of particularized legal contexts, including informed consent in the context of the range of types of procedures for permissions, information disclosure, or the giving or receiving of physical contact?

Conversations about digital labor have contributed the provocation; who shares the profits when data is brokered? It might be useful for us to consult the intellectual contributions of the Wages for Facebook project, which drew from the Italian 1970s Marxist feminist movement Wages for Housework, to add nuance to this discussion, particularly for thinking through ways which social movements have moblized to demand fair compensation for work. users should share in the wealth they generate from data brokering and ad revenue through their engagement. This is the consumer as producer argument in a nutshell. Information activists have acted on this premise, by devising ways to value the worth of social media engagement and content production. Some of those have already been implemented by business like the YouTube Partners/Creators Program or micropayments for website clicks, etc., but many have not. This all starts getting obscene really fast for lots of reasons like bots. I can't help going back to a point my current advisor Lilly Irani made to me on related subjects; "These might be bourgeois property solutions to a problem of how communications constitute collective life."

Wittkower is pessimistic about asymmetrical interpersonal relationships created by the structured interaction in social spaces online just like he’s pessimistic about business practices, but for different reasons. In general, this section seems to wander slightly away from the main argument, being less about the ethics of care as a consent issue than about redefining definitions of lurkers and creepers. But maybe they are related because these different categories of users, and user affects, have different ways that they negotiate relationships with the information others post, some that are more asymmetrical than others. But does that information asymmetry matter in the context of the ethics of consent is my question? We should grant users some agency in that they already know that when they post an Image to Facebook or a geolocation on Twitter, they know that anyone can have that information.

Overall, the paper provides a provocative bridge between the concepts of care, support, and intimacy and the structured, business-user and user-user interactions. With further work, I believe we can continue to find use and resonance by finding interchangeability between psycho-social affects and design concepts. Going forward, I would like to think more about this conversation about care, support, and intimacy in business practices regarding data and privacy in relationship to affordances in many ways not addressed in this paper, including considerations about the allocations of resources, labor, and time.

Sources

Gibson, J. J. (1986). The ecological approach to visual perception. Hills-dale, NJ: Erlbaum. (Original work published 1979)

Gibson, J. J. (1977). "The theory of affordances." In R. Shaw & J. Brans-ford (Eds.), Perceiving, acting, and knowing: Toward an ecological psychology. (pp. 67-82). Hillsdale, NJ: Erlbaum

Norman, Don. 2002. Design of Everday Things. New York: Basic Books.

Aryn Martin, Natasha Myers, Ana Viseu. 2015. "The politics of care in technoscience." Vol. 45(5) 625—641. Social Studies of Science. Thank you Feminist Labor Lab-mates for related, relevant discussions on care.

Foucault, M. (1977). Discipline and punish: The birth of the prison. New York: Pantheon Books.

Thank you Oliver Keyes for pointing me to Wittkower's paper.