Bryan Stevenson offers a wonderfully rich perspective on attitudes in the United States. He points to some very real consequences that are the result of a general lack of social conversation, understanding and appropriate action by our society. These consequences were fueled by intolerance and fear.
Coaching moment: What does this video have to do with digital identities? Everything. Bryan talks eloquently about the humanity in our lives. In the digital world, humanity is largely stripped away and reduced to points on a line, or organized according to a specific (not necessarily appropriate) framework, or a gathering of bits that someone can make assumptions about.
We all know that we’re more than the sum of our parts: we may have purchased a thing, or been involved in an event, or may know a particular person, but that thing, event or person does not define our whole life. We need tools to help us explore and express our individual needs, wishes, and priorities online as well as off. When we are only subject to the rules of others, we can not be ourselves.
This session features Lauren Gelman, BlurryEdge Strategies, and Kevin Mahaffey, Lookout Mobile Security. Kevin says most powerful force in a company is security and privacy. However, no start-up starts with Chief Privacy Officer. Lookout uses a “New York Times test”: everything you’re doing can be published on front page, including how your product works. “Everyone complains about privacy policies, but the more you can communicate with users you can avoid a whole world of pain.”
Lauren: what if your device was stolen? You probably don’t want to notify the thief that the device is being tracked. What’s your threat model? Who’s looking for your data?
Kevin: you have a choice of encrypting data or password resets. There are constraints from many interests that will prevent you from doing what you want. Trust-e is doing some good work.
Each platform makes decisions about how users are going to make decisions about their use of the device. Tremendous liability for companies that misuse customer data. Users are starting to weigh this as a decision point. Compliance is a smaller part of Lauren’s work–there’s a whole lot of unregulated stuff going on. She gives a company a “gut check” on what users would think of these practices, collecting location info and what’s reasonable notice, later translation into a document.
Compliance is not big for startups. The companies that succeed are likely to be those who handle privacy best in any new field.
Server location and data protection: different countries treat data variably, what about later when data is valuable? This is a really hard problem, best answer is locate servers in countries with best policies (Kevin Marks suggests Iceland). Have policies that spell out requirements: what you have, retention, is there another alternative to what’s normal procedures, etc. Other extremes: all user data is going into cloud such as Amazon services. This is an adjustment for people. Who holds the key?
New changes to Facebook? It’s a decision to work with them or not. Lauren doesn’t believe that Facebook-like practices will happen again. Using FB Connect is a decision to facilitate user authentication.
What do you think about AWS services, 80 page Terms of Service that allows a very invasive data policy in Amazon’s favor? Lauren: a lot of people are trusting what Amazon’s going to do. I’ve read their TOS and I don’t know what Amazon’s going to do. Important to ask about notice, what kind of policies need to be ported from cloud hosts into your products/services.
Not in this session but related:I Shared What?!? – a service that shows you what information you’re sharing when you use Facebook or FB Connect.
This session is a “behind the scenes look at Micrsoft’s internal privacy program.” See the agenda for more information. Participants: Kim Howell, Reese Solberg, Michelle Bruno.
From Kim: Website: is this a new domain, link to privacy statement? existing privacy statement and does it match/make sure it covers everything? Data collection (see above). Send questions to new site/organization, get information, iterate. More questions: authentication, communication, vendors. Are people creating new accounts? use of email? data access requests? Vendors? Next round of questions: how well does IT + PR + Lawyers work together? Does privacy statement match the service? where’s plausible deniability? Make sure what’s required is clear, what’s optional. Provide better notice about use of information, data retention. Using HTTPS? How easy/obvious is it to obtain informed consent when signing up? Companies often think that writing a privacy statement at the last minute. (Wrong)
Next iteration: What new data is being collected? being sent where? other (new) features coming up? what info is shared? location: is it always being sent, or only in use when app is open? what other info (unique device ID, cell tower info, gender, etc.) is being sent with location data? data retention? If services changes, company may need to re-opt in application users. Privacy controls? (example of circulating the data within different departments of the company, “accounting department loves this data.”) Who needs access? for what use? access to raw data or aggregated statistics? Have data handlers been trained? Unique identifiers are not the only way of identifying a person. What’s intended use of collected data?
Michelle Bruno, Technical Privacy Manager: see printed case study (not online). Focus areas:
Level setting: focus on use of customer data, customer expectations, opting out
Author guidance: “how to” guides, privacy review checklist, company activities, data sharing, research and betas
Position yourself: pro-business privacy message, culture of privacy as a value-add
Piggyback: identify existing processes that you can take advantage of: spec templates, guidelines, bug tracking, testing, release management…
Analyze and assess: comprehensive data-gathering plan to understand company’s risk
Educate: pro-privacy contacts in each group to help succeed, spread work to peers about new process/resources
Questions: tension between user controls and corporate collections? Make sure value matches, is understood by both sides. Look at what business can put in place to allow better user controls. Microsoft has a federated privacy team, Kim’s team defines what compliance looks like.
Not mentioned in this panel but of some related interest (about Terms, not Privacy Policies): TOSAmend and EFF‘s TOSback.
Good attendance, very diverse industry representation! Thanks Joseph from Broadridge for his chair in our crowded room, allowing me to take notes.
Kaliya showed a slide of PDEC landscape: Personal zone overlapping with Accountability “Trust” Frameworks which contained Personal Data Zone, also overlapping with the Market. At bottom of this landscape view: Governance through Legal, Code, Identifiers, and Peers–who act as framework creators.
Slide of PDEC Startup Circle. Joining is a peer-reviewed process, what open standards are they using, what’s their value space/where are they coming from. Leaders consider if group qualifies; trying to cultivate “an industry collaborative, engaging with technologists and business leaders from banking and finance, telecom, cable, web, advertising, media and other industries seeking to understand opportunities, launch pilot projects and ultimately offer service in the ecosystem.”
Discussion about who “manages” your data as your IDP, and what personal control individuals have over that data. Is this like a bank, where you go in to withdraw all your money and get the Bank’s response “that’s our money?” Or can you withdraw your funds and walk across the street to another institution and open a new account, because your money is portable? Why would a telco worry about risk? This is a most important concept for them. Similarly in banking: board-level view is that they’re not going to be the first ones to jump. Either all jump at once or they get killed. Risk in the US of having all your funds in one institution is higher than distributed accounts. Same thing with different kinds of data, e.g., health data vs spending.
Fair Information Practices (FTC standard used for enforcement): framework when they started back in the 1970s worked, but now systems are more complex, no notice and consent about which databases we’re now part of. About time for a FIPS refresh? Kaliya is working on a paper, what are core principles and guidelines that government could adopt? Where does the thinking need to be? We have more powerful devices in our pockets. Lots of privacy conversations are about do not track/store. OECD principles are not regulations, are technology neutral (data minimization, etc.) but they don’t make assumption about individual ownership & agency over own data.
Refreshing principles is a good exercise, but one thing missing from principles is concept of fairness. Control is about fairness, fair trade and equality. Striking assymetry today. Notice and consent is not working, people can’t do much about it.
Mary quickly reviewed Organizations stewarding user driven personal data and ID. Slide includes: ProjectVRM (an ethos and conversation), WEF, PDEC, Kantara Initiative, IDCommons, UMA, Information Sharing Working Group, Open Identity Exchange, The Data Portability Project, W3C, and microformats.
Shift in focus back to PDEC’s work: What’s personal data and what’s not? What’s self-asserted data?
Kaliya showed a map of personal data (link to come), then reviewed briefly what some of the companies do in the Startup Circle. Question about business models and how those companies plan to make money. (Some uncertainty here.) What are they hoping to do, how do they see working together? Respect, collaboratively working toward interoperability, for big players to adopt or use emerging standards. Faster adoption. Is this policy or protocol standards? PDEC is about conversation, discovery and education, document activities, and catalyzing an interactive collaborative market. Paint common pictures, evolve common language.
Note: If you’re interested in this space, check back for updated links to slides and graphics that were in progress during this session.
Danah Boyd is an insightful researcher. She just wrote a post called “Real Names” Policies Are an Abuse of Power in which she takes Google to task for their changing policies and rather abrupt practice of kicking people off of Google Plus. I agree that being arbitrary is an abuse of power when it affects people so strongly (disabling an account removes the use of all services, not just Google Plus). However, there are two kinds of power: shared, and proprietary.
Google, along with Facebook, Twitter, and in fact nearly all Internet-based services (Amazon, eBay, your Internet service provider, etc.), are proprietary. These services are run by companies that:
are private or beholden to shareholders (their “business model”),
have one-sided Terms of Service and Policy documents that users are required to agree to, and
are based on the selective delivery of their user base to their customers (usually advertisers).
A striking characteristic of these businesses is that they have a practice of reducing things to black and white. Our chosen (registered) name “is” or “is not” really us. See Doc’s post A Sense of Bewronging for more thought on this. In a simplified (business) sense, it is an abuse of social power to declare that many of us are not who we say we are, even if we’re known to many others by our chosen registered name.
Contrast this with a shared power model, like a commons, or services that are implemented according to open standards. The underlying Internet protocols (the apache web server, sendmail, TCP/IP, etc.) are not owned by anyone, everybody can use them, and anybody can improve them. These resources are shared—no terms of service is required to use the Internet or email with any device you choose, with any compatible software, from any location that has access. “Commons” is where you can be who you are, no matter what name you go by.
Coaching moment: This may be a non-issue for some. I have friends that use their name to create a “brand” for themselves—so people will recognize them everywhere, and know what they’re about. However, that’s not an option for people in sensitive situations. Think of it this way: Everyone has a moment when they choose not to disclose some bit of information to the world. Sometimes it’s a name. That’s not a bad thing, and it should be a choice.