Posts Tagged ‘law’

PII 2011: Mapping the PII Market: Players, Regulators, Stakeholders

November 15th, 2011
Comments Off

Session with Terence Craig and Mary Ludloff, PatternBuilders. Terence: their book is Privacy and Big Data (O’Reilly).

Things have changed in privacy and personal information. PII-driven business models (later). Data collectors are the engine: giants like Google, Facebook, Twitter, also organizations and agencies like Florida DMV (sold data to LexisNexus), also mom & pop operations. What makes information valuable? Your health and wealth, the networking you do, the Internet of things (you). What role to the aggregators play: markets for buying and selling data. Uses are infinite: research, monitoring, predictive modeling, advertising…

PII-driven business models:

  • Platform plays (SAS, Hadoop, Revolution, Microsoft’s SharingInsight, CouchDB, etc.) – where everything is phoning home all the time.
  • Social plays: LinkedIn, Facebook, Google Plus and Foursquare, but mobile is not this change. Also KISSmetrics, Klout, Zinga, hootsuite, radian6.
  • Goverment plays: TSA and NSA, FBI, IRS, can buy from Facebook, Palantir (DOD).
  • Privacy plays: SafetyWeb,, TRUSTe, Singly, also Intellilight (in Detroit, attached to street lights where if there are a couple of people are there it turns audio mike and calls police), Spokeo, Datong
  • Everyone plays: not just about advertising, many industries and business models benefit.

Implications for all PII players: privacy expectations, regulatory adherence (global), transparency (toward customers), crisis management. Privacy concerns are growing with consumers. Government is signalling that concern with new legislation. Companies must invest in this area, including training and certification.

Regulations: it’s confusing and will get more so. US: >30 federal states, >100 state regs for data security privacy. EU, pending legislation adds more. Bottom line; you’re going to need help here. Be transparent, be explicit about what you can’t provide. Use opt-in data options only.

Crisis management: when things to wrong, know how you are going to deal with them. Get a team and process in place. It’s about staying with the story if you can (used to be getting ahead of the story, now stay with). How to avoid a train wreck: be transparent, think global, be ready for breaches, behave as if you were worth your customers’ trust.

Question: opt-in: don’t short the short-term: be transparent. Opt in is a good way for customers to choose, is sticky.

future, history, records, tools , , , , , , , , ,

PII 2011: Implementing a Privacy Program

November 15th, 2011
Comments Off

This session is a “behind the scenes look at Micrsoft’s internal privacy program.” See the agenda for more information. Participants: Kim Howell, Reese Solberg, Michelle Bruno.

Kim Howell, (one of) Privacy Directors at Microsoft: When you’re doing a privacy review (practical, intuitive), you need to ask questions. Role playing with Reese as new company seeking a “privacy policy.” First questions (from our table discussions): what does site do, how do they collect info and what do they do with it? What’s their info flow path (is it resold?)? What’s their business model? How do you protect what you’ve collected? Controls by the individual (can visitors remove their data? remediation? transparency?)? Cookies? Other passive data collections? Countries involved (collection, use, storage)?

From Kim: Website: is this a new domain, link to privacy statement? existing privacy statement and does it match/make sure it covers everything? Data collection (see above). Send questions to new site/organization, get information, iterate. More questions: authentication, communication, vendors. Are people creating new accounts? use of email? data access requests? Vendors? Next round of questions: how well does IT + PR + Lawyers work together? Does privacy statement match the service? where’s plausible deniability? Make sure what’s required is clear, what’s optional. Provide better notice about use of information, data retention. Using HTTPS? How easy/obvious is it to obtain informed consent when signing up? Companies often think that writing a privacy statement at the last minute. (Wrong)

Next iteration: What new data is being collected? being sent where? other (new) features coming up? what info is shared? location: is it always being sent, or only in use when app is open? what other info (unique device ID, cell tower info, gender, etc.) is being sent with location data? data retention? If services changes, company may need to re-opt in application users. Privacy controls? (example of circulating the data within different departments of the company, “accounting department loves this data.”) Who needs access? for what use? access to raw data or aggregated statistics? Have data handlers been trained? Unique identifiers are not the only way of identifying a person. What’s intended use of collected data?

Michelle Bruno, Technical Privacy Manager: see printed case study (not online). Focus areas:

  1. Level setting: focus on use of customer data, customer expectations, opting out
  2. Author guidance: “how to” guides, privacy review checklist, company activities, data sharing, research and betas
  3. Position yourself: pro-business privacy message, culture of privacy as a value-add
  4. Piggyback: identify existing processes that you can take advantage of: spec templates, guidelines, bug tracking, testing, release management…
  5. Analyze and assess: comprehensive data-gathering plan to understand company’s risk
  6. Educate: pro-privacy contacts in each group to help succeed, spread work to peers about new process/resources
  7. Identify triage partners: incident handling, partnerships in legal, customer support, operations, PR
  8. Measure: what are your success metrics?

Questions: tension between user controls and corporate collections? Make sure value matches, is understood by both sides. Look at what business can put in place to allow better user controls. Microsoft has a federated privacy team, Kim’s team defines what compliance looks like.

Not mentioned in this panel but of some related interest (about Terms, not Privacy Policies): TOSAmend and EFF‘s TOSback.

future, history, records, tools , , , , , , , , ,

IIW XIII: PDEC’s Legal Advisory Board

October 19th, 2011
Comments Off

This session was led by Mary Hodder and me. Our Agenda:

  1. What’s the problem?
    • Disparate efforts and projects
    • Differing legal perspectives (some driving innovation)
    • Desire to coordinate, support and educate
    • Include people who are new to this circle but doing related work
  2. Who should join?
  3. Monthly calls/Town Hall
  4. Working Areas/Groups (calls) – ad hoc discussions as needed

Notes from Markus Sabadello:

PDEC’s mission:

There are different efforts + projects. As a consequence, there are also different local perspectives.
PDEC’s desire is not to build any concrete things (trust frameworks, technology, etc), but:

  • to coordinate existing efforts, and help finding each other
  • to follow, track and educate about existing efforts and different views
  • to facilitate dialogue between the actors
  • to support new startups/initiatives to realize the vision
  • to bring together sociologists, technologists, lawyers, etc. working on personal data
  • to include people who are new to the circle but doing related work

Some concrete topics during the session:

  • multi-jurisdiction issues
  • multi-disciplinary groups with interdisciplinary focus
  • defining the new business layer (not just legal layer)
  • common elements of a trust framework
  • technologists (want to push things forward?) + lawyers (want to push things back?) need to work together!
  • how do we represent individuals?
  • legal practice becomes law
  • role of regulation (managing risk)
  • look at existing regulator models for telecom + postal interop
  • what is institutionally required for certain services + systems
  • what can individuals do? where are groups + capital required and where are they not required?
  • relationsip of IdPs to each other and to nation-states who set policies

Kaliya presented the “Personal Data Ecosystem Landscape” diagram explaining the conceptual relationships between different parties in the Personal Data Ecosystem.

future , , , , , , , , , , , ,

IIW XIII: Sneaky Bastards

October 18th, 2011

Renée Lloyd’s talk on how VRM disrupts and liberates legal practice. VRM challenges how legal practice is executed.  What if new businesses had to negotiate with her? What messages are people giving us? Agreement making in a human style: sharing thoughts, contextual, ceremonies and expressions vary. Contrast: agreement corporate style (hand cuffs, “None are more hopelessly enslaved than those who falsely believe they are free.” -Goethe)

Clickwrap “agreements” are what we call Contracts of Adhesion: no negotiations, take it or leave it. Should not be assumed that we always want a fast track. Look especially in employment contracts, very restrictive. Makes sense in certain contexts but should not be a universal solution. Even if you’re the 800 pound gorilla, you don’t always have to act like one. The processes are insidious, things start small but quickly spread into areas where it’s impossible to right any wrongs.

Lawyers use existing terms from contracts to create new, limit liability. Nothing new: no innovation is allowed (social media prohibited). Caveat emptor: problematic because consumers can’t “be aware.” FTC expects that markets create innovation, expected that it will help protect. But no plain english rules, effective regulation that’s reactive (because bad things happen) produces unworkable constraints and rules.  System is out of balance. “No number of legal victories, tech tools or whitepapers–however well-intended or important–are going to convince people to take ownership of agreement-making back from lawyers and companies.” It’s about people, not lawyers. We need to bring the balance back, help people to take ownership.

Time for a change of attitude! What do you do when the most economic solution is a data exploiter for the big companies? What if there was a trusted voice? For example, what if the Terms of Use on a site, instead of the pages you see now, told you this (video) [x] Cede your rights? No site has fair terms right now. Or what if you had a “sneaky bastards” prompt that took your selected text, marked it up, turned it into a visual games? (Sue for emotional distress: Amazon’s terms at 80 pages and Renée didn’t have it all, PLUS they can change those terms anytime. Outrageous.) Activities to bind parties, even if they don’t understand, includes individual initialing every page. New advice in legal community is to make it clearer that they knew. Relaxed formality standards. – starter site with Renée’s rants, hall of shame, hall of fame, etc. Learning, doing, kicking butt, giving props, pure ferocious fun! Data on collaborative law shows less law suits in case where people/individuals are consulted about change. Innovation: minimum viable contracts, curate, track and assess new contract models, etc. Check the site for things to develop.

future, history, records, tools , , , , , , , , , , , , , , , , ,

On Data and Disclosure

December 15th, 2009
Comments Off

I like to think about ways to customize my world, and the digital world writ large, in ways that support and help us explore our unique selves. It is in our very diversity that individual strengths can play out to become our personal best, to help each other grow, and create fertile new worlds.

However, under the guise of “increased security,” we are increasingly surrounded by tools and technologies that minimize and standardize us, including video surveillance and data storage and analysis. About that last link to Google, CEO Eric Schmidt recently said “If you have something that you don’t want anyone to know, maybe you shouldn’t be doing it in the first place.

This indiscriminate personal data hoarding is both an individual and a societal problem. Schmidt’s argument that we shouldn’t have anything to hide is specious (not to mention a double standard: it doesn’t apply to Schmidt). In a 2007 paper called ‘I’ve Got Nothing to Hide’ and Other Misunderstandings of Privacy, George Washington University Law School’s Daniel J. Solove convincingly critiques that argument. Indeed we have many things to hide, like our passwords and credit card numbers, certain personal habits and preferences, things that contribute to human dignity and respect. As noted security expert Bruce Schneier writes in his essay The Eternal Value of Privacy, “Too many wrongly characterize the debate as “security versus privacy.” The real choice is liberty versus control.”

Ironically, Gary Wolf and Kevin Kelly host a blog called The Quantified Self where they report about people exploring ways to keep track of themselves. It’s a significant difference between curiosity, personal need, and voluntary disclosure that’s driving data sets, and corporate ventures like Facebook (nod to jerking you around again with recent privacy policy changes), Google (Schneier’s response to Schmidt’s quote above), and damned near every corporate site you make an account with and that tracks your every move these days.

I’m looking for examples of sites that encourage liberty and demonstrate some respect for its users/clients. I will be reporting on what I find. If you have suggestions, I welcome them.

Coaching moment: Here’s a little thought exercise. Think about a typical day in your life.

What kind of things do you do in private? These might be taking a shower, brushing your teeth, thinking about the day. Some things might be really private as in just you by yourself, and other things may be private in some context, like thinking about your day out loud with your spouse or partner. Once you get a good list, which of those things would make you uncomfortable if they were made public in some way?

Now think of the kind of things you do in public, like driving to work or the store, walking around, having a conversation over lunch. Think about stories that might be told about you from the perspective of not knowing what you were really doing. You might take clues from signs that you walk by, or maybe other people (posture, groupings, facial expressions). Can you think of any stories that are not only wrong but might hurt you?

Finally, think about your online tools. Have you actually looked at the Terms of Service or Privacy Policies that you’re agreeing to? If you knew they were disrespectful to you or even abusive of your personal self and liberty, would you stop using them? Since the answer is “probably not,” what would you suggest these companies change?

friends/family, future, history, records, tools , , , , , , , , , , , , , , , , ,