Archive

Posts Tagged ‘advertising’

PII 2011: Making Privacy Portable

November 15th, 2011
Comments Off

Larry Downes moderating panel with Chris Babel, TRUSTe, Jim Brock, PrivacyChoice, and Chris Kelly, Kelly Investments. Jim: PrivacyChoice’s mission is to make privacy easier: managing online, templates, partners & their APIs. We’re bootstrapped right now. Chris B: TRUSTe: privacy services have evolved into advertising, mobile and cloud spaces. Was non-profit but 2.5 years ago we went for-profit. Chris K: companies with data components of user behavior, concerns with venture model.

Larry: privacy was a cost (or risk) of doing business, now we’re looking at empowering users in a way that generates profits. Anecdotal experience in making privacy profitable, and what we learned? Jim: customers have been coming to us (on business side) with a compliance model, wanting to see uplift in their site with TRUSTe seal. Customers have concerns, their seal helps address that. Chris B: space between customer needs and marketing efforts. “Profile Choice” allows real-time bidding on aggregate-able info, didn’t find the right mix at that time. Chris K: misunderstandings between what companies are trying to do and what customers believe they’re doing. Using data for ad targeting within a company privacy policy. Beacon became Facebook Connect.

Larry: Beacon, and Google Buzz, had unsuccessful launch: unclear purpose (benefits), generating FTC complaints. Is there something about the launch of a product or service that makes it more dangerous or risky than other times? Jim: use of large datasets are prone to claims of changing the rules. If you’re working in areas that weren’t contemplated, that can be confusing, need to think about how to advance sharing practices. Navigating these waters is extraordinarily difficult. Jim: any future change may be viewed as a breach of privacy, unexpected changes (lack of or poor communications, offer choices, does company honor user choices, no accountability). Chris K: FTC, government isn’t in a good position to deal on this level but you don’t want to attract their attention.

Larry: sources of funding? Chris: question is no longer is privacy big enough, now it’s what are the top level matters? Investment community–advertising (every $ spent wants to be more targetted). Jim: process in ad targeting space, global, and how little is online: ad people are demanding more information about who’s receiving their ads.

Larry: about your not taking public investments? Jim: happy accident.

Chris K: Forensics for providing choice or for analytics/response: there are techniques, can take better control over this as web providers to help users. Data flow as arms business: companies that need to control what’s happening on their site or people who want to offer services to consumers. Chris B: targeted ads now more transparent. Balance against malware, cookies and their sources that feels more like security.

Larry: FTC’s interest in these issues, pending legislation in Congress–how does possibility of regulations affect climate for investment? Chris K: uncertainty is a cloud, straightforward means of regulation can move industry forward. But interim finger-pointing, lobbying gaming, are problems. Likes EU model, but we’re moving away from that. Chris B: gov is crowdsourcing communities, online advertising and ad space initiatives are trying to be more self-regulating. Still uncertain, industry groups and co-regulation being brought up and talked about. Chris K: Congress is a giant consumer of these targeting services. Behavioral targeting seems to be settling. Larry: what if a new regulation passes that takes a business model out or forces… Chris K: legislation takes time to effect.

Questions. Did people that saw the TRUSTe seal click on the seal or just go with it? Chris: clicks were low, most people recognize seal as an envelope.  What are people choosing? (site can collect, store, use for ad targeting, give to 3rd parties) Chris K: policy should say. We can’t make sure people read the policy. Do I have a right not to have data collected? Ends up as different perspectives from people vs industry, investment (collect data).

future, records, tools , , , , , , , , , , , , ,

IIW XIII: The Final Overview

October 27th, 2011
Comments Off
young person sitting alone, by xJason.Rogersx

Thanks xJason.Rogersx

What a head-filling event! If you’re interested, you can see notes from many of the sessions on the IIW wiki. Some of the sessions are rather technical, which is consistent with the roots of this unconference.

A few of the things I learned: people continue to amaze me by these projects: personal data projects (check out Personal–no notes from their demo; and soon The Locker Project), reputation sites (I was busy vouching for people whose work I know with Connect.me), the many stories of evented APIs (think actions: when something happens, it can trigger something else to happen, as in the “Internet of Things”), and of course the evolution of the Personal Data Ecosystem and PDEC.

Coaching moment: There are two major forces pushing forward. One is represented by Facebook: collect and manipulate, sell and distribute all of the personal data that can be found. This is a pulling, pillaging process with the “users” as the product being sold. The other force is not yet represented, but you might think of it as an opposite: individual people have access to their own data when they need it, using starter organizing, permissioning, sharing and distribution tools. What if you could say “No Facebook, you can’t plunder my own and my friends’ data–and mean it? What if advertisers came to you when you wanted? The idea is to say “yes” and “no” to data sharing when it’s appropriate for you. It’s you who is important, not a product.

history, records, tools , , , , , , , , ,

Trust, and Using You

August 10th, 2011
Steve Woodruff's tutorial (graphic steps)

Click for Steve's tutorial

Steve Woodruff brings us a quick tutorial on how to reset LinkedIn’s new “Social Advertising” setting:

Apparently, LinkedIn has recently done us the “favor” of having a default setting whereby our names and photos can be used for third-party advertising. A friend forwarded me this alert (from a friend, from a friend…) this morning.

Since Facebook has been such a good model of creative “reuse” of our personal information, and consequent destruction of personal trust in social settings, it seems corporately fitting that LinkedIn would try the same.

Coaching moment: Doesn’t it bother you when people make self-serving assumptions about what you want to share with others? True, you did voluntarily share this information, but shouldn’t you be able to express clear limits on how this shared information is used—before it’s misused? I think so!

friends/family, future, records , , , , , , , , , , , ,

Open Data Partnership

March 12th, 2011

When the government threatened to regulate an industry that has for some time been playing fast and loose with people’s personal data, the industry proposed to open their databases–at least a little. The Open Data Partnership is claimed to be a “market-wide collaboration that allows consumers to gain more control over the information that companies have collected about their interests in one easy-to-use portal.”

SmartPlanet quoted Mike Zaneis, Senior Vice President and General Counsel for the Interactive Advertising Bureau (IAB), who explained:

Better Advertising’s Open Data Partnership is exactly the kind of initiative that will enable us to remain self-regulated as an industry. The more transparency we can provide consumers that enables them to retain control over their own data, the more trusted our ecosystem becomes – to the benefit of everyone.

Interestingly, many of the big data tracking companies have already signed on. (Hubspot, which just received an infusion of $32M from Google and Salesforce, are all missing from the list.)

With predictions for a sharp increase in analytics and data mining in 2011, the window offered by the Open Data Partnership is an interesting third option to “Do Not Track” or laissez-faire. It gives people better understanding and control over what they’re sharing and why. That said, it’s still about advertising (in which people are the product, not the customers).

Coaching moment: This is an interesting situation. If you could know more about yourself by looking at the data being collected, would you? Once you saw this information, would you be inclined to help correct it? If not, why?

history, records , , , , , , , , , , , , , , , , , , , , , ,

Mining the new Gold

March 9th, 2011
Comments Off

The Wall Street Journal has been running a fantastic set of articles called What They Know. Today’s (15th in their series) is called TV’s Next Wave: Tuning In to You. This article states that:

Data-gathering firms and technology companies are aggressively matching people’s TV-viewing behavior with other personal data—in some cases, prescription-drug records obtained from insurers—and using it to help advertisers buy ads targeted to shows watched by certain kinds of people.

How this translates, the article explains, is that these companies are now tracking you at a level of surfing and life-involvement that is highly customizable to your tv. (They don’t have to know your name, they know who you are by your habits.) Let’s say, for example, that you watched five cookie commercials (tracked), then later in the week you bought a package of cookies (tracked from purchases). These companies will start to get a picture of how many cookie commercials (or anything else that you watch) it will take to affect your behavior. Using an example from the article, the U.S. Army tested four different ads for recruitment:

One group, dubbed “family influencers” by Cablevision, saw an ad featuring a daughter discussing with her parents her decision to enlist. Another group, “youth ethnic I,” saw an ad featuring African-American men testing and repairing machinery. A third, “youth ethnic II,” saw soldiers of various ethnicities doing team activities.

Someone will likely claim that there’s no personally identifiable information being exchanged. That will be a lie, as they could only make that claim by defining “personally identifiable information” in a very different way than regular people–or government regulators–would. This is more about tracking and compiling the most intimate details of our lives, so we can be manipulated into acting a certain way.

Coaching moment: Corporate behavior like this is an example of a slippery slope. There is no real end to the social destruction that could be wrought on our world by corporate visions of a “good society.” I doubt that any one person that works for these companies would wish to be tracked and manipulated in this way. But when that person goes to work for a company that does this, the person is “just doing his job.”

There’s a clear reason why “Do Not Track” legislation is being proposed. This story points out an example of tracking that, I would argue, crosses ethical boundaries. It’s one thing to use voluntarily shared data about people. It’s another to invade their homes and lives for corporate gain.

I might be over-reacting. How do you feel about this?

history, records, tools , , , , , , , , , , , , , , , , , , , , , ,