User:Sidstamm/Notes IAPP Global Summit 2012

From MozillaWiki
Jump to: navigation, search

Overall Themes:

  • Regulate use, not technology.
  • Privacy is about people.
  • Are ethics appropriately present in self-regulatory bodies?
  • Privacy v. secrecy: secrecy isn't pre-requisite for privacy.


Thursday, March 8:

Keynote: Jeff Jarvis, Author and Journalist

  • Social Norms are meant to change -- they're not supposed to be static, and as people interact differently with technology, expectations and norms change.
  • We should poke the social norms frequently to make sure they're not stale.
  • Too often we blame the technology and not out-of-date social norms.

Danah Boyd has shown that many children under the age of 12 have a Facebook page, and many of them had help from their parents to create these pages. So is COPPA successful?

Jarvis asserts that COPPA actually causes children to be the worst served sector on the web because technology is not available to them and when it is, liability is denied since access was obtained "illicitly".

The right to forgotten is often contrary to the right to free speech. Exercising your right to be "erased" might effect what other individuals have expressed or constructed with technology. In fact, the right to free speech often works contrary to the right to be forgotten: people may re-share things you said, but without their approval (and after you requested being "forgotten").

Jarvis: Technology is almost never at fault. Regulating technology as a whole can damage the web! Regulate people, not technology.

Consider Google street view: who grants permissions for pictures taken in public places? Google? The people in those pictures?

Jarvis is mad about DNT and cookie "demonization". Claims DNT will damage the media industry. (I think he misunderstands DNT, at least from a technical viewpoint.) In fact, the Internet is too young, and it's to early to know what needs to be regulated.

Key Point: Regulate behavior, not the technology. Use over knowledge. (Danah Boyd also asserts this).

We MUST BECOME RADICALLY TRANSPARENT OVERNIGHT. (I assume Jarvis here means in how we collect and use data.)

The government is not the best protector of privacy, rather the biggest threat. They can't be radically transparent. (Continuously blaming media organizations for screwing this one up.)

Turn around the "Privacy" discussion to talk about "understanding sharing." That's what we really need to do.

Jarvis proposes principles that can help manage a widely open society (draft): 1) right to connect. 2) right to speak. 3) right to assemble and act. 4) Privacy is an ethic, and context matters. Share to benefit others, not just yourself. 5) What's public is a public good. 6) Open by default, and secret only as necessary. 7) A bit is a bit. (net neutrality) 8) Public is what makes the net the net. Do NOT GOVERN.

Privacy is important - so is public sharing. The net is not broken, so don't try and fix it.

Keynote: Brad Smith, GC and EVP Legal/Corporate Affairs, Microsoft

Privacy is about people.

How has the Internet shaped societal norms? Maybe these changes should be reflected in law...

Recommends reading the Sotomayeur opinion (4 pages) on text messages and the employee right to privacy. Also recommends reading supreme court opinion on GPS tracking equipment. (??)

Ever since 1791, secrecy has been considered a pre-requisite for privacy. This is why we seal envelopes.

But no longer is it about secrecy, it's more about behaviors and use of information once it's had. Smith suggests the WH bill of rights aligns with this. Here's what we should do:

1) Focus on balance between complexity and simplicity. Details matter, but simple is sometimes more effective.

Privacy needs to be communicated in the form of transparency -- what is "heard", or collected, should be communicated.

A 47-page terms of use document is not effective. Too much is too much.

2) Approach your job without boundaries.

We need to create a private Internet but also one that encourages sharing.

PUT PEOPLE FIRST

We need "technology to serve people, not people to serve technology".

Session: The final FTC Privacy Report: Implications for Consumers, Businesses, Legislation and Enforcement

D. Reed Freeman Jr. (Moderator) Julie Brill, Commissioner, FTC J. Howard Beales, Associate Professor of Strategic Management and Public Policy, George Washington University

Q: What does the FTC hope to address with respect to Privacy? Brill: Consumers complain about stuff.

  • Revealing location
  • Pregnancy indicator from innocuous data
  • Disclosures and lack of clarity or inadequate disclosure
  • kids -- child issues
  • manipulating the default settings to circumvent privacy settings

Q: Do our traditional notions about privacy still hold?

Are disclosures of 100-page privacy policies realistic?

Can we have just-in-time or layered notice? The FTC is considering this.

Freeman: We need to think hard about what we're trying to fix. What consequences are "bad"? Otherwise we end up with too much disclosure.

(Doesn't like the draft consumer privacy bill of rights -- it's not clear what we are trying to solve.)

Brill: Yeah, what's the harm, focus on that. OBA may feel creepy, but can see situations where it inadvertently causes harm. For example, kids browsing history leaked to parents.

Profiles that are used for secondary uses -- harmful. (e.g., to determine eligibility for health care products).

Brill is concerned about scraping social networks to change a credit line, insurance eligibility, or employment possibilities. To her it seems unfair.

Freeman: Problem is use, not collection. FTC is improperly focused on "do not collect" when the use is really the root problem.

Brill: Use is important, but so is collection. Unidentified collection that has potential for alternate use: collection is disconcerting.

Need to discern between unbridled collection and specific, acceptably-purposed collection. Non-limited data is RISKY to companies.

"Lets think about privacy as risk management" -- and not compliance or data minimization.

Q: "Our competitors do it, it's not illegal, why can't I?" Are we approaching a situation where failing to have privacy-by-design is grounds for civil penalties?

Brill: [No]

There is no strict liability on privacy.

With respect to privacy, code designed to collect and import data is under "control" at the company -- this is different than security breach issues. We are looking for conscious actions that are contrary to fair business practices.

Freeman: WE NEED STRICT LIABILITY FOR PRIVACY [damnit]

Q: It seems that do-not-track swallowed up the initial FTC report -- were there things in there that got missed?

Brill: The DNT recommendation in the report is incredibly important.

  • Happy that industry stepped up
  • Admits there are definition issues
  • Says DAA's about ads program is good
  • browser-based stuff is good too
  • w3c work is good too

Q: Are we there? What does the FTC want next?

Brill: Want to see all these mechanisms working together. Worried about details, but watching progress.

Q: Use or Collection?

Brill: Both. DNT is do-not-collect, but some collection needs to happen (fraud, analytics, etc), but DNT must address some collection too.

Freeman: Not clear that DNT has identified a problem to solve. We need to track people that don't want to be tracked!? [This guy has no idea it's not a list]

Brill: First v. Third parties is a big part of the DNT discussions. Concerned with social plugins -- when are they first v. third parties? Should we require consumer interaction?

Freeman: Skeptical first v. third is important. "How the info flow is organize" doesn't matter, just the information. [doesn't think context matters]

Brill: Consumers' reasonable expectations matter and they don't expect third-party data collection.

How does a consumer know who collects data and how to correct mistakes in collected and inferred data?

Brill is deeply concerned about this. Likes the model of credit agency corrections + disclosure. But consumers have no view into data brokering.

Brill wants data broker industry to create a one-stop-shop for identifying organizations who broker and can correct the data they hold.

WHO ARE THE BROKERS? This is the big question.

Q: Uses of data is often IP, how can use be regulated? (Secret sauce v. transparency). Self-reg comes down to ethics. Who is accountable and how much transparency is required?

Freeman: Disclosure to users is not useful.


Session: Privacy Implications of Facial Recognition, Facial Detection and Digital Signage

Harley Lorenz Geiger, Policy Counsel, CDT Brian Huseman, Senior Policy Counsel, Intel Maneesha Mithal, Director of Privacy and Identity Protection, FTC

There's a difference between facial detection and facial recognition.

  • Detection = "is it a face?"
  • Recognition = "whose face is it?"

CDT position: Legislation on privacy doesn't deal well with biometrics. Additionally, legislation focused only on facial recognition is undesirable. (Again, seems like the theme is "regulate use, not technology").

FTC position:

  • What are the costs and benefits of using this technology? We must understand this.
  • Can these recognition/detection devices include privacy by design? Can they be created to avoid keeping data?
  • Lots of notice and consent issues: how do you opt-out? Where do you post notices? How are consumer's choices exercised? Do we rope off areas and attenuate camera ranges?

Q: Is it ever ok to do facial recognition as opt-out?

FTC: Not really, should be opt-in because how do you track who should not be recognized? CDT: That.

Q: What about children? FTC: How do you get parental consent? We shouldn't target ads to kids using this tech.

Q: Are we allowed to detect a child is a child in order to ensure we show the right ads?

No -- non-targeted ads should already be family-friendly. (The difference here is that a child is not shown child-specific ads, just "generic" ads.)

Q: Can law enforcement utilize these systems to identify criminals?

CDT: Blending systems used for commercial and law enforcement purposes is dangerous. Not recommended.

Q: Aside from avoiding physical proximity, are there reasonable ways to opt-out?

Consensus: Not really. We can attenuate range and tape off areas.

Q: Can we use this for an employer to authenticate its employees? Log-in via face? Decryption via face?

What about facial identity theft? Twins? Eek.

Session: De-identification, Re-identification and the Definition of PII

Omer Tene, Associate Professor, College of Management School of Law (Moderator) Yucel Saygin, Professor of Engineering, Sabanci University, Istanbul, Turkey Boris Segalis, Partner at InfoLawGroup LLP

First, start with de-identification. Take out PII. but computer science literature has emerged showing how to re-identify stuff.

In 2000, Latanya Sweeney showed how (zip code, birthdate, gender) is enough to uniquely identify all US citizens.

Omer: We should not treat all data as PII when there's a remote chance of re-identification. This encourages a halt on de-identification and public benefit of data sharing. (e.g., case study where data mining showed that using a pair of very popular drugs caused health problems.)

Yucel: Anonymity in data sets relies on data diversity (common values throughout the set that is "anonymous"). Inference attacks rely on lack of data diversity.

Boris: Disclosure of anonymized data is still a risk, and we should think of it as that way. No sufficient legal framework exists for classifying whether data is "sufficiently de-identified" or not. European frameworks are strict to the point that it is almost impossible to release any data. Opposite is true in US.

Friday, March 9:

Keynote: Cory Doctorow, Author

Provided an example of a disney game that deployed kids with mobile phones to hunt through a riddle. The neat part was that there was no central tracking tower that kept track of whether kids were near the right clues; it was inverted -- the phones picked up beacons coming from the clues, so the kids couldn't be tracked, but the game still worked. All the logic in the phones.

He then read a reverse terms of use contract where all listeners were forced to release him from any other terms of service contracts we have engaged with him.

Facebook makes it hard to understand what you trade for the value the service provides. It's a big skinner box. Complexity (as it exists in the game of craps) hides the mechanics and makes it harder for users to make informed decisions. Also, intermittent reinforcement keeps subjects coming back again, "pushing the button" for more, or hopefully more.

It's hard for people to price privacy well (immediate disclosure and benefit, but long-term consequences).

DNT makes it too easy for sites to ignore.

"Do we really want to pretend we can solve our social problems by making your data harder to copy?"

We can't censor without surveillance anymore. It's not broadcast-to-users, but rather stopping the info generation as it comes from users, so we must watch them to censor them. That's put us in a surveillance state.

Lead by example.

Regarding pop-ups: "Mozilla happened to pop-ups", not regulation.

Aside from laws, we also have technology to change society.

Cookie managers could be the new pop-up blocker.

Make it easier to manage cookies and users will do that. This can change tracking cookies.

We can't tell how bad the situation is. Need moar transparency.


Keynote: Hasan Elahi, Associate Professor at The university of Maryland

Very amusing public display talk. Completely counter to Doctorow's focus on secrecy. Unfortunately I was too engaged and did not take notes.  :(

I spent the rest of the day in meetings.