Personal Data Ownership in a Corporate World
Personal Data Ownership in a Corporate World
Tuesday 1I
Convener: Annabelle B.
Notes-taker(s): Hugh P.
Tags for the session - technology discussed/ideas considered:
Discussion notes, key understandings, outstanding questions, observations, and, if appropriate to this discussion: action items, next steps:
[Annabelle]
Want a discussion - lots of questions, but don't necessarily have any answers. We're in a corporate world: our information is handed over to companies, for storage, communication, and so on. Companies hold our data, but can we trust them with it? E.g. facebook, google. Let’s look at alternatives – e.g. “home box” type solutions, then somehow maybe facebook (etc.) integrate with it? How would we make that manageable by the consumer? Then, even if we solve those problems: how do we get the hosting companies to care about these private-cloud solutions?
[Patrick, Eric]
There’s a question of how to get people to care about privacy. It’s an uphill challenge. Can we give users a reason why they should *own* their data? Value or convenience
[Annabelle, Didier]
Q: in EU it seems the government is more involved in privacy issues; does that reflect a difference in public feelings? Maybe not; concerns about privacy in US post-Snowden seem to be at least as important. In France example, there’s a history >40years of state regulations in this area.
[Hugh, Bob]
Apple has an interesting stance here, “we don’t want to see your data” & commercial model from hardware. But: it’s different to say "we don't want" versus "prevent”, or to protect PII when it's in the system; see the prominent iCloud hacks. Does the EULA say “don’t store private info”?
[Hugh, Annabel, Eric]
Q: Are there any very-decentralized options versus that war between the big cloud operators? Or: does decentralization matter? Are we trying to address a problem that doesn’t exist?
Companies are typically better stewards of data than the user. But there are questions around data ownership & sovereignty & control that the user gets.
With a platform with cloud storage & end-to-end encryption: confidentiality is one thing we can do easily under the user's control, but the availability guarantees are hard for an end-user.
How do you be sure you can trust that entity? How guarantee that you have any greater control over my data than another third party?
[David, Annabel, John]
Question over control of the data. What mechanisms for personal control of that data? (David’s embarking on a project where that's critical but don't have a solution yet).
Define “control”? => “consent to share” (or use) (or purpose)
When someone has access to your data you're no longer in control. Radio Shack promised won't share your data for marketing, but look what they did. The fact they had access to that data implies someone else probably will gain access to it, even if gov’t or a creditor. To be in control, encrypt it at the source
Separating “store” from “share” can help that. But, to share => you have ability to view in the clear.
As soon as you share something - even if using a distributed platform... trust that your friend doesn't share it onward. This moves the problem from the "cold & dry code of the computer" onto the “warm wet code of human relationships”.
[John, Eric, Judith]
Authentication on that control? Often authenticate the user who's sharing their data, but maybe can't be sure your friend is really your friend when we do the sharing. That sounds like DRM which is almost exclusively an enterprise thing.
The UMA group is working on protocols may make it possible more generally.
[Judith, Annabel, Steve]
We’ve been talking about personal data as data that originate with the person. What about more disconnected data of mine, : e.g. the records of my power use? Lots of data originates from my *actions* but as soon as it originates it's out of my control. Do the same issues apply?
Arguably not solely your data; e.g. amazon's order data this data is 'theirs and yours' (you brought a product, they sold it). Tied to you. Anonymization? Third-party?
[William, David, Andrew]
People are “creeped out” by ad tracking. Curious if there's a line between creepy and ok. That line varies on context. It only becomes creepy when tat data shows up in a context you didn't expect e.g.: snapchat more persistent or more widely shared after hack.
[Judith, Hugh, Bill, Eric]
Some opportunity for stores of personal data where the user is the aggregator, e.g. health/fitness data from multiple devices.
Bill’s interested in potential abuses at the intersection of health and housing data. Is someone using my personal data in my best interest?
VRM community should talk about "fiduciary first"; looking for a framework where policy supports shares *in my best interest* Does that imply a legislation or regulatory framework? For real-estate cases, the FTC established some disclosure laws; conflicts of interest are rampant though.
[Andrew, Annabel, Hugh, Eric, Christie, Dave]
There’s a necessary shift in social norms. Changed expectations. In the case of gov’t surveillance: some behaviors have changed around metadata.
Have we actually changed our actions? The general public doesn't care or understand.
Some signs say yes. Goog encrypts the backbone now. Lots of software vendors are taking security very seriously. In Canada there’s a noticeable shift in the legislative environment now that people are more aware of the attacks.
It’s less of an anonymous web these days, your footprints are exposed, is there a downside to that?
Certainly there are public effects of not having the expected level of privacy. Society closes in on itself. The outliers have to be brought in. People don't take risks.
One positive change that came out of the internet: people found shared interests.. Requires privacy, vs. social pressure.
See the Colbert/Snowden interview, framed as “do you care that the government has your dick pics”: yes, people care. Another useful framing is that it’s not about 80% of a population (& then outliers), it's about 80% of your life. There’s lots of very normal privacy. Parents’ conversations with their kids about dating are not public.
Worry that NSA has Congress' dick-pics (they do).
[Annabelle, John, Hugh, Didier]
Once you share something with a third party unencrypted, it's out of your control at that point. Interested in privacy-by-design concept, where apps protect users from themselves. Some apps – redphone / signal / textsecure – encrypted by default, don't need to think about the mechanics of security, it just works. We know how to do that, right? - at least the tech, if not getting deployment to ubiquity.
That needs work in the UX.
Not a tech problem, it's a business-model problem for these apps. They don't want your data; the other apps are based on your data. There’s a non-profit building a browser extension that uses blockchain tech to do MITM-proof encryption on any web site.
There are legit reasons for content provider to have the metadata: without that, they can't do usage restrictions – abuse, fraud, harassment investigations? Arguably: those are "quality controls" so they need some insight into the content. If it's all encrypted, how to do the quality control?
[Andrew, John, Kazue, Eric, Annabel, Christie]
One of the hard problems that will take a diverse community is: working on social norms, identifying what is best practice. What if there is actually a *code of conduct* that orgs could sign up to and be accountable for? A "goodness policy” on the internet?
Corporate behaviour: part of Apple/Google is trust over years of operation & living up to their stated values. (As well as the monetization aspects).
Is the identity monetized? When the data becomes the product. “classism”: you can purchase that privacy if you can afford it, but the alternatives are freemium providers who will sell your data. Somehow you have to pay for the platform, otherwise the business is not sustainable; but many startup models externalize that. Free services; defer monetization with VC money to build scale. But free has to fail eventually.