Your Data Your Currency You Terms & What Do People Need to No Longer Need Facebook?

From IIW

Your Data, Your Currency + What Do People Need to No Longer Need Facebook?


Tuesday 1F

Convener: Lubna Dajani & Ricardo Mendez

Notes-taker(s): Scott Mace


Tags for the session - technology discussed/ideas considered:


Privacy, trust



Discussion notes, key understandings, outstanding questions, observations, and, if appropriate to this discussion: action items, next steps:


Lubna – Your data, your currency, your terms


Other leader: Merged with how to leave Facebook


The privacy paradox – short term convenience trumps long-term gains


Lubna: FB is convenient. First ubiquitous platform where people discovered childhood friends. People don’t understand the depth and magnitude of what’s being done with their data.


At least 3 privacy paradoxes academics are talking about. People will share things more intimately with strangers (Brookings Institute). Behavioral economics. Third, difference between what’s claimed and what’s delivered. Search “privacy paradox”


Doc: I’m here for the terms. If we’re not talking about terms, I’m going to bail.


Lubna: How do we transition to the next world. I want terms I define for how you engage with me. This is my data. If data is the new money, and we are the generators of data, how do we take control of our own currency?


Doc: If you start with what people care about, you’ll never invent anything. Data is a head trip. We’re dealing with considerations. They may or may not involve data. We’re in control of what we reveal to other people. We don’t have that online.


Does it matter what motivates people?


Almost irrelevant. The lawyers look for harms. There aren’t a lot of harms. We haven’t built the thing we need yet, the thing we need for ourselves yet. We’re all wearing privacy technologies right now (clothing). They’re norms. We’re naked still online. What can be done with tech will be done until we find out what’s wrong. FB is a learning experience that is a model for itself. Whole other things we could do. Such as what do I want here? We’ve already written no stalking. This saves publishing today if we do it.


Jeff O: Demo table tomorrow. GDPR, my bit is PDPR, personal data protection regimen. Understanding ourselves, that free is appealing. There’s an association with dollars and cents. Really we’re talking about cost. More exploitable is the data. The word free should not be involved. The cost is your data. Free is a dark pattern. The human element.


Joyce: SB 1084, a dark patterns legislative bill, the Detour act, deceptive experiences to online users reduction act. Become aware of private social networks, such as the one in Sweden.


Doc: We learned about it from Chris Savage.


Silicon Valley funded think tanks are going to D.C.


Scott Mace: Another group active in D.C. is the Center for Humane Technology, Tristan Harris’ group, mentioned prominently in Roger McNamee’s book about Facebook, Zucked.


In Japan, students are getting free coffee in exchange for their data, matched with employers. Functioning business model for your data, your currency. Clearly not the students’ terms. FCC lawyers say if you don’t agree with the terms, don’t sign up. Doesn’t work well with LinkedIn where you have to sign up to be employable.


What are we really afraid of? (Many think targeted ads are great). How far do the terms go? Could be much larger than an EULA.


Objecting to surprising uses of data.


Joyce: Trust is actually a commons asset. When trust breaks down as a commons, now nobody believes anything. We need to insure that we share trust. Chris Savage’s academic paper is posted on the Project VRM list. People think it’s just ads, but your information can be used in ways you never thought. Do you know how your car insurance rates are set? Consumer Reports 2-3 years ago, the #1 factor should be what kind of driver you are. Actually it’s your credit profile. Plus zip code, other things. American Express will turn you down on a credit card if people who go to a bar you go to with bad credit. A bunch of examples like this. People have learned to trust companies and institutions. If FB was bad, my friends wouldn’t use it. I can hardly open an email, is this spam? We’ve lowered the trust so much. Can’t get it back up that quickly, unless we have some agency to see who that person is. Tools are the way.


OnStar comes with the vehicle. Built in. GPS, cellular tech whether you enable it or not. Insurance company could easily find out the driving history from GM. Would an insurance company care about that? This came with the package.


Another example, your credit rating was determined partially by shared addresses with people with bad credit.


Lubna: We are living in two eras at the same time. Kids now can swipe before they can stand. We do need an interim for this time. What does my data mean?


Come to the MyData session.


Privacy: Control what you share and with whom. This is my car I paid for, I should be able to control what my car shares with.


MyData guy: There’s a rights issue. Every right is bounded by other people’s rights. If I choose to share my information with a trusted party, I cede some control to them.


Jeff: Say you have a Nest thermostat, 20 years of data, using this energy, you had it well, we’ll be parsing the space down. Others were more frugal. A creepy potential. Happening right now in China.


Joyce: Chris’s paper is: Managing the ambient trust commons, the economics of online consumer information privacy. We take the commons model, has much more value. Commercial model is doing the destruction of what we call trust. Many examples, hundreds of footnotes.


20 people here, 10,000 at F8 trying to do the exact opposite.


What about a law like DMCA that allows takedown notices so my data can be removed.


But beware of unintended consequences, like corrupt politicians using GPDR to get sources revealed, or stories taken down.