Me2B Relationship Management/Tech Archite

From IIW
Jump to: navigation, search

Me2B Relationship Management/Tech Architecture

Tuesday 1F

Convener(s): Johannes Ernst & Kim Date

Notes-taker(s): Nick Roy

Tags for the session - technology discussed/ideas considered:

Discussion notes, key understandings, outstanding questions, observations, and, if appropriate to this discussion: action items, next steps:

Me to B alliance

Original use cases: I want to change my address once across all web sites / etc.

How would we make this work?

Interop problems

Need to rethink what interoperability is

In the past: This device works with that device, this plug fits into that jack.

Is that what we want as people, around interoperability?

Interoperability means things actually work, people won't start crying because things don't work.

We can get our info from our digital life and have stuff that reasons over it. The data itself is interoperable.

From my perspective, my digitial life is totally fragmented - not just usernames and passwords. No unified messaging.

How do we make this happen?

Closest we have ever gotten was the old concept of portals.

Liam is working on a new type of portal related to this.

Adrian claims that the interop layer is an agent. The protocol is something that looks like UMA, the characteristic of that thing is that when you're dealing with one of your apps/service providers, you give them the address of your agent. Instead of giving them your email address, you're giving them the address of a thing that can learn/manage relationships, act on your behalf.

Nat: For a limited set of attributes/claims, don't we already have this?

Kim: We're not talking about identity info, we're talking about all the attributes of your online existence. Slack/email aren't in the same inbox. The services are transient, and as they come and go, you have holes in your life that are ripped out.

Can't accept current architecture where things only live in one place. Needs to survive the failure of one or more parts of the ecosystem.

For data to be portable, the format has to be standardized. We have strong evidence of what this looks like in healthcare.

There is now a rush for data brokers that add value in the healthcare space, acting on this data through ML/etc. As soon as you have this kind of interop, you get people coming out of the woodwork figuring out how to use it for interesting things.

Liam: Built an agent using UMA/GraphQL/Oauth where everyone is their own OP, but it's faulty, want to do a session on this.

Need to know not just where my data is generated, but where it is stored.

Data portability use case- interesting how we create receipts, but we haven't talked about what receipts we create.

Doc: Consent receipt conversation was all about us consenting to terrible terms of other parties. How do you get alignment about consent between people and services, so it's not one-sided? Have any of the password store companies ever been here? Why don't they come? As long as we're stuck in client-server password fail, we can never get out of that? They are in the best position to help us, but they are not interested in helping us.

Mary: Reason to collect consent stuff is not because it really matters to any of us. The consent group was looking for places to drive wedges in the market to put pressure. FTC can go after companies, because the only way to put pressure on companies is to catch them breaking an agreement.

Quintessential FTC issue- only thing they can do is punish bad behavior that has already happened. Need to create a system that creates a counterbalancing force.

Moving back up from the gory details to the high level.

How do we make it so that a user that doesn't know anything about this stuff have a less fragmented digital life?

Original intent of blogging - distributed personal repositories. Then all the decentralized part of that died. So how do I put all my stuff in distributed repos and have it just live "out there"?

Adrian: This is aggregation. Of attributes or of control.

Would be happy if there was at least just a list of all the places that have my address.

Mary: Propose a different word for this, not aggregation, it's indexing.

Kim: Indexing is one way to create an aggregated view. Copying data is another way to create an aggregated view. Should support both / all approaches. Need to be able to survive indexes/stores/etc. going away.

22,000 missing URLs on Kim's blog roll. Trying to fix that problem. All of those 404s were live beautiful things, and it's all gone down the drain.

"Site Deaths"

Doc: Put some easter eggs in old blogs to see if they still show up in indexes, and they don't. Old stuff he wrote is still there, just doesn't show up in indexes. Trying to change DNS has been a PITA as well.

Need to add persistence to the aggregation part of the problem.

Niels: Two questions: What is the incentive for all the services to engage? Because they could lose customers. Second: The API or whatever is always behind what is offered by the services. Interop enables people to do cool new stuff, and thus move off of old services to new innovative services.

Kim: Look at portability work the large corporations have undertaken. Why did they do that? In the past, they all wanted locked walled gardens. GDPR and other things made that impossible or hard. If you have the user at the center of the model, that changes the nature of the game. With SSI, it's possible for the user to be at the center of the model.

Google/Facebook/Microsoft have agreed to the format of data interchange. Goal is to be able to take your email/etc. from one service and put it into another. Wendell says that's export-only, Kim says no it's both export and import. There is a spec on this worth reading. Moving data to the end user is out of scope in that spec.

Incumbent big businesses of all kinds always have reasons not to do stuff. If you look at what happened to the internet, it ripped down all these silos like compuserve, AOL, prodigy. They had no choice. When the individual is able to use standards like TCP/IP, IMAP, SMTP, HTTP, that rips down barriers. While we don't have that for personal data, it's possible for the major players to continue to say 'no'.

I might have that data, but what good does it do me if there's no way to prove that the data is legitimate? Attestation for things like academic credentials.

Don't know what the right architecture is by which we can start to figure out how to interchange and store this stuff.

Lisa: In cellular - feels like disintermediating the carriers - which was a threat because it would just be a 'big dumb pipe', but there is still innovation going on.

Dedra: You have infrastructure providers, and you add standards that make the big dumb pipe be able to serve new needs. In personal data, there is no incumbent. Would users pay to have interop? Guess is not.

Vic: Key is effort: If this stuff works, effort goes away. Effort that I feel to reset my password as an example. People care about the amount of effort it takes. The part of an org that feels this pain is customer service. Every touch point with a customer causes pain. If you're able to eliminate that pain, you win.

Nick: But the monoplists don't have any way for you as an individual to even get help from them.

Jeff: Saving effort *and* building trust. Chrome samesite issue as example of compaines shooting themselves in the foot.

Kevin: Pessimistic about commercial interests solving this problem. They'd take advantage of each other. Could we drive this through government? Example of distribution of data changes through Swedish person number. Estonia, Finland, India all do this.

Johannes: The agent should be OSS. Then the dyanamics of competition goes away.

Niels: Software doesn't run itself, so ultimately there will be another entity who owns the instance of an agent. Unless you solve that on a protocol level, so no one can touch your stuff but you, the sevices will see everything.

Kim: What's amazing is the amount of money going into big companies supporting the idea of decentralization. Big companies by the dozens with 150 people each working on decentralized identity. What are they working on? The wallet. Credential providers. Wallet providers. Free wallets. Likely that the agent layer will be subsidized by compaines that want a better relationship with their customers.

Adrian: This experiment has been run by healthcare for 10 years. Health data portability was tried, nothing came of it. None of the SPs trusted each other to input the data. Cheap to send the data out, expensive to bring it in.

Kim: Where the data lives isn't important, what is important is that it provides persistance and portability. Not important where it's stored, it's important how you access it, how it's distributed, how it's permissioned.

What is the incentive for the first services that sign up? No, the service doesn't have to change anything. You have the ability through the APIs and synchronization methods they're opening up, because they have to. You're getting the ability to run software that reasons across those services. If you depend on the services, you fail on day one. Liam's work: Lifescope. Able to amalgamate and build indexes across applications and services. This is feasible without disturbing the other infrastructure. You could build up an SSI infrastructure that can connect up without going through the existing services. Don't know the architecture. Need to think it out.

Jim: Example of the UX of having to contact places when you change credit card number. Terrible UX. Does the agent push out this info? Do SPs ask for it and get permission to access from you?

Adrian: Work on SSI separation of concerns: Non-correlation. Microsoft offers to do this through a tumbler or a VPN that doesn't keep logs. Design the architecture so there are standards around the tumbler. Anti-example: Sign-in with Apple.

Jim: Problem is that the services consider managing your credit card number your problem and not theirs.

By introducing the friction, the agent gets to learn about the data requests. If you give prior consent with our without a consent receipt, for you, they get to learn about the data uses, and you don't. Everyone in the data brokering business is eager to get the benefit of the learning that's going on. We have to own the private agent.

Group thinks this is the right time to have this conversation. Sense that we've been wrong before.