Identity Assurance (merges with) Automated Policy Negotiation

From IIW
Revision as of 23:09, 15 November 2010 by IdentityWoman (talk | contribs)

(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search

Issue/Topic: Identity Assurance (merges with) Automated Policy Negotiation (3G)

Convener: Leif Johansson & Rainer Hoerbe

Session: 3G

Conference: IIW-Europe October 11, London Complete Notes Page

Notes-taker(s): Nicole Harris

Discussion notes:

Identity assurance for attributes: attribute assurance.

Evaluation of authentication assurance. Scopes of various schemes don’t really cover what top level security schemas are needed in a federation. Level of privacy, level of protection in OX, availability, terms of use classification. Modelling levels of authentication assurance.

STORK.

Coupling of registration process with authentication process. Although they are processed separately, but when represented in terms of messaging these two things get amalgamated.

What are the algorithms of assurance?

There is a difference between saying I trust you tell me your age because you have gone through a registration process and I trust that this data as presented is valid. Attributes are being vetted in a certain way. I.e. this attribute has been proved by government department x. Source of authority for the attribute.

What is the role of proxies in this space? In order to be able to manage this information, there is going to have to be multiple conversations so we end up with an assertion that it I assert that I know what he asserts to be true.

Trust in the identification provided by banks – using small payments to receive information about a person. We put more trust in this than is required.

Assurance policies as broker role. We can all claim level 1 but we need to have a good understanding of what we are saying at level 1.

Lack of portability – i.e. I quit being a student.

In germany you can get fined for having an open network.

Finance industry share assertions made by individuals to look for discrepancies. i.e. how much did I tell bank a and bank b I earn. Who asserted what to whom? That is your basic social graph. Who are you willing to know that you assert that to? In the existing business systems , this could use some work.

Can I tell the same lie consistently is an interesting level of assurance. More eyes.

We like it together – the social context

What is my personal data and how personal is it?

The they to me and me to we…and back to they again.

The man – myself – my social graph – the man.

Can banks give data back to the individual and actually permit that data to be reused in a certain way. What is the benefit to the bank? Give your data back. It is yours – but we would like to use it to, is that OK?

Operational cost of maintaining accuracy of data is huge – and they don’t really have the proper permission to use it.