12B/ Veres One (did:v1) Rubric Evaluation
Veres One (did:v1) Rubric Evaluation
Wednesday 12B
Convener: Joe Andrieu
Notes-taker(s): Erica Connell
Tags for the session - technology discussed/ideas considered:
Veres One, DID Rubric Evaluation, DID methods, DIDs,
Discussion notes, key understandings, outstanding questions, observations, and, if appropriate to this discussion: action items, next steps:
Link to Joe Andreiu Slide deck: http://legreq.com/pres/v1.rubric.iiw.2021.04.21.pdf
Link to Evaluation Rubric: http://legreq.com/media/rubric.v1.2021.04.20.pdf
What we did
Reviewed current W3C draft
Found criteria that express Digital Bazaar’s rationale
Identified potential new criteria to cover new elements
Refined criteria to better address Veres One distinguishing features
Shared with cohort
Engaging cohort for additional feedback and insights
What we learned
Rubric still in infancy
Need structure-variable questions
1.3 Separation of Power
4.6 Consensus layers
Enforcement (initial draft of real questions)
How do we talk about trade offs from one wallet maker to another?
Evaluations are focused on the method
What about adversaries?
C. Allen identifies 27 adversaries (including forgetting password)
Particular methods are designed to address particular adversaries
Questions and criteria are young, still a learning curve
Learning the rubric
Learning each method
Need better tools for community engagement
Criteria discussion
Custom rubric development
Shared rubric evaluations
Questions/Comments:
Looks like NIST (Common Criteria)
Evaluating security of systems
https://www.nist.gov/publications/common-criteria-launching-international-standards
Others have encountered the same challenges
Sometimes questions do not match to use cases
Multiple evaluators have differing opinions on how to answer questions
Helped discussion
There may be value in comparing evaluations
How much effort for this evaluation?
Rough estimate: under $10k for billable time
But there was familiarity with there system, so learning curve was already largely met
Tricky, esp with governance
How the blockchain is constructed
How things are agreed on for protocols, etc
And therefore hard to answer certain questions
Ad industry has been developing this stuff for the last 18 months.
Cookies are going away, so new developments (these kinds of rubrics)
Focused on who gets to do business and under what terms
How disconnected technologies and concerns of the single sign on problem and the media measurement problem addressed in advertising is.
Privacy Rubric could be developed? That evaluates privacy, not DID methods
Privacy interest group had a hard time with how DIDs go (learning curve issues)
Shouldn't have a privacy problem baked it, but that assumption skirts the heavy stuff
“Web shouldn’t have an identity layer”, but single sign on is an important aspect
Ad world is moving to a single sign on kind of approach in the absence of cookies
There are lots of opportunities here to take the tech into a practical use area
Invitation to take this work and campaign it a little more across the W3C
Props for the titanic work to get this out, and same for DID:web
Governance framework, for example
Needs to be a living document for it to be useful to the community at large
Criteria will not be static
Community-wise decentralization ala wikipedia is a possibility
The learning curve cannot be eliminated, yet there is opportunity to streamline some other aspects of putting this together to make future evaluations easier
Notes from Chat:
11:31:06 From Erica Connell to Everyone : http://legreq.com/pres/v1.rubric.iiw.2021.04.21.pdf
11:36:36 From Erica Connell to Everyone : https://w3c.github.io/did-rubric/
11:37:40 From Erica Connell to Everyone : http://legreq.com/media/rubric.v1.2021.04.20.pdf
11:37:56 From Eric Schuh to Everyone : https://github.com/WebOfTrustInfo/rwot9-prague/blob/master/draft-documents/decentralized-did-rubric.md
11:43:24 From Erica Connell to Everyone : http://legreq.com/media/rubric.v1.2021.04.20.pdf
12:00:11 From Joe Andrieu to Everyone : https://en.wikipedia.org/wiki/Common_Criteria
12:20:27 From Joe Andrieu to Everyone : https://www.youtube.com/watch?v=DVK5G9DIKf8
12:26:45 From ns to Everyone : oh Hi, I'just hopped in because I was curious - I was one of the evaluators that worked on the did eval with Markus Sabadello. Thanks for the great talk :)
12:26:54 From Dmitri Z to Everyone : oh excellent!
12:26:59 From Dmitri Z to Everyone : we’re looking forward to reading that!
12:29:27 From ns to Everyone : It was an interesting challenge. Like Walid said its hard to fit all these different designs and sometimes methods targeted at some particular use case into the rubric.
I think this is an ongoing process and its great that different people are working on this coming from different angles