Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

  • Produce a set of toolkits and associated educational materials by the end of 2017 whose purpose is to accelerate the ability of those in the following roles to adopt, deploy, and use UMA-enabled services in a manner consistent with protecting privacy rights:
    • Individuals ("natural persons")
    • Organizations ("legal persons" such as businesses and governments)
    • Legal representatives of the above
  • Focus on GDPR-related toolkits first and foremost. A toolkit could be anything that helps use or leverage an existing piece of legislation or framework, such as an SDK, a checklist, consent receipt templates or profiles, or a set of CommonAccord text, and could be related to the GDPR itself, the EU-U.S. Privacy Shield, BCRs, and so on.
  • Develop a roadmap by the end of 2016 that identifies specific deliverables and timelines within 2017.
  • By the end of 2016, develop two initial deliverables to inform the roadmap work:
    • Comparative analysis of UMA and GDPR concepts such as data subject, processor, and controller
    • Roundup of contractual and regulatory use cases
  • Leverage specialist legal expertise wherever possible to complete and review the deliverables.

Notes on mission progress

The subgroup found funding to work with a legal expert starting in 2017, with a schedule to produce three staged deliverables. The first, Use Cases for Analyzing and Determining a Legal Framework, was delivered on 28 Feb 2017. See the meeting notes and in-document edits for subgroup review activity.

Subgroup meetings

The subgroup's meeting times and notes are here. We meet on Fridays at 8am PT.

...

  • When Alice sets up criteria for access to a digital data resource of hers, such as "Only Bob can access this", can she ensure that the other actors in the authorization chain are doing their best to make sure Bob "is who he says he is" by the time he (someone) actually gets access?
  • If Alice wants to impose limitations on how Bob uses her stuff using business-legal methods vs. some kind of (say) encryption or DRM methods, such as "Bob must promise not to share this data with third parties", how can she ensure these limitations stand up?
  • Can the host of some sensitive information of Alice's, such as personal data, trust an authorization service that promises to do the job of protecting that information in an outsourced fashion? This is roughly akin to the challenges of federated authentication, only for authorization.
  • Can Alice trust an authorization service to do as she bids when it comes to protecting her stuff, if she didn't personally hand-code it?
  • Can an authorization service rely on the hosts of Alice's data and the client applications that Bob uses to operate correctly in their UMA roles?
  • Can the host of Alice's data ensure that it can keep out of legal trouble even if Alice's authorization service appears to want it to share data with a recipient who is in a jurisdiction to which personal data is not allowed to be sent?

Work to date on

...

Here is draft deliverable #1.

Work to date on model text

The model text work is being encoded in the CommonAccord.org system. CommonAccord is:

...

Here is the draft model text. The definitions are more mature than the clauses. It helps to know how to navigate CommonAccord text: First, click on 0.md to get the current full text, and the "Document" link to expand the Source view into a proper document, but all of this text predates the analysis being performed and may be radically changed.

Additional artifacts

An UMA in Contractual and Regulatory Contexts primer/manual is in early draft form. This slide deck, presented at Digital Contracts, Identities, and Blockchain at MIT in May 2016, shares some key use cases. A few additional artifacts are available on the WG's GitHub wiki. All of this work predates the analysis being performed. We may produce a completely new primer/manual at the end of this process.