ISO SC27/WG5-meetings are in two weeks time. There appears to be some movement with ISO 29115 and 29003. NIST has resumed interest in updating these as part of the 800-63 team looking across the board and figuring out the internationalization side. AHughes will report back any action.
Spring conference season is approaching. Kantara is 15 years old-planning a few “birthday parties”
Continue discussion on second criteria question #0180 (superior v. strong evidence) with a “tidied” updated version of Richard’s proposed alternative/comparable criteria (sent 2024.03.14)
FAL Criteria - Jimmy Jung’s email and subsequent thread
Any Other Business
Attendees
Voting: Andrew Hughes, Michael Magrath, Richard Wilsher, Yehoshua Silberstein, Jimmy Jung
Nonvoting: Sarath Laufer
Staff: Amanda Gay, Lynzie Adams
Guests:
Quorum determination
Meeting is quorate when 50% + 1 of voting participants attend
There are <<7>> voters as of <<2024-03-28>>
Approval of Prior Minutes
Motion to approve meeting minutes listed below:
Moved by: Jimmy Jung for unanimous consent, no opposition.
Continue discussion on second criteria question #0180 (superior v. strong evidence) with a “tidied” updated version of Richard’s proposed alternative/comparable criteria (sent 2024.03.14)
Richard
Richard’s recap: This is a proposed set of criteria which would allow the use of comparable alternatives in a manner which actually assesses out some criteria which have to be met if you are going to use a comp/alt approach. The criteria are derived from 63 Rev 3 Clause 5.4–to enable these, there’s a focus in 5.4 which restraints how to determine comparability (functional elements one might choose to replace with something comparable)--would be deployed in both 63A and 63B contexts. At the time, it was agreed to only think of comp/alts in terms of 63A (just to get things done), 63A, and specifically to get an alternative form of evidence. Thus, to invoke or connect to the comp/alt criteria directly, 63A 192 and 212 were inserted: If you are going to have comp/alt, other criteria (700-710) should be fixed, and the provider must meet these criteria. These were written based on 5.4 with an attempt to instill a degree of rigor and process as to how a comp/alt is justified. An analysis for the risk assessed by NIST would be needed (somewhat of a guessing game). Then under B-state that this is the alternative approach and the risks that may be introduced through applying this different approach, and look at the risks that may be reduced through the alt controls, and select/implement these controls. Overall: the alt plan should be as good, the risk analysis should be accepted by top management, and for each comp/alt applied: make sure the service consumers understand that this comparability examination and risk assessment has taken place. Documentation of the requirements for the application of the comparable controls so any user can deploy them effectively. Overall goal: set up a consistent approach that helps the CSPs, the assessors, and the ARB with comp/alts.
AHughes-not sure about risk assessment constraint to only evidence. Not convinced that if we are allowing alternatives of any kind, why are we restricting? 63A700-why are we only talking only about IAL? Shouldn’t it apply to any criteria set in the group? Why are we mentioning any specific set of criteria, wouldn’t this apply to any set of Kantara IAF?
Lynzie-when presented to the ARB, they were in reference to a specific service, would they be applicable to everyone else? Are they comprehensive enough to publish?
Richard-these were presented to IAWG/ARB 18 mos ago much as they are here.
Been around for some time and used in specific purposes.
They are only stated in 63 rev3–only meaningful applicable for IAL and AAL
OpSac/CoSac-not go there
Jimmy: Perhaps because it may be difficult to achieve. He concurs with having it at the bottom of all 3 spreadsheets (some form of this set of criteria) and if someone can make it through the hoops–go for it
Richard: if we take out the focus on evidence selection and validation, it may open the floodgates to people turning up with things that aren’t really IAL2/AAL2, but we have to assess it because people claim “comparability”
Yehoshua: are we skipping a step in the need for someone to justify how it is comparable? Richard notes-this is A-E.
CSP has determined it is as good as what NIST expects.
Yehoshua notes that this is a conclusionary statement.
Richard-could combine b and c into one statement regarding analyzing the risk that alt controls may introduce
Yehoshua: The CSP must do two things
Make your case for what the NIST control is getting at.
Make your case for how you are trying to do it and how the comparable solution solves the new risks introduced.
Richard: If there is a theory about what the NIST controls are trying to achieve and the residual risks accomplished/measured, then what is the degree of residual risk for the comp/alt? That is where comparability is determined, because the residual risk of the alt. Approach should be as good (if not better) than the residual from the original controls.
Hughes: NIST has controls with residual risk, we are proposing alternative controls with risks, need to ensure the risks before/after the swap are OK.
Yehoshua: Is the primary concern for NIST risk? Can/should we view comparability only from the viewpoint of risk? Problematic because we can’t quantify confidence with IAL2.
Andrew Hughes: 863 is written with the concept of risk analysis introduced, however it’s qualitative risk analysis. Substitutability of control is harder to determine from a risk evaluation point of view.
5.4: Focusing on it from the agency’s implementing lens, not from the acceptability of the service providing it. Can’t be altered based on capabilities, implemented/adjusted based on ability to mitigate risk. Recommends using 5.5 methodology
Should we use the 5.5 framework for our (Kantara) criteria?
Yehoshua: Is this a separate discussion of IAWG needing to articulate (publicly) that this one has comparable controls?
SoC 2 is still SoC2 even with exceptions (the report would detail this), but what is the medium to share the comp/alts? Is the report in the SoCA that is downloadable?
Richard: Perhaps require an acknowledgement/statement/description in the public service description in the S3A that a comp/alt is being used, as this goes into the TSL
Could also insist that the CRP makes it clear that there is a comp/alt being used, so that whoever is using the service would be informed.
Final step could be requiring an explicit statement in the TSL
Richard: Notes 5.4 is about agencies and agencies often don’t build these solutions, they buy them from approved CSPs. So the implementation of the comp/alts isn’t in the agency’s hands, it’s in the CSP’s hands. This may mean the CSP services are available to anyone.
Open Action items
Action items may be created inline on any page. This block shows all open action items from all meeting notes.