I have been invited to speak at the second annual NIST Advancing Cybersecurity Risk Management Conference 27-28 May 2020 (Pandemic dependent!). If you are attending be sure to reach out to me via emailor Linkedin.
Data Compatibility: Understanding the Measurement and Metrics Implications of Developing and Updating a Cyber Framework
What happens when cybersecurity frameworks, models, and standards expand beyond their traditional normative, descriptive, and prescriptive roles? More and more these documents are serving as not only the best practices from which organizations should or shall align too but as the assessment tool (read: questionnaire) from which said alignment is assessed and scored. This talk will explore the data ramifications of the framework-cum-assessment-and-scoring-tool and how standards development organizations (SDO) can better support assessment derived data consistency, aggregation, and cross-version compatibility.
Cybersecurity assessment scoring data are used in a variety of ways. Regulators and trade associations use assessment results to benchmark their sector and member organizations. Cyber insurance products -that increasingly require pre-binding cybersecurity evaluations- leverage scores to determine premiums and track trends. And organizations perform annual self- and third-party evaluations to track security maturity and develop and prioritize investment roadmaps. If the framework document is itself the assessment tool, what happens to the forward and backward data compatibility thus measurements and metrics when the framework receives an update?
This talk will examine recent updates to the NIST Cybersecurity Framework v1.1, CIS Controls Version 7.1, and the proposed version 2.0 updates of the Cybersecurity Capability and Maturity Model (C2M2v2) to illustrate how subtle -and not so subtle- changes in form, architecture, and language can improve or impede data compatibility. This session will highlight common data compatibility pitfalls created by these changes and discuss their data ramifications along with future-proofing techniques and change management strategies.
The session will conclude with an overview of the research and development of a proposed normative model that, if adopted by SDOs, could improve a framework’s accessibility thus data consistency and could also mitigate against and in some cases prevent incompatibility of data when a framework receives an update. We will discuss two recent applications of the proposed model. The first use case will demonstrate how the model was used to increase data consistency in the API 1164 Pipeline Security Standard (2019) by developing guidelines for creating control statements. The second will showcase the C2M2 version 2.0 data compatibility working group’s efforts to document and map changes to support forward and backward data compatibility.
Jason is the author of The Cybersecurity Assessment Handbook (April 2020). He has spent much of his career developing and performing cybersecurity assessments including the development of a pre-binding assessment and metrics aggregation tool for a $200mm Llyod’s cyber line slip. Jason led the data compatibility working group for the DOE’s Cybersecurity Capability Maturity Model version 2.0 update (C2M2v2) and was a member of the API 1164 Pipeline Security Standard leadership & architecture team. Jason is a veteran of the United States Marine Corps.
Leave a Reply