By now, we’ve all heard how the President’s Management Agenda (PMA) sets bold goals for IT modernization. A vital element of the PMA is the creation of a Federal Data Strategy. This is a wise choice. Data is the oil and the key to our digital government.
The Federal Data Strategy is still in draft form. Over the summer, the data strategy development team produced their first draft and accepted an initial round of public comments. The result was the establishment of three key principles for federal data, including Ethical Governance, Conscious Design and Learning Culture.
The principles form a foundational framework for agencies and the public to leverage federal data sets to improve programs and generate value. Now, the big question is how to bring these ideas to life. The starting point is the list of 47 draft data practices included in the latest update.
While described as “aspirational goals,” these practices require significant effort to operationalize. As a result, initial feedback from stakeholders is that the list is too unwieldy and complicated. Nevertheless, the current set of practices offers key concepts for agencies to consider. I’d like to highlight a few that I think are the most important but require further discussion.
The first group of practices focuses on data governance. Core to data governance in any organization is the creation of a data governance committee. This group determines data risks, how compliant an agency is to existing regulations and how to best retain data. To be effective, it should be a multi-disciplined team from across the organization. The data governance committee also needs an executive sponsor with the ability to drive change.
Although not stated explicitly, the draft strategy requires agencies to understand their data, a mandate with implications well beyond data governance. If agencies don’t understand what or where their data is, they are never going to be able to protect it or use it strategically. Whether the goal is to inventory data assets, manage high-value assets or coordinate and preserve federal data as a national asset, the process begins with data visibility.
From a technology perspective, data classification is key to visibility. Agencies need automated tools to quickly scan and tag data to ensure that sensitive information is understood, protected and shared. Classification is also necessary to meet compliance regulations. These rules require agencies to implement and enforce retention policies across the organization’s entire data estate, regardless of where the data lives.
The next group of practices focuses on data protection and security. Again, the first step is to gain visibility into your data by classifying it. Classification capabilities help agencies determine what sensitive data can be made available for use, perhaps in redacted or anonymized fashion, and what data must remain locked down. This concept is core to the practices related to protecting confidentiality and ties directly to inspiring public trust and safeguarding privacy.
Further, data strategy practices call for prioritizing and evolving data security. This requires taking a risk-based approach where the most sensitive data is protected using the highest level of security. This is expensive to apply to all data sets, so it’s necessary to understand your data to protect it appropriately.
The strategy also places a focus on interacting with stakeholders and partners. Critical here is for the public to trust that the government uses data appropriately, especially their data. One of the practices calls for the public to access and amend federal data about themselves, similar to recent privacy regulations in the EU and California. The Privacy Act of 1974 and other federal regulations already require agencies to protect personal data from damage, loss or breach, but this appears to go further. To manage, protect and secure PII, agencies must know where it’s stored, who can access it and how long it must be retained. Establishing transparency into data protection and security processes for PII will ensure agencies can fulfill audits and compliance requests, furthering trust with the public.
Taking steps to classify and understand data provides the foundation for access and sharing across federal, state and local governments. It can further enhance well-established and successful open data practices and support researchers working on national priorities.
Additionally, integration with modern cloud computing platforms enhances data sharing and usage. However, the initial draft of the Cloud Smart Strategy hardly mentioned data management. To further these goals, the Federal Data Strategy and Cloud Smart teams must work together to ensure agencies focus on proper governance, compliance and portability for data as it moves to the cloud.
Cost is also an important consideration for the federal data strategy. Therefore, the ability of agencies to leverage their buying power through best-in-class contracts for data management is crucial for the efficient use of data assets.
The federal data strategy development team is currently accepting comments on the draft practices. The next version, including action steps to implement the practices, is due in January.
This inherent lack of data visibility increases risk and complexity and makes it much harder to protect and secure data, let alone manage it as a strategic asset or promote its efficient use. So, I’m cautiously optimistic that the next round of updates will include guidelines for agencies to use data classification tools to find and understand their data, ensuring our government’s digital future.
This article was originally published by GovLoop.com on November 13th, 2018.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.