Federal Directive Hits Two Birds (RIM and eDiscovery) with One Stone
The eagerly awaited Directive from The Office of Management and Budget (OMB) and The National Archives and Records Administration (NARA) was released at the end of August. In an attempt to go behind the scenes, we’ve asked the Project Management Office (PMO) and the Chief Records Officer for the NARA to respond to a few key questions. We know that the Presidential Mandatewas the impetus for the agency self-assessments that were submitted to NARA. Now that NARA and the OMB have distilled those reports, what are the biggest challenges on a go forward basis for the government regarding record keeping, information governance and eDiscovery? “In each of those areas, the biggest challenge that can be identified is the rapid emergence and deployment of technology. Technology has changed the way Federal agencies carry out their missions and create the records required to document that activity. It has also changed the dynamics in records management. In the past, agencies would maintain central file rooms where records were stored and managed. Now, with distributed computing networks, records are likely to be in a multitude of electronic formats, on a variety of servers, and exist as multiple copies. Records management practices need to move forward to solve that challenge. If done right, good records management (especially of electronic records) can also be of great help in providing a solid foundation for applying best practices in other areas, including in eDiscovery, FOIA, as well as in all aspects of information governance.” What is the biggest action item from the Directive for agencies to take away? “The Directive creates a framework for records management in the 21 st century that emphasizes the primacy of electronic information and directs agencies to being transforming their current process to identify and capture electronic records. One milestone is that by 2016, agencies must be managing their email in an electronically accessible format (with tools that make this possible, not printing out emails to paper). Agencies should begin planning for the transition, where appropriate, from paper-based records management process to those that preserve records in an electronic format. The Directive also calls on agencies to designate a Senior Agency Official (SAO) for Records Management by November 15, 2012. The SAO is intended to raise the profile of records management in an agency to ensure that each agency commits the resources necessary to carry out the rest of the goals in the Directive. A meeting of SAOs is to be held at the National Archives with the Archivist of the United States convening the meeting by the end of this year. Details about that meeting will be distributed by NARA soon.” Does the Directive holistically address information governance for the agencies, or is it likely that agencies will continue to deploy different technology even within their own departments? “In general, as long as agencies are properly managing their records, it does not matter what technologies they are using. However, one of the drivers behind the issuance of the Memorandum and the Directive was identifying ways in which agencies can reduce costs while still meeting all of their records management requirements. The Directive specifies actions (see A3, A4, A5, and B2) in which NARA and agencies can work together to identify effective solutions that can be shared.” Finally, although FOIA requests have increased and the backlog has decreased, how will litigation and FOIA intersecting in the next say 5 years? We know from the retracted decision inNDLON that metadata still remains an issue for the government…are we getting to a point where records created electronically will be able to be produced electronically as a matter of course for FOIA litigation/requests? “In general, an important feature of the Directive is that the Federal government’s record information – most of which is in electronic format – stays in electronic format. Therefore, all of the inherent benefits will remain as well – i.e., metadata being retained, easier and speedier searches to locate records, and efficiencies in compilation, reproduction, transmission, and reduction in the cost of producing the requested information. This all would be expected to have an impact in improving the ability of federal agencies to respond to FOIA requests by producing records in electronic formats.” Fun Fact- Is NARA really saving every tweet produced? “Actually, the Library of Congress is the agency that is preserving Twitter. NARA is interested in only preserving those tweets that a) were made or received in the course of government business and b) appraised to have permanent value. We talked about this on our Records Express blog.” “We think President Barack Obama said it best when he made the following comment on November 28, 2011: “The current federal records management system is based on an outdated approach involving paper and filing cabinets. Today’s action will move the process into the digital age so the American public can have access to clear and accurate information about the decisions and actions of the Federal Government.” Paul Wester, Chief Records Officer at the National Archives, has stated that this Directive is very exciting for the Federal Records Management community. In our lifetime none of us has experienced the attention to the challenges that we encounter every day in managing our records management programs like we are now. These are very exciting times to be a records manager in the Federal government. Full implementation of the Directive by the end of this decade will take a lot of hard work, but the government will be better off for doing this and we will be better able to serve the public.” Special thanks to NARA for the ongoing dialogue that is key to transparent government and the effective practice of eDiscovery, Freedom Of Information Act requests, records management and thought leadership in the government sector. Stay tuned as we continue to cover these crucial issues for the government as they wrestle with important information governance challenges.407Views0likes0CommentsResponsible Data Citizens Embrace Old World Archiving With New Data Sources
The times are changing rapidly as data explosion mushrooms, but the more things change the more they stay the same. In the archiving and eDiscovery world, organizations are increasingly pushing content from multiple data sources into information archives. Email was the first data source to take the plunge into the archive, but other data sources are following quickly as we increase the amount of data we create (volume) along with the types of data sources (variety). While email is still a paramount data source for litigation, internal/external investigations and compliance - other data sources, namely social media and SharePoint, are quickly catching up. This transformation is happening for multiple reasons. The main reason for this expansive push of different data varieties into the archive is because centralizing an organization’s data is paramount to healthy information governance. For organizations that have deployed archiving and eDiscovery technologies, the ability to archive multiple data sources is the Shangri-La they have been looking for to increase efficiency, as well as create a more holistic and defensible workflow. Organizations can now deploy document retention policies across multiple content types within one archive and can identify, preserve and collect from the same, singular repository. No longer do separate retention policies need to apply to data that originated in different repositories. The increased ability to archive more data sources into a centralized archive provides for unparalleled storage, deduplication, document retention, defensible deletion and discovery benefits in an increasingly complex data environment. Prior to this capability, SharePoint was another data source in the wild that needed disparate treatment. This meant that legal hold in-place, as well as insight into the corpus of data, was not as clear as it was for email. This lack of transparency within the organization’s data environment for early case assessment led to unnecessary outsourcing, over collection and disparate time consuming workflows. All of the aforementioned detractors cost organizations money, resources and time that can be better utilized elsewhere. Bringing data sources like SharePoint into an information archive increases the ability for an organization to comply with necessary document retention schedules, legal hold requirements, and the ability to reap the benefits of a comprehensive information governance program. If SharePoint is where an organization’s employees are storing documents that are valuable to the business, order needs to be brought to the repository. Additionally, many projects are abandoned and left to die on the vine in SharePoint. These projects need to be expired and that capacity must be recycled for a higher business purpose. Archives currently enable document libraries, wikis, discussion boards, custom lists, “My Sites” and SharePoint social content for increased storage optimization, retention/expiration of content and eDiscovery. As a result, organizations can better manage complex projects such as migrations, versioning, site consolidations and expiration with SharePoint archiving. Data can be analogized to a currency, where the archive is the bank. In treating data as a currency, organizations must ask themselves: why are companies valued the way they are on Wall Street? For companies that perform service or services in combination with products, they are valued many times on customer lists, data to be repurposed about consumers (Facebook), and various other databases. A recent Forbes article discusses people, value and brand as predominant indicators of value. While these valuation metrics are sound, the valuation stops short of measuring the quality of the actual data within an organization, examining if it is organized and protected. The valuation also does not consider the risks of and benefits of how the data is stored, protected and whether or not it is searchable. The value of the data inside a company is what supports all three of the aforementioned valuations without exception. Without managing the data in an organization, not only are eDiscovery and storage costs a legal and financial risk, the aforementioned three are compromised. If employee data is not managed/monitored appropriately, if the brand is compromised due to lack of social media monitoring/response, or if litigation ensues without the proper information governance plan, then value is lost because value has not been assessed and managed. Ultimately, an organization is only as good as its data, and this means there’s a new asset on Wall Street – data. It’s not a new concept to archive email, and in turn it isn’t novel that data is an asset. It has just been a less understood asset because even though massive amounts of data are created each day in organizations, storage has become cheap.SharePoint is becoming more archivable because more critical data is beingstored there, including business records, contracts and social media content.Organizations cannot fear what they cannot see until they are forced by an event to go back and collect, analyze and review that data.Costs associated with this reactive eDiscovery process can range from $3,000-30,000 a gigabyte, compared to the 20 cents per gigabyte for storage.The downstream eDiscovery costs are obviously costly, especially as organizations begin to deal in terabytes and zettabytes. Hence, plus ca change, plus c'est le meme chose and we will see this trend continue as organizations push more valuable data into the archive and expire data that has no value. Multiple data sources have been collection sources for some time, but the ease of pulling everything into an archive is allowing for economies of scale and increased defensibility regarding data management. This will decrease the risks associated with litigation and compliance, as well as boost the value of companies.344Views0likes0CommentsGartner Publishes eDiscovery MarketScope (Pre-Cursor To eDiscovery Magic Quadrant)
Earlier today, Gartner published its eDiscovery MarketScope for 2009. Written by Debra Logan, John Bace, and Whit Andrews, it is perhaps the most comprehensive "buyers guide" available for companies interested in using electronic discovery technology to lower costs. The eDiscovery MarketScope analyzes about 20 software companies focused on electronic data discovery. Based on extensive interviews with end customers and data from the companies themselves, Gartner rates the companies using criteria similar to those used in its famous Magic Quadrant reports. It also identifies market trends, and makes predictions for 2009 and beyond. This report is required reading for anyone considering an investment in eDiscovery software, and I strongly recommend that you get a copy, either from Gartner or some other authorized source. To give you a flavor for Gartner's analysis, a few of its main conclusions are as follows: 1. Bringing eDiscovery In-House Dramatically Reduces Cost This is a claim that electronic discovery software vendors often make, and prospective customers rightly question. Gartner investigates and finds that many of its corporate clients are saving large amounts of money by using eDiscovery software to reduce the amount they spend on lawyers and legal service providers. It reports that customers typically recover their money from buying eDiscovery software within 3-6 months of implementation. 2. There's No Single, End-To-End Solution For eDiscovery Gartner addresses what is probably the most common question I get asked by corporate counsels and litigation support managers - namely, "Isn't there a single product I can buy that will do end-to-end eDiscovery, covering all aspects of the EDRM?" The answer, of course, is "no" and Gartner goes further by predicting that the answer will remain "no" until at least 2011. So, for the foreseeable future, customers will need to buy best-of-breed products from different vendors for different stages of the EDRM model, and ensure they integrate smoothly. 3. There Are 4 Leading eDiscovery Software Companies Company Product Clearwell Clearwell E-Discovery Platform FTI Attenex, RingTail Symantec Discovery Accelerator Zylab E-Discovery Management Module List of vendors achieving highest rating of "strong positive" (from Figure 2, page 10) Of all the companies it analyzed, Gartner only gives 4 its highest rating of "strong positive". Each of the four has different strengths. For processing, analysis and review, Clearwell is "fast-to-install and easy-to-use" (page 12) , while FTI's ability to offer Attenex / RingTail either hosted or on-premise "positions it well for the future" (page 13) . Symantec's leadership in email archiving makes Discovery Accelerator a good option for its customers who need to search and export data from Enterprise Vault. Finally, Zylab is well-known within law-enforcement circles and has a strong presence in Europe and Asia. 4. There Will Be Consolidation In The Next 12 Months As the market matures, Gartner predicts that as many as 25% of eDiscovery software providers will either merge, be acquired, or exit the business. Access Data's ambitious bid for Guidance has publicly put Guidance in play. Beyond that, Gartner suggests that Kazeon and several other players are all likely acquisition targets for larger companies wishing to enter the eDiscovery space. Of course, Gartner is not the only influential voice in eDiscovery. Earlier this year, George Socha and Tom Gelbmann published their Socha-Gelbmann Survey, which also provides a valuable perspective on the market. How do the two reports compare? That will be the subject of my next post.1.1KViews0likes3Comments