We’re just finishing off another great SIFMA Annual Conference, meeting with compliance and legal executives from some of the world’s largest financial institutions. This year’s event also showcased an outstanding Lunch & Learn session on how social and interactive applications are disrupting today’s discovery and regulatory responses. Big thanks to Marty Colburn, Brian Panicko, and Marianna Shafir for providing their expertise in guiding the interactive discussion.
These sessions confirmed what we have been saying over the past several months: that the rapid adoption of new communications and collaborative tools have created a new series of complications for large firms in dealing with e-discovery and supervisory review. However, for the largest firms, the scope and scale of this disruption in truly mind bending.
Here are a few examples of concerns raised over the course of the week:
- “How can we supervise a persistent chat when there are more than 10,000 participants?”
- “We are seeing items captured from our collaboration tools that exceed 2GB each. Our existing archive cannot ingest or index that.”
- “Younger employees and clients are consistently introducing new tools to work. As a result, we are supporting more than 50 different communications networks. Our IT team cannot keep up with the demand for new networks.”
- “We are capturing content from clients in 10 different countries – we are struggling to meet the unique data privacy, locality, and geographic restrictions for each when it take multiple months to deploy a new archiving location.”
What struck me about these concerns is that they are coming from the compliance and legal team leaders of their respective organizations. Sure, there are always questions about new features offered by the latest communications tool, there are always policy questions to sort through before they are comfortable deploying new technologies, and there are always emerging best practices that practitioners can learn from their peers to reduce false positive rates. But, to me, these concerns are leading indicators of a broader recognition that the underlying capture and archiving technologies that firms rely upon are no longer suited for purpose. It is a recognition that the use of new, interactive tools is not as simple as bolting an additional set of capabilities on top of an old, outdated archiving solution. Just as one would not add a new room to a house with a creaky foundation, companies are increasingly recognizing that the underlying architecture supporting their communications may be in need of a complete renovation.
Architecture really does matter to those outside of the IT function in attempting to address the disruption caused by today’s communication and collaborative tools. Given this, what should global firms think about when evaluating new communications technologies like Slack, Symphony, or Microsoft Teams? Where can users find the robustness and the flexibility to address today’s volume and variety of communications – and enable response to the inevitability of the next network? At the highest level, I’d encourage business leaders to consider the following attributes when conferring with your technologists in their due diligence of new communications tools:
- Think Data Objects, Not Messages: Most of today’s collaborative tools provide a combination of modalities. Unfortunately, the challenge of indexing and storing social, collaboration, video, images, and other unstructured content is exponentially more complex than simply storing email. Consequently, firms should be thinking about the ability of their capture and archiving technologies to treat content agnostically, respecting and preserving its native properties and metadata to ensure that context is retained when reviewing those conversations in e-discovery or supervisory review. Let’s face it: the days of treating everything like an email message should be numbered.
- Support Best-of-Class Cloud Standards: Capture, indexing, search, and storage technologies have undergone multiple step-function rounds of innovation since the days when on-premises email archiving systems were designed. Today, any organization can leverage solutions that were designed with proven, web-scale technologies for content processing, object storage, search, and indexing. Similarly, market-leading cloud infrastructure options have emerged, making the days of relying upon vendors who host data in their own data centers a relic of the past. For compliance and discovery users, leveraging solutions built with leading cloud standards ensures that their organization can remain agile in embracing the latest technology innovations to support the search, review, and processing of its communications data.
- Design for Throughput vs. Search Speed Only: For those dealing with time-pressured e-discovery and regulatory response, maximizing throughput is vital – data needs to be quickly ingested, easy to search and retrieve at large scale, and fast to move to the next step in the regulatory or discovery workflow. Unfortunately, legacy on-premises and hosted archives suffered from monolithic design, where adding compute or processing resources to one area of the system adversely impacted other parts of the system. While some vendors proudly touted their search speeds and impressive sounding SLAs, they weren’t as quick to trumpet their content ingestion or export performance metrics. For today’s communication variety and complexity, firms should be looking for solutions that leverage distributed architectural models to deliver scale across all components of the system. For example, content, indices, and metadata can be scaled independently for storage optimization and performance, while also eliminating the single-point-of-failure risk that has plagued legacy archiving solutions.
- Openness and Extensibility are Pre-Requisites: Moving applications and content sources to the cloud should not come at the expense of creating another data silo. As firms look to create a central point of control for unstructured content, they also create an opportunity to share that data with other applications like legal document review, content surveillance, or business intelligence apps. This means that systems should be designed to enable highly reliable, high speed, high volume information delivery with published APIs to fully leverage these information assets. Extensibility also means that systems should be designed to easily integrate to custom or legacy content sources and support the enrichment of archived data by external analytics tools.
- Data Privacy by Design and Default: Given today’s emphasis on data privacy and security, financial services firms should be evaluating solutions that were purposely designed to protect sensitive client information and meet the rigors of regulatory books-and-records and storage requirements. Architectural design is critical to ensure that the highest level of policy controls can be enforced – across multiple geographies, user classes, and content types. Firms should also be inspecting security, management, and operational controls that are covered under standards such as SSAE-16 SOC II and audited by independent third parties.
Thinking about architecture versus simply deploying the next network independently can add complexity to buying decisions but will return dividends in establishing a central point of control versus the added hassle of searching for additional needles in yet another haystack. Obviously, each of these attributes is a topic in and of itself, and we will be following this post over the coming weeks with a series of deliverables from our subject matter experts to provide the depth and clarity that each topic deserves. Stay tuned for more.
Latest posts by Robert Cruz (see all)
- Webinar Recap: Data Privacy – Where Do We Go From Here? - August 29, 2019
- Three Key Webinar Takeaways: The Benefits and Risks of Collaboration Tools - August 5, 2019
- Not Your Grandparents’ Archive: Three Key Features for Today’s Smart Cloud Archive - July 15, 2019