Employee or Corporate-Owned Devices: The Tough Choice for Organizations in Regulated Industries
Contributing authors: Blane Warrene, VP, Product Management and Fi Pickul, Sr. Product Marketing Manager
Mobile devices are an integral part of our lives, offering instant communication, convenience and accessibility for users. This is becoming just as true in our professional lives. Calls, texts and messages on platforms like Microsoft Teams or WhatsApp have become an inherent part of our work routine and some of the most direct ways to communicate.
However, the use of mobile devices for work, particularly in regulated industries — such as financial services, healthcare and legal — raises significant concerns related to compliance and data security. These organizations must decide whether to support a bring-your-own-device (BYOD) policy or offer corporate-owned personally enabled (COPE) devices to support collaboration among their employees and customers.
In this post, we delve into the pros and cons of both approaches. Additionally, we’ll touch upon permission vs. prohibition and ways to uphold mobile compliance when adopting new messaging apps and collaboration platforms for work purposes.
A comprehensive capture solution is required for both BYOD and COPE
When deciding on the best approach for your organization, whether employee- or corporate-owned or a combination of both, it's crucial to consider the diverse ways we communicate today.
While calls and texts are important, it's equally crucial to address the use of mobile messaging apps like WhatsApp and Telegram and collaboration platforms like Microsoft Teams and Zoom. Many have become integral to daily interactions between employees and customers.
Collaboration lies at the heart of these apps. It’s crucial to work with a trusted vendor who can capture conversations in their native format and store and monitor the messages and contextual details — such as join/leave functionality, emojis, reactions, and more.
Mobile communications cannot be ignored by compliance teams. Mobile devices have become indispensable, with text messaging surpassing email as the most engaging communication method and colleagues and customers expecting rapid responses.
Finding the right balance between convenience and the complex, ever-evolving challenges of compliantly capturing, storing, and retaining these communications is essential. The significant fines imposed by the SEC last year serve as a reminder of the importance of thorough consideration.
Understand your industry regulations
To make the right decision for your organization, you must understand your needs, budget and, most importantly, your regulatory requirements.
In Europe, Article 16(7) of MiFID II states “... an investment firm shall take all reasonable steps to record relevant telephone conversations and electronic communications, made with, sent from or received by equipment provided by the investment firm to an employee or contractor or the use of which by an employee or contractor has been accepted or permitted by the investment firm.”
Similarly, for financial services firms in the U.S., both the SEC and FINRA have recordkeeping requirements and guidelines, including:
- SEC Advisers Act Rule 204-2
- SEC Rule 17a-3 and 17a-4
- FINRA Rule 3110(b)(4)
- FINRA Rule 2010
- FINRA Rule 4511
If third-party vendors are involved, additional considerations regarding data control and security are necessary. Banks must engage in proper risk management and ongoing oversight when using third-party solutions or collaborating with vendors for AI applications to ensure compliance, consumer protection, and privacy.
Governance plays a crucial role in managing risks associated with AI. Banks must demonstrate proper documentation, testing protocols, model management, and vendor management. Ongoing audits are necessary to ensure compliance and effectiveness. Proactive communication with examiners before formal examinations is encouraged to provide an overview of AI initiatives and address any questions or concerns.
Risk analysis is a focus in AI applications, allowing for better risk assessment and analysis. However, the lack of explainability in some AI applications poses challenges. Limited applications exist due to these explainability challenges.
AI is being used in audit processes, specifically in natural language processing for analyzing large data sets. It helps identify narratives that lack certain elements, enabling targeted sampling and data assessment. AI tools assist in data visualization and assessment, identifying anomalies or patterns that require further investigation.
The examination of AI and machine learning applications is often approached from an operational risk standpoint. The Office of Comptroller of the Currency’s (OCC) Comptroller’s Handbook on Model Risk Management is a must read for any bank engaging AI models. This booklet will provide insights into the OCC’s approach to AI examination and help banks prepare their model risk management procedures accordingly.
Share this post!
Our internal subject matter experts and our network of external industry experts are featured with insights into the technology and industry trends that affect your electronic communications compliance initiatives. Sign up to benefit from their deep understanding, tips and best practices regarding how your company can manage compliance risk while unlocking the business value of your communications data.
Ready to enable compliant productivity?
Join the 6,500+ customers using Smarsh to drive their business forward.