top of page

AI Automation Readiness with Copilot


AI Automation Readiness with Copilot

In today's world, news about AI is everywhere, and businesses all over the planet are focused on understanding the impact of AI and harnessing the immense power that AI offers us. Some companies are at the forefront of this transformation and are willing to jump headfirst. Other companies are more cautious, choosing to learn more about the risks and impacts of AI before diving in.


Generally, 'automation' has been a slow yet constant march over the last 10 years. There has been a tangible push to enable digital solutions that are integrated and automated, with varying success in different industries. Tools like Robotic Process Automation were the leading edge of automation in the late 2010's, and the interest that these tools created combined with the burgeoning desire for automation have created an opportunity in the market which is being fulfilled by the AI-enabled tools available today.


While AI-based features are working their way into a variety of products (such as SharePoint Premium, Adobe Photoshop, etc.), the public's attention is fixated on generative AI tools like ChatGPT which can help connect people with information and create new content with ease. These tools, which are considered to be at the forefront of AI technology, stand to disrupt many types of work when used in the right way.


In the same vein, usage of such powerful technologies requires modern governance and regulation, an issue that regulatory bodies continue to struggle with. While this will be an ongoing evolution, the short-term governance aspects of using an AI tool in a business setting are the questions that tend to keep business leaders up at night, and ultimately may be contributing to a slower adoption of these tools.



Let's dive into Microsoft Copilot as an example. This product is widely available and highly integrated throughout the Microsoft ecosystem, positioned to users as a tool to increase productivity inside Microsoft tools. In the latest version of Windows 11, a Copilot icon now is featured boldly and clearly in the taskbar, a beacon to users hoping to simplify their day to day lives.


While demand from users is certainly palpable for a tool like Copilot, the tool alone does not help organizations cope with key considerations such as:

  • The ramifications of generated content and its use in work products.

  • The distraction / novelty factor and retaining productivity gains.

  • Concerns about data privacy and or security, including accidental exposure of data to non-privy parties.

Whichever of these is the most concerning, or if there are others that concern you personally, it is important to understand that while the journey to using Copilot is fraught with risk, there are steps you can take to help you reduce and control those risks you will face while reaping the benefits of these tools:


Write a comprehensive policy: While it will be near to impossible to write a policy that covers every aspect of a rapidly changing field such as AI, you should take steps to start an initial policy version that handles the questions of today but is open to future modification. In this policy, you should define who (or, which roles) are permitted to use Copilot features in their work, and for what purposes Copilot can be used. This policy will aid you in implementing technical guardrails as well.


Execute training and change management: For all permitted users, make them aware of what the tool can be used for and how it can be used in their day-to-day. This training should also include an explanation of the pitfalls of Copilot and the things they will be disallowed to do based on the policy.



Do a Microsoft 365 Health Check:

  • Use a tool such as SharePoint Advanced Management to check for security gaps and other missed opportunities to prevent data misuse.

  • Identify if there are any gaps in your data storage (for example, in Azure or SharePoint) where access control is not well-defined or applied consistently.

  • Check your exchange settings and ensure you have all available protections in place.

  • Check your Entra policies and ensure access and identity are under control, and least permissive where possible.

  • Consider usage of Defender, Purview, and Sentinel to increase security and reduce risk.

  • Consider implementing DLP policies which can help protect against data misuse.


Roll out sensitivity labels in Purview: Adding labels to sensitive information in your organization can help Copilot skip data that a user doesn’t have privileges to, even if they might have permission to access it. Enabling the labels is the first step, but then labelling your sensitive content is a large effort which will likely be an ongoing effort involving many people. This push will also require additional change management and training for people to use the labels properly.


If you are on the path to rolling out Microsoft Copilot in your organization, you should consider these four prerequisites to reduce risk and improve adoption of the tool.


Talk to an Cadence Expert to get started with your Health Check and discuss your automation readiness.



Comments


bottom of page