‘AI won’t replace you, but someone using AI will’ has already become a favourite maxim. Copilot is a generative artificial intelligence chatbot developed by Microsoft, and launched in February 2023. It’s already being rather widely used, and amongst the participants of the discussion were some who had anywhere from 20 to 300 licences.

The member who initiated the event provided a presentation that set the scene and got the discussion started.

Using GenAI tools

In addition to Copilot, organisations are of course also experimenting with tools such as ChatGPT and Bing Chat. The participants agreed that the GenAI tools are most useful for specific profiles, such as marketeers, programmers, or legal experts. One reason to start with Copilot would be to (potentially) bring the benefits of GenAI to all profiles. But while they expressed that Copilot for Teams, Outlook and searching documents works, participants had doubts about using it for Excel or PowerPoint. To know the real benefit, improvement would be needed: the automated usage figures from Microsoft were not seen as adequate. Most organisations are using surveys to determine the ROI.

Guidance for Copilot users

Another discussion topic was what guidance to give to employees, and especially the Copilot users. One suggested best practice was to reverse the logic of what data can be accessed or used with Copilot. So instead of blocklisting files containing sensitive data (e.g., salaries, etc) that cannot be used, a list of data that CAN be used is created – an allowlist. Participants pointed to FAQs, product descriptions and practical HR information (such as how to request vacation days, or how time off is calculated) as good data to use with Copilot.

Data access by Microsoft through use of Copilot also came up, as well as where data is stored. These are both issues to look into when deciding whether to purchase Copilot.

Other best practices included:

  • creating a usage document with no more than 5 points
  • giving employees a prompt library or some templates to use with ChatGPT

Furthermore, in all the pilots carried out, training was provided through both e-learning and ‘train the trainer’ programmes.

Attention points

Several attention points surfaced during the roundtable as well:

  • When starting out with Copilot, organisations are forced to change the patching cycle towards monthly patching, and to have the newest version of M365. This means both more patching and a less stable version. Especially in production environments, the patching requirements can be critical. We later did a poll with our Security task force: while the number of responses was small, most of the organisations found out about the patching issues AFTER they started with Copilot.
  • It is important to ensure clear communication around tools to measure productivity and ROI. If this isn’t well communicated, it can sink the whole use case.
  • Using different languages is problematic for many of the AI tools. If possible, stick to one language.
  • Transcripts in Teams works rather well, but it can easily make the storage explode, so pay attention to retention periods. Also, it is unclear if participants can individually opt out.

The conclusion from the discussion was that experimentation is ongoing, and while the advantages for certain specific profiles and use cases are clear, the case for general use is still to be seen.