According to a survey by Software AG, employees are increasingly using their own AI tools without permission from their employer’s IT departments, known as ‘shadow AI’. Many employees cite the lack of AI tools provided by their employers or the desire for more flexibility in their choice of tools. For some, it is quicker and easier to say sorry later than to get permission now.
However, using shadow AI comes with risks, including the potential exposure of trade secrets and data breaches. Employers’ data may be stored in AI services outside their control, and there is a lack of awareness and vulnerability to data breaches. Harmonic Security, a company that helps identify shadow AI, has seen more than 5,000 AI applications in use, including custom versions of ChatGPT and business software with AI features. To address these concerns, employers are creating internal AI tools that allow employees to use AI safely.
What if you discover shadow AI use? Simon Haighton-Williams, Chief Executive Officer at The Adaptavist Group, a UK-based software services group, says, ‘Be patient and understand what people are using and why, and figure out how you can embrace it and manage it rather than demand it’s shut off. You don’t want to be left behind as the organisation that hasn’t [adopted AI].’
Source: Why employees smuggle AI into work, bbc.co.uk, 4 February 2025.