New research from Zscaler has claimed companies are rushing to use generative AI tools too early, often overlooking their own cybersecurity.
In this article from TechRadar (Companies are rushing to use generative AI, but are ignoring the security risks, written by Craig Hale and available here), the survey of more than 900 global IT decision-makers found that 95% of organizations use generative AI tools like ChatGPT in their businesses, but 89% consider them to be risky.
One-third (33%) of the businesses analyzed had not implemented any additional security measures to protect against generative AI, although some have started to explore the matter. Almost an additional quarter (23%) were not even monitoring GenAI usage at all.
Sanjay Kalra, VP Product Management at Zscaler, said: “With the current ambiguity surrounding their security measures, a mere 39% of organizations perceive their adoption as an opportunity rather than a threat. This not only jeopardizes their business and customer data integrity, but also squanders their tremendous potential.”
According to the research, smaller businesses are more likely to perceive generative AI usage as risky.
More than half (51%) of Zscaler’s respondents expect interest in generative AI to increase substantially by the end of the year, leaving companies with a matter of weeks to fine-tune their processes.
Maybe I should extend my BlackBerry lesson from eDiscovery providers to companies implementing AI technology as well! Adopting new technology – any technology – without having a plan to ensure security standards are being met is highly flawed. It’s not just the technology that needs to be sound – it’s the plan for implementing and using it that needs to be sound as well.
So, what do you think? Are you surprised that this study finds that companies are rushing to use generative AI tools too early? Please share any comments you might have or if you’d like to know more about a particular topic.
Image created using Microsoft Bing’s Image Creator Powered by DALL-E, using the term “open door with data flowing out of it”.
Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by my employer, my partners or my clients. eDiscovery Today is made available solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscovery Today should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.