Beware Of Shadow AI – Shadow IT’s Less Well-Known Brother

beSpacific 2024-12-31

SecurityWeek: “While AI tools can enable employees to be innovative and productive, significant data privacy risks can stem from their usage. Shadow IT is a fairly well-known problem in the cybersecurity industry. It’s where employees use unsanctioned systems and software as a workaround to bypass official IT processes and restrictions. Similarly, with AI tools popping up for virtually every business use case or function, employees are increasingly using unsanctioned or unauthorized AI tools and applications without the knowledge or approval of IT or security teams – a new phenomenon known as Shadow AI.  Research shows that from 50 to 75% of employees are using non-company issued AI tools and the number of these apps is growing substantially. A visibility problem emerges. Do companies know what is happening on their own networks? According to our research, beyond the popular use of general AI tools like ChatGPT, Copilot and Gemini, another set of more niche AI applications being used at organizations include:

  • Bodygram (a body measurement app)
  • Craiyon (an image generation tool)
  • Otter.ai (a voice transcription and note taking tool)
  • Writesonic (a writing assistant)
  • Poe (a chatbot platform by Quora)
  • HIX.AI (a writing tool)
  • Fireflies.ai (a note taker and meeting assistant)
  • PeekYou (a people search engine)
  • Character.AI (creates virtual characters) and
  • Luma AI (3D capture and reconstruction).

Why Shadow AI Is A Major Cybersecurity Risk  – Even though AI brings great productivity, Shadow AI introduces different risks: Data leakage, Compliance risks, Vulnerabilities to cyberattacks, Lack of oversight, and Legal risks…”