At the recent Ignite event, Microsoft has intensified its efforts to encourage businesses to embrace its AI assistant, Microsoft 365 Copilot. However, customer feedback indicates a significant challenge within this initiative, particularly concerning the potential for the AI to inadvertently expose sensitive information to employees.
In light of these concerns, Microsoft has implemented a suite of new tools designed to bolster data security and governance. These enhancements feature advanced functions within SharePoint Advanced Management and Purview, as well as a comprehensive blueprint to assist organisations in rolling out the generative AI assistant effectively.
Jennifer Glenn, research director for the IDC's Security and Trust Group, acknowledged the rising anxiety among data security professionals regarding AI tools such as Copilot. She articulated these concerns while speaking to Computer World, stating, “AI tools like Copilot are an increasing concern for data security professionals due to the amount and nature of data that these tools have access to.” Glenn further emphasised the distinct scope of Microsoft 365 within the enterprise sector, remarking that fears about Copilot's ability to inappropriately access or disclose sensitive data are prevalent amongst her peers.
To address the oversharing issue and instil greater confidence among enterprises in adopting AI tools, Glenn underscored the importance of Microsoft’s new governance and security solutions. She noted, "The new data governance and security tools from Microsoft to address oversharing are essential for enterprises to feel confident in adopting AI tools like Copilot."
In summary, as organisations consider incorporating AI technology into their operations, the recent advancements from Microsoft aim to alleviate concerns surrounding data privacy while fostering a safer environment for utilising AI applications such as Microsoft 365 Copilot.
Source: Noah Wire Services