Microsoft Employees Might Review

Microsoft Employees Might Review Your Azure AI Prompts and Responses: Artificial Intelligence Trends

Ruh-roh. Based on Casey Flaherty’s review of the “fine print”, it seems that Microsoft employees might review your Azure AI prompts and responses.

Casey’s LinkedIn article (AI Data Privacy Concern: Microsoft employees might review your Azure AI prompts and responses, available here) discusses his assumption that relying on Microsoft for enterprise AI was the same as relying on Microsoft for enterprise cloud storage and he says flatly: “I was wrong”.

Casey walks through the fine print to illustrate the concern, as follows:

Advertisement
CloudNine

While your prompts (inputs) and completions (outputs), your embeddings, and your training data are NOT available to other customers or OpenAI, or used to improve OpenAI models or any Microsoft or 3rd party products or services, “the Azure OpenAI Service includes both content filtering and abuse monitoring features” and “[t]o detect and mitigate abuse, Azure OpenAI stores all prompts and generated content securely for up to thirty (30) days.”

Human reviewers (which are Microsoft employees) “assessing potential abuse can access prompts and completions data only when that data has been flagged by the abuse monitoring system.”

Don’t want them to do that? “No prompts or completions are stored if the customer is approved for and elects to configure abuse monitoring off… Microsoft allows customers who meet additional Limited Access eligibility criteria and attest to specific use cases to apply to modify the Azure OpenAI content management features by completing this form.”

No problem, right? Well, the form states: “Modified abuse monitoring for Azure OpenAI is only available to managed customers and partners working with Microsoft account teams and only for the use cases listed in this form.”

Advertisement
S2|DATA

So, how do you become a managed customer? Apparently, you can’t – at least for now. As Casey points out, the “Limited Access features for Azure AI services” page states this: “Managed customers work with Microsoft account teams. We invite you to submit a registration form for the features you’d like to use, and we’ll verify your eligibility for access. We are not able to accept requests to become a managed customer at this time.”

So, please submit a form that we cannot accept. Thank you.

And that is as far as Casey was able to get, per his article. He then goes on to point this out:

But you know who does qualify to turn off abuse monitoring, and has, in fact, opted out of abuse monitoring by Microsoft employees? Microsoft. Specifically, Microsoft Copilot.”

“Seems like a bad look.”, Casey says. Gee, you think? 😮

Casey goes into much more detail about the issue in his article linked above and points out that he is “all deliberate speed” on Generative AI for enterprise and “pro Microsoft”, so this is a “statement against interest”. He concludes by saying this:

“I hope Microsoft reverses course. I hope they were moving fast and meaning well. My call to action is for customers and partners to pressure Microsoft by whatever legitimate channels are available.”

So, there’s the issue – about which I’m helping to spread awareness. If it bothers you, consider doing your part to pressure Microsoft on the issue. Otherwise, it seems that Microsoft employees might review your Azure AI prompts and responses.

Hat tip to Kelly Twigger for the heads up on this issue!

So, what do you think? Are you concerned about data privacy with Azure AI? Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by my employer, my partners or my clients. eDiscovery Today is made available solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscovery Today should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.


Discover more from eDiscovery Today by Doug Austin

Subscribe to get the latest posts sent to your email.

Leave a Reply