Microsoft may store your conversations with Bing if you’re not an enterprise user

Microsoft may store your conversations with Bing if you’re not an enterprise user

The new AI Services policies also prohibit any reverse engineering and data collection from the output of any of its products.

Published on 22nd August 2023

Microsoft prohibits users from reverse engineering or harvesting data from its AI software to train or improve other models, and will store inputs passed into its products as well as any output generated.

The details emerged as companies face fresh challenges with the rise of generative AI. People want to know what corporations are doing with information provided by users. And users are likewise curious about what they can do with the content generated by AI. Microsoft addresses these issues in a new clause titled “AI Services” in its terms of service.

The five new policies, which were introduced on 30 July and will come into effect on September 30, state that:

In some cases it’s easy to peer under the hood of some of Microsoft’s AI products. When it launched its AI-powered Bing chatbot, for example, users uncovered the prompt used to shape its behaviors, which revealed its secret codename “Sydney.” Developers have tried to poke around at the underlying code to get a better sense of how it generates text too. All that is now technically banned under the new policies.

Rules preventing developers from using any data generated by its AI models to improve and train other systems have also also been adopted by other companies like Meta, Google, OpenAI, and Anthropic. It’s not clear how well this can be enforced, however, especially as more and more text on the internet is written by machines, and they all scrape these resources to train their own models.

A spokesperson from Microsoft declined to comment on how long the company plans to store user inputs into its software.

“We regularly update our terms of service to better reflect our products and services. Our most recent update to the Microsoft Services Agreement includes the addition of language to reflect artificial intelligence in our services and its appropriate use by customers,” the representative told us in a statement.

Microsoft has previously said, however, that it doesn’t save conversations or use that data to train its AI models for its Bing Enterprise Chat mode. The policies are a little murkier for its Microsoft 365 Copilot, although it doesn’t appear to use customer data or prompts for training, it does store information.

“[Copilot] can generate responses anchored in the customer’s business content, such as user documents, emails, calendar, chats, meetings, contacts, and other business data. Copilot combines this content with the user’s working context, such as the meeting a user is in now, the email exchanges the user has had on a topic, or the chat conversations the user had last week. Copilot uses this combination of content and context to help deliver accurate, relevant, contextual responses,” it said.

In short, be mindful about what you type into Bing if you’re not an enterprise user and don’t want Microsoft to store your prompts or conversations you have with the chatbot.

Source

The latest updates straight to your inbox

We just need a few details to get you subscribed

Health Checks

Inventory & Compliance

Cloud Readiness & Optimisation

Agreement & Audit Support

Learning

Looking for something specific?

Let's see what we can find - just type in what you're after

Wait! Before you go

Have you signed up to our newsletter yet?

It’s chock full of useful advice, exclusive events and interesting articles. Don’t miss out!

Cookie Notice

Our website uses cookies to ensure you have the best experience while you're here.