Microsofts new Terms and Service agreement is rather questionable. In short; It does not clarify if Microsoft will use your data to train it’s AI.

So Mozilla is calling for arms to sign their petition for Microsoft to give a proper answer! You can sign it here -> https://foundation.mozilla.org/en/campaigns/microsoft-ai/

Mozillas Context;

Ask Microsoft: Are you using our personal data to train AI? We had four lawyers, three privacy experts, and two campaigners look at Microsoft’s new Service Agreement, and none of our experts could tell if Microsoft plans on using your personal data – including audio, video, chat, and attachments from 130 products, including Office, Skype, Teams, and Xbox – to train its AI models.

If nine experts in privacy can’t understand what Microsoft does with your data, what chance does the average person have? That’s why we’re asking Microsoft to say if they’re going to use our personal data to train its AI.

    • dinckel@lemmy.world
      link
      fedilink
      English
      arrow-up
      51
      arrow-down
      1
      ·
      8 months ago
      • They harvest it without your consent
      • They don’t tell you what they harvest
      • They don’t tell you what they use it for
      • It’s your personal data

      Yes, you could argue that by signing up for their services you give them perpetual permissions to do what they want with your data, which is what usually happens, but the issue already lies in that this is acceptable to begin with

    • tux@lemmy.world
      link
      fedilink
      English
      arrow-up
      42
      arrow-down
      1
      ·
      8 months ago

      Why should a company get to use my work and data for free to train their AI, for which they’ll make a ton of money, without compensating me. At a minimum they should be informing me so I can make that choice with full knowledge.

      This isn’t a university or educational research either, this is one of the largest companies in the world with Billions of dollars in annual revenue. And to top it off, I already have to pay them for their operating system and annually for their office suite. So not only am I paying them for their product, they’ll then steal my data to train an AI to try and sell that to us too?

      That’s not even taking into account any concerns with “AI might replace me at my job” that a number of folks have.

      • restingboredface@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        19
        arrow-down
        1
        ·
        8 months ago

        Not to mention the fact that if they include office products in this, its not just personal information.

        A lot of IP gets produced in there, even if it’s not purchased or created within an enterprise license. So if they train on that they will be basically stealing corporate information that they definitely have no rights to.

      • ItsComplicated@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        8 months ago

        In theory they shouldnt. Society has given in too much regarding what data can be used and here we are.

        I think any personal information should not be allowed access by third parties or tech companies. Your personal information is just that…personal. Unfortunately, it is about bottom line profit.

      • wmassingham@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        3
        ·
        8 months ago

        They compensate you in the form of providing products like Bing for free. Same way that Facebook pays their bills by running ads.

    • DarkThoughts@kbin.social
      link
      fedilink
      arrow-up
      15
      ·
      8 months ago

      Here’s a deal. Give me unlimited lifetime access to your AI and you can use my public online data. Should be fair enough, right?

    • whileloop@lemmy.world
      link
      fedilink
      English
      arrow-up
      14
      arrow-down
      1
      ·
      8 months ago

      Beyond what everyone else has said, it has already been shown that LLMs have a chance of regurgitating training data, which means that someone’s personal data could get returned in a Bing Chat query.