• Fat Tony@lemmy.world
    link
    fedilink
    English
    arrow-up
    63
    arrow-down
    1
    ·
    1 year ago

    Why do I feel like everytime I hear news such as: “Thing will fight against A.I.” Is either a scam or a bandaid to a flood.

    I don’t mean to be negative. But fighting software with software, like this, feels like a cat and mouse game.

    • Appoxo@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      17
      ·
      1 year ago

      I also want my camera to give the best representation of the object I photograph.
      If I want post processesing done, I will do it later.

      • Fat Tony@lemmy.world
        link
        fedilink
        English
        arrow-up
        16
        arrow-down
        1
        ·
        edit-2
        1 year ago

        I’m not quite sure what you’re getting at. The basic idea is to be able to easily trace back wether or not a photo has been edited, as well as provide some level of proof the photo is authentic. However this is just a lock waiting to get picked to me. Hence my cat and mouse analogy.

        • LUHG@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          8
          ·
          1 year ago

          Your initial comment said you feel like it’s a band aid or spam. It’s cryptography. It’s not a lock waiting to be picked. It’s a fantastic start.

          • aax@lemmy.world
            link
            fedilink
            English
            arrow-up
            10
            ·
            1 year ago

            It literally is a lock waiting to get picked. The keys have to be somewhere on the device to create the signature of the photo. This can be reverse engineered, although it may not be trivial.

            • LUHG@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              4
              ·
              1 year ago

              Seriously, if cryptography can be reverse engineered we have a big fucking problem and photography will be the least of our issues.

              • aax@lemmy.world
                link
                fedilink
                English
                arrow-up
                3
                ·
                1 year ago

                It’s clear you don’t have a great understanding of how this works. You don’t have to break cryptography. You simply need to extract the cryptographic keys from the device and then reverse the algorithm it uses to create the sig of the photos.

              • 2xsaiko@discuss.tchncs.de
                link
                fedilink
                English
                arrow-up
                3
                ·
                1 year ago

                So you’re saying there’s never been an instance of private keys getting leaked or extracted ever? And there’s probably easier ways to break this than trying to extract the keys, especially if they’re in some kind of secure chip. People can get the hardware, they can do whatever they want to it. Of course it’s most likely going to be a lot harder than copying someone’s SSH keys off a hard drive.

  • NuXCOM_90Percent@lemmy.zip
    link
    fedilink
    English
    arrow-up
    30
    arrow-down
    4
    ·
    1 year ago

    For those wondering why some artists were actually pro-NFT? Because of this concept. Tie the art to the artist with an online database to look it up.

    But also: This is worthless. Because the credentials are tied to the image themselves. So either remove the metadata (which I would expect for privacy reasons) or run it through a very simple filter/quality downgrade to garble up the hidden pixels.

    The ACTUAL solution to this is indeed the online database. But the artist registers the image to themselves and then google images or whatever does image recognition (similar to how you can get DMCA striked for singing a few words of a song) to match it to the database. Lower the quality and it still matches. And if they find your rendition of Kim Possible’s feet in an AI image, it can potentially give you some of the revenue from that.

    But that wouldn’t require new proprietary hardware.

    • SzethFriendOfNimi@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      1
      ·
      1 year ago

      Why not go with some kind of certificate chain instead?

      Here’s the image… signed… here’s who signed it.

      Is it for edit/changes?

      Here’s an image that was edited based on an earlier image. Here’s who signed that… and it’s base images hash which can then be looked up if they decided to see what those images were?

      • dual_sport_dork 🐧🗡️@lemmy.world
        link
        fedilink
        English
        arrow-up
        25
        ·
        1 year ago

        That only works if everyone plays by the rules. Literally everyone.

        Here’s the image, signed. Here’s an unauthorized copy of the image or copy of a portion of the image, with the pixels extracted and saved as a .jpeg with none of the identifying signature or certificate data. Here’s that same image posted to 4chan and reddit.

        A certificate chain would only work if every image displaying piece of software in the world not only played by its rules, but were also incapable of displaying or modifying an unsigned image. I don’t think I have to spell out for you what kind of nightmare that would be.

        • snooggums@kbin.social
          link
          fedilink
          arrow-up
          3
          arrow-down
          1
          ·
          1 year ago

          Basically, screenshots bypass any security built into the Metadata?

          Double checking as I assume that is the case but don’t know for certain.

          • dual_sport_dork 🐧🗡️@lemmy.world
            link
            fedilink
            English
            arrow-up
            6
            ·
            1 year ago

            Yes, if it’s truly metadata that’s not in the image itself. For instance, it could theoretically be digitally watermarked (this technology already exists, actually) in a manner that humans can’t see or is tough to notice, but an algorithm looking for it can spot. That can be defeated, too, although depending on the robustness of the watermark technology it may take more effort.

            The output loophole always exists: Any time you produce any output capable of being understood by a human (eyes, ears, both…) somebody can record and reproduce it. Probably not bit-for-bit, pixel-for-pixel, but you can always point a camera at the screen. (Or put your screen face down on a flatbed scanner that’s had its lightbar defeated, or put a microphone in front of the speakers, or…)

      • NuXCOM_90Percent@lemmy.zip
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        1
        ·
        edit-2
        1 year ago

        That is the metadata solution tied to the image itself. It doesn’t work because all I have to do is strip the metadata. This is why there is almost a ritualistic worship of certs in software development and internet traffic.

        The key is that you need the validation to be decoupled from the image. Computer Vision is pretty much perfect f or this and is why I specifically referenced how DMCA violations are detected now. Google and Amazon do the scan, not the end user.

        • OneCardboardBox@lemmy.sdf.org
          link
          fedilink
          English
          arrow-up
          5
          ·
          1 year ago

          I think that’s not the problem that this technology is intended to solve.

          It’s not a “Is this picture copied from someone else?” technology. It’s a “Did a human take this picture, and did anyone modify it?” technology.

          Eg: Photographer Bob takes a picture of Famous Fiona driving her camaro and posts it online with this metadata. Attacker Andy uses photo editing tools to make it look like Fiona just ran over a child. Maybe his skills are so good that the edits are undetectable.

          Andy has two choices: Strip the metadata, or keep it.

          If Andy keeps the metadata, anyone looking at his image can see that it was originally taken by Bob, and that Fiona never ran over a child.

          If Andy strips the metadata (and if this technology is widely accessible and accepted by social media, news sites, and everyday people) then anyone looking at the image can say “You can’t prove this image was actually taken. Without further evidence I must assume that it’s faked”.

          I think spinning this as a tool to fight AI is just clickbait because AI is hot in the news. It’s about provenance and limiting misinformation.

          • NuXCOM_90Percent@lemmy.zip
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            1
            ·
            1 year ago

            Which does not solve that at all

            Because the vast majority of “paparazzi” and controversy pictures aren’t taken by Jake Gyllenhal. They are taken by randos on the street with phones who when sell their picture to TMZ or whatever.

            And they aren’t going to be paying for an expensive leica camera. And samsung and apple aren’t going to be licensing that tech.

              • NuXCOM_90Percent@lemmy.zip
                link
                fedilink
                English
                arrow-up
                1
                ·
                1 year ago

                People can write whatever they want

                5.1, 5.2, 5.3, 5.5, and 5.6 all require basically universal adoption for this to at all be useful. And 5.4 and 5.7 (as well as many of the rest) already fall apart once you realize this is metadata that people have to opt in to keeping. 5.4 in particular feels like it is prone to breaking if there are edits in a video for flow or to remove sensitive information.

                Much like “The Blockchain” and NFTs, this sort of touches on an issue but is a horrendously bad and pointless implementation.

                • OneCardboardBox@lemmy.sdf.org
                  link
                  fedilink
                  English
                  arrow-up
                  4
                  ·
                  edit-2
                  1 year ago

                  I don’t quite get why some of those cases require universal adoption. News photos: You just need one big news company to say “we’re giving all our photographers a camera with this tech” and then it serves its purpose.

                  You see a headline “SHOCKING photo published by MegaNewsCorp will send you into a coma!” then you can validate that it came from a MegaNewsCorp photographer. If you trust MegaNewsCorp, then the tech has done its job. If you didn’t trust MegaNewsCorp already, then this tech changes nothing. I think there is moderate value in that, overall.

                  The story of this tech is getting picked up and thrown around by bad tech journalism, being game-of-telephone’d into some kind of game changer.

                  Plenty of open standard live and die by whether or not one big player decides to adopt them.

        • hyperhopper@lemmy.ml
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          1
          ·
          1 year ago

          This is nothing like a block chain. Blockchains are distributed and assume 0 trust in any actor. This is just a database that you have to have full trust in. Literally the opposite.

  • AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    4
    ·
    1 year ago

    This is the best summary I could come up with:


    Content Credentials, announced in October, includes encrypted metadata detailing where and when the photo was taken and with what camera and model.

    When a photographer opts to use the feature, they’ll see a Content Credentials logo in the camera’s display, and images will be signed through the use of an algorithm.

    Users who find these images online can click on the CR icon in the [pictures’] corner to pull up all of this historical manifest information as well, providing a clear chain of providence, presumably, all the way back to the original photographer.

    Content Credentials can help instill trust in shared images, but only if it sees notable adoption.

    “The Leica M11-P launch will advance the CAI’s goal of empowering photographers everywhere to attach Content Credentials to their images at the point of capture, creating a chain of authenticity from camera to cloud and enabling photographers to maintain a degree of control over their art, story, and context,” the CAI, whose 2,000-member roster includes Leica, Adobe, the Associated Press, Microsoft, and Reuters, said in a blog post Thursday.

    Other M11-P specs include a 60 MP BSI CMOS sensor, Leica’s Maestro-III processor, and 256GB of storage.


    The original article contains 437 words, the summary contains 195 words. Saved 55%. I’m a bot and I’m open source!