• nebulaone@lemmy.world
    link
    fedilink
    arrow-up
    28
    ·
    edit-2
    2 days ago

    Reputation and PGP signatures could be used to verify real human made content. That is, of course, if people actually care, which I think will be rare.

    There might be no-ai communities, that require this and are closed down to avoid being scraped for ai training.

    Edit: Also AI is already enshittifiying itself, which might get worse if it becomes more widespread than it already is.

      • CarrotsHaveEars@lemmy.ml
        link
        fedilink
        arrow-up
        3
        ·
        1 day ago

        Throw the technical bit away. Just think of it as a signature. Yes, an old school, written-with-hand signature.

        Does the bank trust me giving you $100 by you having this cheque? Yes. Why? Because I told them what my name is and what my signature is like.

        Will the bank give you $100 if you stole my cheque and sign your name on it?

      • groet@feddit.org
        link
        fedilink
        arrow-up
        12
        ·
        1 day ago

        Its not about “just having a signature”. Its about a web of trust. It only works if you verify if the key belongs to a creator that is actually a person.

        Basically creators go to a convention and hand out their public key in person and have other creators sign their key. If you trust creator A is real and they signed the key of creator B, you can have some trust B is also real. And if your buddy went to the convention, met A and B, got their public keys and tells you they are real you can also trust they are real. The more steps/signatures you are away from a creator the less trustworthy they are and nothing really ensures a (human) creator doesn’t use AI secretly. If somebody is found to be a fraud everyone has to distrust their key.

      • PieMePlenty@lemmy.world
        link
        fedilink
        arrow-up
        6
        ·
        edit-2
        1 day ago

        Trust is the most important part. You trust someone they made something themselves. They digitally sign their work with a public key that is known to be theirs. You can now verify they (the person you trust) made it.
        Once the trusted creator’s key is leaked, they are no longer trusted for future works.
        AI made content can be freely signed as well, but if you don’t trust the origin, the signature doesn’t matter anyway since it will just verify it is coming from the AI creator.
        The key thing is trust, the signature is just there to verify.

        • prole@lemmy.blahaj.zone
          link
          fedilink
          arrow-up
          1
          ·
          11 hours ago

          What is stopping a human from gaining a reputation with a signature, and then selling it (a la reddit accounts)?

    • alekwithak@lemmy.world
      link
      fedilink
      arrow-up
      4
      arrow-down
      2
      ·
      1 day ago

      I like webghost0101’s Idea:

      […] a blockchain linked video camera where metadata of footage gets written into the chain to combat fake news and misinformation.

      The goal would be to create a proof and record of original footage, to which media publishers and people who share can link towards to verify authenticity/author.

      If the media later gets manipulated or reframed you would be able to verify this by comparing it to the original record.

      • sus@programming.dev
        link
        fedilink
        arrow-up
        14
        arrow-down
        1
        ·
        edit-2
        1 day ago

        blockchain is a totally useless extra bit glued on there. All the real evidence will be the cryptographic signatures added by the hardware manufacturer (which can be faked, but requires extracting the keys from the “security chip” in the camera which may be very difficult)

        all the blockchain does at that point is provide a timestamp of “signed hash of the picture+metadata was uploaded on x date” which can easily be done without blockchain too

        • Buddahriffic@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          10 hours ago

          Yeah, the interface between the analog reality and its digital representation will always be the weak point. But, such a scheme could at least mean that whatever reality is going to be represented at least needs to be decided relatively quickly after the moment has passed, rather than it being possible to create whatever video helps the most well after the fact.

          Like if someone is trying to frame someone of some crime, with video authentication, they need to create and authenticate a video based on when they want the supposed crime to occur. Without it, they could find out when their target has alibi gaps and just target that time after the fact.

          Though another bit on reality’s side is corroborating with other nearby cameras. If my camera says you walked onto my property at 8:00 and left at 8:30, there should be nearby cameras I don’t own that pick you up before and after those times. Every camera that has an angle should pick it up. Though advantage goes to those with many cameras, since real footage or fake, they’d be better able to create a longer narrative and dispute conflicting narratives.