Brand MU Day
    • Categories
    • Recent
    • Tags
    • Popular
    • Users
    • Groups
    • Login

    AI PBs

    Scheduled Pinned Locked Moved Game Gab
    126 Posts 36 Posters 3.8k Views
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as topic
    Log in to reply
    This topic was forked from PBs Tez
    This topic has been deleted. Only users with topic management privileges can see it.
    • PavelP
      Pavel
      last edited by

      So AI isn’t so much coming for my job as it is creating more work for my colleagues. I see.

      He/Him. Opinions and views are solely my own unless specifically stated otherwise.
      BE AN ADULT

      WizzW 1 Reply Last reply Reply Quote 4
      • GashlycrumbG
        Gashlycrumb @MisterBoring
        last edited by

        @MisterBoring said in AI PBs:

        Are there any other options that might represent a truly ethical source of PB art?

        My tabletop players have made HeroForge minis of their PCs. Not actually downloaded them or had them 3d printed or anything, just screenshots of the miniature online. So, it’s free, and I don’t think anybody minds. Mind you they do sometimes buy stuff from HeroForge, too.

        "This is Liberty Hall; you can spit on the mat and call the cat a bastard!"
        – A. Bertram Chandler

        P 1 Reply Last reply Reply Quote 2
        • P
          ProperPenguin @Gashlycrumb
          last edited by

          @Gashlycrumb said in AI PBs:

          @MisterBoring said in AI PBs:

          Are there any other options that might represent a truly ethical source of PB art?

          My tabletop players have made HeroForge minis of their PCs. Not actually downloaded them or had them 3d printed or anything, just screenshots of the miniature online. So, it’s free, and I don’t think anybody minds. Mind you they do sometimes buy stuff from HeroForge, too.

          I have also seen people use things like BG3 and Cyberpunk’s character creators in a similar fashion for VTT.
          (And it’s becoming more common for games to release character creators for free in advance of the game’s release.)

          1 Reply Last reply Reply Quote 0
          • MisterBoringM
            MisterBoring
            last edited by

            I’ve actually been looking at using Unreal Engine’s Metahuman Creator to make PBs, but my abilities in that engine are… awful.

            Proud Member of the Pro-Mummy Alliance

            1 Reply Last reply Reply Quote 0
            • WizzW
              Wizz @Pavel
              last edited by

              @Pavel said in AI PBs:

              So AI isn’t so much coming for my job as it is creating more work for my colleagues. I see.

              here’s another article about it more generally. I find it kind of funny but also deeply sad considering that these are probably just…what, undiagnosed narcissists who have badly needed intervention for years? and now they’ve got Fancy Autocorrect endlessly validating them into psychosis.

              MisterBoringM PavelP 2 Replies Last reply Reply Quote 1
              • MisterBoringM
                MisterBoring @Wizz
                last edited by

                @Wizz There’s a dark part of my psyche that hopes it leads one of them to this:

                a man is sitting in a theater eating popcorn and smiling

                Proud Member of the Pro-Mummy Alliance

                1 Reply Last reply Reply Quote 0
                • PavelP
                  Pavel @Wizz
                  last edited by

                  @Wizz Yeah, I was being a bit flippant, but some colleagues of mine have already reported (anecdotally, their study is as-yet unpublished) an increase in cases of delusion being “fed” by hallucinating LLMs supposedly confirming cases of unreality.

                  Alas, most research I’ve seen thus far has been on clinical applications of LLMs rather than their clinical impact.

                  He/Him. Opinions and views are solely my own unless specifically stated otherwise.
                  BE AN ADULT

                  somasatoriS 1 Reply Last reply Reply Quote 2
                  • somasatoriS
                    somasatori @Pavel
                    last edited by

                    @Pavel said in AI PBs:

                    Alas, most research I’ve seen thus far has been on clinical applications of LLMs rather than their clinical impact.

                    “I don’t need a therapist anymore, I talk to ChatGPT and it helps” - BPD patient I was working with

                    85ccd01a-835a-48b9-a3fc-192b63cdb602-image.png

                    d2e3383f-d6c1-4b47-aac8-6f7875dcb1f1-image.png

                    bf830634-3fd9-4436-a590-6790803677cc-image.png

                    "And the Fool says, pointing to the invertebrate fauna feeding in the graves: 'Here a monarchy reigns, mightier than you: His Majesty the Worm.'"
                    Italo Calvino, The Castle of Crossed Destines

                    1 Reply Last reply Reply Quote 4
                    • R
                      RedRocket @Faraday
                      last edited by RedRocket

                      @Faraday Artists can’t win in court against most a.i. cases because training an a.i. isn’t the same as copying someone’s work.

                      A lot of people picture a.i. art like it’s theft because they think the program is just doing something like cutting a woman out of a painting and pasting it over a picture of a mountain cabin next to a lake taken by a photographer then grabbing a picture of someone’s dog with a stick in its mouth from Facebook and slapping a filter over it to make it all the same style. That isn’t how it works though. Everything the a.i. makes is entirely original.

                      The training process teaches it to draw in the same way humans learn to do art, trial and error with self reinforced learning when it gets it right and tweaking it’s method when it gets it wrong. It tests to see how similar what it makes is to the original training material. The reason it’s so powerful is that it can learn 24 hours a day without humans being involved at all past the initial set up.

                      Nothing is actually copied. Artistic style is emulated, which can’t break copying laws.

                      FaradayF 1 Reply Last reply Reply Quote 0
                      • FaradayF
                        Faraday @RedRocket
                        last edited by Faraday

                        @RedRocket said in AI PBs:

                        Everything the a.i. makes is entirely original.

                        GenAI makes nothing original. Every single thing it does is algorithmically based on the work it’s been trained on. Without that trained work, they’ve got no product.

                        That trained work was used without the permission of the creators. That is the crux of the lawsuits, and while the results have been mixed so far, I believe ultimately the creators will prevail in some form or another (probably a watered-down global licensing pool, but it’s at least something). I believe this because one of the cornerstones of the fair use doctrine is that the transformative work does not replace or compete with the original. That is demonstrably not the case here. This has been theft and plagiarism on a scale that would make Napster blush.

                        ETA: The Getty and Disney lawsuits are probably the strongest, as they show pretty compelling evidence that their artwork/photos are baked into these GenAI tools to such a degree that it can faithfully reproduce them when prompted. It’s not just stylistic inspiration.

                        @RedRocket said in AI PBs:

                        The training process teaches it to draw in the same way humans learn to do art…

                        GenAI does not learn in the same way a human does. It’s a false equivalence. People keep wanting to anthropomorphize these things like they’re actually intelligent, but they’re not. They’re fancy word- and image-predicting algorithms. Autocomplete on steroids. They do not fundamentally understand the world the way a human does. They have no actual creativity, insight, or originality. They match patterns and generate similar ones. They do it really well, which is why the tools work, but that is not the way humans think or learn.

                        R 2 Replies Last reply Reply Quote 11
                        • R
                          RedRocket @Faraday
                          last edited by

                          @Faraday said in AI PBs:

                          Every single thing it does is algorithmically based on the work it’s been trained on. Without that trained work, they’ve got no product.

                          Yes, but the same is true for humans.

                          If you never learned to draw by trial and error, by comparing your work to other people, by learning anatomy and seeing how close you can get it to a goal you set for yourself, you wouldn’t be able to make anything either. Your brain and the A.I. brain work the same way. That’s why, legally speaking, it isn’t copying. It’s a very skilled imitation, yes, but it isn’t copying.

                          PavelP 1 Reply Last reply Reply Quote 0
                          • R
                            RedRocket @Faraday
                            last edited by

                            @Faraday said in AI PBs:

                            They have no actual creativity, insight, or originality. They match patterns and generate similar ones.

                            That’s exactly what artists do when they make art. The human artist has the ability to choose which patterns to combine into a new product but it’s all alchemy! It’s just mixing things together to get something new.

                            You can’t paint a baseball game without painting a baseball and a bat.
                            You can’t paint a skyscraper with no walls.
                            You can’t paint an ocean with no water.

                            Artists are doing the exact same pattern repetition the A.I. does.

                            FaradayF 1 Reply Last reply Reply Quote 0
                            • JennkrystJ
                              Jennkryst
                              last edited by

                              Everyone is forgetting that this forum would not exist, save for an AI PB who went ‘beep boop, repsect muh authoritah’ and then banned everyone who went ‘lol, no’.

                              … wait, does AI PB not mean Robotic Purring Barrister?

                              Mummy Pun? MUMMY PUN!
                              She/her

                              1 Reply Last reply Reply Quote 0
                              • PavelP
                                Pavel @RedRocket
                                last edited by

                                @RedRocket said in AI PBs:

                                That’s why, legally speaking, it isn’t copying.

                                That very much remains to be seen.

                                He/Him. Opinions and views are solely my own unless specifically stated otherwise.
                                BE AN ADULT

                                1 Reply Last reply Reply Quote 9
                                • FaradayF
                                  Faraday @RedRocket
                                  last edited by Faraday

                                  @RedRocket said in AI PBs:

                                  That’s exactly what artists do when they make art. The human artist has the ability to choose which patterns to combine into a new product but it’s all alchemy! It’s just mixing things together to get something new.

                                  Just because the output is the same as a human doesn’t mean that the process is the same. A human, an abacus, a calculator, Google Sheets, and an LLM (sometimes) can all calculate 8+6, but only the human actually understands math and can merge that understanding with an understanding of the actual world.

                                  This reminds me of something Gary Marcus said in an article about AI hallucinations., where he explains how a ChatGPT answer hallucinated a simple fact (birthplace) about a celebrity (Shearer).

                                  Because LLMS statistically mimic the language people have used, they often fool people into thinking that they operate like people.

                                  But they don’t operate like people. They don’t, for example, ever fact check (as humans sometimes, when well motivated, do). They mimic the kinds of things of people say in various contexts. And that’s essentially all they do.

                                  You can think of the whole output of an LLM as a little bit like Mad Libs.

                                  [Human H] is a [Nationality N] [Profession P] known for [Y].

                                  By sheer dint of crunching unthinkably large amounts of data about words co-occurring together in vast of corpora of text, sometimes that works out. Shearer and Spinal Tap co-occur in enough text that the systems gets that right. But that sort of statistical approximations lacks reliability. It is often right, but also routinely wrong. For example, some of the groups of people that Shearer belongs to, such as entertainers, actors, comedians, musicians and so forth includes many people from Britain, and so words for entertainers and the like co-occur often with words like British. To a next-token predictor, a phrase like Harry Shearer lives in a particular part of a multidimensional space. Words in that space are often followed by words like “British actor”. So out comes a hallucination.

                                  That is just not how human brains operate.

                                  R 1 Reply Last reply Reply Quote 4
                                  • R
                                    RedRocket @Faraday
                                    last edited by

                                    @Faraday
                                    A deeper understanding of the context and meaning of the image created is not a factor of its legality. In court it doesn’t matter that the A.I. doesn’t understand what the words really mean, it only matters if the output can be proven to have used other people’s intellectual property in such a way that it is wholly unoriginal.

                                    They can’t do that with A.I. because when you put in a prompt the a.i. doesn’t pull up an image and manipulate it, it creates an entirely new image based on pattern predictions.

                                    This is a good video to understand how A.I. actually functions.

                                    https://m.youtube.com/watch?v=1aM1KYvl4Dw&t=2019s&pp=ygUcaG93IGltYWdlIGdlbmVyYXRvciBhaSB3b3Jrc9IHCQnHCQGHKiGM7w%3D%3D

                                    FaradayF 1 Reply Last reply Reply Quote 0
                                    • FaradayF
                                      Faraday @RedRocket
                                      last edited by Faraday

                                      @RedRocket said in AI PBs:

                                      A deeper understanding of the context and meaning of the image created is not a factor of its legality.

                                      I agree. My comments about understanding and context were in response to your assertion that AI learns like a human does.

                                      it only matters if the output can be proven to have used other people’s intellectual property in such a way that it is wholly unoriginal.

                                      Originality is not the only thing that matters from a legal perspective. I might take the characters/setting/etc. from Lord of the Rings and use it in an entirely original manner and it could still be copyright infringement. I might also use a screencap/clip/etc. verbatim in commentary/review/etc. and could be entirely fair use. The transformative nature of a derivative work is merely one factor in a complicated test for fair use. Being transformative alone is not a “get out of jail free card” for using someone else’s copyrighted works.

                                      The legal battle over whether AI is fair use is ongoing and messy, and will take years to sort out. We are certainy not going to settle it here amongst a bunch of internet gamers, most of whom are not lawyers, but we are nonetheless entitled to our opinions.

                                      I personally think that GenAI is going to have an uphill battle to claim that the copyrighted works are not baked into its product somehow when it’s capable of generating something like this (source: Hollywood Reporter]:

                                      near-identical images of Yoda from Midjourney vs original Star Wars images

                                      R 1 Reply Last reply Reply Quote 5
                                      • R
                                        RedRocket @Faraday
                                        last edited by RedRocket

                                        @Faraday

                                        Yes, if midjourney was using its A.I. to make fake Yoda merch and sell it then that would be copywrite infringement, but that isn’t what is happening. The I.P. holders are trying to sue the A.I. companies for things people /might/ do using their product. That will never hold up in a court of law because you can’t be convicted of possible crimes, only crime you actually do.

                                        Just ask the anti-gun people. They have been trying to sue gun manufacturers for decades because the streets of America are flooded with guns and no company has ever been held accountable for the mass shootings that their weapons were used in.

                                        Yes, A.I. can be used to steal your I.P. and so can the eyes of a human artist.

                                        If you don’t want people stealing your fursona don’t post it in public. That is the only perfect protection to prevent copyright violations.

                                        The people suing these A.I. companies are making an unreasonable ask of the company. No one can design an A.I. that checks everything it makes against every copyright held by everyone, ever just to make sure its users aren’t planning on using the results to sell fake merch.

                                        If you want to fight copyright fraud sue the people doing the fraud not the one who drew the picture.

                                        FaradayF 1 Reply Last reply Reply Quote 0
                                        • FaradayF
                                          Faraday @RedRocket
                                          last edited by

                                          @RedRocket said in AI PBs:

                                          If you want to fight copyright fraud sue the people doing the fraud not the one who drew the picture.

                                          I think you have a fundamental misunderstanding of copyright law. The person who draws the picture of Yoda is violating the copyright (unless it falls under the very narrow definition of fair use). It doesn’t matter whether they sell it or not.

                                          Now, in practice Disney isn’t going to come after every random fan that draws Yoda stuff. That’s impractical, a waste of their time and resources, and a PR nightmare. So they choose to focus on the people making money from it (who then have money for them to take).

                                          I (and more importantly, many actual lawyers who specialize in this stuff) allege that by drawing pictures of Yoda and pictures derived from the pictures of Yoda (based on its training data), Midjourney is violating Disney’s copyright. AND they’re making money from it.

                                          R 1 Reply Last reply Reply Quote 6
                                          • R
                                            RedRocket @Faraday
                                            last edited by

                                            @Faraday

                                            Okay, let me rephrase that so I am more clear. It is the person writing the prompt that is doing the fraud not the computer making the image.

                                            It’s like suing a pencil manufacturer because someone used a pencil to draw Mickey Mouse. The people making the pencil didn’t do the crime. In the same way, the people making the A.I. shouldn’t be held responsible if someone else uses it for fraud and that is what these litigious companies are after.

                                            They want to establish a precedent to make it so A.I. companies can held liable for damages if their code is used to do fraud.

                                            Courts would never even consider this for any other industry.
                                            If you went to court and sued Fruit of the Loom for making cotton T-shirts that criminals bought and used to make fake merch, you would be laughed out of court. A.I. companies are no different.

                                            They can’t be expected to regulate everything the users of their products might ever do wrong. It’s an impossible ask or in legal terms, an “unfair burden”.

                                            Any half decent lawyer should be able to win this case without breaking a sweat.

                                            JennJ RozR JennkrystJ 3 Replies Last reply Reply Quote 0
                                            • First post
                                              Last post