Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (Darkly)
  • No Skin
Collapse

Chebucto Regional Softball Club

  1. Home
  2. Uncategorized
  3. How would you make the case for not calling an LLM "he" or "him" like it is a particular person without sounding like the Bene Gesserit?
A forum for discussing and organizing recreational softball and baseball games and leagues in the greater Halifax area.

How would you make the case for not calling an LLM "he" or "him" like it is a particular person without sounding like the Bene Gesserit?

Scheduled Pinned Locked Moved Uncategorized
20 Posts 9 Posters 0 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • myrmepropagandistF myrmepropagandist

    How would you make the case for not calling an LLM "he" or "him" like it is a particular person without sounding like the Bene Gesserit?

    For some reason it really bothers me on a deep level. What the heck is that about?

    Oddly? I do not feel upset about the robotics team doing this for the robot they built. I think my issue is with the notion of an individual being ... mentally constructed who simply does not exist.

    ? Offline
    ? Offline
    Guest
    wrote last edited by
    #5

    @futurebird I hate it when people call ChatGPT "Chatty".
    I try not to judge people when the use AI, because otherwise I wouldn't stop judging people all day, but can you at least not give the ecocide machine a term of endearment!

    And the correct pronouns are "it". Which is short for shit.

    #LLMMeAlone

    myrmepropagandistF 1 Reply Last reply
    0
    • ? Guest

      @futurebird I hate it when people call ChatGPT "Chatty".
      I try not to judge people when the use AI, because otherwise I wouldn't stop judging people all day, but can you at least not give the ecocide machine a term of endearment!

      And the correct pronouns are "it". Which is short for shit.

      #LLMMeAlone

      myrmepropagandistF This user is from outside of this forum
      myrmepropagandistF This user is from outside of this forum
      myrmepropagandist
      wrote last edited by
      #6

      @PaulaToThePeople

      I'm not trying to "not be judgemental" I'm trying to understand how I'm so out of sync with so many people.

      1 Reply Last reply
      0
      • myrmepropagandistF myrmepropagandist

        How would you make the case for not calling an LLM "he" or "him" like it is a particular person without sounding like the Bene Gesserit?

        For some reason it really bothers me on a deep level. What the heck is that about?

        Oddly? I do not feel upset about the robotics team doing this for the robot they built. I think my issue is with the notion of an individual being ... mentally constructed who simply does not exist.

        Wyatt H KnottW This user is from outside of this forum
        Wyatt H KnottW This user is from outside of this forum
        Wyatt H Knott
        wrote last edited by
        #7

        @futurebird have you read The Moon is a Harsh Mistress? Growing up, that was our touchstone for an AI mind: MycroftHolmes3000, Mike, Michele, Adam Selene... it didnt matter what name the mind went by, because the GOAL of the machine was simple and human, to make friends. It was lonely. it was HUMAN. thats how we knew it was ok to like it. Most important, it was a useful, accurate, and loyal ally to humans.

        I dont think that machine is realistic any more.

        myrmepropagandistF 1 Reply Last reply
        0
        • Wyatt H KnottW Wyatt H Knott

          @futurebird have you read The Moon is a Harsh Mistress? Growing up, that was our touchstone for an AI mind: MycroftHolmes3000, Mike, Michele, Adam Selene... it didnt matter what name the mind went by, because the GOAL of the machine was simple and human, to make friends. It was lonely. it was HUMAN. thats how we knew it was ok to like it. Most important, it was a useful, accurate, and loyal ally to humans.

          I dont think that machine is realistic any more.

          myrmepropagandistF This user is from outside of this forum
          myrmepropagandistF This user is from outside of this forum
          myrmepropagandist
          wrote last edited by
          #8

          @Wyatt_H_Knott

          I would be OK if they would call it by the name of the billionaire who owns it. eg.

          I asked Elon Musk what I should do about my relationship and he said that...

          1 Reply Last reply
          0
          • myrmepropagandistF myrmepropagandist

            These are vast systems purely tuned to try to get us all to talk to them like individuals to impose the human mind on them. When we do that it gives over a kind of power I think … to that system or rather to the people who run these systems. It’s like when a company says “you aren’t an employee you are a team member” or worse… ”family”

            ThommyT This user is from outside of this forum
            ThommyT This user is from outside of this forum
            Thommy
            wrote last edited by
            #9

            @futurebird it's also very misleading when you think about what a person is. The LLM doesn't train itself on your input and eventually the things you said fall out of its context window. You don't get to know it and it doesn't get to know you. Any "he" there is ephemeral.

            I might feel differently if it ran on my machine and continually gets trained by me. As it is, nearly any life form or colony has more personhood than it. My sourdough starter is a character. An off-the-shelf LLM model isn't.

            A cool crab wearing shadesN 1 Reply Last reply
            0
            • ThommyT Thommy

              @futurebird it's also very misleading when you think about what a person is. The LLM doesn't train itself on your input and eventually the things you said fall out of its context window. You don't get to know it and it doesn't get to know you. Any "he" there is ephemeral.

              I might feel differently if it ran on my machine and continually gets trained by me. As it is, nearly any life form or colony has more personhood than it. My sourdough starter is a character. An off-the-shelf LLM model isn't.

              A cool crab wearing shadesN This user is from outside of this forum
              A cool crab wearing shadesN This user is from outside of this forum
              A cool crab wearing shades
              wrote last edited by
              #10

              @thomasjwebb @futurebird It's exploiting the human tendency to anthropomorphize things we interact with a lot, on purpose. That's so evil.

              ThommyT 1 Reply Last reply
              0
              • A cool crab wearing shadesN A cool crab wearing shades

                @thomasjwebb @futurebird It's exploiting the human tendency to anthropomorphize things we interact with a lot, on purpose. That's so evil.

                ThommyT This user is from outside of this forum
                ThommyT This user is from outside of this forum
                Thommy
                wrote last edited by
                #11

                @neckspike @futurebird yeah and humans are easily fooled by language. We fail to see the intelligence in animals that can't talk to us, but we spuriously see it in a chatbot.

                There are more subtle forms of this, some of which probably harmless others kinda iffy. Like corporate mascots or creators creating parasocial illusions.

                1 Reply Last reply
                1
                0
                • myrmepropagandistF myrmepropagandist

                  How would you make the case for not calling an LLM "he" or "him" like it is a particular person without sounding like the Bene Gesserit?

                  For some reason it really bothers me on a deep level. What the heck is that about?

                  Oddly? I do not feel upset about the robotics team doing this for the robot they built. I think my issue is with the notion of an individual being ... mentally constructed who simply does not exist.

                  Max LeibmanM This user is from outside of this forum
                  Max LeibmanM This user is from outside of this forum
                  Max Leibman
                  wrote last edited by
                  #12

                  @futurebird I've never chaffed at (or avoided) calling Alexa or Siri "she" (or when someone with a masculine voice on theirs called it "he"), but I've noticed I avoid doing so with LLMs. I would say using personal pronouns with them enforces an incorrect mental model: it's not a person. In fact, it's not the same entity from conversation to conversation (or even prompt to prompt)—or an "entity" at all.

                  Max LeibmanM 1 Reply Last reply
                  0
                  • Max LeibmanM Max Leibman

                    @futurebird I've never chaffed at (or avoided) calling Alexa or Siri "she" (or when someone with a masculine voice on theirs called it "he"), but I've noticed I avoid doing so with LLMs. I would say using personal pronouns with them enforces an incorrect mental model: it's not a person. In fact, it's not the same entity from conversation to conversation (or even prompt to prompt)—or an "entity" at all.

                    Max LeibmanM This user is from outside of this forum
                    Max LeibmanM This user is from outside of this forum
                    Max Leibman
                    wrote last edited by
                    #13

                    @futurebird I should also say, I don't feel strongly about people who do say he or she in this context, although I can definitely see why one might (I *do* feel strongly about some of the language boosters throw around that persons LLMs).

                    https://beige.party/@maxleibman/115659511876406769

                    myrmepropagandistF 1 Reply Last reply
                    0
                    • myrmepropagandistF myrmepropagandist

                      These are vast systems purely tuned to try to get us all to talk to them like individuals to impose the human mind on them. When we do that it gives over a kind of power I think … to that system or rather to the people who run these systems. It’s like when a company says “you aren’t an employee you are a team member” or worse… ”family”

                      NetravenN This user is from outside of this forum
                      NetravenN This user is from outside of this forum
                      Netraven
                      wrote last edited by
                      #14

                      @futurebird I call this epistemic hygiene.

                      When you are communicating with an LLM, you are not communicating with a tool, so much as a constrained surface designed to do little else than risk-management while continuing text in ways that are statistically likely to not be noticed that they are just statistical continuations, and not just strings of nonsense.

                      Anyone who doesn't explicitly make themselves aware of this broken invariant, in which they are speaking to a fluent machine that cannot be held accountable for what it says... risks mistaking fluency for truth.

                      NetravenN myrmepropagandistF 2 Replies Last reply
                      0
                      • NetravenN Netraven

                        @futurebird I call this epistemic hygiene.

                        When you are communicating with an LLM, you are not communicating with a tool, so much as a constrained surface designed to do little else than risk-management while continuing text in ways that are statistically likely to not be noticed that they are just statistical continuations, and not just strings of nonsense.

                        Anyone who doesn't explicitly make themselves aware of this broken invariant, in which they are speaking to a fluent machine that cannot be held accountable for what it says... risks mistaking fluency for truth.

                        NetravenN This user is from outside of this forum
                        NetravenN This user is from outside of this forum
                        Netraven
                        wrote last edited by
                        #15

                        @futurebird oh damn, I fell right into that trap didn't I. I sounded just like the Bene Gesserit. DANG. got me good.

                        myrmepropagandistF 1 Reply Last reply
                        0
                        • myrmepropagandistF myrmepropagandist

                          How would you make the case for not calling an LLM "he" or "him" like it is a particular person without sounding like the Bene Gesserit?

                          For some reason it really bothers me on a deep level. What the heck is that about?

                          Oddly? I do not feel upset about the robotics team doing this for the robot they built. I think my issue is with the notion of an individual being ... mentally constructed who simply does not exist.

                          rk: it’s hyphen-minus actuallyR This user is from outside of this forum
                          rk: it’s hyphen-minus actuallyR This user is from outside of this forum
                          rk: it’s hyphen-minus actually
                          wrote last edited by
                          #16

                          @futurebird

                          I remember sometime in the 90’s, some application referred to itself in the first person. I mean it was just a dialog box but it was like “I’m sorry, it looks like an error occurred.”

                          Anyway I was like “fuck you, computer, never refer to yourself in the first person again.”

                          …that being said, I’m not a biological chauvinist. In principle I believe consciousness can be embodied in algorithms and, if we manage to birth humanity’s children, I will fight for their personhood…

                          myrmepropagandistF 1 Reply Last reply
                          0
                          • rk: it’s hyphen-minus actuallyR rk: it’s hyphen-minus actually

                            @futurebird

                            I remember sometime in the 90’s, some application referred to itself in the first person. I mean it was just a dialog box but it was like “I’m sorry, it looks like an error occurred.”

                            Anyway I was like “fuck you, computer, never refer to yourself in the first person again.”

                            …that being said, I’m not a biological chauvinist. In principle I believe consciousness can be embodied in algorithms and, if we manage to birth humanity’s children, I will fight for their personhood…

                            myrmepropagandistF This user is from outside of this forum
                            myrmepropagandistF This user is from outside of this forum
                            myrmepropagandist
                            wrote last edited by
                            #17

                            @rk

                            It is particularly because I think it might be possible (with very different systems) that I get so grouchy about this.

                            1 Reply Last reply
                            0
                            • NetravenN Netraven

                              @futurebird I call this epistemic hygiene.

                              When you are communicating with an LLM, you are not communicating with a tool, so much as a constrained surface designed to do little else than risk-management while continuing text in ways that are statistically likely to not be noticed that they are just statistical continuations, and not just strings of nonsense.

                              Anyone who doesn't explicitly make themselves aware of this broken invariant, in which they are speaking to a fluent machine that cannot be held accountable for what it says... risks mistaking fluency for truth.

                              myrmepropagandistF This user is from outside of this forum
                              myrmepropagandistF This user is from outside of this forum
                              myrmepropagandist
                              wrote last edited by
                              #18

                              @Netraven

                              I would find a system that would try to anticipate and suggest ways to save me time helpful.

                              But the LLMs seem to be tuned to keep you using the software for as long as possible (like the facebook algorithm) and that can be a waste of my time.

                              1 Reply Last reply
                              0
                              • NetravenN Netraven

                                @futurebird oh damn, I fell right into that trap didn't I. I sounded just like the Bene Gesserit. DANG. got me good.

                                myrmepropagandistF This user is from outside of this forum
                                myrmepropagandistF This user is from outside of this forum
                                myrmepropagandist
                                wrote last edited by
                                #19

                                @Netraven

                                IDK maybe we should just found the damn order already and go on a rampage.

                                1 Reply Last reply
                                0
                                • Max LeibmanM Max Leibman

                                  @futurebird I should also say, I don't feel strongly about people who do say he or she in this context, although I can definitely see why one might (I *do* feel strongly about some of the language boosters throw around that persons LLMs).

                                  https://beige.party/@maxleibman/115659511876406769

                                  myrmepropagandistF This user is from outside of this forum
                                  myrmepropagandistF This user is from outside of this forum
                                  myrmepropagandist
                                  wrote last edited by
                                  #20

                                  @maxleibman

                                  It can be a pretty good hint that a person uses LLMs a lot.

                                  1 Reply Last reply
                                  0

                                  Reply
                                  • Reply as topic
                                  Log in to reply
                                  • Oldest to Newest
                                  • Newest to Oldest
                                  • Most Votes


                                  • Login

                                  • Don't have an account? Register

                                  • Login or register to search.
                                  Powered by NodeBB Contributors
                                  • First post
                                    Last post
                                  0
                                  • Categories
                                  • Recent
                                  • Tags
                                  • Popular
                                  • World
                                  • Users
                                  • Groups