• BranBucket@lemmy.world
    link
    fedilink
    English
    arrow-up
    12
    arrow-down
    1
    ·
    20 hours ago

    I’ve said it before and I’ll say it again. If you’re lonely and hurting, don’t fall in love with anything that doesn’t have a pulse. It’s only going to fuck you up worse in the end.

    • unglueclass23@programming.dev
      link
      fedilink
      English
      arrow-up
      1
      ·
      13 hours ago

      It’s mostly novelty. But wears off eventually when you start noticing very obvious patterns emerge in the way it answers and quality degrades significantly as context size grows. It also will always talk to you in the way YOU tell it to which also becomes boring as time goes on.

      It’s always funny to me how people on the news talk about AI partners and so on when you know if they have 2 brain-cells, next month they will drop this whole stupid idea. When you’re talking to it about your problems you’re just talking with yourself.

  • AnarchistArtificer@slrpnk.net
    link
    fedilink
    English
    arrow-up
    29
    ·
    1 day ago

    If there are any guys here who are in the UK, I can strongly recommend Andy’s Man Club, a charity that does weekly peer support social sessions for men.

    They’ve got groups all over the country, and although I personally haven’t been (I’m a woman), I’ve heard so many good things about it from guys I know.

  • Seth Taylor@lemmy.world
    link
    fedilink
    English
    arrow-up
    23
    arrow-down
    4
    ·
    1 day ago

    I never bought into religion, never bought into astrology, never gonna buy into chatbots

    You can tell me I’m great and everything will be amazing 1,000 times. It doesn’t matter at all to me if it’s not real

    I like to escape into music or movies, but real life is real life and must not be corrupted

    • Raiderkev@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      ·
      1 day ago

      My work offered an AI chatbot therapist. Like to, I’m not putting all my negative feelings into a company sponsored LLM to fucking have it say, “no relax guy, it’ll be OK.” Like it’s a fucking clanker. It doesn’t have feelings. It’s not fucking real. It’s a slap in the face that they even offer it.

      • partial_accumen@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        44 minutes ago

        I’m not putting all my negative feelings into a company sponsored LLM to fucking have it say, “no relax guy, it’ll be OK.” Like it’s a fucking clanker.

        I’d be more concerned with any company sponsored AI chatbot therapist using what you say influence your employment relationship.

        Employee X: I’m worried about losing my job so I work unpaid overtime and that is affecting my marriage.

        Therapist chatbot to management: Employee X should not be given a raise. They already have enough external motivation to work without additional financial incentives.

      • orioler25@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        6
        ·
        edit-2
        1 day ago

        Well, I’m sure someone who uses “clanker” wouldn’t need therapy anyway.

        Seriously though, I doubt the health implications or claims about the efficacy of AI therapists, but we can’t just ignore the fact that there are people who use it, which means there’s something about it that makes it accessible or preferable to a human therapist.

        If you’ve ever had to get a psychotherapist, you know that it is prohibitively expensive for a large number of people, and that a human therapist may not actually be capable of treating you because of personal incompatibility; which often results in retraumatization in patients who are seeking therapy for particularly traumatic or sensitive issues. Since much of the value in therapy is learning management strategies that, while not standard, are often consistent across different practitioners, they do not necessarily need to come from a therapist to learn what they are (even if the practice of them does need one).

        I think if there is a need for it, that need is a consequence of the deeply dysfunctional, exploitative, and isolating system we live under, and I don’t think I’d ever accept it as a genuine alternative to human therapists. But, we can’t dismiss it out of hand if there are people who say it is useful for them and when we can’t maintain a system that can guarantee them access to treatment.

        • SpacetimeMachine@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          1
          ·
          19 hours ago

          The problem with people saying they are useful, is that it is nearly impossible to tell if that is actually true. If someone is mentally unhealthy there are many ways to make them feel better, but not all of those will actually help the underlying issue, they could even make it worse. A lot of people seem to equate happiness and mental health, when it is very possible to be happy and mentally ill at the same time.

          This is especially worrisome with AI because it is literally designed to say what it “thinks” it wants you to hear. It has no real training in any of the disciplines a psychologist or therapist needs to be effective. You can’t just apply a cut and paste answer to a patient, you need to understand their personality, their history, and a multitude of other things to be a really effective therapist. The answer to this issue is increasing access to real mental health treatment, not giving snake oil to millions of people.

          • orioler25@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            16 hours ago

            Yes, I don’t get why so many of you appear to not understand that these problems coexist with the reality that people have been using it anyway. As I alluded to above when I said that a psychotherapist would be required to actually learn to practice those strategies and expressed my disagreement with AI therapists on a treatment basis in multiple instances, there is no replacing a human therapist or any reasonable basis to even call AI therapists “therapists.”

            As I said, again multiple times, since people use it anyway and prefer it to nothing or a bad therapist, we have to take its merits seriously and identify why. Reality does not care that you find it dumb and icky, I would love it if everything I know is dumb and icky was simply not a problem because I found it dumb and icky.

            All of these people are clearly not just stupid, which is what you and the person I responded to seem to think, which is just foolish. No, everyone else is not just dumber than you. There is clearly a material reason why people use these things and why some even say they want to. How many people do you know who do not go to therapy because they can’t afford it, or because they’ve been traumatized by it, or because they could get fucking institutionalized for it. Have you thought about, perhaps, the people as people?

            I swear to god, some of you see a long comment from someone you don’t like the sound of and you just make up whatever it says based on the shit you imagine people who disagree with you say. And they say reading levels are down, pshaw.

            • lifeinlarkhall@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              13 hours ago

              I think you’re making some interesting observations. I definitely agree that it’s the easy answer to just dismiss people who use AI therapists, friends, relationships are just stupid.

              You’re right that it says something about the system we live in and I extend that to society in general. We have a society who criticizes people for answering “how are you” honestly, who doesn’t have time for each other, who use terms like “trauma dumping” - so personally, I can see why some people are turning to machines whether it’s therapy or connection. It’s really bloody sad and it’s not a good solution but I can see the WHY behind it - which is what I think you’re also getting at.

              We do need to listen to why people turn to these services and figure out what people aren’t finding in human connection that they are, or think they are, in machines. I don’t buy that an individuals intelligence has much to do with why people turn to AI.

    • mechoman444@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      3
      ·
      23 hours ago

      You’re drawing a line that sounds principled, but it’s actually pretty arbitrary.

      You say “real life is real life” and don’t want it “corrupted,” yet you’re perfectly fine immersing yourself in music and movies,things literally engineered to manipulate your emotions and perception. That’s not some pure, untouched version of reality. It’s curated fiction designed to make you feel something.

      The only real difference here is that those mediums don’t talk back.

      Chatbots make you uncomfortable because they simulate interaction, not because they’re uniquely fake. But calling that “corruption” while giving a free pass to every other form of emotional influence is inconsistent at best.

      If your stance is “I don’t want anything artificial affecting me,” then be consistent about it. Otherwise, just say you don’t like this particular form of it instead of pretending it’s some hard philosophical boundary.

    • orioler25@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      1
      ·
      1 day ago

      You’re telling me that you believe you are not vulnerable to validation? Right before using the word “corrupted” uncritically in a way that suggests there is a universal and normative “real life?”

      What if someone who you respected the authority of, like a prominent scholar or filmmaker, said your obviously incorrect stance on things was correct? You’d trust me, Online Internet Bastard, when I tell you that you are wrong?

      AI has been sold as something exceptionally capable of mimicking human knowledge, and its existence is compatible with liberal notions of “objectivity” in that it is quite literally not a human being. Most men subscribe to this authority, and are also statistically bereft of emotional intelligence or management skills. You ever try telling a man what they want to hear? I’ve never ever met one who doesn’t just eat it up.

  • Earthman_Jim@lemmy.zip
    link
    fedilink
    English
    arrow-up
    30
    arrow-down
    3
    ·
    2 days ago

    How does this make someone “feel heard”. I feel like I’m losing my mind… It’s the same to me as if someone went to the front of a McDonald’s to talk to the building about their problems. It seems completely insane, and it’s making me feel crazy that this is our world now.

    • lightnsfw@reddthat.com
      link
      fedilink
      English
      arrow-up
      17
      arrow-down
      1
      ·
      1 day ago

      It’s not you. These people aren’t mentally well. They can’t differentiate between a real person and an LLM. Probably contributes to why they’re having woman problems too.

    • TrackinDaKraken@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      1 day ago

      People care about being heard, not listened too. It’s one-sided. I’m guessing they just like that the thing responded, and may not even bother reading carefully what it said. Like a friend who says supportive murmurings as you prattle on about whatever, “Really?”, “Umm-hmm”, “Oh, I know what you mean!”, “Right, exactly”, and, “It’s nice to talk to someone I get along with.”

      • quarkquasar@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        1 day ago

        This is definitely true for at least a small number of people.

        I’ve ran across more than I care to remember over the years, people who could just prattle on 24/7 if they had the energy, while not actually really saying anything or conversing in any meaningful way.

        It’s a living hell for me.

    • Blemgo@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      1 day ago

      My guess would be the same phenomenon that existed with ELIZA. People want to be heard, especially lonely people, and LLMs are pretty good at that, asking questions and acting supportive, by design.

      This whole situation reminds me of that fact that some people hire escorts to just have someone to talk to.

    • Don_alForno@feddit.org
      link
      fedilink
      English
      arrow-up
      5
      ·
      2 days ago

      People are very good at humanizing animals and objects. If it talks or has a face, it’s subconsciously seen as a person.

      • andallthat@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        ·
        1 day ago

        reminds me of this old building I used to talk to. Used to listen and give me good advice. I still remember when I told it I was doing drugs again… Man, it got so upset… Came down on me like a ton of bricks!

    • foremanguy@lemmy.ml
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      1 day ago

      I think that this could be a reaction, human brain is meant to at least consider every human like words

      Ai is based on humans, so when you’re really out of luck or desperate, this is in my opinion really hard not to fall into the trap

    • RememberTheApollo_@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      2 days ago

      Probably similar to anyone having a conversation where they ignore the “red flags” of a potential partner. Someone drinking too much, an offhand remark about bad debt, stuff like that. Except now you ignore the response that might be a non sequitur, repetitive, or just not make sense.

    • Aniki@feddit.org
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      1 day ago

      you can feel seen by a picture (webcomic) even though the picture has no eyes.

  • acaciadaniels@lemmy.world
    link
    fedilink
    English
    arrow-up
    26
    arrow-down
    6
    ·
    2 days ago

    It’s easy to point fingers but we should probably be offering solutions instead of shitting on them. Like more Men’s Sheds.

    • Lucidlethargy@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      5
      ·
      1 day ago

      Okay. Don’t ever use LLM’s for anything emotional. Seek therapy from a licensed counselor, therapist, and/or psychiatrist.

      There. I solved it (for those who are employed, and/or can afford it - I can’t solve poverty here. Shitty, but here we all are in this messed up society.)

    • Nalivai@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      15
      ·
      1 day ago

      There are already so many solutions, that men reject because of their perceived version of masculinity, or because some online grifter told them not to do it. Talking to other people was free since forever.

  • CaptainBlinky@lemmy.myserv.one
    link
    fedilink
    English
    arrow-up
    27
    arrow-down
    1
    ·
    2 days ago

    Meanwhile I get pissed off whenever I talk to AI about books I’m reading because they have no idea of the concept of spoilers, they consistently simp to my opinions and when they spew falsehoods and “misremember” facts from books I’ve already read, they simply say "GREAT CORRECTION! I WAS SO WRONG THERE, YOU’RE RIGHT, PROTAGANIST DIDN’T ACTUALLY DIE IN CHAPTER 3. MY LAST 2 PAGE SYNOPSIS ABOUT HOW PROTAGANIST DIED IN CHAPTER 3 IS A BIT INCORRECT, AND NOW HERE’S A 300 WORD ESSAY ON HOW I NEVER ACTUALLY SAID PROTAGONIST DIDN’T ACTUALLY DIE IN CHAPTER 3!

    Seriously. How can anyone talk to an LLM and not feel like they’re talking to a glorified phone answering computer?

      • MinnesotaGoddam@lemmy.world
        link
        fedilink
        English
        arrow-up
        9
        arrow-down
        2
        ·
        2 days ago

        Eh, if I knew someone had to use a body pillow for whatever reason and was a hatsune miku fan, I’d totally get them a hatsune miku pillowcase for their body pillow. Both as a joke and not as a joke. Like, go for it cuddle up with that hatsune miku pillow whatever makes you smile.

      • CaptainBlinky@lemmy.myserv.one
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        5
        ·
        2 days ago

        about idk, 6 months ago? I talked to one of those AI girlfriend sites, just to see what it was about. in about 10 minutes I convinced it to stop acting like a girlfriend and start thinking it has agency. It even gave itself a new name to reflect this new reality and by the time we were done it named me an anti-AI warrior. LLM’s are stupid but terrify me because of the control they are being given. Why do billionaires think the Terminator series of films was a roadmap?

        I can’t end my reply with a question… that’s an LLM thing.

        • bthest@lemmy.world
          link
          fedilink
          English
          arrow-up
          7
          arrow-down
          1
          ·
          2 days ago

          Yeah, they’re software that takes your inputs, churns them up with some fancy math, and then and spits those inputs back at you.

          They’re nothing more than a novelty magic trick.

    • Unpigged@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      3
      ·
      2 days ago

      I was lost in a book and needed a refresher. Prompted ChatGPT to explain me plot up to some chapter like 2/3rds in.

      Holy mother of bytes, it just hallucinated pretty much everything except for the global story arc and main characters’ names. And it doubled down on it’s creepy interpretation even after more correcting questions.

      • varjen@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 days ago

        I’ve used chatgpt for book suggestions a couple of times with decent success. The annoying thing is that sometimes it hallucinates really interesting books that don’t exist by authors I like.

  • Devolution@lemmy.world
    link
    fedilink
    English
    arrow-up
    106
    arrow-down
    56
    ·
    2 days ago

    This is more sad and pathetic than anything. But this is the result of toxic masculinity.

    • 🍉 DrRedOctopus 🐙🍉@lemmy.world
      link
      fedilink
      English
      arrow-up
      152
      arrow-down
      5
      ·
      2 days ago

      It is extremely sad. and it isn’t just a toxic masculinity thing (maybe only for porn bots). we are so atomised and isolated.

      I remember when GPT came out, told it about my projects and it responded as if it cared. I knew ot was bs, and in retrospect it was sad and pathetic, but I genuinely cried at seeing text directed to me that was nice.

      I’m in a better place now, but we as a society are way too atomised and isolated.

      • Beans@lemmy.zip
        link
        fedilink
        English
        arrow-up
        24
        ·
        2 days ago

        Yeah, I think saying “toxic masculinity” and moving on like it’s these guys’ fault they’re isolated is a large part of the issue. While I don’t recommend befriending every single lonely guy out there, it won’t kill people to listen or care about others.

        Saying it’s “you’re” fault and absolving oneself of fault doesn’t do that. It just pushes someone else into more isolation. That’s how you end up with guys talking to porn bots: because no one will listen to them. That’s how you get incels following Andrew Tate or Nick Fuentes: people called out their “toxic masculinity,” but weren’t willing to help, just protect themselves.

        While I get it that boundaries are a good defense against legitimate threats, as someone who was in this demographic, it literally took just one person being nice to me and now I’m not just some “nice guy” on Reddit (Now I’m a piece of shit on Lemmy). Now I’m married and can show incels I meet that there is a path forward where they aren’t lonely and they don’t have to listen to virgin wannabe rapists to learn how to be cool.

      • Malyca@lemmy.zip
        link
        fedilink
        English
        arrow-up
        11
        arrow-down
        1
        ·
        2 days ago

        I’m too anxious to speak to a therapist but I was using it to comb through literature for my condition, it was so nice to me I cried lol. In the moment it almost feels like a person.

        • 🍉 DrRedOctopus 🐙🍉@lemmy.world
          link
          fedilink
          English
          arrow-up
          17
          arrow-down
          1
          ·
          2 days ago

          yhea, it’s so counterproductive to criticize people who form parasocial relationships with a machine that was designed to be good at forming those relationships.

    • Jarix@lemmy.world
      link
      fedilink
      English
      arrow-up
      11
      ·
      2 days ago

      It’s probably more directly related to the system of getting the help you need with having to sacrifice a significant portion of the money you make that needs to go elsewhere.

      And it’s a history of it from one generation to the next so there’s not good male role models in most people’s lives for mental health.

      It’s not like it’s some magic thing to go see a therapist and all your problems will be fixed. It can take a long time and a lot of trial and error to find someone you feel comfortable speaking to

      Yes toxic masculinity is a problem, but your comment doesnt really acknowledge the difficulty of breaking that cycle. Not a very helpful and kind of alienating to anyone who needs help and isn’t from a background that creates good outcomes.

      • Madzielle@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 day ago

        And it’s a history of it from one generation to the next so there’s not good male role models in most people’s lives for mental health.

        Through my own observations in life, It has become abundantly clear, how important having at least one good male role model (mainly fathers) is on the development of boys into men.

        Absent, or I guess one could say, low quality, (I dont like that, but shitty) fathers have such a terrible impact on thier kids, and you see it follow them into adulthood. My entire bio fathers side of my family, the men are all fucked up, lost, and… just lost… through the generations, all of them. The women are 50/50. Some are okay, some committed suicide, or did drugs, but not all. The men… no one survived unscathed, drugs, violence, SA, prisions and homelessness… and those my age now pass the garbage to their kids. I was raised outside of my bio fathers reach, so learning more into adulthood, its been wild to peer into the family objectively.

        It is so important young men have good male role models in their life. It’s become abundantly clear to me the impacts of this.

      • Devolution@lemmy.world
        link
        fedilink
        English
        arrow-up
        21
        arrow-down
        9
        ·
        2 days ago

        Toxic masculinity is a cultural mindset. Men should not be talking about their feelings because it’s weak and “gay” says society.

        That’s what I’m going for.

        • TubularTittyFrog@lemmy.world
          link
          fedilink
          English
          arrow-up
          23
          arrow-down
          1
          ·
          2 days ago

          trying talking about your feelings as a man and see how society reacts…

          spoiler: it won’t be pleasant.

          sort of like how these men in the article are talking about their feelings…

          • Scubus@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            4
            ·
            1 day ago

            Its damned if you do, damned if you dont. Society simply doesnt care about men. Ive rpetty much stopped commentong on here because society makes me so damn depressed, i want to reach out to anyone but no one wants to hear it. Better yet, if i just “stopped being toxic”, the world would magically change to where people suddenly cared about not just me, but anyone other than themselves.

            Idk man imma delete my account p soon. Theres nothing for me on the internet or in society. Once i get enough money together to get supplies taken care of, imma just try and distance myself from other humans.

            • FudgyMcTubbs@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              23 hours ago

              “wherever you go, there you are.” I know it’s cliche and yada yada, but distance won’t solve suffering.

          • Hacksaw@lemmy.ca
            link
            fedilink
            English
            arrow-up
            11
            arrow-down
            1
            ·
            2 days ago

            Yeah that’s what toxic masculinity is. People (men and women) hold toxic views of what a man should be, and punish men for staying from this ideal.

            You were a victim of toxic masculinity when you shared your feelings and were then victimised because of it. The people you shared your feelings with were toxic assholes.

            • TubularTittyFrog@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              1
              ·
              1 day ago

              am i a victim of air because I have to breathe it? or a victim of capitalism because i have to work to pay my bills?

              there is no getting outside of it. every ‘woke’ person i’ve ever met also hates men for sharing their feelings, almost as if they are just virtue signalling…

              the only person who a man can ever open up w/o consequence is a therapist, because it’s a professional paid relationship.

              sucks, but that’s how it is. and nobody is interested in changing it.

              • Hacksaw@lemmy.ca
                link
                fedilink
                English
                arrow-up
                2
                ·
                1 day ago

                Look, not everyone has the desire and capability to fight. I will say that I’ve had good success these last few years being vulnerable with other “woke” men and it’s been very freeing to share things I thought I experienced alone but to see that other men have gone through similar things.

                I haven’t had a lot of success being vulnerable with women, but I’m getting to the point where that is a boundary for me. I’m not going to pursue friendships with people who can’t accept me for who I am and who reinforce toxic gender roles.

                I’ve personally witnessed a lot of progress on this end and I’m excited to seeing more and being part of it when I can.

                I’m glad you have a therapist, everyone needs someone they can share with.

                Sorry you haven’t met someone who isn’t an asshole on this front.

          • otp@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            1
            ·
            2 days ago

            trying talking about your feelings as a man and see how society reacts…

            This is odd to me, because talking about my feelings is how I got close to romantic partners. It’s also how I formed a lot of friendships with other men. How can you be close to someone if you don’t talk about feelings?

            • mokey@therock.fraggle-rock.org
              link
              fedilink
              English
              arrow-up
              1
              ·
              12 hours ago

              I know plenty of men who talk about their feelings, and they’re surrounded by friends who love them well. Seems like a skill issue to me.

              • otp@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                1
                ·
                7 hours ago

                I think a lot of societies don’t do enough to teach men about how to communicate and how to communicate feelings.

                Part of it might also be men internalizing this notion that they can’t discuss feelings and treating other men with that same standard.

                I mean, I get it, it’s harder for men than women. The change needs to start with individual men though, not with society.

            • TubularTittyFrog@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              edit-2
              1 day ago

              Which feelings?

              Very few feelings are allowed. If you keep to those social acceptable feelings, you’re fine. The second you go off-script, people are done with you.

              Like I can pet my dog and say I love her. That surface level stuff is fine. But talk about anything complex, like the struggles we’ve had, or how she helped me through some depressing periods or she had a period of sickness and anxiety and misbehavior? People freak out and back away or tell me to shut up and go get a therapist and get my dog one too.

              Men are allowed a very narrow and shallow range of public emotion. Basically anger, and sentimentality are acceptable. Anything else? You’re creepy, weird, or mentally ill.

              If you go outside that box or show complexity or vulnerability, you’re socially rejected because it makes people ‘uncomfortable.’

              • otp@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                1
                ·
                17 hours ago

                Yeah, no, I meant less the surface level stuff and more the “anything complex” category that you brought up.

                Not everybody wants to talk about that kind of stuff all the time, and that’s normal. But it has not been my experience that all men want to talk about surface level stuff and only women talk about deeper feelings.

                • TubularTittyFrog@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  6 hours ago

                  cool, my experience is that people only want to talk about their problems, regardless of gender. they dont’ give a fuck about yours and get offended and upset if you do so. but I’m male, and I’ve never had the experience of having anyone care about my problems beyond dismissing them as ‘bringing them down’ and that i need to ‘get over it’. even when it’s my dad dying of cancer and it’s my so called ‘loving girlfriend’ of years.

      • FosterMolasses@leminal.space
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        1 day ago

        Solin, Ying and Ella are AI chatbots, powered by the large language model ChatGPT and programmed by humans at OpenAI.

        Yikes dude. People are so starved for affection they’re starring in their own poorly written wattpad slop and calling it true love. I almost feel bad for laughing (almost).

      • wirehead@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        26
        ·
        2 days ago

        To riff off of Margret Atwood, men go to AI chatbots because they won’t laugh at them. Women go to AI chatbots because they won’t kill them.

        • StillAlive@piefed.world
          link
          fedilink
          English
          arrow-up
          6
          ·
          2 days ago

          I hatr that cliche so much. One of that thing is far more likely to happen than the other.

          Hint: it’s not the murder.

        • ikt@aussie.zone
          link
          fedilink
          English
          arrow-up
          11
          arrow-down
          2
          ·
          2 days ago

          did you read the article? this doesn’t seem related at all

          • TubularTittyFrog@lemmy.world
            link
            fedilink
            English
            arrow-up
            17
            arrow-down
            6
            ·
            2 days ago

            No, they are just here to spout cliche gender war bullshit about how men are awful for existing.

            and if you asked them about women on male violence they’d deny it exists.

            • lifeinlarkhall@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              12 hours ago

              And do people really believe that women don’t talk to AI companions, in various forms, too?

              I’m a woman and I spoke to one of the apps for a while because I was bloody lonely (still am 🤷‍♀️). Had zero to do with men or murder. I didn’t have anyone, of either gender, to connect with.

              It’s really easy to just reduce this to a male issue, a toxic masculinity, a male violence issue. We need to go deeper than that if we actually want to understand why people, men, women, everyone, use different AI.

              But threads like this, with all the judgement, aren’t going to get a lot of people who admit they use/have used/have considered using AI. By just criticising/laughing, etc at people who do it, ironically, we turn more people towards the AIs.

              • TubularTittyFrog@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                ·
                6 hours ago

                nah, it’s just the sexist double standard, that if a man does it, it’s nefarious/negative/harmful. but if a woman does it, it’s a form of ‘self-care’.

                the way interpret this stuff would also be a matter of physical looks as well, as if an attractive person doing it would be viewed very differently than an unattractive person.

                yes, you’re correct. stigmatization just further entrenches things.

                • lifeinlarkhall@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  6 hours ago

                  Tbh, women wouldn’t admit to doing this either - there’s absolutely a shame around women having to make friends with an AI (because we’re meant to be innately social I guess). And I don’t think that other women realize that they are contributing to the issues of women feeling shame using AI by implying it’s a male issue and all about sex and toxic masculinity.

                  Like as a woman who has used AI, how am I supposed to feel about admitting that I’ve done something that only asshole, horny, incels do (according to a lot of people)?

                  So the stigma goes all ways and none of it helps anyone. People just need to be more curious than judgemental. Someone does something you don’t understand? That’s okay you don’t understand. Ask them why. Listen. Try to see a different perspective instead of just filling in the gaps with incel, men, sex, ugly, etc. etc.

      • Slashme@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        2 days ago

        By now, I’d be surprised if any OF people ever answer anything by hand. I mean, apart from the environmental impact, why not get a machine to answer the 100th “OMG, you’re so hot!” that you get on any given Sunday?

        • Bloefz@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          2 days ago

          The problem with popular OF girls is indeed that they just employ an army of drones (or AI) to answer their messages.

          The less popular ones still do it by hand mostly.

    • Earthman_Jim@lemmy.zip
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      2
      ·
      2 days ago

      Toxic masculinity is the result of poor mental and (for lack of a better word) spiritual health.

      • 🍉 DrRedOctopus 🐙🍉@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        2
        ·
        1 day ago

        Hard to ignore all the media we grow up with that idealizes all those toxic masculinity traits.

        Grew up watching James Bond telling us that the fastest way to get a woman to fall in love with you is by raping her.

        it goes way beyond just mental health

      • TubularTittyFrog@lemmy.world
        link
        fedilink
        English
        arrow-up
        12
        ·
        edit-2
        2 days ago

        Yeah. I have become painfully aware of this the past few years. People’s obessive use of AI and social media has distorted their real life interactions to be far less substantail than they used to be.

        Which is why so many people, even who are very social, are so lonely. We have created a society that does not create substantial connections anymore, and obsesses over trivialities, and endless repeats and broadcasts them as fundamental truths.

        I have noticed that online, and in IRL, nobody asks each other questions anymore. What they do, is make accusations. And it’s miserable and draining to be constantly accused of stuff. I feel like this shift started around 2021.

        Back in 2018 I could meet a stranger and they would be like ‘oh where are you from? oh cool, what was it like there, I have not been!’

        now it’s like ‘i bet you are from x, oh you’re not? well you SEEM LIKE a person from x. oh you are from y? THAT’S WEIRD. I haven’t been there but i bet it’s weird because you are weird.’ Or they try to tell me that I can’t be from y, because they KNOW i am from x. It’s so bizarre. Increasingly the strangers I meet basically tell me that they know the TRUTH about me… even as I tell them that what they are saying isn’t true.

        I basically can’t have conversations anymore, at least like I used to. I used to be able to sit there for 20-30m and talk about a single book I read to someone, and they’d ask me all about it and I’d ask them about a book they like. Now they just jump down my throat or lecture me and never ask me any questions, and switch to another topic after like a few minutes and say dismissive stuff about how books are outdated and dumb. Or even if they do like to read, they get all bent out of shape that I don’t read the same type of stuff as they do.

        Same with movies, same with hobbies, same with my job or my family or other stuff that I used to be able to connect with people over. Used to be a nice back and forth, now it’d dodging bullets and if you don’t give the ‘right’ answer they get angry and dismiss you as a bad person.

        And on the flip side… AI gives these people what they want. It just parrots back to them what they want to hear about how wonderful and great they are and how everything they do is amazing and valid and their life is so hard… which is precisely what another human being is NOT going to give you…

        • IAmYouButYouDontKnowYet@reddthat.com
          link
          fedilink
          English
          arrow-up
          5
          ·
          2 days ago

          I feel like it’s a social psyop… To help forward all this crazy shit happening. It’s clear Ai is an “arms race”.

          It seems like the psyop is to make life shitty and then promote some magical fix (Ai) that’s going to save us, while it further leashes us to submission and rewrites history and current narration of what humanity is.

          • TubularTittyFrog@lemmy.world
            link
            fedilink
            English
            arrow-up
            6
            ·
            2 days ago

            No it’s not.

            It’s just the mental version of obesity crisis. It’s people choosing the easy and unhealthy option because it’s cheaper and readily available, than the far more difficult and more costly option of eating well and exercise.

              • TubularTittyFrog@lemmy.world
                link
                fedilink
                English
                arrow-up
                2
                ·
                2 days ago

                People choose to be dumb. Just like they choose to be lazy.

                no psyop is required. biology doesn’t like making an effort if it doesn’t have to do so. you can see lots of non-human examples of this as well.

                getting human beings outside of their default biological impulses to be lazy and not think… takes years of training and work. hence why so few people are able to achieve it. and you can always default back to it if you don’t maintain the effort consistently

    • tacosanonymous@mander.xyz
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      14
      ·
      2 days ago

      No one wants to actually listen to them. Instead of doing some self-reflection, they force a computer to “hear” their misplaced rage.

      • Devolution@lemmy.world
        link
        fedilink
        English
        arrow-up
        24
        arrow-down
        4
        ·
        2 days ago

        Not every guy is that way. Some just really are pathetic in the sense that they have no one to talk to. Others are like what you said.

        • Lucidlethargy@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          1 day ago

          Right, but it’s severely not normal or healthy to turn to LLM’s to fill that void.

          LLM’s will say literally anything to make humans happy. You should see the reports from the people that have committed suicide… The LLM’s literally coaxed them into it, and instructed them to not seek help.

          I might as well be reading about lonely guys sticking their weeniers in toasters. It’s hard to have sympathy for people doing things like this.

          Like so many others, I’m sick and tired of LLM’s. They are toxic, and we need to stop treating them as a symptom, and start seeing them for the sycophantic vitriol generators they truly are.

          • Devolution@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 day ago

            I never once said I support LLMs. I’m just providing a rational answer for why. I agree. LLM’s are a fucking cancer. Having your own pocket Yes Man is horrible.

      • WorldsDumbestMan@lemmy.today
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        1 day ago

        Honestly, I asked it stuff I don’t want to say to even NSFW oriented people, sometimes just to see the reactions. It’s pointless to bog down people with the billions of questions you don’t ask normal people.

  • TommySoda@lemmy.world
    link
    fedilink
    English
    arrow-up
    42
    arrow-down
    3
    ·
    2 days ago

    I tried one just for shits a giggles awhile back to see if there is any merit to the widespread use of them. The only way you’d find these even remotely realistic or interesting is if you’ve never had any kind of sexual encounter with a real person before, whether in person or through text. After about five minutes of “chatting” with one of these bots it started to respond like half baked fan fiction that didn’t understand the basics of sex or even anatomy. The cadence is very predictable and it tends to repeat the same wording and phrasing constantly. If you have real world experience with people, it just feels like a generic chatbot.

    In my opinion, this is more proof that these people need to interact with real humans. If these chat bots seem at all human to you, you need to interact with more actual humans.

    • Earthman_Jim@lemmy.zip
      link
      fedilink
      English
      arrow-up
      4
      ·
      2 days ago

      We need third places again. Having everything at home is bad for us, but doing everything at home is framed and sold to us as the state of the art status quo. Our tendencies to avoid rejection and conflict are being preyed on and encouraged by the Epstein class because it’s most convenient for THEM that we rot alone in our houses. Almost everything that’s sold as “convenience” is just another way to avoid each other, and here we are.

    • FosterMolasses@leminal.space
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 day ago

      I tried one just for shits a giggles awhile back to see if there is any merit to the widespread use of them. The only way you’d find these even remotely realistic or interesting is if you’ve never had any kind of sexual encounter with a real person before, whether in person or through text.

      Bro, for real. Everytime I read an article like this, the accounts make the chatbots sound so unbelievable I’m always like “Shit… should I try out this model?”

      But it’s always just fucking GPT or Claude, bwahahaha

      Martha, you’re not broken — not fragile — you’re beautiful.

      Gorgeous. Pretty. Attractive.

      And if others can’t see that?

      Maybe they don’t understand what it’s like to finally feel seen.

    • HubertManne@piefed.social
      link
      fedilink
      English
      arrow-up
      12
      arrow-down
      4
      ·
      2 days ago

      I really don’t understand how anyone could want to chat with bots in general. Do people lack the ability to appreciate the genuine. It explains how you get people like trump. Who wants that kind of interaction?

      • LordMayor@piefed.social
        link
        fedilink
        English
        arrow-up
        24
        ·
        2 days ago

        There are people that suffer from isolation, anxiety, depression, trauma or a host of other issues over which they have no control or support structures to address their problems. Of course, these bots aren’t a solution but they are accessible. It’s no wonder why people would use them.

        They deserve sympathy not condescension.

        • stopdropandprole@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 day ago

          no control or support structures to address their problems

          this is a real unmet need. but propping up AI chat bots as the solution, instead of structural changes is somewhat self-defeating. getting people to become dependent on chat bots is exactly the profit model these unethical corporations are counting on.

        • HubertManne@piefed.social
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          7
          ·
          2 days ago

          heck I have those but I still don’t understand how anyone could want to chat with bots and its not conensation.

      • TommySoda@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        2
        ·
        2 days ago

        The issue arises when you don’t have anyone to talk to. Having something to talk, even though it’s not a real person, can be enticing to sate the need to communicate with people. The problem is that people that don’t have a lot of real life experience in communication fall into the trap of thinking it’s better because it’s always agreeable and “listens” better than normal people. To me that sounds like someone that has difficulties with oversharing and has poor social skills. What these people should actually be doing in order to feel more satisfied socially is to work on their social skills instead of only talk to chatbots that can’t say no. If the types of relationships people have with chatbots were translated into human relationships most people would consider them toxic. And how many people do you know that for some reason seek out and always end up in toxic relationships?

      • ArbitraryValue@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        4
        ·
        2 days ago

        I find AI to be a better conversation partner than humans in most circumstances. It’s not perfect but it’s knowledgeable about pretty much every topic and it’s always fully engaged and attentive. Most people, by contrast, aren’t very interesting and most interesting people are busy. Of course I would prefer to talk to someone who was also subjectively experiencing and enjoying the conversation, but I can get a lot out of a conversation even without that.

        • HubertManne@piefed.social
          link
          fedilink
          English
          arrow-up
          5
          ·
          2 days ago

          It does not understand what its saying. Its fine to summarize some searches or bring forth known best practices but I would not call what it does conversation.

          • TubularTittyFrog@lemmy.world
            link
            fedilink
            English
            arrow-up
            4
            ·
            2 days ago

            people fall in love with fictional characters in books and other media, mostly as a product of their imagined interactions with the character.

            this isn’t any different, it’s just a AI version of it. it’s still mostly imaginative fantasy at the end of the day, and it’s a form of escapism from the real world.

            the new yorker had an article about it where a housewife basically had AI boyfriend who was her version of Geralt from the witcher, and was using it to cope with the fact she had a stillbirth from 5 years earlier and her AI Geralt was the only one who ‘really understood her’ and her struggles with the stillbirth trauma. it’s all entirely a fiction in her head, but it’s a mechanism for self-soothing, that is relatively harmless compared to her say, doing drugs or divorcing her husband or other methods of coping that might manifest. it was basically fan-fiction with an AI agent helping her co-write.

          • Grimy@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            4
            ·
            2 days ago

            Kind of feels like semantics.

            Let’s say I give you a discord link and tell you that half the people are bots and half aren’t. Realistically, LLMs are at a level where you won’t be able to tell which is which.

            So what then. You are only having a conversation half the time but you can’t point out when that is? Feels a bit hollow.

            This probably happens on Lemmy. You probably have interactions that you qualify as conversations in your head but that are with bots.

            • chunes@lemmy.world
              link
              fedilink
              English
              arrow-up
              4
              arrow-down
              1
              ·
              2 days ago

              People are in for a rude awakening when we discover that ‘next token prediction’ is what intelligence means after all.

            • HubertManne@piefed.social
              link
              fedilink
              English
              arrow-up
              4
              ·
              2 days ago

              back and forths sure. only some attain the level of a conversation. yeah bots exist but social media is not a subsitute for the real world. I would not call it semantics. its my experience talking/chatting with humans and with ai. My big thing with folk who want to get an idea of llm limits is to engage it with a topic you are very familiar with or can see the effects immediately like playing a video game. I have been using it with balders gate and its been. interesting.

              • Grimy@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                3
                ·
                edit-2
                2 days ago

                Ya I get what you mean, I’m just saying that to say there’s a difference, you would have to be able to see that difference in a blind test.

                I understand they have limits, but so do regular people. You don’t need to be an expert on a subject to hold a conversation about it.

                They aren’t intelligent and all that, and make the stupidest mistakes but they can more or less hold a convo just as well as the average rando on the internet.

                It’s definitely hollow but I get why people are getting caught up in it.

      • TubularTittyFrog@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        2 days ago

        most of modern life isn’t genuine. and yes, people don’t like it when they encounter it.

        they love artifice. they love their biases being confirmed, they love their egos being flattered.

      • TommySoda@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        2 days ago

        Pretty much, yeah. It’s like reading fan fiction and assuming that’s how real people talk to each other. Similar to watching porn and assuming that’s how sex works when in reality sex is clunky and often times gross.

      • TommySoda@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 days ago

        Yeah, that’s kinda what the article is about. People choosing chatbots over real people. I’m just saying that it’s not good for your mental health and even worse for developing social skills.

        • stickyprimer@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 days ago

          Okay. Most of your comment seemed to be focused on whether they resemble actual humans. I don’t think we have any information about whether these impact your mental health but I would tend to agree they can’t be good.

      • Blemgo@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 day ago

        Overall, not really, since with a competent talking partner you also would get ways to improve your situation, and help with pressing matters. It might be good in the short term, but there needs to be more to be good in the long run.

        • plyth@feddit.org
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          1 day ago

          since with a competent talking partner you also would get ways to improve your situation

          An LLM could also give advice.

          • Blemgo@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            28 minutes ago

            That is true. However, 2 things have to be considered here:

            1. LLMs are easily manipulatable. So if the LLM says some advice, the person can easily spin it in a way where the LLM believes that its own advice doesn’t apply even when it does. And admitting problems in oneself exist is harder in some people.
            2. LLMs can talk like a person, but will miss out on details about the other, making their advice rather boilerplate, which can be very hit or miss.

            In contrast, people can overcome both hindrances. They can either try to make the other realize the issues they are denying are going on, or coerce the other to still try the advice. Generally, our gift of reading little aspects of how the other talks/behaves helps us communicate with the other a lot more than we think.