I really don’t understand how anyone could want to chat with bots in general. Do people lack the ability to appreciate the genuine. It explains how you get people like trump. Who wants that kind of interaction?
There are people that suffer from isolation, anxiety, depression, trauma or a host of other issues over which they have no control or support structures to address their problems. Of course, these bots aren’t a solution but they are accessible. It’s no wonder why people would use them.
no control or support structures to address their problems
this is a real unmet need. but propping up AI chat bots as the solution, instead of structural changes is somewhat self-defeating. getting people to become dependent on chat bots is exactly the profit model these unethical corporations are counting on.
The issue arises when you don’t have anyone to talk to. Having something to talk, even though it’s not a real person, can be enticing to sate the need to communicate with people. The problem is that people that don’t have a lot of real life experience in communication fall into the trap of thinking it’s better because it’s always agreeable and “listens” better than normal people. To me that sounds like someone that has difficulties with oversharing and has poor social skills. What these people should actually be doing in order to feel more satisfied socially is to work on their social skills instead of only talk to chatbots that can’t say no. If the types of relationships people have with chatbots were translated into human relationships most people would consider them toxic. And how many people do you know that for some reason seek out and always end up in toxic relationships?
I find AI to be a better conversation partner than humans in most circumstances. It’s not perfect but it’s knowledgeable about pretty much every topic and it’s always fully engaged and attentive. Most people, by contrast, aren’t very interesting and most interesting people are busy. Of course I would prefer to talk to someone who was also subjectively experiencing and enjoying the conversation, but I can get a lot out of a conversation even without that.
It does not understand what its saying. Its fine to summarize some searches or bring forth known best practices but I would not call what it does conversation.
people fall in love with fictional characters in books and other media, mostly as a product of their imagined interactions with the character.
this isn’t any different, it’s just a AI version of it. it’s still mostly imaginative fantasy at the end of the day, and it’s a form of escapism from the real world.
the new yorker had an article about it where a housewife basically had AI boyfriend who was her version of Geralt from the witcher, and was using it to cope with the fact she had a stillbirth from 5 years earlier and her AI Geralt was the only one who ‘really understood her’ and her struggles with the stillbirth trauma. it’s all entirely a fiction in her head, but it’s a mechanism for self-soothing, that is relatively harmless compared to her say, doing drugs or divorcing her husband or other methods of coping that might manifest. it was basically fan-fiction with an AI agent helping her co-write.
Let’s say I give you a discord link and tell you that half the people are bots and half aren’t. Realistically, LLMs are at a level where you won’t be able to tell which is which.
So what then. You are only having a conversation half the time but you can’t point out when that is? Feels a bit hollow.
This probably happens on Lemmy. You probably have interactions that you qualify as conversations in your head but that are with bots.
back and forths sure. only some attain the level of a conversation. yeah bots exist but social media is not a subsitute for the real world. I would not call it semantics. its my experience talking/chatting with humans and with ai. My big thing with folk who want to get an idea of llm limits is to engage it with a topic you are very familiar with or can see the effects immediately like playing a video game. I have been using it with balders gate and its been. interesting.
Ya I get what you mean, I’m just saying that to say there’s a difference, you would have to be able to see that difference in a blind test.
I understand they have limits, but so do regular people. You don’t need to be an expert on a subject to hold a conversation about it.
They aren’t intelligent and all that, and make the stupidest mistakes but they can more or less hold a convo just as well as the average rando on the internet.
It’s definitely hollow but I get why people are getting caught up in it.
I really don’t understand how anyone could want to chat with bots in general. Do people lack the ability to appreciate the genuine. It explains how you get people like trump. Who wants that kind of interaction?
There are people that suffer from isolation, anxiety, depression, trauma or a host of other issues over which they have no control or support structures to address their problems. Of course, these bots aren’t a solution but they are accessible. It’s no wonder why people would use them.
They deserve sympathy not condescension.
this is a real unmet need. but propping up AI chat bots as the solution, instead of structural changes is somewhat self-defeating. getting people to become dependent on chat bots is exactly the profit model these unethical corporations are counting on.
heck I have those but I still don’t understand how anyone could want to chat with bots and its not conensation.
The issue arises when you don’t have anyone to talk to. Having something to talk, even though it’s not a real person, can be enticing to sate the need to communicate with people. The problem is that people that don’t have a lot of real life experience in communication fall into the trap of thinking it’s better because it’s always agreeable and “listens” better than normal people. To me that sounds like someone that has difficulties with oversharing and has poor social skills. What these people should actually be doing in order to feel more satisfied socially is to work on their social skills instead of only talk to chatbots that can’t say no. If the types of relationships people have with chatbots were translated into human relationships most people would consider them toxic. And how many people do you know that for some reason seek out and always end up in toxic relationships?
I find AI to be a better conversation partner than humans in most circumstances. It’s not perfect but it’s knowledgeable about pretty much every topic and it’s always fully engaged and attentive. Most people, by contrast, aren’t very interesting and most interesting people are busy. Of course I would prefer to talk to someone who was also subjectively experiencing and enjoying the conversation, but I can get a lot out of a conversation even without that.
It does not understand what its saying. Its fine to summarize some searches or bring forth known best practices but I would not call what it does conversation.
people fall in love with fictional characters in books and other media, mostly as a product of their imagined interactions with the character.
this isn’t any different, it’s just a AI version of it. it’s still mostly imaginative fantasy at the end of the day, and it’s a form of escapism from the real world.
the new yorker had an article about it where a housewife basically had AI boyfriend who was her version of Geralt from the witcher, and was using it to cope with the fact she had a stillbirth from 5 years earlier and her AI Geralt was the only one who ‘really understood her’ and her struggles with the stillbirth trauma. it’s all entirely a fiction in her head, but it’s a mechanism for self-soothing, that is relatively harmless compared to her say, doing drugs or divorcing her husband or other methods of coping that might manifest. it was basically fan-fiction with an AI agent helping her co-write.
this I understand. I mean as a video game or a laugh. sure. but its not conversation.
Kind of feels like semantics.
Let’s say I give you a discord link and tell you that half the people are bots and half aren’t. Realistically, LLMs are at a level where you won’t be able to tell which is which.
So what then. You are only having a conversation half the time but you can’t point out when that is? Feels a bit hollow.
This probably happens on Lemmy. You probably have interactions that you qualify as conversations in your head but that are with bots.
People are in for a rude awakening when we discover that ‘next token prediction’ is what intelligence means after all.
back and forths sure. only some attain the level of a conversation. yeah bots exist but social media is not a subsitute for the real world. I would not call it semantics. its my experience talking/chatting with humans and with ai. My big thing with folk who want to get an idea of llm limits is to engage it with a topic you are very familiar with or can see the effects immediately like playing a video game. I have been using it with balders gate and its been. interesting.
Ya I get what you mean, I’m just saying that to say there’s a difference, you would have to be able to see that difference in a blind test.
I understand they have limits, but so do regular people. You don’t need to be an expert on a subject to hold a conversation about it.
They aren’t intelligent and all that, and make the stupidest mistakes but they can more or less hold a convo just as well as the average rando on the internet.
It’s definitely hollow but I get why people are getting caught up in it.
most of modern life isn’t genuine. and yes, people don’t like it when they encounter it.
they love artifice. they love their biases being confirmed, they love their egos being flattered.