Sunday, August 31, 2025

A.I. as opposed to what?

One popular use of A.I. - specifically chatbots using LLMS - that I did not foresee is as a source of life advice. Sometimes, people refer to this in the context of therapy, as if the user is using ChatGPT or another A.I. as a substitute therapist. That implies a certain level of intimacy and privacy that probably applies to a minority of the many advice-seeking queries users offer every day. 

When thinking about the effects of anything, we often search for comparisons, and the comparisons we make are often shaped by metaphors we use - I think that's what is happening with how we think about the effects of using A.I. for advice-seeking. Instead of choosing a comparison that comes to mind the quickest or perhaps one that suits our existing biases for or against the phenomenon in question (e.g., how do the answers given by ChatGPT compare to those given by a professional therapist), we might choose our comparisons in a more deliberate way. Instead of starting with the connections our mind makes when learning of a few vivid exemplars, we might first define the field of inquiry - what is the behavior we're interested in, how many folks are engaging in it, and how can I observe a representative sample of it? 

The next step is to consider the behavior in the context of people's lives. Here, the choice of comparison is not what comes to mind for us, but our reasonable guess as to what people engaging in the behavior would have done otherwise. If they didn't have access to ChatGPT, who or what would they have gone to for advice (or would they have gone to anyone or anything for advice at all)? 

Therapists strike me as sources of very good advice on many topics. They're also better than self-help books because they can tailor their advice to you as an individual and they engage in a back-and-forth exchange. There's also some intangible humanity to your connection with a therapist that clearly helps. On the down side, it is hard to access therapy. There are only so many professionally trained therapists to go around. Efforts to make them more accessible by moving therapy online sacrifices some benefits of the therapeutic experience, namely the intangible human connection, in addition to leading to burnout from overbooked therapists. Even a society that is fully committed to serving the mental and emotional needs of its citizens runs up against the limits of therapeutic supply. 

Of course, people have turned to many other sources for life advice, including friends, family, clergy, authors, and artists. You may not approach a movie, TV show, or a novel with an advice-seeking intension, and yet its lessons may be your guide through any number of emotional or existential straits.

Then there are sources of advice that we encounter online: in spaces explicitly marked as such (and which derive their format and logic from newspaper advice columns) but also on YouTube, TikTok, and podcasts. Again, these life lessons may not be something we sought out, but they may still inform how we navigate challenging times in our lives. 

As I consider whether its a good idea for people to turn to ChatGPT for advice, I find that the most apt comparison - that is, the most plausible source of advice for most of the people asking ChatGPT for advice - is either googling it, searching YouTube or TikTok, or running across it on YouTube or TikTok via an algorithm. The quality of advice coming from these sources is, well, mixed. In many cases, the answers aren't as high quality as one would get from a professional, but its much more accessible, which is why its used more often. In some cases - googling something that is commonly googled and finding results from trusted sources - the quality is similar to that which you'd get from a professional. 

Are the types of life advice questions users ask ChatGPT more like the impersonal kind that people have, for the past 20 years, tended to google? Or are they more like the personal questions that people turned to friends, family, and - if they have access - therapists for? My sense is that a fair amount of young people have been getting that kind of advice from YouTube, TikTok, and other social media platforms for the past decade. In that sense, it is an appropriate point of comparison for that type of advice.

I'd imagine advice of this sort given by ChatGPT would be pretty homogeneous when compared to what you would find on social media platforms. The latter would vary greatly in values it reflected, the perspectives and experiences from which it drew. It leaves the existing variety of humanity intact, while ChatGPT probably flattens it. 

I guess I imagine cases in which people get bad life advice - "bad" in the sense that they result in some harm to themselves and/or others - being more numerous on social media platforms than on ChatGPT. I also imagine that the flat homogeneity of its answers won't necessarily spread to other aspects of culture more broadly, as some suspect it will. ChatGPT tends to qualify its advice and allow for some degree of variety and nuance in its answers. By merely asserting that there is some variety and nuance to certain life advice questions, ChatGPT would be contradicting the values of some of its users. This is less likely to happen on social media where answers are sorted by algorithm to confirm the existing biases of users. 

And so that would seem to be the trade-off: leave the echo chambers of social media advice intact - some of which generate harmful outcomes - or replace them with answers that guide most users away from harmful outcomes but homogenize...something. But is it really Culture that's being homogenized if people are encouraged to cope with crisis in particular ways? This is where my speculation reaches its limit and I feel the need for a more systematic examination of the life advice questions people ask ChatGPT and the answers they receive. 



No comments: