

Seriously. What happens if it hallucinates and decides that I said I was planning to harm myself or others? Could I end up being committed because an LLM thought I said something I didn’t?
Or more realistic, how does this affect something like body language? When taking notes, a therapist does more than just write down the words you say. They also take note on any body language or behavior that might be relevant to your case. If AI is replacing all the note taking, then this leaves two possibilities. One possibility is the therapist simply won’t ever have records of nonverbal communication. The second is even worse - you try to get an AI to create this record by feeding it a video of the session. Now you have even more subjectivity brought in.




My husband and I don’t have a TV in our bedroom. We’ll go to bed at different times, and a TV going is just too much light and noise. If one of us wants to fall asleep to a TV show, there’s a very comfortable couch in the living room.
I think it might be different if we had kids. In that case, having a TV in the bedroom can be useful, in case you want to watch something kids aren’t old enough for yet. But it’s just the two of us, so we keep a TV out of the bedroom. The bedroom is for sleeping and other activities.