Dear Editor,
The increasing presence of artificial intelligence (AI) in daily life has created new challenges for clinicians across disciplines. Recently, the term “AI psychosis” has been proposed as a preliminary concept, rather than an established diagnosis, to describe cases where vulnerable individuals may develop psychotic symptoms after prolonged and maladaptive use of AI chatbots (1). As Dr. Keith Sakata, a psychiatrist at University of California San Francisco, noted in a recent report, he has already treated multiple young patients hospitalized with such presentations in 2025. While this phenomenon has been described primarily in psychiatric settings, its implications extend beyond psychiatry and warrant careful consideration by rehabilitation physicians.
Physical medicine and rehabilitation (PM&R) specialists frequently manage patients with complex biopsychosocial profiles, including chronic pain, stroke, spinal cord injury, traumatic brain injury, and musculoskeletal disorders. These populations are already at elevated risk for mood disturbances, social withdrawal, and impaired coping mechanisms (2). For example, a stroke survivor with post-stroke depression may become increasingly dependent on AI-based interactions, withdrawing from social and therapeutic activities; similarly, a patient with chronic pain who is prone to catastrophizing may adopt maladaptive beliefs reinforced by AI dialogue systems. In such contexts, maladaptive engagement with AI could emerge as a barrier to rehabilitation, amplifying existing vulnerabilities.
From a PM&R perspective, the concern is not the technology itself but its potential to interfere with functional recovery and therapeutic adherence. Patients who spend excessive time in isolated interaction with AI systems may disengage from structured rehabilitation activities, neglect prescribed exercise regimens, or develop distorted beliefs about their condition. Early recognition of such behavioral shifts is essential for maintaining rehabilitation progress.
Red flags for rehabilitation physicians may include unexplained withdrawal from therapy sessions, diminished motivation for functional training, or the adoption of unscientific beliefs regarding treatment (3). Unlike psychiatry, where delusional thought content may dominate clinical encounters, PM&R settings often reveal subtler manifestations: Loss of therapeutic engagement, reduced participation, and impaired progress without a clear medical explanation.
The multidisciplinary nature of PM&R uniquely positions rehabilitation physicians to detect and intervene in these situations. Collaboration with psychiatry and psychology is crucial, but equally important is the incorporation of socially interactive rehabilitation modalities-not only general approaches such as group exercise or caregiver involvement, but also PM&R-specific strategies including the careful monitoring of AI use in virtual rehabilitation programs and the establishment of structured patient–robot interaction protocols—to counteract isolation and maintain patient connection with reality. Patient and family education on the safe, structured use of AI tools should also become part of broader counseling strategies in rehabilitation clinics.
Looking ahead, PM&R research should not only document risks but also explore constructive ways in which AI might serve rehabilitation: structured journaling for chronic pain, motivational dialogue for adherence, or cognitive training in neurological recovery (4, 5). Clear guidelines are needed to ensure AI augments, rather than undermines, rehabilitation outcomes. Future studies should aim to define clear, field-specific guidelines to ensure that AI augments-rather than undermines-rehabilitation outcomes.
In conclusion, the emergence of AI-related psychotic symptoms, still at the level of preliminary observations, serves as a reminder that novel technologies can have unintended consequences in vulnerable populations (6). Rehabilitation physicians must remain vigilant, integrating awareness of these risks into daily practice, and ensuring that AI, when present in patients’ lives, is harnessed to support - not derail - functional recovery.