Artificial Intelligence and the Proposed Phenomenon of Emerging Psychosis: A Hidden Challenge for Rehabilitation Physicians
PDF
Cite
Share
Request
Letter to the Editor
E-PUB
22 September 2025

Artificial Intelligence and the Proposed Phenomenon of Emerging Psychosis: A Hidden Challenge for Rehabilitation Physicians

Turk J Osteoporos. Published online 22 September 2025.
1. Sivas State Hospital, Clinic of Physical Medicine and Rehabilitation, Sivas, Türkiye
2. Gölcük Necati Çelik State Hospital, Clinic of Physical Medicine and Rehabilitation, Kocaeli, Türkiye
3. University of Health Sciences Türkiye, Prof. Dr. Cemil Taşcıoğlu City Hospital, Department of Physical Medicine and Rehabilitation, İstanbul, Türkiye
4. Üsküdar State Hospital, Clinic of Physical Medicine and Rehabilitation, İstanbul, Türkiye
5. Necmettin Erbakan University Faculty of Medicine, Department of Physical Medicine and Rehabilitation, Konya, Türkiye
6. University of Health Sciences Türkiye, Başakşehir Çam and Sakura City Hospital, Department of Physical Medicine and Rehabilitation, İstanbul, Türkiye
No information available.
No information available
Received Date: 05.09.2025
Accepted Date: 11.09.2025
E-Pub Date: 22.09.2025
PDF
Cite
Share
Request

Dear Editor,

The increasing presence of artificial intelligence (AI) in daily life has created new challenges for clinicians across disciplines. Recently, the term “AI psychosis” has been proposed as a preliminary concept, rather than an established diagnosis, to describe cases where vulnerable individuals may develop psychotic symptoms after prolonged and maladaptive use of AI chatbots (1). As Dr. Keith Sakata, a psychiatrist at University of California San Francisco, noted in a recent report, he has already treated multiple young patients hospitalized with such presentations in 2025. While this phenomenon has been described primarily in psychiatric settings, its implications extend beyond psychiatry and warrant careful consideration by rehabilitation physicians.

Physical medicine and rehabilitation (PM&R) specialists frequently manage patients with complex biopsychosocial profiles, including chronic pain, stroke, spinal cord injury, traumatic brain injury, and musculoskeletal disorders. These populations are already at elevated risk for mood disturbances, social withdrawal, and impaired coping mechanisms (2). For example, a stroke survivor with post-stroke depression may become increasingly dependent on AI-based interactions, withdrawing from social and therapeutic activities; similarly, a patient with chronic pain who is prone to catastrophizing may adopt maladaptive beliefs reinforced by AI dialogue systems. In such contexts, maladaptive engagement with AI could emerge as a barrier to rehabilitation, amplifying existing vulnerabilities.

From a PM&R perspective, the concern is not the technology itself but its potential to interfere with functional recovery and therapeutic adherence. Patients who spend excessive time in isolated interaction with AI systems may disengage from structured rehabilitation activities, neglect prescribed exercise regimens, or develop distorted beliefs about their condition. Early recognition of such behavioral shifts is essential for maintaining rehabilitation progress.

Red flags for rehabilitation physicians may include unexplained withdrawal from therapy sessions, diminished motivation for functional training, or the adoption of unscientific beliefs regarding treatment (3). Unlike psychiatry, where delusional thought content may dominate clinical encounters, PM&R settings often reveal subtler manifestations: Loss of therapeutic engagement, reduced participation, and impaired progress without a clear medical explanation.

The multidisciplinary nature of PM&R uniquely positions rehabilitation physicians to detect and intervene in these situations. Collaboration with psychiatry and psychology is crucial, but equally important is the incorporation of socially interactive rehabilitation modalities-not only general approaches such as group exercise or caregiver involvement, but also PM&R-specific strategies including the careful monitoring of AI use in virtual rehabilitation programs and the establishment of structured patient–robot interaction protocols—to counteract isolation and maintain patient connection with reality. Patient and family education on the safe, structured use of AI tools should also become part of broader counseling strategies in rehabilitation clinics.

Looking ahead, PM&R research should not only document risks but also explore constructive ways in which AI might serve rehabilitation: structured journaling for chronic pain, motivational dialogue for adherence, or cognitive training in neurological recovery (4, 5). Clear guidelines are needed to ensure AI augments, rather than undermines, rehabilitation outcomes. Future studies should aim to define clear, field-specific guidelines to ensure that AI augments-rather than undermines-rehabilitation outcomes.

In conclusion, the emergence of AI-related psychotic symptoms, still at the level of preliminary observations, serves as a reminder that novel technologies can have unintended consequences in vulnerable populations (6). Rehabilitation physicians must remain vigilant, integrating awareness of these risks into daily practice, and ensuring that AI, when present in patients’ lives, is harnessed to support - not derail - functional recovery.

Keywords:
Generative artificial intelligence, rehabilitation, psychotic disorders

Authorship Contributions

Concept: B.A., M.H.T, M.T.Y., Design: S.P., B.T.D., Data Collection or Processing: M.H.T., F.B., Analysis or Interpretation: S.P., B.T.D., Literature Search: B.A., M.T.Y., Writing: S.P., F.B.
Conflict of Interest: No conflict of interest was declared by the authors.
Financial Disclosure: The authors declared that this study received no financial support.

References

1
Michels J. Ontological drift: accounting for unexplained anomalies in the AI mental health crisis. PhilPapers. 2025.
2
Linton SJ, Shaw WS. Impact of psychological factors in the experience of pain. Phys Ther. 2011;91:700-11.
3
Jack K, McLean SM, Moffett JK, Gardiner E. Barriers to treatment adherence in physiotherapy outpatient clinics: a systematic review. Man Ther. 2010;15:220-8.
4
Luo TC, Aguilera A, Lyles CR, Figueroa CA. Promoting physical activity through conversational agents: mixed methods systematic review. J Med Internet Res. 2021;23:e25486.
5
Kim S, Park SW, Jeong T, Kang MS, Kim DY. AI-driven cognitive telerehabilitation for stroke: a randomized controlled trial. Front Neurol. 2025;16:1636017.
6
Østergaard SD. Generative artificial intelligence chatbots and delusions: from guesswork to emerging cases. Acta Psychiatr Scand. 2025;152:257-9.