![]() ![]() ‘I do think they are effective for some people but not necessarily effective for all and not necessarily effective if you don’t have some other support system in place. A psychologist and Head of the School of Mental Health and Psychological Sciences at King’s College London, Wykes remains sceptical about the notion that apps could replace a live therapist. Professor Til Wykes is a member of the NICE committee that approved the apps for provisional use, pending outcomes and user feedback. These are likely to be the first batch in an increasing number of such apps, as the NHS seeks to reduce the huge backlog of people waiting for talking therapies. Three are online CBT programmes for depression that should be delivered with support from a practitioner or therapist, including regular monitoring of progress and patient safety. ![]() Six of the apps are recommended for use only with the support of a high-intensity therapist, for use by people with anxiety disorders such as body dysmorphic disorder, generalised anxiety, PTSD and social anxiety disorder. 3 Some are already in widespread circulation, but NICE approval is needed if they are to be offered through the NHS. In May, NICE fast-tracked approval for nine mental health apps to be offered within the NHS Talking Therapies primary care counselling services to treat anxiety and depression. They are also used increasingly in the mental health services to monitor clients in the community and ensure they are taking their meds and following their treatment regimes. To a certain extent, this is already offered by the AI–based apps now widespread in the mental health arena, especially ones focused on self-help and mental wellbeing. ‘Handing these lower-level tasks and processes to automated systems could free up clinicians to do what they do best: careful differential diagnosis, treatment conceptualisation and big-picture insights.’ 2ĭone right, AI can help clinicians with intake interviews, documentation, notes and other basic tasks, they say it is a tool to make their lives easier. 1 Could therapy delivered by ChatGPT actively do harm?Ī recent article published by a group of leading academics on how AI could change psychotherapy sought to envisage how it could be done safely and responsibly. And some echo the AI industry leaders who, earlier this year, put out a warning that the AI technology they themselves are building could one day threaten the human race. Others fear that it threatens that most essential element of talking therapy – the human-to-human relationship. ![]() Some argue that AI brings exciting new tools that can only benefit more people and improve access to therapy. ![]() In mental health, apps offer easily accessible psychoeducation, activity and compliance monitoring and CBT-based therapies virtual reality is providing new and effective ways of challenging phobias and paranoia, and chatbots are delivering basic talking therapy and conducting assessment interviews. But what it can’t do is empathise or feel.ĭigital technology is already transforming the delivery of healthcare, and not just in terms of administration. You can have conversations with ChatGPT that you might easily mistake for a human-to-human interaction it can write essays, answer questions intelligently, code data, compose emails and engage in social media chit-chat. ChatGPT takes AI to a whole new level of sophistication. Or should we be rejoicing in the added richness – and relief from tedious bureaucratic admin – that it potentially brings?ĪI is certainly high on the current news agenda, spurred by the launch of ChatGPT in November last year. Artificial intelligence (AI) is coming for therapists’ jobs and we should be afraid, perhaps very afraid. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |