Notes
Article history
The research reported in this issue of the journal was funded by the PHR programme as project number 17/149/03. The contractual start date was in October 2019. The final report began editorial review in March 2022 and was accepted for publication in November 2022. The authors have been wholly responsible for all data collection, analysis and interpretation, and for writing up their work. The PHR editors and production house have tried to ensure the accuracy of the authors’ report and would like to thank the reviewers for their constructive comments on the final report document. However, they do not accept liability for damages or losses arising from material published in this report.
Permissions
Copyright statement
Copyright © 2024 Scior et al. This work was produced by Scior et al. under the terms of a commissioning contract issued by the Secretary of State for Health and Social Care. This is an Open Access publication distributed under the terms of the Creative Commons Attribution CC BY 4.0 licence, which permits unrestricted use, distribution, reproduction and adaptation in any medium and for any purpose provided that it is properly attributed. See: https://creativecommons.org/licenses/by/4.0/. For attribution the title, original author(s), the publication source – NIHR Journals Library, and the DOI of the publication must be cited.
2024 Scior et al.
Chapter 1 Introduction
Sections of this chapter have been reproduced from the Standing Up for Myself (STORM) study protocol,1 reproduced under licence CC-BY-4.0 from the National Institute for Health and Care Research (NIHR) Journals Library.
People with intellectual disabilities are more likely than the general population to experience ‘some of the worst of what society has to offer’, including low incomes, unemployment, poor housing, social isolation and loneliness, bullying and abuse. 2 The NHS commissioned report which presents this conclusion recommended more action to reduce stigma and discrimination as a means to improve lives and health outcomes for people with intellectual disabilities.
Intellectual disability is characterised by an IQ below 70 and associated deficits in everyday adaptive skills, arising before the age of 18, and is estimated to affect 1.4–2% of the UK population. 3 Adults and young people with intellectual disabilities are at increased risk of mental health problems, with recent prevalence estimates of diagnosable psychiatric disorders at 30–50%. 4,5 They face substantial social and health inequalities, at least partly due to stigmatising attitudes and barriers within health, social care and education systems, and wider society. 2,6 The increased risk of mental health problems is due to a range of biological, psychological and social factors – one important social factor is stigma: the negative stereotypes held by society about people with intellectual disabilities, which often lead to prejudice and discrimination. Despite positive changes in policies, service provision and societal views in general, young people and adults with intellectual disabilities still frequently face negative attitudes and discrimination. 7 These in turn render them more vulnerable to a negative sense of self and low self-esteem,2,8,9 poor quality of life2,6,10 and mental health problems. 11,12 Accordingly, interventions that seek to reduce stigma and that ideally empower people with intellectual disabilities to challenge stigma themselves have the potential to improve their well-being and to reduce inequalities.
Stigma
Young people and adults with intellectual disabilities often face attitudinal barriers and negative interactions arising from their stigmatised status in society, including bullying, harassment, hostility and other negative encounters in the community. This is in addition to discrimination in education, employment, access to and receipt of health and social care services, and housing. 13 However, due to social and cognitive skills limitations associated with a diagnosis of intellectual disabilities and reduced social support, their capacity to deal with others’ negative responses is often diminished. In response to not infrequent disability hate crimes, they are often afraid to move freely within their local communities, severely restrict their movements and even move home, if able to do so. 14,15
Consistent associations have been reported in this population between stigma and poorer self-reported health outcomes,6 increased anxiety and depression11,12 and lower self-esteem. 8,9 Consequently, intellectual disability stigma needs to be tackled at multiple levels, as articulated in our theoretical framework which calls for evidence production and remedial action at four interlinked levels (Figure 1): the level of stigmatised person, their family who may be subject to courtesy and affiliate stigma themselves,16,17 the community who generates and maintains stigma through prejudicial attitudes and discrimination, and finally the level of society and systems such as legislation and policy. 18
Interventions are in place to tackle such stigma at the institutional level and to promote more positive attitudes towards people with intellectual disabilities among the public and among key target groups. A number of systematic reviews evidence a lack of effective interventions for: (1) the reduction of negative effects associated with stigma as experienced by people with intellectual disabilities, and (2) managing stigma. 18–20 Psychological and psychosocial approaches that are suitable to this end, such as cognitive behavioural, narrative approaches and acceptance and commitment therapy, have been successfully used, for example, in the mental health field to help buffer individuals against stigma and its negative consequences, including low self-esteem. 21,22 Evidence that empowering the target or ‘victims’ of stigma can be effective comes, for example, from meta-analyses of interventions with victims of bullying. 23,24 Developing effective ways of enhancing the capacity of people with intellectual disabilities to manage and resist stigma, both individually and collectively, is likely to have positive effects on their self-esteem, mental health and general well-being. In turn, reducing the negative impact of stigma may reduce demands on (mental) health and social care services as a result of improved well-being (see Chapter 2, Logic model).
A meta-analysis of ‘stigma resistance’ in the mental health field found strong positive associations between stigma resistance and self-stigma, self-efficacy, quality of life and recovery in people with mental health problems. 25 These authors described stigma resistance as an ongoing, active process that involves using one’s experiences, knowledge, and sets of skills at the (1) personal, (2) peer and (3) public levels. 26 They described stigma resistance at the personal level as ‘(a) not believing stigma or catching and challenging stigmatising thoughts, (b) empowering oneself by learning about one’s condition and recovery, (c) maintaining one’s recovery and proving stigma wrong and (d) developing a meaningful identity apart from’ (p.182) one’s condition (in their case, mental health challenges). At the peer level, stigma resistance involves ‘using one’s experiences to help others fight stigma and at the public level, resistance involved (a) education, (b) challenging stigma, (c) disclosing one’s lived experience and (d) advocacy work’ (p.182). 26 As yet though, this understanding has not been translated into evidence-based interventions that seek to increase stigma resistance among members of different stigmatised groups.
Looking to disability stigma specifically, the first Global Disability Summit in 2018 stressed that more action is needed to reduce high levels of stigma faced by people with disabilities around the world. The Summit led to 170 commitments by governments and multilateral organisations, including the World Health Organization and several UN agencies, to do more to reduce disability stigma. As noted, this project is firmly located within an understanding that disability stigma occurs at and needs to be tackled at multiple levels. Hence, the decision in this project to focus on empowering people with intellectual disabilities to resist stigma, if they so choose, does not for a moment suggest that parallel anti-stigma interventions are not also needed at the family, community and institutional levels.
Standing Up for Myself as a public health intervention
In the intellectual disability field, a few interventions have sought to employ psychosocial approaches to enhance the capacity of people with intellectual disabilities to manage their stigmatised status in society. These include psychoanalytically informed consciousness-raising groups,27 and groups drawing on cognitive behaviour therapy. 28 While such interventions may be helpful at an individual level by supporting the person to ‘come to terms’ with their disability and learn to cope with a stigmatised identity, they do not go beyond stigma management, nor do they explicitly name oppression, and thus do not empower people with intellectual disabilities to challenge the status quo. Furthermore, they were designed to be delivered by highly skilled clinicians and are therefore limited in reach.
Standing Up for Myself, a psychosocial group intervention, works directly with groups of young people and adults with intellectual disabilities to enhance their capacity to manage and resist stigma. STORM was designed from the outset to be scalable by being brief (four sessions plus one follow-up session) and suitable for delivery by group facilitators with a modest amount of preparation and training but without requiring any specific qualifications. By being delivered within the context of established groups, STORM provides a safe space to tackle sensitive subjects, maximises the potential for peer support and does not require substantial new delivery mechanisms which would affect its potential future implementation.
The STORM programme
Standing Up for Myself is a manualised psychosocial group intervention developed with close input from people with intellectual disabilities and experienced facilitators of groups for members of this population. STORM was designed for delivery by staff in education, social care and third sector organisations who have experience of facilitating groups for people with intellectual disabilities, without requirement for any specialist qualification. ‘Third sector organisations’ refers to voluntary and community organisations, typically providing services to achieve social goals. These are non-governmental and operate on a not-for-profit basis (unlike the private sector). The STORM intervention is delivered to established groups to ensure members feel comfortable and safe discussing sensitive and potentially distressing topics, to offer peer support, and to ensure that group facilitators who know group members can monitor responses and offer additional support, where necessary. Peer support available through a group intervention is seen as an integral part of STORM with hypothesised benefits for well-being, self-worth and responses to stigma, based on evidence from the mental health field. 29,30
Standing Up for Myself consists of four weekly 90-minute sessions and a 90-minute follow-up session (delivered around 4 weeks after session 4); an overview of the STORM programme sessions and key messages is provided in Table 1. The intervention involves: (1) watching short films of people with intellectual disabilities talking about the meaning of intellectual disability to them personally, their first-hand experiences of social interactions (both positive and negative) and how they deal with negative interactions with others; (2) group discussions of this material, guided by questions posed by the group facilitator as per the manual; (3) sharing of personal experiences relating to the topic under discussion; (4) problem-solving in relation to different possible responses to stigmatising experiences; and (5) action planning for managing and resisting stigma in future either individually and/or as a group. STORM uses existing film footage produced with and by people with intellectual disabilities as stimuli for discussion in sessions 1–3. This consists of short film clips (2–7 minutes in length) used as part of the STORM intervention with permission from the films’ original makers/producers.
STORM sessions and key messages | STORM activities |
---|---|
Session 1: What does ‘learning disability’ mean to people with learning disabilities? What does it mean to me? Key message: My learning disability is only one part of me. |
Videos (4 clips) followed by a discussion to explore:
|
Session 2: How are people with learning disabilities treated? Key message: It’s not OK for people to treat me badly. I don’t have to put up with it. |
Videos (3–4 clips) followed by a discussion to explore:
|
Session 3: How do people with learning disabilities respond to being treated negatively? Key message: I can stand up for myself when people treat me badly. |
Videos (× 3/4 clips) followed by a discussion to explore:
|
Session 4: What am I already doing when others treat me in a way I don’t like? What else do I want to try? Key message: I can make a plan to help me stand up for myself. People I trust can help me with it. |
Begin action planning:
Celebration event. |
Follow-up session: What worked and what got in the way of my plan? Key message: Things can get in the way of my plan. Talking to others can help me decide what to do next and not give up. |
To review and discuss:
|
Of note, the intervention uses the term ‘learning disability’ as the most commonly used term in the UK to denote ‘intellectual disability’.
The STORM manual is available as a pdf document and is supported by an online web-based version (a Wiki) that contains all training and preparation materials, film clips, session materials, information for participants in an accessible Easy Read format, and optional activity and work sheets in a format designed to make it as easy as possible for facilitators to deliver each session in accordance with the manual. These materials were fully updated following feedback from an initial pilot study. 31
The rise of digital interventions and the impact of COVID-19
The use of digital health (also known as e-health) interventions has grown rapidly over the last decade, a development that has accelerated greatly in the wake of the COVID-19 pandemic and associated social restrictions. 32,33 The effects of the COVID-19 pandemic were acutely felt in the UK from March 2020, with the first national lockdown. The way in which services and support were delivered, including to people with intellectual disabilities, has been responsive to successive lockdowns and the need for distancing. Face-to-face (F2F) meetings were largely suspended at times during the pandemic and, wherever possible, organisations shifted to supporting their members and users through virtual methods, in particular, web-based video meetings and WhatsApp calls (see Chapter 3, Usual practice). Consultations with partner and third sector organisations over the course of the COVID-19 pandemic have highlighted that this shift has been successful for many (if not all) people with mild-to-moderate intellectual disabilities. With initial support, many have learnt how to use virtual meeting software/apps. Many individuals with intellectual disabilities have underlying health conditions that place them at increased risk of COVID-19, and many organisations anticipate that they may continue to use virtual methods for the foreseeable future to engage with people with mild-to-moderate intellectual disabilities, and as part of a more diverse offer once they are able to resume F2F meetings. The widespread move towards much greater use of digital technologies, including by people with intellectual disabilities, through force of circumstance has offered an opportunity to assess the use and value of digital methods to promote well-being in this population. This is a priority in current health policy, yet an area where people with intellectual disabilities were largely excluded pre-pandemic. 34,35
Evidence supporting the acceptability and effectiveness of digital mental health interventions among both children and young people and adult populations has been generated in a range of settings. 36–41 However, as Lehtimaki et al. 37 note, the effectiveness of many digital mental health interventions remains inconclusive and more rigorous and consistent demonstrations of effectiveness and cost effectiveness are needed. In addition, further analyses are required to provide stronger recommendations regarding relevance for specific populations. 40 The current STORM study was originally designed as a cluster-randomised feasibility trial of the manualised STORM psychosocial group programme for people with intellectual disabilities, including economic and process evaluation. However, following the onset of the COVID-19 pandemic, the study was initially paused, and subsequently (in the autumn of 2020) the study design was revised. Adapting STORM so that it would be suitable for digital delivery became the primary aim, followed by a small pilot study to test the feasibility and acceptability of the adapted digital group intervention.
Digital interventions for people with intellectual disabilities
The use of digital health interventions with people with intellectual disabilities has been limited and this population has generally been excluded from e-health research. 42 Even though the scope of digital mental health is vast and people with intellectual disabilities who face multiple threats to their mental well-being might well benefit, they seem to have been relatively neglected in the discourse around digital mental health and the development and implementation of new (digital) interventions. 35 Historically, lower levels of access to the internet among this population have created a ‘digital divide’. 34 In 2019, Ofcom reported that people with intellectual disabilities in the UK had lower access to the internet compared with people without disabilities. This discrepancy was partly explained by lower levels of household ownership of PCs/laptops and tablets among this population (69% vs. 85% of people with no disabilities) and of smartphones (70% vs. 81%). 43 Contextual barriers to the use of digital interventions with people with intellectual disabilities also include concerns among many service providers about their own lack of digital skills which limit their opportunity to support people with intellectual disabilities. 44 However, it has been suggested that such barriers can be overcome with appropriate support and adaptations and a small, but growing, literature attests to the value of digital technologies to improve health as well as educational, vocational and leisure opportunities for this population. 35
The rationale for a digital version of the STORM intervention
The suggestion that the digital divide amongst people with intellectual disabilities needs tackling if we are to further avoid increasing existing health inequities has never been more evident than in the context of the pandemic. Increasing access to digital technology and the internet and generating e-health interventions that can improve the well-being of large numbers of people with intellectual disabilities should be a priority. It is also in line with the need to challenge health inequalities in the UK health and social care system stipulated in the National Institute for Health and Care Excellence Evidence Standards Framework for Digital Health Technologies. 45
Having the option to meet virtually can promote self-determination and social participation for people with intellectual disabilities,46 with benefits including increased communication and social interaction. 47,48 Digital technology could also be a means to widen access for members of this population to therapy. 42 The potential benefits of digital interventions for people with intellectual disabilities are supported through consultation work we carried out at various time points since the first UK lockdown (see Chapter 3, Usual practice). Self-advocates and third sector organisations reported that for many people with intellectual disabilities virtual meetings were a preferred means of connecting with others as home was seen as a safe and familiar space to discuss personal experiences, and it could allow increased confidence to speak up and engage in discussions. Many people with intellectual disabilities experience anxiety about moving around their local community or travelling alone or are reliant on support from others to attend services, both of which can act as barriers to joining and attending groups in person. Virtual meetings have been suggested as one way to help overcome this, with some organisations informing us that they have been able to reach more of their members since replacing F2F with virtual meetings. Furthermore, people with intellectual disabilities living in rural areas of the UK are often prevented from using community services and joining groups due to the absence or inaccessibility of public transport and/or support required to travel. Organisations we consulted, particularly those that cover rural areas, informed us that attendance at F2F meetings is lower among their members during the winter months and greater use of digital meetings could fill an important gap. We should stress that digital methods are by no means for everyone. In a recent large-scale study on the impact of the COVID-19 pandemic on the lives of people with intellectual disabilities in the UK, 42% were reported never to have been keen on online activities. 48 Therefore, adjustments to service delivery continue to be called for to meet the diverse needs of this population, and to reduce barriers to the inclusion and active participation of people with intellectual disabilities in their communities, both ‘real world’ and virtual.
The immediate need for and importance of STORM to help people with intellectual disabilities discuss negative experiences with their peers and resist stigma has been highlighted during the COVID-19 pandemic. The discrimination and inequalities people with intellectual disabilities have experienced during this time have also been highlighted. 49,50 Self-advocacy groups have often felt neglected by the government while their members had to shield, and have complained about lack of reasonable adjustments, including a failure to communicate social restrictions and rules. By continuing to meet virtually, these organisations have provided their members with space to connect with others and have sought to reduce the negative consequences of social isolation. They have emphasised the need for a space to discuss negative experiences and the stigma they face more generally, as well as specifically at this time when discrimination and social inequalities have been heightened.
Looking to the future, it seems important to offer a choice of delivery formats to access the STORM intervention to increase inclusion and respond to diverse preferences for engaging with peers. This aligns with our long-term goal to offer STORM as a widely and freely accessible public health intervention that can enhance the ability of people with intellectual disabilities to manage and resist the stigma they often face in their everyday lives. By potentially extending the reach of STORM among this population through the creation of a digital version, we seek to challenge exclusion and promote equality at multiple levels.
Adaptation of STORM for digital delivery and pilot study
In view of the aforementioned issues, the main aim of this project was to (1) adapt the STORM intervention for digital delivery (Digital STORM) to groups of people with mild-to-moderate intellectual disabilities, and (2) test delivery of the adapted intervention and outcome measurement in a virtual environment through a small pilot with four groups. The adaptation process is described in detail in Chapter 2. The pilot study is reported according to the progression criteria set out in the revised study protocol (see Chapter 2, Revised study design). The acceptability, adherence and fidelity of the intervention delivery are detailed in Chapter 3.
Chapter 2 Methods
Sections of this chapter have been reproduced from the STORM study protocol,1 reproduced under licence CC-BY-4.0 from the NIHR Journals Library.
Original study design
Standing Up for Myself was originally designed as a cluster-randomised feasibility trial of the manualised STORM psychosocial group programme for people with intellectual disabilities, including economic and process evaluation. It was planned to invite third, education and social sector organisations that work with groups of people with intellectual disabilities to identify one established group with an average of six to seven members interested in participating in the trial. Groups were to be randomised to STORM or the control arm on a 1 : 1 ratio using variable block randomisation in which the unit of randomisation is the group. STORM was to be delivered via F2F groups as weekly 90-minute sessions over the course of 4 weeks, with a 90-minute follow-up session (delivered around 4 weeks after session 4). The control group would receive usual practice (UP) plus access to STORM after the 10–12 months’ follow-up period (wait-list control). Using mixed methods, we intended to examine delivery of the intervention and adherence, as well as to gain stakeholder views on the acceptability of the intervention and on barriers and facilitators that may affect not only its future implementation but also plans for a future definitive trial.
Revised study design
In view of the obstacles that the COVID-19 pandemic posed to delivering the study as originally intended, the study protocol was revised to include work to adapt the STORM intervention to make it suitable for online delivery via web-based meeting platforms to people with mild-to-moderate intellectual disabilities. The adaptation was a 4-month phase of work, undertaken with in-depth input and oversight from a multistakeholder Intervention Adaptation Group (IAG). The outcome of this phase was to produce intervention materials appropriately adapted to the new delivery context and to revise the intervention manual and training materials, where necessary. The newly adapted digital intervention would then be piloted through delivery to groups already meeting online and as part of a pilot study to examine its feasibility and acceptability (Figure 2).
Objectives
The objectives were:
-
To adapt the existing STORM intervention for online delivery (Digital STORM), ensuring the content, number of sessions and direct contact time were the same for both STORM and Digital STORM.
-
To pilot the Digital STORM intervention in order to investigate the feasibility of recruitment to and retention of participants in Digital STORM; and adherence, fidelity and acceptability of Digital STORM, when delivered to groups of people with mild-to-moderate intellectual disabilities online.
-
To test digital administration of the study outcome and health economics measures.
-
To build on community assessments to describe what ‘UP’ for organisations delivering groups for people with mild-to-moderate intellectual disabilities may look like in the wake of COVID-19, to inform a potential future trial.
Progression criteria related to study objectives
Undertaking the pilot study of a digital adaptation of STORM was contingent on the study team being able to meet objective 1, to revise and adapt the intervention materials to the acceptance of the IAG and study management group (SMG). This objective was met by the end of March 2021.
The study progression criteria for objectives 2 and 3 are outlined in Table 2. For objective 2, regarding the feasibility and acceptability of Digital STORM, the progression criteria have been divided into two parts. Part one outlines criteria assessed using data collected from facilitators and through session recordings, and part two focuses on acceptability from the perspective of the pilot participants. Finally, for objective 3, progression criteria are defined to assess the feasibility of digital administration of the study outcome measures.
Research objective 2: To investigate the feasibility and acceptability of Digital STORM, when delivered to groups of people with intellectual disabilities online. | ||
---|---|---|
2.1. Feasibility of delivery of Digital STORM to participants, assessed via facilitator notes, supervision notes, facilitator interviews and a review of session recordings to demonstrate: | ||
Progression not indicated | Indicator for progression, with further adaptations needed | Strong indicator for progression |
Attendance was poor (participants on average attended less than three sessions) and within- session presence and engagement were poor, unlikely to be resolved with adaptations. Technical problems were significant, were not resolvable and likely to severely impair future implementation. |
Good attendance across sessions (participants on average attended at least three sessions), BUT presence and engagement within sessions were somewhat impaired by technical issues. These would be resolvable with some additional modifications/guidance. | Good attendance across sessions (participants on average attended at least three sessions) and presence and engagement within sessions were good, any technical problems were resolvable and did not unduly impact on running the session or engagement (with brief connectivity problems not exceeding 10–15 minutes). |
Would not recommend Digital STORM to others nor consider running it with other groups. | Would only consider recommending Digital STORM to others or running another group if specific changes were made to the content, structure or accessibility of the technology. | Would recommend Digital STORM to others and consider running it with another group. |
Facilitators felt unable to monitor participants’ emotional responses during sessions and to respond accordingly, modifications unlikely to resolve this. | Facilitators felt modifications were needed to monitor participants’ emotional responses during sessions and to respond accordingly, but these would be straightforward to implement through additional guidance or adaptations. | Facilitators felt able to monitor participants’ emotional responses during sessions and to respond accordingly. |
Privacy threats were evident and unmanageable or significantly affected engagement and participation. Further modifications unlikely to adequately resolve this. | Privacy could be better managed with some modifications. | Able to manage any threats to privacy for group members either by following the manual or taking additional steps. |
2.2. Acceptability of Digital STORM to participants, assessed via feedback obtained from focus group meetings to demonstrate: | ||
Progression not indicated | Indicator for progression, with further adaptations needed | Strong indicator for progression |
A majority of participants judged the delivery format as not acceptable and would not recommend Digital STORM to others. Changes unlikely to remedy this. | 50% or fewer judged the digital delivery format as acceptable and would recommend Digital STORM to others. | A majority of participants judged the digital delivery format as acceptable and would recommend Digital STORM to others. |
On occasion when participants found intervention contents distressing, they either felt un-supported by the group facilitator and/or their peers, or were unable to access support outside of the group. Changes unlikely to remedy this. | On occasion when participants found the intervention contents distressing, they did not always feel well supported by the group facilitator and/or their peers, or were not able to access support outside of the group when they needed it. Changes would mean that support in this area could be improved. | On occasion when participants may have found the intervention contents distressing, they either felt well supported by the group facilitator and/or their peers, or were able to access support outside of the group. |
Participant privacy was not maintained in a way that allowed them to fully engage in the sessions. Changes unlikely to remedy this. | Participant privacy was mostly maintained but there were occasions when threats to privacy affected engagement and/or raised concern. Changes in this area could make this more acceptable in the future. | Participant privacy was maintained in a way that allowed full engagement in the sessions. |
Research objective 3: To test the feasibility of digital administration of the study outcome measures: | ||
Progression not indicated | Indicator for progression, with further adaptations needed | Strong indicator for progression |
< 70% of collected outcome data found to be useable. | 70–79% of collected outcome data found to be useable. | 80%+ of outcome data (collected at baseline) found to be useable. |
Three categories of progression indicators are provided, a strong indicator for progression, progression indicated subject to further amendments and progression not indicated. All progression criteria were agreed with the SMG, Study Steering Committee (SSC) and funder as part of the approvals of the amended protocol.
In addition to the above, the risk of exclusion from Digital STORM due to technical issues was considered, and a target was set of no participant missing more than two sessions, or parts thereof, due to technical problems (other than brief connectivity problems not exceeding 10–15 minutes).
Adaptation of STORM for digital delivery
Intervention Adaptation Group
In October 2020, the IAG was established to oversee the adaptation of the STORM intervention. The IAG consisted of our four PPI advisors (STORM expert advisors), the independent co-chair of the PPI group (CB), three group facilitators (from our stakeholder group), two research and impact staff from Mencap (as our intervention delivery partner), members of the research team (KS, LR, ER, MO, KD) and the director and a member of staff of Unthinkable Digital. Digital inclusion experts Unthinkable Digital were commissioned to work with the team because they brought expertise in digital inclusion and learning design specific to people with intellectual disabilities. Their particular knowledge and skills were deemed important to ensure that the adapted digital version of STORM would be optimised for the engagement of people with intellectual disabilities and ease of delivery for facilitators who would all be relatively new to online group facilitation.
Attention to potential risks
In adapting the intervention, careful consideration was given to potential risks to ensure every effort was made to mitigate these. The risks were broadly categorised as falling within the following three areas. Firstly, potential risks to acceptability and inclusion arising from technical barriers to accessing online group meetings due to limited technology/digital equipment (e.g. phone/laptop/tablet), and/or internet connectivity problems (in recognition of the digital divide highlighted in the literature, see Chapter 1).
Secondly, we were concerned about potential risks to confidentiality and privacy while exploring personal and sensitive issues, particularly when participants join web-based meetings from remote environments where threats to privacy can be less controlled than in F2F meetings.
Thirdly, risks related to the facilitator’s ability to monitor and manage any potential emotional distress during the intervention, for example, while sharing information and videos via the screenshare feature in Zoom or MS Teams. Similarly, ensuring group members would feel supported and/or able to access support was seen as very important in the context of digital delivery of the intervention.
Ensuring access and engagement with Digital STORM
Given the above aims and orientation to risks, the IAG reviewed the following aspects of the STORM intervention and made recommendations to the research team regarding necessary adaptations to the manual and other resources:
-
intervention format and structure, including changes to group dynamics and interactions when moving from a F2F to a virtual environment, and mechanisms for managing potential challenges to participants’ right to privacy
-
intervention content, including how to manage demands on group facilitators tasked with delivering Digital STORM
-
delivery mechanisms and resources, including how to ensure participants had access to and familiarity with technology to be used as part of Digital STORM
-
participant needs for support, including the role of carers/family members in supporting participants to engage with the intervention.
Outcomes of the Intervention Adaptation Group meetings
The IAG met four times, at monthly intervals, between November 2020 and February 2021. Each meeting focused on one or several of the topics outlined above. During the meetings, the IAG was able to draw on community assessments already conducted by the research team to identify key considerations in adapting the STORM intervention for digital delivery. Also informing the IAG’s discussions, ideas and plans were insights generated through a web survey disseminated with support from our partner organisations. The survey sought to understand what facilitators had learnt regarding what works in making the shift from F2F to web-delivered activities, including how to prepare and support access to and engagement with web-based meetings and manage privacy concerns (further details about the survey are presented below and in Chapter 3).
A summary of the key recommendations for adaptation from the IAG meetings is provided in Table 3.
Intervention structure | |
Number and timings of sessions |
|
Group sizes |
|
Intervention content | |
Videos |
|
Delivery mechanisms and resources | |
Zoom/MS Teams or other web-based meeting platforms |
|
White board/flip chart notes |
|
Communication cards |
|
Posters for key messages and worksheets |
|
Manual and Wiki |
|
Managing threats to privacy | |
Facilitator's role in managing privacy |
|
Participants’ need for support | |
Monitoring for emotional distress and how to manage this during sessions |
|
Digital STORM intervention
Logic model
A Logic model for the Digital STORM intervention is shown in Figure 3. The model was somewhat refined from the original version pertaining to the F2F version of STORM to reflect.
Digital STORM facilitator training and supervision
Facilitator training
In addition to providing the intervention manual (paper and electronic copy), access to the Wiki and other Digital STORM resources, facilitators were provided with brief training to deliver the intervention. The training was delivered over the course of two 2-hour sessions and was jointly delivered by the Study Manager (LR) and a member of Mencap’s research and impact team (HB) via Zoom. Session 1 provided an overview and background to Digital STORM and an introduction to the resources. A demonstration of the Wiki was provided, as well as an overview of the structure of the Digital STORM sessions and key messages. Guidance was shared about managing the technology involved, followed by a discussion of possible solutions to problems that might present in the course of delivering Digital STORM. Session 2 was designed to be more interactive in nature, providing the opportunity for facilitators to ask questions about Digital STORM resources and providing advice for preparing and delivering Digital STORM sessions. Further detail and discussion around action planning was included, as was detail about the supervision and support available.
Supervision for facilitators
Regular one-to-one supervision sessions with HB were offered to each facilitator throughout the Digital STORM intervention delivery, in addition to ad hoc support via e-mail correspondence. The supervision sessions were designed to be informal in nature, hosted via Zoom or MS Teams (depending on facilitator preference). Each facilitator was offered a ‘check in’ session before they delivered the first session. This was to ensure that they felt prepared and had the opportunity to ask any final questions before starting delivery of the intervention. The other supervision sessions were offered shortly after each STORM session; attendance was not mandatory.
Digital STORM pilot phase
The pilot evaluation of the Digital STORM intervention focused on issues of feasibility and acceptability. As described in Objectives, this study set out to evaluate:
-
recruitment (of organisations, facilitators and participants), retention and adherence (participants);
-
intervention fidelity;
-
feasibility of intervention delivery and acceptability.
Recruitment and retention to the pilot study
Sample size
Following the adaptation phase, the Digital STORM intervention was piloted with four groups (N = 22), with the group size ranging from three to seven members. The smaller group size (compared to the 5–10 recommended for the original intervention) was determined by feedback received during our stakeholder consultations that smaller group sizes work better in web meetings.
Inclusion and exclusion criteria for groups and participants
The feasibility of recruitment was established through identifying group facilitators and consenting group members who met the study inclusion criteria and did not meet the exclusion criteria. A record of the screening and recruitment processes and outcomes was kept in a database developed and maintained by the research team.
Groups were eligible to take part in the Digital STORM pilot if they:
-
were in place already (although group meetings might have been disrupted due to COVID-19 and associated social restrictions);
-
intended to continue or restart meeting as a group for at least 3 further months;
-
had at least three and no more than eight members with intellectual disabilities who wished to participate in the intervention;
-
were willing to replace five of their usual meetings with Digital STORM;
-
had a group facilitator willing to receive training and facilitate the Digital STORM intervention in digital form and in line with the manual;
-
had organisational support to deliver the study intervention.
Groups were excluded if:
-
they were run as part of the NHS;
-
some of their regular members declined taking part in Digital STORM and it was not possible to find alternative meeting times to run Digital STORM (in the event, this did not occur during the pilot).
Participants were included if they:
-
were aged 16+ years;
-
had a mild-to-moderate intellectual disability as defined by an administrative definition (use of services for people with intellectual disabilities);
-
were able to communicate in English;
-
were a member of an established group for people with intellectual disabilities (educational, activity, social or self-advocacy focused);
-
were able to complete the outcome measures (with or without support) and engage with the STORM intervention;
-
had access to the internet and a device that could access web meetings, be this via Zoom, MS Teams or Google Meet;
-
were able to access support to access web-based meetings, where needed;
-
had capacity to provide informed consent to participation in the study.
Screening groups and participants for eligibility
Facilitators of groups were sent a study information leaflet (see Appendix 1). After they and their group expressed an initial interest in the Digital STORM project, a researcher contacted facilitators individually as part of the screening process to check for eligibility. Questions were asked about the usual activities of the group, including how well established their online meetings were, resources available to them (including digital equipment), group members’ ages, ability to join in discussions, the support available to them (including from carers, and supporters within and beyond the participating organisation). The screening questions were guided by the inclusion and exclusion criteria listed above.
Obtaining informed consent from participants
All potential participants from the identified groups were provided with study information in an accessible (Easy Read) information sheet (this can be accessed at www.storm-ucl.com/). Potential participants had the option of receiving the information sheet by e-mail or post. A researcher also attended one of the groups’ usual (virtual) meetings to present the information to group members and answer any questions they might have, using Zoom or MS Teams, whichever software was familiar to the group. The researcher presented the information verbally while also showing the information sheet using the screenshare function.
Group members were given at least 24 hours to consider the study information before making a decision and informing the group facilitator of this. As part of confirming their interest in participating in the study, group members provided their contact details to the group facilitator to pass on to the researcher. Once this information was provided to the research team, a researcher made contact with group members to set up individual meetings using the online platform they were familiar with.
At the initial individual meetings with potential participants, the researcher shared and read through the information sheet again, checking whether the information was understood by the participant, further explaining anything that the participant appeared unsure about and answering any questions they might have. The researcher proceeded to share the consent form via screenshare and talked through each information point, checking the participant’s retention and understanding. The participant’s verbal responses of ‘yes’/‘no’ to each item on the consent form were recorded (using end-to-end encryption, in line with data protection regulations). Participants were also asked to state their name and the date for the purposes of recording their verbal consent. Recordings of verbal consent were stored on a University College London (UCL) secure research drive and separate from all other data collected.
Retention, attendance and adherence
Retention, attendance and adherence were assessed by group facilitators recording participants’ attendance at each session using a bespoke attendance and feedback form (see Appendix 5). The form captured attendance at each session and included a rating by the facilitator for each participant of whether technical issues affected their engagement with the session. Ratings were on a three-point scale; (1) no/minimal issues for the person (did not unduly impact on running the session or their engagement), (2) some issues affecting participants’ engagement in the session (e.g. missed up to 15 minutes) and/or (3) significant issues affecting person’s/presence/engagement (e.g. missed more than 15 minutes). There was space for facilitators to make notes about how the technical issues affected the person. Additional space was provided with prompts for facilitators to comment on: (1) what went well during the session, (2) challenges to delivering the session as planned and (3) whether anyone had become unduly upset during the session due to the session content.
Facilitators were asked to complete the form after each session and to send it to the intervention partner (HB), so as to facilitate an understanding of the experiences and to inform discussions in supervision. Following supervision sessions, the completed document was sent to the Study Manager (LR) with any additional supervision notes added. In case of any missing attendance and engagement data, session recordings were used to assess what proportion of sessions participants were able to attend (to account for participants dropping out at any point, either due to poor internet connectivity or for other reasons).
Intervention fidelity
All Digital STORM sessions were video recorded, to conduct ratings of the intervention’s fidelity. Recordings were made using the record function available in Zoom/MS Teams. To enable recordings to be saved to local UCL computers/secure drives and to negate other more complex data-sharing procedures, a researcher (MO) set up the meetings in the relevant app and sent the joining details to facilitators to share with their group members. MO would then start the meetings, start the recording and then share hosting control of the meeting app with the facilitator. She would then turn off her video, microphone and sound and would not interact with nor observe the group during sessions and inform group members that this was the case. Recordings of Zoom meetings were end-to-end encrypted via Zoom’s recording function, while recordings via MS Teams were stored in the secure UCL cloud.
To assess fidelity to the Digital STORM manual, a checklist of core requirements was developed using a coding frame (Microsoft Excel spreadsheet) for (1) adherence to the manual, (2) group process and (3) facilitator engagement with group members (see Appendix 6). The checklist was used to rate the video recordings of sessions to determine whether core elements were definitely present/somewhat present/absent. The checklist was adapted from an existing fidelity instrument developed for group interventions and by taking into account the particular social and communication needs of people with intellectual disabilities. 51 Fidelity ratings were completed by the chief investigator (CI) (KS) and a second rater (KD). Both independently rated a session and discussed points of divergence to achieve a high level of reliability in their ratings. The data manager (SJ) randomised the remaining recordings, to identify session 1 or 2 and session 3 or 4 to be rated for each of the four groups. All follow-up sessions were rated. Rater 1 (KD) rated nine sessions across three groups and rater 2 (KS) rated three sessions from one group. Both raters kept a reflective log while rating the session recordings. Once finalised, the data were sent to the data manager (SJ) for analysis. Quantitative analyses were employed to explore the variation of fidelity scores across groups and the association between fidelity scores and outcomes. Furthermore, the recordings were used alongside the facilitators’ attendance records, notes and log entries to identify what challenges to implementing Digital STORM as planned arose and how these were managed, in order to identify the need for further revisions to the manual and/or delivery mechanisms ahead of a feasibility study.
Feasibility of intervention delivery and acceptability – qualitative methods
All Digital STORM participants were invited to participate in a virtual focus group interview with their fellow group members. STORM facilitators were invited to a virtual individual interview. Views were sought on study participation, barriers/facilitators to the intervention’s implementation, as well as potential future improvements to the content or delivery of Digital STORM sessions (see Appendix 2). During the interviews, we also asked about the perceived value, benefits and harm or unintended consequences of the Digital STORM intervention to develop a full understanding of the likely mechanisms of change and to ensure these are fully measured in a full study.
Both focus groups with participants and individual interviews with facilitators were conducted via Zoom or MS Teams, in line with their preferences. Recordings were made using the record function available in Zoom/MS Teams and later transcribed using Otter for qualitative coding and analysis.
Three of the focus groups were co-facilitated by a peer researcher from the STORM Expert Advisory Panel (HR) and a researcher (MO), and one was conducted by a researcher (MO) alone. The focus group meetings took place following the final STORM session and once all post-intervention data had been collected. Group facilitators were also present for the discussions. Not all group members were present for the full duration of the focus group meetings, with some arriving late, leaving early, or being absent altogether (full details in Appendix 7). Semistructured interviews were conducted with the four group facilitators either by the CI or MO. Both facilitator interviews and focus groups were conducted via a video call platform (Zoom or MS Teams), with facilitator interviews lasting 40–60 minutes and focus groups lasting up to 90 minutes.
Questions were developed to obtain participants’ and facilitators’ views on the feasibility and acceptability of the Digital STORM intervention, informed by the progression criteria set out in Chapter 4, Adapting and piloting the existing STORM intervention for online delivery. This was supplemented with questions to capture the impact of Digital STORM on participants (see Appendix 2). The interviews and focus group recordings were transcribed using the Otter artificial intelligence software and anonymised by a researcher (MO) prior to analysis.
Qualitative analyses
Framework analysis52,53 is a method of thematic analysis which is widely used in health research and is an appropriate approach when multiple researchers are working together. 54 This analytical method was applied to the qualitative data collected via interviews with facilitators and focus groups with group members. In applying this method both inductive and deductive approaches were used. This allowed the identification of themes which address the progression criteria for the study as well as other unanticipated themes which are generated through unrestricted coding of the data. As such, themes were formulated that related to the feasibility and acceptability of the Digital STORM intervention, as well as observations by group members and facilitators that pointed to wider aspects or suggestions for improvement to be held in mind.
Following familiarisation with the transcripts, two researchers (LR and MO) independently rated the same transcripts until any differences in coding had been discussed and resolved. Thereafter the focus group data were coded by one researcher (MO). Facilitator interviews, together with facilitator session and supervision notes, were coded by the Study Manager (LR). Coding involved line-by-line reading of the transcript and applying a ‘code’ describing that passage of text. Initial codes related to issues of feasibility, acceptability and perceived impact of the Digital STORM intervention and were refined through initial open coding. This process was undertaken in MS Word, using the comment function to highlight passages of text and apply a code. A working analytical framework was developed from the progression criteria and further categorisation of the codes identified and agreed on by both researchers in discussion with the CI (KS). Passages of data from transcripts were then charted into a framework matrix using a table in MS Word. During the charting, a content count was made for specific areas of interest, where these would help address questions in the process evaluation or enable the team to report results relating to the progression criteria (it is recognised that producing counts is not typically an outcome of framework analysis). 54 The charted data were then compared and contrasted to support the interpretation of the data and development of themes and subthemes.
Testing digital administration of outcome and economic evaluation measures
The third objective was to test digital administration of the study’s outcome and economic evaluation measures undertaken at baseline and post intervention for all participants. All procedures for supporting remote administration of study measures were finalised as part of the adaptation phase. Concurrently with the adaptation of the intervention, the research team reviewed materials and procedures for supporting administration of study outcome measures using video meeting platforms and the web-based survey platform QualtricsXM.
Health and social outcome measures
The health-related and social outcome measures used in the pilot were as follows (the combined measures as presented in QualtricsXM are presented in Appendix 3):
-
Mental well-being measured using an adapted version of the Warwick-Edinburgh Mental Wellbeing Scale (WEMWBS). 55 The WEMWBS is a 14-item scale validated for adolescents aged 13+ and adults. The scale was adapted by the research team by amending the reference period from 2 weeks to 1 week; simplifying the wording of some items, for example, changing ‘optimistic’ to ‘hopeful’ and turning gerunds to simple past tense forms; and reducing the response scale from a five- to a four-point scale.
-
Self-esteem, measured using a six-item version of the Rosenberg self-esteem scale, validated for people with intellectual disabilities. 56
-
Self-efficacy in rejecting prejudice (SERP), single self-rated item used in our pilot: ‘At this moment, how confident do you feel about standing up to prejudice?’, rated on a four-point scale (‘not at all confident’ to ‘very confident’).
-
Reactions to Discrimination (RtD) four-item subscale of the Intellectual Disabilities Self-Stigma Scale, measuring emotional reactions to stigma in people with intellectual disabilities. 57
-
Sense of Social Power: adapted four-item version of the Sense of Power Scale, to date not yet validated for people with intellectual disabilities. 58
The above scales were rated using a four-point Likert response scale; other than for the SERP the response options were ‘never, sometimes, often, always’. The Likert scale for all items was supported by a pictorial representation of the rating scale.
Measures for economic evaluation
The measures for economic evaluation used in the pilot were as follows:
-
A Service Information Schedule (SIS) was developed to capture comprehensive costs associated with the Digital STORM intervention. Information was sought on staff salaries, on-costs, overheads, training costs, materials and postage costs (where applicable). Data were collected from the research team on the adaptation of STORM for online delivery, from the delivery partner on the provision of training, supervision and materials, and from facilitators on time spent preparing for delivery, feedback and follow-up, and materials.
The information on staff involved in intervention-related activities required was as follows:
-
profession/job title (to approximate salaries if missing)
-
salary (annual or hourly)
-
grade where applicable (to approximate salaries if missing)
-
oncosts (employer pension and NI contributions)
-
overheads
-
number of hours spent on each task.
The SIS was completed by intervention providers in collaboration with the research team and further explored during qualitative interviews with facilitators.
-
EuroQol-Youth (EQ-5D-Y),59 a self-report measure of health-related quality of life across five domains that are rated on a three-point scale, was administered. The EQ-5D-Y is a version of the EQ-5D-3L aimed at children and adolescents. The main difference is in the wording of questions, aiming to make the measure more accessible for this population. This measure allows for the calculation of quality-adjusted life years a common measure in health economic evaluation.
-
Client Service Receipt Inventory (CSRI),60 a self-report measure of use of services and supports. A short version covering a retrospective 3-month period was developed and adapted for use with input from the PPI Advisory Group. Participants were asked to provide information about contacts with general health services, mental health services, third sector organisations and education support as well as informal help received from supporters/carers and friends (see Appendix 4).
Scoping the acceptability of using data linkage in future studies
As part of the post-intervention data collection, we also asked participants questions to test the potential for consent to accessing routine data concerning health, education and social care use via data linkage methods, to inform the methods to be used in a possible future study.
The question asked of participants was phrased as follows:
Instead of asking you lots of questions, another way for researchers to get this information is from your records. For example, asking your doctor for this information from your medical record. Do you understand this?
If we’d asked to get this information from your medical records, how happy would you be about this?
The response options given were:
-
I would be happy for researchers to ask for this information from my records.
-
I would not be happy with this. I prefer they ask me.
-
I am not sure.
Assessment time points
A schedule of enrolment and assessments for group members at baseline and post intervention is provided in Table 4. The intervention commenced 1–20 days after baseline assessments. All baseline data were collected by the end of April 2021, prior to participants undertaking session 1 of the intervention. Post-intervention data were collected 3 months from baseline, range 10–12 weeks. All post-intervention data were collected following session 5 and before the qualitative focus groups and interviews took place (by the end of June 2021).
Time point | Screening | Baseline | Post intervention |
---|---|---|---|
Informed consent | X | ||
Demographics | X | ||
Assessment against inclusion criteria | X | ||
Mental well-being | X | X | |
Self-esteem | X | X | |
SERP | X | X | |
RtD | X | X | |
Sense of Social Power | X | X | |
EQ-5D-Y | X | X | |
CSRI | X | X |
Procedures for digital administration of study measures
Following informed consent, baseline measures were administered to groups 1–3 by two research assistants (MO, KD), and to group 4 by two Doctorate in Clinical Psychology trainees supervised by the CI and Study Manager. All research staff carrying out data collection were appropriately qualified and completed relevant training. Clarification of any questions relating to STORM that arose during sessions was referred to the CI or Study Manager.
Administration of all measures followed an assessment manual, which allowed for some flexibility depending on participants’ support needs. Items were read by the researcher one by one and visual supports and practice items were used to support understanding and familiarisation. Participants’ responses were entered directly into the study QualtricsXM database via an online link, set up by the data manager at the clinical trials unit (SJ). Data collection sessions were audio-recorded using the record function available in Zoom/MS Teams. Participants provided consent for the recordings to be used for potential training of researchers in the future.
After completing the baseline measures, all participants were asked three questions to assess the acceptability of collecting measures via video meeting platforms:
-
What did you think about doing the interview?
-
Would it be ok for others to do this?
-
Is there anything we could do to make it better?
Participants received a £10 retail voucher for a retailer of their choice at each assessment point in recognition of the time taken to complete the measures.
Outcome data analyses
Descriptive statistics were used to describe baseline demographics and summarise responses on the proposed outcome measures. Statistical methods are therefore descriptive in nature. Significance tests are not reported as this is a pilot study and is not a test of the intervention. Categorical data are presented using counts and percentages and continuous data using means and standard deviations or medians and inter-quartile ranges, as appropriate. For each outcome measure where a participant had completed at least one item, the number and percentage of useable forms are reported. Baseline data completeness results are reported in relation to the progression criteria described in Progression criteria related to study objectives.
Economic evaluation analyses
The feasibility of economic evaluation was assessed using rates of completion of information about the cost of the intervention (SIS), rates of completion of information about access to formal and informal sources of support (CSRI) and rates of completion of the EQ-5D-Y. In addition, an attempt was made to compare index values to other studies that have used the youth version of the measure.
A comprehensive intervention cost for Digital STORM was calculated based on SIS data, including information on staff salaries, on-costs, training costs, materials and travel time.
The proportion of returned CSRIs and the proportion of questions completed at each time point are reported. The proportion of participants reporting contacts with a given service was examined to determine whether it is feasible to assess cost-effectiveness from: (1) a public sector perspective, or (2) a wider societal perspective in a full trial.
Establishing usual practice
The template for Intervention Description and Replication (TIDieR)61 guidance for trials emphasises the importance of a complete published description of interventions to enable implementation and research replication. Having a complete description of a comparator is also good practice for the same reasons. Therefore, objective 4 for this study was to describe what UP, the potential comparator for a future trial, might look like for groups of people with mild-to-moderate intellectual disabilities in the wake of the COVID-19 pandemic. To achieve this objective, during the adaptation period, information was sought from third sector and education providers to ascertain how the pandemic had affected the offer and format of group meetings for people with intellectual disabilities and what UP looked like in this new context. A survey was designed exploring UP with people with intellectual disabilities who attend groups run by third, education and social sector organisations, both before and since the onset of the pandemic (see Appendix 8). The facilitators were asked to provide details about the nature of the activities their groups were engaged in before and since the COVID-19 pandemic (where they had adapted to meeting online). We also sought to understand how facilitators and group members were adapting to meeting and engaging with their groups on digital platforms to inform the intervention’s digital adaptation. QualtricsXM (web survey platform) was utilised to collect data and included questions about the type of group activities, who delivers them, mode of delivery, group size and details of changes to UP.
An invitation to complete the survey was disseminated between November 2020 and February 2021. Organisations initially contacted with an invitation to complete the survey were known to the research team, including partner organisations [Mencap, Foundation for People with Learning Disabilities (FPLD) and People First Dorset]; groups that had taken part in the original STORM pilot; or organisations that had expressed interest in the feasibility randomised controlled trial (RCT) in January 2020. Additional organisations with large networks were contacted via e-mail, such as Learning Disability England and Choice Forum; these advertised information about the survey via e-mail to their networks. The survey was also advertised on social media.
Data management
Outcome data and economic evaluation data were collected using a web-based QualtricsXM database. The data were saved into a secure, encrypted bespoke online database at the Centre for Trials Research (CTR). This secure encrypted system was accessed by username and password (restricted only to those who needed direct access) according to CTR standard operating procedures (SOPs) and complying with Good Clinical Practice and General Data Protection Regulations (GDPR). Back-up paper case report forms (CRFs) were available in the event the web-based system was not accessible. Fidelity checks and data cleaning were performed as detailed in the data management plan. Following data cleaning, the database was shared with the statistician (MW) and health economist (EB) for analysis. A full data management plan was signed off prior to data collection.
Safety reporting
Procedures were put in place for the reporting of any serious adverse event (SAE) within 24 hours of knowledge of the event to the study team. There were no adverse events (AEs) or SAEs for STORM. Any planned treatments that participants were already receiving at the start of the study were not considered as AEs or SAEs.
Trial registration, governance and ethics
The study was funded by the NIHR PHR programme (17/149/03). The trial protocol was registered with Current Controlled Trials (ISRCTN16056848) and published as an amended version. The adaptation and pilot protocol was approved by the Centre for Trials Research, all co-investigators, the facilitator and STORM expert advisors (PPI), the wider SMG, the SSC, and the funder. The study was overseen by the SSC, comprising an independent chairperson and six further independent members, including two self-advocates with intellectual disabilities, supported in the SSC role by FPLD, an organisation they had previously worked with. The SSC met four times in total to review the study progress, methods and management.
The study was approved by the UCL Research Ethics Committee prior to recruitment and data collection commencing (Project ID 0241/005). An amendment to the study methods was approved by the committee before the start of the pilot of the adapted Digital STORM intervention.
Patient and public involvement and stakeholder involvement
Our reporting of the patient and public involvement (PPI) for this study follows the GRIPP2 reporting checklist.
Patient and public involvement for the project overall consisted of an advisory group of four expert advisors with intellectual disabilities. A separate stakeholder group of experienced group facilitator advisors from the third and education sectors was also involved in advising the project team throughout the study. Both groups were familiar with the STORM programme through involvement or participation in an earlier pilot study of STORM. Representatives from the STORM Expert Advisor Panel (SEAP) and all facilitator advisors were invited to SMG meetings. Two STORM experts initially represented SEAP at each SMG meeting and were provided with additional support for this. Following the outbreak of the pandemic, it was not possible to sustain this model in a meaningful and inclusive way and instead, CB represented the views of SEAP at the SMG and provided feedback to SEAP in turn. Members of the stakeholder group also attended SMG meetings. Both groups attended the IAG. Finally, as noted, there were two independent representatives with intellectual disabilities on the SSC. Figure 4 depicts the various PPI and stakeholder groups and their relationships.
The patient and public involvement advisory group
The PPI advisory group consisted of three men and one woman with intellectual disabilities, all of whom were advisors during the earlier development and initial piloting of the STORM programme (hereafter referred to as STORM experts). The group named themselves the SEAP. SEAP meetings were co-chaired on a rotational basis by Christine-Koulla Burke (CB), Director of one of our partner organisations, the FPLD, and one of the STORM experts with intellectual disabilities. The meetings were also attended by the Study Manager and Research Assistant to present materials and matters for discussion and feedback to the group.
The aims for PPI in the current project were:
-
to support the inclusion of people with people with intellectual disabilities in research that matters to them;
-
to bring to the team lived experience and knowledge of both intellectual disabilities and stigma. The team sought to engage with this experience and knowledge to:
-
shape the participation of organisations and people with intellectual disabilities in the research;
-
ensure the accessibility of all study materials and processes for potential participants in the research (e.g. information sheets, consent forms);
-
adapt outcome measures where relevant and advise on appropriate data collection processes.
-
Following agreement of the amended protocol to adapt the STORM programme for digital delivery, the aims of the PPI evolved to include:
-
co-design of Digital STORM resources and approaches through participation in an IAG, details of which are provided above.
Patient and public involvement activities
Early SEAP meetings were used to re-familiarise the STORM experts with the STORM programme, discuss the PPI plans, clarify roles and to provide training (e.g. on PPI and research methodology relevant to this study, such as RCTs). The panel discussed and agreed how they wished meetings to be run, including sharing responsibility for co-chairing meetings. To enable this, the co-chairs (i.e. one of the STORM experts and CB) would meet 30 minutes ahead of each meeting to run through the Easy Read agenda and discuss preferences for co-chairing the respective meeting. The co-chairs had an annotated version of the agenda with small prompts (in red text) to help them in the role. For each meeting, a folder with a record of new words and concepts that arose (which might otherwise be considered jargon) was available to support everyone to work together. Easy Read minutes were also produced. At the start of each meeting, the research team provided feedback verbally on actions taken in response to previous input from the STORM experts. At the end of each meeting, dedicated time allowed everyone to reflect on how the meeting had gone. This allowed us to adjust how we worked and to put additional support in place where necessary. We continued to meet with the STORM experts via Zoom following the COVID-19 pandemic outbreak. In addition to Zoom meetings, the research team consulted individual advisors directly by telephone.
To check the materials and processes developed with the STORM experts, the project team made presentations to the FPLD Advisory Group and a group of Mencap Research Champions (a group of Mencap employees with intellectual disabilities who advise on research at Mencap). The team provided a taster of the first STORM session to the Mencap Research Champions to gauge views about presenting intervention resources in the planned format (the STORM Wiki, see Chapter 1). This group also commented on the draft project information sheets to check understanding, especially around the explanation of the RCT methodology. Information about STORM outcome measures was reviewed with both the Mencap Research Champions and the FPLD Advisory group members to ensure wording, practice items and visual supports were easy to understand and accessible. A representative from the FPLD Advisory group also worked with the team allowing them to pilot the online delivery of the study measures. These groups’ comments and feedback were presented back to the STORM experts who agreed the final amendments to be made by the research team.
Stakeholder group involvement
A group of experienced group facilitators from the third sector and education providers was established. The group met at key points during the study. These meetings were more informal and were co-led by the CI and Study Manager. Initially, this group provided input on plans for recruitment, study materials and procedures. As the work evolved to co-designing Digital STORM, the model of separate meetings ceased and all advisors joined a larger IAG. Members of this stakeholder group were also invited to all study steering group meetings and regularly provided input in this forum.
Outcomes from patient and public involvement and stakeholder contributions
The PPI for the study led to significant influence in a number of key areas. First, considering how to help potential participants with intellectual disabilities understand the study design, all advisors shared their thoughts and frustrations in understanding the RCT design themselves and shared ideas for presenting this information to others through project information sheets. This work also led to the production of a video by SEAP members to promote the benefits of taking part in the research, no matter which arm of a trial one might be randomised to. Owing to the pandemic halting recruitment, we did not get to use these resources further.
Significant contributions were made to the adaptation of two study measures (the WEMWBS and the CSRI). The data presented in Chapter 4 on the feasibility and acceptability of the study measures are testament to the impact of the PPI work in these measures’ development.
The PPI advisors’ input was critical to our decision-making whether to take an extended pause to the study in response to the pandemic or proceed to the development of a protocol for a digital adaptation of STORM and associated pilot. The STORM experts made their views firmly known to the team and NIHR, how important it was in their view to continue to tackle stigma, with their and others’ experiences of discrimination during the pandemic bringing this into stark relief in new ways. They conveyed this via a letter to the funder, and also first hand in a virtual meeting with the funder arranged to discuss the proposed adaptation for digital delivery. The STORM experts were keen to adapt to the current context and to use the study resources and expertise appropriately. Both STORM experts and facilitator advisors encouraged the team to think about delivering STORM via online meetings that were already happening in many organisations. They noted that, for some, digital meetings were preferred and that digital options supported individuals to engage who might not do so in a F2F context.
Facilitator advisors (from the stakeholder group) made a good case to the team for having a digital option which would still have relevance beyond the pandemic, for example, allowing people living in more remote areas to join in, to support activities to continue in the winter months when engagement in person typically reduces, or to run groups at times of the day when people with intellectual disabilities may be reluctant to venture out. All advisors highlighted concerns about people who may experience digital exclusion and for this reason did not want to limit the future of STORM as a public health intervention to a digital-only context, advising us that many people will want to return to F2F group meetings in the future.
The STORM experts also worked with the team to write a chapter in a book (Hierarchies of Disability Human Rights, Routledge Press, Interdisciplinary Disability Studies Book Series, in press). This focused on their involvement in the STORM research and how being involved in research and within the work of universities intersects with the human rights agenda. The group also co-wrote a blog post for British Medical Journal Open with the CI, on the reinforcement of society’s negative stereotypes of people with intellectual disabilities as ‘vulnerable’ and ‘victims’ (rather than active agents) during the pandemic, also making reference to the STORM study and need for more anti-stigma interventions in this field. 62
Critical reflections on patient and public involvement
Meeting regularly at the start of the project with the PPI group was important, but it proved challenging, particularly in the early stages, as the development of accessible materials and ensuring that all PPI group members were adequately supported to attend meetings proved very time-consuming. This eased over time once meeting materials could be reused and good systems were developed for providing support.
Resources to support the involvement of people with intellectual disabilities as project advisors may require additional funding to allow flexibility, for example, if some advisors need more training or support than others and to be able to adapt to the different activities that advisors may want to be involved in. For example, in this project, the advisors asked to be involved in meeting other groups that we consulted but this had not been planned or costed for. In addition, advisors adopted different roles, often on a rotational basis. While this allowed for variability and broader skills development, it reduced the scope for the development of more specific and transferable skills and also proved more resource-intensive in needing to prepare and support PPI advisors for different activities and roles. Going forward, it should be considered whether individual advisors should lead different activities and assume more narrowly defined, specific roles for more complex activities. For example, one PPI advisor might adopt a governance role and represent the wider panel at SMG and other meetings; another advisor might have more of a role as a peer researcher; while another might represent the panel when the research team consults with other groups and so on.
The pandemic has been a challenging time for all of our advisors, some experiencing the negative effects of the pandemic directly or, in the case of our facilitator advisors, needing to respond to the pandemic and provide support to members of their organisations in new ways, while also navigating a fluctuation in staffing levels (and stresses in their personal lives). This has at times affected our advisors’ ability to contribute to the project and thus necessitated more flexible and less structured ways of working together. The process of working together remotely has been one of continual learning; all advisors have given us constructive feedback which has helped us work together more effectively. Ideally however, meetings in person are felt to be more beneficial to ensuring good communication standards and the development of trust and positive relationships.
Chapter 3 Results
Setting and group recruitment
Fourteen third sector organisations and one Special Education College were contacted via e-mail and telephone during February 2021 regarding their potential participation in the study. Ten of these organisations had expressed interest in participating in the original STORM feasibility RCT trial (during recruitment undertaken between January and March 2020). However, only one group facilitator came forward from these 10 to express interest in participating in the pilot of Digital STORM and they later withdrew their interest due to other competing demands.
As a result, an additional four organisations were identified through our group facilitator survey as potentially meeting the inclusion criteria (see Objectives) and invited to take part in the Digital STORM trial. Of these four organisations one did not reply to our invitation and three responded with interest; these were either social or self-advocacy groups. A further organisation, an educational college (known to the research team due to their participation in an earlier STORM pilot in 2017–8), expressed interest in running a Digital STORM group. The Digital STORM facilitator and participants at the college had not experienced STORM before in any format.
All four groups were screened by the UCL researcher (MO) and found to meet the study inclusion criteria (see below). As a result, three third sector organisations for people with intellectual disabilities and one special educational college were recruited to participate in the pilot study. This was achieved as planned by April 2021. As detailed in Table 5, these organisations are located in the North, and South East of England and in Wales. Including participants beyond London and the home countries was only possible due to the approach to collecting informed consent and outcome measures using digital platforms. Some of the group members from group 2 did not meet the eligibility criteria, as they were considered to have more severe intellectual disabilities by the group facilitator. For this group, the Digital STORM sessions were run at a mutually agreed alternative time, so that the usual activities and meeting time could be maintained throughout the period of time that the Digital STORM intervention was delivered. Therefore, group members with more severe intellectual disabilities were not excluded from meeting with their group or taking part in activities as usual.
Group number/type | Location | Facilitator | Participantsa |
---|---|---|---|
1. Social group | Bradford, England | Project Coordinator Female |
6 |
2. Self-advocacy/social | Bridgend, Wales | Training and Development Worker Female |
6 |
3. Self-advocacy | London, England | Learning Disabilities User Involvement Co-ordinator Female |
7 |
4. Educational (College) | Surrey, England | College Pathway Lead Female |
3 |
Total | 22 |
Facilitator training and supervision uptake
The group size at each training session varied according to facilitator availability. For the first training session, two facilitators completed the training together with both LR and HB. The other two facilitators undertook the training individually with LR. For the second training, it was possible to bring everyone together in a joint session. While there was the opportunity to try out the STORM materials and practice as a group, none of the facilitators had enough time to take this up.
All facilitators engaged with the supervision sessions offered by the intervention partner. The average number of supervision sessions attended was three (range sessions 2–4). All facilitators had a supervision session prior to running session 1. Thereafter, three facilitators engaged in a supervision session after session 1, one had supervision following session 2 (instead of after session 1, as this coincided with the college half-term holiday), and one facilitator had a supervision session after session 4. Supervision sessions ranged from 15 to 50 minutes in length. After sessions 2, 3 and 4, those who chose not to have a formal supervision session exchanged e-mails with the intervention partner to discuss minor issues, or share experiences.
The facilitators seemed enthusiastic and engaged throughout the time they were delivering the intervention and were in contact with HB regularly via e-mail. Discussion during supervision sessions mostly focused on practical issues related to the session content (e.g. issues relating to streaming and the format of the videos). The sessions also gave facilitators the reassurance that they were doing a good job at delivering the intervention and likely helped to maintain intervention fidelity. No issues relating to safeguarding or the well-being of group members arose during the pilot.
Baseline characteristics of participants
As can be seen from Table 6, of the 22 participants approximately two-thirds were female and the majority identified as ‘White British/white other’ (73%). The median age of participants was 34 years. Forty-one per cent reported attending mainstream school and 36% a special school. Sixty-eight per cent reported attending a self-advocacy group.
Characteristic | All participants (N = 22) |
---|---|
Age in years: median (IQR) | 34 (27–37) |
Range | 21–59 |
Gender: n (%) | |
Female | 15 (68) |
Male | 7 (32) |
Ethnicity: n (%) | |
White British/white other | 16 (73) |
Black British/African/Caribbean/other | 4 (18) |
Asian British/Asian other | 1 (5) |
Other | 1 (5) |
Type of school attended: n (%) | |
Mainstream | 9 (41) |
Special school | 8 (36) |
Both | 2 (9) |
Unsure | 3 (14) |
Self-advocacy group attendance: n (%) | |
Yes | 15 (68) |
No | 4 (18) |
Unsure | 4 (14) |
Retention
Of the 22 participants who provided informed consent and baseline measurements, 21 were retained to post intervention. One participant withdrew from group 1 before the start of the first session, due to personal reasons, with the facilitator confirming that this was not related to STORM.
Attendance
After delivering a STORM session, facilitators were required to complete an attendance register and facilitator notes (see Appendix 5). The attendance registers and facilitator notes were always completed and sent over promptly to HB and were described by facilitators as easy to complete.
The progression criteria set out that a strong indicator for progression would be participants on average attending at least three sessions, whereas a poor indicator for progression would be participants attending less than three sessions. Of the 22 participants, 20 (90.9%) attended three or more sessions.
The median number of sessions attended was 5 [interquartile range (IQR) = 1]. The majority of participants (63.6%) attended all five sessions, with three participants attending four and three sessions, respectively, and two participants attending only one session (Figure 5).
The number of participants attending each session decreased slightly as the sessions progressed (Figure 6). However, the minimum number of participants attending any one session was 16 (72.7%), indicating sustained high attendance through the intervention.
Impact of technical issues during Digital STORM sessions
Prior to the onset of the pandemic, all participating groups had been meeting face to face and digital means of meeting had only been adopted in the wake of the pandemic. The risk of exclusion from Digital STORM due to technical issues was considered in relation to a target of no participant missing more than two sessions, or parts thereof (other than brief connectivity problems not exceeding 10–15 minutes), due to technical problems.
The majority of participants (n = 18) did not miss any session, or parts thereof (> 15 minutes), due to technical issues. While four participants missed one session, no participants missed two or more sessions due to technical issues. No or minimal technical issues in terms of joining the online meetings were recorded for participants across the majority of sessions (82.1%; Figure 7). While 13.7% of all sessions involved some issues affecting participants’ presence and engagement (up to 15 minutes), only 4.2% of all sessions involved significant issues (> 15 minutes). This pattern was consistent across all individual sessions, except for session 2, for which more participants (35%) experienced some challenges to presence and engagement (up to 15 minutes).
Fidelity of delivery in line with the manual
The process for assessing the fidelity in line with the manual was set out in Chapter 2 (see Intervention fidelity), a random selection of the recordings was rated (with either session 1 or 2 and either session 3 or 4 rated for each of the 4 groups, plus all follow-up sessions). For group 2, session 1 was also rated as a practice exercise to develop consistency between the two raters. All follow-up sessions were rated. An overview of the session recordings that were rated for fidelity for each group is presented in Table 7.
Group | Session 1 | Session 2 | Session 3 | Session 4 | Session 5 |
---|---|---|---|---|---|
1 | √ | √ | √ | ||
2 | √a | √ | √ | √ | |
3 | √ | √ | √ | ||
4 | √ | √ | √ |
The intervention was delivered with a high degree of fidelity to the Digital STORM manual (Table 8). Across all sessions, on average, 91.8% of the core requirements of the intervention were assessed as definitely present, 5.1% as somewhat present and only 3.1% as absent. Requirements judged as absent included facilitators occasionally omitting to summarise discussion points and in one case not holding an optional celebration event. The high degree of fidelity was consistent across assessed sessions and groups (see Table 8 and Figure 8).
Group | Session 1/2 | Session 3/4 | Session 5 | Overall | ||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|
Definitely present (%) | Somewhat present (%) | Absent (%) | Definitely present (%) | Somewhat present (%) | Absent (%) | Definitely present (%) | Somewhat present (%) | Absent (%) | Definitely present (%) | Somewhat present (%) | Absent (%) | |
1 | 89.5 | 5.3 | 5.3 | 93.8 | 0.0 | 6.2 | 93.8 | 0.0 | 6.2 | 92.3 | 1.8 | 5.9 |
2 | 100.0 | 0.0 | 0.0 | 75.0 | 12.5 | 12.5 | 87.5 | 12.5 | 0.0 | 87.5 | 8.3 | 4.2 |
3 | 100.0 | 0.0 | 0.0 | 93.8 | 0.0 | 6.2 | 87.5 | 12.5 | 0.0 | 93.8 | 4.2 | 2.1 |
4 | 97.4 | 1.3 | 1.3 | 100.0 | 0.0 | 0.0 | 81.3 | 18.7 | 0.0 | 93.8 | 6.2 | 0.0 |
Average | 97.4 | 1.3 | 1.3 | 90.7 | 3.1 | 6.2 | 87.5 | 10.9 | 1.6 | 91.8 | 5.1 | 3.1 |
As part of the qualitative interviews and focus groups, researchers checked the acceptability of recording the intervention sessions with facilitators and group members to facilitate fidelity checks. Across the four groups, 14 group members felt the set-up and recording of sessions by researchers to be acceptable; some participants even felt the researcher’s set up to be useful when technical problems arose:
It didn’t get in the way, it was good to be recorded.
G3
I think it went well, because there were times when we were cut out because of technical issues, coming back from the previous question, but because of how the email, and all that was sent, it was easy to get back on.
G4
One participant from group 4 said they would have preferred sessions not to have been recorded. All four facilitators found the recording to be acceptable, although one did comment that they would have preferred to be more in control of the set-up of the meetings.
Qualitative evaluation
To understand the feasibility and acceptability of the Digital STORM intervention from the perspective of facilitators, individual interviews were undertaken. Acceptability of the intervention to group members was assessed via focus groups. Of the 21 group members, 20 attended a focus group, spread across the four participating groups. However, not all participants were present for the duration of focus groups. Attendance for each focus group question is shown in Appendix 7.
Feasibility of delivering Digital STORM
Eight main themes and associated subthemes were identified concerning the feasibility of Digital STORM delivery from the perspective of group facilitators. These are organised into three areas of interest: (1) feasibility of using Digital STORM resources/support; (2) feasibility of digital technology; and (3) feasibility of managing other digital delivery challenges. Of the eight main themes, three were developed during the design of the analytical framework to inform decisions on the study progression criteria: managing technical difficulties; maintaining/managing privacy; monitoring and managing emotional responses. The other themes were developed during the initial coding and categorisation of the data: preparing to deliver STORM; use of resources within sessions; action planning; facilitating engagement in the intervention; and access to a range of support. An overview of the themes and their constituent subthemes, together with the frequency with which facilitators commented on each is presented in Table 9. Facilitators could be recorded as endorsing and commenting on subthemes both positively and negatively; hence, some of the frequencies total more than four.
Main themes | Subthemes | Frequency (n = 4 facilitators) | ||
---|---|---|---|---|
B. Feasibility of using Digital STORM resources | ||||
Preparing to deliver STORM | Positive | Negative | ||
|
4 | 2 | ||
|
3 | – | ||
|
– | 2 | ||
|
2 | 1 | ||
|
1 | – | ||
Use of resources within sessions | Positive | Negative | ||
|
2 | 3 | ||
|
3 | 2 | ||
|
2 | 2 | ||
|
– | 1 | ||
|
– | 1 | ||
Action planning | Positive | Negative | ||
|
3 | 3 | ||
|
3 | 2 | ||
C. Feasibility of digital technology | ||||
Managing technical difficulties | ||||
|
4 | |||
|
4 | |||
D. Feasibility of managing other challenges in the digital environment | ||||
Maintaining/managing privacy | Positive | Negative | ||
4 | 2 | |||
Monitoring and managing emotional responses | Positive | Negative | ||
4 | 2 | |||
E. Engagement of group members in the intervention and access to other support | ||||
Facilitating engagement in the intervention | ||||
|
3 | |||
|
4 | |||
Access to a range of support | ||||
|
2 | |||
|
3 |
Feasibility of using Digital STORM resources
Preparing to deliver Digital STORM
A number of resources were made available to facilitators to enable them to both prepare for and be able to deliver the intervention. This theme focuses on the use of resources and support to prepare for intervention delivery. Five subthemes were formulated which relate to the manual, training, time available, past experience that aided preparation and supervision. Each of these subthemes received positive and negative reviews or both.
The purpose of the intervention manual was to guide facilitators in the preparation of resources, technology and delivery of each session’s activities. In this sense, reading through the manual and using the prompts for preparation formed part of the intervention training. All four facilitators gave positive reviews about the manual in helping them to prepare to deliver Digital STORM.
The book [manual] was really helpful. I’m much more of a paper person. So, I liked having the book and I sort of scribbled notes.
F2
Two facilitators also commented about the negative aspects of the manual, for example the sheer volume of information.
There was probably too much information.
F2
Details about the training provided by the UCL team (LR) and intervention partner (HB) are provided in Chapter 2. The rationale for the training was to provide the facilitators with an overview of the Digital STORM intervention, its underpinning theories, structure and guidance around how it was intended to be delivered, as well as offering support to help facilitators to be aware of and prepare for potential challenges related to delivering the intervention via web-based meeting platforms. Facilitators were positive about the training provided, in particular they commented on the flexibility offered in its organisation, the chance to meet other facilitators, understanding what STORM is about, and being able to ask questions. While the training was found to be helpful in familiarising facilitators with the intervention and resources, the reality of delivering the intervention still came with feeling not fully prepared in terms of juggling the use of the various resources.
I thought the training was really good. And [the team] was brilliant at the fact that I couldn’t make the first training, so she … moved everything around, … I was really grateful for that. And then the second training, having it with other facilitators as well was really nice.
F1
I think it was good, because it meant that I could ask the questions, and just get an idea of what sort of vibe we wanted from it.
F2
I think the training gave me more understanding on what STORM was about.
F3
Limited availability of time for facilitators to prepare for delivering the intervention was raised as a concern by two facilitators. Both facilitators commented on time being a scarce commodity which affected how much they were able to prepare for each session.
I found myself sort of reading through the plan, sort of 5/10 minutes beforehand. It’s just because I just literally haven’t had the time to sort of sit down.
F1
I did struggle with and maybe that was my own problem with, you know, I couldn’t spend a lot of time preparing for it.
F4
During the interviews, all facilitators reflected on past experiences of facilitating groups which had prepared them for delivering Digital STORM. These included experiences of facilitating groups and/or teaching, facilitation training, having previously facilitated groups online, familiarity with web-based meeting platforms, knowing the group well, and group members themselves having become familiar with online meetings. One facilitator acknowledged that they felt less experienced with facilitating online and this affected how prepared they felt to deliver the intervention.
Having to learn everything myself, how to navigate sharing my screen and everything. I have been delivering online before but I think when I started the STORM, I was still new to it. So, it was more challenges on me.
F3
Supervision provided for Digital STORM facilitators is described in Chapter 2. The facilitators when asked had relatively little to say about the supervision available and received during their interviews. However, in some cases, supervision had provided reassurance following sessions and an opportunity to reflect on the facilitator’s stance.
To be able to sort of run through it and say, look, this happened, this happened, and her to go, ‘fair enough. No that’s absolutely fine. Don’t worry about it. Or maybe just talk a little bit more about it’.
F1
… this pushed me to give them a bit more independence, that sort of thing we spoke about, like, my insecurities about my work or their capabilities, actually a nice space to test them out.
F2
Use of resources within sessions
This theme focuses on the use of resources when delivering the intervention. The five subthemes relate to: ease of use of the Wiki, playing and sharing videos, sharing other content, ease of use of the manual and ease of use of the group member booklet. Each of these subthemes received both positive and negative feedback from facilitators.
The ease of use of the STORM Wiki (a web platform designed as both a repository of intervention resources and an aide to delivering session content), while acceptable to some, was also found wanting by three facilitators. Concerns were raised particularly around sharing videos.
I didn’t like the STORM Wiki at all, I found it really clunky. […] I don’t know, it wasn’t clear to me. […], I’d try and kind of set up a video and have everything prepared in advance. But you went to click on a video, and it needed to reload.
F1
So, when I got into the Wiki and read through the notes, it was all familiar to me. I understood how to do it. Beforehand, you think, oh, yeah, yeah, I’ve got that. I’ve got my resources. That’s fine. I’ve got my Wiki. I can go through that, everything’s good. And then when you’re actually doing it, and trying to engage three people at the same time, it was like, oh, okay, this is quite a juggling act.
F4
I didn’t find that there was obviously enough information on the Wiki for me to follow it. Whereas with the manual is much better broken down, it was more pointless for me, then I didn’t have to do so much prep because it was in the book. I think if I just had the Wiki, I probably would have had to have basically written this myself anyway.
F3
Another facilitator recognised the potential of the Wiki to be a useful resource if it supported and guided the facilitator more in the way that the manual did. However, in its current form, it felt like a few too many things to negotiate on top of managing delivery via the web meeting platform.
So, if the Wiki could replicate the facilitator notes of how they go through it a little bit more. As a deliverer, that would be a lot more helpful, I think.
F4
In line with the comments about the Wiki, one facilitator reported that sharing videos through the Wiki was not optimal as they could not find a way to enlarge the video. This was more of a problem for participants joining via a mobile phone than those joining using other devices.
I did find it a bit of a waste of screen space on videos. Because obviously, the box with the video was like in the centre and was really little. And then there was all this screen. And some of my guys got a laptop. So that was okay. But some of them might be using mobiles, which meant … the video was tiny.
F2
When there were challenges playing the videos from the STORM Wiki, facilitators had recourse to access the same video content via links to YouTube. Having this resource helped ensure the intervention could still be delivered as intended.
I think for one of the sessions, the videos wouldn’t work, but I had the YouTube …. So, I managed to find what I needed because I had the titles and stuff are on there.
F2
When it came to sharing other content, two facilitators reflected on the challenges of using web-based meeting platforms and how they tried to minimise the anticipated problems by keeping screen sharing to a minimum.
I couldn’t see everyone. … that’s more of a Zoom thing, isn’t it? But we got around it. We didn’t really need to see the screens too much.
F2
Zoom has got a whiteboard function, which could have possibly been used. But I chose not to put too much attention on taking away a chunk of the screen.
F3
Two facilitators were positive about the ease of use of the manual for delivering the intervention sessions and activities. They said it provided a place where they could add their own notes, and aided group discussions by providing prompts and alternative approaches or phrases. This was something that was lacking in the Wiki and for this reason using the manual as a guide was preferred.
I found that really helpful because … rather than just giving me titles, it … gave me questions and sentences, and if I felt like people weren’t getting it, … there were alternative ways of wording it.
F2
I needed to do one or the other, I either needed to use the notes [manual], or I needed to use the Wiki. And I knew that the Wiki didn’t have enough of the key points on it. I couldn’t use the Wiki alone, I needed to use the facilitator points [in the manual].
F4
One facilitator commented on the content of the group member booklet in relation to images used to stimulate ideas for individual or group action plans in session 4. They felt that these did not adequately stimulate ideas about how people could stand up for themselves.
Pictures … I found confused our members a little bit because they were thinking more of, you know, I want to be really good at painting or I really want to do cooking, or I really want to do this and not to say that isn’t about standing up for yourself.
F1
Action planning
In session 4 of the intervention, group members undertook action planning to convey that they themselves can take action, and to offer an initial space to consider how they may wish to manage or resist stigma (building on discussions in session 3). One facilitator saw supporting group members to plan as really important for their group members and was very proactive in supporting action plan development and implementation.
I took quite an involved approach. But we were really keen for this to not just be, yeah, you’ve done a programme and shove it in the drawer. Yeah, I wanted them to have a legacy from it. I want them to be able to come out in a year and say, ah, the reason I started speaking at events is because I did it on the STORM project.
F3
There were also fewer positive reflections from another facilitator on this aspect of the intervention, implying a lack of time for developing and implementing action plans, which led them to reflect on the need to be proactive in supporting group members.
So, we did do kind of mini action plans. Sort of on the cuff almost. But no one had, … written anything down in particular. And mainly because a lot of them said they hadn’t had time.
F1
I sort of just left everyone to it after the fourth session. So, I think it’s probably a bit my fault, not really sort of following up with it, but also sort of that time gap and other things that were going on as well.
F1
When it came to implementing action plans, proactive support in the planning was emphasised as important to their success, and, as a result, it had the potential to be one of the most rewarding aspects of the intervention.
We properly planned it. And it actually meant that it was a success.
F2
I think the best for me was seeing how excited people were to take part and seeing the action plans happen …
F2
In some groups, group members did not implement any action plans and in the educational setting, the timing of the holidays prevented the group from undertaking their planned joint project within the time frame set out in the manual (4-week gap). Other external factors such as COVID-19 restrictions also affected the implementation in instances where group members needed support to carry out their action plans.
The restrictions meant I couldn’t travel to him to meet him for any length of time. But he didn’t want to do a video on zoom. So, it was one of those, oh, how are we going to do this? … I am literally waiting for a kind of moment where we’re legally allowed to do it.
F2
Feasibility of digital technology
Technical difficulties
A range of technical difficulties were encountered during the delivery of Digital STORM. These included poor internet connections, individuals’ videos freezing, issues with Zoom account updates and logging-in processes, the use of mobile phone devices to join the group sessions as well as devices running out of battery during the session. Many of these difficulties are not specific to the STORM intervention but rather are common challenges of digital meetings.
Like we didn’t really have many issues with it, it was more sort of members, having issues with internet or, …. She needed to upgrade her zoom.
F1
No technical problems apart … every time I shared the screen, she couldn’t see it. So, I had to keep sharing two to three times so that she could get it. I think because she’s using her phone and maybe there’s something in that maybe?
F3
Despite technical difficulties that were encountered, facilitators reported no or minimal impact of issues related to joining online meetings on delivering the intervention.
We didn’t really have any issues, erm nothing, nothing dramatic, that stopped the flow of what we were doing anyway.
F4
The types of minimal impact referred to were delaying the start of the session, having people join late and therefore missing some parts. It was acknowledged that this could be distracting for other group members.
Where problems with technology were encountered, these were all described as manageable for the facilitators. Resolutions to technical problems were found, such as phoning group members to assist them in joining the meeting, providing a re-cap, or catching up with individuals separately on any content they had missed. Facilitators emphasised that some difficulties were mitigated by providing clear advice in advance or by group members having support in place within the home from family or support staff.
She had, she was using a phone, which whenever someone rung her, it cut out. So, there were a few times where [group member’s name] would come back in and go ‘oh, what have I missed, what have I missed, tell me what I’ve missed!’ And I think the first two sessions I was trying to recap, but I was thinking actually this is a waste of time. So, the next couple of times that [group member’s name] would drop in and out, I’d ring her after the session and just say right, the only bit you missed was ….
F2
Feasibility of managing other digital delivery challenges
Maintaining/managing privacy
All four facilitators were confident that privacy was maintained and managed to an acceptable level during the delivery of the intervention. Where there were potential compromises to privacy these were considered unavoidable and beyond the control of the facilitators.
I wouldn’t say there were any sort of particular issues of privacy really, because they were all in their own space in their own homes.
F1
Everybody seems to be on their own apart from [name] and [name], they work in the [staff] office and there’s nothing we can do about that really.
F3
Monitoring and managing emotional responses
Another concern during the adaptation phase was whether facilitators in a digital environment would be able to detect any potential distress in response to the intervention content. For this reason, it was agreed that group sizes should be smaller than for F2F intervention delivery to ensure facilitators could see all group members on the screen at the same time. Keeping group sizes small was perceived to facilitate the monitoring of emotional responses. Facilitators did not report any significant distress or negative emotional reactions by their group members and so were unable to comment any further on being able to manage and contain any emotional responses during the sessions.
I only had six people taking part. So, I found it okay.
F2
…didn’t have any other sort of issues around … anyone being upset or, … having to take time out or not wanting to share anything. So, I think overall, it was okay.
F1
Engaging group members with the intervention and access to support
Engagement of group members in the intervention
Two subthemes were formulated in relation to facilitators engaging group members in the intervention. These were not necessarily related to the digital delivery format, but concerned content that was emotionally or conceptually difficult. The first relates to emotionally difficult content. The topic of stigma can bring up challenging thoughts, feelings and emotions and how facilitators and group members engage with or choose not to engage with these. For some group members, these were perhaps new, not having faced the topic so explicitly before.
I think it’s brought up some, definitely some sort of questions, some kind of different thoughts.
F1
One facilitator in particular felt the sensitivity of the topic necessitated more time devoted to it (around sessions 2 and 3), as they wanted to be sensitive to group members and validate their experiences. Given the sensitivity of the topic, the facilitator emphasised how important it was that the group knew her and one another.
… I think the fact that we all knew each other quite well, none of these stories are particularly new.
F2
In trying to engage group members in the Digital STORM sessions, some observed that the material consisted of some conceptually difficult content, especially where people did not have direct experience of stigma to draw upon, or could not imagine themselves in a scenario where they might need to stand up for themselves. One reason facilitators suggested was that some people always had support in the community, which afforded them some protection.
I think it was the being treated badly thing. They couldn’t sort of put it on themselves. They could … I saw my friend being bullied and I said it was wrong. But they couldn’t really put it on themselves and think of themselves as the person that was the victim almost.
F2
I found when I asked them questions, they would be very repetitive with their responses, and not really processing the question that I asked them.
F4
They found it hard to imagine a time they might be challenged in the community. And … for some of my members being alone in the community is an impossibility because of their epilepsy. So, they’re generally always with a trusted adult.
F2
This meant facilitators spent more time prompting group members than anticipated, thus working harder to keep them engaged.
It was good to see people thinking for themselves, but … I don’t think they were able to express all of it. I don’t know if they understood it fully. Because when you go back to them, they kind of, I had to keep, keep prompting them.
F3
I’d have thought they’d be a bit more forthcoming with, actually, this happened to me, and I did this and I didn’t know what to do. And then I’d be like oh right, and then I’d have to do more thinking on my feet how to probably solve it or suggest something or direct them, but there was a lot more of me giving them scenarios. But maybe they haven’t gone through it as much as I assumed, so.
F3
Concepts around action planning and encouraging group members to generate ideas for their action plans also appeared conceptually difficult for some.
I don’t think they understood what to do. And I try to say ‘You think of something that you want to do’. So, I had to, I kind of had said, I gave them a scenario ….
F3
Given the at times challenging content, one facilitator felt it would be better to have more time to cover the content.
The digital bit didn’t make a huge difference. Apart from perhaps breaking it down. But I think if it had been over three sessions or two sessions, it would have been easier to understand.
F2
Access to a range of support
The theme of peer support encapsulates the support that participants gave to each other within Digital STORM sessions, for example sharing ideas to develop action plans.
Peer support was really important … it all kind of came together when people were throwing their ideas at her.
F2, FG
He said, you know, this is what I want to do … And we sort of talked it through together and he said, well, how do I get there? So, we chatted as a group for support and said what could [group member’s name] do to get to that point? So, we said, well, contact people that he knows. So, he knows people at X, which is a learning disability theatre, … so you could talk to them? And who could help him get to that … Is it you know, a friend? Is it a carer, or supporter? Is it someone from [organisation name]?
F1
Peer support within sessions also generated an awareness of the possibility of receiving support from peers beyond STORM sessions.
I think that everyone’s learnt actually that they can lean on other people with learning disabilities. It doesn’t have to be paid staff or mums and dads.
F2, FG
Facilitators also spoke about the support from others; this included support to join sessions, during sessions and outside of sessions (in particular in relation to supporting the implementation of action plans).
So, she actually had her dad with her at the time, and they’d managed to sort of set up the laptop.
F1
His staff in his house supported him because we’d obviously liaised with the staff to explain this was really important, and it was, yeah. And it was on Teams, which he doesn’t usually use. So, we had separate Teams training sessions to prepare him for that. And he smashed it. Absolutely smashed it. He was amazing. And they asked him to come back and speak at their next meeting.
F2
I said, “Mum, [group member’s name] has got a mini project to do, will you let her do it?” She went, “Oh, I don’t know, I don’t know, I’ll think about it.” And then five minutes later, she came back. She went “right, I don’t like you anymore. But yeah, I will let her do it”. And she was brilliant, mum was really supportive. So, I sent a letter explaining what it was for, why she was doing it. And I sent out like a nice little shopping list pad to her in the post, like a treat thing. And they went to Asda and Mum sat outside and [name] went in and bought two bottles of dandelion and burdock. While mum had a nervous breakdown in the car park, [group member’s name] came out and was so happy Mum phoned me and was like “she’s done it. It was amazing. She’s brilliant. She’s my hero.” Mum was so proud of her that she’s going to give [name] a little shopping list next time to go and get a couple of little items. … So, everyone’s just taking it and run with it and supporters have been brilliant and parents and things.
F2
For the facilitator of group 2, good relationships with family and support staff were key to having conversations and getting them on board with supporting group members’ action plans. This did require an additional investment of their time.
Everyone’s responsibilities were like, laid out in there as well. So, everyone’s expectations were kind of included, like mum has to wait outside the shop. [Facilitator’s name] has to keep her phone on. I’d say about three hours making those plans after the STORM session.
F2
In contrast, the facilitator of group 3 felt the support of others could vary, depending on other demands on them, suggesting consistent support is not something that can be assumed.
I spoke to the support team for [group member’s name] and [group member’s name] and saying this is what they’ve been tasked to do. “Can you help them?” And of course, they’ll say “yes”, but it depends on their day-to-day routines and everything.
F3
Despite this, they felt more involvement from support staff could be beneficial in helping participants implement the lessons learnt in STORM sessions.
If someone was cruel to them or said something horrible to them, they have the tools but I don’t think they’d maybe put them into play straight away because they’d need to be supported, and maybe if their support team was with them throughout STORM, maybe that could have helped as well.
F3
Acceptability of the Digital STORM intervention
Of the eight main themes related to acceptability of the intervention, five themes were derived from the objectives and progression criteria: acceptability of the resources and activities, acceptability of digital technology, acceptability of privacy and support available, recommending Digital STORM and views on intervention delivery mode. The remaining three themes, perceived benefits of the intervention, likes and dislikes and suggested improvements, were developed during the initial coding of the transcripts and before the development of the coding framework.
Acceptability of Digital STORM resources and activities
Use of resources within sessions
In one group, two participants described times when the videos did not play at first, they found this to be slow and frustrating at times.
Like it was frustrating. Like [facilitator’s name] was saying that we were really struggling to watch some of them, just couldn’t get them playing.
G1
Across the four groups, 14 participants felt the group member booklet was easy to use and a useful aid to accompany the session content. Most used the booklet to look at during sessions rather than to write notes, and instead relied on the facilitator to take notes. Few participants used the booklet outside of Digital STORM sessions due to no ‘homework’ being set by the facilitator.
… it was everything sort of in one place. I don’t think I used it outside of the sessions. But yeah, I did have it in sessions, yeah.
G1
I give the booklet 10/10. When I looked at the pictures, then it helps me to read.
G2
But now, I use it all the time, in the session and outside the session … I write in it.
G3
It was helpful with understanding what the sessions were going to be about.
G4
Individual actions – planning and implementing
During the focus groups, seven participants talked in positive terms about implementing their action plans, reporting an increased sense of confidence and believing the plans would be helpful in the future.
But I went into Asda’s myself and my mum waited outside for me … Yeah. So, I will feel that my confidence is up … I think it’s been brilliant. I’m really proud of it.
G2
It’s very creative, one of my skills is being creative.
G3
I think it’s going to be a really useful action plan for the future.
G4
Acceptability of digital technology for meetings
Across the four groups, four participants reported having experienced technical problems during Digital STORM sessions. These ranged from poor internet connectivity, Zoom update requests and low device batteries. None of the participants felt that the technical problems affected their experience of taking part, or described the impact as minimal. However, one participant described feeling stressed when they could not connect to the internet.
I think it went well, because there were times when we were cut out because of technical issues, coming back from the previous question, but because on how the email, and all that was sent, it was easy to get back on.
G4
No, I did a little bit but when you were talking about that plan but then I started to repeat it and then I was able to get back on track and catch up.
G4
I couldn’t come to that meeting because I couldn’t connect to any internet here. So, I was like getting all stressed out.
G2
Participants described the ways in which they managed the technical problems; examples include phoning the facilitator in the moment and catching up with missed content by information being posted to them.
Acceptability of privacy and support available
Across the four groups, all of those present when privacy was discussed (n = 18) felt their privacy was maintained, with views that any potential threats to privacy were acceptable or well managed. Some participants described how they self-managed potential threats to privacy, for example moving to other rooms or using headphones:
Then if someone came in or someone came back, I’d have plugged in my earphones so they couldn’t hear.
G4
Like, sometimes when my father was doing work in the house, where I was, I’d go in my room or in my mum’s room or in the dining room.
G2
One participant from group 4 (who joined Digital STORM sessions from a room within their residential college) found it difficult at times when staff members tried to enter the room, although they felt able to manage the situation.
It was ok, if some people knock on the door and then you’re like oh no … I found it a bit difficult.
G4
It was important to understand in the context of a digital intervention whether group members considered the content distressing and whether they were able to access additional support if or when they wanted it. Across three groups, five participants commented on finding some of the videos difficult to watch due to seeing the individual featured being treated unfairly. However, they did not feel they needed additional support to process feelings associated with the material, nor did they disengage from the intervention.
The thing that I don’t really like is sometimes when I watch the videos, seeing the person’s reaction, how they felt after that in certain situations so, but all round is a good session, but I didn’t like that really.
G3
When someone was not being nice to a woman and they talk behind her back and they say nasty, and they are kids and all that … I don’t …
G2
One participant did not find anything about the STORM content upsetting but found it difficult talking about their personal life during discussions.
Not really but when it comes to my personal life, it was a bit sad.
G1
Across the four groups, none of the participants present during the focus groups felt they wanted or needed any additional support while taking part in Digital STORM. Nine participants explicitly reported they had access to support if needed. In addition to support from the facilitators, group members also talked about peer support and support from others.
Group members spoke about peer support; however, this centred more on support towards others outside of STORM sessions, specifically to stand up for and help others, rather than providing support to one another within the sessions.
If they have any problems, I can help them to sort their problems out. If they need anything, I can help them out. Stick up for them.
G2
We can stick up for people now and people who have learning difficulties.
G2
The STORM meetings have helped me to speak up, I speak up for other people, [for example, making sure] they understand what other people are saying.
G3
Participants across two groups spoke about seeking support from others to talk through problems, to manage difficult situations, in addition to developing and implementing action plans as part of the STORM intervention.
It helped me to stand up for myself by asking other people to help me, asking the support worker to help me to stand up for myself.
G3
I learnt how to stand up for myself when I go to a local shop if somebody bullies me. I’ll go to a local shop or church and ask them to help me, so I don’t have to do it on my own.
G3
Start with someone, not on my own.
G2
Perceived benefits of the intervention
Three main themes were identified that concern the perceived benefits of Digital STORM according to participants and facilitators: personal awareness, learning and growth. Group members commented on an increased personal awareness of learning disabilities, including through talking about themselves. They also became more aware that they can stand up for themselves. The personal learning arose in new knowledge of ways to stand up for self and others and generally managing difficult situations. Group members experienced personal growth through an increased confidence to stand up for themselves as well as more general confidence. Additionally, they reported feeling pride and an increased sense of independence. Facilitators had less to report about the impact of Digital STORM on group members; however, they were clear that there was no detrimental impact.
Personal awareness
Across two groups, three participants said they have learnt about what having a learning disability means, for others with a learning disability and themselves.
How people with a learning disability are treated … And the story is only part and the choices you make for that story will complete it.
G2
Across three groups, four participants spoke about learning to talk about themselves, their experiences and their feelings with others.
Like talking about my personal life … telling my stories and what I’m feeling about that.
G1
Learning about how to talk about the good values you’ve got and the bad ones.
G3
Across four groups, eight participants commented on knowing they can now stand up for themselves, by providing examples of standing up for themselves as well as being assertive about what they like/do not like or want.
When it comes to standing up to bullies, I can stand up.
G1
And it wasn’t right, so I thought I’m not having any of this, I’m going to stand up for myself so I told him straight and then I just carried on walking then.
G2
I learned to … be strong and say how you feel really, instead of not saying anything at all.
G3
Personal learning
Across four groups, eight participants spoke about learning how to stand up for themselves.
It’s been telling us how to speak up and I can’t believe it, all the work I’ve done with you lot … that meeting STORM is 10/10.
G3
It has helped me to … think about how you act in that when somebody makes you mad on the street.
G3
For me, it was learning new things and also the different scenarios on how to stand up for myself if I was in those scenarios.
G4
From one group, two participants spoke about learning how to stand up for others too.
We can watch the video of people bullying us and now we can help people who are like us.
G2
We can stick up for people now and people who have learning difficulties.
G2
Across three groups, four participants provided examples of how they have managed difficult situations since participating in STORM, for example when in a crowded place, by themselves or through seeking support from others.
It helped me like calm down in certain situations, helps me to be sometimes motivated in stuff instead of thinking about it in general.
G3
Personal growth
Across two groups, nine participants suggested they have grown in confidence to stand up for themselves.
It gives me … the confidence because … if I don’t like something I can always say.
G1
I know what to do. And it gives me encouragement to stick up for myself.
G3
Across two groups, five participants commented on their improved confidence generally.
It’s brought confidence to me. So that’s why it’s so good doing this project.
G2
… it’s boosted my confidence compared to how I was before when I first started my advocacy rep job, I was quiet as mice but now you can’t stop me from talking and so it’s boosted my confidence.
G3
Across two groups, three participants commented on a sense of pride, with regard to implementing their action plan and the overall experience of taking part in the intervention.
I think it’s been brilliant. I’m really proud of it.
G2
I’m so proud of myself, thanks so much. All the work I’ve done, I’m so proud of, thanks so much.
G3
From group 2, five participants commented on their increased independence.
Helping me to be more independent, learning me on how to be more independent.
I caught the bus on my own I did. I went to the chemist by myself.
Guess what I had, I went to doctors on my own to have my COVID-19 jab.
All four facilitators felt the Digital STORM intervention had a positive impact on group members but were less forthcoming with specific examples. Where they did comment, they had observed an increase in confidence, self-reflection and a shared passion. One facilitator felt it might be too soon to judge the impact.
I think there’s definitely a lot of skills in there, where our members are now sort of thinking about actually, what do I want in my life? And, you know, kind of I do these set activities each week. Is this something that actually I want to do? And how can I get to this point, as well.
F1
It gets them thinking about situations that they’ve been in, that they could stand upto, I think it is really beneficial.
F1
He’d gone to a cinema, and they tried to, there was a problem with it and they tried to fob them off with something. And he’s like, no, we’ve paid our tickets, and we want to do this. And he really did. He really did have his moment.
F4
For the project, … they totally believe that they can stand up for themselves, whether they can or whether they can’t, they believe it, which is half the battle as far as I’m concerned. And they’re really keen to spread that message. … That was so nice to see at the end of it, they’re so motivated to get together and do this thing that they want to do. And that STORM gave them a vehicle to do that.
F4
No detrimental impact
Facilitators from each group were clear that, to the best of their knowledge, there was no detrimental impact on group members taking part in Digital STORM.
I think it’s brought up some questions, some kind of different thoughts. But I don’t think anyone particularly became upset or thought, you know, I don’t want to do this anymore. […] And I think, yeah, I wouldn’t say there’s been a negative impact at all.
F1
I think it’s been a positive experience all around. They’ve enjoyed the sessions, they’ve enjoyed the one-to-ones. They’ve enjoyed the fact that there’s been that the action plan.
F2
Recommending Digital STORM
Recommend to others
All four facilitators indicated they would recommend Digital STORM to other facilitators and groups. Across the four groups, 16 participants said that they would recommend Digital STORM to others, with none saying they would not. Participants spoke about their reasons for recommending it, including what they had learnt and enjoyed about taking part. One participant also spoke about the impact of COVID-19 and the need to ensure the intervention is available as a digital version.
Because it would help other people stick up for themself. And it’s very, very good to be in like a group discussing different life situations as a group … it helps … with motivation.
G3
Yeah, I would recommend it because we don’t know how long this Covid is going to go on for and obviously tamper our ability to meet up. So, like to get all the sessions done then on like digital seems to be a good way to move forward with it … if we can do it, why can’t anyone else?
G4
Researcher: Would you recommend digital STORM to others?
Our members would love to repeat it again. And I know there’s other facilitators, who’ve got a similar kind of attitude to things as me. And they’ve already asked about it, because obviously, the members chat.
F2
Complete Digital STORM again
All four facilitators stated that they would facilitate a Digital STORM group again; two suggested they would like to deliver Digital STORM content to their pilot group, as refreshers, to ensure that group members continued to benefit from the intervention’s key messages over the longer term. Several participants also reported that they would like to take part in Digital STORM again.
It’s been fantastic and I wouldn’t mind doing it again. 100%.
G2
Yes. Now I know more about it. I definitely would do it [again]. I think it’s, something beneficial. And I think it gets our members thinking … about situations that they’ve been in, that they could stand up, you know, to, I think it’s really beneficial.
F1
I think it would be something that we would want to revisit sort of like once a year, we’re gonna do STORM. And everyone will know the format. And everyone will remember and with our […] clients,[…], we’ve got to be so repetitive.
F2
Views on intervention delivery mode
Facilitators had a range of views on the mode of delivering STORM, if they were to deliver the intervention again in the future. Two had a preference for working F2F and two had a mixed preference. Facilitators could see the pros and cons of both delivery formats. These included the increased ability to use augmentative communication to generate discussion and prompt ideas for F2F delivery and increased accessibility and not relying on transport for digital delivery.
When you’re face to face, I can do objects of reference as well and prompt ideas.
F3
I think the digital version would be a really good supportive thing for people who aren’t together. And that is the only way that they can get together to do a project like this. I think it’s an alternative version for people who aren’t sitting in a college together or some other establishment together.
F4
… if it was something that was during the day, they’d definitely be able to travel or they’d get a lift or but some would probably be more hesitant. And possibly it could have been the reason why we had the members that we did have because, you know, they were able just to jump onto a call rather than travel somewhere.
F1
Four participants (from across three groups) stated a clear preference for taking part in a digital version of STORM. Reasons for this included that the space felt safe and quiet, and that it was a good option in the context of the COVID-19 pandemic, while others suggested personal growth due to learning new technical skills.
I like doing it online so I can sit in my space with all of you, it’s because it’s useful and it helps me and it helps my feelings. It’s just sometimes I get distracted but erm I’m actually enjoying doing it online at the moment.
G1
I was I was quite nervous because I never done Zoom before and I thought erm logging in, it was quite difficult for me at first, but when I got gradually used to it, I started doing it all the time. And it worked out perfectly.
G3
Three participants (from across two groups) stated a preference for a F2F version.
I think that’s a hard one that, I kind of enjoyed it. Oh, but I do think it would have been better if we were able to meet up like this while things … I think like we get more better conversation when we’re all together.
G4
Likes and dislikes
During the focus groups and interviews facilitators and group members were asked what they liked or disliked about Digital STORM. Aspects that group members liked included: learning new things, meeting with peers to share ideas and stories, resources (in particular the group member booklet and communications cards) and the videos.
We all loved the STORM group, it was fantastic to learn new things.
G2
I liked that we all worked as a group together.
G1
And it’s very, very good to be in like a group discussing different life situations as a group … it helps … with motivation.
G3
All the different pictures, it was fantastic the book was.
G2
I feel uplifted when I saw those movies.
G3
I actually found them really interesting … and when you learnt about what everything everyone is up to, it was really inspiring.
G3
The videos we did watch were really good. It’s really interesting to hear people’s stories and it felt like it was speaking to people with learning disabilities, rather than asking other people about, you know, life for people with learning disabilities.
G1
Facilitators liked the opportunity to focus on a different and serious topic, learning more about their group members, having a well-structured and ready designed programme, and a range of helpful resources.
I like the fact that the key message is quite clear, that’s at the top. So, if you sort of lose your track a little bit, you can bring that back in there as well. And the first bit sort of reminding them about the session before and what the key message is in that time, and then having the sort of timings out as well, I thought that was really good. Because we didn’t run over at all, even though, you know, some points we were chatting away for, you know, a good 15/20 minutes about one particular topic.
F1
Having that sort of serious conversation with our members and learning a little bit more about them, what things that they struggle with, or have had to sort of overcome in their backgrounds has been really interesting. And then, you know, possibly will help us be able to support them a little bit better in the future. […] having sort of a smaller group to do it in as well. Learning a bit more about members that don’t necessarily talk in socials as well.
F1
I liked the … it kind of felt like there was a repetitive nature, you know, with the kind of taglines, the little kind of, ‘this is who I am’ … ‘it’s not okay for people to treat me badly’. I like the fact that … they kept reappearing …
F2
Personally, it was a great thing to engage in and to have spent some time with students.
F4
Facilitators did not like delivering the intervention during a difficult period of change and use of the Wiki.
For me, personally, it came at quite a … busy time. And just as sort of coming out of lockdown again, and sort of moving our project back from being online to being out in the community again and all the sort of issues that have been around that in itself. So, I don’t feel personally … I don’t think I’ve given it enough time to prepare.
F1
Suggested improvements
Both facilitators and group members suggested a number of improvements, detailed below.
Digital STORM resources:
-
use PowerPoint slides instead of the Wiki;
-
stream videos via a YouTube channel;
-
if Wiki retained, offer more facilitator prompts;
-
reduce volume of information;
-
reduce session 3 content;
-
balance positive and negative contents more (sessions 2 and 3);
-
more choice over video content;
-
introduce action plans earlier, provide examples or video instructions;
-
plans for revisiting content (refreshers).
Delivering the session online:
-
recruit an additional facilitator;
-
more, but shorter sessions to break up the content, or longer sessions.
Actions for facilitators:
-
facilitator to provide reminders about action plans.
Support:
-
facilitators of different groups getting together to share experiences;
-
supporters joining people in the sessions;
-
more 1 : 1 time with facilitators outside of sessions.
Feasibility and acceptability of collecting study measures
Feasibility of collecting outcome measures
Very high levels of data completeness were achieved (Table 10). The target set out in the progression criteria was to reconsider the appropriateness of the measures for remote data collection if less than 80% of collected data at baseline were useable. With the exception of one measure, 100% of completed forms were useable. One participant missed just one item on the RtD Scale. Outcome variables are summarised at baseline and post intervention. Twenty-one participants provided data post intervention.
Measure | Participants (N = 22) | |
---|---|---|
Baseline | Post intervention | |
Warwick-Edinburgh Mental Wellbeing scale adapted, score range 0–42, higher scores indicate higher levels of mental well-being | ||
Forms completed [n (%)] | 22 (100) | 21 (95) |
Useable forms [n (% of those completed)] | 22 (100) | 21 (100) |
Mean total score (SD) | 32.41 (6.18) | 32.66 (8.20) |
Median (IQR) | 33 (28–38) | 35 (28–38) |
Range | 19–40 | 15–42 |
Rosenberg self-esteem scale, score range 0–18, higher scores indicate higher self-esteem | ||
Forms completed [n (%)] | 22 (100) | 21 (95) |
Useable forms [n (% of those completed)] | 22 (100) | 21 (100) |
Mean score (SD) | 14.08 (2.80) | 14.14 (2.67) |
Range | 7–18 | 8–18 |
Self-efficacy in rejecting prejudice score range 0–3, higher scores indicate more confidence to reject prejudice | ||
Forms completed [n (%)] | 22 (100) | 21 (95) |
Useable forms [n (% of those completed)] | 22 (100) | 21 (100) |
Mean score (SD) | 1.86 (1.17) | 2.00 (1.10) |
Range | 0–3 | 0–3 |
Reactions to Discrimination, adapted, score range 0–12, higher scores indicate more negative emotional reactions to stigma | ||
Forms completed [n (%)] | 22 (100) | 21 (95) |
Useable forms [n (% of those completed)] | 21 (95) | 21 (100) |
Mean score (SD) | 5.52 (2.87) | 4.90 (3.14) |
Range | 0–12 | 0–12 |
Sense of Social Power, adapted, score range 0–3, higher scores indicate a higher sense of power | ||
Forms completed [n (%)] | 22 (100) | 21 (95) |
Useable forms [n (% of those completed)] | 22 (100) | 21 (100) |
Mean score (SD) | 1.93 (0.68) | 2.17 (0.67) |
Range | 0.75–3 | 1–3 |
Feasibility of assessing cost effectiveness
Service Information Schedule data availability
Data related to intervention costs were complete. Information about on-costs and overheads was not sought. This information can be approximated using national data or published accounts and the fact it is missing is therefore not a barrier to estimating the cost of the intervention. SIS data can therefore be considered complete.
Client Service Receipt Inventory data availability
At baseline, 22 respondents provided CSRI data, with 21 providing CSRI data post intervention. There are two instances of missing data in the response to the use of inpatient services. Across other services (not prompted), there was one incidence of a missing value for the type of service or support (received every day). CSRI data were therefore mostly complete.
Intervention cost
While not strictly part of the study objectives, preliminary intervention costs were calculated.
Adaptation costs
Adaptation costs translating STORM for online delivery are a one-off occurrence and are not included in the cost of the intervention, which is calculated as a long-run marginal opportunity cost. However, the cost associated with adaptation is presented in Table 11 for information.
Staff costs
Staff costs by group and activity at 2021 rates are shown in Figure 9. All costs other than the provision of training and supervision were borne by participating organisations. Variation in the cost of delivery and training received, which is based on a constant number of hours across all groups, is due to differences in hourly pay. We can also note variation in the cost associated with preparation and feedback, which varies both due to differences in time and differences in pay. Supervision time also varies, but less so than preparation and feedback. The variation in hours is shown in Table 12.
Category | Group 1000 | Group 2000 | Group 3000 | Group 4000 | Average |
---|---|---|---|---|---|
Delivery | 7.50 | 7.50 | 7.50 | 7.50 | 7.50 |
Feedback | 0.00 | 7.00 | 2.50 | 0.00 | 2.38 |
Preparation | 0.58 | 16.00 | 3.75 | 2.50 | 5.71 |
Training | 2.75 | 2.75 | 2.75 | 2.75 | 2.75 |
Supervision | 4.00 | 4.83 | 3.00 | 2.33 | 3.54 |
Total | 14.83 | 38.08 | 19.50 | 15.08 | 21.87 |
Other resource costs
Only one facilitator (group 2000) reported costs for postage and materials, with other groups reporting no or small costs. It was not possible to obtain postage costs for all groups. These costs have therefore been excluded from the total, and materials costs have been apportioned equally to all groups (Table 13).
Item | Total cost (£) | Cost per group (£) |
---|---|---|
250 traffic light cards | 145 | 36.25 |
50 STORM booklets | 345 | 86.25 |
8 STORM manuals | 116 | 29.00 |
Materials | 21.59 | 5.40 |
Stampsa | 41.04 | - |
Total | 668.63 | 157.16 |
Service use
Data on the CSRI were available for all 21 participants who provided both baseline and post-intervention data. It is therefore feasible to collect data on service use to assess cost effectiveness in a full trial.
The proportion of study participants reporting contacts with a given service is reported to determine whether participants rely primarily on formal services or informal care for support.
Table 14 shows responses regarding service use on the core CSRI at baseline and post intervention. As noted above, one participant dropped out before the end of the study.
Service | Number of people reporting contact with the service | |
---|---|---|
Baseline | Post intervention | |
Inpatient | ||
Yes | 1 | 1 |
No | 20 | 19 |
Missing | 1 | 1 |
Outpatient | ||
Yes | 3 | 2 |
No | 19 | 18 |
Don’t know | 0 | 1 |
GP | ||
Yes | 11 | 15 |
No | 10 | 6 |
Don’t know | 1 | 0 |
Other help with health | ||
Yes | 14 | 14 |
No | 8 | 7 |
LD team | ||
Yes | 5 | 10 |
No | 9 | 4 |
Not applicable | 8 | 7 |
College | ||
Yes | 2 | 2 |
No | 12 | 12 |
Not applicable | 8 | 4 |
Help with other things | ||
Yes | 21 | 20 |
No | 1 | 1 |
Support worker | ||
Yes | 17 | 15 |
No | 4 | 5 |
Not applicable | 1 | 1 |
Unpaid care | ||
Yes | 14 | 13 |
No | 7 | 7 |
Not applicable | 1 | 1 |
Social worker | ||
Yes | 7 | 9 |
No | 14 | 11 |
Not applicable | 1 | 1 |
Table 15 shows the list of other services noted by participants. Participants received a range of services and support across many domains, including formal as well as informal care. In a number of cases, when asked about ‘other services’, that is, those outside the core CSRI, participants responded with a service that is also found on the core CSRI. These have been reconciled to core CSRI responses accordingly.
Service | Baseline | Post intervention |
---|---|---|
Care service | 1 | 1 |
Health service (other)a | 1 | 2 |
Charity | 2 | 2 |
Friends and family | 10 | 14 |
Specialist service | 1 | 1 |
Support group | 5 | 5 |
Work | 2 | 2 |
Other | 3 | 1 |
Data on medication use were also collected. Participants were asked whether they used any medication, whether they knew what it was for, and what that reason was (Table 16).
Item | Baseline (N = 22) | Post intervention (N = 21) |
---|---|---|
Used | 7 | 8 |
Reason known | 6 | 5 |
Reason given | 4 | 5 |
Analysis of EuroQol-Youth
Data on the EQ-5D-Y were available for all 21 participants who provided both baseline and post-intervention data. It is therefore feasible to collect data on health-related quality of life from this population.
EuroQol-Youth descriptive system
The descriptive results showing the individual ratings of problems in five domains on three levels are shown below for the baseline (Table 17) and post-intervention (Table 18) time points.
Levels/dimensions | Mobility | Self-care | Usual activities | Pain and discomfort | Anxiety and depression |
---|---|---|---|---|---|
No problems | 17 (80.95%) | 17 (80.95%) | 14 (66.67%) | 14 (66.67%) | 13 (61.90%) |
Some problems | 2 (9.52%) | 4 (19.05%) | 7 (33.33%) | 4 (19.05%) | 5 (23.81%) |
Unable/extreme problems | 2 (9.52%) | 0 (0%) | 0 (0%) | 3 (14.29%) | 3 (14.29%) |
Levels/dimensions | Mobility | Self-care | Usual activities | Pain and discomfort | Anxiety and depression |
---|---|---|---|---|---|
No problems | 18 (85.71%) | 19 (90.48%) | 18 (85.71%) | 12 (57.14%) | 14 (66.67%) |
Some problems | 1 (4.76%) | 2 (9.52%) | 2 (9.52%) | 6 (28.57%) | 6 (28.57%) |
Unable/extreme problems | 2 (9.52%) | 0 (0%) | 1 (4.76%) | 3 (14.29%) | 1 (4.76%) |
There is a general trend of problems resolving over time, except in the pain and discomfort domain (Figure 10). While the small sample size and study design do not permit any generalisations or conclusions to be drawn, the proportion with any level of self-reported problems at baseline is such that, were the same pattern to be found in a larger sample, changes effected by the intervention may be captured on this measure. The ‘usual activities’ dimension showed the largest movement between baseline and post intervention – of note, this may well be due to the impact of changing COVID-19-related social restrictions over time.
EQ-5D-Y index values
The EQ-5D-Y is very new, and to date, value sets are only available for Slovenia and Japan. 63,64 While values derived from adult populations for the EQ-5D-3L should not be applied to the youth version, it is desirable to apply a country-specific valuation. In addition, our study population does not consist of children, and while the population may benefit from the modified wording, it does not follow that the youth-specific valuation should be applicable to them. We therefore present index values derived from: (1) the youth version for Slovenia, which was considered to be likely to be culturally closer to the UK context than Japan, and (2) the EQ-5D-3L adult version of the measure in Table 19.
Mean | SD | Min | Max | |
---|---|---|---|---|
Slovenia – Baseline | 0.740 | 0.365 | 0.254 | 1 |
Slovenia – Post intervention | 0.773 | 0.307 | 0.009 | 1 |
UK – Baseline | 0.701 | 0.407 | 0.322 | 1 |
UK – Post intervention | 0.726 | 0.365 | 0.157 | 1 |
Given that this is a feasibility study, we do not endeavour to test for statistical significance nor to draw conclusions about the effect of the intervention. We observe the following trends:
-
- higher average scores and less variability (smaller standard deviations) at post intervention compared to baseline;
-
- higher scores and less variability using the Slovenian utility scores, compared to the UK data set. Note that this is likely due to utility scores generally being higher for the same health state in the Slovenian data set;
-
- a larger difference in means between baseline and post intervention using the Slovenian data set.
In conclusion, while neither valuation tariff is entirely appropriate for our study population, our findings demonstrate that the measure can successfully be used, and there appear to be changes in health-related quality of life over the study period.
Acceptability of outcome and health economic measures
After completing the baseline measures, all participants were asked three questions:
-
What did you think about doing the interview?
-
Would it be ok for others to do this?
-
Anything we could do to make it better?
All 22 participants responded positively about their experience of completing measures. All said that they felt the questionnaires would be okay for others to complete and did not feel there was anything that could have been improved.
Some of the responses included ‘Really good, really interesting’, ‘Thought it was good and felt fab doing it’, ‘It was great, brilliant and satisfying’ and ‘I found it challenging in a good way’.
It is notable that participants responded to these questions following an often-lengthy baseline data collection session; thus responses may have been limited. However, the responses are supported by data collected several months following baseline measures. In two of the focus groups, participants spoke about completing the questionnaires.
Yeah, when she did the questionnaire with me. Yeah, that was good. I enjoyed that, it was just me and her.
G3
I enjoyed like, [researcher name], I, I enjoyed speaking to her and telling her my experiences.
G3
I really liked it when it was just me and [researcher name], meeting up and doing questionnaires on Wednesdays.
G1
From the researcher’s (MO) perspective of administering the outcome measures, participants appeared to engage well with the measures.
The adapted CSRI was interactive and participants appeared to enjoy the timeline activity which provided the opportunity to talk about memorable events such as birthdays or cultural festivals. None of the participants expressed distress during completion of the CSRI.
Acceptability of using routinely collected data
Immediately after completing the CSRI post intervention, all 21 participants were asked whether they would be happy for researchers to find out the same information from their records (data linkage), clearly explaining the purpose of this question being for future research and not the current research study.
Out of 21 participants, 20 said they would be happy for researchers to gather their health information from their records. One participant said that they would not be happy with this and they would prefer it if researchers asked them.
Usual practice
The 48 facilitators responding to our online survey collectively facilitated 81 different groups (range 1–5 types of groups per facilitator). Table 21 presents a summary of facilitators’ descriptive comments on their groups’ activities (note that not all facilitators provided such comments).
Forty-eight facilitators across the UK started the survey; however, completion rates varied across the different sections of the survey. Five facilitators had not transitioned to running groups via web-based meeting platforms following social restrictions imposed by the government. The types of groups the facilitators ran varied, as detailed in Table 20 (note some facilitators ran more than one type of group).
Type of group | N (%) |
---|---|
Self-advocacy | 28 (58%) |
Social | 16 (33%) |
Activities based | 11 (23%) |
Educational | 7 (15%) |
Vocational | 7 (15%) |
Othera | 12 (25%) |
Total | 81 |
Most groups met once a week (15 groups prior to the pandemic, 18 groups since the pandemic). The duration of their meetings was generally for up to 2 hours (20 groups before and 22 groups since the pandemic), advocacy groups varied more greatly in the duration of their meetings before the pandemic, with an equal split of groups meeting for up to 2 hours and those meeting for more than 2 hours (8 groups in each category). The consistency of attendance at group meetings was variable both before and since the pandemic.
The types of activities groups were engaged in before and since the pandemic are detailed in Table 21. Many groups adapted their usual activities for online meetings; however, some groups’ adaptations were more significant than others. Many made the focus of online meetings more social in nature and naturally a focus on COVID-19 and associated guidance for keeping safe and following the rules also featured.
Type of group | Nature of activities | |
---|---|---|
Advocacy (n = 28) | Before (n = 18):
|
Since (n = 13):
|
Social (n = 16) | Before (n = 10):
|
Since (n = 7):
|
Activities based (n = 11) | Before (n = 4)
|
Since (n = 3)
|
Vocational (n = 7) | Before (n = 5):
|
Since (n = 5):
|
Educational (n = 7) | Before (n = 5):
|
Since (n = 5):
|
Other (n = 12) | Before (n = 6):
|
Since (n = 6):
|
The results indicate that, in many cases, organisations sought to maintain the usual nature of group activities but switched to online delivery. In many instances, the range of activities and opportunities available to group members had reduced. While some respondents noted reduced participation, overall, the responses suggest that people with intellectual disabilities are willing and able to engage in online activities and means of interacting with others. Finally, when asked about their observations and feedback from group members about their preferences for future (longer-term) meeting modalities, there was a mixed picture. Of the 33 facilitators responding to this question, only 1 indicated a preference for online only meetings and 7 for in-person-only meetings. The remaining facilitators suggested the preference would be for a hybrid offer, combining online and in-person attendance, that people could choose from (19 facilitators, 58%) or that the preference is likely to vary across individuals (6 facilitators, 18%).
Chapter 4 Discussion
Key results
Adapting and piloting the existing STORM intervention for online delivery
Research objective 1 was to adapt the existing STORM intervention for online delivery. This objective was met through the work of the IAG group between November 2020 and February 2021.
The IAG made changes to the intervention structure, to enable both a positive engagement experience and ease of monitoring group members. The intervention content remained largely unchanged, only some streamlining of the number of videos per session was undertaken to ensure the content in each session could be covered within the shorter 60-minute time frame. The delivery mechanism (web-based meetings) was the biggest change and necessitated additional guidance to be produced in the intervention manual, as well as changes to the configuration of materials on the Wiki. Alternative resources were also deemed necessary, which led to the production of a new group member booklet. Guidance for facilitators around managing privacy and meeting participants’ needs for support was also considered. Additional details were added to the manual and facilitator training. It was however accepted that these issues were to some extent out of the control of the research team and would therefore require close monitoring as part of the facilitators’ supervision meetings and through feedback to be sought in the qualitative interviews.
Investigating the feasibility and acceptability of Digital STORM
Research objective 2 was to investigate the feasibility and acceptability of Digital STORM when delivered to groups of people with mild-to-moderate intellectual disabilities online. We were able to meet the recruitment target of four groups to pilot the intervention and of the 22 participants who provided informed consent, 21 were retained to post intervention. The majority (90.9%) of participants attended three or more sessions. Indeed, most participants (63.6%) attended all five Digital STORM sessions. The majority did not miss any sessions, or no more than 15 minutes of any session, due to significant technical issues. This was considered acceptable as it had parity with technical problems that commonly occur with any meetings that rely on digital technology and internet connectivity and did not create an unreasonable expectation or burden on facilitators or participants to ‘catch up’ on any material missed. Fidelity of the intervention delivery in line with the manual was assessed through reviewing and rating video recordings of selected sessions. Facilitators reported the recording of sessions as acceptable and over 90% of the core intervention requirements were assessed as fully met. Accordingly, all the progression criteria for objective 2 were fully met.
During individual interviews with facilitators and focus groups with group members, the Digital STORM intervention was found to be feasible and acceptable. The detailed manual and training enabled facilitators to familiarise themselves with the intervention aims, activities and resources. Another factor that influenced facilitators’ preparedness for delivering the intervention was the time available to them to prepare. Facilitators’ previous experiences of facilitating and of delivering activities via web-based meeting platforms helped them feel prepared.
In relation to the use of the intervention resources within the sessions, a more mixed picture emerged. Facilitators noted challenges using the Wiki, in particular in relation to playing and sharing of videos, as the amount of inactive time on the Wiki website ahead of accessing a video meant the Wiki often logged facilitators out. They also noted that the Wiki limited the size of the viewing window, which made it more difficult for those on mobile phones to view videos. Sharing other content was less difficult, but in general, facilitators tried to keep screen sharing to a minimum. The manual was helpful in guiding facilitators through the sessions. However, facilitators reported that trying to co-ordinate between the Wiki, manual and operating the web-meeting functions could make facilitating the sessions complex. The group member booklet was welcomed by facilitators and participants. Some improvements were suggested to the images in the booklet section on action planning. Facilitators found the action planning (session 4) more challenging than earlier STORM sessions to deliver and support group members with. There was variability around the implementation of action plans, factors influencing this included: how proactive facilitators were in providing support outside sessions, how easy it was to generate ideas with group members and ensuring that group members had the support they needed.
While facilitators acknowledged that there were technical challenges (see above), these were felt to be manageable. Other digital delivery challenges were also found to be manageable, for example, facilitators felt appropriate levels of privacy were maintained and that any threats to privacy were easily navigated. On the whole, facilitators felt it was possible to monitor group members’ emotional responses to the intervention content and to provide support. The most challenging time to monitor participants’ reactions was during screen sharing, in particular when sharing the videos.
Facilitators’ reflections on engaging group members in the intervention brought to the fore that the content could be both emotionally and conceptually difficult for some group members. It could be emotionally difficult in the sense of hearing about negative ways in which people with intellectual disabilities are often treated; this was also found in our earlier pilot where groups met in person. In both instances, group members felt it was important to acknowledge these issues, even if it did elicit feelings of sadness in some. Furthermore, in our earlier pilot of the F2F version of STORM, group members’ and facilitators’ narrative accounts suggested that the intervention gave them an opportunity to process such emotions. 31 Where participants did not have personal experiences of stigmatising encounters to draw upon, this also posed a challenge as it required group members to engage their imagination to consider ‘what if’ type scenarios, which could make some of the intervention content more conceptually difficult to engage with. This only arose with the group run in a special education college and was not observed in our earlier pilot of the F2F version of STORM with ten groups (N = 67). This suggests that the appropriateness of delivering STORM within very sheltered environments may need further careful consideration. Another area that was conceptually difficult related to action planning, particularly the need to generate ideas that were both personally meaningful and realistic to implement. In some instances, participants chose to plan something as a group or with others. This was in keeping with the intervention manual and some groups took a similar approach in our earlier F2F pilot. 31
A key mechanism in the intervention logic model is that of peer support. Peer support was enabled throughout the intervention sessions, and facilitators also reflected on this being an important component. Group members received support from others besides the facilitator and peers; this included family and paid carers who provided technical support for some participants to join the intervention sessions, and some also provided support with the implementation of action plans. Ensuring support from others required facilitators to be proactive. Good pre-existing relationships were important in galvanising family members’ or paid carers’ involvement. There was a suggestion that there may be benefits of family or paid supporters joining the intervention sessions. This is an area that would require careful consideration both from a choice and empowerment stance and in relation to the most effective ways to engage carers in health and well-being interventions for people with intellectual disabilities, which as yet appear unclear. 65 Importantly, it is possible that carers could impede as well as facilitate engagement and that their inclusion could well override participant choice and opportunities for participants developing greater self-efficacy and independence. 66
The intervention was considered to be acceptable to group members and facilitators. All said they would run or take part in Digital STORM again and would recommend it to other facilitators or people with intellectual disabilities. It should be acknowledged that these recommendations while taken here as a proxy for acceptability may reflect a desire to please the research team and not to be critical of the intervention. However, viewed in combination with other reports of participants’ experiences of the Digital STORM intervention (including what they liked about it) the perceived acceptability does seem to be promising. There were mixed views about digital versus in-person delivery of the intervention, with a range of arguments for and against each mode. None of the facilitators or group members had experienced the intervention in person and, therefore, they were not able to make an informed judgement with regard to delivery mode. Furthermore, it is important to note that in exploring UP it was indicated by the facilitators running groups online during the pandemic, that their (and their group members’) longer-term preferences for meetings would be to have a hybrid offer of both in person and online attendance of meetings. Our facilitator stakeholders from the third sector also suggested (during IAG meetings) that this would likely be how they would operate in future. Subsequently, as restrictions have eased, this has become a reality for their offer to the people with intellectual disabilities they support. This validates the work undertaken to adapt STORM for digital delivery, as we are now in a position to offer an intervention which could be delivered to groups in person and online, reflecting a new era of approaches to supporting people with intellectual disabilities.
Finally, the analysis of the interviews and focus groups allowed an understanding of the perceived benefits of the intervention. While this was not an a priori objective of the research, these insights should be considered in planning a future trial, in particular to inform considerations around study engagement and retention, as well as suitable health and social outcomes to measure. Group members in particular spoke about personal awareness, learning and growth. They felt they had gained an awareness and deeper understanding about what it means to have a learning disability, were more able to talk about themselves and had recognised that they could stand up for themselves. They had learnt how to stand up for themselves and others and how to manage difficult situations. They had grown personally in terms of general confidence, but also confidence to assert their needs and rights, and some talked about a sense of pride and increased independence. These themes triangulate with those seen in our initial F2F pilot. 31
Digital administration of study outcome and health economic measures
The third research objective was to test the digital administration of the study outcome and health economic measures via web-based meetings. All procedures and materials for supporting remote administration of study measures were finalised during the adaptation phase. Digital administration of the study outcome and health economic measures was undertaken with all pilot participants at baseline and post intervention. Collecting outcome data via a web-based meeting platform enabled the team to recruit people from further afield (e.g. Wales and North of England as well as South East England) than would have been possible if data collection had taken place in person.
The results show very high levels of data completeness achieved at both baseline and post intervention for the outcome measures. Other than one measure with one item response missing, completed forms were 100% useable. Overall, data availability for economic evaluation was very good, and there were no major issues identified that would pose a barrier to conducting a cost-effectiveness analysis alongside a full trial.
Group members’ narrative accounts of their experience of completing the study measures were overall very positive. These positive responses of the research assessment sessions may in part reflect the fact that people were experiencing varying degrees of isolation and reduced opportunities for meaningful engagement and relationships because of the pandemic. 48,67 They may therefore have particularly enjoyed and appreciated having one-to-one time with an interested researcher.
Group members felt it would be acceptable for other people with intellectual disabilities to complete the measures in the way they had, via web-based meetings with a researcher. Further, in relation to the acceptability of collecting some health economics data via routinely collected data, this was appraised positively by all participants but one.
Based on these findings, we can conclude that it was both feasible and acceptable to collect study outcomes and health economic data remotely via web-based meeting platforms. Moreover, the data produced via this method of data collection meet the progression criteria of more than 80% of the data being useable. The implications of these findings for data collection during a later trial are discussed below.
Usual practice
The fourth and final research objective was to build on community assessments to describe what UP might look like for groups of people with mild-to-moderate intellectual disabilities in the wake of COVID-19, to inform a potential future trial. Many organisations had transferred group meetings to a digital environment, suggesting that it would be possible to have an in person or digital ‘UP’ control arm in a potential future trial.
Strengths of the study
The study team, guided by our PPI group and other stakeholders, was able to be creative in response to the unprecedented challenges brought about by the COVID-19 pandemic. The study funding was utilised to develop an innovative digital adaptation of an intervention for a population that is often excluded from e-health initiatives. The PPI for the study has been a key strength, with proactive and responsive adjustments made throughout to ensure our advisors with intellectual disabilities could be meaningfully involved in discussing and informing a range of issues the study faced. The PPI advisors were also supported to develop new skills in relation to research, such as interviewing and co-authoring.
The pilot of the adapted intervention included four groups across a wide geographical area (including England and Wales). All facilitators were recruited to the interviews and only one participant was lost to post-intervention outcome measurement.
The qualitative evaluation provided useful insights into the perspectives of facilitators and group members, which will help inform future optimisation of the intervention and design of a trial. The study included an assessment of the feasibility and acceptability of collecting outcome and economic evaluation measures via web-based meetings, all of which were highly successful. This is largely due to the research team’s detailed planning and the development of robust support processes which ensured a positive experience for participants. The potential for collecting study data in this way will be of benefit to a future trial where larger numbers of people, spread across a greater geographical region, will be reachable in a pragmatic and cost-effective way.
Limitations of the study
Recruitment and retention
While we were successful in recruiting the target number of groups and participants during the pandemic, the numbers involved were small. It is possible with the move to web-based meetings that facilitators were looking for new activities to engage their groups in and this could have aided recruitment to the pilot. Furthermore, all facilitators and group members consented on the basis that they would receive the intervention. It therefore remains unclear whether the process of recruitment to a future trial would be feasible and acceptable.
It needs emphasising that for the three community-based groups, all participants were actively engaged with the respective organisations and as such may have been more socially engaged and motivated than a representative sample of people with intellectual disabilities. In general, the group format and need to be part of an established group for people with intellectual disabilities articulated in the Logic Model (see Figure 3), unquestionably make it harder to reach individuals who may be very socially isolated and perhaps at greatest risk of poor well-being.
Finally, we have not been able to assess whether it would have been possible to retain participants to a follow-up 12 months from baseline as originally planned for the feasibility RCT.
Intervention materials
Feedback from facilitators indicates that the Wiki is not the best format for the delivery of the intervention. Some facilitators said they would have preferred having the materials on PowerPoint slides, together with the videos streamed via a YouTube channel. The participant booklet, introduced as part of the digital adaptation to facilitate sharing of notes and resources, was generally received positively but requires some modifications to the images in the action planning section.
Methodological
There were some limitations with the focus group methodology, namely that we were not able to achieve attendance from everyone who completed Digital STORM and that some participants only attended part of the focus group meetings. Every effort was made to ensure the focus groups would be a comfortable experience for participants; for example, to avoid overburdening participants, focus groups were purposefully not scheduled for the end of the fifth session, but on a separate occasion (as advised by our stakeholder advisors and reinforced by the group facilitators when making arrangements). In prompting discussion about group members’ experiences of Digital STORM, it is difficult to discern whether for some questions group members agreed with others’ comments because they shared the same perspective or whether they were unsure how to respond to the respective question. Going forward, it may be more beneficial to carry out individual interviews with STORM participants. Moreover, the individual interviews with facilitators did not elicit a rich description of the perceived benefits of the intervention in the same way they did from the group members. One reason for this, noted by some of the facilitators, is that intervention effects take time to materialise, yet the time frame for accessing feedback from facilitators and participants was tight in the present pilot as the focus of the present pilot was on feasibility and acceptability.
While the use of the EQ-5D-Y is feasible with this study population, no specific utility tariff currently exists that reflects the population appropriately in all respects. Prior to a future trial, the available measures and tariffs should be reviewed and a decision taken on whether the EQ-5D-Y is indeed the most appropriate measure.
Data protection regulations relating to the recording of sessions and data storage meant that a researcher had to set up and start all online group sessions. While this was acceptable to facilitators and participants, it necessitated careful co-ordination between the research team and facilitators, reduced flexibility for facilitators and was resource intensive on part of the research team. Further consideration will need to be given to develop appropriate and practical procedures for a larger trial.
Implications for intervention development and future research
Intervention optimisation
The findings from the pilot of the Digital STORM intervention point to a need to consider optimising some aspects of the intervention and the context for its delivery further. While this was only hinted at in the current findings, one issue to consider is whether individuals already engaged in self-advocacy may stand to benefit less from the intervention than those who have had only limited support and skills development required in standing up to stigma. As such, in a future trial, recruitment should seek to include more community groups and education settings that are not specifically related to self-advocacy. Furthermore, a small number of participants in the pilot (particularly some from the specialist college setting) appeared to have little personal experience of stigmatising encounters, similar to those covered in STORM. This might be explained by the context of these participants only accessing the community in the company of a supporter. As this resulted in a more hypothetical engagement with the issues covered, going forward it should be considered whether at least occasional use of community resources without support should be a criterion for participant inclusion.
Storage and display of session materials, such as key messages and summaries, require some refinement. In particular, the use of the Wiki to stream videos should be reconsidered, as should the need for facilitators to use both the Digital STORM manual and the Wiki. As suggested by some facilitators, having the STORM materials in the order set out in the manual on PowerPoint slides, alongside the facility to stream the videos via YouTube may offer a more user-friendly solution. This would certainly avoid some of the technical challenges noted in relation to playing the videos from within the Wiki.
For the action planning, the focus of session 4, feedback indicates that the images in the participant booklet should be reviewed, to ensure they refer more closely to actions implicated in resisting stigma. In addition, options for galvanising others’ support for the implementation of action plans should be considered.
Future trial of the STORM intervention
The present study indicates that the Digital STORM intervention and digital administration of outcome and health economics measures are feasible and acceptable. However, the fact that they were tested in an uncontrolled pilot leaves some questions unanswered. In particular, the question whether randomisation in an RCT design would be acceptable to organisations and participants is as yet unanswered and could be tested in an internal pilot. While retention and data completeness were very good, post-intervention data collection took place around 3 months from baseline and the feasibility of collecting outcome and health economics data 12 months from baseline, as originally intended, is as yet untested.
Given that digital collection of data was very successful, retaining this method of data collection seems indicated for a future trial and similar future studies, due to its reduced costs, convenience and the opportunity to expand the geographical region for the research.
How STORM delivery format, take-up and retention interact with a changing pandemic context is an important matter for further consideration. There were clear arguments for a digital intervention format to not only ensure flexibility at a time of continuing uncertainty but also increase the potential for inclusion of people with intellectual disabilities who live in more rural locations or may be reluctant to venture out after dark during winter months. Going forward, a careful assessment of access to and engagement with the digital and F2F versions of the STORM intervention in changing external circumstances is merited. It is conceivable that groups may choose to start STORM using a F2F delivery format but may need to be prepared to switch to an online format should circumstances require this. Any such transition mid-way through the intervention is likely to require additional preparation and support, and it may well affect facilitator and participant experiences of the intervention as well as delivery costs.
Furthermore, while digital delivery of STORM was found acceptable, feasible and useful in this study, the questions of who digital delivery of this and similar interventions is suitable for and to what extent offering the choice of delivery format is appropriate need further consideration in the context of diverse needs and backgrounds of people with intellectual disabilities.
Notwithstanding these considerations, the present study suggests that with support and adjustments people with intellectual disabilities can participate in online research and interventions. As such, their frequent a priori exclusion from e-health research should be questioned. Furthermore, the success of collecting data digitally in the present study indicates that digital data collection methods should be considered in research with this population and also counter indicate the exclusion of people with mild to moderate intellectual disabilities from digitally conducted studies a priori on the assumption that they would not be able to give valid responses to web-delivered surveys.
Conclusions and recommendations
The findings indicate that all progression criteria were fully satisfied. Accordingly, the SMG and SSC have recommended to the funder that progression to a full trial, with an internal pilot, is justified. Further issues to be assessed concern the feasibility of randomisation in an RCT design, and the potential to offer choice over the format of intervention delivery to those randomised to the intervention arm. Given continuing uncertainty in relation to the COVID-19 pandemic and the intermittent re-introduction of social restrictions, offering a choice between the F2F and digitally delivered versions of STORM seems appropriate and reflects the new hybrid service offer that many organisations are rapidly adopting. Of note, the choice over intervention format would have to be at group and not individual level as combining F2F and online delivery is fraught with difficulties. Optimisation of the intervention, such as revising the format of storing and displaying session materials and videos, will need some planning as the use of the Wiki in addition to the manual was found to be cumbersome.
Acknowledgements
We thank the following for their contribution to the research presented in this report:
Carla Barrett, Abi Goldsmith Sumner, Jun Yi Lee, Maya Patel, Larissa Raab and Unthinkable Digital.
We thank the SSC for their valuable advice and support.
The study was approved by the University College London Research Ethics Committee on 6 December 2019, prior to recruitment and data collection commencing (Project ID 0241/005). An amendment to the study methods was approved by the committee on 19 January 2021 before the start of the pilot of the adapted Digital STORM intervention.
Contributions of authors
Katrina Scior (https://orcid.org/0000-0002-4679-0090) was the project lead, responsible for the planning and oversight of all aspects of the project’s delivery and reporting. She chaired the SMG, supervised the study manager and other study staff, and co-led the qualitative evaluation and intervention adaptation. She was a lead member of the IAG, interviewed group facilitators and rated recordings of intervention sessions to assess fidelity. With LR she led on writing the final report.
Lisa Richardson (https://orcid.org/0000-0002-9753-0184) was the study manager responsible for the day-to-day operational delivery of the research. Along with KS, she co-led on the qualitative evaluation, and the intervention adaptation. She was lead supervisor for the study’s research assistants, with support from KS. She supported the PPI along with CB and co-delivered the intervention training with HB. She conducted the qualitative analysis of facilitator interviews and co-supervised with KS the analysis of the group member focus groups. She led on writing the report and synthesised others’ contributions.
Elizabeth Randell (https://orcid.org/0000-0002-1606-3175) was the senior trial manager, providing support to all aspects of the project from design through to reporting. She co-ordinated the work of colleagues at the CTU. She advised on the process evaluation, intervention adaptation and structure of the final report. She was also a member of the IAG. She led writing the abstract, scientific summary and introduction as well as commenting on report drafts.
Michaela Osborne (https://orcid.org/0000-0001-5436-1088) was the main research assistant on the study. She contributed to PPI meetings, supported the adaptation of outcome measures, and developed the data collection protocol and survey of usual practice. She conducted the study recruitment, consented participants and collected study outcome data. Along with HR, she collected focus group data and conducted a qualitative analysis of the transcripts. She contributed to reporting of the recruitment, usual practice and reporting of the focus group analysis.
Harriet Bird (https://orcid.org/0000-0003-0266-695X) co-delivered the STORM intervention training and provided supervision to facilitators, and contributed to reporting of these. She was also a member of the IAG and supported additional PPI through the Mencap Research Champions group.
Afia Ali (https://orcid.org/0000-0002-0104-9370) advised on all aspects of the project and commented on report drafts.
Eva-Maria Bonin (https://orcid.org/0000-0001-9123-9217) was the lead Health Economist. She designed the economic evaluation and conducted the data analysis. She co-wrote the report sections on the health economic methods and results with KZ. She commented on report drafts.
Adrian Brown and Celia Brown advised on all aspects of the project as members of the PPI group. They co-chaired PPI meetings, reviewed all participant materials and methods for administration of outcome measures, were members of the IAG and contributed to SMG meetings.
Christine-Koulla Burke (https://orcid.org/0000-0002-5562-8896) co-chaired the patient and public involvement group and was a member of the IAG. She contributed to the report section on PPI and commented on report drafts.
Lisa Bush advised on all aspects of the project as group facilitator advisor to the SMG and research team. She was a member of the IAG and contributed to SMG meetings.
Jason Crabtree (https://orcid.org/0000-0002-0542-1504) advised on all aspects of the project and commented on report drafts.
Karuna Davies (https://orcid.org/0000-0003-1434-9580) was a second research assistant on the study. She consented participants and collected outcome data for the project. She coded recordings of intervention sessions for fidelity assessments and analysed the usual practice survey. She contributed to writing the methods related to fidelity assessments and reporting of usual practice.
Paul Davies advised on all aspects of the project as a member of the PPI group. He co-chaired PPI meetings, reviewed all participant materials and methods for administration of outcome measures, was a member of the IAG and contributed to SMG meetings.
David Gillespie (https://orcid.org/0000-0002-6934-2928) advised on statistical analysis and commented on report drafts.
Andrew Jahoda (https://orcid.org/0000-0002-3985-6098) advised on all aspects of the project and the fidelity and qualitative methods in particular. He commented on report drafts.
Sean Johnson (https://orcid.org/0000-0003-4641-835X) was the data manager. He led the data management plan, set up the database for outcome data and led on analysing and reporting of the attendance and fidelity data. He commented on report drafts.
Richard Hastings (https://orcid.org/0000-0002-0495-8270) contributed to all aspects of project planning and management and commented on report drafts.
Laura Kerr advised on all aspects of the project as group facilitator advisor to the SMG and research team. She was a member of the IAG and contributed to SMG meetings.
Rachel McNamara (https://orcid.org/0000-0002-7280-1611) advised on all aspects of the project and commented on report drafts.
Jane Menzies advised on all aspects of the project as group facilitator advisor to the SMG and research team. She was a member of the IAG and contributed to SMG meetings.
Harry Roche advised on all aspects of the project as a member of the PPI group. He co-chaired PPI meetings, reviewed all participant materials and methods for administration of outcome measures, was a member of the IAG and contributed to SMG meetings. He also conducted focus group interviews with MO.
Melissa Wright (https://orcid.org/0000-0002-1011-4795) was the study statistician, conducting the quantitative analysis of outcome data and reporting of the sample demographics as well as the completeness and usability of the outcome data. She commented on report drafts.
Kyann Zhang (https://orcid.org/0000-0002-0612-1503) conducted the health economic analysis under EB’s supervision and co-wrote the report sections on the health economics methods and results with EB.
Data-sharing statement
Data can be obtained from the corresponding author upon reasonable request.
Ethics Statement
Ethical approval was granted by the University College London Research Ethics Committee on 5 Dec 2019. An amendment was approved on 22 January 2021. Ref 0241/005.
Disclaimers
This report presents independent research funded by the National Institute for Health and Care Research (NIHR). The views and opinions expressed by authors in this publication are those of the authors and do not necessarily reflect those of the NHS, the NIHR, the PHR programme or the Department of Health and Social Care. If there are verbatim quotations included in this publication the views and opinions expressed by the interviewees are those of the interviewees and do not necessarily reflect those of the authors, those of the NHS, the NIHR, the PHR programme or the Department of Health and Social Care.
References
- Scior K, Ali A, Gillespie D, Bonin E-M, Crabtree J, McNamara R, et al. National Institute for Health Research Study Protocol (v2). The STanding up FOR Myself (STORM) Psychosocial Group Intervention for Young People and Adults With Intellectual Disabilities: Feasibility Study n.d. https://fundingawards.nihr.ac.uk/award/17/149/03 (accessed 17 November 2023).
- Rickard W, Donkin A. A Fair, Supportive Society: A social Determinants of Health Approach to Improving the Lives of People with Learning Disabilities. London: University College London: Institute of Health Equity; 2018.
- Hatton C, Glover G, Emerson E, Brown I. People With Learning Disabilities in England 2015. Durham: Public Health England; 2016.
- Buckles J, Luckasson R, Keefe E. A systematic review of the prevalence of psychiatric disorders in adults with intellectual disability, 2003–2010. J Ment Health Res Intellect Disabil 2013;6:181-207. https://doi.org/10.1080/19315864.2011.651682.
- Cooper S, Smiley E, Morrison J, Williamson A, Allan L. Mental ill-health in adults with intellectual disabilities: prevalence and associated factors. Br J Psychiatry 2007;190:27-35. https://doi.org/10.1192/bjp.bp.106.022483.
- Emerson E, Madden R, Graham H, Llewellyn G, Hatton C, Robertson J. The health of disabled people and the social determinants of health. Public Health 2011;125:145-7. https://doi.org/10.1016/j.puhe.2010.11.003.
- Mencap . Bullying Wrecks Lives: The Experiences of Children and Young People With a Learning Disability 2007.
- Beart S, Hardy G, Buchan L. How people with intellectual disabilities view their social identity: a review of the literature. J Appl Res Intellect Disabil 2005;18:47-56. https://doi.org/10.1111/j.1468-3148.2004.00218.x.
- Paterson L, McKenzie K, Lindsay B. Stigma, social comparison and self‐esteem in adults with an intellectual disability. J Appl Res Intellect Disabil 2012;25:166-76. https://doi.org/10.1111/j.1468-3148.2011.00651.x.
- Jahoda A, Markova I. Coping with social stigma: people with intellectual disabilities moving from institutions and family home. J Intellect Disabil Res 2004;48:719-29. https://doi.org/10.1111/j.1365-2788.2003.00561.x.
- Ali A, King M, Strydom A, Hassiotis A. Self-reported stigma and symptoms of anxiety and depression in people with intellectual disabilities: findings from a cross sectional study in England. J Affect Disord 2015;187:224-31. https://doi.org/10.1016/j.jad.2015.07.046.
- Dagnan D, Waring M. Linking stigma to psychological distress: testing a social–cognitive model of the experience of people with intellectual disabilities. Clin Psychol Psychother 2004;11:247-54. https://doi.org/10.1002/cpp.413.
- Scior K, Werner S. Intellectual Disability and Stigma: Stepping Out from the Margins. Basingstoke: Palgrave Macmillan; 2016.
- Quarmby K. Scapegoat: Why We Are Failing Disabled People. Portobello Books; 2011.
- Richardson L, Beadle-Brown J, Bradshaw J, Guest C, Malovic A, Himmerich J. “I felt that I deserved it” – experiences and implications of disability hate crime. Tizard Learn Disab Rev 2016;21:80-8. https://doi.org/10.1108/TLDR-03-2015-0010.
- Mitter N, Ali A, Scior K. Stigma experienced by family members of people with intellectual and developmental disabilities: multidimensional construct. BJPsych Open 2018;4:332-8. https://doi.org/10.1192/bjo.2018.39.
- Mitter N, Ali A, Scior K. Stigma experienced by families of individuals with intellectual disabilities and autism: a systematic review. Res Dev Disabil 2019;89:10-21. https://doi.org/10.1016/j.ridd.2019.03.001.
- Werner S, Scior K. Intellectual Disability and Stigma: Stepping Out from the Margins. Cham: Springer Nature; 2016.
- Scior K, Werner S. Changing Attitudes to Learning Disability. London: Mencap; 2015.
- Seewooruttun L, Scior K. Interventions aimed at increasing knowledge and improving attitudes towards people with intellectual disabilities among lay people. Res Dev Disabil 2014;35:3482-95. https://doi.org/10.1016/j.ridd.2014.07.028.
- Mittal D, Sullivan G, Chekuri L, Allee E, Corrigan PW. Empirical studies of self-stigma reduction strategies: a critical review of the literature. Psychiatr Serv 2012;63:974-81. https://doi.org/10.1176/appi.ps.201100459.
- Yanos PT, Lucksted A, Drapalski AL, Roe D, Lysaker P. Interventions targeting mental health self-stigma: a review and comparison. Psychiatr Rehabil J 2015;38:171-8. https://doi.org/10.1037/prj0000100.
- Barbero JAJ, Hernández JAR, Esteban BL, García MP. Effectiveness of antibullying school programmes: a systematic review by evidence levels. Child Youth Serv Rev 2012;34:1646-58. https://doi.org/10.1016/j.childyouth.2012.04.025.
- Merrell KW, Gueldner BA, Ross SW, Isava DM. How effective are school bullying intervention programs? A meta-analysis of intervention research. Sch Psychol Q 2008;23:26-42. https://doi.org/10.1037/1045-3830.23.1.26.
- Firmin RL, Luther L, Lysaker PH, Minor KS, Salyers MP. Stigma resistance is positively associated with psychiatric and psychosocial outcomes: a meta-analysis. Schizophr Res 2016;175:118-28. https://doi.org/10.1016/j.schres.2016.03.008.
- Firmin RL, Luther L, Lysaker PH, Minor K, Mcgrew J, Cornwell M. Stigma resistance at the personal, peer, and public levels: a new conceptual model. Stigma Health 2017;2. https://doi.org/10.1037/sah0000054.
- Szivos S, Griffiths E. Consciousness raising and social identity theory: a challenge to normalization. Clin Psychol Forum 1990;1:11-5.
- Doswell S, Caplan K, Brooks M. ‘Knowing me, knowing you’ – a group exploring life with a learning disability and how to maximise strengths. BPS Bull Faculty People Intel Disabil 2015;13:27-31.
- Mahlke CI, Krämer UM, Becker T, Bock T. Peer support in mental health services. Curr Opin Psychiatry 2014;27:276-81. https://doi.org/10.1097/YCO.0000000000000074.
- Burke EM, Pyle M, Machin K, Morrison AP. Providing mental health peer support 1: a Delphi study to develop consensus on the essential components, costs, benefits, barriers and facilitators. Int J Soc Psychiatry 2018;64:799-812. https://doi.org/10.1177/0020764018810307.
- Scior K, Cooper R, Fenn K, Poole L, Colman S, Ali A, et al. ‘Standing up for myself’ (STORM): development and qualitative evaluation of a psychosocial group intervention designed to increase the capacity of people with intellectual disabilities to manage and resist stigma. J Appl Res Intellect Disabil 2022;35:1297-306. https://doi.org/10.1111/jar.13018.
- European Parliament . The Rise of Digital Health Technologies During the Pandemic 2021. www.europarl.europa.eu/RegData/etudes/BRIE/2021/690548/EPRS_BRI(2021)690548_EN.pdf (accessed 17 November 2023).
- Horton T, Hardie T, Mahadeva S, Warburton W. Securing a Positive Health Care Technology Legacy from COVID-19 2021. www.health.org.uk/publications/long-reads/securing-a-positive-health-care-technology-legacy-from-covid-19 (accessed 17 November 2023).
- Lussier-Desrochers D, Normand CL, Romero-Torres A, Lachapelle Y, Godin-Tremblay V, Dupont A-E, et al. Bridging the digital divide for people with intellectual disability. Cyberpsychol: J Psychosoc Res Cybersp 2017;11. https://doi.org/10.5817/CP2017-1-1.
- Sheehan R, Hassiotis A. Digital mental health and intellectual disabilities: state of the evidence and future directions. Evid Based Ment Health 2017;20:107-11. https://doi.org/10.1136/eb-2017-102759.
- Deady M, Choi I, Calvo RA, Glozier N, Christensen H, Harvey SB. eHealth interventions for the prevention of depression and anxiety in the general population: a systematic review and meta-analysis. BMC Psychiatry 2017;17:1-14. https://doi.org/10.1186/s12888-017-1473-1.
- Lehtimaki S, Martic J, Wahl B, Foster KT, Schwalbe N. Evidence on digital mental health interventions for adolescents and young people: systematic overview. JMIR Mental Health 2021;8. https://doi.org/10.2196/25847.
- Luo C, Sanger N, Singhal N, Pattrick K, Shams I, Shahid H, et al. A comparison of electronically-delivered and face to face cognitive behavioural therapies in depressive disorders: a systematic review and meta-analysis. EClinicalMedicine 2020;24. https://doi.org/10.1016/j.eclinm.2020.100442.
- Newby JM, Twomey C, Li SSY, Andrews G. Transdiagnostic computerised cognitive behavioural therapy for depression and anxiety: a systematic review and meta-analysis. J Affect Disord 2016;199:30-41. https://doi.org/10.1016/j.jad.2016.03.018.
- Sasseville M, LeBlanc A, Boucher M, Dugas M, Mbemka G, Tchuente J, et al. Digital health interventions for the management of mental health in people with chronic diseases: a rapid review. BMJ Open 2021;11. https://doi.org/10.1136/bmjopen-2020-044437.
- Wagner B, Horn AB, Maercker A. Internet-based versus face-to-face cognitive-behavioral intervention for depression: a randomized controlled non-inferiority trial. J Affect Disord 2014;152:113-21. https://doi.org/10.1016/j.jad.2013.06.032.
- Vereenooghe L, Gega L, Langdon PE. Intellectual disability and computers in therapy: views of service users and clinical psychologists. Cyberpsychology 2017;11. https://doi.org/10.5817/CP2017-1-11.
- Ofcom . Disabled Users Access to and Use of Communication Devices and Services: Learning Disability Research Summary 2019. www.ofcom.org.uk/__data/assets/pdf_file/0026/132965/Research-summary-learning-disability.pdf (accessed 17 November 2023).
- Oudshoorn CE, Frielink N, Nijs SL, Embregts PJ. eHealth in the support of people with mild intellectual disability in daily life: a systematic review. J Appl Res Intellect Disabil 2020;33:1166-87. https://doi.org/10.1111/jar.12758.
- National Institute for Health and Care Excellence (NICE) . Evidence Standards Framework for Digital Health Technologies 2019. www.nice.org.uk/Media/Default/About/what-we-do/our-programmes/evidence-standards-framework/digital-evidence-standards-framework.pdf (accessed 17 November 2023).
- Ayres KM, Mechling L, Sansosti FJ. The use of mobile technologies to assist with life skills/independence of students with moderate/severe intellectual disability and/or autism spectrum disorders: considerations for the future of school psychology. Psychol Sch 2013;50:259-71. https://doi.org/10.1002/pits.21673.
- Reichenberg M. Who will present it during the broadcast? A case study at a daily activity centre. J Res Spec Educ Needs 2016;16:65-73. https://doi.org/10.1111/1471-3802.12061.
- Flynn S, Hayden N, Clarke L, Caton S, Hatton C, Hastings RP, et al. Coronavirus and People With Learning Disabilities Study Wave 3 Results (Full Report) 2021.
- Lodge K. COVID-19 shows that the lives of people with a learning disability are still not related as equal. BMJ Opin 2020. https://blogs.bmj.com/bmj/2020/09/01/COVID-19-shows-that-the-lives-of-people-with-a-learning-disability-are-still-not-treated-as-equal/ (accessed 17 November 2023).
- Thomas R. Unprecedented number of DNR orders for learning disabilities patients. Health Serv J 2020. www.hsj.co.uk/coronavirus/unprecedented-number-of-dnr-orders-for-learning-disabilities-patients/7027480.article (accessed 17 November 2023).
- Jahoda A, Willner P, Rose J, Stenfert Kroese B, Lammie C, Shead J, et al. Development of a scale to measure fidelity to manualized group-based cognitive behavioural interventions for people with intellectual disabilities. Res Dev Disabil 2013;34:4210-21. https://doi.org/10.1016/j.ridd.2013.09.006.
- Ritchie J, Spencer L. Analysing Qualitative Data. London: Routledge; 1994.
- Ritchie J, Lewis J. Qualitative Research Practice: A Guide for Social Science Students and Researchers. London: SAGE; 2003.
- Gale NK, Heath G, Cameron E, Rashid S, Redwood S. Using the framework method for the analysis of qualitative data in multi-disciplinary health research. BMC Med Res Methodol 2013;13:1-8. https://doi.org/10.1186/1471-2288-13-117.
- Tennant R, Hiller L, Fishwick R, Platt S, Joseph S, Weich S, et al. The Warwick–Edinburgh Mental Well-being Scale (WEMWBS): development and UK validation. Health Qual Life Outcomes 2007;5:1-13. https://doi.org/10.1186/1477-7525-5-63.
- Dagnan D, Sandhu S. Social comparison, self‐esteem and depression in people with intellectual disability. J Intellect Disabil Res 1999;43:372-9. https://doi.org/10.1046/j.1365-2788.1999.043005372.x.
- Ali A, Strydom A, Hassiotis A, Williams R, King M. A measure of perceived stigma in people with intellectual disability. Br J Psychiatry 2008;193:410-5. https://doi.org/10.1192/bjp.bp.107.045823.
- Anderson C, John OP, Keltner D. The personal sense of power. J Pers 2012;80:313-44. https://doi.org/10.1111/j.1467-6494.2011.00734.x.
- Wille N, Badia X, Bonsel G, Burström K, Cavrini G, Devlin N, et al. Development of the EQ-5D-Y: a child-friendly version of the EQ-5D. Qual Life Res 2010;19:875-86. https://doi.org/10.1007/s11136-010-9648-y.
- Beecham J, Knapp M. Measuring Mental Health Needs. London: Gaskell; n.d.
- Hoffmann TC, Glasziou PP, Boutron I, Milne R, Perera R, Moher D, et al. Better reporting of interventions: template for intervention description and replication (TIDieR) checklist and guide. BMJ 2014;348. https://doi.org/10.1136/bmj.g1687.
- Brown C, Davies P, Roche H, Brown A, Scior K. People with learning disabilities are equal members of society, not vulnerable burdens. BMJ Opin 2021. https://blogs.bmj.com/bmj/2021/03/18/people-with-learning-disabilities-are-equal-members-of-society-not-vulnerable-burdens/ (accessed 17 November 2023).
- Prevolnik Rupel V, Ogorevc M. IMPACT HTA HRQoL Group . EQ-5D-Y value set for Slovenia. PharmacoEcon 2021;39:463-71. https://doi.org/10.1007/s40273-020-00994-4.
- Shiroiwa T, Ikeda S, Noto S, Fukuda T, Stolk E. Valuation survey of EQ-5D-Y based on the international common protocol: development of a value set in Japan. Med Decis Making 2021;41:597-606. https://doi.org/10.1177/0272989X211001859.
- Hithersay R, Strydom A, Moulster G, Buszewicz M. Carer-led health interventions to monitor, promote and improve the health of adults with intellectual disabilities in the community: a systematic review. Res Dev Disabil 2014;35:887-90. https://doi.org/10.1016/j.ridd.2014.01.010.
- Owens R, Earle S, McNulty C, Tilley E. What works in community health education for adults with learning disabilities: a scoping review of the literature. J Appl Res Intellect Disabil 2020;33:1268-83. https://doi.org/10.1111/jar.12746.
- Flynn S, Caton S, Gillooly A, Bradshaw J, Hastings R, Hatton C, et al. The experiences of adults with learning disabilities in the UK during the COVID-19 pandemic: qualitative results from wave 1 of the coronavirus and people with learning disabilities study. Tizard Learning Disability Review 2021;4:224-9. https://doi.org/10.1108/TLDR-09-2021-0027.
Appendix 1 Facilitator information leaflet
UCL Research Ethics Committee Approval ID Number: 0241/005
We are working on a study which aims to advance our understanding of how people with learning disabilities respond to prejudice and discrimination they may experience, and whether a new group programme can support them in this process.
Our research team
-
We are a team of researchers from University College London (UCL); we are part of the UCL Unit for Stigma Research (UCLUS).
-
Our research team also includes researchers, clinicians and self-advocates with learning disabilities from other parts of the UK.
Contact details for the research team:
Should you have any queries or require any further information, please do not hesitate to contact the research team.
Please contact Michaela Osborne (Research Assistant) if your group may be interested in taking part.
E-mail: m.osborne@ucl.ac.uk
Please contact Lisa Richardson (Study Manager) or Katrina Scior (Study Lead) in case of any complaints.
E-mail: lisa.richardson@ucl.ac.uk (Working days: Tuesday and Wednesday)
E-mail: k.scior@ucl.ac.uk
This study is funded by the National Institute for Health Research (NIHR) Public Health Research programme (NIHR PHR Project 17/149/03).
Background to this research study
-
Research shows that people with learning disabilities often face negative consequences because of negative stereotypes, prejudice, bullying and discrimination associated with having a learning disability (what we collectively refer to as ‘stigma’ hereafter).
-
Despite positive changes in policies, service provision and societal views, negative attitudes and discrimination remain everyday realities for many people with learning disabilities.
-
Despite a clear need to do more to empower people with learning disabilities to manage and resist stigma, to date few interventions have targeted this and none have been shown to be effective.
-
Developing effective ways of promoting skills and confidence in standing up to stigma is likely to have positive effects on their mental well-being and social interactions.
About the research study
-
We have developed a new psychosocial group programme called Standing Up for Myself (STORM), to help people with learning disabilities cope with and stand up to the stigma they often have to face on account of having a learning disability. More information about STORM is provided below.
-
We have adapted STORM for digital delivery (e.g. using Zoom meetings); we refer to this as ‘Digital STORM’.
-
We are now looking to find four to six groups for people with learning disabilities to try out Digital STORM as part of our pilot evaluation (more information about eligibility for the pilot is provided below).
If Digital STORM is found to be acceptable, it may be included in a larger feasibility study in the future.
-
Our vision for both STORM (for F2F meetings) and Digital STORM (digital meetings) is that it will be a freely available resource for group facilitators to draw on when working with groups of people with learning disabilities.
Who can participate in the study?
-
We are looking to include existing groups of people with learning disabilities (educational, activity, social or self-advocacy focused) in this study
-
It is important that the group:
-
Is in place already (although group meetings may have been disrupted due to COVID-19 and associated restrictions)
-
Intend to meet regularly (at least fortnightly) between March and April 2021
-
Have at least three and no more than six members with learning disabilities who wish to participate
-
Be able to meet for five 90-minute sessions weekly or fortnightly (i.e. are willing to replace five usual meetings with Digital STORM)
-
Have a group facilitator willing to receive training and supervision to facilitate the Digital STORM programme
-
Have organisational support to deliver the programme
-
At least two groups taking part in the pilot need to be already set-up for online working and meetings. We would like to work with one group who may not be set up for this way of working, whom we would provide some support for online working and meetings (group members would need to have some resources as outlined below).
-
-
As part of this, it is important that the group members:
-
Have a mild-to-moderate learning disability
-
This is because the study has ethical approval only for people who have capacity to consent. Also, the activities in this programme involve watching films and discussing these – we are in no way looking to exclude people with severe and profound learning disabilities but feel an intervention that meaningfully involves them would need to be designed in line with their needs.
-
○ Are aged 16 or over.
-
○ Have capacity to provide informed consent to participate in the study.
-
○ Able to communicate in English.
-
○ Know one another.
-
○ Have someone that they can talk to for support if they find the content of the sessions upsetting.
-
○ Have access to the internet and a device that can access web meetings, be this via Zoom, Skype, MS Teams or Google Meet to engage with the Digital STORM programme.
-
○ Where needed, have support to access web-based meetings.
-
○ Are able to complete questionnaires in an online 1 : 1 meeting (with support) on one occasion before starting Digital STORM.
If you are unsure about your group or a group members suitability to take part in the programme, we are happy to discuss your concerns, please do get in touch.
What will I need to do as a group facilitator?
-
Clarify in discussion with the research team whether the group as a whole and individual group members are suitable to take part in the study.
-
If yes, we would like your assistance in collecting some information from group members:
-
Once group members have had the study presented to them and expressed interest in taking part, a member of the research team would need to meet with each group member individually via video call to obtain their informed consent and complete a questionnaire. These meetings would be organised through you as the group facilitator.
-
We will meet with every group member two times to complete the questionnaire before the first Digital STORM group session, and this will be soon after they have expressed interest in taking part. The second time will be following the final Digital STORM group session.
-
The questionnaire contains questions about how individuals feel about themselves, their well-being, how empowered they feel, how they feel having a learning disability affects them, and about services they access for support. Questions will also involve looking at stories about being treated fairly and unfairly, group members will be asked to imagine they are in the story and how they might respond in these situations.
-
Following the final Digital STORM session supporting two members of the research team to attend and ask group members questions about their experiences of the Digital STORM programme, including how they found the web-based format and issues around privacy and support received.
-
-
You will be asked to support the organisation of a focus group session with group members. This will be led by two researchers who will ask about group members' experiences of Digital STORM.
-
You will be asked to take part in an individual interview led by a researcher. Questions will ask about your experiences of delivering Digital STORM, including any barriers you experienced.
-
The focus groups and interviews will be audio-recorded, transcribed and analysed. Transcripts will be anonymised, and you will therefore not be identifiable.
-
-
The Digital STORM programme will involve:
-
Facilitating the delivery of sessions 1–5 via web-based meetings
-
You will be supported by Mencap, our partner in this research, to deliver the programme, including:
-
▪ Receiving 2–3 hours of training via a web-based meeting
-
▪ ‘Supervision’ to support you in delivering the programme to your group – in addition to planned meetings or telephone calls you will be able to contact the Mencap lead for the STORM programme if any queries or concerns arise at any other times
-
-
We will ask you to record the group sessions (with group members’ consent); this is so that we can learn more about how the STORM programme works via a digital delivery format.
-
If you have any queries or concerns about the study itself, you will be able to contact the Study Manager at any time.
-
We will ask you to monitor and keep note of any problems that arise with regard to group members' privacy, confidentiality and support needs (given they will be in their personal environment) as well as details of any practical problems accessing the meetings.
-
You will be required to explain the confidentiality limits within the group. This includes ensuring that group members understand that what they say in the group will be kept confidential unless group members share information that may cause harm to the person or someone else. The safeguarding procedures at the organisation that the groups are held should be followed as per usual.
-
The Standing Up for Myself (Digital STORM) programme
-
Digital STORM is a four-session group programme (plus one follow-up session) which consists of sessions that last 1 hour and 30 minutes.
-
The group will watch filmed first-hand testimonials by people with learning disabilities, engage in discussions and practical exercises.
-
It is designed to be interactive, thought-provoking and fun, despite the serious topic.
-
The STORM programme draws on psychological theories and evidence.
The themes of the four sessions are:
-
What does ‘learning disability’ mean to people with learning disabilities? What does it mean to me?
-
How are people with learning disabilities treated by others?
-
How do people with learning disabilities respond to negative treatment from others? What strategies can I use?
-
What do I want to try and do more of to respond to negative treatment from others?
What will happen to the questionnaires and group recordings?
The anonymised responses to the online questionnaires will be available to the UCL research team, with the exception of answers to questions about service use, which will be shared with our colleagues at London School of Economics. No one outside the research team will have access to any personal or identifying information about participants. The information will be stored safely on a password-protected computer. The group recordings will be transcribed with any identifiable information about participants removed and will be saved securely on a password-protected computer. The UK Data Protection Act 2018 will be adhered to at all times.
What will happen to the results of the research study?
The research findings will determine whether Digital STORM might be included in a future research study to test the feasibility and acceptability of Digital STORM for people with learning disabilities. This pilot study may be written into reports, which could be published. It will not be possible to identify any of the individuals who take part in the study from the reports, as all the information will be anonymised, with information from many individuals grouped together.
Local data protection privacy notice
The controller for this project will be UCL. The UCL Data Protection Officer provides oversight of UCL activities involving the processing of personal data.
This ‘local’ privacy notice sets out the information that applies to this particular study. Further information on how UCL uses participant information can be found in our ‘general’ privacy notice for participants in health and care research studies.
Data will be collected using web-based meeting platforms. The categories of personal data used will be the name of the organisation through which the person participates in this research, name of person, age and gender as well as contact details (to facilitate arranging meetings for the purposes of data collection). The lawful basis that will be used to process this personal data are: ‘Public task’ for personal data and ‘Research purposes’ for special category data. The data will be processed so long as it is required for the research project. If we are able to anonymise or pseudonymise the personal data provided, we will undertake this and will endeavour to minimise the processing of personal data wherever possible.
If you are concerned about how personal data are being processed, or if you would like to contact us about your and/or the rights of the participants you are supporting, please contact a member of the research team in the first instance (contact details on page 1). In addition, the UCL data protection team can also be contacted at dataprotection@ucl.ac.uk should you have any further concerns.
What if something goes wrong?
If you have any concerns or complaints about the way the research is being managed, you can contact Lisa Richardson or Katrina Scior in the first instance, our contact details are on page 1.
You can also contact the Chair of the UCL Research Ethics Committee with complaints – ethics@ucl.ac.uk
Thank you for reading this information sheet
Appendix 2 Interview and focus group guides/scripts
Facilitator interview schedule | |
---|---|
Topic | Possible questions |
Experience of delivering STORM |
|
Digital delivery |
|
Supporting group members |
|
Training and Manual |
|
Supervision |
|
Positive impact of STORM |
|
Adverse impact of STORM |
|
Overall |
|
Focus group topic guide |
---|
1. What did you like about STORM sessions? |
2. What did you not like about STORM sessions? |
3. What did you learn? |
4. Has STORM helped you change anything? Has it made anything better for you? Has it made anything worse for you? |
5. Has STORM made you feel more confident about standing up for yourself? |
6. What was good about doing STORM online? |
7. What was not good about doing STORM online? |
8. Did you or your group have problems with technology? What happened? Was it ok in the end? Did it get in the way of doing STORM? |
9. What did you think of the STORM booklet? Did you use it during STORM sessions? Did you look at it outside of STORM sessions? |
10. Did you find anything upsetting about doing STORM? |
11. Did you feel you had support? |
12. Who supported you? |
13. Should we do more to make sure group members feel well supported? |
14. What should we do? |
15. Your STORM sessions were recorded by a researcher on Zoom, was this ok or not ok? |
16. If not ok, please can you tell us what was not ok? |
17. Were you able to talk about personal things in a private way in STORM sessions? |
18. How can we change something to make it more private? |
19. Would you recommend Digital STORM to others? |
20. If you said yes, why would you recommend Digital STORM? |
21. If you said no, why would you not recommend Digital STORM? |
22. Could we change something to make Digital STORM better? |
23. If we make the changes you asked for, would you then recommend Digital STORM to others? |
Appendix 3 Standing Up for Myself outcome measures
Note the same questionnaire was delivered at baseline and post intervention. The sociodemographic data were only collected at baseline. EQ-5D-Y removed for copyright reasons.
Appendix 4 Standing Up for Myself Client Service Receipt Inventory
Appendix 5 Standing Up for Myself session attendance record and facilitator notes
Appendix 6 Sample of the STORM fidelity checklist
STORM fidelity checklist for session 1 | ||||
---|---|---|---|---|
Site ID | Date of session | Rater initials | Date of rating | |
Component | Subcomponent/description | Definitely presenta | Somewhat presenta | Absenta |
1. Introduce STORM programme content and structure | 1.1. Introduction to the STORM programme content provided. | |||
1.2. Introduction to the programme structure provided. | ||||
2. Recap group rules | 2.1. Facilitates discussion of group rules, with members invited to contribute. | |||
3. Introduce the session aims | 3.1. The key message and aims of the session are introduced. S1: We will be looking at what ‘learning disability’ means to different people with learning disabilities. We will be looking at what ‘learning disability’ means to you | |||
4. Video led discussions about what having LD means | 4.1. Facilitators show one video about what having a LD means. Video 1.1 ‘Mencap’s Young Ambassadors tackle stereotypes’ (Mencap) | |||
4.2. Facilitates group discussion following the video content on LD. Discussion should cover both what people in the video said about having an LD and what does LD mean to group members. | ||||
5. Video led discussion about other aspects of individual’s identity | 5.1. Facilitators show at least one video about achievements/pride/interests/skills Video 1.2 ‘Young Ambassadors talk about achievements’ (Mencap) Video 1.3 ‘... things that are important...’ (Wiltshire Voices) |
|||
5.2. Facilitates group discussion following the video content on achievements/pride/interests/skills. Discussion should cover what people in the video said about their achievements and the group members achievements. | ||||
6. Further discussion to elicit personal narrative from group members | 6.1. Facilitates a broader discussion around identity. Discussion should cover ‘other things’ about themselves, for example, what is important to them, things other people may not know about them – hobbies, achievements, what proud of or want others to know about them. | |||
7. Summarise key points from the session | 7.1. Facilitator recaps the key message of the session Summarise the session’s key points including the key message: My learning disability is only one part of me. | |||
8. Check out | 8.1. Facilitator checks how people are feeling at the end of the session. Facilitator asks: How did you find today’s session? Is everyone feeling okay? | |||
STORM generic items (rated for all sessions) | ||||
1. Balance of presentation | A balance is struck between the facilitator presenting material and promoting discussion amongst the group | |||
2. Session is focused | Degree to which the delivery of the session is focused and in line with the manual | |||
3. Engaging group members | The facilitator seeks to engage all group members in the session | |||
4. Conveying understanding | Facilitator listens carefully to group members, is empathic and responsive to contributions | |||
5. Adapting communication | Delivers session in an accessible fashion and adjusts the content or style of their own communication, where indicated | |||
6. In control of the session | Facilitator in control of the session, maintaining a good pace of communication and activity | |||
7. Warmth and respect | Facilitator shows warmth and respect to group members throughout the session | |||
8. Promotes peer support | Facilitator promotes opportunities for peer support |
Appendix 7 Number of participants present during the focus groups
Focus group questions | Participants present (n) | ||||
---|---|---|---|---|---|
G1 (/5) | G2 (/6) | G3 (/7) | G4 (/3) | Total (/21) | |
1. What did you like about STORM sessions? | 4 | 6 | 5 | 3 | 18 |
2. What did you not like about STORM sessions? | 4 | 6 | 5 | 3 | 18 |
3. What did you learn? | 4 | 6 | 6 | 3 | 19 |
4. Has STORM helped you change anything?
|
4 | 6 | 7 | 3 | 20 |
5. Has STORM made you feel more confident about standing up for yourself? | 4 | 6 | 7 | 3 | 20 |
6. What was good about doing STORM online? | 4 | 6 | 7 | 3 | 20 |
7. What was not good about doing STORM online? | 4 | 6 | 7 | 3 | 20 |
8. Did you or your group have problems with technology?
|
4 | 6 | 7 | 3 | 20 |
9. What did you think of the STORM booklet?
|
4 | 6 | 7 | 3 | 20 |
10. Did you find anything upsetting about doing STORM? | 3 | 6 | 7 | 3 | 19 |
11. Did you feel you had support?
|
3 | 6 | 7 | 3 | 19 |
12. Your STORM sessions were recorded by a researcher on Zoom, was this ok or not ok? | 3 | 6 | 6 | 3 | 18 |
13. If not ok, please can you tell us what was not ok? | 3 | 6 | 6 | 3 | 18 |
14. Were you able to talk about personal things in a private way in STORM sessions? | 3 | 6 | 6 | 3 | 18 |
15. How can we change something to make it more private? | 3 | 6 | 6 | 3 | 18 |
16. Would you recommend Digital STORM to others? | 3 | 6 | 6 | 3 | 18 |
17. If you said yes, why would you recommend Digital STORM? | 3 | 6 | 6 | 3 | 18 |
18. If you said no, why would you not recommend Digital STORM? | 3 | 6 | 6 | 3 | 18 |
19. Could we change something to make Digital STORM better? | 2 | 5 | 6 | 3 | 16 |
20. If we make the changes you asked for, would you then recommend Digital STORM to others? | 2 | 5 | 6 | 3 | 16 |
Appendix 8 Survey to establish ‘usual practice’
Group facilitators survey
We are seeking the views and experiences of facilitators/supporters who run discussion-based groups for young people and adults (age 16+) with mild-to-moderate learning disabilities. The aim of this survey is to seek information about discussion-based groups that exist for people with learning disabilities in the context of the COVID-19 pandemic and associated restrictions, and how these groups, including the activities involved, have changed since before the COVID-19 pandemic. We will use your responses to consider how we can best develop a web-based group programme for people with learning disabilities to access. The survey is purely exploratory, and your responses will only be used to further our understanding – they will not be published, nor will your responses be shared outside of the project team. The survey will take approximately 15 minutes to complete.
Q1 What type of group/s for people with learning disabilities did/do you facilitate/support? (Please select all that apply)
-
▢ Educational Group (e.g. college class)
-
▢ Vocational Group
-
▢ Social Group
-
▢ Activity-based Group (e.g. drama)
-
▢ Self-Advocacy Group
-
▢ Therapy Group
-
▢ Other (Please specify) ______ _____ ______ ________ _____ __________
The following questions are about discussion-based groups for people with learning disabilities that you facilitated/supported before the COVID-19 pandemic.
Q1 Did you facilitate or support discussion-based groups for young people and/or adults with mild-to-moderate learning disabilities before the COVID-19 pandemic?
-
○ Yes
-
○ No
Q2 Please can you tell us more about the group/s you facilitated before the COVID-19 pandemic? (Type your answers below each question for each group you facilitated/supported)
Average number of attendees per session | Number of facilitators/supporters per session | How often groups meet (e.g. once per week/fortnightly) | Duration of group sessions (minutes) | |
---|---|---|---|---|
Group 1 | ||||
Group 2 (If applicable) | ||||
Group 3 (If applicable) | ||||
Group 4 (If applicable) |
Q3 Please can you describe the activities that were involved in your group/s?
-
○ Group 1 ___________ _______________ _________ _____ ________
-
○ Group 2 (If applicable) ________ ____________ __________ _____ ________
-
○ Group 3 (If applicable) ___ _______________ ______________ _______ _____
-
○ Group 4 (If applicable) __ _________ ______ _____________ _________ _____
The following questions are about discussion-based groups for people with learning disabilities that you have facilitated/supported since the start of the COVID-19 pandemic and associated restrictions.
Q4 Since the COVID-19 pandemic, have some/all of the groups you facilitate continued to meet?
-
○ Yes, groups have continued as before
-
○ Yes, but with some changes
-
○ No, all groups have suspended until further notice
-
○ Q4a Have your group/s attempted to meet at all since the start of the COVID-19 pandemic?
-
○ No
-
○ Yes, but unable to continue (Please explain why, include details of any barriers and/or attempts to overcome these) ___ ________ ____________ ____________ ____
Q5 Please can you tell us more about the group/s you have facilitated/supported since the COVID-19 pandemic and associated restrictions began. (Type your answers below each question for each group where applicable.)
Method of meeting (online/in person) | Average number of attendees per session | Is member attendance consistent or changeable? | Number of facilitators/supporters per session | How often groups meet (e.g. once per week/fortnightly) | Duration of group sessions (minutes) | |
---|---|---|---|---|---|---|
Group 1 | ||||||
Group 2 (If applicable) | ||||||
Group 3 (If applicable) | ||||||
Group 4 (If applicable) |
Q6 Please can you describe the activities that are involved in your group/s and if applicable, how activities have changed since before the COVID-19 pandemic and why?
_________ ______ ___________________ _________ ____________ _________
The following questions are about discussion-based groups for young people and/or adults with learning disabilities meeting online.
Q7 Have you supported/facilitated an online group meeting for people with learning disabilities due to the COVID-19 pandemic and associated restrictions?
-
○ No
-
○ Yes
-
○ Yes, but not due to COVID-19 (Please explain further) ______ ______ _______
Q8 What online software have you used for group meetings? (Please tick all that apply)
-
▢ Zoom
-
▢ Microsoft Teams
-
▢ Skype
-
▢ WhatsApp video calls
-
▢ Other (Please specify) _____ _____ _________ _________ _____________
Q9 How many of your members approximately have access to the technology required to meet online? (Please tick one box on the scale below for each device)
All | More than half | About half | Less than half | None | Unsure | |
---|---|---|---|---|---|---|
Phone | ⃝ | ⃝ | ⃝ | ⃝ | ⃝ | ⃝ |
Tablet | ⃝ | ⃝ | ⃝ | ⃝ | ⃝ | ⃝ |
Laptop/computer | ⃝ | ⃝ | ⃝ | ⃝ | ⃝ | ⃝ |
Wifi/internet | ⃝ | ⃝ | ⃝ | ⃝ | ⃝ | ⃝ |
Other, please specify | ⃝ | ⃝ | ⃝ | ⃝ | ⃝ | ⃝ |
Q10 What support do your members require to join group meetings, if any? (Tick all that apply)
-
▢ Call members before meetings to prepare
-
▢ Provide Easy Read instructions (e.g. how to join)
-
▢ Ongoing training sessions
-
▢ None, they receive help from their carer/supporter at home
-
▢ None, they can join independently after some support at the beginning
-
▢ None
-
▢ Other, please specify ______ ________ ____________ _______________
Q11 How have you found the different aspects of facilitating groups online? (Please read the following statements and for each select one response on the scale)
Very difficult | Somewhat difficult | Okay | Somewhat easy | Very easy | Unsure | |
---|---|---|---|---|---|---|
Identifying that a member is having technical issues | ⃝ | ⃝ | ⃝ | ⃝ | ⃝ | ⃝ |
Supporting a member with technical issues | ⃝ | ⃝ | ⃝ | ⃝ | ⃝ | ⃝ |
Managing group dynamics (e.g. engaging everyone in discussions) | ⃝ | ⃝ | ⃝ | ⃝ | ⃝ | ⃝ |
Sharing videos/documents and other materials on screen | ⃝ | ⃝ | ⃝ | ⃝ | ⃝ | ⃝ |
Monitoring members’ emotional well-being | ⃝ | ⃝ | ⃝ | ⃝ | ⃝ | ⃝ |
Supporting a member if they become distressed | ⃝ | ⃝ | ⃝ | ⃝ | ⃝ | ⃝ |
Other, please specify | ⃝ | ⃝ | ⃝ | ⃝ | ⃝ | ⃝ |
Q11a For the statements that you answered ‘Very Difficult’ or ‘Somewhat Difficult’, were you able to manage these difficulties and if so, how? Please describe below.
__________ ______________ ___________ _________________ ____________
Q12 During online group meetings, what measures, if any, have you put in place to ensure privacy and confidentiality for members? Please describe below.
______ _______ _________ __________ _____________ _________ __________
Q12a During online group meetings, how have you found ensuring privacy and confidentiality for members? (Please tick one box on the scale below)
-
○ Very difficult
-
○ Somewhat difficult
-
○ Okay
-
○ Somewhat easy
-
○ Very easy
-
○ Unsure
Q12b Please can you specify what the difficulties have been to ensuring privacy and confidentiality for members and whether you have managed to overcome these, if so how?
________ _____________ _______________ _______ _____________________
Q13 Have you needed to provide emotional support to any group members following an online meeting?
-
○ Yes
-
○ No
Q13a How have you been able to emotionally support your members? (Tick all that apply)
-
▢ Phone call
-
▢ Online video call
-
▢ Socially distanced visit at a person’s home
-
▢ Socially distanced meeting in the community
-
▢ Other, please explain____________ _____________ _____________ _____
Q13b Do you feel the support provided was sufficient to meet members’ needs?
-
○ Yes
-
○ Unsure
-
○ No, please explain ______ ______ _______ ______________ _____________
Q14 What, if any, other challenges have you experienced when facilitating groups for members online and how have you managed these?
______________ ________ __________ _________ ________ _______________
Q19 Have you facilitated/supported any group/s for people with learning disabilities online?
-
○ Definitely yes
-
○ Probably yes
-
○ Might or might not
-
○ Probably not
-
○ Definitely not
The following final questions are about your thoughts on the near to long-term future of groups for people with learning disabilities.
Q15 How likely do you think it is that groups run by your organisation will continue to meet online in the longer term? (Please tick one box on the scale below)
-
○ Extremely likely
-
○ Somewhat likely
-
○ Unsure
-
○ Somewhat unlikely
-
○ Extremely unlikely
Q16 Through your observations and speaking to group members, how do you think they would prefer to meet in the future? (Please tick one box)
-
○ Online
-
○ In person
-
○ No preference
-
○ Varied across members
-
○ Not sure
Q17 When do you think your organisation’s discussion-based groups will be able to return to meeting in person with changes to reflect government guidelines? (Please tick one box) Please note, we understand that this is very difficult to predict during an uncertain time, we are only seeking your immediate thoughts.
-
○ Already started
-
○ Hopefully within the next 2 months
-
○ Hopefully within 3–6 months
-
○ No sooner than 6 months
-
○ Unsure
List of abbreviations
- AE
- adverse event
- CI
- chief investigator
- CSRI
- Client Service Receipt Inventory
- CTR
- Centre for Trials Research (Cardiff University)
- EQ-5D-Y
- EuroQol-Youth
- F2F
- face-to-face
- FPLD
- Foundation for People with Learning Disabilities
- IAG
- Intervention Adaptation Group
- IQR
- interquartile range
- NIHR
- National Institute for Health and Care Research
- PPI
- patient and public involvement
- RCT
- randomised controlled trial
- RtD
- Reactions to Discrimination
- SAE
- serious adverse events
- SEAP
- STORM Expert Advisor Panel (Study PPI group)
- SERP
- self-efficacy in rejecting prejudice
- SIS
- Service Information Schedule
- SMG
- study management group
- SSC
- Study Steering Committee
- STORM
- Standing Up for Myself
- UCL
- University College London
- WEMWBS
- Warwick-Edinburgh Mental Wellbeing Scale