- Research
- Open access
- Published:
Development and usability testing of an online platform for provider training and implementation of cognitive-behavioral therapy guided self-help for eating disorders
BMC Digital Health volume 3, Article number: 2 (2025)
Abstract
Background
Most individuals with eating disorders (EDs) do not receive treatment, and those who do receive care typically do not receive evidence-based treatment, partly due to lack of accessible provider training. This study developed a novel “all-in-one” online platform for disseminating training for mental health providers in cognitive-behavioral therapy guided self-help (CBTgsh) for EDs and supporting its implementation. The aim of the study was to obtain usability data from the online platform prior to evaluating its effects on provider training outcomes and patient ED symptom outcomes in an open pilot trial.
Methods
Nine mental health provider participants (n = 4 in Cycle 1; n = 5 in Cycle 2) and 9 patient participants (n = 4 in Cycle 1; n = 5 in Cycle 2) were enrolled over two cycles of usability testing. In Cycle 1, we recruited providers and patients separately to complete brief platform testing sessions. In Cycle 2, we recruited provider-patient dyads; providers completed training using the platform and subsequently delivered CBTgsh to a patient for three weeks. Usability was assessed using the System Usability Scale (SUS), the Usefulness, Satisfaction, and Ease of Use Questionnaire (USE), and semi-structured interviews.
Results
Interview feedback converged on two themes for providers (applicability of program for real-world use, platform structure and function) and two themes for patients (barriers and facilitators to engagement, perceived treatment effects). SUS and USE scores were in the “average” to “good” ranges across cycles.
Conclusions
Findings from this study demonstrate preliminary feasibility and acceptability of the online platform. Data collected in this study will inform further refinements to the online platform. The platform’s effects on provider training outcomes and patient ED symptom outcomes will be evaluated in an open pilot trial. Given the wide treatment gap for EDs and barriers to dissemination and implementation of evidence-based treatments, the online platform represents a scalable solution that could improve access to evidence-based care for EDs.
Background
Eating disorders (EDs) are serious mental illnesses that affect 10% of people in their lifetime [1]. EDs are associated with high medical and psychiatric comorbidity, poor quality of life, and high mortality [2]. Evidence-based treatments for EDs have been well-established [3, 4] and are recommended by treatment guidelines [5]. Yet, less than 20% of those with EDs receive treatment [6, 7], and when individuals with EDs do receive care, it is typically not an evidence-based treatment [8,9,10]. Further, some research has shown that providers working in certain settings, including community mental health clinics and rural areas, may be even less likely to use evidence-based protocols [11,12,13]. Lack of accessible provider training in evidence-based treatments has been cited as a major contributor to the research-practice gap [14, 15]. Standard methods for provider training, which typically consist of a one- or two-day workshop delivered by an expert and provision of a manual [16], require substantial time and resources, making dissemination difficult [9]. This approach also is not sustainable, and as providers leave the site and new ones enter, new providers do not have access to training. Further, although workshops increase knowledge, their impact on skills is short-lived without ongoing support [16]. Scalable and sustainable methods for provider training in evidence-based treatments for EDs and ongoing support are needed.
Online platforms can overcome barriers to dissemination and implementation of training [17] and have several advantages over traditional methods of training: 1) training can be offered to geographically dispersed providers; 2) training is accessible anytime, anywhere; 3) providers can repeatedly review material, reinforcing learning; 4) the platform can be regularly updated; 5) data on most-used features can be collected, informing refinement; and 6) online training is a sustainable resource to address the issue of turnover [18,19,20]. Websites for evidence-based treatment training are starting to be developed (e.g., for motivational interviewing, cognitive-behavioral therapy, interpersonal psychotherapy, dialectical behavior therapy), have demonstrated efficacy (e.g., [21,22,23,24,25,26,27]), and hold promise for training providers in rural areas [28]. Although research has found that ongoing support is needed to sustain the impacts of training [16], providing traditional ongoing expert support can increase training costs by 50% + [29]. Electronic support tools may be a scalable solution for providing ongoing support and enhancing treatment implementation in several ways. First, checklists can help providers ensure essential components are delivered [30]. Second, routine outcome monitoring, including electronic feedback systems, improves patient outcomes [31,32,33]. Finally, electronic support tools can enhance homework compliance and facilitate information transfer to providers [34].
When selecting an evidence-based treatment to disseminate, it is important to consider efficacy; cost-effectiveness; clinical range; ease of training/learning; and mode of treatment delivery (e.g., with limited external input and by providers with minimal training) [9]. Further, the difficulties that are often encountered when attempting to scale-up evidence-based treatments may be exacerbated by design problems, which may be addressed by user-centered design (also known as human-centered design or design thinking) [35]. User-centered design is an approach to product development that grounds the process in information about the individuals and settings with which products will ultimately be used [35].
With end-user input, this study developed a novel “all-in-one” online platform for both training mental health providers in an evidence-based treatment for EDs and supporting its implementation. We used the Consolidated Framework for Implementation Research (CFIR) as a guiding framework to drive the design of the online platform and its implementation [36]. Cognitive-behavioral therapy guided self-help (CBTgsh) was selected as the treatment of focus for several reasons. First, CBTgsh is effective in treating adults with bulimia nervosa (BN) and binge-eating disorder (BED) [37, 38] and is recommended by the National Institute for Health and Care Excellence (NICE) as a first-line treatment for adults with BN and BED [39]. Second, CBTgsh is acceptable to patients, cost-effective, requires significantly less training than standard approaches, and is intended to be implemented by a wide variety of providers, including non-specialists [37]. Providers delivering CBTgsh serve as “guides” to patients to promote continued use of the self-help program and provide support, and thus it is not required for providers to have advanced therapeutic knowledge or training. Finally, given CBTgsh’s self-help format, all patient-facing self-help content can be built into the online platform, creating a “one-stop shop” for providers and patients. This paper describes the process of developing the online platform and conducting two iterative cycles of usability testing with mental health providers without prior training in CBTgsh and adult patients with EDs. The aim of the study was to garner feasibility and acceptability data on the online platform prior to evaluating its effects on provider training outcomes and patient ED symptom outcomes in an open pilot trial.
Methods
Participants
Nine mental health provider participants (n = 4 in Cycle 1; n = 5 in Cycle 2) and nine patient participants (n = 4 in Cycle 1; n = 5 in Cycle 2) were enrolled over two cycles of usability testing. In Cycle 1, eligibility criteria for providers included: 1) 18 years or older, 2) mental health provider, 3) US resident, 4) English-speaking, and 5) no experience in providing CBTgsh (experience treating EDs with other modalities was permitted but not required or assessed). In Cycle 2, eligibility criteria were identical to those in Cycle 1, with the addition that eligible providers had to currently have or anticipate having a patient with an ED in the next two months. In Cycle 1, inclusion criteria for patients included: 1) 18 years or older, 2) US resident, 3) English-speaking, and 4) screening positive for clinical/subclinical BN or BED. Patients were excluded if they met criteria for clinical/subclinical anorexia nervosa (for which CBTgsh is not an evidence-based treatment). In Cycle 2, inclusion criteria for patients included: 1) 18 years or older, 2) US resident, 3) English-speaking, and 4) identified by their provider as experiencing binge eating with or without accompanying compensatory behaviors.
Cycle 1 recruitment
In Cycle 1, we recruited providers and patients separately to complete supervised usability testing of the online platform. Provider participants were recruited through social media posts, emails, partnerships with community mental health and psychology training clinics in the midwestern U.S., and mental health provider listservs (e.g., Missouri Eating Disorders Council, Academy for Eating Disorders). Recruitment materials highlighted the opportunity to participate in a study testing an online provider training and treatment platform for EDs. Providers interested in participating self-directed to an online eligibility screen. Providers who met inclusion criteria on the screen were given the opportunity to provide their contact information to be invited to participate in the study and subsequently completed a phone call with a study team member, during which the team member explained the study aims and procedures. Providers who were eligible and interested at this stage subsequently completed a baseline survey and provided informed consent to participate.
Patient participants in Cycle 1 were recruited through social media posts and emails. Recruitment materials for patients were directed to individuals with eating or body image concerns and highlighted the opportunity to participate in a study testing an online treatment platform for EDs. Patients interested in participating self-directed from recruitment materials to an online eligibility screen, which contained questions that screened for EDs. Patients meeting inclusion criteria on the screen were given the opportunity to provide their contact information to be invited to participate in the study and subsequently completed a phone call with a study team member, during which the team member explained the study aims and procedures. Eligible patients were subsequently sent a baseline survey, during which informed consent was obtained.
Cycle 2 recruitment
In Cycle 2, we recruited provider-patient dyads to complete unsupervised usability testing of the online platform. All procedures for recruiting providers in Cycle 2 were identical to those in Cycle 1.
During the initial phone call with a study team member, prospective providers in Cycle 2 were informed of the types of patients for whom the online CBTgsh platform would be a good fit (i.e., those with binge eating with or without accompanying compensatory behaviors) and verified whether they had or anticipated having a patient with a binge-type ED in the next two months. Providers who had already begun treating their patient’s ED were eligible for the study, provided that a formal CBT protocol was not already being used. Providers were instructed to use their typical methods of assessing presence of eating pathology, an approach that has been used in other implementation research in EDs [21]; patients in Cycle 2 did not complete an eligibility screen. After completing the baseline survey, enrolled providers were instructed to share general information about the study with their patients with a binge-type ED using IRB-approved information sheets about the study provided by the study team. Specifically, providers were instructed to inform patients that they were receiving remote training in an evidence-based treatment for EDs through a research study, and that if their patient was interested in participating, providers would begin guiding patients through CBTgsh material using the online platform to address their ED symptoms, as part of their usual care. Members of the study team provided support and reminders for providers making referrals to patients. Patients interested in participating self-directed to an online eligibility screen from the information sheets, and those who were eligible were sent a baseline survey on which they provided informed consent to participate. Patient participation in the study with their provider was voluntary, and data from providers whose patients were not interested in participating were not retained; that is, we only analyzed data from providers whose patients enrolled in the study.
Online platform design
The CBTgsh web-based platform in this study was developed, hosted, and maintained by an industry partner, 3C Institute. Participants were able to access the platform using any device with internet connection (e.g., computer, smartphone).
Online platform content and features
The CBTgsh content included in the online platform was based on the Overcoming Binge Eating, 2nd Edition self-help program for EDs [40]. We created the original prototype online platform prior to usability testing based on prior implementations of CBTgsh [37, 41, 42] and consultation with experts involved in the original self-help program.
Provider-facing end
The provider-facing end of the online platform contained CBTgsh training materials, broken down into modules and delivered in numerous formats. Specifically, we created PDFs, videos, PowerPoints, and module summary sheets summarizing content in the Fairburn [40] self-help book. The training provided psychoeducation about EDs, a comprehensive description of the CBTgsh approach, guidelines for how to assess eating and body image problems in patients, and a session-by-session instructional walkthrough of how to deliver CBTgsh on a weekly basis with patients.
The provider-facing end also contained tools to support the implementation of CBTgsh with use of the platform, including session checklists with essential goals for each session and interactive sheets to take session notes on. Another key feature was that providers were given access to their patients’ real-time symptom self-monitoring data, which could be used to track patients’ progress.
Patient-facing end
The patient-facing end of the platform contained self-help content directly derived from Fairburn’s program [40], which included psychoeducation, goal-setting, and assignments broken down into modules. Specifically, the platform provided patients with chapters from the self-help book, as well as psychoeducational module cheat-sheets that we created to summarize the key learning points of each module and activity sheets (e.g., a shape-checking self-monitoring form). The platform also hosted digital self-monitoring logs, where patients could record their eating and ED symptoms; once entered, these data were immediately made visible to their providers. Screenshots of the provider- and patient-facing ends of the platform are shown in Fig. 1 and Fig. 2, respectively.
Procedures
Usability testing of the online CBTgsh platform was conducted over two cycles. After Cycle 1, refinements were made to the platform’s features and functionality based on feedback gathered from participants. All usability testing was conducted remotely to facilitate inclusion of participants across the United States. All procedures were overseen and approved by the Washington University in St. Louis Institutional Review Board.
Cycle 1: Supervised Usability Testing
Upon completion of the baseline survey in Cycle 1, enrolled participants scheduled a 30-min virtual testing session with a member of the research team. The testing session procedures were identical for both providers and patients. During the testing session, the research team member directed participants to the main features in the platform (i.e., instructions, dashboard, video, documents), a standard practice for assessing usability of an implementation strategy [43]. During the walkthrough of the platform, participants were asked to use the “think aloud” strategy and voice aloud their thoughts and immediate reactions to the platform content [44]. Following the testing session, participants participated in a 30-min semi-structured qualitative interview to further assess their experience with the platform and feasibility. Participants were subsequently emailed a post-engagement survey, which contained quantitative measures of usability of the platform. Completion of all study activities in Cycle 1 took approximately one hour. Provider and patient participants were compensated with a $25 electronic Amazon gift card.
Cycle 2: Unsupervised Usability Testing
Upon completion of the baseline survey in Cycle 2, enrolled providers were given access to the online platform (a unique account was created for each participant) and instructed to complete the CBTgsh training (which took about 3 h to complete) via the online platform within one week. During this time, providers were also instructed to share information about the study with one of their patients for whom they believed this approach was a good fit (i.e., patient with binge eating with or without accompanying compensatory behaviors). After providers completed CBTgsh training and patients were consented and enrolled, providers were instructed to deliver CBTgsh to their patients using the online platform as part of usual care over a 3-week period. At the end of the testing period, semi-structured interviews were conducted by research assistants with providers and patients separately to assess their experiences and feedback on the platform. Participants also completed post-engagement surveys which contained quantitative usability measures. Provider and patient participants were compensated with a $25 electronic Amazon gift card. Following completion of the study activities, providers and patients were able to continue using the platform if they wished.
Measures
Quantitative data
At baseline, participants reported on demographic information, including race, ethnicity, sex, gender identity, sexual orientation, household income, and living region (including if they lived in a rural area).
Providers were also asked to indicate their profession (response options: 1) Psychiatrist; 2) Psychologist; 3) Therapist; 4) Counselor; 5) Social worker; 6) Mental health worker; 7) Other [please specify]); the highest degree they had received; whether they practiced in a rural area; and whether they practiced in a community mental health center.
The System Usability Scale (SUS) [45] was used at post-engagement to evaluate the usability of the online platform. This measure contains 10 items, with response options ranging from strongly disagree [1] to strongly agree [5]. Possible scores range from 0–100; overall scores above the established cutoff of 68 reflect “above average” usability. The SUS has been validated for use in small sample sizes [46, 47].
Participants also completed the Usefulness, Satisfaction, and Ease of Use (USE) Questionnaire at post-engagement. This 30-item measure assesses usefulness, ease of use, ease of learning, and satisfaction of users [48], with response options ranging from strongly disagree [1] to strongly agree [7]. For each subscale, items were averaged to generate a score. Possible total scores range from 19–133.
The Stanford-Washington University Eating Disorder Screen (SWED) [49] was used on the eligibility screener in Cycle 1 to assess whether patients met criteria for clinical/subclinical BN or BED using the established criteria of endorsing 6 + binge eating episodes, 6 + vomiting episodes, and/or 6 + laxative/diuretic use episodes over the past 3 months. The SWED demonstrates good sensitivity and specificity for identifying DSM-5 ED diagnoses [49].
Qualitative feedback
Semi-structured interview questions solicited participant feedback on the individual platform components, the utility and design of the platform, and overall positive and negative experiences. The interview script for each cycle can be found in the Supplementary Material.
Analytic strategy
Quantitative analysis
Descriptive statistics on participant characteristics and quantitative usability data were calculated using R version 4.1.3. Inferential statistics were not used given the small sample size.
Qualitative analysis
Iterative development
The initial version of the online CBTgsh platform was tested by Cycle 1 participants. Interviews with participants were transcribed and qualitative feedback was assessed and used to inform refinements to the platform before Cycle 2. For example, in response to participant feedback, we worked with 3C Institute to: 1) modify the data fields in the self-monitoring surveys (i.e., separate fields for place and time of logged eating event); 2) add language to self-monitoring surveys to instruct patients to save data before leaving the page; 3) improve organization of psychoeducational content; and 4) provide more training and treatment content on body image problems. Refinements were made based on feasibility and how frequently suggestions were made by participants. Some suggestions were not feasible given budget limitations. Suggestions that we were not able to implement included: 1) creating a tracker to visually depict participants’ progress with training (for providers) or reading and completing modules (for patients); and 2) creating a button for providers to instantly send self-monitoring surveys and relevant modules to patients’ email addresses (instead of the patient having to log into the platform and navigate to the resources themselves).
Thematic analysis
To examine provider and patient participants’ feedback on the online platform, the study team transcribed the recordings of the semi-structured qualitative feedback interviews from Cycle 1 and 2. We expected that our sample size (n = 18) sufficed for the purposes of qualitative analyses, given that sample sizes over 9 typically achieve coding saturation and sample sizes between 16–24 achieve meaning saturation [50, 51]. During analyses, we determined that no new themes were being identified by the 18th participant; as this point, we determined that we had achieved saturation. We analyzed the transcripts using qualitative inductive thematic analysis with a realist lens focused on understanding the realities and experiences of the participants [48]. Thematic analysis aims to identify repeating patterns and contexts of participant feedback and fit our analysis goals of assessing this feedback through an inductive, realist lens. In line with Braun & Clarke [52], we read the transcripts to understand participant feedback, created two separate codebooks for provider feedback and patient feedback, coded the transcripts (two independent coders coded each transcript), and defined and named themes.
Coding procedures
During the coding process, each coder (n = 5) independently read and reviewed the transcripts and drafted preliminary codes. Then, the coding team came together and created an initial codebook, which was used to code a subset of the transcripts (1–2 transcripts per coder). After test coding using the initial codebook, all coders met several times to refine and finalize the codebook. Each transcript was coded by two independent coders using the finalized codebook. Coding discrepancies were identified and discussed by all coders until a consensus was reached. After completing coding, we used a bottom-up approach to: (1) group codes into subthemes based on their relationships within the transcripts, (2) group subthemes into themes, and (3) re-review transcripts and reevaluate themes and subthemes as needed. Finally, coders named and defined the themes and subthemes. The coding process and theme development were completed using the Dedoose software [53].
Results
Participant characteristics
Nine mental health providers (M age = 41.8 ± 8.6, 88.9% female, 100% White and non-Hispanic) without expertise in CBTgsh participated in the study. Five providers (55.6%) practiced in community mental health centers and four (44.4%) practiced in rural areas. Table 1 describes characteristics of the provider participants.
Nine patients with probable EDs (77.8% female, 100% White and non-Hispanic) participated in the study. The mean age of patients in Cycle 1 was 53.8 (SD = 5.6); age of patients was not collected in Cycle 2. In Cycle 1, 3 of 4 participants had probable subclinical bulimia nervosa, and the 1 remaining patient had probable subclinical binge eating disorder; 2 of 4 patients reported engaging in purging behaviors. Patient symptom data were not collected in Cycle 2. Table 2 describes patient participant characteristics.
Quantitative usability data
See Table 3 for detailed usability data. In Cycle 1, providers reported a mean SUS score of 83.1 (SD = 12.6) and patients reported a mean score of 86.3 (SD = 18.0). These scores represent “good” and “excellent” usability, respectively. Providers reported a mean USE score of 111.5 (SD = 15.2) and patients reported a mean score of 124.5 (SD = 9.3).
In Cycle 2, providers reported a mean SUS score of 77.5 (SD = 10.2) and patients reported a mean score of 66.0 (SD = 15.9). These scores reflect “good” and “average” usability, respectively. Providers reported a mean USE score of 98.8 (SD = 25.1) and patients reported a mean score of 71.8 (SD = 30.8). Across the sample, usability scores declined between cycles but remained in the “good” or “acceptable” categories.
Thematic analysis
Thematic analysis revealed that data converged on two themes for providers and two themes for patients.
Table 4 contains illustrative quotes from provider participants for each theme and subtheme. Themes are briefly described and illustrated below.
Provider theme 1: applicability of the program for real-world use
Providers’ feedback centered on implications for real-world use of the online platform. Specifically, their comments reflected their experiences using the platform and their thoughts about relevant factors for future use of the platform with providers and patients.
Barriers and catalysts for Use
Providers commented on factors that facilitated and deterred them from using the platform during usability testing. In terms of catalysts of their use of the platform, they highlighted the ease of use of the platform, including overall easy navigation and ability to access resources and tools. They also cited specific features that they found useful as motivators for use of the platform. For instance, one provider commented, “I think if I had a patient for whom I really wanted to sort of track how they were eating and when they were eating and making sure they were eating regularly, then I would find it really helpful to be able to glance at [the self-monitoring logs] and just to have like a snapshot of how they're doing” (P14). Providers also noted potential barriers to real-world use of the platform with patients, including scheduling challenges, limited time which may impede ability to use the platform, and seeing patients with limited tech-savviness who may find the platform content overwhelming or difficult to navigate. Other potential barriers included providers needing more scaffolding and instructions around how to use provider resources in the platform and difficulty getting buy-in to use the platform from some patients.
Quality of online training xxperience
Providers shared their impressions of the online platform’s CBTgsh training material, including the platform’s ability to provide training in a treatment approach that they did not have prior experience with. They praised specific characteristics of the training content, including the quantity, completeness, and organization of the content, as well as how relevant it seemed to themselves and patients. Several providers offered positive feedback on the various modalities of training content that was available in the platform (e.g., videos, PDFs). P10 commented, “I appreciated how the information was presented via video and then you had access to the actual slides and a checklist, like I feel like depending on what kind of a learner you have in front of you like you cover all of the different ways to absorb the information.” Providers found the training content to be informative and highlighted the introductory overview of CBTgsh as a standout component.
Utility of the platform to deliver treatment
In addition to their experiences receiving training through the platform, providers discussed their impressions of how useful the platform was as an aid for delivering treatment for EDs. They mentioned the utility of specific features and tools for treatment delivery, such as session checklists, the food log, and the symptom tracker. One provider noted, “I really liked the checklists for the sessions and being able to kind of know I could go to that one tab and find everything I needed like, oh I’m running late, I have session one and I could pull it all up and print it all easily from there” (P11). Providers also offered their views on potential benefits to both providers (e.g., being able to review patients’ recent symptoms before session) and patients (e.g., being able to track their own progress in the platform) for providers using the platform to deliver CBTgsh.
Provider theme 2: platform structure and function
Feedback from providers also centered on the ease of use of the platform, the platform’s aesthetics, and the functions of features and tools. Several providers highlighted that the platform was user-friendly, intuitive to use, and easy to navigate. One provider (P15) noted, “It's certainly easy to log on, easy to find the main tabs, provider resources, patient resources, and to click along with the different modules.”
Table 5 contains illustrative quotes from patient participants for each theme and subtheme. Themes are briefly described and illustrated below.
Patient theme 1: barriers and facilitators to engagement
Patients’ feedback on their experiences with the online platform focused on factors that facilitated and detracted from their use of the platform.
Platform functionality
Patients discussed the functionality of the platform, including how well features and tools functioned, the ease of use of the platform, and the layout and aesthetics of the platform. Several patients highlighted challenges with functionality in the self-monitoring log and food log. For example, P08 noted, “Well, the only thing that I saw was like that technical issue, was, it just seems a little glitchy, like if I enter my data and then say I wanna go back to the dashboard and print it off, it will come up to be 0, there’s 0 entries in there. So I have to completely re-log in, then go to the dashboard, then it will load, then I can bring it up, choose to download it to a PDF, and print it off.” Patients commented on how such technical challenges detracted from the usability of the logs. Patients also highlighted that the platform was overall easy to navigate.
Patient-specific factors
Feedback from patients also discussed motivating and detracting factors for using the platform that were specific to the patients’ preferences and experiences (i.e., not related to platform functionality). They mentioned factors that may influence real-world use of the platform, such as level of comfort with technology. Patients provided suggestions for future iterations of the platform with improved accessibility for patients with lower comfort with technology. One patient stated, “[It would be helpful] having like a little instructional walkthrough, so that people don’t get too confused because there’s going to be people…that are not tech savvy at all that want to be able to use it” (P03). Patients also suggested simplification of platform features to improve accessibility: “That was the hardest part for me, [the amount of detail required]. Since I’ve gotten older, things need to be more simplified, I guess. When I’m younger I could multitask like everybody and their son” (P06).
Patient theme 2: perceived treatment effects
Patients’ feedback also focused on the extent to which the platform was (or could have been) effective at addressing their ED symptoms.
Relevance to therapeutic goals
Patients provided feedback on the platform’s alignment with their goals for therapy. They specifically discussed the relevance of features and tools (e.g., the food log, symptom tracker, psychoeducation) and characteristics of platform content (e.g., quantity, personal relevance, clarify, format, completeness). Patients also reported their willingness to use the platform for further ED treatment, their willingness to recommend the platform to others, their perceived benefits of the platform (e.g., increased awareness, accountability), and the degree to which they felt that they could be honest while using the platform. P08 commented, “[What I liked about the platform was] the accountability of actually having to log information in, to have to self-assess what you would characterize as a binge, knowing this is going to be looked at by somebody else. So you're under a microscope, you know, so it's time to get real, time to be honest with yourself, and that really helped me. That really helps me. Just seeing it in black and white. I’d say the accountability factor was huge and that's what I was really afraid of losing, to be honest.” In addition to accountability, patients mentioned that the content (e.g., psychoeducation) and tools (e.g., symptom tracking) in the platform allowed them to develop more awareness of their symptoms: “I love the idea that you could actually write your response to gauge how you're feeling at the time you're eating. I love that. That was the absolute best part. Because it actually made me think a lot about things I never thought of before where my food was concerned. When I would sit down to read, I felt almost like somebody understood me while reading these articles. When I was writing, when I filled out the food journals, [I would think], ‘Hold up, is this why I'm eating this? Am I hungry, or…?’ I felt like somebody finally got it.” (P06).
Patient-provider communication
Patients commented on how they used the platform to communicate with their treatment provider. They reflected that the self-monitoring logs allowed them to send timely reports on their symptoms to their providers. Patients also highlighted how the platform’s tools allowed for private forms of communication with their providers, which facilitated honest disclosure. One patient stated, “[I think the self-monitoring surveys would be helpful for providers to assess patients’ progress] because some people aren’t comfortable talking to their providers about it even though that’s what they’re there for, so sometimes it might be easier just to do this and submit that information and then they can possibly try to do some diagnosing with what they’ve submitted” (P01).
Discussion
This study employed user-centered design to develop a prototype online platform for disseminating training for mental health providers in CBTgsh and supporting its implementation. We conducted two iterative cycles of usability testing with mental health providers without prior training in CBTgsh and adult patients with EDs. To our knowledge, this was the first “all-in-one” online platform developed to support both scalable training of providers in an evidence-based treatment and intervention delivery for EDs.
In Cycle 1 of usability testing, we recruited providers and patients separately to complete brief platform testing sessions. In Cycle 2, we recruited provider-patient dyads; providers completed training using the platform and subsequently delivered CBTgsh to a patient for three weeks. Despite the fact that refinements based on Cycle 1 feedback were made to the platform prior to Cycle 2, usability scores decreased between cycles for both providers and patients, a finding that is consistent with previous digital mental health intervention research when shifting from laboratory to real-world usability testing [54]. Because of the nature of unsupervised testing, it is plausible that the lower usability scores in Cycle 2 were influenced by the complexities of integrating the platform into routine clinical care, such as individual preferences for care, time constraints, and varying levels of comfort with learning new technologies. Indeed, thematic analysis highlighted that patients experienced challenges with the functionality of some platform tools (e.g., self-monitoring logs) in routine practice. Patients also discussed personal factors (e.g., lack of comfort with technology) that contributed to poorer usability. These findings reflect the importance of collecting usability data under real-world conditions to inform refinements that serve users’ needs, which has been called for by many researchers in the digital mental health implementation field [55, 56]. Despite the decrease in usability, scores remained in the average to good range across both cycles. Qualitative feedback suggested that providers and patients saw utility in the platform’s training and treatment capabilities; these data suggest that further refining the platform’s functionality and accessibility could improve its usability.
Thematic analysis of participant feedback revealed provider themes of applicability of the platform for real-world use and platform structure and function. On the whole, providers reported high ease of use of the platform. They found the training material to be informative, organized, and well-formatted. These findings align with prior research that has found online methods of training providers in evidence-based treatments for EDs feasible and acceptable [57]. Providers had positive impressions of the platform’s treatment implementation support tools, such as the session checklists and patient self-monitoring surveys, due to their ability to help them prepare for sessions and track their patients’ progress.
Patients’ qualitative feedback centered on barriers and facilitators to platform engagement and perceived treatment effects. Patients reported considerably lower ease of use of the platform relative to providers, citing challenges with navigation and issues with functionality of self-monitoring logs. They cited low comfort with technological tools as a barrier to using the platform and provided feedback for improving accessibility for less tech-savvy patients. However, patients generally found the platform’s treatment content and tools to align with their treatment goals and reported benefits such as increased accountability and awareness of their symptoms following use of the platform. Patients also highlighted the platform’s ability to facilitate discrete and timely patient-provider communication. This feedback suggests that patients found the platform to have high potential to address their ED symptoms, and that further refining the platform to improve functionality and accessibility could improve its effectiveness.
Strengths of this study included the use of a rigorous thematic analysis protocol, in line with Braun & Clark [52], and the diversity of the mental health providers in terms of practice setting. Another strength was the use of user-centered design with target users (i.e., mental health providers without expertise in CBTgsh and patients with EDs) under real-world conditions, which will enhance scale-up of the online platform [35]. Limitations included lack of diversity in the sample in terms of race, ethnicity, gender identity, and sex, which may limit generalizability to settings that often serve patients from minoritized backgrounds (e.g., community mental health settings). At the same time, the demographic makeup of our sample may have been influenced by broader structural issues in the mental health care landscape, as most mental health providers in the U.S. are White [58], and patients from minoritized backgrounds may be less likely to seek treatment for their ED symptoms [59, 60]. Another limitation was that we were not able to make all suggested refinements between cycles due to limited capabilities by the budget for our pilot study, which may have impacted usability. In addition, the type of usual care that patients in this study were receiving was not assessed (e.g., if CBTgsh was used to adjunct treatment for other mental health concerns or used as a sole treatment for the duration of the protocol); this may have impacted usability. Importantly, we did not assess providers’ prior experience with treating EDs, which is a limitation. Although usability data from the present sample is highly valuable, future research evaluating the provision of online training to non-specialist providers is critically needed to uncover CBTgsh’s true potential as a scalable solution for addressing the ED research-practice gap. Finally, we did not collect data on patients’ age, which represents a limitation and future direction given participants’ feedback about level of comfort with technology. Following pilot testing, future research aimed toward the implementation of the online platform in mental health settings is needed. In line with recommended guidance for implementation of digital mental health interventions in health care settings, organizational-level assessment of need, buy-in, and cost-effectiveness represents a key next step for successful implementation of the online platform [61].
Conclusions
Taken together, findings from this study demonstrate preliminary feasibility and acceptability of the online platform. Results indicate areas for improvement to increase usability of the platform, yet the approach was largely received well by both providers and patients. Data collected in this study will inform further refinements to the online platform, and the platform’s effects on provider training outcomes and patient ED symptom outcomes will be evaluated in an open pilot trial. Given the wide treatment gap for EDs [6, 7] and barriers to dissemination of evidence-based treatments [14], the online platform represents a scalable solution that could improve access to evidence-based care for EDs.
Data availability
The datasets used and/or analyzed in the current study are available from the corresponding author on reasonable request.
Abbreviations
- CBTgsh:
-
Cognitive-behavioral therapy guided self-help
- EDs:
-
Eating disorders
- SWED:
-
Stanford-Washington Eating Disorder Screen
References
Schaumberg K, Welch E, Breithaupt L, Hübel C, Baker JH, Munn-Chernoff MA, et al. The Science Behind the Academy for Eating Disorders’ Nine Truths About Eating Disorders. Eur Eat Disord Rev. 2017;25(6):432–50.
Klump KL, Bulik CM, Kaye WH, Treasure J, Tyson E. Academy for eating disorders position paper: Eating disorders are serious mental illnesses. Intl J Eating Disorders. 2009;42(2):97–103.
Kass AE, Kolko RP, Wilfley DE. Psychological treatments for eating disorders. Curr Opin Psychiatry. 2013;26(6):549–55.
Lock J. An Update on Evidence-Based Psychosocial Treatments for Eating Disorders in Children and Adolescents. J Clin Child Adolesc Psychol. 2015;44(5):707–21.
Hilbert A, Hoek HW, Schmidt R. Evidence-based clinical guidelines for eating disorders: international comparison. Curr Opin Psychiatry. 2017;30(6):423–37.
Kazdin AE, Fitzsimmons-Craft EE, Wilfley DE. Addressing critical gaps in the treatment of eating disorders. Int J Eat Disord. 2017;50(3):170–89.
Hart LM, Granillo MT, Jorm AF, Paxton SJ. Unmet need for treatment in the eating disorders: A systematic review of eating disorder specific treatment seeking among community cases. Clin Psychol Rev. 2011;31(5):727–35.
Cooper Z, Bailey-Straebler S. Disseminating Evidence-Based Psychological Treatments for Eating Disorders. Curr Psychiatry Rep. 2015;17(3):12.
Fairburn CG, Wilson GT. The dissemination and implementation of psychological treatments: Problems and solutions. Int J Eat Disord. 2013;46(5):516–21.
Waller G. Treatment Protocols for Eating Disorders: Clinicians’ Attitudes, Concerns, Adherence and Difficulties Delivering Evidence-Based Psychological Interventions. Curr Psychiatry Rep. 2016 18;18(4):36.
Wolitzky-Taylor K, Zimmermann M, Arch JJ, De Guzman E, Lagomasino I. Has evidence-based psychosocial treatment for anxiety disorders permeated usual care in community mental health settings? Behav Res Ther. 2015;72:9–17.
Dotson JAW, Roll JM, Packer RR, Lewis JM, McPherson S, Howell D. Urban and Rural Utilization of Evidence-Based Practices for Substance Use and Mental Health Disorders. J Rural Health. 2014;30(3):292–9.
Weaver A, Capobianco J, Ruffolo M. Systematic Review of EBPs for SMI in Rural America. Journal of Evidence-Informed Social Work. 2015;12(2):155–65.
Cook JM, Biyanova T, Coyne JC. Barriers to Adoption of New Treatments: An Internet Study of Practicing Community Psychotherapists. Adm Policy Ment Health. 2009;36(2):83–90.
Beck AJ, Page C, Buche MJ. Characteristics of the Rural Behavioral Health Workforce: A Survey of Medicaid/Medicare Reimbursed Providers. 2018 [cited 2024 Apr 28]; Available from: https://www.behavioralhealthworkforce.org/wp-content/uploads/2019/01/Y3-FA2-P1-Rural-Pop_Full-Report.pdf
Beidas RS, Kendall PC. Training therapists in evidence-based practice: A critical review of studies from a systems-contextual perspective. Clin Psychol Sci Pract. 2010;17(1):1–30.
Fairburn CG, Patel V. The impact of digital technology on psychological treatments and their dissemination. Behav Res Ther. 2017;1(88):19–25.
Mor Barak ME, Nissly JA, Levin A. Antecedents to Retention and Turnover among Child Welfare, Social Work, and Other Human Service Employees: What Can We Learn from Past Research? A Review and Metanalysis. Social Service Review. 2001;75(4):625–61.
Rollins AL, Salyers MP, Tsai J, Lydick JM. Staff Turnover in Statewide Implementation of ACT: Relationship with ACT Fidelity and Other Team Characteristics. Adm Policy Ment Health. 2010;37(5):417–26.
Beidas RS, Marcus S, Wolk CB, Powell B, Aarons GA, Evans AC, et al. A Prospective Examination of Clinician and Supervisor Turnover Within the Context of Implementation of Evidence-Based Practices in a Publicly-Funded Mental Health System. Adm Policy Ment Health. 2016;43(5):640–9.
Karam Jones AM, Fitzsimmons‐Craft EE, D’Adamo L, Eichen DM, Graham AK, Kolko Conlon RP, et al. A pilot study evaluating online training for therapist delivery of interpersonal psychotherapy for eating disorders. Intl J Eating Disorders. 2024;eat.24197.
Kobak KA, Wolitzky-Taylor K, Craske MG, Rose RD. Therapist Training on Cognitive Behavior Therapy for Anxiety Disorders Using Internet-Based Technologies. Cogn Ther Res. 2017;41(2):252–65.
Cooper Z, Bailey-Straebler S, Morgan KE, O’Connor ME, Caddy C, Hamadi L, et al. Using the internet to train therapists: randomized comparison of two scalable methods. J Med Internet Res. 2017;19(10): e355.
Dimeff LA, Koerner K, Woodcock EA, Beadnell B, Brown MZ, Skutch JM, et al. Which training method works best? A randomized controlled trial comparing three methods of training clinicians in dialectical behavior therapy skills. Behav Res Ther. 2009;47(11):921–30.
O’Connor M, Morgan KE, Bailey-Straebler S, Fairburn CG, Cooper Z. Increasing the availability of psychological treatments: a multinational study of a scalable method for training therapists. J Med Internet Res. 2018;20(6): e10386.
Clancy R, Taylor A. Engaging clinicians in motivational interviewing: Comparing online with face-to-face post-training consolidation. Int J Mental Health Nurs. 2016;25(1):51–61.
Curran GM, Woo SM, Hepner KA, Lai WP, Kramer TL, Drummond KL, et al. Training substance use disorder counselors in cognitive behavioral therapy for depression: Development and initial exploration of an online training program. J Subst Abuse Treat. 2015;58:33–42.
Bennett-Levy J, Perry H. The Promise of Online Cognitive Behavioural Therapy Training for Rural and Remote Mental Health Professionals. Australas Psychiatry. 2009;17(1_suppl):S121–4.
Adrian M, Lyon AR. Can E-Mail Reminders Sustain Training Gains From Continuing Education? PS. 2016;67(8):932–3.
Bailey-Straebler S, Cooper Z, Dalle Grave R, Calugi S, Murphy R. Development of the CBT-E Components Checklist: A tool for measuring therapist self-rated adherence to CBT-E. IJEDO. 2022A;5(4):6–10.
Boswell JF, Kraus DR, Miller SD, Lambert MJ. Implementing routine outcome monitoring in clinical practice: Benefits, challenges, and solutions. Psychother Res. 2015;25(1):6–19.
Knaup C, Koesters M, Schoefer D, Becker T, Puschner B. Effect of feedback of treatment outcome in specialist mental healthcare: meta-analysis. Br J Psychiatry. 2009;195(1):15–22.
Fortney JC, Unützer J, Wrenn G, Pyne JM, Smith GR, Schoenbaum M, et al. A Tipping Point for Measurement-Based Care. FOC. 2018;16(3):341–50.
Tregarthen JP, Lock J, Darcy AM. Development of a smartphone application for eating disorder self-monitoring. Intl J Eating Disorders. 2015;48(7):972–82.
Lyon AR, Koerner K. User-centered design for psychosocial intervention development and implementation. Clin Psychol Sci Pract. 2016;23(2):180.
Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implementation Sci. 2009;4(1):50.
Wilson GT, Zandberg LJ. Cognitive–behavioral guided self-help for eating disorders: Effectiveness and scalability. Clin Psychol Rev. 2012;32(4):343–57.
Wilson GT, Wilfley DE, Agras WS, Bryson SW. Psychological Treatments of Binge Eating Disorder. Arch Gen Psychiatry. 2010;67(1):94–101.
National Institute for Health and Care Excellence. Eating disorders: recognition and treatment. 2017. 2017;
Fairburn CG. Overcoming binge eating: The proven program to learn why you binge and how you can stop [Internet]. Guilford Press; 2013 [cited 2024 Apr 28]. Available from: https://books.google.com/books?hl=en&lr=&id=9A_VJal_FWwC&oi=fnd&pg=PP1&dq=Fairburn,+C.+G.+(2013).+Overcoming+binge+eating:+The+proven+program+to+learn+why+you+binge+and+how+you+can+stop.+Guilford+Press.&ots=t4H9mH_hIF&sig=9X-1EfE-kf5gCs3mrPAaPjS_3Bk
Traviss GD, Heywood-Everett S, Hill AJ. Guided self-help for disordered eating: a randomised control trial. Behav Res Ther. 2011;49(1):25–31.
Fairburn CG, Carter JC. Self-help and guided self-help for binge-eating problems. 1997; Available from: https://psycnet.apa.org/record/1997-08478-030. [cited 2024 Apr 28].
Lyon AR, Coifman J, Cook H, McRee E, Liu FF, Ludwig K, et al. The Cognitive Walkthrough for Implementation Strategies (CWIS): a pragmatic method for assessing implementation strategy usability. Implement Sci Commun. 2021;2(1):78.
Jaspers MW, Steen T, Van Den Bos C, Geenen M. The think aloud method: a guide to user interface design. Int J Med Informatics. 2004;73(11–12):781–95.
Brooke john. SUS: A “Quick and Dirty” Usability Scale. In: Usability Evaluation In Industry. CRC Press; 1996.
Bangor A, Kortum PT, Miller JT. An Empirical Evaluation of the System Usability Scale. International Journal of Human-Computer Interaction. 2008Jul 29;24(6):574–94.
Lewis JR, Sauro J. The Factor Structure of the System Usability Scale. In: Kurosu M, editor. Human Centered Design. Berlin, Heidelberg: Springer Berlin Heidelberg; 2009 p. 94–103. (Lecture Notes in Computer Science; vol. 5619). Available from: http://link.springer.com/https://doiorg.publicaciones.saludcastillayleon.es/10.1007/978-3-642-02806-9_12. [cited 2024 Apr 28].
Lund AM. Measuring usability with the use questionnaire12. Usability interface. 2001;8(2):3–6.
Graham AK, Trockel M, Weisman H, Fitzsimmons-Craft EE, Balantekin KN, Wilfley DE, et al. A screening tool for detecting eating disorder risk and diagnostic symptoms among college-age women. J Am Coll Health. 2019;67(4):357–66.
Creswell JW, Creswell JD. Research design: Qualitative, quantitative, and mixed methods approaches. Sage publications; 2017. Available from: https://books.google.com/books?hl=en&lr=&id=KGNADwAAQBAJ&oi=fnd&pg=PP1&dq=Creswell,+J.+W.,+%26+Creswell,+J.+D.+(2018).+Research+design:+Qualitative,+quantitative,+and+mixed+methods+approaches.+Fifth+edition.+Los+Angeles,+SAGE.&ots=XEEd9a4fe0&sig=SL7ToYTKehK4OOYKJxFFkVIL7zg . [cited 2024 Apr 28].
Hennink MM, Kaiser BN, Marconi VC. Code Saturation Versus Meaning Saturation: How Many Interviews Are Enough? Qual Health Res. 2017;27(4):591–608.
Braun V, Clarke V. Using thematic analysis in psychology. Qual Res Psychol. 2006;3(2):77–101.
Dedoose Version 9.0.17, cloud application for managing, analyzing, and presenting qualitative and mixed method research data. Los Angeles, CA: SocioCultural Research Consultants, LLC; 2021. Available from: www.dedoose.com
Callan JA, Dunbar Jacob J, Siegle GJ, Dey A, Thase ME, DeVito DA, et al. CBT MobileWork©: User-Centered Development and Testing of a Mobile Mental Health Application for Depression. Cogn Ther Res. 2021;45(2):287–302.
Mohr DC, Riper H, Schueller SM. A solution-focused research approach to achieve an implementable revolution in digital mental health. JAMA Psychiat. 2018;75(2):113–4.
Glasgow RE, Phillips SM, Sanchez MA. Implementation science approaches for integrating eHealth research into practice and policy. Int J Med Informatics. 2014;83(7):e1-11.
Fairburn CG, Allen E, Bailey-Straebler S, O’Connor ME, Cooper Z. Scaling up psychological treatments: a countrywide test of the online training of therapists. J Med Internet Res. 2017;19(6): e214.
Luona L, Ginsberg A. American Psychological Association uses ACS data to identify need for mental health services, education, and training. 2021;
Grammer AC, Shah J, Laboe AA, McGinnis CG, Balantekin KN, Graham AK, et al. Predictors of treatment seeking and uptake among respondents to a widely disseminated online eating disorders screen in the United States. Intl J Eating Disorders. 2022;55(9):1252–8.
Marques L, Alegria M, Becker AE, Chen C nan, Fang A, Chosak A, et al. Comparative prevalence, correlates of impairment, and service utilization for eating disorders across US ethnic groups: Implications for reducing ethnic disparities in health care access for eating disorders. International Journal of Eating Disorders. 2011;44(5):412–20.
Graham AK, Lattie EG, Powell BJ, Lyon AR, Smith JD, Schueller SM, et al. Implementation strategies for digital mental health interventions in health care settings. Am Psychol. 2020;75(8):1080–92.
Funding
This research was supported by a pilot grant from the Washington University Institute of Public Health Center for Dissemination and Implementation and NIH Grants F31 MH138068, K08 MH120341, and T32 HL130357.
Author information
Authors and Affiliations
Contributions
LD: investigation, formal analysis, writing – original draft & review/editing; AAL: formal analysis, writing – review/editing; JG: formal analysis, writing – review & editing; CH: formal analysis, writing – review & editing; MF: formal analysis, writing – review & editing; BD : investigation, writing – review/editing; MLF: investigation, writing – review & editing; ZC: conceptualization, investigation, writing – review/editing; DEW: funding acquisition, conceptualization, investigation, writing – review/editing; EEFC: conceptualization, funding acquisition, investigation, writing – original draft & review/editing.
Corresponding author
Ethics declarations
Ethics approval and consent to participate
All study procedures were approved by the Washington University Institutional Review Board. Informed consent was obtained from all study participants.
Consent for publication
Dr. Ellen Fitzsimmons-Craft (shown in Fig. 1) has provided written consent for publication of her photo in Fig. 1. Guilford Publications provided written permission to use and reproduce content from "Fairburn CG. Overcoming binge eating: The proven program to learn why you binge and how you can stop [Internet]. Guilford Press; 2013" in our research. We have permission to use the System Usability Scale, the Usefulness, Satisfaction, and Ease of Use Questionnaire, and the Stanford-Washington University Eating Disorder Screen, which are all in the public domain and freely available for use in research.
Competing interests
Dr. Fitzsimmons-Craft receives royalties from UpToDate, is on the Clinical Advisory Board for Beanbag Health, and is a consultant for Kooth.
Additional information
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Supplementary Information
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License, which permits any non-commercial use, sharing, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if you modified the licensed material. You do not have permission under this licence to share adapted material derived from this article or parts of it. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by-nc-nd/4.0/.
About this article
Cite this article
D’Adamo, L., Laboe, A.A., Goldberg, J. et al. Development and usability testing of an online platform for provider training and implementation of cognitive-behavioral therapy guided self-help for eating disorders. BMC Digit Health 3, 2 (2025). https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s44247-024-00140-6
Received:
Accepted:
Published:
DOI: https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s44247-024-00140-6