Skip to main content

Centralizing prescreening data collection to inform data-driven approaches to clinical trial recruitment

Abstract

Background

Recruiting to multi-site trials is challenging, particularly when striving to ensure the randomized sample is demographically representative of the larger disease-suffering population. While previous studies have reported disparities by race and ethnicity in enrollment and randomization, they have not typically investigated whether disparities exist in the recruitment process prior to consent. To identify participants most likely to be eligible for a trial, study sites frequently include a prescreening process, generally conducted by telephone, to conserve resources. Collection and analysis of such prescreening data across sites could provide valuable information to improve understanding of recruitment intervention effectiveness, including whether traditionally underrepresented participants are lost prior to screening.

Methods

We developed an infrastructure within the National Institute on Aging (NIA) Alzheimer’s Clinical Trials Consortium (ACTC) to centrally collect a subset of prescreening variables. Prior to study-wide implementation in the AHEAD 3–45 study (NCT NCT04468659), an ongoing ACTC trial recruiting older cognitively unimpaired participants, we completed a vanguard phase with seven study sites. Variables collected included age, self-reported sex, self-reported race, self-reported ethnicity, self-reported education, self-reported occupation, zip code, recruitment source, prescreening eligibility status, reason for prescreen ineligibility, and the AHEAD 3–45 participant ID for those who continued to an in-person screening visit after study enrollment.

Results

Each of the sites was able to submit prescreening data. Vanguard sites provided prescreening data on a total of 1029 participants. The total number of prescreened participants varied widely among sites (range 3–611), with the differences driven mainly by the time to receive site approval for the main study. Key learnings instructed design/informatic/procedural changes prior to study-wide launch.

Conclusion

Centralized capture of prescreening data in multi-site clinical trials is feasible. Identifying and quantifying the impact of central and site recruitment activities, prior to participants signing consent, has the potential to identify and address selection bias, instruct resource use, contribute to effective trial design, and accelerate trial enrollment timelines.

Background

The recruitment phase of multi-site clinical trials represents a large, but modifiable, component of total trial duration and cost [1, 2]. “Successful” recruitment for a trial includes not only accruing the sample on schedule [3] but also enrolling a sample that is representative of the larger disease-suffering population [4,5,6,7]. There are several known barriers to participating in clinical trials, particularly in historically underrepresented communities [8, 9]. Few interventions have been demonstrated to overcome these barriers [10].

The evaluation of recruitment strategies is generally focused on actual enrollment, including successful screening and randomization of study participants. However, measuring activity prior to trial enrollment may provide valuable information for both central and local efforts to accelerate and diversify recruitment, instruct resource expenditures, adjust recruitment campaigns, and, if needed, amend protocols to address observed selection bias.

Efforts to capture prescreening recruitment data may be particularly valuable in preclinical Alzheimer’s disease (AD) trials. These trials enroll cognitively unimpaired older volunteers who are screened for biological markers of AD [11]. Traditional clinical trial recruitment methods may not be effective in these trials, especially when the goal is to recruit a demographically representative cohort. In the first multi-site preclinical AD trial, the Anti-Amyloid treatment in Asymptomatic AD (A4) study [12], it was shown that participants from underrepresented racial and ethnic groups were more frequently recruited through local, compared to centralized or national, efforts [13]. Yet, it has been difficult to determine the most effective recruitment strategies and whether specific central efforts led to more successful local recruitment remains unclear [14], in part due to lack of prescreening data.

Systematizing and centralizing prescreening data capture is uncommon [15], in part due regulations that restrict formal data collection prior to consent and inclusion of prescreening data in parent trial databases as well as limited resources available to support this effort. Yet, to understand the effectiveness of recruitment strategies and potential sample bias, it is critical for trialists to assess the full recruitment process. This process begins with efforts to increase awareness and interest in trials and data are needed to examine this “top of the funnel” (Fig. 1). Limiting recruitment data to those collected after consent at in-person screening visits tells only part of the story.

Fig. 1
figure 1

Clinical trial recruitment “funnel”

To evaluate recruitment prior to enrollment, we designed and developed a centralized prescreening database, the data-driven approach to recruitment (DART), for an ongoing preclinical AD trial conducted by the NIA-funded Alzheimer’s Clinical Trials Consortium (ACTC). In this manuscript, we describe the design of the DART database and the share pilot data obtained from the initial vanguard phase used to assess the feasibility of centralizing prescreening data collection.

Methods

The prescreening database was implemented under the AHEAD 3–45 study, a clinical trial evaluating the safety and efficacy of lecanemab (BAN2401, Eisai Inc.) in individuals who may be at increased biological risk for AD dementia [16, 17].

Design

The DART database was collaboratively developed by a working group of ACTC coordinating center personnel and participating study sites. This working group met monthly from September 2020 to June 2021, first to design the data collection form and then to establish methods to minimize barriers and maximize the likelihood of adoption by many study sites. We viewed the inclusion of site personnel in this working group as essential to the success of this initiative.

Variable selection

Variables selected for the DART database were aligned with the ACTC Minimal Data Set (MDS) recruitment and demographic variables (Table 1). To minimize site burden, we restricted the number of variables collected to eleven, including seven categorical and four free-text fields.

Table 1 Prescreening database variable description

Site selection and responsibilities

In the DART vanguard phase, seven active AHEAD 3–45 sites collected prescreening data for approximately eight months. Vanguard sites were selected with attention to balance across site type and experience, including experience in similar trials, existing infrastructure to capture prescreening data, and existing prescreening databases. Vanguard sites were reimbursed for their participation.

A key component of the vanguard phase was a monthly meeting with representatives from each site to discuss implementation and to share and review metrics generated from prescreening data. The goal was to use preliminary site experiences to identify opportunities to improve database design, reduce site burden, facilitate timely data entry, and improve data integrity.

Electronic data capture system (EDC)

We developed a separate EDC system specifically for this prescreening initiative, using the same framework as the AHEAD 3–45 study EDC [18]. Given that preexisting methods of capturing prescreen data varied widely across sites, we offered two options for sites to transmit prescreening data. For sites not capturing prescreening data electronically, data were entered directly into the EDC by site personnel. For sites with preexisting prescreening databases, batched upload was permitted, if it was performed at least every 2 weeks. A Data Transfer Agreement ensured the uploaded data were coded and formatted appropriately. Summary data reports were developed and distributed to sites and study leadership.

Institutional review board approval and informed consent

The central IRB governing the AHEAD 3–45 study (Advarra, Columbia, MD) determined that the prescreening database was of minimal risk and did not require a formal informed consent process. Advarra granted the study a Waiver of Consent and Waiver of HIPAA after determining that the waiver satisfied the Common Rule and the criteria set forth in the HIPAA Privacy Rule at (45 CFR 164.512(i)(2)). Only deidentified information is collected in the central database.

Statistical analysis

Time to contract execution and time to IRB approval for this initiative were defined as the time from when the site’s participation was confirmed to the time the contract was fully executed and central IRB approval was received by each site, respectively. Time to data entry was defined as the time the contract was fully executed to the time data entry/upload was initiated at the respective site.

Continuous variables were summarized by means and standard deviations while categorical variables were summarized using percentages. All statistical analyses were performed using R (Version 4.1.0).

Results

Each of the seven vanguard sites was able to provide data in this initiative. Six of the sites opted to directly enter their data into the EDC, and one site utilized the batched upload functionality. Mean time to IRB approval after site selection was 124.1 days (range: 62–157 days), and mean time to contract execution was 213.1 days (range: 167–299 days). Mean time to data entry was 47.6 days (range: 2–128 days) across sites.

Sites reported some challenges while initiating this protocol at their sites. The most common barrier reported was the inability to identify staff at sites to complete the data entry, primarily affecting time to data entry. Other issues raised included the handling of incomplete records (i.e., participants decided not to proceed with screening before demographic information was collected), difficulty tracking prescreening status and updating records accordingly, and inconsistent entry of the PTID# in the prescreen EDC after eligible participants attended a study screening visit. For the batched upload site, minor formatting issues arose that required correction prior to incorporating the data into the database. These included using incorrect separators (“/” instead of “|”) and incorrect coding of gender. Once these barriers were addressed, subsequent issues entering the data were minimal.

Table 2 displays the demographic summaries and recruitment source of participants prescreened during the vanguard phase by site. Most prescreened participants in this vanguard phase self-reported as being female sex, White race, and non-Hispanic ethnicity. The total number of prescreened participants varied widely among the vanguard sites (range: 3–611 participants), as did the recruitment sources of the participants. Referrals through websites (including study website, site website, and ClinicalTrials.gov) consistently produced a meaningful proportion 55.8% (range: 36.0–75.0%) of prescreen activity.

Table 2 Demographics and recruitment source by site

Table 3 summarizes recruitment source data by race and ethnicity, respectively. Websites were the most common recruitment source across racial groups. Hispanic participants appeared to have more often been recruited from registries and local recruitment efforts than to non-Hispanics participants.

Table 3 Recruitment source by race and ethnicity

Table 4 presents eligibility results, reported by race and ethnicity. During the vanguard phase, 19% of prescreened participants had been deemed eligible for in-person screening, though many prescreened participants remained in the prescreening process at the time of data freeze.

Table 4 Eligibility by race and ethnicity

Table 5 shows the distribution of reasons why participants did not continue to in-person screening by race. The most frequent reason was a loss of interest or concern about study burden, though many participants (50%) who did not proceed to in person screening had reason entered as “other.”

Table 5 Reason for prescreen fail by race

Qualitative feedback collected from the vanguard sites through the monthly meetings suggested that the metrics routinely shared were helpful in guiding local recruitment efforts. Therefore, site-specific reports will be generated for all participating sites, and study-level web reports will be made available in real-time to study leadership for the study-wide implementation phase of this initiative.

Discussion

Through the DART initiative, we demonstrated that the collection of prescreening data in a multi-site clinical trial is feasible. Seven vanguard sites were able to enter or upload prescreening data into an EDC developed specifically for this purpose. We also demonstrated that meaningful questions can be answered by capturing key variables from the prescreening phase.

From these preliminary data, we identified recruitment strategies that more often yielded prescreened participants than others. We also observed early trends that the effectiveness of recruitment sources may differ among racial and ethnic groups. For example, multiple sources of outreach including local campaigns, such as local television or radio interviews, account for a slightly higher percentage of Hispanic prescreens, compared to non-Hispanic, though the sample size remains low. Moreover, the AHEAD 3–45 study website accounted for a high percentage of prescreened participants across several racial and ethnic groups. Given that most of the central recruitment strategies implemented for the study promoted the study website, this suggests that these efforts may be a critical element toward improving demographic representation in this study.

Measuring prescreen failure rates may offer important guidance to trial leadership. From these vanguard sites, we found that losing interest and the unwillingness to endure trial burden were more frequent reasons for people to not proceed to in-person screening than was ineligibility based on trial enrollment criteria. This finding could help inform changes in recruitment materials or site practices to explore means to reduce burden or to make research participation more appealing to potential participants. Notably, had trial enrollment criteria been a primary reason for failure to advance to in-person screening, such data would provide the study team the opportunity to review and potentially revise the trial inclusion/exclusion criteria.

Next steps

As we move towards study-wide implementation phase of DART, some changes have been made to the data collection form. As noted above, a high percentage of reasons participants prescreen fail were entered as “other,” followed by a free-text description. In response to this, we expanded the options that sites can select for “Reason for Prescreen Fail.” We used the reasons written in the free-text field to expand the categories to match the exact inclusion/exclusion criteria from the trial, including “age,” “does not have additional risk factor (< 65 years old only),” “already enrolled in another clinical trial,” “no longer interested—lives too far from study site,” and “lost to follow-up/unable to contact.” The addition of these options will help limit the number of “other” reasons in the expanded initiative and permit more useful and analyzable data. We also decided to eliminate the collection of occupation and education as these variables were infrequently collected in the initial stages of the prescreening process and hence resulted in substantial missing data.

Using these methods to centralize prescreening data collection requires effort from site personnel, project management, data management, and biostatistics, making funding an essential component for success. Ideally, resources to collect prescreening data would be included in the original study budget. Some but not all trials offer start-up funds for the effort of securing IRB approval and other preparatory needs, as well as recruiting participants for initial screens. Inclusion of the site effort to put a prescreening database and infrastructure in place might ideally be included as a line-item in start-up budgets. Alternatively, the resources for maintaining prescreening databases might be provided as part of infrastructure resources for new or established trial site consortia.

Limitations

We acknowledge some important limitations. The DART initiative vanguard phase utilized only 7 sites from the AHEAD 3–45 study with a small number of data variables. This was done to limit site, participant, and coordinating burden and to enable collection of preliminary experiences with the initiative. The prescreening initiative may result in duplication of effort for at least some sites that already capture prescreening data electronically, though we offered the batched upload option to minimize burden for those sites. The initiative has costs, which may limit the ability for small and/or underfunded trials and trial networks to create this infrastructure, potentially limiting the generalizability of this effort. Alternatively, collecting these data may enable efficient use of recruitment resources, potentially reducing overall trial costs. The initiative started early in the recruitment phase of the AHEAD 3–45 study, which may have had an impact on the number of prescreens sites entered. This may accurately reflect start-up in future trials, but we have limited information related to the main stages of study accrual. The COVID-19 pandemic may also have influenced these results, given the impact on site staffing during the vanguard phase and possible effects on willingness to participate in the AHEAD 3–45 study. Finally, though the study website yielded the most prescreens, it is unclear how participants found the study website as other advertisements may have directed them towards the website. Though this is a limitation, it does support the utility of a study website as a mechanism for potential participants to connect with sites.

Conclusions

Recruitment for clinical trials is challenging and time consuming. Relying on post-consent screening data is insufficient to fully capture the effectiveness of centralized and local efforts to accrue a full sample and identify sources of selection bias. The centralized collection of prescreening data may increase the efficiency, speed, and effectiveness of study recruitment, including enrolling a cohort more representative of the population at large. The vanguard phase of this innovative prescreening database initiative demonstrated the feasibility of establishing such a database and allowed the project team to learn important lessons to increase the likelihood of a successful study-wide implementation.

Availability of data and materials

Not applicable.

Abbreviations

NIA:

National Institute on Aging

ACTC:

Alzheimer’s Clinical Trials Consortium

AD:

Alzheimer’s disease

A4 Study:

Anti-Amyloid Treatment in Asymptomatic AD study

DART:

Data-driven approach to recruitment

EDC:

Electronic data capture system

PTID:

Participant ID

References

  1. Schneider LS. Recruitment methods for United States Alzheimer disease prevention trials. J Nutr Health Aging. 2012;16(4):331–5.

    Article  CAS  PubMed  Google Scholar 

  2. Vellas B, Hampel H, Rouge-Bugat ME, Grundman M, Andrieu S, Abu-Shakra S, et al. Alzheimer’s disease therapeutic trials: EU/US Task Force report on recruitment, retention, and methodology. J Nutr Health Aging. 2012;16(4):339–45.

    Article  CAS  PubMed  Google Scholar 

  3. Kasenda B, von Elm E, You J, Blumle A, Tomonaga Y, Saccilotto R, et al. Prevalence, characteristics, and publication of discontinued randomized trials. JAMA. 2014;311(10):1045–51.

    Article  CAS  PubMed  Google Scholar 

  4. Gilmore-Bykovskyi AL, Jin Y, Gleason C, Flowers-Benton S, Block LM, Dilworth-Anderson P, et al. Recruitment and retention of underrepresented populations in Alzheimer’s disease research: a systematic review. Alzheimers Dement (N Y). 2019;5:751–70.

    Article  PubMed  Google Scholar 

  5. Nuno MM, Gillen DL, Dosanjh KK, Brook J, Elashoff D, Ringman JM, et al. Attitudes toward clinical trials across the Alzheimer’s disease spectrum. Alzheimers Res Ther. 2017;9(1):81.

    Article  PubMed  PubMed Central  Google Scholar 

  6. FDA. Enhancing the diversity of clinical trial populations - eligibility criteria, enrollment practices, and trial designs guidance for industry - guidance document. 2020.

    Google Scholar 

  7. Grill JD, Sperling RA, Raman R. What should the goals be for diverse recruitment in Alzheimer clinical trials? JAMA Neurol. 2022;79(11):1097–8.

    Article  PubMed  Google Scholar 

  8. Oh SS, Galanter J, Thakur N, Pino-Yanes M, Barcelo NE, White MJ, et al. Diversity in clinical and biomedical research: a promise yet to be fulfilled. PLoS Med. 2015;12(12):e1001918.

    Article  PubMed  PubMed Central  Google Scholar 

  9. Wendler D, Kington R, Madans J, Van Wye G, Christ-Schmidt H, Pratt LA, et al. Are racial and ethnic minorities less willing to participate in health research? PLoS Med. 2006;3(2):e19.

    Article  PubMed  Google Scholar 

  10. Grill JD, Galvin JE. Facilitating Alzheimer disease research recruitment. Alzheimer Dis Assoc Disord. 2014;28(1):1–8.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  11. Sperling RA, Karlawish J, Johnson KA. Preclinical Alzheimer disease-the challenges ahead. Nat Rev Neurol. 2013;9(1):54–8.

    Article  CAS  PubMed  Google Scholar 

  12. Sperling RA, Rentz DM, Johnson KA, Karlawish J, Donohue M, Salmon DP, et al. The A4 study: stopping AD before symptoms begin? Sci Transl Med. 2014;6(228):228fs13.

    Article  PubMed  PubMed Central  Google Scholar 

  13. Raman R, Quiroz YT, Langford O, Choi J, Ritchie M, Baumgartner M, et al. Disparities by race and ethnicity among adults recruited for a preclinical Alzheimer disease trial. JAMA Netw Open. 2021;4(7):e2114364.

    Article  PubMed  PubMed Central  Google Scholar 

  14. Tarrant SD, Bardach SH, Bates K, Nichols H, Towner J, Tamatha C, et al. The Effectiveness of small-group community-based information sessions on clinical trial recruitment for secondary prevention of Alzheimer’s disease. Alzheimer Dis Assoc Disord. 2017;31(2):141–5.

    Article  PubMed  PubMed Central  Google Scholar 

  15. NIA/NIH. Alzheimer’s Disease and Related Dementias Clinical Studies Recruitment Planning Guide 2019 [Available from: https://www.nia.nih.gov/sites/default/files/2019-05/ADEAR-recruitment-guide-508.pdf.

  16. [Available from: https://clinicaltrials.gov/ct2/show/NCT04468659.

  17. Rafii MS, Sperling RA, Donohue MC, Zhou J, Roberts C, Irizarry MC, et al. The AHEAD 3–45 Study: design of a prevention trial for Alzheimer’s disease. Alzheimers Dement. 2022;19(4):1227–33.

    Article  PubMed  PubMed Central  Google Scholar 

  18. Jimenez-Maggiora GA, Bruschi S, Qiu H, So JS, Aisen PS. Corrigendum to: ATRI EDC: a novel cloud-native remote data capture system for large multicenter Alzheimer’s disease and Alzheimer’s disease-related dementias clinical trials. JAMIA Open. 2022;5(1):ooac008.

    Article  PubMed  PubMed Central  Google Scholar 

Download references

Acknowledgements

The authors would like to acknowledge the staff that contributed to this effort at ATRI and seven vanguard sites: Martha Muniz, Nitya Rajasekaran, Raymond Scott Turner, Megan Hall, Anita Ranta, Yolanda Tillman, Jaimie Ziolkowski, Shirley Sirivong, Beatriz Yanez, Dan Hoang, Mary Nguyen, Wilma Burns, Lauren Mackenzie, Yazleen Reyes, Gaby Campos-Cortes, Olusegun Adegoke, Ana Rodriguez, and Akpevweoghene Ikoba.

Funding

This initiative was supported by the NIH/NIA (5U24AG057437) and Eisai Inc.

Author information

Authors and Affiliations

Authors

Contributions

Dylan Kirn, Rema Raman, and Josh Grill wrote the main manuscript text. Dylan Kirn prepared Figure 1. Karin Ernstrom and Shunran Wang prepared Tables 1–5. All authors reviewed the manuscript and significantly contributed to the design and implementation of this initiative. The authors read and approved the final manuscript.

Corresponding author

Correspondence to Dylan R. Kirn.

Ethics declarations

Ethics approval and consent to participate

This study was approved by the central IRB (Advarra, Columbia, MD) and was conducted in accordance with the Declaration of Helsinki. Advarra granted the study a Waiver of Consent and Waiver of HIPAA after determining that the waiver satisfied the Common Rule and the criteria set forth in the HIPAA Privacy Rule at (45 CFR 164.512(i)(2)).

Consent for publication

Not applicable.

Competing interests

The authors declare no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Kirn, D.R., Grill, J.D., Aisen, P. et al. Centralizing prescreening data collection to inform data-driven approaches to clinical trial recruitment. Alz Res Therapy 15, 88 (2023). https://doi.org/10.1186/s13195-023-01235-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s13195-023-01235-4

Keywords