Skip to main content

Portable stroke detection devices: a systematic scoping review of prehospital applications

Abstract

Background

The worldwide burden of stroke remains high, with increasing time-to-treatment correlated with worse outcomes. Yet stroke subtype determination, most importantly between stroke/non-stroke and ischemic/hemorrhagic stroke, is not confirmed until hospital CT diagnosis, resulting in suboptimal prehospital triage and delayed treatment. In this study, we survey portable, non-invasive diagnostic technologies that could streamline triage by making this initial determination of stroke type, thereby reducing time-to-treatment.

Methods

Following PRISMA guidelines, we performed a scoping review of portable stroke diagnostic devices. The search was executed in PubMed and Scopus, and all studies testing technology for the detection of stroke or intracranial hemorrhage were eligible for inclusion. Extracted data included type of technology, location, feasibility, time to results, and diagnostic accuracy.

Results

After a screening of 296 studies, 16 papers were selected for inclusion. Studied devices utilized various types of diagnostic technology, including near-infrared spectroscopy (6), ultrasound (4), electroencephalography (4), microwave technology (1), and volumetric impedance spectroscopy (1). Three devices were tested prior to hospital arrival, 6 were tested in the emergency department, and 7 were tested in unspecified hospital settings. Median measurement time was 3 minutes (IQR: 3 minutes to 5.6 minutes). Several technologies showed high diagnostic accuracy in severe stroke and intracranial hematoma detection.

Conclusion

Numerous emerging portable technologies have been reported to detect and stratify stroke to potentially improve prehospital triage. However, the majority of these current technologies are still in development and utilize a variety of accuracy metrics, making inter-technology comparisons difficult. Standardizing evaluation of diagnostic accuracy may be helpful in further optimizing portable stroke detection technology for clinical use.

Peer Review reports

Background

Stroke is a severe medical emergency and a leading cause of morbidity and mortality worldwide, [1] causing approximately 1 in every 19 deaths in the United States alone [2].

Advances in stroke treatment, particularly endovascular therapy (EVT), have been shown to be highly effective in improving functional outcome in patients with emergent large vessel occlusion (LVO) [3,4,5,6]. However, EVT outcomes are time-dependent, with every 15-minute decrease in stroke onset to EVT arterial puncture associated with an increased chance of independent ambulation (absolute increase 1.14% [95% CI: 0.75–1.53%]) and modified Rankin Scale 0–2 (absolute increase 0.91% [95% CI, 0.45–1.36%]) at discharge [6, 7]. With only 10% of stroke centers capable of providing EVT, [8, 9] delayed or inaccurate prehospital diagnosis may increase need for interhospital transfer, associated with a 116-minute average time-delay, thereby increasing time-to-revascularization for LVO patients and placing undue burden on stroke centers tasked with treating incorrectly diagnosed stroke mimics [10, 11]. Finally, emerging evidence that timely treatment of intracranial hemorrhage is associated with improved clinical outcomes further suggests that accurate early recognition of different stroke types could have substantial effect on recovery after stroke [12].

Currently, prehospital diagnosis of stroke relies on stroke triage scales, and the results of these assessments provide the basis for emergency medical service (EMS) transport decisions. The diagnostic accuracy of such scales in detecting large vessel occlusions, however, is low, ranging from 55 to 89% sensitivity, 40–92% specificity, and 0.73–0.78 area under the ROC curve (AUC) [13,14,15,16,17,18,19]. This variability in accuracy may be the result of varying stroke types, severity, and presenting symptoms, as well as inter-state protocol differences, resulting in misdiagnosis of stroke types and, as consequence, selection of less appropriate hospital types for initial stroke admission and increased necessity for interhospital transfer [20,21,22,23].

Thus, the integration of portable diagnostic technology in standard prehospital stroke care may improve triage and play an important role in reducing transportation time to the appropriate hospital, allowing for increased treatment efficiency and improved functional outcomes for stroke patients. Therefore, the purpose of this study was to conduct a systematic scoping review intended to (1) identify and characterize novel portable technologies with the potential to diagnose stroke in the prehospital setting; (2) report diagnostic accuracy and feasibility of use of identified technologies; and (3) assess the quality of any included studies.

Methods

A systematic scoping review methodology was selected to identify available diagnostic accuracy and feasibility of use data for novel portable stroke detection devices and to identify any knowledge gaps in this emerging field, as recommended by previous literature [24] and an experienced librarian. Due to the scoping nature of this study, its protocol has not been prospectively registered. This study utilizes the methodology framework described by Arksey and O’Malley [25] and follows the Preferred Reporting Items for Systematic Reviews and Meta-Analyses Extension for Scoping Reviews (PRISMA-ScR) reporting guidelines [26].

Review inclusion criteria

Any study reporting a portable, non-invasive technology or device with potential for prehospital detection of stroke or intracranial hemorrhage was eligible for inclusion. Studies were required to report diagnostic accuracy results (including, but not limited to, specificity, sensitivity, and/or area under the curve) after testing on patient cohorts experiencing ischemic stroke, hemorrhagic stroke, or intracranial hemorrhage in clinical settings for inclusion.

Review exclusion criteria

Studies were excluded if they reported on technology that requires use of specialized vehicles (ex. mobile stroke units using mobile CT technology) and could not be translated into handheld, easily transported devices. Studies solely testing computer algorithm-based stroke detection without other portable technology applications were also excluded, as were studies that tested technology only on phantom individuals rather than real world patients. Case studies, studies not written in English, and studies without available full texts were also excluded.

Search strategy

A search strategy was developed alongside an experienced librarian following a series of preliminary searches identifying studies and key terms relevant to the study questions. Final search strategies were applied to PubMed and Scopus until January 2021 and are found in Additional file 1: Appendix 1. Similar articles and articles cited by included studies were also retrieved.

Study selection

Retrieved articles were imported into EndNote X9 citation software, and following de-duplication, two reviewers (S.C. and R.K.) screened title/abstracts and retained full-text studies based on identified inclusion and exclusion criteria to select the final list of included studies. Any concerns about inclusion were resolved by discussion with author C.P.K.

Data extraction

Data were extracted independently by two reviewers (S.C. and R.K.). Any discrepancies were resolved by discussion. A data extraction tool was developed using previous reporting frameworks and published reviews as guidance [27, 28].Included fields consisted of general study design as well as more in-depth characterization of technologies, diagnostic accuracy, and feasibility of use. The extraction tool also included a field for cost of application, but as no identified studies reported cost data, that field is not reported here.

Data synthesis

No further data analysis was undertaken due to the lack of standardized reporting for diagnostic accuracy in this field and the scoping nature of this review.

Quality assessment

The quality of included studies was assessed independently by two reviewers (S.C. and R.K.) using the Quality Assessment of Diagnostic Accuracy Studies-2 (QUADAS-2) tool, [29] with any discrepancies were resolved by discussion. The QUADAS-2 tool reviews risk of bias and applicability specific to diagnostic accuracy reporting through four domains, including patient selection, index test, reference standard, and flow and timing. Risk of bias is reported in each domain as high, low, or unclear.

Results

Our review identified 227 studies after application of the search strategy and deduplication. A total of 81 studies were selected for full text screening, during which 65 studies were excluded for the reasons listed in Fig. 1 based on review criteria. Sixteen studies were included in the final analysis. Assessed risk of bias of each included study is shown in Table 1.

Fig. 1
figure 1

PRISMA flow diagram for study inclusion

Table 1 QUADAS-2 tool for risk of bias assessment for included studies

Characteristics of the included studies are summarized in Table 2. Six of the included studies described near-infrared spectroscopy (NIRS) technology, 4 tested ultrasound technologies, 4 used diagnostic electroencephalography (EEG) scanning, 1 studied microwave technology, and 1 used a volumetric impedance phase shift spectroscopy (VIPS) device. The majority of studies were conducted in North America (7) and Europe (6), with the majority of devices being tested in the United States (6). Three of the studies described prehospital testing of devices (either on-site or during hospital transport). Six devices were tested in emergency departments (EDs), and the remaining 7 were tested in hospital or stroke center settings. The majority of studies required trained personnel or clinicians for device use, while output was often computer or device-generated. All identified studies described non-invasive, portable devices with potential prehospital applications. Median time for measurement was 3 minutes (IQR: 3 minutes to 5.6 minutes). No cost data were available in any study.

Table 2 Characterization of Included Studies

Stroke detection

Ten devices were used for the purpose of stroke detection, including differentiating strokes from stroke mimics or healthy controls, detecting middle cerebral artery (MCA) occlusion, detecting or differentiating severe stroke or large vessel occlusions (LVOs), and differentiating between stroke types (Table 3).

Table 3 Characterization of included portable stroke technologies and associated diagnostic accuracy metrics

Three devices were used to differentiate stroke from stroke mimics or healthy controls, including two studies testing EEG devices and one study using ultrasound technology.

Michelson, et al. [30] presented a US multicenter study analyzing the diagnostic accuracy of a hand-held EEG device (BrainScope Co., Inc.) in detecting stroke in 183 patients (31 ischemic stroke, 17 hemorrhagic stroke, 135 stroke mimic confirmed by CT/MRI) presenting to the ED. Ten minutes of EEG data were recorded, and data analysis was done through a derived Structural Brain Injury Index (SBII) algorithm. Stroke detection had a sensitivity of 91.7% and specificity to stroke mimic of 50.4%. The study reported a > 90% accuracy in detecting ischemic and hemorrhagic stroke, with 80% sensitivity to CT- MRI+ ischemic stroke. Authors noted that many included patients may have experienced prior strokes or transient ischemic attacks, which may disrupt baseline EEG recordings, and thus clinical applicability may benefit from inclusion of a control group.

Wilkinson, et al. [31] discussed a small single-center study that used the Muse (InteraXon Inc., Toronto, ON) electroencephalography system to detect stroke severity in 25 patients presenting to a university hospital in Canada. Patients were assessed an average of 3.71 days after stroke onset. The authors noted an increase in delta/alpha ratio and (delta+theta)/(alpha+beta) ratio in ischemic stroke patients with increased severity, as well as a low frequency decrease and high frequency increase in pairwise-derived Brain Symmetry Index in stroke patients compared to controls. The device was able to differentiate moderate/severe stroke from small strokes and controls (as diagnosed by CT, MRI, and stroke scale assessment) with a sensitivity of 63% and specificity of 86%. Scans were recorded for 3 minutes in eyes-open and -closed states. Feasibility of prehospital use may benefit from automatability of EEG interpretation but results from this study are limited by small sample size and may be less applicable to more acute stroke cases.

Herzberg, et al. [32] described a single-site study in Germany that used two portable ultrasound machines capable of transcranial color-coded sonography (TCCS), SonoSite Micromaxx (SonoSite Inc., Bothell, Wash., USA) and Philips CX50 (Philips Ultrasound, Bothell, Wash., USA, to detect stroke in 102 patients on-site or during ambulance transport. Examinations were performed by a TCCS-experienced neurologist, lasted 5.6 minutes on average, and were reviewed by a certified sonographer. Compared with in-hospital CT, CTA, or MRA imaging, TCCS was able to differentiate stroke from stroke mimics with 94% sensitivity and 48% specificity. Authors note increased diagnostic accuracy in detection of MCA occlusion with addition of TCCS examination.

Detection of MCA occlusion

One study, Schlachetzki, et al. [33], also described the use of SonoSite Micromaxx and Philips CX50 in the prehospital setting for the detection of middle cerebral artery (MCA) occlusion on-site or during ambulance transit in Germany. Contrast agent microbubbles were added in some cases and improved results in cases with inadequate temporal bone windows. The device was used to examine 113 enrolled patients with symptoms of acute ischemic stroke, and examinations were performed by board-certified stroke neurologists or a senior resident with neurosonography certification. Based on CT angiography, MR angiography, and in-house neurosonography, 9 out of 10 MCA occlusions were identified correctly, while patent MCAs were correctly identified in 75 out of 76 cases. Sensitivity and specificity were reported to be 90 and 98%, respectively. The average time needed to perform the TCCS scan was 5.6 minutes and did not delay prehospital management. Prehospital use may be partially limited by the expertise and training needed to perform ultrasound examinations.

Detection of LVO/severe stroke

Five studies used devices to detect LVO or severe stroke, with 2 studies testing ultrasound technologies, 2 testing EEG devices, and 1 using a VIPS device. One study [34] also reported diagnostic accuracy of ICH detection with the same ultrasound device, and another study [35] reported higher diagnostic accuracy in detecting any stroke/TIA than detecting LVO alone with an EEG device.

Antipova, et al. [34] was a single-center study that performed non-contrast TCCS using a SonoSite M-Turbo Point-of-Care ultrasound machine, Philips Sparq, or Philips CX50 ultrasound on 107 patients presenting to a general hospital in the UK with acute stroke symptoms. Imaging was performed by either an experienced sonographer or a neurologist with several years of experience in transcranial ultrasonography; results were interpreted by the neurology team. Compared to reference CT imaging, large vessel occlusion (LVO) was detected correctly in 7/13 suspected LVO patients (4 cases were missed and 2 had inadequate temporal windows). ICH was correctly identified in 10/18 suspected patients. Use of transcranial ultrasound improved ICH detection from clinical assessment alone by 10% (57% increased sensitivity with no changes in specificity). LVO detection based on clinical assessment and transcranial ultrasound showed a sensitivity of 55% and specificity of 97% with a 6% overall improvement in detection. Time needed scan completion ranged from 7 to 49 minutes, with a median of 20 minutes. The study presents a triage model combining both clinical assessment and ultrasound, which could reduce need for interhospital transfers. However, clinical use may be limited by time required for scan completion, expertise requirements for operation, and temporal window insufficiencies.

Erani, et al. [35] tested the Quick-20 (Cognionics, Inc., San Diego, CA; Fig. [A]) electroencephalography system in 100 ED patients with suspected acute stroke, including 43 patients with ischemic stroke, 7 with ICH, and 13 with TIA. While EEG variables alone resulted in a 65% sensitivity at 80% specificity with an AUC of 78.2 in detecting acute stroke/TIA or not, combined clinical and EEG measures with deep learning increased sensitivity to 79% with an AUC of 87.8. Similarly, EEG alone had a 41% sensitivity and AUC of 68.9 in identifying LVO or not, while the combination increased sensitivity to 76% and AUC to 86.4. Overall, authors noted the diagnostic accuracy of clinical data and EEG combined was better than either alone at detecting acute stroke/TIA or LVO. Scans were recorded for 3 minutes.

Kellner, et al. [19] tested a volumetric impedance phase shift spectroscopy (VIPS) visor device manufactured by Cerebrotech in 248 subjects including 41 stroke code patients, 79 healthy volunteers, and 128 patients presenting to a comprehensive stroke center with varied neurological pathologies. The diagnostic accuracy of the device in differentiating severe stroke from minor stroke and severe stroke from all other pathologies was evaluated. Severe stroke was defined as emergent large vessel occlusion (ELVO), severe intracranial stenosis with National Institutes of Health Stroke Scale (NIHSS) scores ≥6, ICH ≥60 mL, and large territorial strokes. Authors noted differences in detected mean bioimpedance asymmetry (MBA) between severe stroke, mild stroke, and control patients. The device performed with a specificity of 92% and sensitivity of 93% in differentiating severe stroke from small stroke and a specificity of 87% and sensitivity of 93% in differentiating severe stroke subjects from all other cases. Three scans were taken in succession by study personnel trained by Cerebrotech in device usage, and total scan time was approximately 30 seconds. Future studies validating results from this preliminary analysis are needed.

Thorpe, et al. [36] studied the diagnostic accuracy of two transcranial doppler (TCD) ultrasound metrics, Velocity Asymmetry Index (VAI) and Velocity Curvature Index (VCI), with 2-MHz handheld ultrasound probes to detect LVO in 66 subjects (33 CTA-confirmed LVO, 33 in-hospital controls). TCD scans were performed by trained technicians and were recorded over 30 seconds. Means of both VAI and VCI were found to be greater in control subjects relative to LVO. Sensitivities and specificities for VAI and VCI metrics varied slightly based on specified statistical thresholds, yet the authors noted the superiority of the VCI metric compared to VAI.

Sergot, et al. and the EDGAR Study Group [37] conducted a multicenter study testing the PLD (Forest Devices, Inc., Pittsburgh, PA) in detecting LVO in a cohort of 109 patients (25 LVO, 38 non-LVO ischemic, 14 hemorrhagic, 32 mimics confirmed by CTA). The device demonstrated an 80% sensitivity and specificity in LVO discrimination. Scans were applied by users after a 1-hour training and required a median of 4.6 minutes to conduct. Authors noted that PLD was obtained after imaging and intravenous thrombolysis in most cases, which may have impacted PLD accuracy, implicating the need for validation studies prior to clinical use.

Stroke characterization

Persson, et al. [38] performed two proof-of-principle clinical studies testing two different microwave-based prototype devices with machine-learning-derived data analysis differentiating ischemic stroke from hemorrhagic stroke. In the first study, neurophysiology and engineering staff tested the first prototype helmet device on 20 acute stroke patients (9 ischemic, 11 hemorrhagic). When the detector was set to diagnose all ICH patients, 7/11 ischemic stroke (IS) patients were able to be differentiated from ICH patients. Diagnostic accuracy, measured by AUC, was determined to be 0.88. In the second study, nursing staff tested the second prototype on 25 patients (15 IS, 10 ICH). When the detector was set to diagnose all 10 ICH patients, 14/15 IS patients were able to be correctly differentiated from ICH patients. Authors noted an AUC of 0.85 in differentiating ICH and IS patients and an AUC of 0.87 in differentiating ICH patients from healthy controls. Scan time was not reported. Larger cohort studies likely need to be done to better characterize the potential of microwave-based prehospital stroke diagnosis.

Detection of intracranial hemorrhage

There were six studies that tested an Infrascanner NIRS device to detect intracranial hemorrhage (Table 3).

Robertson, et al. [39] conducted a multicenter study evaluating the diagnostic accuracy of the near-infrared (NIR)-based Infrascanner device (InfraScan, Inc.) in detecting intracranial traumatic hematomas in 365 patients (269 controls and 96 patients with intracranial hemorrhage). Scanning was completed by study personnel who completed a half-day NIR device training. Device performance was affected by type and size of hemorrhage. Authors reported a 68.70% sensitivity and 90.70% specificity in detecting any intracranial hemorrhage and 88% sensitivity and 90.70% specificity in detecting hematomas with volume > 3.5 mL and distance < 2.5 cm from brain surface, the detection limits of the device. The entire scan required less than 2 minutes to complete and was performed within 40 minutes of the comparator CT scan. Clinical use of this device may be limited by lack of diagnostic accuracy data in patients with scalp lacerations or head injuries.

The remaining five studies (Liang, et al. [40]; Xu, et al. [41]; Peters, et al. [42]; Yuksen, et al. [43]; and Kontojannis, et al. [44]) were smaller single-center studies in China, the Netherlands, Thailand, and the UK that used a newer Infrascanner Model 2000 device (InfraScan, Inc., Philadelphia, PA, USA). Sensitivity for intracranial hematoma detection ranged from 75 to 100% and specificity ranged from 44.4 to 93.6%. Specification of hematoma identification within Infrascanner detection limits (volume > 3.5 mL and depth < 2.5 cm from brain surface) improved sensitivity to 89.36–100%. Scans were all conducted by trained operators and required 3–4 minutes for completion.

Differences in diagnostic accuracy values may be explained by size of hematomas (sensitivity was reported to be higher in patients with larger bleeds [39]), which may indicate the necessity of future studies testing the device in larger patient cohorts and stratifying diagnostic accuracy by hemorrhage size or type. Differences could also be explained by selection of different patient populations (with differences in skin and hair color potentially altering device sensitivity), differences in operator training protocols, and differences in study setting (two studies reported incomplete scans that did not assess all brain regions due to difficulty accessing those regions in prehospital or emergency environments).

Discussion

Many emerging portable stroke technologies can detect and differentiate stroke subtypes and severity. Three studies tested devices differentiating stroke from stroke mimics and controls, [30,31,32] one detected MCA occlusion, [33] five detected LVO and severe stroke, [19, 34,35,36,37], one differentiated between ischemic and hemorrhagic stroke, [38] and six detected intracranial hematoma [39,40,41,42,43,44].

While current prehospital stroke diagnosis relies on stroke triage scales, diagnostic accuracy for EMS stroke or TIA identification with triage scales alone remains low (positive predictive value 34.3% [95% CI: 33.7–35.0], sensitivity 64.0% [95% CI: 63.0–64.9]) [45]. Many of the selected studies presented technologies that were capable of detecting and stratifying stroke with higher diagnostic accuracy than clinical assessment or triage scales alone. Integration of technology with clinical assessment may enhance stroke detection and reduce false-positive stroke diagnosis in the prehospital setting, which could allow EMS to make more informed decisions about bypass transportation to EVT-capable comprehensive stroke centers, thereby reducing time-to-treatment and improving clinical outcomes for severe stroke patients.

All the included studies were designed in a manner that allowed them to calculate diagnostic accuracy, and most described some level of blinding, with either device operators blinded to reference standard findings or clinicians blinded to device findings. Blinding to clinical presentation was not possible in any study, and time between device scanning and reference test varied by study, increasing risk of bias. Only three studies were conducted in prehospital settings, [32, 33, 42] which may limit the generalizability of studies conducted in ED or hospital settings. Prehospital applications may also be limited by training requirements for device use. Only a few identified articles were larger multicenter studies; thus, validation studies with larger sample sizes may be needed prior to clinical application.

Previous studies have reported similar categories of portable stroke detection technology. Walsh, et al. [46] reported ten devices in development, most of which had not been tested nor published in peer-reviewed journals. Martinez-Gutierrez, et al. [47] presents similar devices in development, along with mobile stroke unit technology and stroke scale applications, both of which were excluded here. Lumley, et al. [27] discussed a large variety of prehospital diagnostic technologies, including blood biomarkers and telemedicine technology, but only included two studies involving imaging technology [48, 49]. Shahrestani, et al. [50] reviewed stroke point-of-care technologies tested on human subjects or phantom head models, several of which are presented here; this review, however, focuses on devices with available human-centered diagnostic accuracy metrics. Several other studies [51,52,53,54,55,56,57,58,59,60,61] identified by search strategy were not presented here due lack of complete diagnostic accuracy data, but may prove to be promising technologies for prehospital stroke detection in the future.

While this paper provides a systematic approach to identifying emerging portable stroke detection devices, it also has several limitations. The systematic nature of this paper limits its scope in that it excludes studies that are unpublished, not peer-reviewed, or lacking diagnostic accuracy data. Thus, emerging technologies in preliminary stages of testing, without published diagnostic accuracy in human subjects, were not identified. Furthermore, this study does include technologies with intended prehospital use that have not yet been tested in prehospital settings, many of which may require validation in prehospital settings prior to clinical use. Finally, the protocol for this systematic scoping review was not registered, and due to the nature of the included literature, analyses were not performed in this study.

Future directions

In the future, diagnostic accuracy reporting for portable stroke detection devices should be standardized. Differences in study design and study populations compounded with differences in reported accuracy metrics and protocol make inter-technology comparisons difficult. More comprehensive reporting of reference imaging standards, average imaging time and time to imaging results, device invasiveness and portability, expertise requirements for device use and result analysis, device training protocols, and eventually expected device costs would better inform clinical applications of each device and reduce study bias. Furthermore, patient populations assessed with device technology were widely varied in the included studies; clear reporting of patient populations, characteristics, and clinical presentations would clarify potential uses and settings for technology implementation. While early device validation in-hospital settings provide a valuable method to assess a preliminary patient population within reasonable limits of time and personnel, performing validation studies of early promising devices with larger, multicenter studies in prehospital settings would better establish the value and efficacy of current technologies. Finally, few studies compared diagnostic accuracy of device use with accuracy of clinical assessment alone; such comparisons may better inform the value of these devices in prehospital triage beyond current standards of practice.

Impact of portable stroke technology on reducing time-to-treatment, improving patient outcomes, and reducing healthcare costs should also be investigated by modeling the implementation of reviewed technologies at various points of care in pre-hospital and hospital settings. By integrating literature-based estimates of functional outcome benefits and cost reduction relative to time reduced by technology integration, such a model may inform future methods to optimize prehospital triage and EMS bypass policies.

Key recommendations

The results of this review should be considered when designing future detection technologies and reporting on these trials. To begin, we recommend standardization in reporting of diagnostic accuracy statistics. Future analyses should report, at minimum, the following variables: specificity, sensitivity, ROC curves, positive predictive value (PPV), and negative predictive value (NPV). Moreover, the comparator standard should also be clearly indicated. Reporting these values will allow for meaningful and complete comparisons to be drawn between devices and will be important in guiding future research.

The patient populations assessed by technologies should be clearly defined. Most included studies reported eligibility criteria, as well as settings and dates of participant enrollment. The method of patient enrollment (randomized, consecutive, etc.) should be stated. Importantly, included population characteristics should be described in detail. Beyond demographic characteristics, studies should include the clinical presentation and impression of suspected stroke as well as results of any conducted pre-hospital or neurological triage scales. Such reporting would allow for better comparison between reported devices and potential applications.

In addition, there should be standard operating characteristics that are reported in future studies. This information should clearly indicate the following: level of training required for device use, identification of the individual operating the device and any relevant qualifications (physician, trained research technician, etc.), setting and location of technology use, timing of device use within the chain of stroke care, and time to scan completion. The method of scan analysis should also be reported, including description of device results and any rationale or algorithms related to the final determination of stroke identification. Additionally, the degree of blinding in research protocol, for device operators, any individuals assessing device results, and individuals assessing the reference test, should be clearly stated. The aforementioned data is essential to reproducibility, quality assessment, and potential for prehospital use and, while the several studies did include this information in some capacity, it should be clearly stated in the methods section of all future literature analyzing novel devices.

Furthermore, we propose that ease of use and applicability must be prioritized when designing future interventions. As noted above, some of the included devices required highly specialized operators, limiting the potential to make these devices universal. Devices that require minimal training for use and interpretation will be the most feasible for incorporation into prehospital personnel training for use in the prehospital environment. Few studies noted the additional benefit of devices on diagnostic accuracy of clinical or prehospital triage scales alone. Importantly, the applicability and value of stroke detection devices in a real-world setting would be greatly informed by such comparisons, which should be considered in the design process of future studies of device-based diagnostic accuracy.

Finally, although all the devices included in this study were portable, they were tested in various contexts, from prehospital settings to specialized hospital units. Early device validation in hospital settings may be more feasible for device assessment in a standard patient population within reasonable limits of time and expertise. However, future RCTs may benefit from analyzing device use in multiple settings, particularly after initial validation, and should report this setting-specific diagnostic data. This future literature can hopefully lead to the development of stroke detection devices that are specialized for use in different parts of the care chain, from emergency situations in the prehospital environment to daily monitoring during in-patient care.

As highlighted by this study, there is a current lack of standardization in reporting of diagnostic accuracy data of portable stroke detection devices. In 2015, Cohen et al. developed a checklist, titled “Standards for Reporting of Diagnostic Accuracy Studies” (STARD) that may be used to ensure sufficient reporting of data in a range of fields [62]. Subsequently, field-specific STARD guidelines have emerged for the research of various disease states, including dementia [63] and various infectious processes, [64] among others [65]. In line with this past literature, we recommend the development of STARD-PSD guidelines, focused on creating standards for portable stroke device technology.

Conclusion

While numerous portable, non-invasive technologies have emerged as promising tools for the detection and stratification of stroke subtypes, most are still in development and have not yet been tested in large multicenter or prehospital settings. Moreover, included studies report a variety of study designs, study populations, and diagnostic accuracy metrics, making inter- technology and inter-device comparisons particularly difficult. Standardized reporting of diagnostic accuracy metrics, requirements for device training and use, studied patient populations and characteristics, and comparison of device accuracy with that of clinical assessment alone may better inform the value of portable stroke detection technology in prehospital triage.

Availability of data and materials

The authors declare that all data supporting the findings of this study are available within the article and its supplementary files.

Abbreviations

EVT:

Endovascular therapy

LVO:

Large vessel occlusion

EMS:

Emergency medical services

AUC:

Area under the curve (ROC curve)

CSC:

Comprehensive stroke center

NIRS:

Near-infrared spectroscopy

EEG:

Electroencephalography

VIPS:

Volumetric impedance phase shift spectroscopy

TCCS:

Transcranial color-coded sonography

MCA:

Middle cerebral artery

TCD:

Transcranial doppler

IS:

Ischemic stroke

ICH:

Intracerebral hemorrhage

PPV:

Positive Predictive Value

NPV:

Negative Predictive Value

References

  1. Katan M, Luft A. Global Burden of Stroke. Semin Neurol. 2018;38:208–11.

    Article  PubMed  Google Scholar 

  2. Virani SS, Alonso A, Aparicio HJ, Benjamin EJ, Bittencourt MS, Callaway CW, et al. Heart Disease and Stroke Statistics—2021 Update. Circulation. 2021;143:e254–743.

    Article  PubMed  Google Scholar 

  3. Campbell BCV, Mitchell PJ, Kleinig TJ, Dewey HM, Churilov L, Yassi N, et al. Endovascular therapy for ischemic stroke with perfusion-imaging selection. N Engl J Med. 2015;372:1009–18.

    Article  CAS  PubMed  Google Scholar 

  4. Berkhemer OA, Fransen PSS, Beumer D, van den Berg LA, Lingsma HF, Yoo AJ, et al. A randomized trial of intraarterial treatment for acute ischemic stroke. N Engl J Med. 2015;372:11–20.

    Article  PubMed  CAS  Google Scholar 

  5. Goyal M, Demchuk AM, Menon BK, Eesa M, Rempel JL, Thornton J, et al. Randomized assessment of rapid endovascular treatment of ischemic stroke. N Engl J Med. 2015;372:1019–30.

    Article  CAS  PubMed  Google Scholar 

  6. Jovin TG, Chamorro A, Cobo E, de Miquel MA, Molina CA, Rovira A, et al. Thrombectomy within 8 hours after symptom onset in ischemic stroke. N Engl J Med. 2015;372:2296–306.

    Article  CAS  PubMed  Google Scholar 

  7. Jahan R, Saver JL, Schwamm LH, Fonarow GC, Liang L, Matsouaka RA, et al. Association Between Time to Treatment With Endovascular Reperfusion Therapy and Outcomes in Patients With Acute Ischemic Stroke Treated in Clinical Practice. JAMA. 2019;322:252.

    Article  PubMed  PubMed Central  Google Scholar 

  8. Gerschenfeld G, Muresan I-P, Blanc R, Obadia M, Abrivard M, Piotin M, et al. Two Paradigms for Endovascular Thrombectomy After Intravenous Thrombolysis for Acute Ischemic Stroke. JAMA Neurol. 2017;74:549–56.

    Article  PubMed  PubMed Central  Google Scholar 

  9. Yi J, Zielinski D, Ouyang B, Conners J, Dafer R, Chen M. Predictors of false-positive stroke thrombectomy transfers. J Neurointerv Surg. 2017;9:834–6.

    Article  PubMed  Google Scholar 

  10. Froehler MT, Saver JL, Zaidat OO, Jahan R, Aziz-Sultan MA, Klucznik RP, et al. Interhospital transfer before thrombectomy is associated with delayed treatment and worse outcome in the STRATIS registry (Systematic Evaluation of Patients Treated With Neurothrombectomy Devices for Acute Ischemic Stroke). Circulation. 2017;136:2311–21.

    Article  PubMed  PubMed Central  Google Scholar 

  11. Rinaldo L, Brinjikji W, McCutcheon BA, Bydon M, Cloft H, Kallmes DF, et al. Hospital transfer associated with increased mortality after endovascular revascularization for acute ischemic stroke. J Neurointerv Surg. 2017;9:1166–72.

    Article  PubMed  Google Scholar 

  12. Kellner CP, Schupper AJ, Mocco J. Surgical Evacuation of Intracerebral Hemorrhage: The Potential Importance of Timing. Stroke. 2021;52(10):3391–8.

    Article  PubMed  Google Scholar 

  13. Hastrup S, Damgaard D, Johnsen SP, Andersen G. Prehospital Acute Stroke Severity Scale to Predict Large Artery Occlusion: Design and Comparison With Other Scales. Stroke. 2016;47:1772–6.

    Article  PubMed  Google Scholar 

  14. Nazliel B, Starkman S, Liebeskind DS, Ovbiagele B, Kim D, Sanossian N, et al. A brief prehospital stroke severity scale identifies ischemic stroke patients harboring persisting large arterial occlusions. Stroke. 2008;39:2264–7.

    Article  PubMed  PubMed Central  Google Scholar 

  15. Pérez de la Ossa N, Carrera D, Gorchs M, Querol M, Millán M, Gomis M, et al. Design and validation of a prehospital stroke scale to predict large arterial occlusion: the rapid arterial occlusion evaluation scale. Stroke. 2014;45:87–91.

    Article  PubMed  Google Scholar 

  16. Katz BS, McMullan JT, Sucharew H, Adeoye O, Broderick JP. Design and validation of a prehospital scale to predict stroke severity: Cincinnati Prehospital Stroke Severity Scale. Stroke. 2015;46:1508–12.

    Article  PubMed  PubMed Central  Google Scholar 

  17. Lima FO, Silva GS, Furie KL, Frankel MR, Lev MH, Camargo ÉCS, et al. Field Assessment Stroke Triage for Emergency Destination: A Simple and Accurate Prehospital Scale to Detect Large Vessel Occlusion Strokes. Stroke. 2016;47:1997–2002.

    Article  PubMed  PubMed Central  Google Scholar 

  18. Singer OC, Dvorak F, du Mesnil de Rochemont R, Lanfermann H, Sitzer M, Neumann-Haefelin T. A simple 3-item stroke scale: comparison with the National Institutes of Health Stroke Scale and prediction of middle cerebral artery occlusion. Stroke. 2005;36:773–6.

    Article  PubMed  Google Scholar 

  19. Kellner CP, Sauvageau E, Snyder KV, Fargen KM, Arthur AS, Turner RD, et al. The VITAL study and overall pooled analysis with the VIPS non-invasive stroke detection device. J Neurointerv Surg. 2018;10:1079–84.

    Article  PubMed  PubMed Central  Google Scholar 

  20. Tu TM, Tan GZ, Saffari SE, Wee CK, Chee DJM, Tan C, et al. External Validation of Stroke Mimic Prediction Scales in the Emergency Department. BMC Neurol. 2020;20(1):269.

    Article  PubMed  PubMed Central  Google Scholar 

  21. Nor AM, Davis J, Sen B, Shipsey D, Louw SJ, Dyker AG, et al. The Recognition of Stroke in the Emergency Room (ROSIER) scale: development and validation of a stroke recognition instrument. Lancet Neurol. 2005;4:727–34.

    Article  PubMed  Google Scholar 

  22. Jiang H-L, Chan CP-Y, Leung Y-K, Li Y-M, Graham CA, Rainer TH. Evaluation of the Recognition of Stroke in the Emergency Room (ROSIER) Scale in Chinese Patients in Hong Kong. PLoS One. 2014;9:e109762.

    Article  PubMed  PubMed Central  CAS  Google Scholar 

  23. Chuck CC, Martin TJ, Kalagara R, Madsen TE, Furie KL, Yaghi S, et al. Statewide Emergency Medical Services Protocols for Suspected Stroke and Large Vessel Occlusion. JAMA Neurol. 2021;78(11):1404–6. https://doi.org/10.1001/jamaneurol.2021.3227.

    Article  PubMed  Google Scholar 

  24. Munn Z, Peters MDJ, Stern C, Tufanaru C, McArthur A, Aromataris E. Systematic review or scoping review? Guidance for authors when choosing between a systematic or scoping review approach. BMC Med Res Methodol. 2018;18:1–7.

    Article  Google Scholar 

  25. Arksey H, O’Malley L. Scoping studies: towards a methodological framework. Int J Soc Res Methodol. 2005;8:19–32.

    Article  Google Scholar 

  26. Tricco AC, Lillie E, Zarin W, O’Brien KK, Colquhoun H, Levac D, et al. PRISMA Extension for Scoping Reviews (PRISMA-ScR): Checklist and Explanation. Ann Intern Med. 2018;169:467–73.

    Article  PubMed  Google Scholar 

  27. Lumley HA, Flynn D, Shaw L, McClelland G, Ford GA, White PM, et al. A scoping review of pre-hospital technology to assist ambulance personnel with patient diagnosis or stratification during the emergency assessment of suspected stroke. BMC Emerg Med. 2020;20:30.

    Article  PubMed  PubMed Central  Google Scholar 

  28. Hoffmann TC, Glasziou PP, Boutron I, Milne R, Perera R, Moher D, et al. Better reporting of interventions: template for intervention description and replication (TIDieR) checklist and guide. BMJ. 2014;348:g1687.

    Article  PubMed  Google Scholar 

  29. Whiting PF, Rutjes AWS, Westwood ME, Mallett S, Deeks JJ, Reitsma JB, et al. QUADAS-2: a revised tool for the quality assessment of diagnostic accuracy studies. Ann Intern Med. 2011;155:529–36.

    Article  PubMed  Google Scholar 

  30. Michelson EA, Hanley D, Chabot R, Prichep LS. Identification of acute stroke using quantified brain electrical activity. Acad Emerg Med. 2015;22:67–72.

    Article  PubMed  Google Scholar 

  31. Wilkinson CM, Burrell JI, Kuziek JWP, Thirunavukkarasu S, Buck BH, Mathewson KE. Predicting stroke severity with a 3-min recording from the Muse portable EEG system for rapid diagnosis of stroke. Sci Rep. 2020;10:18465.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  32. Herzberg M, Boy S, Hölscher T, Ertl M, Zimmermann M, Ittner K-P, et al. Prehospital stroke diagnostics based on neurological examination and transcranial ultrasound. Crit Ultrasound J. 2014;6:3.

    Article  PubMed  PubMed Central  Google Scholar 

  33. Schlachetzki F, Herzberg M, Hölscher T, Ertl M, Zimmermann M, Ittner KP, et al. Transcranial Ultrasound from Diagnosis to Early Stroke Treatment – Part 2: Prehospital Neurosonography in Patients with Acute Stroke – The Regensburg Stroke Mobile Project. Cerebrovasc Dis. 2012;33:262–71.

    Article  PubMed  Google Scholar 

  34. Antipova D, Eadie L, Makin S, Shannon H, Wilson P, Macaden A. The use of transcranial ultrasound and clinical assessment to diagnose ischaemic stroke due to large vessel occlusion in remote and rural areas. PLoS One. 2020;15:e0239653.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  35. Erani F, Zolotova N, Vanderschelden B, Khoshab N, Sarian H, Nazarzai L, et al. Electroencephalography Might Improve Diagnosis of Acute Stroke and Large Vessel Occlusion. Stroke. 2020;51:3361–5.

    Article  PubMed  PubMed Central  Google Scholar 

  36. Thorpe SG, Thibeault CM, Canac N, Wilk SJ, Devlin T, Hamilton RB. Decision Criteria for Large Vessel Occlusion Using Transcranial Doppler Waveform Morphology. Front Neurol. 2018;9:847.

    Article  PubMed  PubMed Central  Google Scholar 

  37. Sergot PB, Maza AJ, Derrick BJ, Smith LM, Berti LT, Wilcox MR, et al. Portable Neuromonitoring Device Detects Large Vessel Occlusion in Suspected Acute Ischemic Stroke. Stroke. 2021;52:1437–40.

    Article  CAS  PubMed  Google Scholar 

  38. Persson M, Fhager A, Trefná HD, Yu Y, McKelvey T, Pegenius G, et al. Microwave-based stroke diagnosis making global prehospital thrombolytic treatment possible. IEEE Trans Biomed Eng. 2014;61:2806–17.

    Article  PubMed  Google Scholar 

  39. Robertson CS, Zager EL, Narayan RK, Handly N, Sharma A, Hanley DF, et al. Clinical evaluation of a portable near-infrared device for detection of traumatic intracranial hematomas. J Neurotrauma. 2010;27:1597–604.

    Article  PubMed  Google Scholar 

  40. Liang C-Y, Yang Y, Shen C-S, Wang H-J, Liu N-M, Wang Z-W, et al. Chinese Military Evaluation of a Portable Near-Infrared Detector of Traumatic Intracranial Hematomas. Mil Med. 2018;183:e318–23.

    Article  PubMed  Google Scholar 

  41. Xu L, Tao X, Liu W, Li Y, Ma J, Lu T, et al. Portable near-infrared rapid detection of intracranial hemorrhage in Chinese population. J Clin Neurosci. 2017;40:136–46.

    Article  PubMed  Google Scholar 

  42. Peters J, Van Wageningen B, Hoogerwerf N, Tan E. Near-Infrared Spectroscopy: A Promising Prehospital Tool for Management of Traumatic Brain Injury. Prehosp Disaster Med. 2017;32:414–8.

    Article  PubMed  Google Scholar 

  43. Yuksen C, Sricharoen P, Puengsamran N, Saksobhavivat N, Sittichanbuncha Y, Sawanyawisuth K. Diagnostic properties of a portable near-infrared spectroscopy to detect intracranial hematoma in traumatic brain injury patients. Eur J Radiol Open. 2020;7:100246.

    Article  PubMed  PubMed Central  Google Scholar 

  44. Kontojannis V, Hostettler I, Brogan RJ, Raza M, Harper-Payne A, Kareem H, et al. Detection of intracranial hematomas in the emergency department using near infrared spectroscopy. Brain Inj. 2019;33:875–83.

    Article  PubMed  Google Scholar 

  45. Powers WJ, Rabinstein AA, Ackerson T, Adeoye OM, Bambakidis NC, Becker K, et al. 2018 Guidelines for the Early Management of Patients With Acute Ischemic Stroke: A Guideline for Healthcare Professionals From the American Heart Association/American Stroke Association. Stroke. 2018;49:e46–110.

    Article  PubMed  Google Scholar 

  46. Walsh KB. Non-invasive sensor technology for prehospital stroke diagnosis: Current status and future directions. Int J Stroke. 2019;14:592–602.

    Article  PubMed  Google Scholar 

  47. Martinez-Gutierrez JC, Chandra RV, Hirsch JA, Leslie-Mazwi T. Technological innovation for prehospital stroke triage: ripe for disruption. J Neurointerv Surg. 2019;11:1085–90.

    Article  PubMed  Google Scholar 

  48. Murphy D, deKerillis P, Frabizzio J, Nash B, Shah Q. Abstract T MP107: Measurement of Acute Brain Hemorrhage in the Pre-hospital Setting. Stroke. 2015;46:ATMP107.

    Article  Google Scholar 

  49. Coutinho J. EEG Controlled Triage in the Ambulance for Acute Ischemic Stroke (Electra-Stroke). 2018. https://clinicaltrials.gov/ct2/show/NCT03699397. Accessed 27 November 2021.

  50. Shahrestani S, Wishart D, Han SMJ, Strickland BA, Bakhsheshian J, Mack WJ, et al. A systematic review of next-generation point-of-care stroke diagnostic technologies. Neurosurg Focus. 2021;51:E11.

    Article  PubMed  Google Scholar 

  51. Xu J, Chen J, Yu W, Zhang H, Wang F, Zhuang W, et al. Noninvasive and portable stroke type discrimination and progress monitoring based on a multichannel microwave transmitting–receiving system. Sci Rep. 2020;10:21647.

  52. Ludewig P, Gdaniec N, Sedlacik J, Forkert ND, Szwargulski P, Graeser M, et al. Magnetic Particle Imaging for Real-Time Perfusion Imaging in Acute Stroke. ACS Nano. 2017;11:10480–8.

    Article  CAS  PubMed  Google Scholar 

  53. Yan Q, Jin G, Ma K, Qin M, Zhuang W, Sun J. Magnetic inductive phase shift: a new method to differentiate hemorrhagic stroke from ischemic stroke on rabbit. BioMed Eng Online. 2017;16:63.

    Article  PubMed  PubMed Central  Google Scholar 

  54. Muehlschlegel S, Selb J, Patel M, Diamond SG, Franceschini MA, Sorensen AG, et al. Feasibility of NIRS in the Neurointensive Care Unit: A Pilot Study in Stroke Using Physiological Oscillations. Neurocrit Care. 2009;11:288–95.

    Article  PubMed  PubMed Central  Google Scholar 

  55. Shreve L, Kaur A, Vo C, Wu J, Cassidy JM, Nguyen A, et al. Electroencephalography Measures are Useful for Identifying Large Acute Ischemic Stroke in the Emergency Department. J Stroke Cerebrovasc Dis. 2019;28:2280–6.

    Article  PubMed  PubMed Central  Google Scholar 

  56. Ljungqvist J, Candefjord S, Persson M, Jönsson L, Skoglund T, Elam M. Clinical Evaluation of a Microwave-Based Device for Detection of Traumatic Intracranial Hemorrhage. J Neurotrauma. 2017;34:2176–82.

    Article  PubMed  PubMed Central  Google Scholar 

  57. Mobashsher AT, Abbosh AM. On-site Rapid Diagnosis of Intracranial Hematoma using Portable Multi-slice Microwave Imaging System. Sci Rep. 2016;6:37620.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  58. Gottlibe M, Rosen O, Weller B, Mahagney A, Omar N, Khuri A, et al. Stroke identification using a portable EEG device – A pilot study. Neurophysiologie Clinique. 2020;50:21–5.

    Article  PubMed  Google Scholar 

  59. Lamonte MP, Sewell J, Bahouth MN, Sewell C. A Noninvasive Portable Acoustic Diagnostic System to Differentiate Ischemic From Hemorrhagic Stroke. J Neuroimag. 2005;15:57–63.

    Article  Google Scholar 

  60. Vasquez JAT, Tobon Vasquez JA, Scapaticci R, Turvani G, Bellizzi G, Rodriguez-Duarte DO, et al. A Prototype Microwave System for 3D Brain Stroke Imaging. Sensors. 2020;20:2607.

    Article  Google Scholar 

  61. Zhang H, Chen M, Jin G, Xu J, Qin M. Experimental study on the detection of cerebral hemorrhage in rabbits based on broadband antenna technology. Comput Assist Surg. 2019;24:96–104.

    Article  Google Scholar 

  62. Cohen JF, Korevaar DA, Altman DG, Bruns DE, Gatsonis CA, Hooft L, et al. STARD 2015 guidelines for reporting diagnostic accuracy studies: explanation and elaboration. BMJ Open. 2016;6:e012799.

    Article  PubMed  PubMed Central  Google Scholar 

  63. Noel-Storr AH, McCleery JM, Richard E, Ritchie CW, Flicker L, Cullum SJ, et al. Reporting standards for studies of diagnostic test accuracy in dementia: The STARDdem Initiative. Neurology. 2014;83:364–73.

    Article  PubMed  PubMed Central  Google Scholar 

  64. Gardner IA, Nielsen SS, Whittington RJ, Collins MT, Bakker D, Harris B, et al. Consensus-based reporting standards for diagnostic test accuracy studies for paratuberculosis in ruminants. Prev Vet Med. 2011;101:18–34.

    Article  PubMed  Google Scholar 

  65. Hong PJ, Korevaar DA, McGrath TA, Ziai H, Frank R, Alabousi M, et al. Reporting of imaging diagnostic accuracy studies with focus on MRI subgroup: Adherence to STARD 2015. J Magn Reson Imaging. 2018;47:523–44.

    Article  PubMed  Google Scholar 

Download references

Acknowledgements

We would like to acknowledge Rachel Pinotti for her assistance in the development of search strategy and consultation during the scoping review process.

Funding

This study did not have any funding sources.

Author information

Authors and Affiliations

Authors

Contributions

SC and CPK completed an initial literature search to finalize the aims of this scoping review. SC, RK, and CPK were major contributors in search strategy development, study inclusion and exclusion, data extraction, and manuscript writing. The remaining authors provided input on the future directions and recommendations sections in the manuscript, which provide valuable additions to the scope and applicability of this manuscript. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Susmita Chennareddy.

Ethics declarations

Ethics approval and consent to participate

Not applicable.

Consent for publication

Not applicable.

Competing interests

Dr. De Leacy is a consultant for Stryker, Penumbra, Siemens, and Cerenovus. Dr. Mokin is supported by a grant from the NIH (NIH R21NS109575) and is a consultant for Medtronic and Cerenovus. Additionally, Dr. Mokin is an investor in BrainQ, Endostream, Serenity medical, Synchron. Dr. Fifi is a stockholder in Cerebrotech. Dr. Mocco is the PI on research trials funded by: Stryker Neurovascular, Microvention, and Penumbra and he is an investor in: Cerebrotech, Imperative Care, Endostream, Viseon, BlinkTBI, Myra Medical, Serenity, Vastrax, NTI, RIST, Viz.ai, Synchron, Radical, and Truvic. He serves, or has recently served, as a consultant for: Cerebrotech, Viseon, Endostream, Vastrax, RIST, Synchron, Viz.ai, Perflow, and CVAid. Dr. Kellner has received research grant support from Cerebrotech, Siemens, Penumbra, Minnetronix, Viz.AI, Integra, Longeviti, and Irras and has ownership in Metis Innovative and Precision Recovery. The remaining authors have no conflicts of interest to declare.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Chennareddy, S., Kalagara, R., Smith, C. et al. Portable stroke detection devices: a systematic scoping review of prehospital applications. BMC Emerg Med 22, 111 (2022). https://doi.org/10.1186/s12873-022-00663-z

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s12873-022-00663-z

Keywords