Open Access
15 April 2021 Smartphone-based imaging systems for medical applications: a critical review
Brady Hunt, Alberto J. Ruiz, Brian W. Pogue
Author Affiliations +
Abstract

Significance: Smartphones come with an enormous array of functionality and are being more widely utilized with specialized attachments in a range of healthcare applications. A review of key developments and uses, with an assessment of strengths/limitations in various clinical workflows, was completed.

Aim: Our review studies how smartphone-based imaging (SBI) systems are designed and tested for specialized applications in medicine and healthcare. An evaluation of current research studies is used to provide guidelines for improving the impact of these research advances.

Approach: First, the established and emerging smartphone capabilities that can be leveraged for biomedical imaging are detailed. Then, methods and materials for fabrication of optical, mechanical, and electrical interface components are summarized. Recent systems were categorized into four groups based on their intended application and clinical workflow: ex vivo diagnostic, in vivo diagnostic, monitoring, and treatment guidance. Lastly, strengths and limitations of current SBI systems within these various applications are discussed.

Results: The native smartphone capabilities for biomedical imaging applications include cameras, touchscreens, networking, computation, 3D sensing, audio, and motion, in addition to commercial wearable peripheral devices. Through user-centered design of custom hardware and software interfaces, these capabilities have the potential to enable portable, easy-to-use, point-of-care biomedical imaging systems. However, due to barriers in programming of custom software and on-board image analysis pipelines, many research prototypes fail to achieve a prospective clinical evaluation as intended. Effective clinical use cases appear to be those in which handheld, noninvasive image guidance is needed and accommodated by the clinical workflow. Handheld systems for in vivo, multispectral, and quantitative fluorescence imaging are a promising development for diagnostic and treatment guidance applications.

Conclusions: A holistic assessment of SBI systems must include interpretation of their value for intended clinical settings and how their implementations enable better workflow. A set of six guidelines are proposed to evaluate appropriateness of smartphone utilization in terms of clinical context, completeness, compactness, connectivity, cost, and claims. Ongoing work should prioritize realistic clinical assessments with quantitative and qualitative comparison to non-smartphone systems to clearly demonstrate the value of smartphone-based systems. Improved hardware design to accommodate the rapidly changing smartphone ecosystem, creation of open-source image acquisition and analysis pipelines, and adoption of robust calibration techniques to address phone-to-phone variability are three high priority areas to move SBI research forward.

1.

Introduction

Smartphone-based imaging (SBI) has been proposed for numerous biomedical applications, many of which use an optical attachment to augment or extend the native device capabilities (Fig. 1). In the past decade, the most common application for SBI has been diagnostic analysis of ex vivo specimens (i.e., point-of-care testing), which has utilized smartphones in a variety of microscopy and microfluidic detection schemes.14 SBI is also frequently proposed for noninvasive monitoring and diagnosis of externally accessible tissues, particularly in dermatological applications.5 More recently, SBI for minimally invasive procedures and treatment guidance has also been reported, including photodynamic therapy (PDT),69 endoscopy,1013 in vivo microscopy,1417 and surgery.1822 As ex vivo diagnostic applications have been extensively reviewed elsewhere,14 this review focuses on SBI systems for real-time tissue imaging applications (i.e., in vivo monitoring, diagnosis, and treatment guidance). However, recent developments in ex vivo diagnostic system designs, which may have relevance in tissue imaging applications are also discussed for comparison and contrast.

Fig. 1

SBI for various biomedical imaging applications grouped into four clinical workflows.

JBO_26_4_040902_f001.png

The review is structured as follows. First, the established and emerging smartphone capabilities as well as methods and materials for SBI system interface design are reviewed, with an eye toward classifying them as to their optical, mechanical, and electrical components. Each of these can be passive in their functionality to simply extend what the phone camera itself is doing, or they can be active, in terms of adding function that the SBI itself could not achieve. Then, the emerging applications of SBI systems are presented within the three aforementioned roles of tissue imaging (monitoring, diagnosis, and treatment guidance). Finally, the pros and cons of smartphone utilization in emerging applications are discussed alongside recommendations to improve clinical translation and uptake of research advances in SBI.

2.

System Interface Design

SBI systems leverage built-in sensors of modern smartphones in addition to various optical, mechanical, electrical, and software components to augment native device capabilities. When developing smartphone-based optical instrumentation, a fundamental design choice is how custom hardware and software will interface with the smartphone. Current SBI system interfaces vary greatly at both the hardware and software levels, ranging from basic utilization of the unmodified smartphone camera with built-in or third party software to application-specific optical attachments being actively controlled with custom software. The terms “smartphone-based” and “using a smartphone” appear frequently in biomedical optics research abstracts but do not adequately capture the diversity of the underlying interface designs and degree of smartphone utilization. In this section, the first discussion is on the built-in capabilities of modern smartphones that can be leveraged for biomedical imaging applications. Then, characterization of systems from the literature is done in terms of the hardware and software componentry utilized to augment built-in capabilities, highlighting commonly used materials and methods for developing SBI systems.

2.1.

Built-in Capabilities

Driven by global demand for mobile computing and telecommunications, smartphones have been at the forefront of consumer electronics innovation for nearly two decades. As a result, built-in sensor capabilities continue to evolve at a rapid pace, making smartphones the “Swiss Army knife” of mobile computing. Using an internet database,23 we compared smartphone specifications for several iOS and Android smartphones released over the past decade and created a summary of eight established and emerging capabilities which are relevant for biomedical imaging applications (Fig. 2). We defined established capabilities as specifications which have been available for over 5 years and are common for current entry-level smartphones, whereas emerging capabilities are those available only on current high-end smartphones and may become more widely available in the future.

Fig. 2

Established and emerging smartphone capabilities for biomedical imaging applications.

JBO_26_4_040902_f002.png

2.1.1.

Cameras

Camera technology continues to be an area of fierce competition and innovation among smartphone manufacturers. Modern smartphones are equipped with several compact camera modules for front and rear photoacquisition at various magnifications. These preassembled modules are small form factor optical systems typically consisting of a multielement lenses, apertures, optical filters, CMOS sensors, and motors for autofocus and image stabilization.

The primary engineering constraint is the small form factor of the sensor and lens elements. Most smartphones utilize 1/3 format sensors (active pixel area 17  mm2) with between 5 and 12 MP (1.1 to 1.8  μm pixels). Newer models are moving toward larger (1/1.3, 65  mm2 active pixel area) and ultrahigh-resolution sensors (50+ MP with 2.4  μm effective pixels after processing). These high-resolution CMOS sensors now support “4K,” or 2160 p, video acquisition at up to 60 fps, as well as ultrafast acquisitions at lower resolution (1080 p at 240 fps and 720 p at 960 fps).

Another major trend for smartphone cameras in the past few years has been a shift from a single to multiple rear cameras with additional lenses for ultrawide, macro, and telephotoacquisition. Periscope lenses are becoming more common and use folding mirror geometries to achieve longer focal lengths and as high as 10× optical zoom. Equivalent focal lengths listed by manufacturers currently range from around 18 mm for ultrawide lenses all the way up to 240 mm for the longest periscope lenses, with most primary widefield lenses being in the 25 to 30 mm range. Adjusting for a 3- to 10-fold crop factor based on the 1/4 to 1/1.3 sensor format range, actual focal lengths for current smartphones range from around 3 to 30 mm. Having a variety of lenses with the possibility for multicamera acquisition has not been extensively utilized in SBI systems but could prove useful for biomedical imaging applications.

In conjunction with larger sensors and multilens systems, there has been movement toward more sophisticated integrated signal processing for image denoising and reconstruction using multiple acquisitions. While these advances in computational photography may be advantageous for some applications, the lack of fine-grained control of image acquisition and processing pipelines is not ideal for medical and scientific imaging. Over the last few years smartphones have gained the ability to fix imaging parameters and access the unprocessed (RAW) imaging data, which is essential for quantitative imaging applications. Section 2.3 provides a detailed discussion on these advances of imaging acquisition controls.

2.1.2.

Other optical sensors

Current smartphones are also equipped with other built-in and/or peripheral optical technologies which can be utilized for biomedical imaging applications including: ambient light, proximity/depth, thermal, and wearable sensors. Recent advances and references to additional topical reviews of these sensors are summarized in this section.

In contrast with cameras, ambient light sensors are simple photodetectors which only measure the intensity of incident light. The primary purpose of ambient light sensing (ALS) is to automatically adjust the user screen brightness based on lighting conditions; however, several reports have demonstrated use of this sensor to measure intensity changes due to light attenuation24,25 as well as light emission26 from chemical assays. A recent review on the use of ALS in point-of-care testing stated that it can provide “resolution as low as 0.01 lux over a wide range of wavelengths from 350 to 1050 nm.”27

Low-resolution proximity sensors to detect when a phone is being held close to the face/ear have been on smartphones for over a decade but have had little utility for biomedical applications. However, on-board depth sensing technologies and associated software development kits to support augmented reality are becoming more capable. Depth estimation using dual-camera stereoscopic images has been utilized to segment background from foreground objects and to create depth of field maps for synthetic “bokeh.” Although stereo depth estimations have good spatial resolution, they are relatively low-precision and susceptible to variability in lighting/imaging conditions.28 Light-field imaging is an emerging alternative to stereoscopic imaging but has been evaluated in a relatively small number of studies and has not yet to be integrated in commercial systems.2931 Most recently, 3D time-of-flight cameras are an emerging capability which can potentially provide depth information at sufficiently quantitative spatial and temporal resolutions for dynamically measuring distances and volumes.32,33 Ulrich et al.34 provided a comprehensive review of these emerging methods and other RGB-D sensing technologies.

In addition to depth sensing, compact IR cameras have also enabled thermal imaging on smartphones. Two commercially available attachments are the FLIR One Pro and SEEK CompactPRO. Beyond clip-on thermal attachments, some phone models, such as the Caterpillar CAT S61, have now incorporated on-board thermal imaging sensors.35 Kirimtat et al.36 recently conducted a head-to-head comparison of the FLIR and SEEK attachments and broadly summarized related works in biomedicine, which use both smartphone-based and nonsmartphone-based handheld thermography. This review includes applications using the FLIR One smartphone sensor in both diagnostic and treatment guidance applications.22,3741

Wearable optical sensors that wirelessly interface with smartphones hold great potential for more consistent and noninvasive health monitoring.42 Established capabilities for wrist-based sensors are activity and heart rate monitoring, which use motion and optical sensing, respectively. In recent years, newer devices have added sensing capability for electrocardiography, skin temperature, blood oxygenation, and blood pressure monitoring. Wearable systems have been extensively reviewed elsewhere.4347 This review focuses on approaches which utilize native smartphone optical sensors or otherwise use the smartphone to actively control an external optical system, as opposed to approaches that passively acquire point-based optical measurements.

2.1.3.

Ancillary capabilities (touchscreen, networking, motion, and computation)

Other built-in capabilities that distinguish smartphones from traditional computing platforms include touchscreen displays, networking, motion/audio interface control, and computational power. Here, we briefly summarize advances in these areas which may play a role in SBI systems moving forward.

Modern smartphones provide high-performance displays which can be versatile interfaces for many applications. Displays with 10-bit color could provide advantages compared to established 8-bit displays for applications involving high dynamic range imaging. Newer displays can also achieve higher display and touch refresh rates (up to 120 and 240 Hz, respectively), which could be helpful in applications where image data need to be played back and/or annotated with high temporal fidelity.

Wireless networking is another great strength of smartphones, which can facilitate untethered handheld imaging. Mature communications protocols are well supported on smartphones including Wi-Fi 802.11.ac, 4G cellular networks, and Bluetooth low-energy. In applications where device-to-device communication is needed, Bluetooth can support relatively low-latency communication (<100  ms) at up to 2 Mbps and over large distances (100 to 400 m).48 In cases where even lower latency and higher bandwidth device-to-device communication is needed, wired USB connections can be utilized for up to 10 Gbps and submillisecond latency.

Motion and audio sensors are well established on smartphones but have not yet been widely utilized for biomedical imaging applications. Motion sensors will continue to play a role in photography image stabilization as well as for 3D depth sensing. One area for consideration in medical imaging is the use of motion and audio for contactless user interfaces through gesture or voice commands. For example, millimeter wave radar is an emerging sensing technology which could enable enhanced, 360-deg hand gesture recognition for user interface control.49,50

Embedded computing architectures continue to improve and enable more data intensive applications on smartphones, including image processing pipelines. However, improvements in mobile computing performance do not readily translate for biomedical imaging applications without appropriate software frameworks to support full utilization of the hardware. This remains a barrier in the research community as computationally efficient image analysis on smartphones requires significant programming expertise. However, machine learning-based image analysis could provide a relatively versatile, easy-to-use, and computationally efficient strategy compared to traditional image processing toolkits, which are not well supported on mobile devices. Dedicated processors to support accelerated machine learning inference are likely to become a standard moving forward.

2.2.

Hardware

Hardware design of SBI systems ranges along a design spectrum from passive interfaces with minimal adaptation of built-in optics to active interfaces with battery-powered, smartphone-controlled optical attachments. Figure 3 illustrates this spectrum with examples from research literature as well as commercial products. SBI systems occupy the space between a native smartphone camera system and a fully external optical sensing system, such as a wrist-based wearable device. Moving up and to the right, more sophisticated hardware interfaces which can enable a greater system control and optical performance are observed; however, these designs often become phone-specific and potentially diminish the longer term stability of optical designs in integrating with newer smartphones. Advances in 3D printing have been instrumental in overcoming this challenge for research prototyping, but further developments to address the scalability of SBI hardware interfaces is important to achieve greater impact and translation. For example, some designs minimize the reliance on smartphone hardware by interfacing external optical systems through wireless or wired connections to the phone.13,51,52 Recently, Alawsi and Al‐Bawi reviewed smartphone-based adapter designs across a large variety of both ex vivo and in vivo point-of-care imaging applications.53 Here, we characterize common materials and methods utilized to create optical, mechanical, and electrical interfaces for SBI systems with particular attention to considerations for the in vivo applications covered in this review.

Fig. 3

Smartphone-based optical interface design spectrum. The spectrum of optical interfaces for smartphone-based biomedical imaging spans from use of the native device only to fully external optical systems that interface with the phone via wired/wireless communication. Attachment designs vary along mechanical and optoelectronical axes from minimal/passive attachments to more complex and actively controlled attachments. Examples of commercial and research prototype systems are plotted within the spectrum for reference. Visuals adapted from the following sources: assay imaging box,54 foot imaging box,55 droplet lens,56 microscopy clip,57 retinal imaging module,58,59 dermatology clip,60 photosensitizer fluorescence module,6 endoscope,10 and multispectral imaging module.52

JBO_26_4_040902_f003.png

2.2.1.

Optical

The primary optical interface for SBI systems is naturally the built-in camera. Smartphone cameras have been modified using off-the-shelf optical components including lenses (spherical, aspheric, achromats, infinity corrected objectives, Fresnel lenses, and reversed smartphone lenses from disassembled camera modules), filters (bandpass, longpass, neutral density, polarizers), apertures, mirrors (folding, scanning, and dichroic), optical fibers for light relay of flash LED, diffusers, and diffraction gratings. Integration of these passive optics in front of the smartphone camera is a fairly straightforward; however, it does impose some design limitations. One major challenge toward computer-aided design of SBI systems is variability and/or unspecified optical properties of built-in lenses, filters, and LEDs of preassembled smartphone camera modules. Bae et al.10 approximated the smartphone lens of their system for ray tracing simulation using the crop factor of the CMOS sensor, the fixed aperture specified by the vendor, and assuming the lens is set to infinite-focus. In both in vivo and ex vivo microscopy applications, others have utilized an reversed smartphone lens to match the light collection angle of the built-in lens phone and relay distortion-free conjugate images to the CMOS sensor that fill the entire sensor.16,57,61,62 For reproducible calibration of spectral response across different smartphone models, several reports have utilized commercially available reference color targets (X-rite ColorChecker for example) to apply phone-by-phone calibration factors.6369 Quantitative methods for calibration of SBI systems have also been proposed.70

One area that is not as well appreciated is control of stray light in SBI systems. Systems for low light level applications require exclusion of ambient room lighting to preserve the signal specificity or purity. This is especially important in spectroscopic, chromatic, and filtered light applications for external tissue measurements.6,17,52,64,68,7173 Additionally, light emission from tissue comes at high numerical aperture, and so control over access to these signals requires careful lens design and external light control. Filtering of signals is always challenging given the range of choices and the typically short camera–tissue distances, and so evaluation of the contaminating signals is important, as are choices about use of potentially multiple filters.6

Future developments in fabrication of customized optics could enable greater flexibility in smartphone optical interface design.74,75 Miniaturized polymer “droplet” lenses for both light collection76,77 and filtering56,78 are a promising development as they can potentially be fabricated at very low cost and assembled with minimal adaptation of smartphone optics. Rapid advances in computational imaging and optical design also hold great promise to be utilized in conjunction with existing smartphone optical sensors or to be added through additional dedicated sensors.7984

2.2.2.

Mechanical

Mechanical interfaces for SBI systems serve several functions depending on the use case including optical alignment and coupling of custom optics to the smartphone, background light rejection in fluorescence applications, and ergonomic setup and clinical use. Custom enclosures for optical attachments are typically fabricated using 3D printing to accommodate unique smartphone geometries. Fused filament printing is most common and typically provides alignment precision within a few hundred microns. Stereolithography 3D printing is beginning to be more widely used and can fabricate enclosures with sub-100  μm precision. This is often adequate for positioning most optical components but may not be ideal for lens alignment, particularly high numerical aperture lenses. For applications with custom lenses needing more precise alignment, cage rod assemblies mounted inside a 3D-printed enclosure have been used.15,16

In terms of attachment mechanisms, SBI systems can be broadly categorized into one of three categories: (1) no attachment, (2) clip attachment, and (3) case attachment. SBI systems with no attachment utilize only the native device or otherwise may communicate wirelessly with external optics. For example, He and Wang demonstrated Weiner estimation to reconstruct “pseudohyperspectral” images in an attachment-free manner,68 whereas Cai et al.51 demonstrated a smartphone-interfaced wireless spectrometer for in vivo measurement of biosamples. Both approaches can, in principle, work for various smartphone models without requirement for hardware customization.

Clip attachments designs intended be lower profile, less dependent on a specific phone geometry, and easily attached/detached from the smartphone. Clip attachments for the top of the phone near the cameras and the base of the phone using the charging port have both been demonstrated.57,85,86 Spigulis et al.87 demonstrated a sticky platform-like attachment that was not as low profile as other clip designs but enabled easy attachment/detachment of various smartphones. Such designs are appealing in the sense that they could work with multiple phones; however, clip attachments that include custom lens assemblies are often impractical as manual alignment of lenses with the phone camera is cumbersome and error prone.

A more common design for SBI systems with custom optics is a case attachment. Case attachments accommodate larger optics/electronics systems and more precisely couple the system to the camera. As smartphone manufacturers often published detailed specifications for third party manufacturing of phone cases, 3D design of case attachments is straightforward. Their primary drawback is that they often require design modifications for each smartphone model, and with the rapid succession of models produced today, there would be frequent changes needed for matching the updates. This change in phone shape with successive models on an annual basis is one of the most difficult challenges in these attachment device approaches.

Sterilizability or sanitation is another important and potentially overlooked mechanical design constraint. Attachments that are easily assembled/disassembled are preferable for frequent sterilization or cleaning. For applications where SBI systems are used by contact, use of biocompatible materials should be encouraged. Most 3D-printed prototypes do not comply with these needs and so merely serve as a rapid prototype that needs to be implemented in a medical grade material production. In sterile clinical environments, such as the surgical suite, clear sterile plastic sleeves are commonly used for camera and microscope systems.18,20

2.2.3.

Electrical

Some SBI systems also use embedded electronics to facilitate controlled light delivery, active optical components (scanning mirrors, tunable filters, and motorized mounts), wireless/wired data relay, and for microcontroller logic. Such active attachment designs are more common for fluorescence and multispectral imaging applications for more controlled light delivery.52,66,8890 For example, Cavalcanti et al.88 used a multiplexed system of fiber optics to deliver colored light from eight different LED sources across the visible spectrum to the tip of an otoscope. Wired connections have also been used to interface smartphones with USB cameras for tethered capsule endoscopy as well as multispectral dermal imaging.13,52 Cai et al.51 developed a pencil-like spectrometer based on a compact WiFi-enabled camera. Currently, the use of wired/wireless communication to embedded electronics seems somewhat underutilized and is one promising avenue to increase control and customization of SBI optical systems.

2.3.

Software

In terms of software interfaces, there is a great deal of variety in the level of functionality supported in research prototypes, with many systems relying on third party camera software and manual postprocessing of images. Figure 4 highlights approaches and core functionalities of software supported by SBI systems in unrealistic and realistic deployments with increasing degrees of control and customization. Clinical deployment of SBI systems for real-time use into clinical settings necessitates custom software to facilitate both data acquisition and analysis. State-of-the-art SBI software interfaces should accomplish both of these functions in addition to facilitating easy operation for the end user.

Fig. 4

Smartphone-based software design spectrum. Approaches and core functionalities supported by SBI systems in unrealistic and realistic deployments with increasing degrees of customization are highlighted. Unfortunately, the vast majority of SBI systems reported in the current literature use unrealistic acquisition and analysis pipelines.

JBO_26_4_040902_f004.png

Acquisition is one area where smartphones have great potential to streamline and enhance the usability for biomedical imaging. The core functionalities of good acquisition software are fivefold: (1) provide a video-rate preview of the camera sensor data, (2) control relevant acquisition parameters (focus, exposure, gain, white balance, RAW pixel data, etc.), (3) trigger start and stop of camera acquisition, (4) facilitate storage and organization of acquired data (by patient or sample, for example), and (5) interface with downstream image analysis pipelines (either through on-board or wireless communication). Despite smartphones having perfected these functionalities for everyday photography, these advances may not readily transfer to SBI systems with customized optics and/or acquisition pipelines.

As medical providers are often multitasking while performing clinical examinations, real-time visualization, and easy triggering of acquisition is of particular importance for diagnostic and treatment guidance systems. For example, Bae et al.10 developed a custom app for their endoscopy system, which facilitates image relay to a head-mounted display. Other SBI systems that facilitate improved ease of use and contactless acquisition through hand-gesture or voice-commands.9193 The highly networked capabilities of smartphones seem largely underutilized, given the potential ability to interface with many peripherals at the same time, via multiple communication protocols (i.e., Bluetooth, Wi-Fi, cellular, or direct cable connection).

Another crucial function for acquisition software in scientific and medical applications is the capability to control acquisition and postprocessing parameters, retain adequate bit depth, and interface with downstream image analysis pipelines. The ability to acquire reproducible images has been an ongoing challenge for smartphone-based systems due to their “point-and-shoot” design which automates imaging acquisition and postprocessing.94 Autofocusing/exposure is generally beneficial for improved usability of imaging systems; however, developers should be careful of relying on high-level native and third party libraries to provide this functionality as it may inhibit quantitative reproducibility. In recent years, the ability to fix imaging parameters and access the unprocessed RAW pixel data has become more accessible on both iOS and Android platforms. This has been leveraged in more recently reported systems for improved quantitative accuracy in low-light applications, although incorporating RAW image processing capability in downstream on-board analysis on phones has not yet been demonstrated.6,16,64,73

In terms of analysis, common functions for SBI systems include image review/consultation,91,95 region of interest selection,19,96 colorimetric quantification,73,97 intensity measurements,6,98,99 diagnostic classification,65,71,90,100 and segmentation of tissue structures.14,15,55,101 However, implementing custom image analysis routines on smartphone operating systems requires substantial programming expertise, leading researchers to continue to rely on manual image processing workflows. The problem is further exacerbated by phone-to-phone variability, requiring additional work to ensure smartphone-based analysis routines function properly for multiple phone cameras and operating systems. Both one-time calibration of phone cameras using optical targets as well as per-measurement calibration by measuring ambient lighting conditions have been proposed to improve quantitative reproducibility.68,73,86,94,102 An alternative solution to phone-by-phone calibration for image analysis pipelines could be to leverage cloud-based computing with deep learning. Such an approach has been demonstrated by Song et al.100 for oral imaging. Centralizing image data acquired from multiple users/phones to an appropriate privacy-compliant server enables collection of larger, more diverse datasets which can in turn be used to develop more robust image analysis pipelines and deploy them without having to continually update smartphone software.

3.

Context and Applications

As noted in Sec. 1, SBI systems have been proposed for a variety of diagnostic applications and are increasingly being proposed for noninvasive monitoring of disease conditions. There are also emerging reports proposing the use of SBI to guide treatment procedures (surgical or PDT) or to conduct more invasive diagnostic imaging of deep tissues (endoscopy). This section contains a structured summary of recent reports using SBI tissue imaging for three types of applications: monitoring, diagnosis, and treatment guidance. While there are some commonalities in how SBI could be advantageous in all three of these categories, understanding of the clinical context (i.e., where is the measurement being taking and who is taking it) is important in assessing the strengths and weaknesses of smartphone utilization. As there are some SBI systems and applications, which may well fit into more than one of the three aforementioned categories, note that this categorization was not intended to be a rigid one, but rather a broad grouping for purposes of discussion.

3.1.

Monitoring

Monitoring applications often require repeated sampling over a sustained period (from hours up to months) to assess appreciable differences in disease states, and therefore benefit from being low-cost and noninvasive. In recent years, SBI systems have been proposed for monitoring of vital signs,103109 blood glucose,110113 blood pressure,114,115 blood oxygenation,68,116 hemoglobin concentration,72 atrial fibrilation,117,118 jaundice,73,97,119,120 skin cancer,121 and diabetic foot ulcers.55,122 All of these applications propose utilizing either contact-based or contactless optical measurements using the smartphone camera, most often by individuals on themselves (i.e., self-monitoring). With the exception of blood glucose monitoring, all are proposed for use through installation of a software app onto native devices and do not require external optical attachments.

Vital sign monitoring using contact-based video recording of fingers (i.e., photoplethysmography) was an early SBI application proposed for smartphones,103 and there are continued reports of novel ways to utilize this approach to extract additional hemodynamic metrics (blood pressure, oxygenation, cardiac arrhythmia, etc.).115,116,118 However, this is clearly one area where practical considerations of the context have been overlooked and smartphone utilization is increasingly questionable, as to the use case. Although contact-based vital sign monitoring is achievable using smartphone cameras, continuous monitoring is infeasible without dedicating the smartphone for that purpose. Low-cost, dedicated wearable sensors which can relay data to the smartphone are clearly a more practical and reliable long-term solution for obtaining contact-based vital sign measurements.

Noncontact-based methods for extracting hemodynamics using video recording are a more recent development which more fully utilizes the spatial information provided by smartphone imaging.68,72,104,123,124 Two recent works, by Park et al.72 and He and Wang,68 used computational techniques to infer higher resolution spectral responses and predict hemoglobin content in tissue using only RGB smartphone image data. Park et al. evaluated their method on a dataset acquired from 153 participants referred for a blood count at an academic health center in Kenya, whereas He and Wang performed their assessments on a few volunteers in a “dark” lab environment. Park et al. also performed a more rigorous quantitative comparison of their noninvasive hemoglobin concentration prediction compared to the gold-standard obtained by venous blood draw, observing good quantitative agreement of their noninvasive hemoglobin predictions (R2=0.91 for 15 test patients).72 The work by Park et al. showed a rigorous and realistic assessment of their method in a clinical setting. However, both methods required careful calibration of smartphone camera spectral response, used post-hoc image analysis routines, and were only conducted by research personnel, so it remains to be seen if they can be effectively deployed at scale on smartphones.

Other monitoring application for SBI involves photographic surveillance of dermal conditions, skin cancer, or diabetic foot ulcers for example. In these applications, there is less need for optical instrumentation, but rather software design to facilitate standardized acquisition, automated analysis, and data relay to clinical providers as needed. For example, in the case of diabetic foot ulcers, Yap et al.125 developed an app that provides a “ghost outline” of the user’s foot based on prior measurements to ensure that repeated measurements were well coregistered. And Ploderer et al.93 proposed the use of the front facing camera and voice guidance to enable users to acquire photos of the bottom of their feet more conveniently. While less novel in their extraction of optical tissue properties, these approaches demonstrate a focus on real-world usability in SBI monitoring applications.

3.2.

Diagnosis

Diagnostic measurements of tissue are typically performed to examine a suspicious area of tissue more closely to assess the underlying cause of abnormality and/or severity of disease. As diagnostics are typically used for triage to treatment interventions, such applications more likely to be performed by medical personnel and in clinical settings rather than by individuals at home. Some of the primary in vivo diagnostic modalities proposed for SBI systems include white light imaging,52,65,66,91,100102,126133 autofluorescence imaging,17,65,66,71,100 multispectral/hyperspectral imaging,52,64,88 endoscopy,1013 in vivo spectroscopy,14,51,85 and in vivo microscopy.1416,62,134 For these modalities, the most frequent imaging sites are external tissues (dermis, facial, and retinal), externally accessible tissues (oral cavity, cervix, and ear), as well as some deeper tissues in the case of endoscopy (bladder, larynx, and esophagus).

In past years, a number of novel SBI systems have been developed for the application of skin cancer diagnosis and surveillance through smartphone-based dermascopy. In 2016, Kim et al. published one of the first smartphone-based multispectral imaging systems and explored its potential use for dermal lesion assessment.89 Their system consisted of a motor-controlled wheel of optical filters placed in front of the LED flash with embedded electronics and custom app to synchronize acquisition of a spectral image cube from images acquired at nine different wavelengths from 440 to 690 nm. They validated their spectral measurements against a non-SBI liquid-tunable-crystal-filter system using a colored optical target and performed exploratory normal volunteer imaging at imaging two dermal sites (one acne and nevus region). Acquisition speed was not reported, but image cube processing speed was reportedly 30 s on the phone and could be sped up to 3 s per image cube with cloud-based processing. More recently, Uthoff et al. reported a multispectral system that performs sequential illumination using eight different colored LEDs across the visible to near-infrared regime (450 to 940 nm) and was actively controlled using a custom smartphone app.52 Their system implemented two different camera acquisition methods: one using the built-in camera and one using a tethered USB camera. The authors performed measurements using both camera setups and semiquantitatively assessed skin chromophores (hemoglobin, oxygenated hemoglobin, and melanin) in two clinical cases (one benign and one malignant). Acquisition reportedly required 20 s per image cube, and it was not specified whether on-board image cube analysis was achieved. While both systems are state-of-the-art implementations in terms of integration of custom hardware and software into novel, compact imaging systems, their primary shortcoming is a lack of integration into a practical clinical workflow to more concretely demonstrate the advantages of having these modalities on an SBI system. This can be done but requires more extensive testing than was reported.

Endoscopic procedures using both rigid and flexible optical probes have incorporated into SBI systems for otolaryngological, esophageal, and cervical examination of epithelial tissues.1015,135 In the context of endoscopy applications, handheld systems with rigid optical probes seem more synergistic with smartphone utilization as the probe and the device are compact enough to simultaneously manipulate and position.10,135 By contrast, other microendoscope systems utilize thin flexible coherent fiber bundles for image relay.14,15 These systems benefit less from smartphone utilization, as the optical probe is typically manipulated independently of the monitor screen and the optical probes can relay images over longer distances (often several feet), reducing the need for an ultracompact optical enclosure. Similarly, tethered capsule endoscopes benefit minimally from smartphone utilization as the tethered connection enables relay over long distances and to any computer monitor.

3.3.

Treatment Guidance

Treatment guidance imaging systems may have many overlapping requirements with diagnostic applications but primarily differ in that they have more stringent requirements for real-time visualization and analysis to provide active procedural guidance and support more rapid clinical decision making. Recent reports for SBI systems and applications that involve treatment guidance include image-guided surgery,1820,22,37,38,41 management of severe burn injuries,40,95 PDT,69 and venipuncture.63,136

The starkest example of SBI for treatment guidance is surgery. Because surgical procedures are very costly in terms of medical infrastructure and human resource requirements, the typical justification of smartphone utilization based on low-cost becomes irrelevant, and usability merits of the form factor, interactive interface design, and networking capabilities of the smartphone platform must clearly take precedence. In 2014, Teichman et al.19 proposed the use of a smartphone app to take images and subsequent spatial measurements using software to postoperatively verify placement of toric intraocular lenses during cataract surgery. While feasibility of SBI in this application was demonstrated, no substantive assessment of clinical outcomes was undertaken and the approach does not appear to have been widely adopted. In 2018, a team of neurosurgical clinicians reported a retrospective analysis on the use of a smartphone-based rigid endoscope to visualize the surgical field in over 42 neurosurgical procedures over a span of five years. The study was a nonrandomized retrospective analysis and was limited to a qualitative case report of the usability of the smartphone in this application. The authors noted that the placement of the smartphone screen in front of the endoscope made it “more intuitive” and “enhanced 3D perception” during operation. The attachment utilized with the rigid endoscope appears to have since been discontinued.137 Overall, these clinical assessments of SBI systems in surgery were quite small scale and qualitative in nature, perhaps indicating a lack of confidence in providers and ethics committees to evaluate these techniques in a more substantive manner.

In two of the aforementioned applications (severe burn management and PDT), SBI systems have been proposed to provide noninvasive, quantitative quality assurance of treatment procedures which, in current clinical workflows, require less expertise to deliver but can be more subjective in nature. These applications are much better suited to leverage SBI systems. In photodynamic therapy, for example, Ruiz et al.6 developed a quantitative, handheld fluorescence imaging system for PpIX-PDT dosimetry before, during, and after phototherapy. The system consists of a battery powered case attachment which includes an embedded LED ring (405-nm illumination) and 600-nm longpass emission filter, and the cylindrical enclosure enables contact-based measurement at a fixed working distance and focal length, which helps ensure easy operation and reproducibility of measurements. The system was assessed using intralipid phantoms with known PpIX concentrations as well as in an animal model for PDT treatment and showed excellent quantitative precision in both applications. A clinical evaluation of another SBI system for photodynamic therapy of oral lesions at a medical center was recently reported by Khan et al.9 to assess low-cost technological treatments for this disease. Twenty-nine patients with confirmed oral squamous cell carcinoma lesions and who were undergoing PDT were imaged using white light, ultrasound, autofluorescence, and PpIX fluorescence imaging pre- and post-treatment. Fluorescence imaging guidance was demonstrated to provide visual guidance to demarcate lesion boundaries and quantitatively confirm the treatment region pre- and post-treatment due to photobleaching. The authors noted some practical limitations of their current instrumentation (offline analysis routines as well as the handheld device being too bulky for easy access to all regions of the oral cavity), but it is more evident that the clinical assessments were rigorously performed in the intended clinical setting and that future developments will continue to leverage SBI in appropriate and meaningful ways.

4.

Discussion

The pervasive rationale for smartphone utilization in biomedical imaging has emphasized a multitude of factors including cost, portability, connectivity, ease-of-use, and scalability. While these factors are clearly desirable features of biomedical imaging systems, rigorous justification for how SBI systems outperform non-SBI systems is often lacking. The ubiquity of smartphones is frequently cited in literature as a justification that SBI systems are inherently low-cost, easy-to-use, and scalable biomedical imaging solutions. However, undermining this claim is the reality that the majority of original research for SBI systems is limited to a single phone model and utilizes manual, often fragmented image acquisition and analysis pipelines. While this is partly the nature of research prototyping, it is important to openly discuss these limitations and continually move toward practices that will enable greater reproducibility and translation of research advances in SBI.

4.1.

Guidelines for Appropriate Use of Smartphone for Biomedical Optics

As discussed in the prior sections, smartphone-based hardware and software interface design varies greatly, with some designs more fully/appropriately utilizing the capabilities of the smartphone platform and others less so. Here, we provide six keys (the six C’s) as recommended guidelines to assess the appropriateness of smartphone use in biomedical imaging systems.

Clinical context represents an understanding of the clinical need and the intended user. These factors should always provide the overarching context for device design and development. Smartphone utilization should primarily be justified within this context and if the workflow needs of the clinic match the capabilities of a SBI system, or if other more dedicated systems would be superior. The ideal scenario is to assess new devices in their intended setting (at home, diagnostic lab, health clinic, central hospital, etc.) and in the hands of the intended operator (patient, lab technician, nurse, physician). When that is not feasible, device developers should be careful not to overstate the impact of their systems and acknowledge this limitation.

Completeness represents achieving a complete implementation of the intended clinical workflow, including custom app development and use and testing by nonresearch personnel. Manual image processing steps using desktop software or evaluations performed only in lab settings are often signs that a complete workflow has not been achieved. Achieving a complete implementation is clearly the best way to test the value of the approach.

Compactness relates to the importance of portability and small size for the intended application. For diagnostic and treatment guidance systems, if handheld use and/or easy transport between exam rooms will enhance usability, smartphone use is more appropriate. Alternatively, if the use case is in healthcare outside the medical center, or in remote setting, does the compact nature of the phone and the addition match the economics and portability needs of the situation?

Connectivity refers to the importance of wireless communication for the intended application. Applications that use wireless communication, through either Bluetooth or Wi-Fi, and features such as cloud computing for centralized data are ideal for SBI systems. Scalability and multiuser deployment are appropriate choices for these as well.

Cost is often the most emphasized reason for development of SBI systems, but it should be noted that this is rarely a true argument. Low-cost optoelectronic systems are now widespread and the cost of the most advanced smartphones has grown. The cost issue should be carefully evaluated, particularly if the application requires a hardware attachment and is intended for diagnostic use or treatment guidance. In these contexts, costs associated with regulatory approval and marketing will far outweigh material costs. Designs that prioritize the aforementioned C’s as opposed to minimizing prototyping costs are more likely to make an impact in medicine.

Claims refers to general statements regarding ease-of-use, cost, or scalability of SBI systems by virtue of smartphone utilization alone. Such claims should not be promulgated in research literature but rather appropriately justified through quantitative assessments. Some studies have quantified usability improvements by measuring time for task completion, performing surveys, or conducting blinded image reviews.11,129,133 Broader use of such assessments to substantiate improved usability of SBI systems should be encouraged. When possible, usability should be assessed in the intended clinical setting, but anatomical models and/or imaging phantoms can be good alternatives when clinical evaluation is infeasible.138

Key questions to embody the six proposed guidelines are contained in Fig. 5. These questions are intended to be used as a self-assessment for biomedical optics developers to encourage more careful consideration of the design choices made during SBI system development and improve the overall quality and rigor of SBI system assessments reported in the literature.

Fig. 5

Six guidelines for evaluating appropriateness of smartphone utilization in biomedical imaging applications.

JBO_26_4_040902_f005.png

Ultimately, smartphone utilization in biomedical imaging is a multifaceted design choice, which should be carefully justified and evaluated by system developers. That is to say, any biomedical imaging system can, in theory, be prototyped with better overall optical and computational performance using scientific-grade components. On the other hand, many systems prototyped as “low-cost” systems on smartphones can likely be implemented at even lower fabrication cost with greater reproducibility using single board computers or embedded processors connected to peripheral camera sensors.80,139143 Therefore, the burden should be placed on SBI system developers to demonstrate the unique advantages of their systems through prospective clinical assessments. Achieving this will require working toward greater reproducibility and translation of SBI systems.

4.2.

Achieving Reproducibility and Translation of Smartphone-Based Optical System Design

In the past decade, several startups have launched SBI products targeted toward dermatology and ophthalmology clinics as well as telehealth applications, but none have yet gained significant traction in medical practice. With over a decade since SBI systems for healthcare applications have been under development, the lack of commercial success for SBI systems should raise concern. While it is challenging to comprehensively identify barriers to translation of SBI systems, we postulate that the speed at which smartphone technology evolves and the short lifecycle of these products is not readily conducive to medical device manufacturing standards. For hardware interfaces, new form factors and camera modules are launched each year, necessitating redesign of optical attachments. The lack of standardization in software development and reproducibility of results for SBI are a barrier for research progress. Here, we propose the following three items to move toward wide adoption of SBI systems: (1) focusing on hardware design that facilitates adoption of varying phone models, (2) creation of open-source software for SBI system development, and (3) adoption of robust calibration methods to best facilitate quantitative reproducibility.

Hardware design that focuses on attachments that are adaptable to different placements of the camera will be imperative for this field to gain long-term traction. Alternatively, the cost of attachment development could be sufficiently low as to allow ease of development for multiple platforms, similar to the smartphone case marketplace today. Today most devices are made for a specific phone model and customized around it, but further thought into adaptive design for constant changes in camera placement and phone sizes will be important. Hardware is more difficult to standardize as people will likely elect to use different smartphones for development. As a starting point, sharing of CAD files for optical attachments, custom enclosures, and electronic schematics with publication should be encouraged. Many research groups and startups focus on 3D printing of the hardware containers which is now a reliable and reasonable way to prototype. The shift from 3D printing technology to automated production of attachment hardware via machining, injection molding, or thermosetting will likely be important. The attachments with optical components can take advantage of highly developed optomechanical engineering that has already revolutionized the smartphone camera industry. The major benefits of spectral, polarization, or gated sensing and imaging remain to be fully exploited with custom attachments.

Open-source software toolkits and starter applications for biomedical imaging are a good place to start addressing existing development and reproducibility problems in SBI. Effective smartphone app development and maintenance requires significant programming expertise and is currently a barrier for many researchers who might be developing their software from scratch. In order for research prototypes to achieve clinical translation, standardized methods for SBI software development are needed. Cho et al.144 proposed a concept for a “retargetable application development platform for healthcare mobile applications.” Such a project is a worthy goal. In another recent review on smartphone point-of-care adapters, Alawsi and Al‐Bawi proposed that cross platform app development using Ionic or Xamarin as a possible solution.53 Although cross-platform app development could help in principle, it would likely be limited to only the subset of functionality which is common to all operating systems and would utilize the phone’s built in compression algorithms. This would not be ideal for quantitative fluorescence imaging for example. An alternative starting point is to create and maintain platform-specific templates that support core functionality needed for biomedical imaging which would include support for RAW image acquisition and standardized processing routines for common biomedical image analysis tasks.

Robust calibration of SBI systems is essential for addressing reproducibility problems and achieving clinical translation. Two major factors in this regard are: (1) lack of characterization of sensor performance (dynamic range, SNR, absorption spectra of built-in filters, demosaicing, and data acquisition rates) and (2) “black-box” processing that phones perform on the CMOS imaging data to generate traditional 8-bit RGB images. The use of RAW pixel data to confirm suitable processing pipelines is a straightforward way to circumvent this issue for all applications, including colorimetric and quantitative techniques. Given the increased complexity of accessing and analyzing RAW pixel data, suitable alternatives include color and gray-scale calibration targets (X-rite ColorChecker, for example).67,69 Moving toward full system characterization using radiometric calibration methods to understand results presented in studies should also be encouraged. Relative radiometric calibration methods for smartphones have been proposed.70 Absolute radiometric calibration methods that are optimized for SBI systems should be developed to aid in the development of quantitative applications such as fluorescence imaging. Additionally, public or app-specific sharing of image measurements from commercially available optical phantoms/targets can help ensure reproducibility of optical measurements. Development of easily networked access to file spaces will enable platforms that take advantage of off-phone computing resources such as deep learning algorithms that interpret the image data. At a minimum, these steps would enable relative calibration and comparison across hardware systems in the literature.

5.

Conclusions

SBI systems have demonstrated a large array of applications and exhibit great potential to facilitate compact, easy-to-use biomedical imaging systems. However, for SBI systems to achieve that potential, more holistic assessments of SBI systems are needed to enable greater reproducibility and demonstrate value within their intended clinical settings. Evaluation of SBI systems should take into account clinical context, completeness, compactness, connectivity, cost, and claims associated with novel systems. Claims regarding the scalability and low-cost of SBI systems based on the ubiquity of smartphones should not be sufficient to justify their novelty and impact. Ongoing work in SBI for medical applications should prioritize realistic clinical assessments with quantitative and qualitative comparisons to other non-SBI systems in order to more clearly demonstrate the value of SBI systems within their intended applications. Improved hardware design to accommodate the rapidly changing smartphone ecosystem, creation open-source software and starter applications for SBI system development, and adoption of robust calibration techniques to address phone-to-phone variability are three high priority areas to move SBI research in biomedical imaging forward.

Disclosures

Two of the authors (BWP and AJR) are cofounders and employed part time by QUEL Imaging, developing tools for fluorescence imaging, including smartphone applications.

Acknowledgments

This work has been partially supported by the National Institutes of Health (Grant No. P01CA084203).

References

1. 

A. Roda et al., “Smartphone-based biosensors: a critical review and perspectives,” TRAC Trends Anal. Chem., 79 317 –325 (2016). https://doi.org/10.1016/j.trac.2015.10.019 Google Scholar

2. 

S. Kanchi et al., “Smartphone based bioanalytical and diagnosis applications: a review,” Biosens. Bioelectron., 102 136 –149 (2018). https://doi.org/10.1016/j.bios.2017.11.021 BBIOE4 0956-5663 Google Scholar

3. 

J. Liu et al., “Point-of-care testing based on smartphone: the current state-of-the-art (2017–2018),” Biosens. Bioelectron., 132 17 –37 (2019). https://doi.org/10.1016/j.bios.2019.01.068 BBIOE4 0956-5663 Google Scholar

4. 

T. R. Kozel and A. R. Burnham-Marusich, “Point-of-care testing for infectious diseases: past, present, and future,” J. Clin. Microbiol., 55 2313 –2320 (2017). https://doi.org/10.1128/JCM.00476-17 JCMIDW 1070-633X Google Scholar

5. 

B. P. Hibler, Q. Qi and A. M. Rossi, “Current state of imaging in dermatology,” Semin. Cutan. Med. Surg., 35 2 –8 (2016). https://doi.org/10.12788/j.sder.2016.001 Google Scholar

6. 

A. J. Ruiz et al., “Smartphone fluorescence imager for quantitative dosimetry of protoporphyrin-IX-based photodynamic therapy in skin,” J. Biomed. Opt., 25 056003 (2020). https://doi.org/10.1117/1.JBO.25.5.056003 JBOPFO 1083-3668 Google Scholar

7. 

J. Hempstead et al., “Low-cost photodynamic therapy devices for global health settings: characterization of battery-powered LED performance and smartphone imaging in 3D tumor models,” Sci. Rep., 5 10093 (2015). https://doi.org/10.1038/srep10093 SRCEC3 2045-2322 Google Scholar

8. 

H. Liu et al., “Development and evaluation of a low-cost, portable, LED-based device for PDT treatment of early-stage oral cancer in resource-limited settings,” Lasers Surg. Med., 51 345 –351 (2019). https://doi.org/10.1002/lsm.23019 LSMEDI 0196-8092 Google Scholar

9. 

S. Khan et al., “Clinical evaluation of smartphone-based fluorescence imaging for guidance and monitoring of ALA-PDT treatment of early oral cancer,” J. Biomed. Opt., 25 063813 (2020). https://doi.org/10.1117/1.JBO.25.6.063813 JBOPFO 1083-3668 Google Scholar

10. 

J. K. Bae et al., “Smartphone-based endoscope system for advanced point-of-care diagnostics: feasibility study,” JMIR Mhealth Uhealth, 5 e99 (2017). https://doi.org/10.2196/mhealth.7232 Google Scholar

11. 

S. Lu et al., “Endockscope: a disruptive endoscopic technology,” J. Endourol., 33 960 –965 (2019). https://doi.org/10.1089/end.2019.0252 Google Scholar

12. 

S. E. Maurrasse, T. W. Schwanke and A. Tabaee, “Smartphone capture of flexible laryngoscopy: optics, subsite visualization, and patient satisfaction,” Laryngoscope, 129 2147 –2152 (2019). https://doi.org/10.1002/lary.27803 Google Scholar

13. 

G. Sharma et al., “Smartphone-based multimodal tethered capsule endoscopic platform for white-light, narrow-band, and fluorescence/autofluorescence imaging,” J. Biophotonics, 14 e202000324 (2020). https://doi.org/10.1002/jbio.202000324 Google Scholar

14. 

X. Hong et al., “A dual-modality smartphone microendoscope for quantifying the physiological and morphological properties of epithelial tissues,” Sci. Rep., 9 15713 (2019). https://doi.org/10.1038/s41598-019-52327-x SRCEC3 2045-2322 Google Scholar

15. 

B. D. Grant et al., “A mobile-phone based high-resolution microendoscope to image cervical precancer,” PLoS One, 14 e0211045 (2019). https://doi.org/10.1371/journal.pone.0211045 POLNCL 1932-6203 Google Scholar

16. 

E. E. Freeman et al., “Smartphone confocal microscopy for imaging cellular structures in human skin in vivo,” Biomed. Opt. Express, 9 1906 –1915 (2018). https://doi.org/10.1364/BOE.9.001906 BOEICL 2156-7085 Google Scholar

17. 

Q. He, T. Liu and R. K. Wang, “Handheld swept-source optical coherence tomography guided by smartphone-enabled wide-field autofluorescence photography for imaging facial sebaceous glands,” Opt. Lett., 45 5704 (2020). https://doi.org/10.1364/OL.405765 OPLEDP 0146-9592 Google Scholar

18. 

M. Mandel et al., “Smartphone-assisted minimally invasive neurosurgery,” J. Neurosurg., 130 90 –98 (2019). https://doi.org/10.3171/2017.6.JNS1712 JONSAC 0022-3085 Google Scholar

19. 

J. C. Teichman, K. Baig and I. I. K. Ahmed, “Simple technique to measure toric intraocular lens alignment and stability using a smartphone,” J. Cataract Refract. Surg., 40 1949 –1952 (2014). https://doi.org/10.1016/j.jcrs.2014.09.029 Google Scholar

20. 

O. Aly, “Assisting vascular surgery with smartphone augmented reality,” Cureus, 12 e8020 (2020). https://doi.org/10.7759/cureus.8020 Google Scholar

21. 

M. M. Çelikoyar, O. Topsakal and S. Gürbüz, “Mobile technology for recording surgical procedures,” J. Visual Commun. Med., 42 120 –125 (2019). https://doi.org/10.1080/17453054.2019.1612234 Google Scholar

22. 

T. Han et al., “Indocyanine green angiography predicts tissue necrosis more accurately than thermal imaging and near-infrared spectroscopy in a rat perforator flap model,” Plast. Reconstr. Surg., 146 1044 –1054 (2020). https://doi.org/10.1097/PRS.0000000000007278 Google Scholar

23. 

GSMArena, “GSMArena compare specs tool,” GSMArena.com Google Scholar

24. 

Y. M. Park et al., “Ambient light-based optical biosensing platform with smartphone-embedded illumination sensor,” Biosens. Bioelectron., 93 205 –211 (2017). https://doi.org/10.1016/j.bios.2016.09.007 BBIOE4 0956-5663 Google Scholar

25. 

Y. Zhao et al., “A nanozyme- and ambient light-based smartphone platform for simultaneous detection of dual biomarkers from exposure to organophosphorus pesticides,” Anal. Chem., 90 7391 –7398 (2018). https://doi.org/10.1021/acs.analchem.8b00837 ANCHAM 0003-2700 Google Scholar

26. 

I. Hussain et al., “Design of a smartphone platform compact optical system operational both in visible and near infrared spectral regime,” IEEE Sens. J., 18 4933 –4939 (2018). https://doi.org/10.1109/JSEN.2018.2832848 ISJEAZ 1530-437X Google Scholar

27. 

S. Dutta, “Point of care sensing and biosensing using ambient light sensor of smartphone: critical review,” TRAC Trends Anal. Chem., 110 393 –400 (2019). https://doi.org/10.1016/j.trac.2018.11.014 Google Scholar

28. 

A. Breitbarth et al., “Measurement accuracy and dependence on external influences of the iPhone X TrueDepth sensor,” Proc. SPIE, 11144 1114407 (2019). https://doi.org/10.1117/12.2530544 PSISDG 0277-786X Google Scholar

29. 

B. Krolla, M. Diebold and D. Stricker, “Light field from smartphone-based dual video,” Lect. Notes Comput. Sci., 8926 600 –610 (2015). https://doi.org/10.1007/978-3-319-16181-5_46 LNCSD9 0302-9743 Google Scholar

30. 

D. W. Palmer et al., “Glare-free retinal imaging using a portable light field fundus camera,” Biomed. Opt. Express, 9 3178 –3192 (2018). https://doi.org/10.1364/BOE.9.003178 BOEICL 2156-7085 Google Scholar

31. 

H. M. Kim et al., “Miniaturized 3D depth sensing-based smartphone light field camera,” Sensors, 20 2129 (2020). https://doi.org/10.3390/s20072129 SNSRES 0746-9462 Google Scholar

32. 

S. Lee et al., “A time-of-flight range sensor using four-tap lock-in pixels with high near infrared sensitivity for LiDAR applications,” Sensors, 20 116 (2020). https://doi.org/10.3390/s20010116 SNSRES 0746-9462 Google Scholar

34. 

L. Ulrich et al., “Analysis of RGB-D camera technologies for supporting different facial usage scenarios,” Multimedia Tools Appl., 79 29375 –29398 (2020). https://doi.org/10.1007/s11042-020-09479-0 Google Scholar

35. 

“Cat S61—full phone specifications,” https://www.gsmarena.com/cat_s61-9076.php Google Scholar

36. 

A. Kirimtat et al., “FLIR vs SEEK thermal cameras in biomedicine: comparative diagnosis through infrared thermography,” BMC Bioinf., 21 88 (2020). https://doi.org/10.1186/s12859-020-3355-7 BBMIC4 1471-2105 Google Scholar

37. 

J. T. Hardwicke, O. Osmani and J. M. Skillman, “Detection of perforators using smartphone thermal imaging,” Plast. Reconstr. Surg., 137 39 –41 (2016). https://doi.org/10.1097/PRS.0000000000001849 Google Scholar

38. 

E. Y. Xue et al., “Use of FLIR ONE smartphone thermography in burn wound assessment,” Ann. Plast. Surg., 80 S236 –S238 (2018). https://doi.org/10.1097/SAP.0000000000001363 APCSD4 Google Scholar

39. 

R. F. M. van Doremalen et al., “Validation of low-cost smartphone-based thermal camera for diabetic foot assessment,” Diabetes Res. Clin. Pract., 149 132 –139 (2019). https://doi.org/10.1016/j.diabres.2019.01.032 DRCPE9 0168-8227 Google Scholar

40. 

J. Goel et al., “A prospective study comparing the FLIR ONE with laser Doppler imaging in the assessment of burn depth by a tertiary burns unit in the United Kingdom,” Scars Burn Heal, 6 2059513120974261 (2020). https://doi.org/10.1177/2059513120974261 Google Scholar

41. 

M. P. B. Obinah, M. Nielsen and L. R. Hölmich, “High-end versus low-end thermal imaging for detection of arterial perforators,” Plast. Reconstr. Surg. Glob. Open, 8 e3175 (2020). https://doi.org/10.1097/GOX.0000000000003175 Google Scholar

42. 

D. M. Roblyer, “Perspective on the increasing role of optical wearables and remote patient monitoring in the COVID-19 era and beyond,” J. Biomed. Opt., 25 102703 (2020). https://doi.org/10.1117/1.JBO.25.10.102703 JBOPFO 1083-3668 Google Scholar

43. 

S. C. Mukhopadhyay, “Wearable sensors for human activity monitoring: a review,” IEEE Sens. J., 15 1321 –1330 (2014). https://doi.org/10.1109/JSEN.2014.2370945 ISJEAZ 1530-437X Google Scholar

44. 

A. Nag, S. C. Mukhopadhyay and J. Kosel, “Wearable flexible sensors: a review,” IEEE Sens. J., 17 3949 –3960 (2017). https://doi.org/10.1109/JSEN.2017.2705700 ISJEAZ 1530-437X Google Scholar

45. 

Z. S. Ballard, A. Ozcan, , J. Rehg, S. Murphy and S. Kumar, “Wearable optical sensors,” Mobile Health, 313 –342 Springer, Cham, Switzerland (2017). Google Scholar

46. 

J. Heikenfeld et al., “Wearable sensors: modalities, challenges, and prospects,” Lab Chip, 18 217 –248 (2018). https://doi.org/10.1039/C7LC00914C LCAHAM 1473-0197 Google Scholar

47. 

A. Kamišalić et al., “Sensors and functionalities of non-invasive wrist-wearable devices: a review,” Sensors, 18 1714 (2018). https://doi.org/10.3390/s18061714 SNSRES 0746-9462 Google Scholar

48. 

“Understanding reliability in Bluetooth® technology,” (2020). https://www.bluetooth.com/bluetooth-resources/understanding-reliability-in-bluetooth-technology/ Google Scholar

49. 

“Project Soli—Google ATAP,” https://atap.google.com/soli/technology/ Google Scholar

50. 

C. Gu and J. Lien, “A two-tone radar sensor for concurrent detection of absolute distance and relative movement for gesture sensing,” IEEE Sens. Lett., 1 1 –4 (2017). https://doi.org/10.1109/LSENS.2017.2696520 Google Scholar

51. 

F. Cai et al., “Pencil-like imaging spectrometer for bio-samples sensing,” Biomed. Opt. Express, 8 5427 –5436 (2017). https://doi.org/10.1364/BOE.8.005427 BOEICL 2156-7085 Google Scholar

52. 

R. D. Uthoff et al., “Point-of-care, multispectral, smartphone-based dermascopes for dermal lesion screening and erythema monitoring,” J. Biomed. Opt., 25 066004 (2020). https://doi.org/10.1117/1.JBO.25.6.066004 JBOPFO 1083-3668 Google Scholar

53. 

T. Alawsi and Z. Al‐Bawi, “A review of smartphone point‐of‐care adapter design,” Eng. Rep., 1 e12039 (2019). https://doi.org/10.1002/eng2.12039 SRKHEK Google Scholar

54. 

J.-S. Yang et al., “Smartphone diagnostics unit (SDU) for the assessment of human stress and inflammation level assisted by biomarker ink, fountain pen, and origami holder for strip biosensor,” Sens. Actuators B, 241 80 –84 (2017). https://doi.org/10.1016/j.snb.2016.10.052 SABCEB 0925-4005 Google Scholar

55. 

L. Wang et al., “Smartphone-based wound assessment system for patients with diabetes,” IEEE Trans. Biomed. Eng., 62 477 –488 (2015). https://doi.org/10.1109/TBME.2014.2358632 IEBEAX 0018-9294 Google Scholar

56. 

B. Dai et al., “Colour compound lenses for a portable fluorescence microscope,” Light Sci. Appl., 8 75 (2019). https://doi.org/10.1038/s41377-019-0187-1 Google Scholar

57. 

A. Orth et al., “A dual-mode mobile phone microscope using the onboard camera flash and ambient light,” Sci. Rep., 8 3298 (2018). https://doi.org/10.1038/s41598-018-21543-2 SRCEC3 2045-2322 Google Scholar

58. 

A. Russo et al., “A novel device to exploit the smartphone camera for fundus photography,” J. Ophthalmol., 2015 823139 (2015). https://doi.org/10.1155/2015/823139 Google Scholar

59. 

“The Portable Ophthalmoscope for Your iPhone | Hand held fundus camera price| D-EYE for Humans | D-EYE,” https://www.d-eyecare.com/en_US/product Google Scholar

60. 

“Tech specs of MoleScope II smart dermoscopy,” https://www.dermengine.com/tech-specs-of-molescope-ii Google Scholar

61. 

N. A. Switz, M. V. D’Ambrosio and D. A. Fletcher, “Low-cost mobile phone microscopy with a reversed mobile phone camera lens,” PLoS One, 9 e95330 (2014). https://doi.org/10.1371/journal.pone.0095330 POLNCL 1932-6203 Google Scholar

62. 

G. N. McKay et al., “Visualization of blood cell contrast in nailfold capillaries with high-speed reverse lens mobile phone microscopy,” Biomed. Opt. Express, 11 2268 –2276 (2020). https://doi.org/10.1364/BOE.382376 BOEICL 2156-7085 Google Scholar

63. 

J. H. Song, C. Kim and Y. Yoo, “Vein visualization using a smart phone with multispectral Wiener estimation for point-of-care applications,” IEEE J. Biomed. Health Inf., 19 773 –778 (2015). https://doi.org/10.1109/JBHI.2014.2313145 Google Scholar

64. 

I. Kuzmina et al., “Study of smartphone suitability for mapping of skin chromophores,” J. Biomed. Opt., 20 090503 (2015). https://doi.org/10.1117/1.JBO.20.9.090503 JBOPFO 1083-3668 Google Scholar

65. 

R. D. Uthoff et al., “Point-of-care, smartphone-based, dual-modality, dual-view, oral cancer screening device with neural network classification for low-resource communities,” PLoS One, 13 e0207493 (2018). https://doi.org/10.1371/journal.pone.0207493 POLNCL 1932-6203 Google Scholar

66. 

R. D. Uthoff et al., “Small form factor, flexible, dual-modality handheld probe for smartphone-based, point-of-care oral and oropharyngeal cancer screening,” J. Biomed. Opt., 24 106003 (2019). https://doi.org/10.1117/1.JBO.24.10.106003 JBOPFO 1083-3668 Google Scholar

67. 

M. Nixon, F. Outlaw and T. S. Leung, “Accurate device-independent colorimetric measurements using smartphones,” PLoS One, 15 e0230561 (2020). https://doi.org/10.1371/journal.pone.0230561 POLNCL 1932-6203 Google Scholar

68. 

Q. He and R. Wang, “Hyperspectral imaging enabled by an unmodified smartphone for analyzing skin morphological features and monitoring hemodynamics,” Biomed. Opt. Express, 11 895 –910 (2020). https://doi.org/10.1364/BOE.378470 BOEICL 2156-7085 Google Scholar

69. 

B. Cugmas and E. Štruc, “Accuracy of an affordable smartphone-based teledermoscopy system for color measurements in canine skin,” Sensors, 20 6234 (2020). https://doi.org/10.3390/s20216234 SNSRES 0746-9462 Google Scholar

70. 

O. Burggraaff et al., “Standardized spectral and radiometric calibration of consumer cameras,” Opt. Express, 27 19075 –19101 (2019). https://doi.org/10.1364/OE.27.019075 OPEXFF 1094-4087 Google Scholar

71. 

A. Lihachev et al., “Differentiation of seborrheic keratosis from basal cell carcinoma, nevi and melanoma by RGB autofluorescence imaging,” Biomed. Opt. Express, 9 1852 –1858 (2018). https://doi.org/10.1364/BOE.9.001852 BOEICL 2156-7085 Google Scholar

72. 

S. M. Park et al., “mHealth spectroscopy of blood hemoglobin with spectral super-resolution,” Optica, 7 563 –573 (2020). https://doi.org/10.1364/OPTICA.390409 Google Scholar

73. 

F. Outlaw et al., “Smartphone screening for neonatal jaundice via ambient-subtracted sclera chromaticity,” PLoS One, 15 e0216970 (2020). https://doi.org/10.1371/journal.pone.0216970 POLNCL 1932-6203 Google Scholar

74. 

N. Vaidya and O. Solgaard, “3D printed optics with nanometer scale surface roughness,” Microsyst. Nanoeng., 4 18 (2018). https://doi.org/10.1038/s41378-018-0015-4 Google Scholar

75. 

T. Gissibl et al., “Two-photon direct laser writing of ultracompact multi-lens objectives,” Nat. Photonics, 10 554 –560 (2016). https://doi.org/10.1038/nphoton.2016.121 NPAHBY 1749-4885 Google Scholar

76. 

W. M. Lee et al., “Fabricating low cost and high performance elastomer lenses using hanging droplets,” Biomed. Opt. Express, 5 1626 –1635 (2014). https://doi.org/10.1364/BOE.5.001626 BOEICL 2156-7085 Google Scholar

77. 

Y. L. Sung et al., “Fabricating optical lenses by inkjet printing and heat-assisted in situ curing of polydimethylsiloxane for smartphone microscopy,” J. Biomed. Opt., 20 047005 (2015). https://doi.org/10.1117/1.JBO.20.4.047005 JBOPFO 1083-3668 Google Scholar

78. 

J. Long et al., “Frugal filtering optical lenses for point-of-care diagnostics,” Biomed. Opt. Express, 11 1864 –1875 (2020). https://doi.org/10.1364/BOE.381014 BOEICL 2156-7085 Google Scholar

79. 

J. K. Adams et al., “In vivo fluorescence imaging with a flat, lensless microscope,” bioRxiv, (2020). https://doi.org/10.1101/2020.06.04.135236 Google Scholar

80. 

L. Belcastro et al., “Handheld multispectral imager for quantitative skin assessment in low-resource settings,” J. Biomed. Opt., 25 082702 (2020). https://doi.org/10.1117/1.JBO.25.8.082702 JBOPFO 1083-3668 Google Scholar

81. 

X. Li et al., “Fast confocal microscopy imaging based on deep learning,” in IEEE Int. Conf. Comput. Photogr., 1 –12 (2020). https://doi.org/10.1109/ICCP48838.2020.9105215 Google Scholar

82. 

Y. Li et al., “Diffuser-based computational imaging funduscope,” Opt. Express, 28 19641 –19654 (2020). https://doi.org/10.1364/OE.395112 OPEXFF 1094-4087 Google Scholar

83. 

C. Liu et al., “High resolution diffuse optical tomography using short range indirect subsurface imaging,” in IEEE Int. Conf. Comput. Photogr., 1 –12 (2020). https://doi.org/10.1109/ICCP48838.2020.9105173 Google Scholar

84. 

W. Chen et al., “Optical design and fabrication of a smartphone fundus camera,” Appl. Opt., 60 1420 (2021). https://doi.org/10.1364/AO.414325 APOPAI 0003-6935 Google Scholar

85. 

P. Edwards et al., “Smartphone based optical spectrometer for diffusive reflectance spectroscopic measurement of hemoglobin,” Sci. Rep., 7 12224 (2017). https://doi.org/10.1038/s41598-017-12482-5 SRCEC3 2045-2322 Google Scholar

86. 

K. G. Shah et al., “Mobile phone ratiometric imaging enables highly sensitive fluorescence lateral flow immunoassays without external optical filters,” Anal. Chem., 90 6967 –6974 (2018). https://doi.org/10.1021/acs.analchem.8b01241 ANCHAM 0003-2700 Google Scholar

87. 

J. Spigulis et al., “Smartphone snapshot mapping of skin chromophores under triple-wavelength laser illumination,” J. Biomed. Opt., 22 091508 (2017). https://doi.org/10.1117/1.JBO.22.9.091508 JBOPFO 1083-3668 Google Scholar

88. 

T. C. Cavalcanti et al., “Smartphone-based spectral imaging otoscope: system development and preliminary study for evaluation of its potential as a mobile diagnostic tool,” J. Biophotonics, 13 e2452 (2020). https://doi.org/10.1002/jbio.201960213 Google Scholar

89. 

S. Kim et al., “Smartphone-based multispectral imaging: system development and potential for mobile skin diagnosis,” Biomed. Opt. Express, 7 5294 –5307 (2016). https://doi.org/10.1364/BOE.7.005294 BOEICL 2156-7085 Google Scholar

90. 

S. Kim et al., “Smartphone-based multispectral imaging and machine-learning based analysis for discrimination between seborrheic dermatitis and psoriasis on the scalp,” Biomed. Opt. Express, 10 879 –891 (2019). https://doi.org/10.1364/BOE.10.000879 BOEICL 2156-7085 Google Scholar

91. 

J. Mink and C. Peterson, “MobileODT: a case study of a novel approach to an mHealth-based model of sustainable impact,” Mhealth, 2 (4), 9832 (2016). https://doi.org/10.21037/mhealth.2016.03.10 Google Scholar

92. 

R. Gross, “How to get better images with a colposcope,” (2019) https://www.mobileodt.com/blog/get-better-images-with-a-colposcope/ Google Scholar

93. 

B. Ploderer et al., “Promoting self-care of diabetic foot ulcers through a mobile phone app: user-centered design and evaluation,” JMIR Diabetes, 3 e10105 (2018). https://doi.org/10.2196/10105 Google Scholar

94. 

A. Skandarajah et al., “Quantitative imaging with a mobile phone microscope,” PLoS One, 9 e96906 (2014). https://doi.org/10.1371/journal.pone.0096906 POLNCL 1932-6203 Google Scholar

95. 

L. A. Wallis et al., “A smartphone app and cloud-based consultation system for burn injury emergency care,” PLoS One, 11 e0147253 (2016). https://doi.org/10.1371/journal.pone.0147253 POLNCL 1932-6203 Google Scholar

96. 

C. Liu et al., “Wound area measurement with 3D transformation and smartphone images,” BMC Bioinf., 20 724 (2019). https://doi.org/10.1186/s12859-019-3308-1 BBMIC4 1471-2105 Google Scholar

97. 

A. T. Sufian et al., “Chromatic techniques for in vivo monitoring jaundice in neonate tissues,” Physiol. Meas., 39 095004 (2018). https://doi.org/10.1088/1361-6579/aadbdb PMEAE3 0967-3334 Google Scholar

98. 

J. L. D. Nelis et al., “A randomized combined channel approach for the quantification of color- and intensity-based assays with smartphones,” Anal. Chem., 92 7852 –7860 (2020). https://doi.org/10.1021/acs.analchem.0c01099 ANCHAM 0003-2700 Google Scholar

99. 

M. K. Hasan et al., “Smartphone-based human hemoglobin level measurement analyzing pixel intensity of a fingertip video on different color spaces,” Smart Health, 5–6 26 –39 (2018). https://doi.org/10.1016/j.smhl.2017.11.003 Google Scholar

100. 

B. Song et al., “Automatic classification of dual-modalilty, smartphone-based oral dysplasia and malignancy images using deep learning,” Biomed. Opt. Express, 9 5318 –5329 (2018). https://doi.org/10.1364/BOE.9.005318 BOEICL 2156-7085 Google Scholar

101. 

B. Askarian, S. C. Yoo and J. W. Chong, “Novel image processing method for detecting strep throat (Streptococcal pharyngitis) using smartphone,” Sensors, 19 3307 (2019). https://doi.org/10.3390/s19153307 SNSRES 0746-9462 Google Scholar

102. 

C. Otero et al., “Comparison of different smartphone cameras to evaluate conjunctival hyperaemia in normal subjects,” Sci. Rep., 9 1339 (2019). https://doi.org/10.1038/s41598-018-37925-5 SRCEC3 2045-2322 Google Scholar

103. 

E. Jonathan and M. Leahy, “Investigating a smartphone imaging unit for photoplethysmography,” Physiol. Meas., 31 N79 –N83 (2010). https://doi.org/10.1088/0967-3334/31/11/N01 PMEAE3 0967-3334 Google Scholar

104. 

M. Kumar, A. Veeraraghavan and A. Sabharwal, “DistancePPG: robust non-contact vital signs monitoring using a camera,” Biomed. Opt. Express, 6 1565 –1588 (2015). https://doi.org/10.1364/BOE.6.001565 BOEICL 2156-7085 Google Scholar

105. 

N. Koenig et al., “Validation of a new heart rate measurement algorithm for fingertip recording of video signals with smartphones,” Telemed. e-Health, 22 631 –636 (2016). https://doi.org/10.1089/tmj.2015.0212 Google Scholar

106. 

Y. Nam et al., “Monitoring of heart and breathing rates using dual cameras on a smartphone,” PLoS One, 11 e0151013 (2016). https://doi.org/10.1371/journal.pone.0151013 POLNCL 1932-6203 Google Scholar

107. 

M. Alafeef, “Smartphone-based photoplethysmographic imaging for heart rate monitoring,” J. Med. Eng. Technol., 41 387 –395 (2017). https://doi.org/10.1080/03091902.2017.1299233 JMTEDN 0309-1902 Google Scholar

108. 

J. R. Maestre-Rendon et al., “A non-contact photoplethysmography technique for the estimation of heart rate via smartphone,” Appl. Sci., 10 154 (2020). https://doi.org/10.3390/app10010154 Google Scholar

109. 

A. Aitkulov and D. Tosi, “Optical fiber sensor based on plastic optical fiber and smartphone for measurement of the breathing rate,” IEEE Sens. J., 19 3282 –3287 (2019). https://doi.org/10.1109/JSEN.2019.2894834 ISJEAZ 1530-437X Google Scholar

110. 

V. Dantu, J. Vempati and S. Srivilliputhur, “Non-invasive blood glucose monitor based on spectroscopy using a smartphone,” in 36th Annu. Int. Conf. IEEE Eng. Med. and Biol. Soc., 3695 –3698 (2014). https://doi.org/10.1109/EMBC.2014.6944425 Google Scholar

111. 

K. Sun et al., “Ultrabright polymer-dot transducer enabled wireless glucose monitoring via a smartphone,” ACS Nano, 12 5176 –5184 (2018). https://doi.org/10.1021/acsnano.8b02188 ANCAC3 1936-0851 Google Scholar

112. 

H. C. Wang et al., “Design, fabrication, and feasibility analysis of a colorimetric detection system with a smartphone for self-monitoring blood glucose,” J. Biomed. Opt., 24 027002 (2019). https://doi.org/10.1117/1.JBO.24.2.027002 JBOPFO 1083-3668 Google Scholar

113. 

H. C. Wang et al., “Development and clinical trial of a smartphone-based colorimetric detection system for self-monitoring of blood glucose,” Biomed. Opt. Express, 11 2166 –2177 (2020). https://doi.org/10.1364/BOE.389638 BOEICL 2156-7085 Google Scholar

114. 

H. Luo et al., “Smartphone-based blood pressure measurement using transdermal optical imaging technology,” Circ. Cardiovasc. Imaging, 12 e008857 (2019). https://doi.org/10.1161/CIRCIMAGING.119.008857 Google Scholar

115. 

P. Schoettker et al., “Blood pressure measurements with the OptiBP smartphone app validated against reference auscultatory measurements,” Sci. Rep., 10 17827 (2020). https://doi.org/10.1038/s41598-020-74955-4 SRCEC3 2045-2322 Google Scholar

116. 

X. Ding, D. Nassehi and E. C. Larson, “Measuring oxygen saturation with smartphone cameras using convolutional neural networks,” IEEE J. Biomed. Health. Inf., 23 2603 –2610 (2019). https://doi.org/10.1109/JBHI.2018.2887209 Google Scholar

117. 

K. Lee et al., “A comparative evaluation of atrial fibrillation detection methods in Koreans based on optical recordings using a smartphone,” IEEE Access, 5 11437 –11443 (2017). https://doi.org/10.1109/ACCESS.2017.2700488 Google Scholar

118. 

G. Rozen et al., “Diagnostic accuracy of a novel mobile phone application for the detection and monitoring of atrial fibrillation,” Am. J. Cardiol., 121 1187 –1191 (2018). https://doi.org/10.1016/j.amjcard.2018.01.035 AJNCE4 0258-4425 Google Scholar

119. 

A. Mariakakis et al., “BiliScreen: smartphone-based scleral jaundice monitoring for liver and pancreatic disorders,” in Proc. ACM Interact. Mob. Wearable Ubiquitous Technol., 1 –26 (2017). Google Scholar

120. 

J. A. Taylor et al., “Use of a smartphone app to assess neonatal jaundice,” Pediatrics, 140 e20170312 (2017). https://doi.org/10.1542/peds.2017-0312 PEDIAU 0031-4005 Google Scholar

121. 

E. Chao, C. K. Meenan and L. K. Ferris, “Smartphone-based applications for skin monitoring and melanoma detection,” Dermatol. Clin., 35 551 –557 (2017). https://doi.org/10.1016/j.det.2017.06.014 DERAEG 1018-8665 Google Scholar

122. 

R. B. Kim et al., “Utilization of smartphone and tablet camera photographs to predict healing of diabetes-related foot ulcers,” Comput. Biol. Med., 126 104042 (2020). https://doi.org/10.1016/j.compbiomed.2020.104042 CBMDAW 0010-4825 Google Scholar

123. 

M. Kumar et al., “PulseCam: a camera-based, motion-robust and highly sensitive blood perfusion imaging modality,” Sci. Rep., 10 4825 (2020). https://doi.org/10.1038/s41598-020-61576-0 SRCEC3 2045-2322 Google Scholar

124. 

A. Pai, A. Veeraraghavan and A. Sabharwal, “HRVCam: robust camera-based measurement of heart rate variability,” J. Biomed. Opt., 26 022707 (2021). https://doi.org/10.1117/1.JBO.26.2.022707 JBOPFO 1083-3668 Google Scholar

125. 

M. H. Yap et al., “A new mobile application for standardizing diabetic foot images,” J. Diabetes Sci. Technol., 12 169 –173 (2018). https://doi.org/10.1177/1932296817713761 Google Scholar

126. 

A. Goldstein et al., “Assessing the feasibility of a rapid, high-volume cervical cancer screening programme using HPV self-sampling and digital colposcopy in rural regions of Yunnan, China,” BMJ Open, 10 e035153 (2020). https://doi.org/10.1136/bmjopen-2019-035153 Google Scholar

127. 

R. N. Maamari et al., “A mobile phone-based retinal camera for portable wide field imaging,” Br. J. Ophthalmol., 98 438 –441 (2014). https://doi.org/10.1136/bjophthalmol-2013-303797 BJOPAL 0007-1161 Google Scholar

128. 

D. Toslak et al., “Wide-field smartphone fundus video camera based on miniaturized indirect ophthalmoscopy,” Retina, 38 438 –441 (2018). https://doi.org/10.1097/IAE.0000000000001888 RETIDX 0275-004X Google Scholar

129. 

P. Li et al., “Usability testing of a smartphone-based retinal camera among first-time users in the primary care setting,” BMJ Innov., 5 120 –126 (2019). https://doi.org/10.1136/bmjinnov-2018-000321 Google Scholar

130. 

T. P. Patel et al., “Smartphone-based, rapid, wide-field fundus photography for diagnosis of pediatric retinal diseases,” Transl. Vision Sci. Technol., 8 29 (2019). https://doi.org/10.1167/tvst.8.3.29 Google Scholar

131. 

M. W. M. Wintergerst et al., “Non-contact smartphone-based fundus imaging compared to conventional fundus imaging: a low-cost alternative for retinopathy of prematurity screening and documentation,” Sci. Rep., 9 19711 (2019). https://doi.org/10.1038/s41598-019-56155-x SRCEC3 2045-2322 Google Scholar

132. 

A. D. Desai et al., “Open-source, machine and deep learning-based automated algorithm for gestational age estimation through smartphone lens imaging,” Biomed. Opt. Express, 9 6038 –6052 (2018). https://doi.org/10.1364/BOE.9.006038 BOEICL 2156-7085 Google Scholar

133. 

T. N. Kim et al., “A smartphone-based tool for rapid, portable, and automated wide-field retinal imaging,” Transl. Vision Sci. Technol., 7 21 (2018). https://doi.org/10.1167/tvst.7.5.21 Google Scholar

134. 

M. J. Fliotsos et al., “Qualitative and quantitative analysis of the corneal endothelium with smartphone specular microscopy,” Cornea, 39 924 –929 (2020). https://doi.org/10.1097/ICO.0000000000002277 CORNDB 0277-3740 Google Scholar

135. 

J. K. Bae et al., “Quantitative screening of cervical cancers for low-resource settings: pilot study of smartphone-based endoscopic visual inspection after acetic acid using machine learning techniques,” JMIR Mhealth Uhealth, 8 e16467 (2020). https://doi.org/10.2196/16467 Google Scholar

136. 

W. Lewis and W. Franco, “Smartphone imaging of subcutaneous veins,” Lasers Surg. Med., 50 1034 –1039 (2018). https://doi.org/10.1002/lsm.22949 LSMEDI 0196-8092 Google Scholar

137. 

“ClearSCOPE 2.0 endoscope adaptor for smartphones announced; endoscopic video capture reinvented,” https://www.prweb.com/releases/2014/09/prweb12187302.htm Google Scholar

138. 

A. Langley and G. M. Fan, “Comparison of the glidescope®, flexible fibreoptic intubating bronchoscope, iPhone modified bronchoscope, and the Macintosh laryngoscope in normal and difficult airways: a manikin study,” BMC Anesthesiol., 14 10 (2014). https://doi.org/10.1186/1471-2253-14-10 Google Scholar

139. 

M. Pagnutti et al., “Laying the foundation to use Raspberry Pi 3 V2 camera module imagery for scientific and engineering purposes,” J. Electron. Imaging, 26 013014 (2017). https://doi.org/10.1117/1.JEI.26.1.013014 JEIME5 1017-9909 Google Scholar

140. 

Y. Kim et al., “Real-time localization of the parathyroid gland in surgical field using Raspberry Pi during thyroidectomy: a preliminary report,” Biomed. Opt. Express, 9 3391 (2018). https://doi.org/10.1364/BOE.9.003391 BOEICL 2156-7085 Google Scholar

141. 

P. Gordon et al., “Portable bright-field, fluorescence, and cross-polarized microscope toward point-of-care imaging diagnostics,” J. Biomed. Opt., 24 096502 (2019). https://doi.org/10.1117/1.JBO.24.9.096502 JBOPFO 1083-3668 Google Scholar

142. 

C. Li et al., “Handheld projective imaging device for near-infrared fluorescence imaging and intraoperative guidance of sentinel lymph node resection,” J. Biomed. Opt., 24 080503 (2019). https://doi.org/10.1117/1.JBO.24.8.080503 JBOPFO 1083-3668 Google Scholar

143. 

S. Parra et al., “Development of low-cost point-of-care technologies for cervical cancer prevention based on a single-board computer,” IEEE J. Transl. Eng. Health Med., 8 1 –10 (2020). https://doi.org/10.1109/JTEHM.2020.2970694 Google Scholar

144. 

C. H. Cho et al., “A novel re-targetable application development platform for healthcare mobile applications,” Int. J. Comput. Sci. Softw. Eng., 6 (9), 196 –201 (2019). Google Scholar

Biography

Brady Hunt is a research scientist at the Thayer School of Engineering at Dartmouth, where he is developing optical systems for treatment guidance in dermatology and radiotherapy applications. He completed his doctorate under the mentorship of Rebecca Richards-Kortum at Rice University, contributing significant clinical assessments of point-of-care imaging technologies in medically underserved populations in Brazil. Prior to that, he graduated magna cum laude with his bachelor’s degree in biophysics from Brigham Young University.

Alberto J. Ruiz is a PhD candidate and Innovation fellow working in the Optics in Medicine Lab at Dartmouth College. He received his BS degree in applied physics from Harvey Mudd College in 2014. He has worked as an applications engineer at Thorlabs Inc., and as an applications engineering manager at Cree Inc. He is passionate about translational technologies, education, and social responsibility. His current research interests include photodynamic therapy, fluorescence-guided surgery, and low-cost optical system design.

Brian Pogue is the editor in chief of the Journal of Biomedical Optics, and endowed as the MacLean Professor of Engineering at Dartmouth. His work is in the role of optics in medicine, within biomedical engineering and medical physics, radiation therapy dosimetry, molecular guided surgery, photodynamic therapy and optically activated therapies. He has published over 400 peer-reviewed publications at the intersection of optics, imaging, therapy and cancer.

CC BY: © The Authors. Published by SPIE under a Creative Commons Attribution 4.0 Unported License. Distribution or reproduction of this work in whole or in part requires full attribution of the original publication, including its DOI.
Brady Hunt, Alberto J. Ruiz, and Brian W. Pogue "Smartphone-based imaging systems for medical applications: a critical review," Journal of Biomedical Optics 26(4), 040902 (15 April 2021). https://doi.org/10.1117/1.JBO.26.4.040902
Received: 31 December 2020; Accepted: 29 March 2021; Published: 15 April 2021
Lens.org Logo
CITATIONS
Cited by 47 scholarly publications.
Advertisement
Advertisement
KEYWORDS
Imaging systems

Cameras

Biomedical optics

Interfaces

Sensors

Diagnostics

Calibration

Back to Top