Article Text

Download PDFPDF

Original research
Novel web application for self-assessment of distance visual acuity to support remote consultation: a real-world validation study in children
  1. Louise Allen1,
  2. Arun James Thirunavukarasu2,
  3. Simon Podgorski2,
  4. Deborah Mullinger1
  1. 1Department of Ophthalmology, Cambridge University Hospitals NHS Foundation Trust, Cambridge, UK
  2. 2School of Clinical Medicine, University of Cambridge, Cambridge, UK
  1. Correspondence to Dr Louise Allen; louise.allen{at}addenbrookes.nhs.uk

Abstract

Objective The difficulty in accurately assessing distance visual acuity (VA) at home limits the usefulness of remote consultation in ophthalmology. A novel web application, DigiVis, enables automated VA self-assessment using standard digital devices. This study aims to compare its accuracy and reliability in children with clinical assessment by a healthcare professional.

Methods and Analysis Children aged 4–10 years were recruited from a paediatric ophthalmology service. Those with VA worse than +0.8 logMAR (Logarithm of the Minimum Angle of Resolution) or with cognitive impairment were excluded. Bland-Altman statistics were used to analyse both the accuracy and repeatability of VA self-testing. User feedback was collected by questionnaire.

Results The left eyes of 89 children (median 7 years) were tested. VA self-testing showed a mean bias of 0.023 logMAR, with a limit of agreement (LOA) of ±0.195 logMAR and an intraclass correlation coefficient (ICC) of 0.816. A second test was possible in 80 (90%) children. Test–retest comparison showed a mean bias of 0.010, with an LOA of ±0.179 logMAR, an ICC of 0.815 and a repeatability coefficient of 0.012. 96% of children rated the test as good or excellent, as did 99% of their parents.

Conclusion Digital self-testing gave comparable distance VA assessments with clinical testing in children and was well accepted. Since DigiVis self-testing can be performed under direct supervision using medical video consultation software, it may be a useful tool to enable a proportion of paediatric eye clinic attendances to be moved online, reducing time off school and releasing face-to-face clinical capacity for those who need it.

  • visual acuity
  • remote consultation
  • telemedicine
  • software validation

Data availability statement

All data relevant to the study are included in the article or uploaded as supplementary information.

http://creativecommons.org/licenses/by-nc/4.0/

This is an open access article distributed in accordance with the Creative Commons Attribution Non Commercial (CC BY-NC 4.0) license, which permits others to distribute, remix, adapt, build upon this work non-commercially, and license their derivative works on different terms, provided the original work is properly cited, appropriate credit is given, any changes made indicated, and the use is non-commercial. See: http://creativecommons.org/licenses/by-nc/4.0/.

Statistics from Altmetric.com

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

Key messages

What is already known about this subject?

  • Distance visual acuity (VA) is fundamental to decision making in ophthalmic consultation.

  • The NHS Long Term Plan target is for a third of consultations to be undertaken remotely.

  • Most vision testing applications are not certified medical devices or clinically validated.

What are the new findings?

  • The novel web application, DigiVis, is accurate and repeatable in children from the age of 4 years in a real-world setting.

  • The test is well accepted by parents and children.

  • No training (other than the application’s instruction video) or professional support is needed for successful self-testing.

How might these results change the focus of research or clinical practice?

  • VA self-testing may be observed in real time (synchronously) during video consultation to ensure viewing distance and eye occlusion are effective.

  • Asynchronous testing may enable home monitoring of conditions such as amblyopia.

  • Either method could reduce the frequency and requirement for clinic attendance.

Introduction

Clinic backlogs were growing even before the COVID-19 pandemic, but disruption during lockdown and ongoing social distancing requirements have added to clinic delays and the risk of preventable visual impairment. The potential benefits of undertaking consultations remotely include reducing the burden of unnecessary hospital attendance on patients while optimising face-to-face capacity for those who need it. The UK NHS Long Term Plan aims for a third of appointments to become virtual to meet the demands of an ageing population within the constraints of limited clinical capacity.1 2

A distance visual acuity (VA) assessment is fundamental to any ophthalmic assessment, and a validated method for assessing VA at home is needed to support remote consultation.3 Although more than 20 vision testing applications are available, very few are clinically validated, designed to be used without a trained observer or certified for medical use. Those available are difficult for the clinician to supervise remotely.3–7

Paediatric ophthalmology clinics have high footfall, with many children requiring frequent reviews of VA for amblyopia therapy, resulting in time off school and expense for the family. An accurate system for self-testing VA which can be used at home may enable a proportion of paediatric appointments to be undertaken remotely.

DigiVis is a recently developed, certified medical web application device enabling self-testing of distance VA. It can be integrated within medical video consultation software, giving the clinician the ability to supervise the test to ensure that viewing distance set-up, use of glasses correction and effective eye occlusion are undertaken. In this paper, we report the accuracy and repeatability of DigiVis self-testing during eye clinic attendance in children between 4 and 10 years of age.

Materials and methods

This was a prospective validation study comparing DigiVis VA self-testing with standard clinical testing. Patients and the public were involved in the design, conduct, reporting and dissemination plans of our research.

All children between 4 and 10 years of age attending routine paediatric ophthalmology clinic appointments within a 6-week period were invited to participate. Those with documented sight impairment of VA worse than +0.8 logMAR (Logarithm of the Minimum Angle of Resolution, 6/38 Snellen) were excluded, as were children with cognitive impairment. The children’s parents gave informed written consent and children gave informed verbal or written assent. Parents used DigiVis to self-test their children’s vision at the time of clinic attendance, using provided internet-connected devices under the supervision of a medical student. Occluding glasses or occlusive patches were provided but no other help was given. DigiVis VA results were documented by the student after testing and parents and children were asked to complete a usability and acceptance questionnaire. A standard, age-appropriate clinical assessment of VA was undertaken by a trained nurse, optometrist or orthoptist masked to DigiVis results. Where the standard vision assessment was undertaken using a Snellen chart, the value was converted to logMAR in Microsoft Excel.

The technology

The DigiVis test requires two digital devices connected to the internet, with no download necessary. A tablet, laptop or desktop computer is used to display the distant test chart. A paired smartphone or tablet is held by the child sitting 2 m away from the test chart display (figure 1A) and functions as an interactive ‘matching card’ (figure 1B). An animated instruction video in the application demonstrates the steps for screen calibration, measuring the viewing distance and pairing the devices. Sloan letter optotypes are presented on the larger, distant screen, with adjacent letters and indicator arrows providing crowding consistent with the letter size, in a similar manner to standard linear logMAR charts. Where fewer than five letters can be displayed on the display screen (from 0.8 logMAR), a crowding box is used. The child is asked to select the letter optotype out of a group of five displayed on their handheld device (four of which are randomised) which matches the letter indicated on the distant screen. The child is encouraged during the test by collecting cartoon animals after each correctly matched letter. Optotype sizing follows a modified García-Pérez psychophysical staircase starting at 0.6 logMAR with three reversal points, facilitating calculation of the VA threshold.8 For this study, a lower limit of 0.00 logMAR was set to reduce test duration for children. The test usually takes between 30 s and 2 min in each child’s eye, depending on the consistency of the subject’s responses. Results are displayed in logMAR, Snellen and ETDRS (Early Treatment Diabetic Retinopathy Study) chart letters.

Figure 1

(A) Randomised optotype presentation on the distant device; the arrow indicates the letter to match. (B) The appearance of randomised letters on the handheld device, one of which matches the indicated letter on the distant test chart. The ‘not sure’ button registers as an incorrect attempt.

Analysis

Data from the left eyes only were analysed to avoid codependence. Where standard clinical test results were <0.00 logMAR, the value was rounded up to 0.00 to enable comparability with DigiVis scores. Agreement between DigiVis and clinical VA measurements as well as test–retest (TRT) agreement were evaluated with Bland-Altman plots, looking specifically at 95% limits of agreement (LOA) and mean bias, and with intraclass correlation coefficients (ICC) and repeatability coefficients. A priori standards were used to facilitate appraisal of agreement as quantified by ICC.9 Analysis and data visualisation were conducted in R (V.3.6.1; R Foundation for Statistical Computing, Vienna, Austria) and Affinity Designer (V.1.8.6; Pantone, Carlstadt, New Jersey, USA).

Results

The left eyes of 89 children 4–10 years of age (mean 7.4 years, median 7 years) were tested using the children’s version of the DigiVis app and by standard, age-appropriate clinical assessment. Of these children, 80 (90%) completed two DigiVis tests, enabling TRT agreement to be appraised. Subject VA based on standard clinical testing ranged from 0 to 0.8 logMAR (mean 0.09 logMAR; IQR 0–0.13 logMAR).

In both comparisons, good agreement is indicated by ICC values (p<0.001) and repeatability coefficients (table 1). Bland-Altman plots feature an average LOA at ±0.195 logMAR for accuracy, comparing DigiVis and clinical assessment (figure 2), and ±0.179 logMAR for TRT agreement (figure 3). Bias was minimal in both cases, indicating a lack of systematic error between measurement techniques in both comparisons. No significant correlation (p>0.1) was observed between mean VA and difference in VA in either Bland-Altman plot, suggesting that agreement was consistent over the tested range. Repeatability coefficients suggest that the smallest detectable difference in vision with DigiVis is around 0.012 logMAR (table 1).

Table 1

Mean bias and LOA for DigiVis compared with standard clinical assessment of VA and TRT agreement with 95% CI

Figure 2

Bland-Altman plot comparing DigiVis visual acuity measurements with standard clinical testing to evaluate accuracy. The bias and 95% limits of agreement (dashed lines) are labelled and have 95% CIs (dotted lines) shaded. LOA, limits of agreement. logMAR, Logarithm of the Minimum Angle of Resolution.

Figure 3

Bland-Altman plot comparing repeated DigiVis measurements to evaluate test–retest agreement. The bias and 95% LOA (dashed lines) are labelled and have 95% CIs (dotted lines) shaded. LOA, limits of agreement. logMAR, Logarithm of the Minimum Angle of Resolution.

Of 89 children, 85 (95.6%) rated the test as good or excellent, as did 88 of the 89 (98.9%) parents. Of the 89 parents, 86 (96.7%) said that they would consider using the test to monitor their child’s vision at home.

Discussion

Conventional chart-based assessment of VA in children aged 6–11 years with corrected VA of 0.20 logMAR or better has reported a TRT LOA of ±0.15 logMAR.10 Validated digital distance VA testing systems include Peek Acuity, with an LOA between the app and clinical measurements of ±0.444 logMAR and a TRT LOA of ±0.414 logMAR.6 COMPlog, a distance VA test requiring a specifically sized computer monitor, recorded a TRT LOA of ±0.10–0.12 logMAR and an ICC of 0.964 in adults when comparing face-to-face with remote testing.11 12 Digital Kay picture symbol testing in children has an LOA between the app and ETDRS chart of ±0.21 and a TRT LOA of ±0.14.13 Together, these data provide a priori standards against which DigiVis can be evaluated, although it should be noted that these validation studies used trained examiners to assess visual threshold rather than self-testing.

In this study, self-assessment with DigiVis, without trained input from an eyecare professional, had minimal bias, LOA of ±0.195 logMAR when compared with standard clinical testing, and TRT LOA of ±0.179 logMAR, with high ICC values of 0.816 and 0.815 and low repeatability coefficients, reinforcing evidence of its accuracy and reliability. The narrowness of CIs for calculated statistics suggests that the sampled population was sufficiently large to provide robust results.

There were several limitations to this study. Standard clinical testing was carried out using a variety of standard charts: Snellen, ETDRS and children’s logMAR flip charts. This reflects real-world variation in paediatric ophthalmology clinics but may have reduced the reliability of clinical measurements. A potential advantage of DigiVis is that it provides uniformity of testing from 4 years upwards and removes observer bias. A further limitation of this analysis was the exclusion of children with VA worse than +0.8 logMAR, a decision made due to the presumed difficulties these individuals may have in accessing the test. Additionally, the IQR of 0.00–0.13 logMAR illustrates bias towards good VA levels in the studied population. Further investigation is required to verify the app’s potential in children with poorer VA levels since they were under-represented in this study. Children with special educational needs and developmental delay were also excluded from this study since prior attempts to use DigiVis in children with Down syndrome had failed due to difficulty in understanding the concept of letter matching. Finally, the apparent consistency of DigiVis in tested subjects may have been inflated by the study participants repeating the test in quick succession, in the same testing environment and on the same devices. However, rapid retesting of children might also have been expected to result in poorer concentration and less repeatability.

Despite the limitations of the study, the results indicate that self-testing with DigiVis is comparable with age-appropriate VA assessment by a trained examiner in this childhood population, agreeing with our findings in a wider population containing older children and adults.14 The accuracy of VA assessment is dependent on viewing distance, correct use of glasses and effective occlusion. Confidence in home testing results may be improved by synchronising testing with remote consultation, using the share screen function of medical video conferencing software. This enables the clinician to observe test set-up and directly monitor both the child’s performance and the test chart in the conferencing window in real time. Based on the clinician’s observation and satisfaction with the parent and child’s ability to undertake the synchronised test effectively, unsupervised asynchronous home monitoring of VA may be considered. This could reduce the need for children to miss school in order to attend a face-to-face consultation. Further studies to determine the take-up and accuracy of both synchronous and asynchronous home vision self-testing using DigiVis are in progress. Vision self-testing was well accepted by both children and parents in this study, with almost all willing to use it for future home monitoring. Home testing and monitoring may encourage parents to take a more active role in their child’s eyecare and clinical capacity could be freed for children needing face-to-face clinic time.

A disadvantage of DigiVis is its need for the family to have two internet-connected devices. Although most young families will have a smartphone and a tablet, a proportion of families will not be able to access the test. There is a recognised relationship between digital exclusion and the risk of poor health; inability to access digital testing could flag up this risk and prioritise access for face-to-face appointments.15

Data availability statement

All data relevant to the study are included in the article or uploaded as supplementary information.

Ethics statements

Patient consent for publication

Ethics approval

This study was approved by the Health Research Authority and Health and Care Research Wales Ethics Committee (IRAS 196573). All procedures adhered to the tenets of the Declaration of Helsinki for research involving human subjects.

Acknowledgments

The authors extend their thanks to Sarah Laidlaw, Sarah Hays, Ruth Proffitt, Ciara O’Sullivan and Emily March for their assistance in the clinic and thank the parents and children who participated.

References

Footnotes

  • Contributors All named authors contributed to trial design, recruitment and manuscript development and are accountable for the integrity of the study. LA led the study design and ethics submission. AJT undertook the statistical analysis. SP and DM were integral in recruitment and testing.

  • Funding The development of the DigiVis web application was funded by an MRC Confidence in Concept grant from the University of Cambridge. This study was funded by Addenbrooke’s Charitable Trust.

  • Competing interests LA is the inventor and developer of DigiVis and founding director of Cambridge Medical Innovations. An international patent application has been made by Cambridge Enterprise.

  • Patient and public involvement Patients and/or the public were involved in the design, or conduct, or reporting, or dissemination plans of this research. Refer to the Methods section for further details.

  • Provenance and peer review Not commissioned; externally peer reviewed.