Abstract

Reaction time is one of the most commonly used measures in online cognitive assessments.  However, there are significant technical problems with the methods that are commonly deployed for obtaining this measure.  Most online cognitive toolkits obtain reaction time measures with a visual cue and some type of mechanical response (keyboard, mouse or touchscreen).  Both the hardware and software involved in the computer systems that these online cognitive tests depend on introduce significant delays and more significantly, variation in these delays.  The variability that is introduced by these systems leads to inaccurate results that health care professionals have come to rely on.  In this report, a comparison is made between the reaction time data collected with a tactile based device that is accurately calibrated to sub-millisecond accuracy (the Brain Gauge) to a visual reaction time test that relies on consumer grade computer systems in a manner that parallels the methods commonly used in online cognitive testing.  Forty healthy controls took both the tactile based and visually based reaction time test, and the results demonstrated a significant difference in both reaction time and reaction time variability.  Most significant was the difference in reaction time variability, which was 16 msec for the tactile test and 81 msec for the visual test.  While the differences could be partially accounted for by tactile vs. visual biological pathways, the variability of the results from the visual task are in the range predicted by error measured from previous reports that performed robotic testing to derive differences between the two modalities of testing.

Citation

Kim J, Francisco E, Holden J, Lensch R, Kirsch B, Dennis R, Tommerdahl M. (2020). Visual vs. Tactile Reaction Testing Demonstrates Problems with Online Cognitive Testing. Journal of Science and Medicine; 2(2): 1-7. https://doi.org/10.37714/JOSAM.V2I2.39.

Introduction

A recent report described problems that are predicted to be prevalent in reaction time testing with broadly and commonly used online cognitive testing systems [1]. The results of that paper demonstrated that inherent system delays made the majority of these online systems extremely inaccurate and were most likely responsible for problems in using reaction time as an accurate and objective measure of brain function. Visual reaction time is most commonly deployed in these online systems, and the results of robotically testing generic visual reaction time commercially available systems were recently compared with the results of a less commonly used commercially available tactile based system (the Brain Gauge). In that study, the inaccuracies measured in the visual reaction time systems ranged from 40-400 msec while the temporal accuracy of the Brain Gauge was measured to be approximately 0.3 msec. The objective of this study is to directly compare the measures obtained from visual and tactile reaction time testing of healthy controls to determine if the latencies measured robotically in Holden et al [1] would be predictive of the human based results.

Methods

Figure 1.Two-point vibrotactile stimulator (the Brain Gauge; Cortical Metrics; Chapel Hill, NC).

Tactile Stimulator

A two-point vibro-tactile stimulator, pictured in Figure 1 (the Brain Gauge by Cortical Metrics, Carrboro, NC), was used to deliver stimuli to the tips of digits 2 and 3. Testing with the Brain Gauge has been previously described in a number of reports [2-9], and the device delivers stimuli and stimulus protocols in the same manner as a previously described 4-point tactile stimulator [10].

During the evaluation session, subjects were seated comfortably in a chair with the dominant hand situated on the Brain Gauge. The independent, computer-controlled probe tips can deliver a wide range of sinusoidal vibrotactile stimulation of varying amplitudes and frequencies. Typically, protocols deliver vibrotactile flutter stimulation (25Hz) to the glabrous tips of either, or both, the second (index, D2) and/or the third (middle, D3) digits of the dominant hand. These digits were chosen as test sites for convenience and comfort and also because of the wealth of neurophysiological data that supports the evaluation of the associated somatotopic regions in the non-human primate cerebral cortex.

A computer monitor provided visual cueing during each of the experimental runs. The cues indicated when the experimental stimuli would be expected to be delivered and when subjects were to respond. Training trials conducted prior to each task familiarized subjects with the test; correct responses on three consecutive training trials were required before the start of the assessment.

Tactile Reaction Time

For the simple tactile reaction time task, a single tap (300μm, 40ms) was delivered to D3 and subjects were instructed to respond by pressing down with D2 as soon as the tap was perceived. A randomized delay ranging from 2 to 7s separated the trials. Response times were recorded for each of the 10 trials. This method was first reported in Zhang et al. [11] and has been reported many times since then [4,5,7,12-27]. The standard deviation of the 10 reaction times was used as a measure of reaction time variability.

Visual Reaction Time

For the simple visual reaction time task, a visual cue was flashed on the computer monitor and individuals were instructed to respond on a computer mouse as quickly as possible. A randomized delay ranging from 2 to 7s separated the trials. The standard deviation of the 10 reaction times was used as a measure of reaction time variability. The method was previously partially described in Holden et al [1]. However, in that study, robotics were used in place of human subjects.

Subjects

Thirty-nine individuals (average age = 21) were recruited into the study and performed the tests. None of the subjects reported any neurological disorders. The experimental procedures were reviewed and approved in advance by an institutional review board.

Results

Thirty-nine individuals completed the tactile reaction time test with the Brain Gauge and also completed the visual reaction time test. Results are described in Figure 2. Note that average tactile reaction time for all subjects was 241 msec and visual reaction time was 329 msec. Inter-trial variability was used to calculate reaction time variability, and reaction time variability for the tactile task was much lower than it was for the visual task (16.8 msec vs. 80.7 msec).

Figure 2.Comparison of average tactile vs. visual reaction time and reaction time variabilities. Note that not only is average visual reaction time much slower than average tactile reaction time, but average visual reaction time variability is over four times higher than that observed for tactile testing.

Discussion

Holden and colleagues [1] reported a very significant 80 msec latency introduced non-biologically for visual reaction time delivered in the manner delivered in this report and a non-significant non-biological latency for the Brain Gauge (~0.3 msec) The difference in reaction time means observed from this group of individuals for tactile and visual reaction times is approximately 85 msec, implying that the majority of the difference in reaction time is due to technical latencies and not biological ones. Obviously, there could be speculation that reaction time testing via the visual system could actually be significantly different than reaction time testing could be through the tactile system, but for this cohort, the evidence leans heavily towards the difference being related to the technology.

Pearce and colleagues compared somatosensory vs. visual reaction time (RT) between two groups – one group having persistent post-concussion symptoms (PPCS) and the other group being asymptomatic [4]. Even though the average age of study participants was ~38 (16 years older than the average age of this study’s participants), the somatosensory RT was slightly lower for the asymptomatic group in the Pearce study than in this study (230 msec vs. 241 msec). However, the difference between the somatosensory RT and visual RT was the same as that found in this study (85 msec). Although different methods of visual RT were deployed (in-lab programming versus a commercially available online cognitive test), both methods relied on similar computer systems – the same type of system that Holden and colleagues predicted would have an approximate 80 msec latency. Interestingly, the difference in asymptomatic vs. PPCS somatosensory RT was reported as significantly different but the visual RT for the 2 groups was not. Thus, it appears that the inaccuracy introduced by the inferior method (the use of consumer-grade computers) leads to results that are not only inaccurate but leads to false conclusions – that there is no difference in information processing speed between the PPCS and asymptomatic cohorts. The majority of publications that report reaction time use visual reaction time testing, and the authors suspect that, based on the work of Holden et al , these publications are mis-reporting or misrepresenting the populations that are being described. In other words, poor methods lead to poor results. On the other hand, studies utilizing the Brain Gauge have consistently reported reaction time in the same range as this report for healthy controls [4-7,28].

Why would the increased latency in visual RT result in a decrease in the difference between the PPCS and asymptomatic RT in the Pearce et al. study? The answer to that is that the increased latency of the visual RT is not consistent and that the system delays that are inherent in the technology have a high degree of variability In other words, while average latency may be 80 msec for the system, it may vary as much as 40 msec [1] and the same biological RT response (without the technical latency) may randomly range from 200 to 280 msec. Averaging several trials of visual RT with high latency and variability will result in significant loss of accuracy and make it difficult to differentiate cohorts on the basis of RT. The difference between reaction time variability (RTv) for the somatosensory and visual RT task was significantly different (16.8 vs. 81 msec), and this clearly reflects the differences in the methodological system variability. The RTv collected with the Brain Gauge in this report was similar to the range reported for the asymptomatic group in Pearce et al [4], and since the accuracy of the Brain Gauge is approximately 0.3 msec, the 14-16 msec range should be considered to be due primarily to the actual biological variability of the task. Other studies utilizing the Brain Gauge have reported reaction time variabilities for healthy controls consistent with those reported in this study for healthy controls [4-7,28].

It would be a considerable injustice for the authors to take full credit for making the observation of the widespread and unfortunately, growing use of inaccurate visual reaction time testing using consumer-grade computers and human interface devices. The inaccuracies and lack of good laboratory practices in reaction time testing was first pointed out by Richard Plant [29-33],and this is fully described in the Holden et al [1] report. It is curious that so many researchers conveniently ignored that work and continue to allow so many inferior consumer-grade computer products to be used for purposes for which they were expressly not designed. The authors believe that this is one significant reason that online concussion testing is widely regarded as simply not working. However, having multiple methods in place that do not work allow for many researchers to continue to keep doing the same research, ostensibly refining their methods, while actually refusing to take the time to understand why their basic experimental approach is both ineffective and demonstrably unsuited to human performance experimental research.This culture of fixation on a method, no matter how flawed, is quite pervasive. For example, another good example of such poor methodology being used is balance testing, the accuracy of which for concussion detection has consistently been shown to be inferior to that of flipping a coin. However, insistence on the continuing use of this method in research and clinical settings props up the careers of many, with no fear that the problem will be “cracked” any time soon, since the fundamental method itself is known to be ineffective. It is sad that narrow-minded technical incompetence has evolved into a primary strategy for job security among academic researchers.

Further entrenching the culture of fact avoidance in academic human performance research is the recent trend to limit cited references to “within the most recent 10 years”, older (and therefore seminal) references being prohibited by the publication policies of some journals. A thorough treatment of the details of a method is most likely to be published early during the adoption of any new method or scientific instrument, therefore, the de facto effect is to promote a less critical and less incisive list of references, which are far less likely to call into question the suitability of a method for the use to which it is being put. The simple fact is, instituting a policy of “recent references only” is tantamount to an overt refusal to learn from history; to repeat mistakes that were probably clearly pointed out decades ago; and to pretend to be doing novel work when, as often as not, the work presented in such journals is nothing more than an inferior rehash of work that had been done at least as well, more than a decade ago. The inappropriate use of consumer-grade computing hardware for the measurement of human performance, specifically reaction time, is simply one result of a decaying culture of sloppiness and technical illiteracy among a majority of academic researchers in this area.

Conclusion

There is growing evidence that the visual reaction times commonly used by online cognitive assessment tools are very inaccurate, and the data in this report is but one data point of many that provides evidence for that. The technological underpinnings of this problem are more fully described in Holden et al [1], but this report, which directly compares visual to somatosensory reaction times in healthy controls, is fully consistent with the predictions made by the robotic testing conducted by Holden and colleagues. Similar technical artefacts are to be expected in various areas where consumer-grade computing hardware is used instead of the use of well-characterized, laboratory-grade scientific instruments.

References

  1. Holden J, Francisco E, Tommerdahl A, Lensch R, Kirsch B, Zai L, Dennis R, Tommerdahl M. Accuracy of different modalities of reaction time testing: Implications for online cognitive assessment tools. BioRXIV. 2019.
  2. Holden Jameson K., Nguyen Richard H., Francisco Eric M., Zhang Zheng, Dennis Robert G., Tommerdahl Mark. A novel device for the study of somatosensory information processing. Journal of Neuroscience Methods. 2012; 204(2)DOI
  3. Favorov Oleg V, Francisco Eric, Holden Jameson, Kursun Olcay, Zai Laila, Tommerdahl Mark. Quantification of Mild Traumatic Brain Injury via Cortical Metrics: Analytical Methods. Military Medicine. 2019; 184(Supplement_1)DOI
  4. King DA, Hume P, Tommerdahl M. Use of the Brain-gauge Somatosensory Assessment for Monitoring Recovery from Concussion: A Case Study. Journal of Physiotherapy Research. 2018; 2(1):1-13.
  5. King Doug. Use of the King-Devick test and Brain Gauge for the management of concussion. Journal of Science and Medicine in Sport. 2018; 21DOI
  6. Tommerdahl AP, Francisco EF, Lensch R, Holden JK, Favorov OV, Tommerdahl M. Response Time in Somatosensory Discrimination Tasks is Sensitive to Neurological Insult. Neurology and Neurobiology. 2019. DOI
  7. Pearce Alan J., Tommerdahl Mark, King Doug A.. Neurophysiological abnormalities in individuals with persistent post-concussion symptoms. Neuroscience. 2019; 408DOI
  8. Pearce Alan J., Kidgell Dawson J., Frazer Ashlyn K., King Doug A., Buckland Michael E., Tommerdahl Mark. Corticomotor correlates of somatosensory reaction time and variability in individuals with post concussion symptoms. Somatosensory & Motor Research. 2019; 37(1)DOI
  9. Zhang Zheng, Francisco Eric M., Holden Jameson K., Dennis Robert G., Tommerdahl Mark. Somatosensory Information Processing in the Aging Population. Frontiers in Aging Neuroscience. 2011; 3DOI
  10. Puts Nicolaas A.J., Edden Richard A.E., Wodka Ericka L., Mostofsky Stewart H., Tommerdahl Mark. 10.1152/jn.00455.2014A vibrotactile behavioral battery for investigating somatosensory processing in children and adults. Journal of Neuroscience Methods. 2013; 218(1)DOI
  11. Puts Nicolaas A., Wodka Ericka L., Tommerdahl Mark, Mostofsky Stewart H., Edden Richard A.. Reply to Dickinson and Milne. Journal of Neurophysiology. 2014; 112(6)DOI
  12. Puts Nicolaas A. J., Wodka Ericka L., Tommerdahl Mark, Mostofsky Stewart H., Edden Richard A. E.. Impaired tactile processing in children with autism spectrum disorder. Journal of Neurophysiology. 2014; 111(9)DOI
  13. Puts Nicolaas A. J., Harris Ashley D., Crocetti Deana, Nettles Carrie, Singer Harvey S., Tommerdahl Mark, Edden Richard A. E., Mostofsky Stewart H.. Reduced GABAergic inhibition and abnormal sensory symptoms in children with Tourette syndrome. Journal of Neurophysiology. 2015; 114(2)DOI
  14. Puts Nicolaas A.J., Wodka Ericka L., Harris Ashley D., Crocetti Deana, Tommerdahl Mark, Mostofsky Stewart H., Edden Richard A.E.. Reduced GABA and altered somatosensory function in children with autism spectrum disorder. Autism Research. 2016; 10(4)DOI
  15. Puts Nicolaas A. J., Harris Ashley D., Mikkelsen Mark, Tommerdahl Mark, Edden Richard A. E., Mostofsky Stewart H.. Altered tactile sensitivity in children with attention-deficit hyperactivity disorder. Journal of Neurophysiology. 2017; 118(5)DOI
  16. Mikkelsen Mark, He Jason, Tommerdahl Mark, Edden Richard A. E., Mostofsky Stewart H., Puts Nicolaas A. J.. Reproducibility of flutter-range vibrotactile detection and discrimination thresholds. Scientific Reports. 2020; 10(1)DOI
  17. Nguyen R.H., Ford S., Calhoun A.H., Holden J.K., Gracely R.H., Tommerdahl M.. Neurosensory assessments of migraine. Brain Research. 2013; 1498DOI
  18. Güçlü Burak, Tanıdır Canan, Çanayaz Emre, Güner Bora, İpek Toz Hamiyet, Üneri Özden Ş., Tommerdahl Mark. Tactile processing in children and adolescents with obsessive–compulsive disorder. Somatosensory & Motor Research. 2015; 32(3)DOI
  19. Wodka Ericka L., Puts Nicolaas A. J., Mahone E. Mark, Edden Richard A. E., Tommerdahl Mark, Mostofsky Stewart H.. The Role of Attention in Somatosensory Processing: A Multi-trait, Multi-method Analysis. Journal of Autism and Developmental Disorders. 2016; 46(10)DOI
  20. Bryant Lauren K., Woynaroski Tiffany G., Wallace Mark T., Cascio Carissa J.. Self-reported Sensory Hypersensitivity Moderates Association Between Tactile Psychophysical Performance and Autism-Related Traits in Neurotypical Adults. Journal of Autism and Developmental Disorders. 2019; 49(8)DOI
  21. Houghton David C., Tommerdahl Mark, Woods Douglas W.. Increased tactile sensitivity and deficient feed-forward inhibition in pathological hair pulling and skin picking. Behaviour Research and Therapy. 2019; 120DOI
  22. Ruitenberg Marit F. L., Cassady Kaitlin E., Reuter-Lorenz Patricia A., Tommerdahl Mark, Seidler Rachael D.. Age-Related Reductions in Tactile and Motor Inhibitory Function Start Early but Are Independent. Frontiers in Aging Neuroscience. 2019; 11DOI
  23. Cassady Kaitlin, Ruitenberg Marit F L, Reuter-Lorenz Patricia A, Tommerdahl Mark, Seidler Rachael D. Neural Dedifferentiation across the Lifespan in the Motor and Somatosensory Systems. Cerebral Cortex. 2020; 30(6)DOI
  24. Rahman Md Shoaibur, Barnes Kelly Anne, Crommett Lexi E., Tommerdahl Mark, Yau Jeffrey M.. Auditory and tactile frequency representations are co-embedded in modality-defined cortical sensory systems. NeuroImage. 2020; 215DOI
  25. Jorgensen-Wagners K. Brain Gauge: Measuring Brain Health for Concussion Recovery.. Military Medicine. 2020; In Review
  26. Hanley Claire J., Burianová Hana, Tommerdahl Mark. Towards Establishing Age-Related Cortical Plasticity on the Basis of Somatosensation. Neuroscience. 2019; 404DOI
  27. Plant Richard R.. A reminder on millisecond timing accuracy and potential replication failure in computer-based psychology experiments: An open letter. Behavior Research Methods. 2015; 48(1)DOI
  28. Plant Richard R., Turner Garry. Millisecond precision psychological research in a world of commodity computers: New hardware, new problems?. Behavior Research Methods. 2009; 41(3)DOI
  29. Plant Richard R., Hammond Nick, Turner Garry. Self-validating presentation and response timing in cognitive paradigms: How and why?. Behavior Research Methods, Instruments, & Computers. 2004; 36(2)DOI
  30. Plant Richard R., Hammond Nick, Whitehouse Tom. How choice of mouse may affect response timing in psychological studies. Behavior Research Methods, Instruments, & Computers. 2003; 35(2)DOI
  31. Plant Richard R., Hammond Nick, Whitehouse Tom. Toward an Experimental Timing Standards Lab: Benchmarking precision in the real world. Behavior Research Methods, Instruments, & Computers. 2002; 34(2)DOI
  32. Tommerdahl Mark, Lensch Rachel, Francisco Eric, Holden Jameson, Favorov Oleg. The Brain Gauge: a novel tool for assessing brain health. The Journal of Science and Medicine. 2019; 1(1)DOI
  33. Tommerdahl Mark, Francisco Eric, Holden Jameson, Lensch Rachel, Tommerdahl Anna, Kirsch Bryan, Dennis Robert, Favorov Oleg. An Accurate Measure of Reaction Time can Provide Objective Metrics of Concussion. The Journal of Science and Medicine. 2020; 2(2)DOI