Contact

Wrasse climbing gourami amur pike Arctic char, steelhead sprat sea lamprey grunion. Walleye

Contact
Location
1800 Gervais Street
Columbia, SC 29201

Resources

devfOLD: A Toolbox for Designing Age-Specific fNIRS Channel Placement 

Collaborators: John Richards and NIRx Team

The toolbox provides age-specific information about probabilistic channel-to-ROI mapping in a MATLAB-based GUI and Excel look-up tables. The information can be used for designing source-detector channel placement based on study-specific age groups and cortical regions of interests (ROIs). The first-generation of the toolbox provides data on channel-to-ROI correspondence between age 2 weeks and 2 years with narrow age intervals, as well as selected child (4 years and 12 years) and adult (20-24 years) ages. Since then, we have expanded to age groups to provide full coverage from 2 weeks to 30-34 years. 

When using devfOLD, please refer to and cite our publication: Fu, X., & Richards, J.E. (2021). devfOLD: A Toolbox for Designing Age-Specific fNIRS Channel Placement. Neurophotonics. 8(4) 045003. 

The toolbox and probabilistic channel-to-ROI mapping data are accessible on GitHub for ages:  2 weeks, 1 month, 2 months, 3 months, 4.5 months, 6 months, 7.5 months, 9 months, 10.5 months, 12 months, 15 months, 18 months, 2 years, 4 years, 12 years, and 20-24 years. 

Please contact Jessie Fu ([email protected]) to request data for additional child and adult age groups (3, 3.5, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18-19, 25-29, and 30-34 years).  

Mobile Eye Tracking Tools 

Collaborators: Koraly E. Pérez-Edgar, Jessica Bradshaw, Julia Yurkovic-Harding, Samuel Harding, John M. Franchak, Leigha A. MacNeill, Kelley E. Gunther, and Jeremy I. Borjon 

Mobile eye tracking (MET), or head-mounted eye tracking, is a tool for recording participant-perspective gaze in the context of active behavior. Recent technological developments in MET hardware enable researchers to capture egocentric vision as early as infancy and across the lifespan. However, challenges remain in MET data collection, processing, and analysis.  

We provide programs for inspecting MET data accuracy and precision, estimating calibration error tolerance, visualization of gaze events, and modeling within-subjects moment-to-moment changes of gaze events.  

The programs and associated details are accessible via on GitHub