/wp-content/uploads/2017/08/down-east-audio-logo-black.png 0 0 db_is_exponential /wp-content/uploads/2017/08/down-east-audio-logo-black.png db_is_exponential2018-04-22 19:13:102018-04-27 00:22:05A low-cost open source eye tracking system
A low-cost open source eye tracking system
One of my favorite pieces of technology is the digital camera. The camera is an exact analog to the human eye, and the data it produces is analogous to vision, perhaps our most important sense. I’ve always felt that there was a serious lack of great software to explore digital cameras and machine vision technology, so I decided to build my own: The Jevons Camera Viewer The project is currently competing in the 2018 Hackaday Open Hardware Challenge. https://hackaday.io/project/153293-low-cost-open-source-eye-tracking The apparatus fits onto the user’s head as shown below. It is remarkably comfortable, and can easily be worn for long periods of time. The usb wires can be extended to a maximum safe length of 20ft, providing pretty good flexibility for experimentation. The apparatus is constructed from two c270 webcams. The c270 offers excellent color depth and decent definition from it’s 1280×720 pixel sensor. The exposure can be adjusted manually using the Camera Viewer, this is extremely important for enabling this technology to work. The head mounted camera is attached via a single zip tie to the center of the glasses. Foam backed tape is attached to the backside of the camera to prevent the circuit board from getting too hot against the user’s head. Note: this is important, use double sided tape as this part is designed to get hot to the touch. The eye tracking camera is attached to the outside of the glasses as shown. A second zip tie is used to adjust the angle of the camera. The user’s gaze will be displayed on the top screen in real time. Simply record the screen using an app such as OBS studio. The tracked points can also be recorded to a file for later analysis using R, Matlab or numPy. The software uses cornea reflection to map the user’s gaze onto image generated by the front facing camera. The matrix of points is generated by looking at a target and advancing using the space bar. In this view 100 points are used. The program uses linear interpolation to enhance resolution.