├── .gitignore
├── LICENSE
├── README.md
├── licenses
├── .gitkeep
└── se_internal_license_for_system_tests
└── src
├── core
├── __init__.py
└── main.py
├── libs
├── __init__.py
└── tobii_pro_wrapper.py
├── methods
├── __init__.py
├── finders.py
├── gazers.py
├── getters.py
└── wrappers.py
├── notebooks
└── .gitkeep
├── tests
└── __init__.py
├── utils
└── TobiiTest
│ ├── PyGazeTobiiTest.osexp
│ ├── desktop.ini
│ ├── tobii-legacy_opensesame_log.csv
│ ├── tobii-legacy_output_analysed.xlsx
│ ├── tobii-legacy_pygaze_log.tsv
│ ├── tobii-new_opensesame_log.csv
│ ├── tobii-new_output_analysed.xlsx
│ └── tobii-new_pygaze_log.tsv
└── variables
└── __init__.py
/.gitignore:
--------------------------------------------------------------------------------
1 | # Byte-compiled / optimized / DLL files
2 | __pycache__/
3 | *.py[cod]
4 | *$py.class
5 |
6 | # C extensions
7 | *.so
8 |
9 | # Distribution / packaging
10 | .Python
11 | build/
12 | develop-eggs/
13 | dist/
14 | downloads/
15 | eggs/
16 | .eggs/
17 | lib/
18 | lib64/
19 | parts/
20 | sdist/
21 | var/
22 | wheels/
23 | *.egg-info/
24 | .installed.cfg
25 | *.egg
26 | MANIFEST
27 |
28 | # PyInstaller
29 | # Usually these files are written by a python script from a template
30 | # before PyInstaller builds the exe, so as to inject date/other infos into it.
31 | *.manifest
32 | *.spec
33 |
34 | # Installer logs
35 | pip-log.txt
36 | pip-delete-this-directory.txt
37 |
38 | # Unit test / coverage reports
39 | htmlcov/
40 | .tox/
41 | .coverage
42 | .coverage.*
43 | .cache
44 | nosetests.xml
45 | coverage.xml
46 | *.cover
47 | .hypothesis/
48 | .pytest_cache/
49 |
50 | # Translations
51 | *.mo
52 | *.pot
53 |
54 | # Django stuff:
55 | *.log
56 | local_settings.py
57 | db.sqlite3
58 |
59 | # Flask stuff:
60 | instance/
61 | .webassets-cache
62 |
63 | # Scrapy stuff:
64 | .scrapy
65 |
66 | # Sphinx documentation
67 | docs/_build/
68 |
69 | # PyBuilder
70 | target/
71 |
72 | # Jupyter Notebook
73 | .ipynb_checkpoints
74 |
75 | # pyenv
76 | .python-version
77 |
78 | # celery beat schedule file
79 | celerybeat-schedule
80 |
81 | # SageMath parsed files
82 | *.sage.py
83 |
84 | # Environments
85 | .env
86 | .venv
87 | env/
88 | venv/
89 | ENV/
90 | env.bak/
91 | venv.bak/
92 |
93 | # Spyder project settings
94 | .spyderproject
95 | .spyproject
96 |
97 | # Rope project settings
98 | .ropeproject
99 |
100 | # mkdocs documentation
101 | /site
102 |
103 | # mypy
104 | .mypy_cache/
105 |
--------------------------------------------------------------------------------
/LICENSE:
--------------------------------------------------------------------------------
1 | MIT License
2 |
3 | Copyright (c) 2019 Medical Imaging Diagnosis Assistant
4 |
5 | Permission is hereby granted, free of charge, to any person obtaining a copy
6 | of this software and associated documentation files (the "Software"), to deal
7 | in the Software without restriction, including without limitation the rights
8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9 | copies of the Software, and to permit persons to whom the Software is
10 | furnished to do so, subject to the following conditions:
11 |
12 | The above copyright notice and this permission notice shall be included in all
13 | copies or substantial portions of the Software.
14 |
15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21 | SOFTWARE.
22 |
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 | # Eye Tracker Setup
2 |
3 |
4 |
5 | Setup screen [Tobii Eye Tracker 4C](https://gaming.tobii.com/product/tobii-eye-tracker-4c/) gazing information for [usability testing](https://github.com/MIMBCD-UI/prototype-breast-screening/wiki/User-Research#user-test-evaluations-) purpose. The [Tobii Eye Tracker 4C](https://gaming.tobii.com/product/tobii-eye-tracker-4c/) aims at providing an immersive reality without a headset. Also, with this product nothing stands between the screen and the immersive experience. Therefore, our clinicians will work with no interference of the device. This repository includes functions for the setup os the eye-tracking routines including: (i) calibration of the eye-tracker; (ii) finding eye positions; and (iii) validation of eye-tracker calibration settings. It contains functions for working with with the new [Tobii Pro SDK](https://www.tobiipro.com/product-listing/tobii-pro-sdk/) for [Python](http://developer.tobiipro.com/python.html), along with essential Eye-Tracking routines, in a `TobiiHelper` class. The repository is part of the work done by [SIPg](http://sipg.isr.tecnico.ulisboa.pt/), an [ISR-Lisboa](http://welcome.isr.tecnico.ulisboa.pt/) research group and [M-ITI](https://www.m-iti.org/), two [R&D Units](http://larsys.pt/index.php/facilities/) of [LARSyS](http://larsys.pt/). The project also involves the collaborative effort of [INESC-ID](http://www.inesc-id.pt/). Both [ISR-Lisboa](http://welcome.isr.tecnico.ulisboa.pt/) and [INESC-ID](http://www.inesc-id.pt/) are [Associate Laboratories](https://tecnico.ulisboa.pt/en/research-and-innovation/rd/associate-laboratories/) of [IST](http://tecnico.ulisboa.pt/) from [ULisboa](https://www.ulisboa.pt/).
6 |
7 |
8 | ## Citing
9 |
10 | We kindly ask **scientific works and studies** that make use of the repository to cite it in their associated publications. Similarly, we ask **open-source** and **closed-source** works that make use of the repository to warn us about this use.
11 |
12 | You can cite our work using the following BibTeX entry:
13 |
14 | ```
15 | @article{CALISTO2021102607,
16 | title = {Introduction of human-centric AI assistant to aid radiologists for multimodal breast image classification},
17 | journal = {International Journal of Human-Computer Studies},
18 | volume = {150},
19 | pages = {102607},
20 | year = {2021},
21 | issn = {1071-5819},
22 | doi = {https://doi.org/10.1016/j.ijhcs.2021.102607},
23 | url = {https://www.sciencedirect.com/science/article/pii/S1071581921000252},
24 | author = {Francisco Maria Calisto and Carlos Santiago and Nuno Nunes and Jacinto C. Nascimento},
25 | keywords = {Human-computer interaction, Artificial intelligence, Healthcare, Medical imaging, Breast cancer},
26 | abstract = {In this research, we take an HCI perspective on the opportunities provided by AI techniques in medical imaging, focusing on workflow efficiency and quality, preventing errors and variability of diagnosis in Breast Cancer. Starting from a holistic understanding of the clinical context, we developed BreastScreening to support Multimodality and integrate AI techniques (using a deep neural network to support automatic and reliable classification) in the medical diagnosis workflow. This was assessed by using a significant number of clinical settings and radiologists. Here we present: i) user study findings of 45 physicians comprising nine clinical institutions; ii) list of design recommendations for visualization to support breast screening radiomics; iii) evaluation results of a proof-of-concept BreastScreening prototype for two conditions Current (without AI assistant) and AI-Assisted; and iv) evidence from the impact of a Multimodality and AI-Assisted strategy in diagnosing and severity classification of lesions. The above strategies will allow us to conclude about the behaviour of clinicians when an AI module is present in a diagnostic system. This behaviour will have a direct impact in the clinicians workflow that is thoroughly addressed herein. Our results show a high level of acceptance of AI techniques from radiologists and point to a significant reduction of cognitive workload and improvement in diagnosis execution.}
27 | }
28 | ```
29 |
30 | ## Pre-Requisites
31 |
32 | The following list is showing the set of dependencies for this project. Please, install and build in your machine the recommended versions.
33 |
34 | List of dependencies for this project:
35 |
36 | - [Git](https://git-scm.com/) (>= [v2.20](https://raw.githubusercontent.com/git/git/master/Documentation/RelNotes/2.20.1.txt))
37 |
38 | - [Python](https://www.python.org/) (>= [v2.7](https://docs.python.org/2/))
39 |
40 | - [Setuptools](https://pypi.org/project/setuptools/) (>= [v40.8](https://pypi.org/project/setuptools/40.8.0/))
41 |
42 | - [Tobii Research](https://pypi.org/project/tobii-research/) (>= [v1.6](https://pypi.org/project/tobii-research/1.6.1/))
43 |
44 | ### Analytical Use
45 |
46 | Tobii’s consumer eye trackers are primarily intended for personal interaction use and not for analytical purposes. Any application that stores or transfers eye tracking data must have a special license from Tobii ([Read more](https://analyticaluse.tobii.com/)). Please, apply for a license [here](https://analyticaluse.tobii.com/license-application-form/).
47 |
48 | ## Instructions
49 |
50 | The instructions are as follows. We assume that you already have knowledge over [Git](https://git-scm.com/) and [GitHub](https://github.com/). If not, please follow this [support](https://guides.github.com/activities/hello-world/) information. Any need for support, just open a [New issue](https://github.com/mida-project/eye-tracker-setup/issues/new).
51 |
52 | ### Clone
53 |
54 | To clone the hereby repository follow the guidelines. It is easy as that.
55 |
56 | 1.1. Please clone the repository by typing the command:
57 |
58 | ```
59 | git clone https://github.com/mida-project/eye-tracker-setup.git
60 | ```
61 |
62 | 1.2. Get inside of the repository directory:
63 |
64 | ```
65 | cd eye-tracker-setup/
66 | ```
67 |
68 | 1.3. For the installation and running of the source code, follow the next steps;
69 |
70 | ### Install
71 |
72 | The installation guidelines are as follows. Please, be sure that you follow it correctly.
73 |
74 | 2.1. Run the following command to install the [library](http://docs.python-requests.org/en/master/user/install/#install) using [pip](https://pypi.org/project/pip/):
75 |
76 | #### On Linux or OS X
77 |
78 | ```
79 | pip install -U pip setuptools
80 | pip install tobii-research
81 | ```
82 |
83 | #### On Windows
84 |
85 | ```
86 | python -m pip install -U pip setuptools
87 | pip install tobii-research
88 | ```
89 |
90 | 2.2. Follow the next step;
91 |
92 | ### Run
93 |
94 | The running guidelines are as follows. Please, be sure that you follow it correctly.
95 |
96 | 3.1. Run the sample using the following command:
97 |
98 | ```
99 | python2 src/core/main.py
100 | ```
101 |
102 | 3.2. Enjoy our source code!
103 |
104 | ### Notebooks
105 |
106 | You can also run a Notebook to watch some of our `models` chart plots. For this goal we are using the well known [Jupyter Notebook](http://jupyter.org/) web application. To run the [Jupyter Notebook](http://jupyter.org/) just follow the steps.
107 |
108 | 4.1. Get inside our project directory:
109 |
110 | ```
111 | cd eye-tracker-setup/src/notebooks/
112 | ```
113 |
114 | 4.2. Run [Jupyter Notebook](http://jupyter.org/) application by typing:
115 |
116 | ```
117 | jupyter notebook
118 | ```
119 |
120 | > If you have any question regarding the [Jupyter Notebook](http://jupyter.org/) just follow their [Documentation](http://jupyter.org/documentation). You can also ask for help close to the [Community](http://jupyter.org/community).
121 |
122 | ## Information
123 |
124 | To find out how to apply the [Upgrade Key](https://www.tobiipro.com/siteassets/tobii-pro/product-descriptions/tobii-pro-upgrade-key-product-description.pdf/?v=1.5) to a [Tobii Eye Tracker 4C](https://gaming.tobii.com/product/tobii-eye-tracker-4c/), follow the [Tobii Pro Upgrade Key – User Instructions](https://www.tobiipro.com/siteassets/tobii-pro/user-manuals/tobii-pro-upgrade-key-user-instructions.pdf/?v=1.1) document. Nevertheless, the [Tobii Pro SDK Python API Documentation](http://devtobiipro.azurewebsites.net/tobii.research/python/reference/1.6.1.22-alpha-g9a43723/index.html) page is of chief importance to this repository, as well as, their [Examples](http://devtobiipro.azurewebsites.net/tobii.research/python/reference/1.6.1.22-alpha-g9a43723/examples.html) page for [Python](http://developer.tobiipro.com/python.html). For the first configurations, please follow both [Python - Getting started](http://developer.tobiipro.com/python/python-getting-started.html) and [Python - Step-by-step guide](http://developer.tobiipro.com/python/python-step-by-step-guide.html) pages, or follow the presented steps. Any questions regarding the [Eye-Tracking topic](https://en.wikipedia.org/wiki/Eye_tracking) just follow the [StackOverflow tag](https://stackoverflow.com/questions/tagged/eye-tracking) for the purpose.
125 |
126 | ### Acknowledgements
127 |
128 | The work is also based and highly contributed from the [`tobii_pro_wrapper`](https://github.com/oguayasa/tobii_pro_wrapper). The [`tobii_pro_wrapper`](https://github.com/oguayasa/tobii_pro_wrapper) repository was developed by [Olivia Guayasamin](http://collectivebehaviour.com/) ([oguayasa](https://github.com/oguayasa)) that we would like to thank. That repository shows pretty much everything we need to connect to a Tobii Eye-Tracker, calibrate the eyetracker, get gaze, eye, and time synchronization data from the eyetracker device, and convert the Tobii coordinate systems units.
129 |
130 | ### Authors
131 |
132 | - [Francisco Maria Calisto](http://www.franciscocalisto.me/) [[ResearchGate](https://www.researchgate.net/profile/Francisco_Maria_Calisto) | [GitHub](https://github.com/FMCalisto) | [Twitter](https://twitter.com/FMCalisto) | [LinkedIn](https://www.linkedin.com/in/fmcalisto/)]
133 |
134 | ## Sponsors
135 |
136 |
137 |
138 |
139 |
140 |
141 |
142 |
143 |
144 |
145 |
146 |
147 |
148 |
149 |
150 |
151 |
152 |
153 |
154 |
155 |
156 |
157 |
158 |
159 |
160 |
161 |
162 | ## Departments
163 |
164 |
165 |
166 |
167 |
168 |
169 |
170 |
171 |
172 |
173 |
174 |
175 | ## Laboratories
176 |
177 |
178 |
179 |
180 |
181 |
182 |
183 |
184 |
185 |
186 |
187 |
188 |
189 |
190 |
191 |
192 |
193 |
194 |
195 |
196 |
197 |
198 |
199 |
200 |
201 |
202 |
203 | ## Domain
204 |
205 |
206 |
207 |
208 |
209 |
210 |
211 |
212 |
213 |
214 |
215 |
--------------------------------------------------------------------------------
/licenses/.gitkeep:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/mida-project/eye-tracker-setup/ade35074eeb951af96508be796ea43afd6d7e08f/licenses/.gitkeep
--------------------------------------------------------------------------------
/licenses/se_internal_license_for_system_tests:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/mida-project/eye-tracker-setup/ade35074eeb951af96508be796ea43afd6d7e08f/licenses/se_internal_license_for_system_tests
--------------------------------------------------------------------------------
/src/core/__init__.py:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/mida-project/eye-tracker-setup/ade35074eeb951af96508be796ea43afd6d7e08f/src/core/__init__.py
--------------------------------------------------------------------------------
/src/core/main.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env python
2 | #coding=utf-8
3 |
4 | """
5 | main.py: When creating a Python module, it is common to make
6 | the module execute some functionality (usually contained
7 | in a main function) when run as the entry point of
8 | the program.
9 | """
10 |
11 | __author__ = "Francisco Maria Calisto"
12 | __maintainer__ = "Francisco Maria Calisto"
13 | __email__ = "francisco.calisto@tecnico.ulisboa.pt"
14 | __license__ = "MIT"
15 | __version__ = "1.0.1"
16 | __status__ = "Development"
17 | __copyright__ = "Copyright 2019, Instituto Superior Técnico (IST)"
18 | __credits__ = [
19 | "Bruno Oliveira",
20 | "Carlos Santiago",
21 | "Jacinto C. Nascimento",
22 | "Pedro Miraldo",
23 | "Nuno Nunes"
24 | ]
25 |
26 | import os
27 | import sys
28 |
29 | from os import path
30 |
31 | import tobii_research as tr
32 | import time
33 |
34 | # The current folder path.
35 | basePath = os.path.dirname(__file__)
36 |
37 | # The path to the repository "src" folder.
38 | joinRepoPath = os.path.join(basePath, '..', '..')
39 | pathRepoAbsPath = os.path.abspath(joinRepoPath)
40 | # Add the directory containing the module to
41 | # the Python path (wants absolute paths).
42 | sys.path.append(pathRepoAbsPath)
43 |
44 | # The path to the repository "src" folder.
45 | joinPath = os.path.join(basePath, '..')
46 | pathAbsPath = os.path.abspath(joinPath)
47 | # Add the directory containing the module to
48 | # the Python path (wants absolute paths).
49 | sys.path.append(pathAbsPath)
50 |
51 | # Appending variables path.
52 | varsPath = os.path.join(joinPath, 'variables')
53 | varsAbsPath = os.path.abspath(varsPath)
54 | sys.path.append(varsAbsPath)
55 | sys.path.insert(0, varsAbsPath)
56 |
57 | # Appending methods path.
58 | methodsPath = os.path.join(joinPath, 'methods')
59 | methodsAbsPath = os.path.abspath(methodsPath)
60 | sys.path.append(methodsAbsPath)
61 | sys.path.insert(0, methodsAbsPath)
62 |
63 | # Importing available methods
64 | from finders import *
65 | from gazers import *
66 | from getters import *
67 |
68 | def apply_licenses(eyetracker):
69 | # license_file_path = "licenses/se_internal_license_for_system_tests"
70 | license_file_path = os.path.join(pathRepoAbsPath, 'licenses', 'se_internal_license_for_system_tests')
71 | print(license_file_path)
72 | print("Applying license from {0}.".format(license_file_path))
73 | with open(license_file_path, "rb") as f:
74 | license = f.read()
75 | failed_licenses_applied_as_list_of_keys = eyetracker.apply_licenses([tr.LicenseKey(license)])
76 | failed_licenses_applied_as_list_of_bytes = eyetracker.apply_licenses([license])
77 | failed_licenses_applied_as_key = eyetracker.apply_licenses(tr.LicenseKey(license))
78 | failed_licenses_applied_as_bytes = eyetracker.apply_licenses(license)
79 | if len(failed_licenses_applied_as_list_of_keys) == 0:
80 | print("Successfully applied license from list of keys.")
81 | else:
82 | print("Failed to apply license from list of keys. Validation result: {0}.".
83 | format(failed_licenses_applied_as_list_of_keys[0].validation_result))
84 | if len(failed_licenses_applied_as_list_of_bytes) == 0:
85 | print("Successfully applied license from list of bytes.")
86 | else:
87 | print("Failed to apply license from list of bytes. Validation result: {0}.".
88 | format(failed_licenses_applied_as_list_of_bytes[0].validation_result))
89 | if len(failed_licenses_applied_as_key) == 0:
90 | print("Successfully applied license from single key.")
91 | else:
92 | print("Failed to apply license from single key. Validation result: {0}.".
93 | format(failed_licenses_applied_as_key[0].validation_result))
94 | if len(failed_licenses_applied_as_bytes) == 0:
95 | print("Successfully applied license from bytes object.")
96 | else:
97 | print("Failed to apply license from bytes object. Validation result: {0}.".
98 | format(failed_licenses_applied_as_bytes[0].validation_result))
99 |
100 | def main():
101 | eyetracker = find_eyetrackers_meta()
102 | apply_licenses(eyetracker)
103 | gaze_data(eyetracker)
104 | # new_eyetracker.subscribe_to(tr.EYETRACKER_GAZE_DATA, gaze_data_callback, as_dictionary=True)
105 | # time.sleep(5)
106 | # new_eyetracker.unsubscribe_from(tr.EYETRACKER_GAZE_DATA, gaze_data_callback)
107 | gaze_output_frequencies(eyetracker)
108 | get_and_set_display_area(eyetracker)
109 |
110 | if __name__ == '__main__':
111 | main()
112 |
113 | # ==================== END File ==================== #
114 |
--------------------------------------------------------------------------------
/src/libs/__init__.py:
--------------------------------------------------------------------------------
1 | from .tobii_pro_wrapper import *
2 |
--------------------------------------------------------------------------------
/src/libs/tobii_pro_wrapper.py:
--------------------------------------------------------------------------------
1 | # -*- coding: utf-8 -*-
2 |
3 | # Psychopy supported Tobii controller for the new Pro SDK
4 |
5 | # Authors: Olivia Guayasamin
6 | # Date: 8/3/2017
7 |
8 | # Modifiers: Francisco Maria Calisto
9 | # Date: 25/03/2019
10 |
11 | # Requirements: Python 2.7 32 Bit (SDK required)
12 | # Tobii Pro SDK 1.0 for Python, and all dependencies
13 | # Psychopy, Psychopy.iohub, and all dependencies
14 | # numpy, scipy, and win32api
15 |
16 | # Summary: Currently provides all functionality for running a FULL CALIBRATION
17 | # ROUTINE for 5 and 9 point calibrations, and converting between Tobii
18 | # Trackbox, Tobii ADA, and Psychopy coordinate systems.
19 |
20 | # This code also contains functionality for finding/calibrating the
21 | # experimental monitor, connecting to keyboard/mouse devices, selecting and
22 | # connecting to a tobii device, getting tobii device parameters, and getting
23 | # real time gaze and eye position data from the tobii tracker.
24 |
25 | # Notes: This code is currently designed for working with a tobii eyetracker
26 | # installed on the same device as the one for running experiments (laptop set-
27 | # up with a single connected eyetracker, no external monitors, and no tobii
28 | # external processors). It should be straightforward to adapt to other
29 | # computer/monitor set-ups, but adaptation is required. Created on Windows OS.
30 | # Not guaranteed.
31 |
32 | # Please contact for questions. This will be updated as more functionality is
33 | # added.
34 |
35 | # -----Import Required Libraries-----
36 | import pyglet
37 |
38 | from psychopy import core as pcore
39 | from psychopy import monitors, visual, gui, data, event
40 | from psychopy.iohub import launchHubServer
41 |
42 | import datetime as dt
43 | import numpy as np
44 | from scipy.spatial import distance
45 |
46 | import tobii_research as tobii
47 |
48 | import collections
49 |
50 | # -----Class for working with Tobii Eyetrackers -----
51 | class TobiiHelper:
52 |
53 | def __init__(self):
54 |
55 | self.eyetracker = None
56 |
57 | self.adaCoordinates = {}
58 |
59 | self.tbCoordinates = {}
60 |
61 | self.calibration = None
62 |
63 | self.tracking = False
64 |
65 | self.win = None
66 |
67 | self.gazeData = {}
68 |
69 | self.syncData = {}
70 |
71 | self.currentOutData = {}
72 |
73 | # ----- Functions for initialzing the eyetracker and class attributes -----
74 |
75 | # find and connect to a tobii eyetracker
76 | def findTracker(self, serialString = None):
77 |
78 | # try to find all eyetrackers
79 | allTrackers = tobii.find_all_eyetrackers()
80 |
81 | # if there are no eyetrackers
82 | if len(allTrackers) < 1:
83 | raise ValueError("Cannot find any eyetrackers.")
84 |
85 | # if there is no serialString specified, use first found eyetracker
86 | if serialString is None:
87 | # use first found eyetracker
88 | eyetracker = allTrackers[0]
89 | address = eyetracker.address
90 | print("Address: " + eyetracker.address)
91 | print("Model: " + eyetracker.model)
92 | print("Name: " + eyetracker.device_name)
93 | print("Serial number: " + eyetracker.serial_number)
94 | # create eyetracker object
95 | self.eyetracker = tobii.EyeTracker(address)
96 | # if serial number is not given as a string
97 | elif not isinstance(serialString, basestring):
98 | raise TypeError("Serial number must be formatted as a string.")
99 | # if serial number is given as a string
100 | else:
101 | # get information about available eyetrackers
102 | for eyetracker in allTrackers:
103 | if eyetracker.serial_number == serialString:
104 | address = eyetracker.address
105 | print("Address: " + eyetracker.address)
106 | print("Model: " + eyetracker.model)
107 | # fine if name is empty
108 | print("Name: " + eyetracker.device_name)
109 | print("Serial number: " + eyetracker.serial_number)
110 |
111 | # create eyetracker object
112 | self.eyetracker = tobii.EyeTracker(address)
113 |
114 | # check to see that eyetracker is connected
115 | if self.eyetracker is None:
116 | print("Eyetracker did not connect. Check serial number?")
117 | else:
118 | print("Eyetracker connected successfully.")
119 |
120 |
121 | # function for getting trackbox (tb) and active display area (ada)coordinates, returns
122 | # coordintes in two separate dictionaries with values in mm
123 | def getTrackerSpace(self):
124 |
125 | # check to see that eyetracker is connected
126 | if self.eyetracker is None:
127 | raise ValueError("There is no eyetracker.")
128 |
129 | # get active display area information in mm as a dictionary
130 | displayArea = self.eyetracker.get_display_area()
131 | self.adaCoordinates['bottomLeft'] = displayArea.bottom_left
132 | self.adaCoordinates['bottomRight'] = displayArea.bottom_right
133 | self.adaCoordinates['topLeft'] = displayArea.top_left
134 | self.adaCoordinates['topRight'] = displayArea.top_right
135 | self.adaCoordinates['height'] = displayArea.height
136 | self.adaCoordinates['width'] = displayArea.width
137 |
138 | # get track box information in mm, return only the 2d coordinates
139 | # of the cube side closest to the eyetracker
140 | trackBox = self.eyetracker.get_track_box()
141 | self.tbCoordinates['bottomLeft'] = trackBox.front_lower_left
142 | self.tbCoordinates['bottomRight'] = trackBox.front_lower_right
143 | self.tbCoordinates['topLeft'] = trackBox.front_upper_left
144 | self.tbCoordinates['topRight'] = trackBox.front_upper_right
145 | # calculate box height and width
146 | trackBoxHeight = np.absolute(trackBox.front_lower_left[1] -
147 | trackBox.front_upper_right[1])
148 | trackBoxWidth = np.absolute(trackBox.front_lower_left[0] -
149 | trackBox.front_lower_right[0])
150 | self.tbCoordinates['height'] = trackBoxHeight
151 | self.tbCoordinates['width'] = trackBoxWidth
152 |
153 |
154 | # define and calibrate experimental monitor, set monitor dimensions
155 | def setMonitor(self, nameString = None, dimensions = None):
156 |
157 | # find all connected monitors
158 | allMonitors = monitors.getAllMonitors()
159 |
160 | # if there are no eyetrackers
161 | if len(allMonitors) < 1:
162 | raise ValueError("Psychopy can't find any monitors.")
163 |
164 | # if no dimensions given
165 | if dimensions is None:
166 | # use current screen dimensions
167 | platform = pyglet.window.get_platform()
168 | display = platform.get_default_display()
169 | screen = display.get_default_screen()
170 | dimensions = (screen.width, screen.height)
171 | # if dimension not given as tuple
172 | elif not isinstance(dimensions, tuple):
173 | raise TypeError("Dimensions must be given as tuple.")
174 |
175 | # if there is not monitor name defined, go to first default monitor
176 | if nameString is None:
177 | # create monitor calibration object
178 | thisMon = monitors.Monitor(allMonitors[0])
179 | print("Current monitor name is: " + allMonitors[0])
180 | # set monitor dimensions
181 | thisMon.setSizePix(dimensions)
182 | # save monitor
183 | thisMon.saveMon() # save monitor calibration
184 | self.win = thisMon
185 | # if serial number is not given as a string
186 | elif not isinstance(nameString, basestring):
187 | raise TypeError("Monitor name must be formatted as a string.")
188 | # if serial number is given as a string
189 | else:
190 | # create monitor calibration object
191 | thisMon = monitors.Monitor(nameString)
192 | print("Current monitor name is: " + nameString)
193 | # set monitor dimensions
194 | thisMon.setSizePix(dimensions)
195 | # save monitor
196 | thisMon.saveMon() # save monitor calibration
197 | self.win = thisMon
198 |
199 |
200 | # ----- Functions for starting and stopping eyetracker data collection -----
201 |
202 | # function for broadcasting real time gaze data
203 | def gazeDataCallback(self,startGazeData):
204 | self.gazeData = startGazeData
205 |
206 |
207 | # function for subscribing to real time gaze data from eyetracker
208 | def startGazeData(self):
209 |
210 | # check to see if eyetracker is there
211 | if self.eyetracker is None:
212 | raise ValueError("There is no eyetracker.")
213 |
214 | # if it is, proceed
215 | print("Subscribing to eyetracker.")
216 | self.eyetracker.subscribe_to(tobii.EYETRACKER_GAZE_DATA,
217 | self.gazeDataCallback,
218 | as_dictionary = True)
219 | self.tracking = True
220 |
221 |
222 | # function for unsubscring from gaze data
223 | def stopGazeData(self):
224 |
225 | # check to see if eyetracker is there
226 | if self.eyetracker is None:
227 | raise ValueError("There is no eyetracker.")
228 | # if it is, proceed
229 | print("Unsubscribing from eyetracker")
230 | self.eyetracker.unsubscribe_from(tobii.EYETRACKER_GAZE_DATA,
231 | self.gazeDataCallback)
232 | self.tracking = False
233 |
234 |
235 | # ----- Helper functions -----
236 | # function for checking tracker and computer synchronization
237 | def timeSyncCallback(self, timeSyncData):
238 | self.syncData = timeSyncData
239 |
240 |
241 | # broadcast synchronization data
242 | def startSyncData(self):
243 | #check that eyetracker is connected
244 | if self.eyetracker is None:
245 | raise ValueError('Eyetracker is not connected.')
246 |
247 | # if it is , proceed
248 | print("Subscribing to time synchronization data")
249 | self.eyetracker.subscribe_to(tobii.EYETRACKER_TIME_SYNCHRONIZATION_DATA,
250 | self.timeSyncCallback,
251 | as_dictionary=True)
252 | print("We just did it...")
253 |
254 |
255 | # stop broadcasting synchronization data
256 | def stopSyncData(self):
257 | self.eyetracker.unsubscribe_from(tobii.EYETRACKER_TIME_SYNCHRONIZATION_DATA,
258 | self.timeSyncCallback)
259 | print("Unsubscribed from time synchronization data.")
260 |
261 | # function for converting positions from trackbox coordinate system (mm) to
262 | # normalized active display area coordinates
263 | def tb2Ada(self, xyCoor = tuple):
264 |
265 | # check argument values
266 | if xyCoor is None:
267 | raise ValueError("No coordinate values have been specified.")
268 | elif not isinstance(xyCoor, tuple):
269 | raise TypeError("XY coordinates must be given as tuple.")
270 | elif isinstance(xyCoor, tuple) and len(xyCoor) is not 2:
271 | raise ValueError("Wrong number of coordinate dimensions")
272 | # check tracker box and ada coordinates
273 | if self.tbCoordinates is None or self.adaCoordinates is None:
274 | raise ValueError("Missing trackbox coordinates. \n" +\
275 | "Try running getTrackerSpace()")
276 |
277 | # get tb and ada values from eyetracker
278 | tbDict = self.tbCoordinates
279 | tbLowLeft = (tbDict.get('bottomLeft')[0],
280 | tbDict.get('bottomLeft')[1])
281 | adaDict = self.adaCoordinates
282 | adaLowLeft = ((adaDict.get('width')/-2),
283 | (adaDict.get('height')/-2))
284 |
285 | # create ratios for x and y coordinates
286 | yRatio = tbLowLeft[1]/adaLowLeft[1]
287 | xRatio = tbLowLeft[0]/adaLowLeft[0]
288 |
289 | # convert and return coordinates
290 | adaNorm = ((xyCoor[0] * xRatio), (xyCoor[1] * yRatio))
291 | return adaNorm
292 |
293 |
294 | # function for converting normalized coordinates to normalized coordinates
295 | # based on the psychopy window
296 | def tb2PsychoNorm(self, xyCoor = tuple):
297 |
298 | # check argument values
299 | if xyCoor is None:
300 | raise ValueError("No coordinate values have been specified.")
301 | elif not isinstance(xyCoor, tuple):
302 | raise TypeError("XY coordinates must be given as tuple.")
303 | elif isinstance(xyCoor, tuple) and len(xyCoor) is not 2:
304 | raise ValueError("Wrong number of coordinate dimensions")
305 |
306 | # convert track box coordinates to adac coordinates
307 | adaCoors = self.tb2Ada(xyCoor)
308 | # correct for psychopy window coordinates
309 | centerScale = self.tb2Ada((1, 1))
310 | centerShift = ((centerScale[0] / 2), (centerScale[1] / 2))
311 | psychoNorm = (adaCoors[0] - centerShift[0],
312 | adaCoors[1] - centerShift[1])
313 | # return coordinates in psychowin 'norm' units
314 | return psychoNorm
315 |
316 |
317 | # function for converting from tobiis ada coordinate system in normalized
318 | # coordinates where (0,0) is the upper left corner, to psychopy window
319 | # coordinates in pix, where (0,0) is at the center of psychopy window.
320 | def ada2PsychoPix(self, xyCoor = tuple):
321 |
322 | # check argument values
323 | if xyCoor is None:
324 | raise ValueError("No coordinate values have been specified.")
325 | elif not isinstance(xyCoor, tuple):
326 | raise TypeError("XY coordinates must be given as tuple.")
327 | elif isinstance(xyCoor, tuple) and len(xyCoor) is not 2:
328 | raise ValueError("Wrong number of coordinate dimensions")
329 |
330 | if np.isnan(xyCoor[0]) and np.isnan(xyCoor[1]):
331 | psychoPix = (np.nan, np.nan)
332 | return psychoPix
333 |
334 | # convert to pixels and correct for psychopy window coordinates
335 | monHW = (self.win.getSizePix()[0],
336 | self.win.getSizePix()[1])
337 | wShift, hShift = monHW[0] / 2 , monHW[1] / 2
338 | psychoPix = ((((xyCoor[0]* monHW[0]) - wShift)),
339 | (((xyCoor[1] * monHW[1]) - hShift) * -1))
340 | # return coordinates in psychowin 'pix' units
341 | return psychoPix
342 |
343 |
344 | # function for converting from tobiis active display coordinate system in
345 | # normalized coordinates where (0,0) is the upper left corner, to monitor
346 | # coordinates in pix, where (0,0) is the upper left corner
347 | def ada2MonPix(self, xyCoor = tuple):
348 |
349 | # check argument values
350 | if xyCoor is None:
351 | raise ValueError("No coordinate values have been specified.")
352 | elif not isinstance(xyCoor, tuple):
353 | raise TypeError("XY coordinates must be given as tuple.")
354 | elif isinstance(xyCoor, tuple) and len(xyCoor) is not 2:
355 | raise ValueError("Wrong number of coordinate dimensions")
356 |
357 | if np.isnan(xyCoor[0]) and np.isnan(xyCoor[1]):
358 | monPix = (np.nan, np.nan)
359 | return monPix
360 |
361 | # convert so point of gaze on monitor is accurate
362 | monPix = (int(xyCoor[0] * self.win.getSizePix()[0]),
363 | int(xyCoor[1] * self.win.getSizePix()[1]))
364 | return monPix
365 |
366 |
367 | # ----- Functions for collecting eye and gaze data -----
368 |
369 | # function for collecting gaze coordinates in psychopy pixel coordinate
370 | # system. currently written to return the average (x, y) position of both
371 | # eyes, but can be easily rewritten to return data from one or both eyes
372 | def getAvgGazePos(self):
373 |
374 | # check to see if the eyetracker is connected and turned on
375 | if self.eyetracker is None:
376 | raise ValueError("There is no eyetracker.")
377 | if self.tracking is False:
378 | raise ValueError("The eyetracker is not turned on.")
379 |
380 | # while tracking
381 | while True:
382 | # access gaze data dictionary to get gaze position tuples
383 | lGazeXYZ = self.gazeData['left_gaze_point_on_display_area']
384 | rGazeXYZ = self.gazeData['right_gaze_point_on_display_area']
385 | # get 2D gaze positions for left and right eye
386 | xs = (lGazeXYZ[0], rGazeXYZ[0])
387 | ys = (lGazeXYZ[1], rGazeXYZ[1])
388 |
389 | # if all of the axes have data from at least one eye
390 | if all([x != -1.0 for x in xs]) and all([y != -1.0 for y in ys]):
391 | # take x and y averages
392 | avgGazePos = np.nanmean(xs), np.nanmean(ys)
393 | else:
394 | # or if no data, hide points by showing off screen
395 | avgGazePos = (np.nan, np.nan)
396 | return self.ada2PsychoPix(avgGazePos)
397 |
398 |
399 | # function for finding the avg 3d position of subject's eyes, so that they
400 | # can be drawn in the virtual track box before calibration. The x and y
401 | # coordinates are returned in normalized "tobii track box" units.
402 | def trackboxEyePos(self):
403 |
404 | # check to see if the eyetracker is connected and turned on
405 | if self.eyetracker is None:
406 | raise ValueError("There is no eyetracker.")
407 | if self.tracking is False:
408 | raise ValueError("The eyetracker is not turned on.")
409 |
410 | # while tracking
411 | while True:
412 | # access gaze data dictionary to get eye position tuples,
413 | # in trackbox coordinate system
414 | lTbXYZ = self.gazeData['left_gaze_origin_in_trackbox_coordinate_system']
415 | rTbXYZ = self.gazeData['right_gaze_origin_in_trackbox_coordinate_system']
416 |
417 | # left eye validity
418 | lVal = self.gazeData['left_gaze_origin_validity']
419 | # right eye validity
420 | rVal = self.gazeData['right_gaze_origin_validity']
421 |
422 | # if left eye is found by the eyetracker
423 | if lVal == 1:
424 | # update the left eye positions if the values are reasonable
425 | # scale left eye position so that it fits in track box
426 | leftTbPos = (-self.tb2PsychoNorm((lTbXYZ[0],
427 | lTbXYZ[1]))[0] * 1.7,
428 | self.tb2PsychoNorm((lTbXYZ[0],
429 | lTbXYZ[1]))[1])
430 | else:
431 | # hide by drawing in the corner
432 | leftTbPos = [0.99, 0.99]
433 |
434 | # if right eye is found by the eyetracker
435 | if rVal == 1:
436 | # update the right eye positions if the values are reasonable
437 | # scale right eye position so that it fits in track box
438 | rightTbPos = (-self.tb2PsychoNorm((rTbXYZ[0], rTbXYZ[1]))[0] * 1.7,
439 | self.tb2PsychoNorm((rTbXYZ[0],
440 | rTbXYZ[1]))[1])
441 | else:
442 | # hide by drawing in the corner
443 | rightTbPos = [0.99, 0.99]
444 | # return values for positio in track box
445 | return leftTbPos, rightTbPos
446 |
447 |
448 | # x, y, and z dimensions are given in mm from the tracker origin, gives the
449 | # average 3d position of both eyes, but can be easily rewritten to yield
450 | # the position of each eye separately
451 | def getAvgEyePos(self):
452 |
453 | # check to see if the eyetracker is connected and turned on
454 | if self.eyetracker is None:
455 | raise ValueError("There is no eyetracker.")
456 | if self.tracking is False:
457 | raise ValueError("The eyetracker is not turned on.")
458 |
459 | # while tracking
460 | while True:
461 | # access gaze data dictionary to get eye position tuples, given in
462 | # mm in from eyetracker origin
463 | lOriginXYZ = self.gazeData['left_gaze_origin_in_user_coordinate_system']
464 | rOriginXYZ = self.gazeData['right_gaze_origin_in_user_coordinate_system']
465 |
466 | # create arrays with positions of both eyes on x, y, and z axes
467 | xs = (lOriginXYZ[0],rOriginXYZ[0])
468 | ys = (lOriginXYZ[1],rOriginXYZ[1])
469 | zs = (lOriginXYZ[2],rOriginXYZ[2])
470 |
471 | # if all of the axes have data from at least one eye
472 | if not (np.isnan(xs)).all() or not (np.isnan(ys)).all() or not (np.isnan(zs)).all():
473 | # update the distance if the values are reasonable
474 | avgEyePos = (np.nanmean(xs), np.nanmean(ys), np.nanmean(zs))
475 | else:
476 | # otherwise set to zero
477 | avgEyePos = (0, 0, 0)
478 | # return average eye position in mm
479 | return avgEyePos
480 |
481 |
482 | # get average distance of the eyes from the tracker origin, given in cm
483 | def getAvgEyeDist(self):
484 |
485 | # check to see if the eyetracker is connected and turned on
486 | if self.eyetracker is None:
487 | raise ValueError("There is no eyetracker.")
488 | if self.tracking is False:
489 | raise ValueError("The eyetracker is not turned on.")
490 |
491 | # while tracking
492 | while True:
493 | # get eye positions
494 | eyeCoors = self.getAvgEyePos()
495 |
496 | # if eyes were found
497 | if sum(eyeCoors) > 0:
498 | # calculate the euclidean distance of eyes from tracker origin
499 | avgEyeDist = distance.euclidean((eyeCoors[0]/10,
500 | eyeCoors[1]/10,
501 | eyeCoors[2]/10), (0, 0, 0))
502 | else: # if eyes were not found, return zero values
503 | avgEyeDist = 0
504 | # return distance value in cm
505 | return avgEyeDist
506 |
507 |
508 | # get average size of pupils in mm, can easily be rewritten to return
509 | # pupil size values for both eyes
510 | def getPupilSize(self):
511 |
512 | # check to see if the eyetracker is connected and turned on
513 | if self.eyetracker is None:
514 | raise ValueError("There is no eyetracker.")
515 | if self.tracking is False:
516 | raise ValueError("The eyetracker is not turned on.")
517 |
518 | # while tracking
519 | while True:
520 | lPup = self.gazeData['left_pupil_diameter']
521 | rPup = self.gazeData['right_pupil_diameter']
522 | pupSizes = (lPup, rPup)
523 |
524 | # if pupils were found
525 | if lPup != -1 and rPup != -1:
526 | avgPupSize = np.nanmean(pupSizes)
527 | else: # otherwise return zero
528 | avgPupSize = (0.0)
529 |
530 | # return pupil size
531 | return avgPupSize
532 |
533 |
534 | # check the validities of right and left eyes, returns as a tuple of
535 | # true/false values
536 | def checkEyeValidities(self):
537 |
538 | # check to see if the eyetracker is connected and turned on
539 | if self.eyetracker is None:
540 | raise ValueError("There is no eyetracker.")
541 | if self.tracking is False:
542 | raise ValueError("The eyetracker is not turned on.")
543 |
544 | # while tracking
545 | while True:
546 | # get validity values
547 | lVal = self.gazeData['left_gaze_origin_validity']
548 | rVal = self.gazeData['right_gaze_origin_validity']
549 | # default validity value
550 | validities = 0 # neither eye is valid
551 |
552 | # if both eyes are valid, return 3
553 | if lVal == 1 and rVal == 1:
554 | validities = 3
555 | # if just left eye is valid, return 1
556 | elif lVal == 1 and rVal == 0:
557 | validities = 1
558 | # if just right eye is valid, return 2
559 | elif lVal == 0 and rVal == 1 :
560 | validities = 2
561 |
562 | # return validity values
563 | return validities
564 |
565 |
566 | # ----- Functions for running calibration -----
567 |
568 | # function for drawing representation of the eyes in virtual trackbox
569 | def drawEyePositions(self, psychoWin):
570 |
571 | # check that psychopy window exists
572 | if psychoWin is None:
573 | raise ValueError("There is no psychopy window available. " +\
574 | "Try calling runTrackbox() instead.")
575 |
576 | # Set default colors
577 | correctColor = [-1.0, 1.0, -1.0]
578 | mediumColor = [1.0, 1.0, 0.0]
579 | wrongColor = [1.0, -1.0, -1.0]
580 |
581 | # rectangle for viewing eyes
582 | rectScale = self.tb2Ada((1, 1))
583 | eyeArea = visual.Rect(psychoWin,
584 | fillColor = [0.0, 0.0, 0.0],
585 | lineColor = [0.0, 0.0, 0.0],
586 | pos = [0.0, 0.0],
587 | units = 'norm',
588 | lineWidth = 3,
589 | width = rectScale[0],
590 | height = rectScale[1])
591 | # Make stimuli for the left and right eye
592 | leftStim = visual.Circle(psychoWin,
593 | fillColor = eyeArea.fillColor,
594 | units = 'norm',
595 | radius = 0.07)
596 | rightStim = visual.Circle(psychoWin,
597 | fillColor = eyeArea.fillColor,
598 | units = 'norm',
599 | radius = 0.07)
600 | # Make a dummy message
601 | findmsg = visual.TextStim(psychoWin,
602 | text = " ",
603 | color = [1.0, 1.0, 1.0],
604 | units = 'norm',
605 | pos = [0.0, -0.65],
606 | height = 0.07)
607 |
608 | # while tracking
609 | while True:
610 | # find and update eye positions
611 | leftStim.pos, rightStim.pos = self.trackboxEyePos()
612 | eyeDist = self.getAvgEyeDist()
613 |
614 | # change color depending on distance
615 | if eyeDist >= 55 and eyeDist <= 75:
616 | # correct distance
617 | leftStim.fillColor, leftStim.lineColor = correctColor, correctColor
618 | rightStim.fillColor, rightStim.lineColor = correctColor, correctColor
619 | elif eyeDist <= 54 and eyeDist >= 45 or eyeDist >= 76 and eyeDist <= 85:
620 | leftStim.fillColor, leftStim.lineColor = mediumColor, mediumColor
621 | rightStim.fillColor, rightStim.lineColor = mediumColor, mediumColor
622 | else:
623 | # not really correct
624 | leftStim.fillColor, leftStim.lineColor = wrongColor, wrongColor
625 | rightStim.fillColor, rightStim.lineColor = wrongColor, wrongColor
626 |
627 | # if left eye is not found, don't display eye
628 | if leftStim.pos[0] == 0.99:
629 | leftStim.fillColor = psychoWin.color # make the same color as bkg
630 | leftStim.lineColor = psychoWin.color
631 |
632 | # if right eye is not found, don't display eye
633 | if rightStim.pos[0] == 0.99:
634 | rightStim.fillColor = psychoWin.color # make same color as bkg
635 | rightStim.lineColor = psychoWin.color
636 |
637 | # give distance feedback
638 | findmsg.text = "You're currently " + \
639 | str(int(eyeDist)) + \
640 | ("cm away from the screen. \n"
641 | "Press 'c' to calibrate or 'q' to abort.")
642 |
643 | # update stimuli in window
644 | eyeArea.draw()
645 | leftStim.draw()
646 | rightStim.draw()
647 | findmsg.draw()
648 | psychoWin.flip()
649 |
650 | # depending on response, either abort script or continue to calibration
651 | if event.getKeys(keyList=['q']):
652 | self.stopGazeData()
653 | psychoWin.close()
654 | pcore.quit()
655 | raise KeyboardInterrupt("You aborted the script manually.")
656 | elif event.getKeys(keyList=['c']):
657 | print("Proceeding to calibration.")
658 | self.stopGazeData()
659 | psychoWin.flip()
660 | return
661 |
662 | # clear events not accessed this iteration
663 | event.clearEvents(eventType='keyboard')
664 |
665 |
666 | # function for running validation routine post calibration to check
667 | # calibration precision and accuracy
668 | def runValidation(self, pointDict = dict):
669 |
670 | # check the values of the point dictionary
671 | if pointDict is None:
672 | print('pointDict has no value. Using 5 point default.')
673 | pointList = [('1',(0.1, 0.1)), ('2',(0.9, 0.1)), ('3',(0.5, 0.5)),
674 | ('4',(0.1, 0.9)), ('5',(0.9, 0.9))]
675 | pointDict = collections.OrderedDict(pointList)
676 | if not isinstance(pointDict, dict):
677 | raise TypeError('pointDict must be a dictionary with number ' +\
678 | 'keys and coordinate values.')
679 | # check window attribute
680 | if self.win is None:
681 | raise ValueError('No experimental monitor has been specified.\n' +\
682 | 'Try running setMonitor().')
683 | # start eyetracker
684 | self.startGazeData()
685 | # let it warm up briefly
686 | pcore.wait(0.5)
687 |
688 | # get points from dictionary
689 | curPoints = pointDict.values()
690 |
691 | # convert points from normalized ada units to psychopy pix
692 | pointPositions = [self.ada2PsychoPix(x) for x in curPoints]
693 |
694 | # window stimuli
695 | valWin = visual.Window(size = [self.win.getSizePix()[0],
696 | self.win.getSizePix()[1]],
697 | pos = [0, 0],
698 | units = 'pix',
699 | fullscr = True,
700 | allowGUI = True,
701 | monitor = self.win,
702 | winType = 'pyglet',
703 | color = [0.8, 0.8, 0.8])
704 | # stimuli for showing point of gaze
705 | gazeStim = visual.Circle(valWin,
706 | radius = 50,
707 | lineColor = [1.0, 0.95, 0.0], # yellow circle
708 | fillColor = [1.0, 1.0, 0.55], # light interior
709 | lineWidth = 40,
710 | units = 'pix')
711 | # Make a dummy message
712 | valMsg = visual.TextStim(valWin,
713 | text = 'Wait for the experimenter.',
714 | color = [0.4, 0.4, 0.4], # grey
715 | units = 'norm',
716 | pos = [0.0, -0.5],
717 | height = 0.07)
718 | # Stimuli for all validation points
719 | valPoints = visual.Circle(valWin,
720 | units = "pix",
721 | radius = 20,
722 | lineColor = [1.0, -1.0, -1.0], # red
723 | fillColor = [1.0, -1.0, -1.0]) # red
724 |
725 | # create array for smoothing gaze position
726 | gazePositions = np.array([0.0, 0.0])
727 | maxLength = 6
728 |
729 | # while tracking
730 | while True:
731 |
732 | # smooth gaze data with moving window
733 | gazePositions = np.vstack((gazePositions,
734 | np.array(self.getAvgGazePos())))
735 | curPos = np.nanmean(gazePositions, axis = 0)
736 | # remove previous position values
737 | if len(gazePositions) == maxLength:
738 | gazePositions = np.delete(gazePositions, 0, axis = 0)
739 |
740 | # update stimuli in window and draw
741 | drawStim = self.ada2PsychoPix(tuple(curPos))
742 |
743 | # draw gaze position only if found
744 | if drawStim[0] is not self.win.getSizePix()[0]:
745 | gazeStim.pos = drawStim
746 | gazeStim.draw()
747 |
748 | # points
749 | for point in pointPositions:
750 | valPoints.pos = point
751 | valPoints.draw()
752 |
753 | # text
754 | valMsg.draw()
755 | valWin.flip()
756 |
757 | # depending on response, either abort script or continue to calibration
758 | if event.getKeys(keyList=['q']):
759 | valWin.close()
760 | self.stopGazeData()
761 | pcore.quit()
762 | raise KeyboardInterrupt("You aborted the script manually.")
763 | elif event.getKeys(keyList=['c']):
764 | valWin.close()
765 | print("Exiting calibration validation.")
766 | self.stopGazeData()
767 | return
768 |
769 | # clear events not accessed this iteration
770 | event.clearEvents(eventType='keyboard')
771 |
772 |
773 | # function for getting the average left and right gaze position coordinates
774 | # for each calibration point in psychopy pix units
775 | def calculateCalibration(self, calibResult):
776 |
777 | # check the values of the point dictionary
778 | if calibResult is None:
779 | raise ValueError('No argument passed for calibResult')
780 |
781 | #create an empty list to hold values
782 | calibDrawCoor = []
783 |
784 | # iterate through calibration points
785 | for i in range(len(calibResult.calibration_points)):
786 | # current point
787 | curPoint = calibResult.calibration_points[i]
788 | pointPosition = curPoint.position_on_display_area # point position
789 | pointSamples = curPoint.calibration_samples # samples at point
790 | # empty arrays for holding left and right eye gaze coordinates
791 | leftOutput = np.zeros((len(pointSamples), 2))
792 | rightOutput = np.zeros((len(pointSamples), 2))
793 |
794 | # find left and right gaze coordinates for all samples in point
795 | for j in range(len(pointSamples)):
796 | curSample = pointSamples[j]
797 | leftEye = curSample.left_eye
798 | rightEye = curSample.right_eye
799 | leftOutput[j] = leftEye.position_on_display_area
800 | rightOutput[j] = rightEye.position_on_display_area
801 |
802 | # get average x and y coordinates using all samples in point
803 | lXY = tuple(np.mean(leftOutput, axis = 0))
804 | rXY = tuple(np.mean(rightOutput, axis = 0))
805 | point = tuple((pointPosition[0], pointPosition[1]))
806 | # put current calibration point coordinates , l and r eye coordinates
807 | # into list, and convert to psychopy window coordinates in pix
808 | newList = [self.ada2PsychoPix(point), self.ada2PsychoPix(lXY),
809 | self.ada2PsychoPix(rXY), pointPosition]
810 | calibDrawCoor.insert(i, newList)
811 |
812 | # for some weird reason my calibration always includes the point (0,0) at
813 | # index 0, so just remove it here
814 | calibDrawCoor.pop(0)
815 | # return as list
816 | return(calibDrawCoor)
817 |
818 |
819 | # function for drawing the results of the calibration
820 | def drawCalibrationResults(self, calibResult = None, calibWin = None, curDict = dict):
821 |
822 | # check argument values
823 | if self.calibration is None:
824 | raise ValueError('No calibration object exists.')
825 | # check values of calibration result
826 | if calibResult is None:
827 | raise ValueError('No calibration result object given.')
828 | # check the values of the point dictionary
829 | if curDict is None:
830 | raise ValueError('No dictionary object given.')
831 | elif not isinstance(curDict, dict):
832 | raise TypeError('curDict must be a dictionary with number \n' +\
833 | 'keys and coordinate values.')
834 | # check value of calibration window
835 | if calibWin is None:
836 | raise ValueError('No psychopy window object given.')
837 |
838 | # get gaze position results
839 | points2Draw = self.calculateCalibration(calibResult)
840 |
841 | # create stimuli objects for drawing
842 | # outlined empty circle object for showing calibration point
843 | calibPoint = visual.Circle(calibWin,
844 | radius = 50,
845 | lineColor = [1.0, 1.0, 1.0], # white
846 | lineWidth = 10,
847 | fillColor = calibWin.color,
848 | units = 'pix',
849 | pos = (0.0, 0.0))
850 | # line object for showing right eye gaze position during calibration
851 | rightEyeLine = visual.Line(calibWin,
852 | units ='pix',
853 | lineColor ='red',
854 | lineWidth = 20,
855 | start = (0.0, 0.0),
856 | end = (0.0, 0.0))
857 | # line object for showing left eye gaze position during calibration
858 | leftEyeLine = visual.Line(calibWin,
859 | units ='pix',
860 | lineColor ='yellow',
861 | lineWidth = 20,
862 | start = (0.0, 0.0),
863 | end = (0.0, 0.0))
864 | # number for identifying point in dictionary
865 | pointText = visual.TextStim(calibWin,
866 | text = " ",
867 | color = [0.8, 0.8, 0.8], # lighter than bkg
868 | units = 'pix',
869 | pos = [0.0, 0.0],
870 | height = 60)
871 | # Make a dummy message
872 | checkMsg = visual.TextStim(calibWin,
873 | text = 'Wait for the experimenter.',
874 | color = [1.0, 1.0, 1.0],
875 | units = 'norm',
876 | pos = [0.0, -0.5],
877 | height = 0.07)
878 |
879 | # make empty dictionary for holding points to be recalibrated
880 | holdRedoDict = []
881 | holdColorPoints = []
882 |
883 | # clear events not accessed this iteration
884 | event.clearEvents(eventType='keyboard')
885 |
886 | # draw and update screen
887 | while True:
888 |
889 | # iterate through calibration points and draw
890 | for i in range(len(points2Draw)):
891 | # update point and calibraiton results for both eyes
892 | point = points2Draw[i]
893 | pointPos = point[3]
894 | pointKey = 0
895 |
896 | # update text
897 | for key, point in curDict.items():
898 | if point == pointPos:
899 | pointText.text = key
900 | pointKey = key
901 |
902 | # if current point is selected for recalibrate, make it noticeable
903 | if int(pointKey) in holdColorPoints:
904 | calibPoint.lineColor = [-1.0, 1.0, -1.0] # green circle
905 | else:
906 | calibPoint.lineColor = [1.0, 1.0, 1.0] # no visible change
907 |
908 | # update point and calibraiton results for both eyes
909 | point = points2Draw[i]
910 | startCoor, leftCoor, rightCoor = point[0], point[1], point[2]
911 | # update positions and draw on window
912 | calibPoint.pos = startCoor # calibration point
913 | leftEyeLine.start = startCoor # left eye
914 | leftEyeLine.end = leftCoor
915 | rightEyeLine.start = startCoor # right eye
916 | rightEyeLine.end = rightCoor
917 | pointText.pos = startCoor # point text
918 |
919 | # update stimuli in window
920 | calibPoint.draw() # has to come first or else will cover other
921 | # stim
922 | pointText.draw()
923 | leftEyeLine.draw()
924 | rightEyeLine.draw()
925 | checkMsg.draw()
926 |
927 | # show points and lines on window
928 | calibWin.flip()
929 |
930 | # determine problem points
931 | # list of acceptable key input !!IF PRESSED KEYS ARE NOT IN KEYLIST, KEYBOARD EVENT MAY CRASH!!
932 | pressedKeys = event.getKeys(keyList = ['c', 'q', '1', '2', '3', '4',
933 | '5', '6', '7', '8', '9'])
934 |
935 | # depending on response, either...
936 | # abort script
937 | for key in pressedKeys:
938 | if key in ['q']:
939 | calibWin.close()
940 | self.calibration.leave_calibration_mode()
941 | pcore.quit()
942 | raise KeyboardInterrupt("You aborted the script manually.")
943 |
944 | # else if recalibration point is requested
945 | elif key in curDict.keys():
946 | # iterate through each of these presses
947 | for entry in curDict.items():
948 | # if the key press is the same as the current dictionary key
949 | if entry[0] == key:
950 | # append that dictionary entry into a holding dictionary
951 | holdRedoDict.append(entry)
952 | # append integer version to a holding list
953 | holdColorPoints.append(int(key))
954 |
955 | # continue with calibration procedure
956 | elif key in ['c']:
957 | print("Finished checking. Resuming calibration.")
958 | checkMsg.pos = (0.0, 0.0)
959 | checkMsg.text = ("Finished checking. Resuming calibration.")
960 | checkMsg.draw()
961 | calibWin.flip()
962 |
963 | # return dictionary of points to be recalibration
964 | redoDict = collections.OrderedDict([]) # empty dictionary for holding unique values
965 | # dont put repeats in resulting dictionary
966 | tempDict = collections.OrderedDict(holdRedoDict)
967 | for keys in tempDict.keys():
968 | if keys not in redoDict.keys():
969 | redoDict[keys] = tempDict.get(keys)
970 |
971 | # return dictionary
972 | return redoDict
973 |
974 | # clear events not accessed this iteration
975 | event.clearEvents(eventType='keyboard')
976 |
977 |
978 | # function for drawing calibration points, collecting and applying
979 | # calibration data
980 | def getCalibrationData(self, calibWin, pointList = list):
981 |
982 | # check argument values
983 | if self.calibration is None:
984 | raise ValueError('No calibration object exists.\n' +\
985 | 'Try running runFullCalibration()')
986 | # check value of calibration window
987 | if calibWin is None:
988 | raise ValueError('No psychopy window object given')
989 | # check the values of the point dictionary
990 | if pointList is None:
991 | raise ValueError('No list object given for pointList.')
992 | elif not isinstance(pointList, list):
993 | raise TypeError('pointList must be a list of coordinate tuples.')
994 |
995 | # defaults
996 | pointSmallRadius = 5.0 # point radius
997 | pointLargeRadius = pointSmallRadius * 10.0
998 | moveFrames = 50 # number of frames to draw between points
999 | startPoint = (0.90, 0.90) # starter point for animation
1000 |
1001 | # calibraiton point visual object
1002 | calibPoint = visual.Circle(calibWin,
1003 | radius = pointLargeRadius,
1004 | lineColor = [1.0, -1.0, -1.0], # red
1005 | fillColor = [1.0, -1.0, -1.0],
1006 | units = 'pix')
1007 |
1008 | # draw animation for each point
1009 | # converting psychopy window coordinate units from normal to px
1010 | for i in range(len(pointList)):
1011 |
1012 | # if first point draw starting point
1013 | if i == 0:
1014 | firstPoint = [startPoint[0], startPoint[1]]
1015 | secondPoint = [pointList[i][0], pointList[i][1]]
1016 | else:
1017 | firstPoint = [pointList[i - 1][0], pointList[i - 1][1]]
1018 | secondPoint = [pointList[i][0], pointList[i][1]]
1019 |
1020 | # draw and move dot
1021 | # step size for dot movement is new - old divided by frames
1022 | pointStep = [((secondPoint[0] - firstPoint[0]) / moveFrames),
1023 | ((secondPoint[1] - firstPoint[1]) / moveFrames)]
1024 |
1025 | # Move the point in position (smooth pursuit)
1026 | for frame in range(moveFrames - 1):
1027 | firstPoint[0] += pointStep[0]
1028 | firstPoint[1] += pointStep[1]
1029 | # draw & flip
1030 | calibPoint.pos = self.ada2PsychoPix(tuple(firstPoint))
1031 | calibPoint.draw()
1032 | calibWin.flip()
1033 | # wait to let eyes settle
1034 | pcore.wait(0.5)
1035 |
1036 | # allow the eye to focus before beginning calibration
1037 | # point size change step
1038 | radiusStep = ((pointLargeRadius - pointSmallRadius) / moveFrames)
1039 |
1040 | # Shrink the outer point (gaze fixation) to encourage focusing
1041 | for frame in range(moveFrames):
1042 | pointLargeRadius -= radiusStep
1043 | calibPoint.radius = pointLargeRadius
1044 | calibPoint.draw()
1045 | calibWin.flip()
1046 | # first wait to let the eyes settle
1047 | pcore.wait(0.5)
1048 |
1049 | # conduct calibration of point
1050 | print("Collecting data at {0}." .format(i + 1))
1051 | while self.calibration.collect_data(pointList[i][0],
1052 | pointList[i][1]) != tobii.CALIBRATION_STATUS_SUCCESS:
1053 | self.calibration.collect_data(pointList[i][0],
1054 | pointList[i][1])
1055 |
1056 | # feedback from calibration
1057 | print("{0} for data at point {1}."
1058 | .format(self.calibration.collect_data(pointList[i][0],
1059 | pointList[i][1]), i + 1))
1060 | pcore.wait(0.3) # wait before continuing
1061 |
1062 | # Return point to original size
1063 | for frame in range(moveFrames):
1064 | pointLargeRadius += radiusStep
1065 | calibPoint.radius = pointLargeRadius
1066 | calibPoint.draw()
1067 | calibWin.flip()
1068 | # let the eyes settle and move to the next point
1069 | pcore.wait(0.2)
1070 |
1071 | # check to quit
1072 | # depending on response, either abort script or continue to calibration
1073 | if event.getKeys(keyList=['q']):
1074 | calibWin.close()
1075 | self.calibration.leave_calibration_mode()
1076 | raise KeyboardInterrupt("You aborted the script manually.")
1077 | return
1078 |
1079 | # clear events not accessed this iteration
1080 | event.clearEvents(eventType='keyboard')
1081 |
1082 | # clear screen
1083 | calibWin.flip()
1084 | # print feedback
1085 | print("Computing and applying calibration.")
1086 | # compute and apply calibration to get calibration result object
1087 | calibResult = self.calibration.compute_and_apply()
1088 | # return calibration result
1089 | return calibResult
1090 |
1091 |
1092 | # function for running simple gui to visualize subject eye position. Make
1093 | # sure that the eyes are in optimal location for eye tracker
1094 | def runTrackBox(self):
1095 |
1096 | # check to see that eyetracker is connected
1097 | if self.eyetracker is None:
1098 | raise ValueError('There is no eyetracker object. \n' +\
1099 | 'Try running findTracker().')
1100 | # check window attribute
1101 | if self.win is None:
1102 | raise ValueError('No experimental monitor has been specified.\n' +\
1103 | 'Try running setMonitor().')
1104 |
1105 | # start the eyetracker
1106 | self.startGazeData()
1107 | # wait for it ot warm up
1108 | pcore.wait(0.5)
1109 |
1110 | # create window for visualizing eye position and text
1111 | trackWin = visual.Window(size = [self.win.getSizePix()[0],
1112 | self.win.getSizePix()[1]],
1113 | pos = [0, 0],
1114 | units = 'pix',
1115 | fullscr = True,
1116 | allowGUI = True,
1117 | monitor = self.win,
1118 | winType = 'pyglet',
1119 | color = [0.4, 0.4, 0.4])
1120 |
1121 | # feedback about eye position
1122 | self.drawEyePositions(trackWin)
1123 | # close track box
1124 | pcore.wait(2)
1125 | trackWin.close()
1126 | return
1127 |
1128 |
1129 | # function for running a complete calibration routine
1130 | def runFullCalibration(self, numCalibPoints = None):
1131 |
1132 | # check that eyetracker is connected before running
1133 | if self.eyetracker is None: # eyeTracker
1134 | raise ValueError("No eyetracker is specified. " +\
1135 | "Aborting calibration.\n" +\
1136 | "Try running findTracker().")
1137 | # check window attribute
1138 | if self.win is None:
1139 | raise ValueError('No experimental monitor has been specified.\n' +\
1140 | 'Try running setMonitor().')
1141 |
1142 | # create dictionary of calibration points
1143 | # if nothing entered then default is nine
1144 | if numCalibPoints is None:
1145 | pointList = [('1',(0.1, 0.1)), ('2',(0.5, 0.1)), ('3',(0.9, 0.1)),
1146 | ('4',(0.1, 0.5)), ('5',(0.5, 0.5)), ('6',(0.9, 0.5)),
1147 | ('7',(0.1, 0.9)), ('8',(0.5, 0.9)), ('9',(0.9, 0.9))]
1148 | elif numCalibPoints is 5:
1149 | pointList = [('1',(0.1, 0.1)), ('2',(0.9, 0.1)), ('3',(0.5, 0.5)),
1150 | ('4',(0.1, 0.9)), ('5',(0.9, 0.9))]
1151 | elif numCalibPoints is 9:
1152 | pointList = [('1',(0.1, 0.1)), ('2',(0.5, 0.1)), ('3',(0.9, 0.1)),
1153 | ('4',(0.1, 0.5)), ('5',(0.5, 0.5)), ('6',(0.9, 0.5)),
1154 | ('7',(0.1, 0.9)), ('8',(0.5, 0.9)), ('9',(0.9, 0.9))]
1155 |
1156 | # randomize points as ordered dictionary
1157 | np.random.shuffle(pointList)
1158 | calibDict = collections.OrderedDict(pointList)
1159 |
1160 | # create window for calibration
1161 | calibWin = visual.Window(size = [self.win.getSizePix()[0],
1162 | self.win.getSizePix()[1]],
1163 | pos = [0, 0],
1164 | units = 'pix',
1165 | fullscr = True,
1166 | allowGUI = True,
1167 | monitor = self.win,
1168 | winType = 'pyglet',
1169 | color = [0.4, 0.4, 0.4])
1170 | # stimuli for holding text
1171 | calibMessage = visual.TextStim(calibWin,
1172 | color = [1.0, 1.0, 1.0], # text
1173 | units = 'norm',
1174 | height = 0.08,
1175 | pos = (0.0, 0.1))
1176 | # stimuli for fixation cross
1177 | fixCross = visual.TextStim(calibWin,
1178 | color = [1.0, 1.0, 1.0],
1179 | units = 'norm',
1180 | height = 0.1,
1181 | pos = (0.0, 0.0),
1182 | text = "+")
1183 |
1184 | # track box to position participant
1185 | # subject instructions for track box
1186 | calibMessage.text = ("Please position yourself so that the\n" + \
1187 | "eye-tracker can locate your eyes." + \
1188 | "\n\nPress 'c' to continue.")
1189 | calibMessage.draw()
1190 | calibWin.flip()
1191 | # turn keyboard reporting on and get subject response
1192 | event.waitKeys(maxWait = 10, keyList = ['c']) # proceed with calibration
1193 |
1194 | #run track box routine
1195 | calibWin.flip() # clear previous text
1196 | self.runTrackBox()
1197 |
1198 | # initialize calibration
1199 | self.calibration = tobii.ScreenBasedCalibration(self.eyetracker) # calib object
1200 | # enter calibration mode
1201 | self.calibration.enter_calibration_mode()
1202 | # subject instructions
1203 | calibMessage.text = ("Please focus your eyes on the red dot " + \
1204 | "and follow it with your eyes as closely as " + \
1205 | "possible.\n\nPress 'c' to continue.")
1206 | calibMessage.draw()
1207 | calibWin.flip()
1208 |
1209 | # turn keyboard reporting on and get subject response
1210 | event.waitKeys(maxWait = 10, keyList = ['c']) # proceed with calibration
1211 |
1212 | # draw a fixation cross
1213 | fixCross.draw()
1214 | calibWin.flip()
1215 | pcore.wait(3)
1216 |
1217 | # create dictionary for holding points to be recalibrated
1218 | redoCalDict = calibDict
1219 |
1220 | # loop through calibration process until calibration is complete
1221 | while True:
1222 |
1223 | # create point order form randomized dictionary values
1224 | pointOrder = list(redoCalDict.values())
1225 |
1226 | # perform calibration
1227 | calibResult = self.getCalibrationData(calibWin, pointOrder)
1228 |
1229 | # Check status of calibration result
1230 | # if calibration was successful, check calibration results
1231 | if calibResult.status != tobii.CALIBRATION_STATUS_FAILURE:
1232 | # give feedback
1233 | calibMessage.text = ("Applying calibration...")
1234 | calibMessage.draw()
1235 | calibWin.flip()
1236 | pcore.wait(2)
1237 | # moving on to accuracy plot
1238 | calibMessage.text = ("Calculating calibration accuracy...")
1239 | calibMessage.draw()
1240 | calibWin.flip()
1241 | pcore.wait(2)
1242 |
1243 | # check calibration for poorly calibrated points
1244 | redoCalDict = self.drawCalibrationResults(calibResult,
1245 | calibWin,
1246 | calibDict)
1247 |
1248 | else: # if calibration was not successful, leave and abort
1249 | calibMessage.text = ("Calibration was not successful.\n\n" + \
1250 | "Closing the calibration window.")
1251 | calibMessage.draw()
1252 | calibWin.flip()
1253 | pcore.wait(3)
1254 | calibWin.close()
1255 | self.calibration.leave_calibration_mode()
1256 | return
1257 |
1258 | # Redo calibration for specific points if necessary
1259 | if not redoCalDict: # if no points to redo
1260 | # finish calibration
1261 | print("Calibration successful. Moving on to validation mode.")
1262 | calibMessage.text = ("Calibration was successful.\n\n" + \
1263 | "Moving on to validation.")
1264 | calibMessage.draw()
1265 | calibWin.flip()
1266 | pcore.wait(3)
1267 | self.calibration.leave_calibration_mode()
1268 | # break loop to proceed with validation
1269 | break
1270 |
1271 | else: # if any points to redo
1272 | # convert list to string for feedback
1273 | printString = " ".join(str(x) for x in redoCalDict.keys())
1274 | # feedback
1275 | print ("Still need to calibrate the following points: %s"
1276 | % printString)
1277 | calibMessage.text = ("Calibration is almost complete.\n\n" + \
1278 | "Prepare to recalibrate a few points.")
1279 | calibMessage.draw()
1280 | calibWin.flip()
1281 | pcore.wait(3)
1282 | # draw fixation cross
1283 | fixCross.draw()
1284 | calibWin.flip()
1285 | pcore.wait(3)
1286 |
1287 | # iterate through list of redo points and remove data from calibration
1288 | for newPoint in redoCalDict.values():
1289 | print(newPoint)
1290 | self.calibration.discard_data(newPoint[0], newPoint[1])
1291 |
1292 | # continue with calibration of remaining points
1293 | continue
1294 |
1295 | # Validate calibration
1296 | # draw fixation cross
1297 | fixCross.draw()
1298 | calibWin.flip()
1299 | pcore.wait(3)
1300 | # run validation
1301 | self.runValidation(calibDict)
1302 | # close window
1303 | calibMessage.text = ("Finished validating the calibration.\n\n" +\
1304 | "Calibration is complete. Closing window.")
1305 | calibMessage.draw()
1306 | calibWin.flip()
1307 | pcore.wait(3)
1308 | calibWin.close()
1309 | return
1310 |
1311 | # ----- Functions for exporting gaze data -----
1312 |
1313 | # Function for getting all gaze and event data from the current sample
1314 | # collected by the eyetracker, returned as a dictionary. Can easily be
1315 | # converted into a pandas dataframe. Strongly suggest putting output into
1316 | # a psychopy data object, as psychopy.data comes with many convenient
1317 | # functions for organizing experiment flow, recording data, and saving
1318 | # files. Gaze position is given in psychopy pixels, eye position, distance, and pupil size
1319 | # given in mm.
1320 | def getCurrentData(self):
1321 | # check gaze Data
1322 | if not self.tracking:
1323 | raise ValueError("Data is not being recorded by the eyetracker.")
1324 |
1325 | # output file at same frequency as eyetracker
1326 | timeCur = np.datetime64(dt.datetime.now())
1327 | timeNow = timeCur
1328 | timeDelta = np.absolute((timeCur - timeNow)/np.timedelta64(1, 'ms'))
1329 |
1330 | # when two data samples are slightly less than the eyetrackers frequency
1331 | # apart, request data from eyetracker
1332 | while timeDelta < 7.0: # change according to eyetracker freq
1333 | pcore.wait(0.001)
1334 | timeNow = np.datetime64(dt.datetime.now())
1335 | timeDelta = np.absolute((timeCur - timeNow)/np.timedelta64(1, 'ms'))
1336 |
1337 | # code can easily be modified to get more than averages
1338 | timeMidnight = np.datetime64(dt.datetime.date(dt.datetime.today()))
1339 |
1340 | self.currentData = {}
1341 | self.currentData['DeviceTimeStamp'] = np.absolute((timeNow - timeMidnight)/np.timedelta64(1, 'ms'))
1342 | self.currentData['AvgGazePointX'] = self.getAvgGazePos()[0]
1343 | self.currentData['AvgGazePointX'] = self.getAvgGazePos()[0]
1344 | self.currentData['AvgGazePointY'] = self.getAvgGazePos()[1]
1345 | self.currentData['AvgPupilDiam'] = self. getPupilSize()
1346 | self.currentData['AvgEyePosX'] = self.getAvgEyePos()[0]
1347 | self.currentData['AvgEyePosY'] = self.getAvgEyePos()[1]
1348 | self.currentData['AvgEyePosZ'] = self.getAvgEyePos()[2]
1349 | self.currentData['AvgEyeDistance'] = self.getAvgEyeDist() * 10
1350 | self.currentData['EyeValidities'] = self.checkEyeValidities()
1351 |
1352 | return self.currentData
1353 |
--------------------------------------------------------------------------------
/src/methods/__init__.py:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/mida-project/eye-tracker-setup/ade35074eeb951af96508be796ea43afd6d7e08f/src/methods/__init__.py
--------------------------------------------------------------------------------
/src/methods/finders.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env python
2 | #coding=utf-8
3 |
4 | """
5 | .py:
6 | """
7 |
8 | __author__ = "Francisco Maria Calisto"
9 | __maintainer__ = "Francisco Maria Calisto"
10 | __email__ = "francisco.calisto@tecnico.ulisboa.pt"
11 | __license__ = "MIT"
12 | __version__ = "1.0.1"
13 | __status__ = "Development"
14 | __copyright__ = "Copyright 2019, Instituto Superior Técnico (IST)"
15 | __credits__ = [
16 | "Bruno Oliveira",
17 | "Carlos Santiago",
18 | "Jacinto C. Nascimento",
19 | "Pedro Miraldo",
20 | "Nuno Nunes"
21 | ]
22 |
23 | import os
24 | import sys
25 |
26 | from os import path
27 |
28 | import tobii_research as tr
29 | import time
30 |
31 | # The current folder path.
32 | basePath = os.path.dirname(__file__)
33 |
34 | # The path to the repository "src" folder.
35 | joinPath = os.path.join(basePath, '..')
36 | pathAbsPath = os.path.abspath(joinPath)
37 | # Add the directory containing the module to
38 | # the Python path (wants absolute paths).
39 | sys.path.append(pathAbsPath)
40 |
41 | def find_eyetrackers_meta():
42 | found_eyetrackers = tr.find_all_eyetrackers()
43 | # available_eyetracker = found_eyetrackers[0]
44 |
45 | for available_eyetracker in found_eyetrackers:
46 | print("Address: " + available_eyetracker.address)
47 | print("Model: " + available_eyetracker.model)
48 | print("Name (It's OK if this is empty): " + available_eyetracker.device_name)
49 | print("Serial number: " + available_eyetracker.serial_number)
50 |
51 | if tr.CAPABILITY_CAN_SET_DISPLAY_AREA in available_eyetracker.device_capabilities:
52 | print("The display area can be set on the eye tracker.")
53 | else:
54 | print("The display area can not be set on the eye tracker.")
55 | if tr.CAPABILITY_HAS_EXTERNAL_SIGNAL in available_eyetracker.device_capabilities:
56 | print("The eye tracker can deliver an external signal stream.")
57 | else:
58 | print("The eye tracker can not deliver an external signal stream.")
59 | if tr.CAPABILITY_HAS_EYE_IMAGES in available_eyetracker.device_capabilities:
60 | print("The eye tracker can deliver an eye image stream.")
61 | else:
62 | print("The eye tracker can not deliver an eye image stream.")
63 | if tr.CAPABILITY_HAS_GAZE_DATA in available_eyetracker.device_capabilities:
64 | print("The eye tracker can deliver a gaze data stream.")
65 | else:
66 | print("The eye tracker can not deliver a gaze data stream.")
67 | if tr.CAPABILITY_HAS_HMD_GAZE_DATA in available_eyetracker.device_capabilities:
68 | print("The eye tracker can deliver a HMD gaze data stream.")
69 | else:
70 | print("The eye tracker can not deliver a HMD gaze data stream.")
71 | if tr.CAPABILITY_CAN_DO_SCREEN_BASED_CALIBRATION in available_eyetracker.device_capabilities:
72 | print("The eye tracker can do a screen based calibration.")
73 | else:
74 | print("The eye tracker can not do a screen based calibration.")
75 | if tr.CAPABILITY_CAN_DO_MONOCULAR_CALIBRATION in available_eyetracker.device_capabilities:
76 | print("The eye tracker can do a monocular calibration.")
77 | else:
78 | print("The eye tracker can not do a monocular calibration.")
79 | if tr.CAPABILITY_CAN_DO_HMD_BASED_CALIBRATION in available_eyetracker.device_capabilities:
80 | print("The eye tracker can do a HMD screen based calibration.")
81 | else:
82 | print("The eye tracker can not do a HMD screen based calibration.")
83 | if tr.CAPABILITY_HAS_HMD_LENS_CONFIG in available_eyetracker.device_capabilities:
84 | print("The eye tracker can get/set the HMD lens configuration.")
85 | else:
86 | print("The eye tracker can not get/set the HMD lens configuration.")
87 |
88 | return available_eyetracker
89 |
90 | # ==================== END File ==================== #
91 |
--------------------------------------------------------------------------------
/src/methods/gazers.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env python
2 | #coding=utf-8
3 |
4 | """
5 | .py:
6 | """
7 |
8 | __author__ = "Francisco Maria Calisto"
9 | __maintainer__ = "Francisco Maria Calisto"
10 | __email__ = "francisco.calisto@tecnico.ulisboa.pt"
11 | __license__ = "MIT"
12 | __version__ = "1.0.1"
13 | __status__ = "Development"
14 | __copyright__ = "Copyright 2019, Instituto Superior Técnico (IST)"
15 | __credits__ = [
16 | "Bruno Oliveira",
17 | "Carlos Santiago",
18 | "Jacinto C. Nascimento",
19 | "Pedro Miraldo",
20 | "Nuno Nunes"
21 | ]
22 |
23 | import os
24 | import sys
25 |
26 | from os import path
27 |
28 | import tobii_research as tr
29 | import time
30 |
31 | # The current folder path.
32 | basePath = os.path.dirname(__file__)
33 |
34 | # The path to the repository "src" folder.
35 | joinPath = os.path.join(basePath, '..')
36 | pathAbsPath = os.path.abspath(joinPath)
37 | # Add the directory containing the module to
38 | # the Python path (wants absolute paths).
39 | sys.path.append(pathAbsPath)
40 |
41 | global_gaze_data = None
42 |
43 | def gaze_data_callback(gaze_data):
44 | global global_gaze_data
45 | global_gaze_data = gaze_data
46 |
47 | # def gaze_data_callback(gaze_data):
48 | # # Print gaze points of left and right eye
49 | # print("Left eye: ({gaze_left_eye}) \t Right eye: ({gaze_right_eye})".format(
50 | # gaze_left_eye=gaze_data['left_gaze_point_on_display_area'],
51 | # gaze_right_eye=gaze_data['right_gaze_point_on_display_area']))
52 |
53 | def gaze_data(eyetracker):
54 | global global_gaze_data
55 |
56 | print("Subscribing to gaze data for eye tracker with serial number {0}.".format(eyetracker.serial_number))
57 | eyetracker.subscribe_to(tr.EYETRACKER_GAZE_DATA, gaze_data_callback, as_dictionary=True)
58 |
59 | # Wait while some gaze data is collected.
60 | time.sleep(5)
61 |
62 | eyetracker.unsubscribe_from(tr.EYETRACKER_GAZE_DATA, gaze_data_callback)
63 | print("Unsubscribed from gaze data.")
64 |
65 | print("Last received gaze package:")
66 | print(global_gaze_data)
67 |
68 | def gaze_output_frequencies(eyetracker):
69 | initial_gaze_output_frequency = eyetracker.get_gaze_output_frequency()
70 | print("The eye tracker's initial gaze output frequency is {0} Hz.".format(initial_gaze_output_frequency))
71 | try:
72 | for gaze_output_frequency in eyetracker.get_all_gaze_output_frequencies():
73 | eyetracker.set_gaze_output_frequency(gaze_output_frequency)
74 | print("Gaze output frequency set to {0} Hz.".format(gaze_output_frequency))
75 | finally:
76 | eyetracker.set_gaze_output_frequency(initial_gaze_output_frequency)
77 | print("Gaze output frequency reset to {0} Hz.".format(initial_gaze_output_frequency))
78 |
79 | # ==================== END File ==================== #
80 |
--------------------------------------------------------------------------------
/src/methods/getters.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env python
2 | #coding=utf-8
3 |
4 | """
5 | .py:
6 | """
7 |
8 | __author__ = "Francisco Maria Calisto"
9 | __maintainer__ = "Francisco Maria Calisto"
10 | __email__ = "francisco.calisto@tecnico.ulisboa.pt"
11 | __license__ = "MIT"
12 | __version__ = "1.0.1"
13 | __status__ = "Development"
14 | __copyright__ = "Copyright 2019, Instituto Superior Técnico (IST)"
15 | __credits__ = [
16 | "Bruno Oliveira",
17 | "Carlos Santiago",
18 | "Jacinto C. Nascimento",
19 | "Pedro Miraldo",
20 | "Nuno Nunes"
21 | ]
22 |
23 | import os
24 | import sys
25 |
26 | from os import path
27 |
28 | import tobii_research as tr
29 | from tobii_research import DisplayArea
30 | import time
31 |
32 | # The current folder path.
33 | basePath = os.path.dirname(__file__)
34 |
35 | # The path to the repository "src" folder.
36 | joinPath = os.path.join(basePath, '..')
37 | pathAbsPath = os.path.abspath(joinPath)
38 | # Add the directory containing the module to
39 | # the Python path (wants absolute paths).
40 | sys.path.append(pathAbsPath)
41 |
42 | def get_and_set_display_area(eyetracker):
43 | display_area = eyetracker.get_display_area()
44 |
45 | print("Got display area from tracker with serial number {0}:".format(eyetracker.serial_number))
46 |
47 | print("Bottom Left: {0}".format(display_area.bottom_left))
48 | print("Bottom Right: {0}".format(display_area.bottom_right))
49 | print("Height: {0}".format(display_area.height))
50 | print("Top Left: {0}".format(display_area.top_left))
51 | print("Top Right: {0}".format(display_area.top_right))
52 | print("Width: {0}".format(display_area.width))
53 |
54 | # To set the display area it is possible to either use a previously saved instance of
55 | # the class Display area, or create a new one as shown bellow.
56 | new_display_area_dict = dict()
57 | new_display_area_dict['top_left'] = display_area.top_left
58 | new_display_area_dict['top_right'] = display_area.top_right
59 | new_display_area_dict['bottom_left'] = display_area.bottom_left
60 |
61 | new_display_area = DisplayArea(new_display_area_dict)
62 |
63 | eyetracker.set_display_area(new_display_area)
64 |
65 | # ==================== END File ==================== #
66 |
--------------------------------------------------------------------------------
/src/methods/wrappers.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env python
2 | #coding=utf-8
3 |
4 | """
5 | .py:
6 | """
7 |
8 | __author__ = "Francisco Maria Calisto"
9 | __maintainer__ = "Francisco Maria Calisto"
10 | __email__ = "francisco.calisto@tecnico.ulisboa.pt"
11 | __license__ = "MIT"
12 | __version__ = "1.0.3"
13 | __status__ = "Development"
14 | __copyright__ = "Copyright 2019, Instituto Superior Técnico (IST)"
15 | __credits__ = [
16 | "Bruno Oliveira",
17 | "Carlos Santiago",
18 | "Jacinto C. Nascimento",
19 | "Pedro Miraldo",
20 | "Nuno Nunes"
21 | ]
22 |
23 | import os
24 | import sys
25 |
26 | from os import path
27 |
28 | # The current folder path.
29 | basePath = os.path.dirname(__file__)
30 |
31 | # The path to the repository "src" folder.
32 | joinPath = os.path.join(basePath, '..')
33 | pathAbsPath = os.path.abspath(joinPath)
34 | # Add the directory containing the module to
35 | # the Python path (wants absolute paths).
36 | sys.path.append(pathAbsPath)
37 |
38 | # Appending libs path.
39 | libsPath = os.path.join(joinPath, 'libs')
40 | libsAbsPath = os.path.abspath(libsPath)
41 | sys.path.append(libsAbsPath)
42 | sys.path.insert(0, libsAbsPath)
43 |
44 | # Import available libs
45 | from libs import tobii_pro_wrapper as tpw
46 |
47 | # Create a TobiiHelper object
48 | thobj = tpw.TobiiHelper()
49 |
50 | # Idenfity and define the experimental monitor
51 | thobj.setMonitor(nameString = None, dimensions = None)
52 |
53 | # Find eyetrackers and connect
54 | thobj.findTracker(serialString = None)
55 |
56 | # Determine the coordinates for the eyetracker's
57 | # tracking spaces (trackbox and active display area)
58 | thobj.getTrackerSpace()
59 |
60 | # Run a full 5 point calibration routing
61 | # thobj.runFullCalibration(numCalibPoints = 5)
62 |
63 | # start the eyetracker
64 | thobj.startGazeData()
65 |
66 | # thobj.getAvgGazePos()
67 |
68 | # # to get real time gaze data, place this command within a "while" loop
69 | # # during each trial run
70 | # while True:
71 | # toPrint = thobj.getCurrentData()
72 | # print(toPrint)
73 |
74 | # stop the eyetracker
75 | thobj.stopGazeData()
76 |
77 | # ==================== END File ==================== #
--------------------------------------------------------------------------------
/src/notebooks/.gitkeep:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/mida-project/eye-tracker-setup/ade35074eeb951af96508be796ea43afd6d7e08f/src/notebooks/.gitkeep
--------------------------------------------------------------------------------
/src/tests/__init__.py:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/mida-project/eye-tracker-setup/ade35074eeb951af96508be796ea43afd6d7e08f/src/tests/__init__.py
--------------------------------------------------------------------------------
/src/utils/TobiiTest/PyGazeTobiiTest.osexp:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/mida-project/eye-tracker-setup/ade35074eeb951af96508be796ea43afd6d7e08f/src/utils/TobiiTest/PyGazeTobiiTest.osexp
--------------------------------------------------------------------------------
/src/utils/TobiiTest/desktop.ini:
--------------------------------------------------------------------------------
1 | [.ShellClassInfo]
2 | IconFile=C:\Program Files\Dropbox\Client\Dropbox.exe
3 | IconIndex=-3301
4 | InfoTip=A secure home for all your photos, documents, and videos.
5 |
--------------------------------------------------------------------------------
/src/utils/TobiiTest/tobii-legacy_opensesame_log.csv:
--------------------------------------------------------------------------------
1 | "acc","accuracy","average_response_time","avg_rt","background","bidi","canvas_backend","clock_backend","color_backend","compensation","coordinates","correct","correct_instructions","count_block_loop","count_block_sequence","count_end_of_experiment","count_experiment","count_experimental_loop","count_fixation","count_instructions","count_logger","count_pygaze_init","count_pygaze_log__stim_off","count_pygaze_log__stim_on","count_pygaze_start_recording","count_pygaze_stop_recording","count_sketchpad","count_trial_sequence","datetime","description","disable_garbage_collection","experiment_file","experiment_path","font_bold","font_family","font_italic","font_size","font_underline","foreground","form_clicks","fullscreen","height","keyboard_backend","live_row","live_row_block_loop","live_row_experimental_loop","logfile","mouse_backend","opensesame_codename","opensesame_version","practice","repeat_cycle","response","response_end_of_experiment","response_instructions","response_time","response_time_end_of_experiment","response_time_instructions","round_decimals","sampler_backend","sound_buf_size","sound_channels","sound_freq","sound_sample_size","start","stim1","stim2","subject_nr","subject_parity","time_block_loop","time_block_sequence","time_end_of_experiment","time_experiment","time_experimental_loop","time_fixation","time_instructions","time_logger","time_pygaze_init","time_pygaze_log__stim_off","time_pygaze_log__stim_on","time_pygaze_start_recording","time_pygaze_stop_recording","time_sketchpad","time_trial_sequence","title","total_correct","total_response_time","total_responses","uniform_coordinates","width"
2 | "undefined","undefined","undefined","undefined","#808080","no","psycho","psycho","psycho","0","relative","undefined","undefined","0","0","0","0","0","0","0","0","0","0","0","0","0","0","0","10/06/17 14:45:25","A template for eye-tracking experiments","yes","PyGazeTobiiTest.osexp","C:\Users\Epsi\Desktop\TobiiTest","no","mono","no","22","no","black","no","yes","768","psycho","0","0","0","C:/Users/Epsi/Desktop/TobiiTest/subject-0.csv","psycho","Jazzy James","3.1.9","no","0","space","NA","space","766.900392948","NA","766.900392948","2","legacy","1024","2","48000","-16","experiment","stim_a9","stim_b9","0","even","30454.1843216","30454.132148","NA","1693.59949848","30453.506065","30484.2914742","29686.2177913","34324.7848481","1693.89030203","34226.0181119","31228.5750453","30480.2377576","34226.2101278","31231.2500107","30480.1522272","PyGaze Tobii Test","0","0","0","yes","1366"
3 | "undefined","undefined","undefined","undefined","#808080","no","psycho","psycho","psycho","0","relative","undefined","undefined","0","0","0","0","0","1","0","1","0","1","1","1","1","1","1","10/06/17 14:45:25","A template for eye-tracking experiments","yes","PyGazeTobiiTest.osexp","C:\Users\Epsi\Desktop\TobiiTest","no","mono","no","22","no","black","no","yes","768","psycho","1","1","0","C:/Users/Epsi/Desktop/TobiiTest/subject-0.csv","psycho","Jazzy James","3.1.9","no","0","space","NA","space","766.900392948","NA","766.900392948","2","legacy","1024","2","48000","-16","experiment","stim_a5","stim_b5","0","even","30454.1843216","30454.132148","NA","1693.59949848","30453.506065","34441.403076","29686.2177913","38270.3746406","1693.89030203","38183.6309223","35185.4240686","34436.1861451","38184.6666963","35187.9976804","34436.1083124","PyGaze Tobii Test","0","0","0","yes","1366"
4 | "undefined","undefined","undefined","undefined","#808080","no","psycho","psycho","psycho","0","relative","undefined","undefined","0","0","0","0","0","2","0","2","0","2","2","2","2","2","2","10/06/17 14:45:25","A template for eye-tracking experiments","yes","PyGazeTobiiTest.osexp","C:\Users\Epsi\Desktop\TobiiTest","no","mono","no","22","no","black","no","yes","768","psycho","2","2","0","C:/Users/Epsi/Desktop/TobiiTest/subject-0.csv","psycho","Jazzy James","3.1.9","no","0","space","NA","space","766.900392948","NA","766.900392948","2","legacy","1024","2","48000","-16","experiment","stim_a2","stim_b2","0","even","30454.1843216","30454.132148","NA","1693.59949848","30453.506065","38346.8298987","29686.2177913","42179.0674442","1693.89030203","42088.2289548","39090.9411259","38340.9860295","42088.4017263","39093.4993423","38340.9013543","PyGaze Tobii Test","0","0","0","yes","1366"
5 | "undefined","undefined","undefined","undefined","#808080","no","psycho","psycho","psycho","0","relative","undefined","undefined","0","0","0","0","0","3","0","3","0","3","3","3","3","3","3","10/06/17 14:45:25","A template for eye-tracking experiments","yes","PyGazeTobiiTest.osexp","C:\Users\Epsi\Desktop\TobiiTest","no","mono","no","22","no","black","no","yes","768","psycho","3","3","0","C:/Users/Epsi/Desktop/TobiiTest/subject-0.csv","psycho","Jazzy James","3.1.9","no","0","space","NA","space","766.900392948","NA","766.900392948","2","legacy","1024","2","48000","-16","experiment","stim_a8","stim_b8","0","even","30454.1843216","30454.132148","NA","1693.59949848","30453.506065","42263.0438241","29686.2177913","46095.5905622","1693.89030203","46004.7084523","43007.0921865","42256.7551964","46004.8739538","43009.812483","42256.6739424","PyGaze Tobii Test","0","0","0","yes","1366"
6 | "undefined","undefined","undefined","undefined","#808080","no","psycho","psycho","psycho","0","relative","undefined","undefined","0","0","0","0","0","4","0","4","0","4","4","4","4","4","4","10/06/17 14:45:25","A template for eye-tracking experiments","yes","PyGazeTobiiTest.osexp","C:\Users\Epsi\Desktop\TobiiTest","no","mono","no","22","no","black","no","yes","768","psycho","4","4","0","C:/Users/Epsi/Desktop/TobiiTest/subject-0.csv","psycho","Jazzy James","3.1.9","no","0","space","NA","space","766.900392948","NA","766.900392948","2","legacy","1024","2","48000","-16","experiment","stim_a3","stim_b3","0","even","30454.1843216","30454.132148","NA","1693.59949848","30453.506065","46185.2752456","29686.2177913","50013.0262903","1693.89030203","49926.8132887","46929.1298814","46178.307506","49926.9779348","46931.8732712","46178.2253967","PyGaze Tobii Test","0","0","0","yes","1366"
7 | "undefined","undefined","undefined","undefined","#808080","no","psycho","psycho","psycho","0","relative","undefined","undefined","0","0","0","0","0","5","0","5","0","5","5","5","5","5","5","10/06/17 14:45:25","A template for eye-tracking experiments","yes","PyGazeTobiiTest.osexp","C:\Users\Epsi\Desktop\TobiiTest","no","mono","no","22","no","black","no","yes","768","psycho","5","5","0","C:/Users/Epsi/Desktop/TobiiTest/subject-0.csv","psycho","Jazzy James","3.1.9","no","0","space","NA","space","766.900392948","NA","766.900392948","2","legacy","1024","2","48000","-16","experiment","stim_a7","stim_b7","0","even","30454.1843216","30454.132148","NA","1693.59949848","30453.506065","50099.1503403","29686.2177913","53921.5796794","1693.89030203","53841.3533828","50843.6738244","50091.3084793","53841.5424051","50846.2461533","50091.2259424","PyGaze Tobii Test","0","0","0","yes","1366"
8 | "undefined","undefined","undefined","undefined","#808080","no","psycho","psycho","psycho","0","relative","undefined","undefined","0","0","0","0","0","6","0","6","0","6","6","6","6","6","6","10/06/17 14:45:25","A template for eye-tracking experiments","yes","PyGazeTobiiTest.osexp","C:\Users\Epsi\Desktop\TobiiTest","no","mono","no","22","no","black","no","yes","768","psycho","6","6","0","C:/Users/Epsi/Desktop/TobiiTest/subject-0.csv","psycho","Jazzy James","3.1.9","no","0","space","NA","space","766.900392948","NA","766.900392948","2","legacy","1024","2","48000","-16","experiment","stim_a6","stim_b6","0","even","30454.1843216","30454.132148","NA","1693.59949848","30453.506065","54006.9874116","29686.2177913","57822.0187927","1693.89030203","57748.6999369","54751.0178126","54000.7150347","57748.8697149","54753.4665499","54000.6354914","PyGaze Tobii Test","0","0","0","yes","1366"
9 | "undefined","undefined","undefined","undefined","#808080","no","psycho","psycho","psycho","0","relative","undefined","undefined","0","0","0","0","0","7","0","7","0","7","7","7","7","7","7","10/06/17 14:45:25","A template for eye-tracking experiments","yes","PyGazeTobiiTest.osexp","C:\Users\Epsi\Desktop\TobiiTest","no","mono","no","22","no","black","no","yes","768","psycho","7","7","0","C:/Users/Epsi/Desktop/TobiiTest/subject-0.csv","psycho","Jazzy James","3.1.9","no","0","space","NA","space","766.900392948","NA","766.900392948","2","legacy","1024","2","48000","-16","experiment","stim_a4","stim_b4","0","even","30454.1843216","30454.132148","NA","1693.59949848","30453.506065","57907.1724996","29686.2177913","61805.6542482","1693.89030203","61649.9443197","58651.7083856","57900.6011937","61650.1081105","58654.2986758","57900.5212227","PyGaze Tobii Test","0","0","0","yes","1366"
10 | "undefined","undefined","undefined","undefined","#808080","no","psycho","psycho","psycho","0","relative","undefined","undefined","0","0","0","0","0","8","0","8","0","8","8","8","8","8","8","10/06/17 14:45:25","A template for eye-tracking experiments","yes","PyGazeTobiiTest.osexp","C:\Users\Epsi\Desktop\TobiiTest","no","mono","no","22","no","black","no","yes","768","psycho","8","8","0","C:/Users/Epsi/Desktop/TobiiTest/subject-0.csv","psycho","Jazzy James","3.1.9","no","0","space","NA","space","766.900392948","NA","766.900392948","2","legacy","1024","2","48000","-16","experiment","stim_a10","stim_b10","0","even","30454.1843216","30454.132148","NA","1693.59949848","30453.506065","61892.1084458","29686.2177913","65796.8715521","1693.89030203","65634.2824079","62636.350107","61884.5907453","65634.4654431","62639.2290626","61884.508636","PyGaze Tobii Test","0","0","0","yes","1366"
11 | "undefined","undefined","undefined","undefined","#808080","no","psycho","psycho","psycho","0","relative","undefined","undefined","0","0","0","0","0","9","0","9","0","9","9","9","9","9","9","10/06/17 14:45:25","A template for eye-tracking experiments","yes","PyGazeTobiiTest.osexp","C:\Users\Epsi\Desktop\TobiiTest","no","mono","no","22","no","black","no","yes","768","psycho","9","9","0","C:/Users/Epsi/Desktop/TobiiTest/subject-0.csv","psycho","Jazzy James","3.1.9","no","0","space","NA","space","766.900392948","NA","766.900392948","2","legacy","1024","2","48000","-16","experiment","stim_a1","stim_b1","0","even","30454.1843216","30454.132148","NA","1693.59949848","30453.506065","65881.9581174","29686.2177913","69697.5997585","1693.89030203","69624.3061341","66626.0424026","65876.2134635","69624.4729185","66628.8747441","65876.1326373","PyGaze Tobii Test","0","0","0","yes","1366"
12 |
--------------------------------------------------------------------------------
/src/utils/TobiiTest/tobii-legacy_output_analysed.xlsx:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/mida-project/eye-tracker-setup/ade35074eeb951af96508be796ea43afd6d7e08f/src/utils/TobiiTest/tobii-legacy_output_analysed.xlsx
--------------------------------------------------------------------------------
/src/utils/TobiiTest/tobii-new_opensesame_log.csv:
--------------------------------------------------------------------------------
1 | "acc","accuracy","average_response_time","avg_rt","background","bidi","canvas_backend","clock_backend","color_backend","compensation","coordinates","correct","correct_instructions","count_block_loop","count_block_sequence","count_end_of_experiment","count_experiment","count_experimental_loop","count_fixation","count_instructions","count_logger","count_pygaze_init","count_pygaze_log__stim_off","count_pygaze_log__stim_on","count_pygaze_start_recording","count_pygaze_stop_recording","count_sketchpad","count_trial_sequence","datetime","description","disable_garbage_collection","experiment_file","experiment_path","font_bold","font_family","font_italic","font_size","font_underline","foreground","form_clicks","fullscreen","height","keyboard_backend","live_row","live_row_block_loop","live_row_experimental_loop","logfile","mouse_backend","opensesame_codename","opensesame_version","practice","repeat_cycle","response","response_end_of_experiment","response_instructions","response_time","response_time_end_of_experiment","response_time_instructions","round_decimals","sampler_backend","sound_buf_size","sound_channels","sound_freq","sound_sample_size","start","stim1","stim2","subject_nr","subject_parity","time_block_loop","time_block_sequence","time_end_of_experiment","time_experiment","time_experimental_loop","time_fixation","time_instructions","time_logger","time_pygaze_init","time_pygaze_log__stim_off","time_pygaze_log__stim_on","time_pygaze_start_recording","time_pygaze_stop_recording","time_sketchpad","time_trial_sequence","title","total_correct","total_response_time","total_responses","uniform_coordinates","width"
2 | "undefined","undefined","undefined","undefined","#808080","no","psycho","psycho","psycho","0","relative","undefined","undefined","0","0","0","0","0","0","0","0","0","0","0","0","0","0","0","10/06/17 13:58:43","A template for eye-tracking experiments","yes","PyGazeTobiiTest.osexp","C:\Users\Epsi\Desktop\TobiiTest","no","mono","no","22","no","black","no","yes","768","psycho","0","0","0","C:/Users/Epsi/Desktop/TobiiTest/subject-0.csv","psycho","Jazzy James","3.1.9","no","0","space","NA","space","1155.52646568","NA","1155.52646568","2","legacy","1024","2","48000","-16","experiment","stim_a2","stim_b2","0","even","36699.7097951","36699.6584769","NA","1707.29592","36699.0037411","37785.2701993","35543.0675845","41632.577358","1707.67567528","41554.7176896","38529.4609699","36726.5206034","41582.4291337","38559.4783155","36726.4406324","PyGaze Tobii Test","0","0","0","yes","1366"
3 | "undefined","undefined","undefined","undefined","#808080","no","psycho","psycho","psycho","0","relative","undefined","undefined","0","0","0","0","0","1","0","1","0","1","1","1","1","1","1","10/06/17 13:58:43","A template for eye-tracking experiments","yes","PyGazeTobiiTest.osexp","C:\Users\Epsi\Desktop\TobiiTest","no","mono","no","22","no","black","no","yes","768","psycho","1","1","0","C:/Users/Epsi/Desktop/TobiiTest/subject-0.csv","psycho","Jazzy James","3.1.9","no","0","space","NA","space","1155.52646568","NA","1155.52646568","2","legacy","1024","2","48000","-16","experiment","stim_a6","stim_b6","0","even","36699.7097951","36699.6584769","NA","1707.29592","36699.0037411","42852.1802571","35543.0675845","46759.8342933","1707.67567528","46663.5269857","43596.5095871","41824.2318936","46682.267139","43667.9728595","41824.1502121","PyGaze Tobii Test","0","0","0","yes","1366"
4 | "undefined","undefined","undefined","undefined","#808080","no","psycho","psycho","psycho","0","relative","undefined","undefined","0","0","0","0","0","2","0","2","0","2","2","2","2","2","2","10/06/17 13:58:43","A template for eye-tracking experiments","yes","PyGazeTobiiTest.osexp","C:\Users\Epsi\Desktop\TobiiTest","no","mono","no","22","no","black","no","yes","768","psycho","2","2","0","C:/Users/Epsi/Desktop/TobiiTest/subject-0.csv","psycho","Jazzy James","3.1.9","no","0","space","NA","space","1155.52646568","NA","1155.52646568","2","legacy","1024","2","48000","-16","experiment","stim_a9","stim_b9","0","even","36699.7097951","36699.6584769","NA","1707.29592","36699.0037411","47919.1771285","35543.0675845","51833.325493","1707.67567528","51698.600465","48663.2237802","46851.1432002","51724.8091387","48703.53429","46851.0602357","PyGaze Tobii Test","0","0","0","yes","1366"
5 | "undefined","undefined","undefined","undefined","#808080","no","psycho","psycho","psycho","0","relative","undefined","undefined","0","0","0","0","0","3","0","3","0","3","3","3","3","3","3","10/06/17 13:58:43","A template for eye-tracking experiments","yes","PyGazeTobiiTest.osexp","C:\Users\Epsi\Desktop\TobiiTest","no","mono","no","22","no","black","no","yes","768","psycho","3","3","0","C:/Users/Epsi/Desktop/TobiiTest/subject-0.csv","psycho","Jazzy James","3.1.9","no","0","space","NA","space","1155.52646568","NA","1155.52646568","2","legacy","1024","2","48000","-16","experiment","stim_a4","stim_b4","0","even","36699.7097951","36699.6584769","NA","1707.29592","36699.0037411","52944.3748322","35543.0675845","56800.9090178","1707.67567528","56706.4260751","53688.82476","51909.5759057","56749.9512476","53711.0930452","51909.4925135","PyGaze Tobii Test","0","0","0","yes","1366"
6 | "undefined","undefined","undefined","undefined","#808080","no","psycho","psycho","psycho","0","relative","undefined","undefined","0","0","0","0","0","4","0","4","0","4","4","4","4","4","4","10/06/17 13:58:43","A template for eye-tracking experiments","yes","PyGazeTobiiTest.osexp","C:\Users\Epsi\Desktop\TobiiTest","no","mono","no","22","no","black","no","yes","768","psycho","4","4","0","C:/Users/Epsi/Desktop/TobiiTest/subject-0.csv","psycho","Jazzy James","3.1.9","no","0","space","NA","space","1155.52646568","NA","1155.52646568","2","legacy","1024","2","48000","-16","experiment","stim_a8","stim_b8","0","even","36699.7097951","36699.6584769","NA","1707.29592","36699.0037411","57894.4665208","35543.0675845","61801.449998","1707.67567528","61681.4601591","58638.6572914","56873.3678648","61750.508051","58686.3516466","56873.2870385","PyGaze Tobii Test","0","0","0","yes","1366"
7 | "undefined","undefined","undefined","undefined","#808080","no","psycho","psycho","psycho","0","relative","undefined","undefined","0","0","0","0","0","5","0","5","0","5","5","5","5","5","5","10/06/17 13:58:43","A template for eye-tracking experiments","yes","PyGazeTobiiTest.osexp","C:\Users\Epsi\Desktop\TobiiTest","no","mono","no","22","no","black","no","yes","768","psycho","5","5","0","C:/Users/Epsi/Desktop/TobiiTest/subject-0.csv","psycho","Jazzy James","3.1.9","no","0","space","NA","space","1155.52646568","NA","1155.52646568","2","legacy","1024","2","48000","-16","experiment","stim_a3","stim_b3","0","even","36699.7097951","36699.6584769","NA","1707.29592","36699.0037411","62911.9694759","35543.0675845","66834.9326106","1707.67567528","66714.4963025","63655.7454237","61875.3487505","66733.867243","63719.5229282","61875.2670689","PyGaze Tobii Test","0","0","0","yes","1366"
8 | "undefined","undefined","undefined","undefined","#808080","no","psycho","psycho","psycho","0","relative","undefined","undefined","0","0","0","0","0","6","0","6","0","6","6","6","6","6","6","10/06/17 13:58:43","A template for eye-tracking experiments","yes","PyGazeTobiiTest.osexp","C:\Users\Epsi\Desktop\TobiiTest","no","mono","no","22","no","black","no","yes","768","psycho","6","6","0","C:/Users/Epsi/Desktop/TobiiTest/subject-0.csv","psycho","Jazzy James","3.1.9","no","0","space","NA","space","1155.52646568","NA","1155.52646568","2","legacy","1024","2","48000","-16","experiment","stim_a1","stim_b1","0","even","36699.7097951","36699.6584769","NA","1707.29592","36699.0037411","67953.9260195","35543.0675845","71868.6752357","1707.67567528","71723.7639565","68698.6406643","66910.4239403","71793.3789153","68728.6370549","66910.3431141","PyGaze Tobii Test","0","0","0","yes","1366"
9 | "undefined","undefined","undefined","undefined","#808080","no","psycho","psycho","psycho","0","relative","undefined","undefined","0","0","0","0","0","7","0","7","0","7","7","7","7","7","7","10/06/17 13:58:43","A template for eye-tracking experiments","yes","PyGazeTobiiTest.osexp","C:\Users\Epsi\Desktop\TobiiTest","no","mono","no","22","no","black","no","yes","768","psycho","7","7","0","C:/Users/Epsi/Desktop/TobiiTest/subject-0.csv","psycho","Jazzy James","3.1.9","no","0","space","NA","space","1155.52646568","NA","1155.52646568","2","legacy","1024","2","48000","-16","experiment","stim_a10","stim_b10","0","even","36699.7097951","36699.6584769","NA","1707.29592","36699.0037411","73054.0620986","35543.0675845","76910.5979948","1707.67567528","76824.9734428","73798.4380426","72026.5952715","76860.3958864","73829.2952974","72026.5161559","PyGaze Tobii Test","0","0","0","yes","1366"
10 | "undefined","undefined","undefined","undefined","#808080","no","psycho","psycho","psycho","0","relative","undefined","undefined","0","0","0","0","0","8","0","8","0","8","8","8","8","8","8","10/06/17 13:58:43","A template for eye-tracking experiments","yes","PyGazeTobiiTest.osexp","C:\Users\Epsi\Desktop\TobiiTest","no","mono","no","22","no","black","no","yes","768","psycho","8","8","0","C:/Users/Epsi/Desktop/TobiiTest/subject-0.csv","psycho","Jazzy James","3.1.9","no","0","space","NA","space","1155.52646568","NA","1155.52646568","2","legacy","1024","2","48000","-16","experiment","stim_a5","stim_b5","0","even","36699.7097951","36699.6584769","NA","1707.29592","36699.0037411","78012.7722649","35543.0675845","81968.5359065","1707.67567528","81831.7213691","78756.9343827","76986.067942","81860.8222557","78836.5974611","76985.9819839","PyGaze Tobii Test","0","0","0","yes","1366"
11 | "undefined","undefined","undefined","undefined","#808080","no","psycho","psycho","psycho","0","relative","undefined","undefined","0","0","0","0","0","9","0","9","0","9","9","9","9","9","9","10/06/17 13:58:43","A template for eye-tracking experiments","yes","PyGazeTobiiTest.osexp","C:\Users\Epsi\Desktop\TobiiTest","no","mono","no","22","no","black","no","yes","768","psycho","9","9","0","C:/Users/Epsi/Desktop/TobiiTest/subject-0.csv","psycho","Jazzy James","3.1.9","no","0","space","NA","space","1155.52646568","NA","1155.52646568","2","legacy","1024","2","48000","-16","experiment","stim_a7","stim_b7","0","even","36699.7097951","36699.6584769","NA","1707.29592","36699.0037411","83096.5061659","35543.0675845","86977.8732678","1707.67567528","86858.8101515","83840.6738432","82062.8140037","86877.005048","83863.6327869","82062.7220585","PyGaze Tobii Test","0","0","0","yes","1366"
12 |
--------------------------------------------------------------------------------
/src/utils/TobiiTest/tobii-new_output_analysed.xlsx:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/mida-project/eye-tracker-setup/ade35074eeb951af96508be796ea43afd6d7e08f/src/utils/TobiiTest/tobii-new_output_analysed.xlsx
--------------------------------------------------------------------------------
/src/variables/__init__.py:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/mida-project/eye-tracker-setup/ade35074eeb951af96508be796ea43afd6d7e08f/src/variables/__init__.py
--------------------------------------------------------------------------------