Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
How too switch between different camera modes
How to report a bug or request a new feature
If you have found a bug or you would like to request a new feature then you can let us know. Fill in this online form with your idea or bug. Please note that your response will be publicly shared (without your email) on our bug tracking system.
If you cannot use the form or want to provide more detail then email me at ghenderson@acecentre.org.uk
If you are familiar with GitHub then you can open a detailed issue on our Github repository outlining your request and we will get back get back to you.
How to install EyeCommander
Visit the download page, here.
Once there you should download EyeCommander-x.x.x.Setup.exe
file.
Once the download has completed if you just click on the downloaded file. You will then be prompted by windows to ask if you're sure you want to run it, you simply click more info and click run anyway.
Once you've done that you'll then see the installer appear on your screen. If you just wait a minute on and the installation process will happen.
Once it's installed, the icon or window will open and it will be ready to use.
It may prompt you to enter the administrator password. This is expected, enter your password and click okay
How to use EyeCommander to output keypresses
EyeCommander is a desktop application that uses your camera to detect when you blink. Once a blink has been detected you can output that into any desktop application, including communication software like Grid3.
To help explain EyeCommander we created two personas that are not based on specific individuals but a collection of people who could benefit from using this technology.
Our first persona is Alexis Anderson who had a brain stem stroke and cannot use body movement to trigger switches. She uses paper based AAC with a limited group of communication partners who understand how she indicates her selections by blinking. Alexis is frustrated that she can only communicate when one of these communication partners are available. Her current solution also means that she cannot access environment control.
Our second persona is Ben Brown who has MND and currently uses eye gaze when he has the energy to do so. Typically, he has to move to a low-tech solution as he becomes tired throughout the day. Ben is very proficient with his high tech device and has learnt where all his vocabulary is stored. It is very problematic that his access depends on having enough energy to use eye gaze and Ben’s family is worried that he will be not able to access his eye gaze as time goes on.
Ben and Alexis are both examples of people who would benefit from using EyeCommander. Ben could use it when he is tired and he can continue to use the same vocabulary no matter his energy levels. Alexis can use EyeCommander to communicate independently and not be reliant on the presence of a specific communication partner.
How to calibrate EyeCommander
Find out about the latest changes to EyeCommander here
Lift the max volume of the click
Fix bug that prevented users from mapping the keyboard emulator to spacebar
Add mind express option
Improve forcing run as admin workflow
Fix bug where the app wouldn't work when the Windows User name included a space
Improve descriptions of blink modes following feedback
Fix bug that caused the window to resize after you opened the settings
Add the ability to hide the sliders once you have finished using them so they don't create a cluttered UI
You can now edit the key that is pressed in 'Keyboard Emulator' mode.
Update links to docs
Add the ability to launch at startup
You can now track a specific eye in all blink modes. This can be useful if a use can only make clear voluntary movements with one eye
Can now detect when a user gazes left and right, enable in ‘blink mode’ settings area [BETA]
Add anonymous analytics so we can understand how EyeCommander is used, you can see them here
Add visual indicator to show blink level
Drop support for Linux and MacOS
Makes sure the users face is always in frame
Automatically make sure that the user is an Admin, no need to ‘run as admin’
The hardware we recommend to run EyeCommander on
EyeCommander is compatible with all modern versions of Windows, its been tested on Windows 10 and up
Currently it only runs on windows, we can make a version for OSx or linux, if this is something you are interested in then contact us.
EyeCommander uses a lot of CPU power to run, this is due to the fact that every frame has to be analysed by a machine learning model to extract all of your facial features.
EyeCommander shows a 'frames per second' counter in the top left hand corner of the video feed. The higher the number of frames per second the more responsive and accurate the blink detection will be. The highest you will get is 30 frames per second and anything lower than 5 frames per second will be too low to work at all.
You can run EyeCommander on GridPad devices but we have found that they get fairly low frame rates. EyeCommander still works at a low frame rate, however it will be less accurate and responsive but you might find its still usable for your use case.
From our experience we have had the best success with Surface Pro tablets. They have enough processing power to run EyeCommander easily at 30 frames per second and work for our client's needs.
Please note, this is a just recommendation so feel free to try it on other devices and let us know how you get on.