What the latency between facial expression and mouseclick?
Posted: Wed May 27, 2015 11:00 am
One of the biggest issues I've struggled with in hands-free gaming is the latency between the input and the mouseclick.
When I'm using voice commands, there is about a half second lag between the time I finished speaking and the time the click is processed. In Iris when I want to click freely on the screen with a delayed right-click macro, I need to time in my eye movements just right with the macro to work.
Managing that latency has always been a challenge for me. I'm wondering if a facial expression in Microsoft Kinect for Windows can be recognized faster than a voice command in voice attack.
What is the latency between an expression and the input? Do you have that measured? If not, is there a video that shows someone's face and the effects on the screen?
When I'm using voice commands, there is about a half second lag between the time I finished speaking and the time the click is processed. In Iris when I want to click freely on the screen with a delayed right-click macro, I need to time in my eye movements just right with the macro to work.
Managing that latency has always been a challenge for me. I'm wondering if a facial expression in Microsoft Kinect for Windows can be recognized faster than a voice command in voice attack.
What is the latency between an expression and the input? Do you have that measured? If not, is there a video that shows someone's face and the effects on the screen?