You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have some questions about the usage of the nslr_hmm algorithm (classify_nslr_hmm() function).
I have x and y coordinates (in pixels) from eye tracking recording. The origin (0,0) is on the upper left corner of an image.
Should I convert them to coordinates relative to the center and then use the cateyes.pixel_to_degree function?
Thank you
Giuseppe
The text was updated successfully, but these errors were encountered:
The NSLR-HMM uses angular data (=the angle of the eye relative to the center/fixation point) as input. If you have flat coordinates/pixels you can use cateyes.utils.coords_to_degree or cateyes.utils.pixel_to_degree to convert both gaze arrays to degrees.
Hey @beppefolder I just updated the code and the documentation for these functions. Hope they are more intuitive and easier to understand now. If they helped solving your problem, feel free to close the issue :)
Hi @DiGyt,
At the end I had to start another work and I stopped the previous one. I 'll let you know in some days as soon I will recover the project.
For the moment, thank you for the help and for enhancing the documentation.
Hi!
I have some questions about the usage of the nslr_hmm algorithm (classify_nslr_hmm() function).
I have x and y coordinates (in pixels) from eye tracking recording. The origin (0,0) is on the upper left corner of an image.
Should I convert them to coordinates relative to the center and then use the cateyes.pixel_to_degree function?
Thank you
Giuseppe
The text was updated successfully, but these errors were encountered: