![]() ![]() The heart rate sample is a point of change, so it is valid until the next change, but not before.Īll of this can be implemented using the Pandas function pd.merge_asof. Second, instead of matching a heart rate sample to the world frames that are closest in time to it, we need to match it to world frames whose timestamps are larger-equal to the heart rate sample's timestamp. First of all, given the lower sampling rate we will now match every heart rate sample to multiple world frames. This changes how we need to match the data a bit. Also it is sampled much more sparsly at 1 Hz (or lower in case there are no changes to report). The heart rate sensor is not sampling data at regular intervals, but instead only records a new sample when the data changes. All the gaze samples that match to the same frame are averaged and this value is used for the overlay.įor gaze data option 2 is usually better, as the averaging contributes a bit of noise reduction. We match every single gaze sample to its closest video frame. This would imply that most of the gaze samples are not visible in the overlay, because there are more gaze samples than frames.Ģ. Given the timestamp of the video frame we search for the gaze sample that is closest in time and choose it for the overlay. We therefore have to decide between two options:ġ. the gaze data is sampled at 200 Hz, while the scene video is only sampled at 30 Hz, so there are a lot more gaze samples than video frames in our recording.įor our visualization, we will need to overlay every scene video frame with a gaze circle that shows where the wearer was looking. Every stream is sampled independently and at different rates. The challenge with syncing the three data streams is that while they are all timestamped, their timestamps are not identical. to_datetime (world_ts " ] ) # Timestamp Matching World_ts_path = "data/demo-recording/running_rd-4a40d94d/world_timestamps.csv" Gaze = gaze ", "gaze x ", "gaze y " ] ] Gaze_path = "data/demo-recording/running_rd-4a40d94d/gaze.csv" All we need from it is the timestamps and the gaze values.įor the scene video, we initially only need its timestamps and the corresponding frame indices for matching, we don't have to touch the actual video frames yet. The gaze data CSV file can be read using Pandas. For more details check out the implementation of the load_fit_data function. The heart rate data can be read using the fitdecode module. ![]() # Loading all Dataįor the example visualization we need the scene video and gaze data from the Pupil Invisible recording, and the heart rate data. Unpack it inside the data/demo-recording folder. The Pupil Invisible recording used in the example is available here. ![]() The heart rate data used in the example is located in data/eye-tracking-run.FIT. Pip install numpy pandas av fitdecode tqdm opencv-python datetime typing To run it you need to install the following dependencies: You can find all code accompanying this guide here. If you need to sync data in real-time while recording, see the real-time API instead. See here for an introduction on LSL with Pupil Invisible. In some experimental setups it can be handy to sync already at recording time using Labstreaming Layer (LSL). This guide is about syncing data post hoc after you have made a recording. To build this visualization, we will actually be matching three data streams: 1. We will also produce a visualization to show the real-time heart rate of a person jogging alongside the eye tracking video. As an example, we will sync a heart rate sensor (from a Garmin Fenix3 HR running watch) with a Pupil Invisible recording. In this guide, you will learn how to sync datastreams with absolute timestamps using the pd.merge_asof function of Pandas. If you add the relative time to this absolute start point, you get absolute timestamps. Typically, the start time of the recording is available as an absolute timestamp. If the sensor you are using only provides relative timestamps it is often still possible to convert them to absolute timestamps. Note that these differ from relative timestamps which count time since, e.g. including the date and exact time when every sample was recorded. The only requirement is that the other data also has absolute timestamps like this, i.e. This makes syncronisation with other sensors easily possible. All eye tracking data you record with Pupil Invisible is accurately timestamped using Unix timestamps in nanoseconds. This data needs to be synced temporally for a joint analysis. Many experimental setups record data from multiple sensors in parallel. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |