Automatic alignment of surgical videos using kinematic data

This is the companion web page for our paper titled "Automatic alignment of surgical videos using kinematic data".

This paper has been presented at the Conference on Artificial Intelligence in Medicine (AIME 2019).

The source code

The software is developed using Python 3.5. You will need the JIGSAWS: The JHU-ISI Gesture and Skill Assessment Working Set to re-run the experiments of the paper. The source code can be downloaded here upon the acceptance of the paper.

To run the code you will also need to download seperatly and install the following dependencies:

Videos illustrating the method

For each surgical task, we present the result of aligning 4 different videos using our Non-Linear Temporal Scaling (NLTS) algorithm.

Suturing

Knot Tying

Needle Passing


Last update : February 2019