New loss implementation in Pytorch : Soft-DTW

Keys words
Time Series, DTW, Pytorch
Objective
We re-implemented a differentiable version of Dynamic Time Warping (DTW) named Soft DTW, which allow robust comaprison to shift, length etc. and thereby expanding the utility of DTW in machine learning. Our work contributes to the current discourse on open-source because we have proposed an optimised losses compatible with PyTorch GPU with our own backward.

Links
GitHub Report Report