Skip to content

Compute SoftDTWLossPyTorch with normalization option and time series of different lengths #473

Open
@denisbeslic

Description

@denisbeslic

Describe the bug
Hello, I want to use the SoftDTWLoss with normalization (Soft-DTW divergence) for comparing predicted time series of different length. It seems to throw this error.

  File ".../lib/python3.10/site-packages/tslearn/metrics/soft_dtw_loss_pytorch.py", line 146, in forward
    xxy = torch.cat([x, x, y])

RuntimeError: Sizes of tensors must match except in dimension 0. Expected size 99 but got size 100 for tensor number 2 in the list.

Is it possible to generate a SoftDTWLoss for time series with unequal lengths or is this a bug?

To Reproduce

targets = torch.tensor(np.random.randn(16, 100, 1))
prediction = torch.tensor(np.random.randn(16, 99, 1))
loss_func = SoftDTWLossPyTorch(gamma=1.0, normalize=True, dist_func=None)
loss = loss_func(prediction, targets)
print(loss.shape)
print(loss)

Environment (please complete the following information):

  • OS: Ubuntu 20.04.5 LTS
  • tslearn version 0.6.1

Additional context

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions