Skip to content

Nans when following Re-training Parametric UMAP with landmarks tutorial #1180

@EHenryPega

Description

@EHenryPega

Hey umap team!

Firstly, a big thanks for all the work on this library, it is incredibly useful! The ability to retrain a ParametricUMAP whilst preserving the mapping for embeddings that have already been processed would be incredible.

I tried this out for my own use case, using the example here on umap-learn as a reference. However, when it came to the retraining phase, the reported loss for each epoch is always nan.

I assumed this was an issue with my own setup, so I copied the example verbatim. Unfortunately I get the exact same outcome. The model does not retrain successfully.

p_embedder.fit(x2_lmk, landmark_positions=landmarks)
Epoch 1/10
3921/3921 ━━━━━━━━━━━━━━━━━━━━ 21s 5ms/step - loss: nan
Epoch 2/10
3921/3921 ━━━━━━━━━━━━━━━━━━━━ 20s 5ms/step - loss: nan
Epoch 3/10
3921/3921 ━━━━━━━━━━━━━━━━━━━━ 20s 5ms/step - loss: nan
Epoch 4/10
3921/3921 ━━━━━━━━━━━━━━━━━━━━ 20s 5ms/step - loss: nan
Epoch 5/10
3921/3921 ━━━━━━━━━━━━━━━━━━━━ 20s 5ms/step - loss: nan
Epoch 6/10
3921/3921 ━━━━━━━━━━━━━━━━━━━━ 19s 5ms/step - loss: nan
Epoch 7/10
3921/3921 ━━━━━━━━━━━━━━━━━━━━ 19s 5ms/step - loss: nan
Epoch 8/10
3921/3921 ━━━━━━━━━━━━━━━━━━━━ 19s 5ms/step - loss: nan
Epoch 9/10
3921/3921 ━━━━━━━━━━━━━━━━━━━━ 19s 5ms/step - loss: nan
Epoch 10/10
3921/3921 ━━━━━━━━━━━━━━━━━━━━ 19s 5ms/step - loss: nan

I suspect there has either been some kind of regression or there have been some updates to the library that are not reflected in the example.

Any help or suggestions would be greatly appreciated. Cheers!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions