Skip to content
This repository was archived by the owner on Jul 4, 2023. It is now read-only.

Commit ff1736b

Browse files
committed
Create non-existent cache dirs for pretrained embeddings
1 parent 9925127 commit ff1736b

File tree

1 file changed

+2
-0
lines changed

1 file changed

+2
-0
lines changed

torchnlp/word_to_vector/pretrained_word_vectors.py

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -171,6 +171,8 @@ def cache(self, name, cache, url=None):
171171
self.vectors = torch.Tensor(vectors).view(-1, dim)
172172
self.dim = dim
173173
logger.info('Saving vectors to {}'.format(path_pt))
174+
if not os.path.exists(cache):
175+
os.makedirs(cache)
174176
torch.save((self.itos, self.stoi, self.vectors, self.dim), path_pt)
175177
else:
176178
logger.info('Loading vectors from {}'.format(path_pt))

0 commit comments

Comments
 (0)