You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm not sure if i understand you correctly but if you want to predict the next word vector given the previous vectors you could just use a vanilla neural network or an RNN. The focus of this repo lies on the pytorch embedding so while you could theoretically reuse the neural network part you would have to delete the embedding step from the forward function and replace it with the GloVe equivalent. Is this more or less what you are trying to achieve? I'm not too familiar with GloVe so im sorry if i misunderstood you
Hi!
I'm wondering if it's possible to use this model with Glove pre-trained word vectors?
How is it possible to substitute this part in that case?
data = []
for i in range(0, len(raw_text) - 2):
context = [raw_text[i-2], raw_text[i-1]]
target = raw_text[i]
data.append((context, target))
So, I don't have a raw text, but only word vectors and want to predict the next word of the sentence.
The text was updated successfully, but these errors were encountered: