Sep 8, 2018 · 1 min read
Hi Sabeer,
Thanks for the wonderful tutorial. Its great to see how you are using an existing embedding with Keras embedding layer.
I’m little confused with dimensions here. Say I have 3700 sentences with max_length of 160 words. Each word has a vector representation of length 300.
So I get an embedding matrix of shape (3700, 160, 300) for all sentences.
How should the embedding layer be built?
Further, in Keras documentation, I don’t see weights and trainable input parameters to Embedding layer. Is there any other documentation?
Thanks
