Antony A
Antony A
Nov 4 · 1 min read

Hi ,

Thank for the tutorial .

So which one out of these represent the sentence embedding vector ? hidden_reps or cls_head ?

print(hidden_reps.shape)#Out: torch.Size([1, 12, 768])

print(cls_head.shape)#Out: torch.Size([1, 768])

what is the excat use of [CLS] token ? is it a dummy vector as its just a token of [CLS] so every sentences will have this [CLS] token right , why we need it exactly ?

when should we use hidden_reps vector ?

when should we use cls_head ?

    Antony A

    Written by

    Antony A