Keyword | CPC | PCC | Volume | Score | Length of keyword |
---|---|---|---|---|---|
transformer encoder decoder attention | 0.2 | 0.3 | 3981 | 80 | 37 |
transformer | 0.84 | 0.9 | 1314 | 86 | 11 |
encoder | 1.05 | 0.6 | 8158 | 66 | 7 |
decoder | 1.82 | 0.6 | 1063 | 31 | 7 |
attention | 1.25 | 0.9 | 9296 | 80 | 9 |
Keyword | CPC | PCC | Volume | Score |
---|---|---|---|---|
transformer encoder decoder attention | 1.45 | 0.6 | 8604 | 4 |
transformer decoder masked attention | 0.56 | 0.7 | 4211 | 86 |
transformer decoder cross attention | 0.66 | 0.6 | 5226 | 66 |
encoder decoder attention model | 1.99 | 0.9 | 2513 | 56 |
transformer decoder attention mask | 1.34 | 0.2 | 1281 | 15 |
transformer encoder and decoder | 0.89 | 0.5 | 2726 | 24 |
encoder decoder cross attention | 1.3 | 0.4 | 961 | 12 |
encoder-decoder attention | 1.19 | 0.7 | 2867 | 70 |
attention based encoder decoder | 0.82 | 0.3 | 2562 | 96 |
transformer encoder decoder architecture | 0.71 | 1 | 6303 | 85 |
transformer model encoder decoder | 0.1 | 0.6 | 9912 | 93 |
transformer_encoder | 1.5 | 0.9 | 8234 | 75 |