Keyword | CPC | PCC | Volume | Score | Length of keyword |
---|---|---|---|---|---|
encoder decoder attention model | 1.74 | 1 | 3757 | 36 | 31 |
encoder | 0.17 | 1 | 5457 | 10 | 7 |
decoder | 0.55 | 0.3 | 1815 | 16 | 7 |
attention | 1.77 | 0.3 | 5988 | 79 | 9 |
model | 0.61 | 0.6 | 387 | 94 | 5 |
Keyword | CPC | PCC | Volume | Score |
---|---|---|---|---|
encoder decoder attention model | 1.76 | 0.4 | 5137 | 30 |
attention encoder-decoder | 1.93 | 0.8 | 5334 | 62 |
encoder decoder attention transformer | 1.76 | 0.3 | 3036 | 26 |
self attention and encoder-decoder attention | 0.57 | 0.8 | 1168 | 21 |
attention-based encoder-decoder | 1.96 | 0.4 | 1555 | 77 |
encoder decoder cross attention | 0.11 | 1 | 4778 | 41 |
encoder-decoder attention layer | 1.66 | 0.5 | 7433 | 4 |
attention-based encoder-decoder network | 0.88 | 0.3 | 3572 | 5 |
self-attention encoder | 0.41 | 0.8 | 5281 | 67 |
whether to output attention in encoder | 1.27 | 0.9 | 5733 | 95 |
encoder and decoder model | 0.98 | 0.6 | 5708 | 74 |
masked decoder self attention | 0.56 | 0.2 | 2315 | 26 |
encoder_attention_mask | 1.25 | 0.1 | 1347 | 32 |
decoder_attention_mask | 0.38 | 0.1 | 1841 | 3 |
encoder_attention_heads | 1.41 | 0.6 | 3206 | 10 |
graph attention auto-encoders | 0.52 | 0.5 | 4411 | 98 |