Self-Attention
Self-attention is a special case of attention where the query, key, and value all come from the same source. It allows each element of a sequence to consider (or “attend to”) all other elements in the same sequence.
Self-attention is a special case of attention where the query, key, and value all come from the same source. It allows each element of a sequence to consider (or “attend to”) all other elements in the same sequence.