#Transformer Architectures
2 articles with this tag
AI Research
Unlocking Transformer Potential Beyond Semantics
Researchers propose SIREN-RoPE, unlocking a novel 'rotation space' in Transformers for dynamic relational encoding, yielding consistent performance gains with minimal overhead.
11 days ago
AI Research
PoM: Linear Complexity Attention Replacement
Polynomial Mixer (PoM) offers a linear complexity replacement for self-attention, matching performance and drastically cutting costs for long sequences.
about 1 month ago