attention package on CRAN
The attention R package, describing how to implement from scratch the attention mechanism - which forms the basis of transformers - in the R language is now available on CRAN. A key example of the results that were achieved using (much larger and more complex forms of) transformers is the change from AlphaFold (1) (which relied primarily on LSTM) to AlphaFold2 (which is primarily based on transformers). This change pushed the results in the protein folding competition CASP-14 to a level of accuracy that made the protein structure prediction accurate enough for practical purposes. A major scientific breakthrough, the impact of which can barely overstated.