Abstract: In the modern computing landscape, the requirements of floating-point arithmetic in different fields have diverged. Some applications require a greater level of precision and performance, ...
Abstract: Transformers have significantly advanced AI and machine learning through their powerful attention mechanism. However, computing attention on long sequences can become a computational ...
Lower-precision floating-point arithmetic is becoming more common, moving beyond the usual IEEE 64-bit double-precision and 32-bit single-precision formats. Today, hardware accelerators and software ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results