Abstract: Recently, the Mixture of Expert (MoE) architecture, such as LR-MoE, is often used to alleviate the impact of language confusion on the multilingual ASR (MASR) task. However, it still faces ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results