Large transformer models are mainstream nowadays, creating SoTA results for a variety of tasks. They are powerful but very expensive to train and use. The extremely high inference cost, in both time and memory, is a big bottleneck for adopting a powerful transformer for solving real-world tasks at scale.
Why is it hard to run inference for large transformer models? Besides the increasing size of SoTA models, there are two main factors contributing to the inference challenge (Pope et al.
雨带北移,归程也是征程 记中国电信卫星公司应急通信小分队的两个日夜 sina.com.cn - get the latest breaking news, showbiz & celebrity photos, sport news & rumours, viral videos and top stories from sina.com.cn Daily Mail and Mail on Sunday newspapers.