Efficient scaling of large language models with mixture of experts and 3D analog in-memory computing.
大型語言模型的高效擴展:專家混合與3D類比內存計算。
Nat Comput Sci 2025-01-08
2-D Transformer: Extending Large Language Models to Long-Context With Few Memory.
2-D Transformer:擴展大型語言模型以應對長上下文與少量記憶。
IEEE Trans Neural Netw Learn Syst 2025-03-21
Harnessing the Power of Single Cell Large Language Models with Parameter Efficient Fine-Tuning using scPEFT.
利用 scPEFT 以參數高效微調,發揮單細胞大型語言模型的潛力
Res Sq 2025-05-02