要約
この作業では、20 億から 270 億のパラメータ規模の軽量で最先端のオープン モデルの Gemma ファミリに新たに追加された Gemma 2 を紹介します。
この新しいバージョンでは、ローカルとグローバルのアテンションのインターリーブ (Beltagy et al., 2020a) やグループクエリ アテンション (Ainslie et al., 2023) など、いくつかの既知の技術的変更を Transformer アーキテクチャに適用します。
また、次のトークン予測の代わりに知識蒸留 (Hinton et al., 2015) を使用して 2B モデルと 9B モデルをトレーニングします。
結果として得られるモデルは、そのサイズに対して最高のパフォーマンスを提供し、さらには 2 ~ 3 倍大きいモデルに匹敵する代替品を提供します。
私たちはすべてのモデルをコミュニティにリリースします。
要約(オリジナル)
In this work, we introduce Gemma 2, a new addition to the Gemma family of lightweight, state-of-the-art open models, ranging in scale from 2 billion to 27 billion parameters. In this new version, we apply several known technical modifications to the Transformer architecture, such as interleaving local-global attentions (Beltagy et al., 2020a) and group-query attention (Ainslie et al., 2023). We also train the 2B and 9B models with knowledge distillation (Hinton et al., 2015) instead of next token prediction. The resulting models deliver the best performance for their size, and even offer competitive alternatives to models that are 2-3 times bigger. We release all our models to the community.
arxiv情報
著者 | Gemma Team,Morgane Riviere,Shreya Pathak,Pier Giuseppe Sessa,Cassidy Hardin,Surya Bhupatiraju,Léonard Hussenot,Thomas Mesnard,Bobak Shahriari,Alexandre Ramé,Johan Ferret,Peter Liu,Pouya Tafti,Abe Friesen,Michelle Casbon,Sabela Ramos,Ravin Kumar,Charline Le Lan,Sammy Jerome,Anton Tsitsulin,Nino Vieillard,Piotr Stanczyk,Sertan Girgin,Nikola Momchev,Matt Hoffman,Shantanu Thakoor,Jean-Bastien Grill,Behnam Neyshabur,Olivier Bachem,Alanna Walton,Aliaksei Severyn,Alicia Parrish,Aliya Ahmad,Allen Hutchison,Alvin Abdagic,Amanda Carl,Amy Shen,Andy Brock,Andy Coenen,Anthony Laforge,Antonia Paterson,Ben Bastian,Bilal Piot,Bo Wu,Brandon Royal,Charlie Chen,Chintu Kumar,Chris Perry,Chris Welty,Christopher A. Choquette-Choo,Danila Sinopalnikov,David Weinberger,Dimple Vijaykumar,Dominika Rogozińska,Dustin Herbison,Elisa Bandy,Emma Wang,Eric Noland,Erica Moreira,Evan Senter,Evgenii Eltyshev,Francesco Visin,Gabriel Rasskin,Gary Wei,Glenn Cameron,Gus Martins,Hadi Hashemi,Hanna Klimczak-Plucińska,Harleen Batra,Harsh Dhand,Ivan Nardini,Jacinda Mein,Jack Zhou,James Svensson,Jeff Stanway,Jetha Chan,Jin Peng Zhou,Joana Carrasqueira,Joana Iljazi,Jocelyn Becker,Joe Fernandez,Joost van Amersfoort,Josh Gordon,Josh Lipschultz,Josh Newlan,Ju-yeong Ji,Kareem Mohamed,Kartikeya Badola,Kat Black,Katie Millican,Keelin McDonell,Kelvin Nguyen,Kiranbir Sodhia,Kish Greene,Lars Lowe Sjoesund,Lauren Usui,Laurent Sifre,Lena Heuermann,Leticia Lago,Lilly McNealus,Livio Baldini Soares,Logan Kilpatrick,Lucas Dixon,Luciano Martins,Machel Reid,Manvinder Singh,Mark Iverson,Martin Görner,Mat Velloso,Mateo Wirth,Matt Davidow,Matt Miller,Matthew Rahtz,Matthew Watson,Meg Risdal,Mehran Kazemi,Michael Moynihan,Ming Zhang,Minsuk Kahng,Minwoo Park,Mofi Rahman,Mohit Khatwani,Natalie Dao,Nenshad Bardoliwalla,Nesh Devanathan,Neta Dumai,Nilay Chauhan,Oscar Wahltinez,Pankil Botarda,Parker Barnes,Paul Barham,Paul Michel,Pengchong Jin,Petko Georgiev,Phil Culliton,Pradeep Kuppala,Ramona Comanescu,Ramona Merhej,Reena Jana,Reza Ardeshir Rokni,Rishabh Agarwal,Ryan Mullins,Samaneh Saadat,Sara Mc Carthy,Sarah Cogan,Sarah Perrin,Sébastien M. R. Arnold,Sebastian Krause,Shengyang Dai,Shruti Garg,Shruti Sheth,Sue Ronstrom,Susan Chan,Timothy Jordan,Ting Yu,Tom Eccles,Tom Hennigan,Tomas Kocisky,Tulsee Doshi,Vihan Jain,Vikas Yadav,Vilobh Meshram,Vishal Dharmadhikari,Warren Barkley,Wei Wei,Wenming Ye,Woohyun Han,Woosuk Kwon,Xiang Xu,Zhe Shen,Zhitao Gong,Zichuan Wei,Victor Cotruta,Phoebe Kirk,Anand Rao,Minh Giang,Ludovic Peran,Tris Warkentin,Eli Collins,Joelle Barral,Zoubin Ghahramani,Raia Hadsell,D. Sculley,Jeanine Banks,Anca Dragan,Slav Petrov,Oriol Vinyals,Jeff Dean,Demis Hassabis,Koray Kavukcuoglu,Clement Farabet,Elena Buchatskaya,Sebastian Borgeaud,Noah Fiedel,Armand Joulin,Kathleen Kenealy,Robert Dadashi,Alek Andreev |
発行日 | 2024-10-02 15:22:49+00:00 |
arxivサイト | arxiv_id(pdf) |
提供元, 利用サービス
arxiv.jp, Google