Gemma 2: Improving Open Language Models at a Practical Size

要約

本研究では、20億から270億のパラメータを持つ、軽量で最先端のオープンモデルであるGemmaファミリーに新たに追加されたGemma 2を紹介する。この新バージョンでは、ローカル-グローバル注意(Beltagy et al.、2020a)やグループ-クエリー注意(Ainslie et al.、2023)をインターリーブするなど、いくつかの既知の技術的修正をTransformerアーキテクチャに適用する。また、次のトークン予測の代わりに知識蒸留(Hinton et al.その結果、モデルはそのサイズに対して最高の性能を発揮し、2~3倍大きなモデルに匹敵する競争力を提供する。我々は全てのモデルをコミュニティに公開している。

要約(オリジナル)

In this work, we introduce Gemma 2, a new addition to the Gemma family of lightweight, state-of-the-art open models, ranging in scale from 2 billion to 27 billion parameters. In this new version, we apply several known technical modifications to the Transformer architecture, such as interleaving local-global attentions (Beltagy et al., 2020a) and group-query attention (Ainslie et al., 2023). We also train the 2B and 9B models with knowledge distillation (Hinton et al., 2015) instead of next token prediction. The resulting models deliver the best performance for their size, and even offer competitive alternatives to models that are 2-3 times bigger. We release all our models to the community.

arxiv情報

著者 Gemma Team,Morgane Riviere,Shreya Pathak,Pier Giuseppe Sessa,Cassidy Hardin,Surya Bhupatiraju,Léonard Hussenot,Thomas Mesnard,Bobak Shahriari,Alexandre Ramé,Johan Ferret,Peter Liu,Pouya Tafti,Abe Friesen,Michelle Casbon,Sabela Ramos,Ravin Kumar,Charline Le Lan,Sammy Jerome,Anton Tsitsulin,Nino Vieillard,Piotr Stanczyk,Sertan Girgin,Nikola Momchev,Matt Hoffman,Shantanu Thakoor,Jean-Bastien Grill,Behnam Neyshabur,Olivier Bachem,Alanna Walton,Aliaksei Severyn,Alicia Parrish,Aliya Ahmad,Allen Hutchison,Alvin Abdagic,Amanda Carl,Amy Shen,Andy Brock,Andy Coenen,Anthony Laforge,Antonia Paterson,Ben Bastian,Bilal Piot,Bo Wu,Brandon Royal,Charlie Chen,Chintu Kumar,Chris Perry,Chris Welty,Christopher A. Choquette-Choo,Danila Sinopalnikov,David Weinberger,Dimple Vijaykumar,Dominika Rogozińska,Dustin Herbison,Elisa Bandy,Emma Wang,Eric Noland,Erica Moreira,Evan Senter,Evgenii Eltyshev,Francesco Visin,Gabriel Rasskin,Gary Wei,Glenn Cameron,Gus Martins,Hadi Hashemi,Hanna Klimczak-Plucińska,Harleen Batra,Harsh Dhand,Ivan Nardini,Jacinda Mein,Jack Zhou,James Svensson,Jeff Stanway,Jetha Chan,Jin Peng Zhou,Joana Carrasqueira,Joana Iljazi,Jocelyn Becker,Joe Fernandez,Joost van Amersfoort,Josh Gordon,Josh Lipschultz,Josh Newlan,Ju-yeong Ji,Kareem Mohamed,Kartikeya Badola,Kat Black,Katie Millican,Keelin McDonell,Kelvin Nguyen,Kiranbir Sodhia,Kish Greene,Lars Lowe Sjoesund,Lauren Usui,Laurent Sifre,Lena Heuermann,Leticia Lago,Lilly McNealus,Livio Baldini Soares,Logan Kilpatrick,Lucas Dixon,Luciano Martins,Machel Reid,Manvinder Singh,Mark Iverson,Martin Görner,Mat Velloso,Mateo Wirth,Matt Davidow,Matt Miller,Matthew Rahtz,Matthew Watson,Meg Risdal,Mehran Kazemi,Michael Moynihan,Ming Zhang,Minsuk Kahng,Minwoo Park,Mofi Rahman,Mohit Khatwani,Natalie Dao,Nenshad Bardoliwalla,Nesh Devanathan,Neta Dumai,Nilay Chauhan,Oscar Wahltinez,Pankil Botarda,Parker Barnes,Paul Barham,Paul Michel,Pengchong Jin,Petko Georgiev,Phil Culliton,Pradeep Kuppala,Ramona Comanescu,Ramona Merhej,Reena Jana,Reza Ardeshir Rokni,Rishabh Agarwal,Ryan Mullins,Samaneh Saadat,Sara Mc Carthy,Sarah Perrin,Sébastien M. R. Arnold,Sebastian Krause,Shengyang Dai,Shruti Garg,Shruti Sheth,Sue Ronstrom,Susan Chan,Timothy Jordan,Ting Yu,Tom Eccles,Tom Hennigan,Tomas Kocisky,Tulsee Doshi,Vihan Jain,Vikas Yadav,Vilobh Meshram,Vishal Dharmadhikari,Warren Barkley,Wei Wei,Wenming Ye,Woohyun Han,Woosuk Kwon,Xiang Xu,Zhe Shen,Zhitao Gong,Zichuan Wei,Victor Cotruta,Phoebe Kirk,Anand Rao,Minh Giang,Ludovic Peran,Tris Warkentin,Eli Collins,Joelle Barral,Zoubin Ghahramani,Raia Hadsell,D. Sculley,Jeanine Banks,Anca Dragan,Slav Petrov,Oriol Vinyals,Jeff Dean,Demis Hassabis,Koray Kavukcuoglu,Clement Farabet,Elena Buchatskaya,Sebastian Borgeaud,Noah Fiedel,Armand Joulin,Kathleen Kenealy,Robert Dadashi,Alek Andreev
発行日 2024-08-02 17:52:12+00:00
arxivサイト arxiv_id(pdf)

提供元, 利用サービス

arxiv.jp, DeepL

カテゴリー: cs.AI, cs.CL パーマリンク