PaLM 2 is trained on over 100 languages and is trained to not only literally translate from one language to another but to apply possible context and understand idioms, poems, and riddles as well. The first improvement is a serious language boost. This model promises to be the building block for Google's AI products, serving as a massive general LLM with expanded reasoning, language and coding capabilities. While Google did not lay out how many parameters PaLM 2 is trained on - PaLM was trained on 540 billion language parameters - Google promises that PaLM 2 will have improved capabilities in addition to faster and more efficient performance. According to Google, PaLM 2 has already been secretly powering Google Bard and has many future integrations planned. Google is setting up PaLM 2 to be the fundamental AI model behind its future AI products.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |