Navigation
Recherche
|
Google's Gemini Pro 1.5 Enters Public Preview on Vertex AI
mardi 9 avril 2024, 16:01 , par Slashdot
One million tokens is equivalent to around 700,000 words or around 30,000 lines of code. It's about four times the amount of data that Anthropic's flagship model, Claude 3, can take as input and about eight times as high as OpenAI's GPT-4 Turbo max context. A model's context, or context window, refers to the initial set of data (e.g. text) the model considers before generating output (e.g. additional text). A simple question -- 'Who won the 2020 U.S. presidential election?' -- can serve as context, as can a movie script, email, essay or e-book. Read more of this story at Slashdot.
https://tech.slashdot.org/story/24/04/09/1332218/googles-gemini-pro-15-enters-public-preview-on-vert...
Voir aussi |
56 sources (32 en français)
Date Actuelle
mer. 1 mai - 15:04 CEST
|