Topic: gemma

Google launches 2 million context window for Gemini 1.5 Pro

Google has announced that developers now have access to a 2 million context window for Gemini Pro 1.5. For comparison, GPT-4o has a 128k context window.  This context window length was first announced at Google I/O and accessible only through a waitlist, but now everyone has access. Longer context windows can lead to higher costs, … continue reading

Gemini improvements unveiled at Google Cloud Next

Google Cloud Next was this week, and the company unveiled a lot of innovations related to AI, such as two new Gemma models for code generation and inference. Google announced that Gemini 1.5 Pro will be entering public preview for Google Cloud customers, and it’s available through Vertex AI. This version of the model was … continue reading

Google announces two new variants of Gemma: CodeGemma and RecurrentGemma

Google has announced that it is extending the Gemma family of AI models with two new variants, one for code generation and one for inference.  For code generation, it is releasing CodeGemma, which provides intelligent code completion and generation. It is capable of producing entire blocks of code at a time, Google claims.  According to … continue reading

Google releases Gemma, a new AI model designed with AI researchers in mind

Google is building on the success of its Gemini launch with the release of a new family of lightweight AI models called Gemma. The Gemma models are open and are designed to be used by researchers and developers to innovate safely with AI.  “We believe the responsible release of LLMs is critical for improving the … continue reading Protection Status