All functionality related to Google Cloud, Google Gemini and other Google products.
- Google Generative AI (Gemini API & AI Studio): Access Google Gemini models directly via the Gemini API. Use Google AI Studio for rapid prototyping and get started quickly with the
langchain-google-genai
package. This is often the best starting point for individual developers. - Google Cloud (Vertex AI & other services): Access Gemini models, Vertex AI Model Garden and a wide range of cloud services (databases, storage, document AI, etc.) via the Google Cloud Platform. Use the
langchain-google-vertexai
package for Vertex AI models and specific packages (e.g.,langchain-google-cloud-sql-pg
,langchain-google-community
) for other cloud services. This is ideal for developers already using Google Cloud or needing enterprise features like MLOps, specific model tuning or enterprise support.
See Google's guide on migrating from the Gemini API to Vertex AI for more details on the differences.
Integration packages for Gemini models and the Vertex AI platform are maintained in
the langchain-google repository.
You can find a host of LangChain integrations with other Google APIs and services in the
googleapis
Github organization and the langchain-google-community
package.
Google Generative AI (Gemini API & AI Studio)
Access Google Gemini models directly using the Gemini API, best suited for rapid development and experimentation. Gemini models are available in Google AI Studio.
pip install -U langchain-google-genai
Start for free and get your API key from Google AI Studio.
export GOOGLE_API_KEY="YOUR_API_KEY"
Chat Models
Use the ChatGoogleGenerativeAI
class to interact with Gemini 2.0 and 2.5 models. See
details in this guide.
from langchain_google_genai import ChatGoogleGenerativeAI
from langchain_core.messages import HumanMessage
llm = ChatGoogleGenerativeAI(model="gemini-2.0-flash")
# Simple text invocation
result = llm.invoke("Sing a ballad of LangChain.")
print(result.content)
# Multimodal invocation with gemini-pro-vision
message = HumanMessage(
content=[
{
"type": "text",
"text": "What's in this image?",
},
{"type": "image_url", "image_url": "https://picsum.photos/seed/picsum/200/300"},
]
)
result = llm.invoke([message])
print(result.content)
The image_url
can be a public URL, a GCS URI (gs://...
), a local file path, a base64 encoded image string (data:image/png;base64,...
), or a PIL Image object.
Embedding Models
Generate text embeddings using models like gemini-embedding-exp-03-07
with the GoogleGenerativeAIEmbeddings
class.
See a usage example.
from langchain_google_genai import GoogleGenerativeAIEmbeddings
embeddings = GoogleGenerativeAIEmbeddings(model="models/gemini-embedding-exp-03-07")
vector = embeddings.embed_query("What are embeddings?")
print(vector[:5])
LLMs
Access the same Gemini models using the (legacy) LLM
interface with the GoogleGenerativeAI
class.
See a usage example.
from langchain_google_genai import GoogleGenerativeAI
llm = GoogleGenerativeAI(model="gemini-2.0-flash")
result = llm.invoke("Sing a ballad of LangChain.")
print(result)
Google Cloud
Access Gemini models, Vertex AI Model Garden and other Google Cloud services via Vertex AI and specific cloud integrations.
Vertex AI models require the langchain-google-vertexai
package. Other services might require additional packages like langchain-google-community
, langchain-google-cloud-sql-pg
, etc.
pip install langchain-google-vertexai
# pip install langchain-google-community[...] # For other services
Google Cloud integrations typically use Application Default Credentials (ADC). Refer to the Google Cloud authentication documentation for setup instructions (e.g., using gcloud auth application-default login
).
Chat Models
Vertex AI
Access chat models like Gemini
via the Vertex AI platform.
See a usage example.
from langchain_google_vertexai import ChatVertexAI
Anthropic on Vertex AI Model Garden
See a usage example.
from langchain_google_vertexai.model_garden import ChatAnthropicVertex
Llama on Vertex AI Model Garden
from langchain_google_vertexai.model_garden_maas.llama import VertexModelGardenLlama
Mistral on Vertex AI Model Garden
from langchain_google_vertexai.model_garden_maas.mistral import VertexModelGardenMistral
Gemma local from Hugging Face
Local
Gemma
model loaded fromHuggingFace
. Requireslangchain-google-vertexai
.
from langchain_google_vertexai.gemma import GemmaChatLocalHF
Gemma local from Kaggle
Local
Gemma
model loaded fromKaggle
. Requireslangchain-google-vertexai
.
from langchain_google_vertexai.gemma import GemmaChatLocalKaggle
Gemma on Vertex AI Model Garden
Requires
langchain-google-vertexai
.
from langchain_google_vertexai.gemma import GemmaChatVertexAIModelGarden
Vertex AI image captioning
Implementation of the
Image Captioning model
as a chat. Requireslangchain-google-vertexai
.
from langchain_google_vertexai.vision_models import VertexAIImageCaptioningChat
Vertex AI image editor
Given an image and a prompt, edit the image. Currently only supports mask-free editing. Requires
langchain-google-vertexai
.
from langchain_google_vertexai.vision_models import VertexAIImageEditorChat
Vertex AI image generator
Generates an image from a prompt. Requires
langchain-google-vertexai
.
from langchain_google_vertexai.vision_models import VertexAIImageGeneratorChat
Vertex AI visual QnA
Chat implementation of a visual QnA model. Requires
langchain-google-vertexai
.
from langchain_google_vertexai.vision_models import VertexAIVisualQnAChat
LLMs
You can also use the (legacy) string-in, string-out LLM interface.
Vertex AI Model Garden
Access Gemini
, and hundreds of OSS models via Vertex AI Model Garden
service. Requires langchain-google-vertexai
.
See a usage example.
from langchain_google_vertexai import VertexAIModelGarden
Gemma local from Hugging Face
Local
Gemma
model loaded fromHuggingFace
. Requireslangchain-google-vertexai
.
from langchain_google_vertexai.gemma import GemmaLocalHF
Gemma local from Kaggle
Local
Gemma
model loaded fromKaggle
. Requireslangchain-google-vertexai
.
from langchain_google_vertexai.gemma import GemmaLocalKaggle
Gemma on Vertex AI Model Garden
Requires
langchain-google-vertexai
.
from langchain_google_vertexai.gemma import GemmaVertexAIModelGarden
Vertex AI image captioning
Implementation of the
Image Captioning model
as an LLM. Requireslangchain-google-vertexai
.
from langchain_google_vertexai.vision_models import VertexAIImageCaptioning
Embedding Models
Vertex AI
Generate embeddings using models deployed on Vertex AI. Requires langchain-google-vertexai
.
See a usage example.
from langchain_google_vertexai import VertexAIEmbeddings
Document Loaders
Load documents from various Google Cloud sources.
AlloyDB for PostgreSQL
Google Cloud AlloyDB is a fully managed PostgreSQL-compatible database service.
Install the python package:
pip install langchain-google-alloydb-pg
See usage example.
from langchain_google_alloydb_pg import AlloyDBLoader # AlloyDBEngine also available
BigQuery
Google Cloud BigQuery is a serverless data warehouse.
Install with BigQuery dependencies:
pip install langchain-google-community[bigquery]
See a usage example.
from langchain_google_community import BigQueryLoader
Bigtable
Google Cloud Bigtable is a fully managed NoSQL Big Data database service.
Install the python package:
pip install langchain-google-bigtable
See usage example.
from langchain_google_bigtable import BigtableLoader
Cloud SQL for MySQL
Google Cloud SQL for MySQL is a fully-managed MySQL database service.
Install the python package:
pip install langchain-google-cloud-sql-mysql
See usage example.
from langchain_google_cloud_sql_mysql import MySQLLoader # MySQLEngine also available
Cloud SQL for SQL Server
Google Cloud SQL for SQL Server is a fully-managed SQL Server database service.
Install the python package:
pip install langchain-google-cloud-sql-mssql
See usage example.
from langchain_google_cloud_sql_mssql import MSSQLLoader # MSSQLEngine also available
Cloud SQL for PostgreSQL
Google Cloud SQL for PostgreSQL is a fully-managed PostgreSQL database service.
Install the python package:
pip install langchain-google-cloud-sql-pg
See usage example.
from langchain_google_cloud_sql_pg import PostgresLoader # PostgresEngine also available
Cloud Storage
Cloud Storage is a managed service for storing unstructured data.
Install with GCS dependencies:
pip install langchain-google-community[gcs]
Load from a directory or a specific file:
from langchain_google_community import GCSDirectoryLoader
See file usage example.
from langchain_google_community import GCSFileLoader
Cloud Vision loader
Load data using Google Cloud Vision API.
Install with Vision dependencies:
pip install langchain-google-community[vision]
from langchain_google_community.vision import CloudVisionLoader
El Carro for Oracle Workloads
Google El Carro Oracle Operator runs Oracle databases in Kubernetes.
Install the python package:
pip install langchain-google-el-carro
See usage example.
from langchain_google_el_carro import ElCarroLoader
Firestore (Native Mode)
Google Cloud Firestore is a NoSQL document database.
Install the python package:
pip install langchain-google-firestore
See usage example.
from langchain_google_firestore import FirestoreLoader
Firestore (Datastore Mode)
Install the python package:
pip install langchain-google-datastore
See usage example.
from langchain_google_datastore import DatastoreLoader
Memorystore for Redis
Google Cloud Memorystore for Redis is a fully managed Redis service.
Install the python package:
pip install langchain-google-memorystore-redis
See usage example.
from langchain_google_memorystore_redis import MemorystoreDocumentLoader