Hi Rajan,
OpenAI
To access corporate hosted service by OpenAI,
You can do it with base_url
in the vectorizer configuration (see docs for all config options)
With classic vectorizer
from weaviate.classes.config import Configure
client.collections.create(
name="DemoCollection",
vectorizer_config=Configure.Vectorizer.text2vec_openai(
model="text-embedding-3-small",
base_url="<custom_openai_url>",
),
)
With named vectorizer
from weaviate.classes.config import Configure
client.collections.create(
"DemoCollection",
vectorizer_config=[
Configure.NamedVectors.text2vec_openai(
name="title_vector",
source_properties=["title", "description"], # properties to vectorize
model="text-embedding-3-small",
base_url="<custom_openai_url>",
)
],
)
Azure
If your OpenAI models are hosted on Azure.
Then your code should look like this (see docs):
With classic vectorizer
from weaviate.classes.config import Configure
client.collections.create(
name="DemoCollection",
vectorizer_config=Configure.Vectorizer.text2vec_azure_openai(
deployment_id="text-embedding-3-small",
resource_name="<azure-resource-name>",
base_url="<custom_openai_url>",
),
)
With named vectorizer
from weaviate.classes.config import Configure
client.collections.create(
"DemoCollection",
vectorizer_config=[
Configure.NamedVectors.text2vec_azure_openai(
name="title_vector",
source_properties=["title", "description"], # properties to vectorize
deployment_id="text-embedding-3-small",
resource_name="<azure-resource-name>",
base_url="<custom_openai_url>",
)
],
)