hi @coolrazor !!
Welcome to our community
Not sure I understood
This is a working example using Weaviate (Docker) and Ollama (Natively):
import weaviate
from weaviate import classes as wvc
client = weaviate.connect_to_local()
print(f"Client: {weaviate.__version__}, Server: client.get_meta().get('version')")
client.collections.delete("Test")
collection = client.collections.create(
name="Test",
vectorizer_config=wvc.config.Configure.Vectorizer.text2vec_ollama(
api_endpoint="http://host.docker.internal:11434",
model="snowflake-arctic-embed"
),
generative_config=wvc.config.Configure.Generative.ollama(
api_endpoint="http://host.docker.internal:11434",
model="llama3.1"
)
)
collection.data.insert({"text": "Why is the sky blue?"})
print(len(collection.query.fetch_objects(include_vector=True).objects[0].vector.get("default")))
print(collection.generate.fetch_objects(single_prompt="answer: {text}").objects[0].generated)
client.close()
Let me know how I can help on this.
Thanks!