Really nice job! As I was curious about how it could work locally (using LLaMA 3), I did my part and wanted to share it in case anyone else is interested—just like I was.
I just added a few lines and modified others.
First, the libraries:
from langchain_ollama import OllamaEmbeddings, ChatOllama
Then, for the embeddings—since the original version uses OpenAI and requires an API key—I wanted to change that. Here's what I came up with:
embeddings = OllamaEmbeddings(
model="llama3",
)And for the interaction with the DB:
tables_vector_store = IRISVector.from_documents(
embedding=embeddings,
documents=tables_docs,
connection_string=iris_conn_str,
collection_name="sql_tables",
pre_delete_collection=True
)
And that’s it! The rest keeps working just as smoothly as before.
And of course, in case you need that the prompt interact with Ollama:
defget_sql_from_text(context, prompt, user_input, use_few_shots, tables_vector_store, table_df, example_selector=None, example_prompt=None):
relevant_tables = get_relevant_tables(user_input, tables_vector_store, table_df)
context["table_info"] = "\n\n".join(relevant_tables)
examples = example_selector.select_examples({"input": user_input}) if example_selector else []
context["examples_value"] = "\n\n".join([
example_prompt.invoke(x).to_string() for x in examples
])
model = ChatOllama(
model="llama3",
temperature=0,
# other params...
)
output_parser = StrOutputParser()
chain_model = prompt | model | output_parser
response = chain_model.invoke({
"top_k": context["top_k"],
"table_info": context["table_info"],
"examples_value": context["examples_value"],
"input": user_input
})
return responseAnd that’s it! The rest keeps working just as smoothly as before.
A couple of notes: Ollama is running locally on my machine (to take advantage of the GPU), and it's running the LLaMA 3 model as specified in the parameter.
I also tried changing the language of the prompt to Spanish, and that worked perfectly too—no issues at all!
Hope this helps someone out there who's looking to do the same. Enjoy!
- Log in to post comments