User bio
404 bio not found
Member since Feb 14, 2018
Posts:
Ariel has not published any posts yet.
Replies:

Really nice job! As I was curious about how it could work locally (using LLaMA 3), I did my part and wanted to share it in case anyone else is interested—just like I was.

I just added a few lines and modified others.

First, the libraries:

from langchain_ollama import OllamaEmbeddings, ChatOllama

 

Then, for the embeddings—since the original version uses OpenAI and requires an API key—I wanted to change that. Here's what I came up with:

 

embeddings = OllamaEmbeddings(
    model="llama3",
)

And for the interaction with the DB:

tables_vector_store = IRISVector.from_documents(
    embedding=embeddings,
    documents=tables_docs,
    connection_string=iris_conn_str,
    collection_name="sql_tables",
    pre_delete_collection=True
)

And that’s it! The rest keeps working just as smoothly as before.

And of course, in case you need that the prompt interact with Ollama:

def get_sql_from_text(context, prompt, user_input, use_few_shots, tables_vector_store, table_df, example_selector=None, example_prompt=None):
    relevant_tables = get_relevant_tables(user_input, tables_vector_store, table_df)
    context["table_info"] = "\n\n".join(relevant_tables)

    examples = example_selector.select_examples({"input": user_input}) if example_selector else []
    context["examples_value"] = "\n\n".join([
        example_prompt.invoke(x).to_string() for x in examples
    ])

    model = ChatOllama(
    model="llama3",
    temperature=0,
    # other params...
    )
    output_parser = StrOutputParser()
    chain_model = prompt | model | output_parser

    response = chain_model.invoke({
        "top_k": context["top_k"],
        "table_info": context["table_info"],
        "examples_value": context["examples_value"],
        "input": user_input
    })
    return response

And that’s it! The rest keeps working just as smoothly as before.

A couple of notes: Ollama is running locally on my machine (to take advantage of the GPU), and it's running the LLaMA 3 model as specified in the parameter.

I also tried changing the language of the prompt to Spanish, and that worked perfectly too—no issues at all!

Hope this helps someone out there who's looking to do the same. Enjoy!

Hi @reachgr-g

Which TrakCare version are you working onn? I can not see the CT_Staff table in my version (2020) but the CT_StaffType which is related to users instead of CT_Hospital

Or is CT_Staff a random name given and you are looking how to get data for some tables which are not directly linked to others?

Happy to help but I may need something more specific

Hi @Bukhtiar Ahmad , 

Patient Registry is intended to maintain Patient Data or visualize, but not to "create". Did you try to add patient direct trough some SDA or HL7 file?

Another try or alternative to "create" patients is from Registry using the "Test Utility", but there you can have only demographics or upload documents using some IHE profile... the other alternative is using de Registry's API

Certifications & Credly badges:
Ariel has no Certifications & Credly badges yet.
Global Masters badges:
Ariel has no Global Masters badges yet.
Followers:
Ariel has no followers yet.
Following: