- HealthShare Unified Care Record Overview – Virtual May 15-16, 2025
- The HealthShare Unified Care Record Overview course is a great way for anyone to become familiar with Unified Care Record, but especially those who need to understand its capabilities but not how to configure HealthShare Unified Care Record.
- This is a non-technical, instructor-led in person training course providing a comprehensive introduction to HealthShare Unified Care Record.
- This course is for anyone who needs to know about the functionality and architecture of HealthShare Unified Care Record. (If you need information on configuring and troubleshooting Unified Care Record, consider the HealthShare Unified Care Record Fundamentals class.)
- No prior knowledge or experience is required for the Overview class and any InterSystems employee may enroll.
- Self-Register Here
My AI use case - need help with Ollama and / or Langchain
I am brand new to using AI. I downloaded some medical visit progress notes from my Patient Portal. I extracted text from PDF files. I found a YouTube video that showed how to extract metadata using an OpenAI query / prompt such as this one:
ollama-ai-iris/data/prompts/medical_progress_notes_prompt.txt at main · oliverwilms/ollama-ai-iris
I combined @Rodolfo Pscheidt https://github.com/RodolfoPscheidtJr/ollama-ai-iris with some files from @Guillaume Rongier https://openexchange.intersystems.com/package/iris-rag-demo.
I attempted to run
python3 query_data.py
Traceback (most recent call last):
File "/irisdev/app/query_data.py", line 39, in <module>
response = query_engine.query(prompt_data)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/irisowner/.local/lib/python3.12/site-packages/llama_index/core/instrumentation/dispatcher.py", line 322, in wrapper
result = func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/home/irisowner/.local/lib/python3.12/site-packages/llama_index/core/base/base_query_engine.py", line 52, in query
query_result = self._query(str_or_query_bundle)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/irisowner/.local/lib/python3.12/site-packages/llama_index/core/instrumentation/dispatcher.py", line 322, in wrapper
result = func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/home/irisowner/.local/lib/python3.12/site-packages/llama_index/core/query_engine/retriever_query_engine.py", line 179, in _query
response = self._response_synthesizer.synthesize(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/irisowner/.local/lib/python3.12/site-packages/llama_index/core/instrumentation/dispatcher.py", line 322, in wrapper
result = func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/home/irisowner/.local/lib/python3.12/site-packages/llama_index/core/response_synthesizers/base.py", line 241, in synthesize
response_str = self.get_response(
^^^^^^^^^^^^^^^^^^
File "/home/irisowner/.local/lib/python3.12/site-packages/llama_index/core/instrumentation/dispatcher.py", line 322, in wrapper
result = func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/home/irisowner/.local/lib/python3.12/site-packages/llama_index/core/response_synthesizers/compact_and_refine.py", line 43, in get_response
return super().get_response(
^^^^^^^^^^^^^^^^^^^^^
File "/home/irisowner/.local/lib/python3.12/site-packages/llama_index/core/instrumentation/dispatcher.py", line 322, in wrapper
result = func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/home/irisowner/.local/lib/python3.12/site-packages/llama_index/core/response_synthesizers/refine.py", line 179, in get_response
response = self._give_response_single(
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/irisowner/.local/lib/python3.12/site-packages/llama_index/core/response_synthesizers/refine.py", line 241, in _give_response_single
program(
File "/home/irisowner/.local/lib/python3.12/site-packages/llama_index/core/instrumentation/dispatcher.py", line 322, in wrapper
result = func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/home/irisowner/.local/lib/python3.12/site-packages/llama_index/core/response_synthesizers/refine.py", line 85, in __call__
answer = self._llm.predict(
^^^^^^^^^^^^^^^^^^
File "/home/irisowner/.local/lib/python3.12/site-packages/llama_index/core/instrumentation/dispatcher.py", line 322, in wrapper
result = func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/home/irisowner/.local/lib/python3.12/site-packages/llama_index/core/llms/llm.py", line 605, in predict
chat_response = self.chat(messages)
^^^^^^^^^^^^^^^^^^^
File "/home/irisowner/.local/lib/python3.12/site-packages/llama_index/core/instrumentation/dispatcher.py", line 322, in wrapper
result = func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/home/irisowner/.local/lib/python3.12/site-packages/llama_index/core/llms/callbacks.py", line 173, in wrapped_llm_chat
f_return_val = f(_self, messages, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/irisowner/.local/lib/python3.12/site-packages/llama_index/llms/ollama/base.py", line 322, in chat
response = self.client.chat(
^^^^^^^^^^^^^^^^^
File "/home/irisowner/.local/lib/python3.12/site-packages/ollama/_client.py", line 333, in chat
return self._request(
^^^^^^^^^^^^^^
File "/home/irisowner/.local/lib/python3.12/site-packages/ollama/_client.py", line 178, in _request
return cls(**self._request_raw(*args, **kwargs).json())
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/irisowner/.local/lib/python3.12/site-packages/ollama/_client.py", line 124, in _request_raw
raise ConnectionError(CONNECTION_ERROR_MESSAGE) from None
ConnectionError: Failed to connect to Ollama. Please check that Ollama is downloaded, running and accessible. https://ollama.com/download