Problem with ollama container in langchain-iris-tool
I cloned @Yuri.Marx's langchain-iris-tool repo and modified docker-compose yaml per this post:
https://community.intersystems.com/post/error-when-trying-langchain-iris...
Now I see this:
docker ps -a
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
c585beb367e6 ollama/ollama:latest "/usr/bin/bash /mode…" 6 minutes ago Restarting (1) 55 seconds ago ollama
c59535140780 langchain-iris-tool-iris "/tini -- /docker-en…" 6 minutes ago Up 6 minutes (healthy) 2188/tcp, 8501/tcp, 54773/tcp, 0.0.0.0:51972->1972/tcp, :::51972->1972/tcp, 0.0.0.0:53795->52773/tcp, :::53795->52773/tcp, 0.0.0.0:32770->53773/tcp, :::32770->53773/tcp langchain-iris-tool-iris-1
e898e27c7275 yourstreamlitapp:latest "streamlit run /usr/…" 6 minutes ago Up 6 minutes 0.0.0.0:8501->8501/tcp, :::8501->8501/tcp langchain-iris-tool-streamlit-1
ollama is Restarting. Why?
I get this error when I try Iris Classes Chat:
ValueError: Error raised by inference endpoint: HTTPConnectionPool(host='ollama', port=11434): Max retries exceeded with url: /api/embeddings (Caused by NameResolutionError("<urllib3.connection.HTTPConnection object at 0x7f4d75c88d90>: Failed to resolve 'ollama' ([Errno -5] No address associated with hostname)"))