Yeh the OpenWebUI would be more on the retrieval side - though they do have their own RAG and vector storage built in - to go a level further would it be an idea be to build the IRIS capabilities around/into OpenWebUI as it is open source? so that way you save the effort on the front end retrieval/chat interface (and LLM integration and Ollama integrations for local models) that are then managed there by others and then build the functionality you have for IRIS around that? eg. add a section for direct semantic search and for the documents piece to be able to configure an IRIS DB as a target for vector/document data storage? (understand this might be a bit too much for now though!)
Great idea, I've been using open-webui to do similar stuff but without IRIS obviously! There may be some ideas there, but the one I think may be good to plug into this would be the built in liteLLM integration it has to proxy out LLM connections to various providers. It's pretty simple to setup within the open-webui interface as well and then through the UI you can hit multiple LLMs easily, even doing multiple LLMs at once to compare results.
Could also be used to expose different embedding models for usage as well.
https://github.com/open-webui/open-webui
https://litellm.vercel.app