go to post Benjamin De Boe · Feb 26, 2016 Hi Benjamin,in order to enable a web application to use iKnow, you need to check the "iKnow" box in the SMPs Web Application management page (System Administration > Security > Applications > Web Applications). This was mentioned in the release notes of the first version introducing the stricter security policies (or at least the routine behind the checkbox is), but isn't mentioned prominently enough in the iKnow guide. We'll look into that.This is actually only related to the web interfaces, so Atelier is not involved here. To create iKnow domain definitions through the management portal, look for the "iKnow Architect" in the SMP menu for iKnow. regards,benjamin
go to post Benjamin De Boe · Feb 1, 2016 Hi Jack, this is not an out-of-the-box feature of the iKnow technology. iKnow's semantic analysis is targeted at identifying the semantic entities of a sentence, but not at interpreting them, which is typically an application-specific activity. However, we do have some building blocks that will help you create such applications, combining the iKnow analysis of a sentence with domain knowledge you already have. If you look at the indexing results for such a sentence, you'll see that the entities iKnow identifies will usually already present a good structure for your sentence, and human questions are often not that complicated. However, if the database you'll be querying is just un-interpreted free text as well, you'll need much more magic. If you're looking at querying a well-known data structure, it's much more feasible. I once wrote a crude text-to-MDX query tool that translated natural language questions into MDX by matching the concepts in the question to the labels on the dimensions and measures of a DeepSee cube definition. In this case, iKnow played its part in decomposing the question into concepts and relationships, which were then easily "interpreted" by custom code as cube elements and MDX constructs. So, in short, iKnow will help you in the semantic analysis of natural language text, but depending on the complexity of the domain, more dedicated (and expensive) tools are usually needed for the subsequent interpretation and inference of results. benjamin
go to post Benjamin De Boe · Jan 28, 2016 Interesting, thanks for trying this out. Maybe I was asking too much when I tested with a subdirectory of the root web application, in order to still see other CSP pages from my abc namespace. And also, I'd still need to look for a convenient way to map javascript files in the same way. But at least we're half way :o)
go to post Benjamin De Boe · Jan 28, 2016 Hi Ben, thanks for your reply, but that's what I tested first, but didn't seem to work, maybe because it somehow still needs the CSP file to be in the install/CSP/xyz/ folder, where it still only is in install/CSP/abc/. I also tried adding a web app /csp/xyz/test/ that referred to the abc folder and xyz namespace, but that was probably too optimistic (or messy).
go to post Benjamin De Boe · Jan 19, 2016 Hi Jack, what exactly do you mean with search using natural language? If you're referring to combined search strings like "snow AND (ski OR ice-skat*)", this is possible today. Or do you mean asking a real literal question? Can you give an example? thanks, benjamin
go to post Benjamin De Boe · Jan 18, 2016 Hi Jack, you can enable stemming by setting the INDEXOPTION index parameter to 1 (or by leveraging the more flexible TRANSFORMATIONSPEC index parameter if you are on 2016.1). Class ThePackage.MyClass Extends %Persistent { Property MyStringProperty As %String; Index MyBasicIndex On (MyStringProperty) As %iFind.Index.Basic(INDEXOPTION=1); } The class reference for %iFind.Index.Basic also explains how you can toggle between stemmed and normal search by using the search mode argument: SELECT * FROM ThePackage.MyClass WHERE %ID %FIND search_index(MyBasicIndex, 'interesting') for normal search vs using search option 1 for stemmed search: SELECT * FROM ThePackage.MyClass WHERE %ID %FIND search_index(MyBasicIndex, 'interesting', 1) We do not discard stop words in iFind, in order to ensure you can query for any literal word sequence afterwards. If you start looking at the projections for entities (cf %iKnow.Index.Analytic class ref), you'll see how iKnow offers you a more insightful view of what a sentence is about through the "entity" level, where classic search tech may only offer you the words of a sentence minus the stop words. regards, benjamin
go to post Benjamin De Boe · Jan 18, 2016 Hi Jack, thanks for sharing your question. iFind actually only uses the iKnow engine, the internal piece of machinery that analyzes natural language text to identify semantic entities and their context. It does not use the iKnow domain infrastructure, which most of the documentation is focused on, but files the output of the iKnow engine into index structures that can be queried using the %FIND syntax or through some of the additional projections in search scenarios. In order to create an iFind index on your table, you simply add it to the class definition (more info here) and then call the regular %BuildIndices() method (if there was data in it already). In a sense, iFind is a more lightweight, search oriented SQL index type, whereas the iKnow domain infrastructure offers a broader environment for exploring entities and their context. FYI, I've posted an example search portal built on top of iFind here.