Clear filter
Announcement
Simon Player · Sep 12, 2017
Modern businesses need new kinds of applications — ones that are smarter, faster, and can scale more quickly and cost-effectively to accommodate larger data sets, greater workloads, and more users.With this in mind, we have unveiled InterSystems IRIS Data Platform™, a complete, unified solution that provides a comprehensive and consistent set of capabilities spanning data management, interoperability, transaction processing, and analytics. It redefines high performance for application developers, systems integrators, and end-user organizations who develop and deploy data-rich and mission-critical solutions. InterSystems IRIS Data Platform provides all of the following capabilities in a single unified platform:Data ManagementAn ultra-high performance, horizontally scalable, multi-model database stores and accesses data modeled as objects, schema-free data, relational data, and multi-dimensional arrays in a single, highly efficient representation. It simultaneously processes both transactional and analytic workloads in a single database at very high scale, eliminating latencies between event, insight, and action, and reducing the complexities associated with maintaining multiple databases. InteroperabilityA comprehensive integration platform provides application integration, data coordination, business process orchestration, composite application development, API management, and real-time monitoring and alerting capabilities to support the full spectrum of integration scenarios and requirements. AnalyticsA powerful open analytics platform supports a wide range of analytics, including business intelligence, predictive analytics, distributed big data processing, real-time analytics, and machine learning. It is able to analyze real-time and batch data simultaneously at scale, and developers can embed analytic processing into business processes and transactional applications, enabling sophisticated programmatic decisions based on real-time analyses. The analytics platform also provides natural language processing capabilities to extract meaning and sentiment from unstructured text, allowing organizations to streamline processes that reference customer emails, knowledge databases, social media content, and other unstructured text data. Cloud DeploymentAutomated “cloud-first” deployment options simplify public cloud, private cloud, on-premise, and virtual machine deployments and updates. You can learn more about this new data platform by visiting our online learning page Simon Player,Director of Development, Data Platforms and TrakCare. Hi , Did we have iris cube like cache and ensemble or iris is different. please explain me how to work with iris i am really confused about iris. Here is a lot of information on Intersystems IRIS to reduce confusion.Your InterSystems sales rep will have more.
Announcement
Josh Lubarr · Oct 19, 2017
Hi, Community!We are pleased to invite you to participate in the InterSystems Documentation Satisfaction survey. As part of ongoing efforts to make our content more usable and helpful, Learning Services wants your feedback about InterSystems documentation. The survey covers many different areas of documentation, and you can complete it in about five minutes. The deadline for responses is October 30. Also, responses are anonymous. The survey is over, thanks! And also you can earn 300 points in Global Masters for the survey completion.
Announcement
Developer Community Admin · Dec 2, 2015
InterSystems Atelier Field Test is now available.Supported customers who have credentials to access the WRC can find the client download here:https://wrc.intersystems.com/wrc/BetaPortalCloud2.csp Thank you for your work! Atelier field test is really cool.
Just two questions:
- Is there any chance to extend 15 days period of test? It would be great (I can't test it full time).
- Can we access to Atelier Ensemble Test server with Studio? I've tried, but I couldn't. It would be very helpful to compare working with Atelier and Studio at the same time with the same code.
Kind regards!
Since you work on the client with Atelier, and only use the server for deployment/compilation/debugging, it does not matter if you need to restart the server every two days, and it allows bug fixes to be deployed automatically. If you export to XML from Atelier and use source control on a system using Studio, then you should see the code automically refresh in Studio when you open the file.
The test is not limited to two weeks. That was simply the expiration on the license. We updated the beta facility with a new version and a more generous license. Happy coding!
I can confirm that it doesn't matter when my cloud instance goes away, I just go back to the portal and deploy the new version and bugs that I found over the last few weeks are usually simply fixed.
Is there a way to force my "instance" to expire/update without waiting the two weeks?
What's the right support channel to use for the cloud-based Atelier FT server? I log in at https://wrc.intersystems.com/wrc/BetaPortalCloud.csp and click the "Launch Server" button. The button disappears immediately and is replaced by text saying "Your Atelier Server has been launched,
it will run for the next (expired)." and when I click on the xxx-.isc.appsembler.com/csp/sys/UtilHome.csp URL I am offered I get a page reading "
No Application Configured
This domain is not associated with an application.
"
John -
The WRC is handling any calls with regard to the beta cloud portal. I saw this and reached out to the cloud provider as the issue is coming from their side. I will let you know as soon as I hear back.
Hi Bill,When will the official release be out? Have been waiting since the 2015 after our 2014 conference when we heard it would by done by end of 2015? Really exited not to work on Windows anymore and also not using studio in remote connections as it chews bandwidth and tends to be really slow. Please read the announcement here: https://community.intersystems.com/post/announcement-about-cach%C3%A9-20162-and-20163-field-test-programs We have been holding updates of Atelier to finish refactoring to conform to the changes outlined there. Kits are being published for 16.2 today and a new Atelier kit will be available in "Check for Updates" today or tomorrow. Any progress on this yet? My 1.0.90 Atelier's "Check for Updates" still tells me it finds none. I've already updated my 2016.2 FT instance to the latest you've published (build 721), so I expect my Atelier won't play well with that until it gets updated too. A new kit will be posted this week, probably today. Updated just now and received build 232. Thanks.
Question
Ankita JAin · May 31, 2017
Hi ,I am stuck with unit test failure with intersystem . In case of unit test failure, the build in jenkins is succeding while the build in jenkins should fail in case unit test failure .In cache programming i am using %UnitTest.Manager class and DebugRunTestCase method within it. I'm able to link studio with jenkins. I wanna fail my build in jenkins, if any of the test cases fails. Could anyone help? Please, provide how you call Caché from Jenkins? Please tell us very clearly how to exit the cache terminal via command line and then that batch script should fail the jenkins job ? Please tell us if it is feassible in intersystem . We have tried with all methods from intersystem doc library but still we could not progress (Jenkins job is not getting triggered based on boolean value passed from cache). Please suggest Hi Ankita,Would you be able to provide some more details about how you currently run the unit tests from your batch script? Regards,Caza Hi Caza ,
We are facing the following issues :-
1. We are getting the boolean value for the pass and fail unit test cases in cache Intersystem, but we are not able to assingn this boolean value into a variable in a batch script( r3 in this case)
2. @ECHO %FAILUREFLAG% is not giving any output to us, can you help with that also.
Please suggest us for the following problems or some alternative approach. The Batch script code is here :-
:: Switch output mode to utf8 - for comfortable log reading
@chcp 65001
@SET WORKSPACE="https://github.com/intersystems-ru/CacheGitHubCI.git"
:: Check the presence of the variable initialized by Jenkins
@IF NOT DEFINED WORKSPACE EXIT 1
@SET SUCCESSFLG =%CD%\successflag.txt
@SET FAILUREFLAG =%CD%\failureflag.txt
@ECHO %FAILUREFLAG%
@DEL "%SUCCESSFLG%"
@DEL "%FAILUREFLAG%"
:: The assembly may end with different results
:: Let the presence of a specific file in the directory tell us about the problems with the assembly
::% CD% - [C] urrent [D] irectory is a system variable
:: it contains the name of the directory in which the script is run (bat)
:: Delete the bad completion file from the previous start
::@DEL "% ERRFLAG%"
:::::::::::::::::::::::::::::::::::::::::: :::::::::::::::::::::
:: CREATING CACHE CONTROL SCRIPT
:: line by line in the build.cos command to manage the Cache
:::::::::::::::::::::::::::::::::::::::::: :::::::::::::::::::::
:: If the Cache is installed with normal or increased security
:: then in the first two lines the name and password of the user Cache
@ECHO user>build.cos
@ECHO password>>build.cos
:: Go to the necessary NAMESPACE
@ECHO zn "MYNAMESPACE" >>build.cos
@ECHO do ##class(Util.SourceControl).Init() >>build.cos
@ECHO zn "User" >>build.cos
@ECHO set ^^UnitTestRoot ="C:\test" >>build.cos
@ECHO do ##class(%%UnitTest.Manager).RunTest("myunittest","/nodelete") >>build.cos
@Echo set i=$order(^^UnitTest.Result(""),-1) >>build.cos
@Echo write i >>build.cos
@Echo set unitResults=^^UnitTest.Result(i,"myunittest") >>build.cos
@ECHO Set r3 = $LISTGET(unitResults,1,1) >>build.cos
@ECHO if r3'= 0 set file = "C:/unittests/successflag.txt" o file:("NWS") u file do $SYSTEM.OBJ.DisplayError(r3) >>build.cos
@ECHO if r3'= 1 set file = "C:/unittests/failureflag.txt" o file:("NWS") u file do $SYSTEM.OBJ.DisplayError(r3) >>build.cos
::@ECHO if 'r3 do $SYSTEM.Process.Terminate(,1)
::@ECHO halt
@IF EXIST "%FAILUREFLAG%" EXIT 1
::@ECHO do ##class(%%UnitTest.Manager).RunTest("User") >>build.cos
::@ECHO set ut = do ##class(%%UnitTest.Manager).RunTest("User") >>build.cos
::set var = 1 >>build.coss
:: If it did not work out, we will show the error to see it in the logs of Jenkins
::@ECHO if ut'= 1 do $SYSTEM.OBJ.DisplayError(sc) >>build.cos
::echo %var%
@Echo On
@ECHO zn "%%SYS" >>build.cos
@ECHO set pVars("Namespace")="MYDEV" >>build.cos
@ECHO set pVars("SourceDir")="C:\source\MYNAMESPACE\src\cls\User" >>build.cos
@ECHO do ##class(User.Import).setup(.pVars)>>build.cos
:: Download and compile all the sources in the assembly directory;
::% WORKSPACE% - variable Jenkins
::@ECHO set sc = $SYSTEM.OBJ.ImportDir("%WORKSPACE%", "*. Xml", "ck",. Err, 1) >>build.cos
:: If it did not work out, we will show the error to see it in the logs of Jenkins
::@ECHO if sc'= 1 do $SYSTEM.OBJ.DisplayError(sc) >>build.cos
:: and from the cos script create an error flag file to notify the script bat
::@ECHO if sc'= 1 set file = "% ERRFLAG%" o file :( "NWS") u file do $ SYSTEM.OBJ.DisplayError (sc) with file >> build.cos
:: Finish the Cache process
::@ECHO if sc'= 1 set file = "% ERRFLAG%" o file :( "NWS") u file do $ SYSTEM.OBJ.DisplayError (sc) with file >> build.cos
:::::::::::::::::::::::::::::::::::::::::: :::::::::::::::::::
:: Call the Cache control program and pass the generated script to it ::
:::::::::::::::::::::::::::::::::::::::::: ::::::::::::::::
C:\InterSystems\HealthShare\bin\cache.exe -s C:\InterSystems\HealthShare\mgr -U %%SYS <%CD%\build.cos
:: If the error file was created during the cos script execution - we notify Jenkins about this
::@IF EXIST "%ERRFLAG%" EXIT 1
Hi Ankita,
I found a couple of issues in the script that might affect your end results:- the folder C:\unittests doesn't exist (at least not on my computer); unless the value of the WORKSPACE env variable is C:\untittest then you have to ensure the folder exists (you can create it either using batch mkdir or using COS ##class(%%File).CreateDirectoryChain() method )- what is stored in the ^UnitTest.Result(i,"myunittest") global is not a status code but a numeric value; so I would suggest replacing Do $system.OBJ.DisplayError(r3) with a simple write command, like this:
@ECHO if r3'= 0 set file = "C:/unittests/successflag.txt" o file:("NWS") u file write r3 >>build.cos
@ECHO if r3'= 1 set file = "C:/unittests/failureflag.txt" o file:("NWS") u file write r3 >>build.cos
Regarding "@ECHO %FAILUREFLAG%" - make sure there are no spaces before or after the = character in the following two commands:
@SET SUCCESSFLG =%CD%\successflag.txt
@SET FAILUREFLAG =%CD%\failureflag.txt
When I did copy/paste of the example script I ended up with a space character before the = character.
Can you try these changes and let me know how you go?
Cheers,Caza Thanks for your response it worked If you add to your batch script something like:
do ##class(com.isc.UnitTest.Manager).OutputResultsXml("c:/unittests/unit_test_results.xml")
then you could pass on the unit_test_results.xml file to the JUnit plugin from Jenkins. This will give you useful reports, duration and individual results breakdown, etc.
For example: https://support.smartbear.com/testcomplete/docs/_images/working-with/integration/jenkins/trend-graph.png Yay! You can create a windows batch file, where you can write commands to call Cache terminal using cterm.exe Here's an object and SQL approaches to get Unit Tests status. One gotcha in windows is making sure the encoding on your .scr file is in UTF-8. Windows thinks a .scr file is a screensaver file and can give it a weird encoding. Once I switched it to UTF-8 it worked as described in the documentation. I don't know how you call Caché method from Jenkins, but anyway you can use $SYSTEM.Process.Terminate in Caché script to exit with an exit status. Something like this.
set tSC=##class(%UnitTest.Manager).DebugRunTestCase(....)
if 'tSC do $SYSTEM.Process.Terminate(,1)
halt
I suggest that you may use csession or cterm to call Caché code, then you should get exit code and send it to Jenkins, which will be recognized by Jenkins as an error and will fail the job. Hi, Thanks Dmitry Maslennikov for the Response.Could you please help us in Clarifying the below Query.Is there any methods/ways through which will get to know whether any of the Unit Test cases is/are failing (we are getting an url of the csp page with the report which has the passed failed status) as we need to send this failure status to Jenkins for the Build to Fail (where in we have acheive this part making the build failure/success based on harcoded boolean) Hi,
We use something like the below to output the unit test results to an xml file in JUnit format.
/// Extend %UnitTest manager to output unit test results in JUnit format.
/// This relies on the fact that unit test results are stored in <b>^UnitTest.Result</b> global. Results displayed on CSP pages come from this global.
Class com.isc.UnitTest.Manager Extends %UnitTest.Manager
{
ClassMethod OutputResultsXml(pFileName As %String) As %Status
{
set File=##class(%File).%New(pFileName)
set i=$order(^UnitTest.Result(""),-1)
if i="" quit $$$OK // no results
kill ^||TMP // results global
set suite="" for {
set suite=$order(^UnitTest.Result(i,suite))
quit:suite=""
set ^||TMP("S",suite,"time")=$listget(^UnitTest.Result(i,suite),2)
set case="" for {
set case=$order(^UnitTest.Result(i,suite,case))
quit:case=""
if $increment(^||TMP("S",suite,"tests"))
set ^||TMP("S",suite,"C",case,"time")=$listget(^UnitTest.Result(i,suite),2)
set method="" for {
set method=$order(^UnitTest.Result(i,suite,case,method))
quit:method=""
set ^||TMP("S",suite,"C",case,"M",method,"time")=$listget(^UnitTest.Result(i,suite,case,method),2)
set assert="" for {
set assert=$order(^UnitTest.Result(i,suite,case,method,assert))
quit:assert=""
if $increment(^||TMP("S",suite,"assertions"))
if $increment(^||TMP("S",suite,"C",case,"assertions"))
if $increment(^||TMP("S",suite,"C",case,"M",method,"assertions"))
if $listget(^UnitTest.Result(i,suite,case,method,assert))=0 {
if $increment(^||TMP("S",suite,"failures"))
if $increment(^||TMP("S",suite,"C",case,"failures"))
if $increment(^||TMP("S",suite,"C",case,"M",method,"failures"))
set ^||TMP("S",suite,"C",case,"M",method,"failure")=$get(^||TMP("S",suite,"C",case,"M",method,"failure"))
_$listget(^UnitTest.Result(i,suite,case,method,assert),2)
_": "_$listget(^UnitTest.Result(i,suite,case,method,assert),3)
_$char(13,10)
}
}
if ($listget(^UnitTest.Result(i,suite,case,method))=0)
&& ('$data(^||TMP("S",suite,"C",case,"M",method,"failures"))) {
if $increment(^||TMP("S",suite,"failures"))
if $increment(^||TMP("S",suite,"C",case,"failures"))
if $increment(^||TMP("S",suite,"C",case,"M",method,"failures"))
set ^||TMP("S",suite,"C",case,"M",method,"failure")=$get(^||TMP("S",suite,"C",case,"M",method,"failure"))
_$listget(^UnitTest.Result(i,suite,case,method),3)
_": "_$listget(^UnitTest.Result(i,suite,case,method),4)
_$char(13,10)
}
}
if $listget(^UnitTest.Result(i,suite,case))=0
&& ('$data(^||TMP("S",suite,"C",case,"failures"))) {
if $increment(^||TMP("S",suite,"failures"))
if $increment(^||TMP("S",suite,"C",case,"failures"))
if $increment(^||TMP("S",suite,"C",case,"M",case,"failures"))
set ^||TMP("S",suite,"C",case,"M",case,"failure")=$get(^||TMP("S",suite,"C",case,"M",case,"failure"))
_$listget(^UnitTest.Result(i,suite,case),3)
_": "_$listget(^UnitTest.Result(i,suite,case),4)
_$char(13,10)
}
}
}
do File.Open("WSN")
do File.WriteLine("<?xml version=""1.0"" encoding=""UTF-8"" ?>")
do File.WriteLine("<testsuites>")
set suite="" for {
set suite=$order(^||TMP("S",suite))
quit:suite=""
do File.Write("<testsuite")
do File.Write(" name="""_$zconvert(suite,"O","XML")_"""")
do File.Write(" assertions="""_$get(^||TMP("S",suite,"assertions"))_"""")
do File.Write(" time="""_$get(^||TMP("S",suite,"time"))_"""")
do File.Write(" tests="""_$get(^||TMP("S",suite,"tests"))_"""")
do File.WriteLine(">")
set case="" for {
set case=$order(^||TMP("S",suite,"C",case))
quit:case=""
do File.Write("<testsuite")
do File.Write(" name="""_$zconvert(case,"O","XML")_"""")
do File.Write(" assertions="""_$get(^||TMP("S",suite,"C",case,"assertions"))_"""")
do File.Write(" time="""_$get(^||TMP("S",suite,"C",case,"time"))_"""")
do File.Write(" tests="""_$get(^||TMP("S",suite,"C",case,"tests"))_"""")
do File.WriteLine(">")
set method="" for {
set method=$order(^||TMP("S",suite,"C",case,"M",method))
quit:method=""
do File.Write("<testcase")
do File.Write(" name="""_$zconvert(method,"O","XML")_"""")
do File.Write(" assertions="""_$get(^||TMP("S",suite,"C",case,"M",method,"assertions"))_"""")
do File.Write(" time="""_$get(^||TMP("S",suite,"C",case,"M",method,"time"))_"""")
do File.WriteLine(">")
if $data(^||TMP("S",suite,"C",case,"M",method,"failure")) {
do File.Write("<failure type=""cache-error"" message=""Cache Error"">")
do File.Write($zconvert(^||TMP("S",suite,"C",case,"M",method,"failure"),"O","XML"))
do File.WriteLine("</failure>")
}
do File.WriteLine("</testcase>")
}
do File.WriteLine("</testsuite>")
}
do File.WriteLine("</testsuite>")
}
do File.WriteLine("</testsuites>")
do File.Close()
kill ^||TMP
quit $$$OK
}
}
Question
Jose Sampaio · Sep 19, 2018
Hi community members!Please, I'm looking for any references or experiences using InterSystems technologies with MQTT (Message Queuing Telemetry Transport) protocol .Thanks in advance! Hi Evgeny!I will take look on this.Tks. Hi, Jose!Have you seen this article? Also pinging @Attila.Toth in hope to provide most recent updates.
Announcement
Daniel Kutac · Oct 29, 2018
We had our first meetup of the Prague Meetup for InterSystems Data Platform last Thursday!
As it was our first such venue, the attendance was not large, but we believe it was a good start. Those who attended could learn about new features that InterSystems IRIS brings to our partners and customers as well as listen to a presentation discussing what it takes to migrate from Caché or Ensemble to InterSystems IRIS and eventually containerizing their applications.
We all enjoyed excellent assortment of various tea species, accompanied by vegetarian food. (so you know what you can expect next time :) )
Attached you find a picture taken at the meetup.
Looking forward to see you next time and perhaps in a bigger group!
Dan Kutac Congratulations! In past, we had a similar event in Austria named "Tech Talk" that formed a national user community over time.I wish you a lot of success,Robert Thank you Robert!Dan
Announcement
Kristina Lauer · Jun 20, 2023
Dive into the details of InterSystems technologies with long-term online programs and classroom courses. Then, show off your skills with program and certification badges! Find all the details in this month's Learning newsletter. Don't forget to subscribe to receive the newsletter in your inbox!
Announcement
Vadim Aniskin · Jun 21, 2023
Hi Developers!
Welcome to the 7th edition of the InterSystems Ideas news bulletin! Read on to learn what has happened on the Ideas Portal since the previous bulletin:
✓ More than 200 ideas are already on the portal
✓ Idea was implemented by a Community member
✓ Implement an idea and get a tech bonus on the Grand Prix 23 Contest
✓ Ideas posted recently
Since the launch of the Ideas Portal, 204 ideas have been posted there. Now 25 of them have already been implemented, and 17 are planned for implementation.
@Francisco.López1549 was added to the Hall of Fame for the implementation of an idea IRIS classes for OpenAI API by @Yuval.Golendginer
👏Thank you for implementing this idea👏
Developers participating in the InterSystems Grand Prix 23 Programming Contest can get 4 technical bonus points for the implementation of Community Opportunity ideas.
Recently posted ideas
1. Add a "Type-to-Filter" ability in dropdown selections. by @Victoria.Castillo2990 2. Make Data Transformation UI guess about message types automatically by @Evgeny.Shvarov 3. Make Every Operation and Service Expose its message classes by @Evgeny.Shvarov 4. Custom Visualizations for Physicians by @Ikram.Shah 5. Introduce InterSystems IRIS support for Apache Airflow by @Evgeny.Shvarov 6. Introduce an Interoperability module (adapter, operation) for pdf.co by @Evgeny.Shvarov 7. Add a parameter in Visual Trace to see the message contents in XML or JSON by @Sylvain.Guilbaud 8. Specific cache buffers per DB by @Yaron.Munz8173 9. Module deployment support via Production Export in Dev environment by @Alexander.Woodhead 10. Envrionment variable support in System Default Settings by @Alexander.Woodhead 11. Settings should be a part of Mirroring by @Scott.Roth 12. Have nicknames for community users by @Minoru.Horita 13. Make all Production Item Settings available in Defaults Settings by @Stefan.Cronje1399 14. Add hyperlink to documentation web page from management portal options by @LuisAngel.PérezRamos 15. Delete drafts from InterSystems Developer Community by @Yuri.Gomes
👏Thank you for generating new ideas👏
Don't forget to vote, comment, and subscribe to the ideas to track and influence their progress.
And look out for the next news bulletin! Hey Community! 👋While reading comments to ideas I found out that the idea "Include support for GraphQL" is implemented. @Gevorg.Arutiunian9096 welcome to the "Hall of Fame" page of the Ideas Portal for implementation of this idea! 👏 Hi Developers!
One more idea is implemented by Community members. 👏
@John.Murray thank you so much for implementing the idea Unit testing in VSCode and welcome to the "Hall of Fame" page! 🏆 @Lorenzo.Scalese 👋 thank you for implementing the idea "REST API for Security Package" and congratulations on your second record in the the "Hall of Fame" page! 👏
Article
Nikolay Solovyev · Jun 21, 2023
The Telegram Adapter for InterSystems IRIS serves as a bridge between the popular Telegram messaging platform and InterSystems IRIS, facilitating seamless communication and data exchange. By leveraging the capabilities of the Telegram API, the adapter allows developers to build robust chatbots, automate tasks, and integrate Telegram with InterSystems IRIS applications.
The most common scenarios where the Telegram Adapter can be used include:
Real-Time System Notifications: Sending immediate notifications about specific events occurring in the system. This can include alerts, updates, or important system information delivered to users or administrators via Telegram.
Customer Interaction and chatbot Applications: This can involve various activities such as appointment scheduling, appointment reminders, and conducting NPS (Net Promoter Score) surveys to gather customer feedback. Intelligent chatbots can be created that integrate with existing systems, have access to extensive knowledge bases, and are capable of answering user questions.
Training and Testing: It can be used to deliver training materials, interactive quizzes, or conduct assessments through Telegram.
Key Features and Functionality
Bi-Directional Communication
All features of the Telegram Bot API are supported (https://core.telegram.org/bots/api)
Out-of-the-box business services, business operations, and messages
Adapter Technical Details.
The telegram-adapter package includes not only the adapter itself but also ready-to-use business services (Telegram.LongPollingService.cls and Telegram.WebHookService.cls), a business operation (Telegram.BusinessOperation), and the Telegram.Request message class.
It is assumed that you will not need to create your own business services or business operations.
Each of these business services will create and send a message containing all the raw data (received from telegram) to the designated production component as specified in the settings. Additionally, you have the option to configure automatic file (document) saving from incoming messages.
To send a message, you need to create a Telegram.Request message, specifying which API method to invoke, and in the Data field, include the JSON with all the required fields for the Telegram API. For example, let's send a text message:
Set msg = ##class(Telegram.Request).%New()
Set msg.Method = "sendMessage"
// All possible fields are described here https://core.telegram.org/bots/api#sendmessage
Set msg.Data = {
"chat_id" : (..ChatId),
"text": ("*bold text*" _$$$NL_ "_italic text_ " _$$$NL_ "```"_$$$NL_"pre-formatted fixed-width "_$$$NL_"code block```"),
"parse_mode": "MarkdownV2",
"disable_notification": "true"
}
Return ..SendRequestAsync("Telegram.BusinessOperation", msg)
To send files such as images, videos, audio, and other types of files, you need to add the full file name (including the path) to the Files collection of the Telegram.Request object and use the same path in the corresponding fields of the JSON message. Here's an example:
Set filePath = "/path/to/photo.jpg"
Set msg = ##class(Telegram.Request).%New()
Set msg.Method = "sendPhoto"
Do msg.Files.Insert(filePath)
// All possible fields are described here https://core.telegram.org/bots/api#sendphoto
Set msg.Data = {
"chat_id": (..ChatId),
"photo": (filePath),
"caption": ("IRIS Telegram Adapter Demo")
}
Return ..SendRequestAsync("Telegram.BusinessOperation", msg)
Telegram Adapter Demo
The telegram-adapter-demo package includes an example of an Echo chatbot implementation.
The bot responds to any text message sent to it. If you send a .png file, it will also reply with a thumbnail of that image, automatically generated by Telegram.
To run this example, follow all the installation steps outlined in the documentation provided at https://github.com/nsolov/telegram-adapter-demo#readme.
By completing the installation process, you will be able to launch and experience this Echo chatbot example.
Announcement
Bob Kuszewski · Jun 30, 2023
When IRIS 2023.2 reaches general availability, we’ll be making some improvements to how we tag and distribute IRIS & IRIS for Health containers.
IRIS containers have been tagged using the full build number format, for example 2023.1.0.235.1. Customers have been asking for more stable tags, so they don’t need to change their dockerfiles/Kubernetes files every time a new release is made. With that in mind, we’re making the following changes to how we tag container images.
Major.Minor Tags: Containers will be tagged with the year and release, but not the rest of the full build number. For example, where an image is accessed currently as
containers.intersystems.com/intersystems/iris:2023.2.0.606.0
Will now be accessed as
containers.intersystems.com/intersystems/iris:2023.2
latest-em and latest-cd Tags: The most recent extended-maintenance and continuous-delivery releases will be tagged with latest-em and latest-cd, respectively. This provides a shorthand notation that can be used in documentation, examples, and development environments. We do not advise using these tags for production environments.
Preview: Developer preview releases will all be clearly tagged with -preview so you can easily separate out pre-release containers from production-ready containers. The most recent preview release will helpfully be tagged with latest-preview.
If you’re looking for the full build number for an InterSystems container, that’s available as a label, which you can view with the docker inspect command. For example:
docker inspect --format '{{ index .Config.Labels "com.intersystems.platform-version"}}' containers.intersystems.com/intersystems/iris:2023.1.0.235.1
Containers will no longer be distributed via the WRC download site. If you’re one of the few customers downloading containers from the WRC download site, now’s the time to switch to the InterSystems Container Registry (docs).
While we’re here, as a reminder that we have been publishing multi-architecture manifests for IRIS containers. This means that pulling the IRIS container tagged 2022.3 will download the right container for your machine’s CPU architecture (Intel/AMD or ARM).
If you need to pull a container for a specific CPU architecture, tags are available for architecture-specific containers. For example, 2022.3-linux-amd64 will pull the Intel/AMD container and 2022.3-linux-arm64v8 will pull the ARM container.
We’ll stop posting to the iris-arm64 repositories soon since multi-architecture streamlines the process of getting the right containers on the right machines.
If you have questions or concerns, please reach out to the WRC. Hi Bob,
That's good news and I like new names much more than old ones. I hope old tags for old releases will still be available?
A couple of entries from my container tagging wishlist (one can dream, you know)
- provide :latest tag for all containers, but especially for community ones, which will just pull the latest working release without having to rebuild dockerfiles every year when license expires
- provide 2023.1.x tag which will follow the latest minor version 🎉, great news !
I join Sergei on latest tag in addition of latest-cd and latest-em. Awesome! Please add latest too. Great improvements! Looking forward to it, Bob.
- provide 2023.1.x tag which will follow the latest minor version
It's not clear to me whether or not this is already the plan for the new Major.Minor tags. For instance, when IRIS 2024.1.1 is released, will it be at containers.intersystems.com/intersystems/iris:2024.1 (where IRIS 2024.1.0 images would likely have existed previously) or containers.intersystems.com/intersystems/iris:2024.1.1? @Robert.Kuszewski can you please clarify? The plan for new IRIS containers is to use YYYY.R format only. Let's walk through an example of a few releases for a hypothetical IRIS 2024.1 release.
Release
Old Tag
New Tag
Initial GA
2024.1.0.123.0
2024.1
First maintenance
2024.1.1.234.0
2024.1
Security Fix
2024.1.1.234.1
2024.1
This new scheme greatly simplifies your work to keep up with whatever mix of maintenance and security releases we provide. All you need to do is reference iris:2024.1 and you'll pick up the latest bug and security fixes without making any changes to your code.
As for the latest tag, we like the idea so much that we're giving you two. One that lets you get the latest release (iris:latest-cd) and one that lets you get the latest long-term-support release (iris:latest-em). You also can use intersystemsdc/iris-community:latest or just intersystemsdc/iris-community for the latest InterSystems IRIS Community Edition release.
And intersystemsdc/iris-community:preview for the latest preview build.
intersystemsdc/irishealth-community and intersystemsdc/irishealth-community:preview for InterSystems IRIS For Health Community Edition Well you gave me two so I should have one spare wish :)
Make iris:latest-cd just iris:latest, this way we can just skip the "latest" bit altogether and just use iris without any tag at all.
Announcement
Preston Brown · Apr 11, 2024
As a InterSystems HealthShare Developer, you will be a part of an Agile team to build healthcare applications and implement new features while adhering to the best coding development standards.
Your key responsibilities include:
Design and implement data integrations using InterSystems HealthShare/IRIS between HIE partners and internal On-prem/Cloud systems
Debug and fix defects in InterSystems HealthShare/IRIS related to HIE connected Healthcare participants
Develop XSLTs and DTLs to transform CCDA/FHIR/HL7 messages to and from Summary Document Architecture (SDA) format
Ensure systems development and support, and create design and technical specifications from business requirement specifications
Oversee best practices design and development, participate in code reviews, and collaborate with fellow developers and Technical Leads
Required Skills
Must have strong software development experience
Should have experience in InterSystems HealthShare UCR, IRIS database, ObjectScript, and XSLT Development
Must have experience in Integration protocols such as TCPIP/MLLP, SFTP, REST, SOAP
Must be able to create DTL mappings from SDA and workflows to process documents
Should have proficiency in CCDA, HL7, JSON and XML messaging
If Interested reach out to Preston Brown @ pbrown@cybercodemasters.com.
Please provide your updated resume/Full Name/Contact phone number and your expected salary.
1099 rate is $60 - $70/hour.
This position will be for 2+ years.
Announcement
Derek Robinson · Apr 17
Hi, Community!
Looking to get actionable insights from your healthcare research? See how InterSystems OMOP can help:
👨🔬What is InterSystems OMOP?
With InterSystems OMOP—a cloud-based software-as-a-service—you can transform clinical data into the OMOP format and get faster insights.
Benefits include:
Create research data repositories efficiently.
Easily ingest, transform, and store data.
🎬Watch the video to learn more!
Article
Kurro Lopez · Apr 14
As we all know, InterSystems is a great company.
Their products can be just as useful as they are complex.
Yet, our pride sometimes prevents us from admitting that we might not understand some concepts or products that InterSystems offers for us.
Today we are beginning a series of articles explaining how some of the intricate InterSystems products work, obviously simply and clearly.
In this essay, I will clarify what Machine Learning is and how to take advantage of it.... because this time, you WILL KNOW for sure what I am talking about.
What (the hell) is Machine Learning?
Machine Learning is a branch of artificial intelligence that focuses on developing algorithms and models that enable computers to learn to perform specific tasks based on data, without being explicitly programmed for each task. Instead of following specific instructions, machines learn through experience, identifying patterns in data and making predictions or decisions based on them.
The process involves feeding algorithms with datasets (called training sets) to make them learn and improve their performance over time. Those algorithms can be designed to perform a wide range of tasks, including image recognition, natural language processing, financial trend prediction, medical diagnosis, and much more.
In summary, Machine Learning allows computers to learn from data and improve with experience, enabling them to perform complex tasks without explicit programming for each situation autonomously...
It is a lovely definition. Yet, I guess you need an example, so here we go:
Well, imagine that every day you write down somewhere the time of sunrise and sunset. If somebody asked you whether the sun would rise the next day, what would you say? All you have noticed was only the time of sunrise and sunset..
By observing your data, you would conclude that with 100% probability, the sun will rise tomorrow. However, you cannot ignore the fact that there is a chance that, due to a natural catastrophe, you will not be able to see the sun rising the next day. That is why you should say that the likelihood of witnessing a sunrise the following day is, in fact, 99.99%.
Considering your personal experience, you can provide an answer that matches your data. Machine Learning is the same thing but done by a computer..
Look at the table below:
A
B
1
2
2
4
3
6
4
8
How do columns A and B relate to each other?
The answer is easy, the value of B is double the A. B=A*2, is a pattern.
Now, examine the other table:
A
B
1
5
2
7
3
9
4
11
This one is a bit more complicated…. If you haven't uncovered the pattern, it is B=(A*2) +3.
A human, for instance, can deduce the formula, meaning that the more data you have, the easier it is to guess the pattern behind this mystery.
So, Machine Learning uses the same logic to reveal the pattern hidden in the data.
How to start?
First, you will need a computer, Yes, since this article is about Machine Learning, having only a notebook and a pencil will not be enough.
Second, you will require an instance of IRIS Community. You can download a Docker image and execute your test here. Note, that it must have ML integrated, e.g., the latest version of InterSystems IRIS Community:
docker pull intersystems/iris-ml-community:latest-em
or
docker pull intersystems/iris-community:latest
If you need another platform, check https://hub.docker.com/r/intersystems/iris-ml-community/tags or https://hub.docker.com/r/intersystems/iris-community/tags.
Then, create a container from this container and run it:
docker run --name iris-ml -d --publish 1972:1972 --publish 52773:52773 intersystems/iris-m
If you are "old-school", you can download a free version for evaluation. Yet, it is important to have an InterSystems account. Check it out at https://login.intersystems.com/login/SSO.UI.Register.cls.
Afterward, ask for an evaluation copy at https://evaluation.intersystems.com/Eval/.
Install it and run your instance.
Now, access the IRIS portal. http://localhost:52773/csp/user/EnsPortal.ProductionConfig.zen
User: Superuser
Pass: SYS
Note: You might be asked to change the password the first time. Do not be afraid, just come up with a password that you can easily remember.
Open the "Machine learning configuration" to review the versions you installed.
At this point, you can see the provider configurations of ML installed.
Earth, "water" and fire... what is the best?
All of them are good. The important thing is how to train your dragon, I mean... your data.
Explore more info about the existing models:
AutoML: AutoML is an automated Machine Learning system developed by InterSystems and housed within the InterSystems IRIS® data platform. It is designed to build accurate predictive models quickly using your data. It automates several key components of the machine-learning process.
Click the link below to see more info: https://docs.intersystems.com/iris20241/csp/docbook/Doc.View.cls?KEY=GAUTOML_Intro
H2O: It is an open-source Machine Learning model. The H2O provider does not support the creation of time series models.
Follow the link below to discover more: https://docs.intersystems.com/irislatest/csp/docbook/DocBook.UI.Page.cls?KEY=GIML_Configuration_Providers#GIML_Configuration_Providers_H2O
PMML: (Predictive Modelling Markup Language). It is an XML-based standard that expresses analytics models. It provides a way for applications to define statistical and data mining models so that they can be easily reused and shared.
Check out the link below for more info: https://docs.intersystems.com/irislatest/csp/docbook/DocBook.UI.Page.cls?KEY=APMML
What is the first step?
Just like in the sunset and sunrise example, we need some data to train our model.
It is essential to know the data objective and the values that should be predicted. It is also critical to have clear data without any duplicates. You must find out what the minimum set of data is as well.
I am going to use the AutoML provider because it is from Intersystems, ha-ha 😉
There are a few kinds of algorithms:
Decision trees: First, the information is classified, then the next question is applied to evaluate the probability. Example: Will it rain tomorrow? Check if the sky is cloudy (very or slightly) or clear. If it is very cloudy, then check the humidity. After that, check the temperature... If it is very cloudy, with high humidity and low temperature, then it will rain tomorrow..
Random forests: It is a set of decision trees, each of which "votes" for a class. The majority of the votes define the selected model.
Neural networks: It does not mean that Skynet is coming... However, it is too complicated to explain in just a few words. The general idea is to "copy" the function of human neurons. It means that each input data gets analyzed by a "neuron", which, in turn, provides the input data to the next "neuron" to analyze the output data.
If you wish to play around with neural networks using Python, you can create one and check how it works. Please, have a look at https://colab.research.google.com/drive/1XJ-Lph5auvoK1M4kcHZvkikOqZlmbytI?usp=sharing.
Through the link above, you can run a routine in Python, with the help of the TensorFlow library. To get the pattern of tables A and B do the following:
import tensorflow as tf
import numpy as np
tableA = np.array([1, 2, 3, 4, 5, 6, 7], dtype=float)
tableB = np.array([5, 7, 9, 11, 13, 15, 17], dtype=float)
hidden1 = tf.keras.layers.Dense(units=3, input_shape=[1])
hidden2 = tf.keras.layers.Dense(units=3)
exit = tf.keras.layers.Dense(units=1)
model = tf.keras.Sequential([hidden1, hidden2, exit])
model.compile(
optimizer=tf.keras.optimizers.Adam(0.1),
loss='mean_squared_error'
)
print("Start training...")
history = model.fit(tableA, tableB, epochs=1000, verbose=False)
print("Model trained!")
import matplotlib.pyplot as plt
plt.xlabel("# Epoch")
plt.ylabel("Loss magnitud")
plt.plot(history.history["loss"])
print("Doing a predicction!")
result = model.predict([100])
print("The result is " + str(result) )
print("Internal variables of the model")
print(hidden1.get_weights())
print(hidden2.get_weights())
print(exit.get_weights())
The code above utilizes the values of A and B to create a model to compare and discover the relation between both values.
When I have done the prediction, it retrieves the correct value, in this sample the prediction is 203.
How does it work in IRIS?
Machine Learning in IRIS is called “integratedML”. It has been implemented since InterSystems IRIS 2023.2 as an Experimental Feature, meaning that it is not supported for production environments. However, the feature is well-tested, and InterSystems believes it can add significant value to customers. You can find more information in Using integratedML documentation.
Still, since this is an ML lesson for beginners, I will explain how to operate it as simply as possible.
Note: I am utilizing a docker with an image from containers.intersystems.com/iris-ml-community
docker pull containers.intersystems.com/iris-ml-community
You can download the IRIS image and samples from https://github.com/KurroLopez/iris-mll-fordummies.
📣TIP: You can open the terminal from Docker with the following command:
docker-compose exec iris iris session iris
Sleepland University studio
Sleepland University has done extensive research on insomnia, conducting thousands of interviews and building a database with various parameters of patients with and without sleeplessness.
The collected data includes the following:
Gender (male/female)
Age (The age of the person in years)
Occupation (The occupation or profession of the person)
Sleep Duration (The number of hours the person sleeps per day)
Quality of Sleep (A subjective rating of the quality of sleep, ranging from 1 to 10)
Physical Activity Level (The number of minutes the person engages in physical activity daily)
Stress Level (A subjective rating of the stress level experienced by the person, ranging from 1 to 10)
BMI Category (The BMI category of the person: Underweight, Normal, Overweight)
Systolic (Systolic blood pressure)
Diastolic (Diastolic blood pressure)
Heart Rate (The resting heart rate of the person in BPM)
Daily Steps (The number of steps the person takes per day)
Sleep Disorder (None, Insomnia, Sleep Apnea)
For the first sample, I created a class (St.MLL.insomniaBase) with the columns mentioned above:
Class St.MLL.insonmniaBase Extends %Persistent
{
/// Gender of patient (male/female)
Property Gender As %String;
/// The age of the person in years
Property Age As %Integer;
/// The occupation or profession of the person
Property Occupation As %String;
/// The number of hours the person sleeps per day
Property SleepDuration As %Numeric(SCALE = 2);
/// A subjective rating of the quality of sleep, ranging from 1 to 10
Property QualitySleep As %Integer;
/// The number of minutes the person engages in physical activity daily
Property PhysicalActivityLevel As %Integer;
/// A subjective rating of the stress level experienced by the person, ranging from 1 to 10
Property StressLevel As %Integer;
/// The BMI category of the person: Underweight, Normal, Overweight
Property BMICategory As %String;
/// Systolic blood pressure
Property Systolic As %Integer;
/// Diastolic blood pressure
Property Diastolic As %Integer;
/// The resting heart rate of the person in BPM
Property HeartRate As %Integer;
/// The number of steps the person takes per day
Property DailySteps As %Integer;
/// None, Insomnia, Sleep Apnea
Property SleepDisorder As %String;
}
Then, I built some classes extending from insomniaBase called insomnia01, insomniaValidate01, and insomniaTest01. It allowed me to have the same columns for each table.
Eventually, we will need to populate our tables with sample values, so I designed a class method for that purpose.
Class St.MLL.insomnia01 Extends St.MLL.insomniaBase
{
/// Populate values
ClassMethod Populate() As %Status
{
write "Init populate "_$CLASSNAME(),!
&sql(TRUNCATE TABLE St_MLL.insomnia01)
……
write $CLASSNAME()_" populated",!
Return $$$OK
}
📣TIP: To open the terminal, type the following command:
docker-compose exec iris iris session iris
Using the terminal, call the method Populate of this class
Do ##class(St.MLL.insomnia01).Populate()
If we do everything right, we will have a table with the values for training our ML.
We also need to create a new table for validation. It is easy because you will only require a part of the data provided for the training. In this case, it will be 50% of the items.
Please, run the following sentence in the terminal.
Do ##class(St.MLL.insomniaValidate01).Populate()
Finally, we will prepare some test data to see the results of our training.
Do ##class(St.MLL.insomniaTest01).Populate()
Train, train, and train... you will become stronger
Now, we have all the data needed to train our model. How to do it?
You will only need 4 simple instructions:
CREATE MODEL
TRAIN MODEL
VALIDATE MODEL
SELECT PREDICT
Creating the model
CREATE MODEL creates the Machine Learning model metadata by specifying the model’s name, the target field to be predicted, and the dataset that will supply the target field.
In our sample, we have some parameters to evaluate sleep disorders so we will design the following models:
insomnia01SleepModel: By gender, age, sleep duration and quality of sleep.
Check if the age and sleeping habits affect any sleep disorder type.
insomnia01BMIModel: By gender, age, occupation and BMI category.
Examine whether age, occupation and BMI affect any sleep disorder type.
insomnia01AllModel: All factors
Inspect if all factors affect any sleep disorder type..
We are going to create all those models now.
Using SQL management in IRIS portal, type the following sentence:
CREATE MODEL insomnia01AllModel PREDICTING (SleepDisorder) From St_MLL.insomnia01
At this point, our model knows which column to predict.
You can check what was created and what the predicting column contains with the sentence below:
SELECT * FROM INFORMATION_SCHEMA.ML_MODELS
Ensure that the predicting column name and the columns themselves are correct.
However, we also want to add different model types since we wish to predict sleep disorders according to other factors, not all fields.
In this case, we are going to use the "WITH" clause to specify the columns that should be used as parameters to make the prediction.
To utilize the "WITH" clause, we must indicate the name of the columns and their type.
CREATE MODEL insomnia01SleepModel PREDICTING (SleepDisorder) WITH(Gender varchar, Age integer, SleepDuration numeric, QualitySleep integer) FROM St_MLL.insomnia01
CREATE MODEL insomnia01BMIModel PREDICTING (SleepDisorder) WITH(Gender varchar, Age integer, Occupation varchar, BMICategory varchar) FROM St_MLL.insomnia01
Make sure that all those models have been successfully created.
Training the model
The TRAIN MODEL command runs the AutoML engine and specifies the data that will be used for training. FROM syntax is generic and allows the same model to be trained multiple times on various data sets. For instance, you may train a table with data from Sleepland University or Napcity University. The most important thing though is to have the data model with the same fields, same name, and the same type.
The AutoML engine automatically performs all necessary machine-learning tasks. It identifies relevant candidate features from the selected data, evaluates feasible model types based on the data and problem definition, and sets hyperparameters to create one or more viable models.
Since our model has 50 records, it is enough for such training.
TRAIN MODEL insomnia01AllModel FROM St_MLL.insomnia01
Do the same with other models.
TRAIN MODEL insomnia01SleepModel FROM St_MLL.insomnia01
TRAIN MODEL insomnia01BMIModel FROM St_MLL.insomnia01
You can find out whether your model has been properly trained with the following sentence:
SELECT * FROM INFORMATION_SCHEMA.ML_TRAINED_MODELS
It is necessary to validate the model and the training with the command VALIDATE MODEL.
Validating the model
At this stage, we need to confirm that the model has been trained properly. So, we should run the command VALIDATE MODEL.
📣Remember: Before populating the class, validate it with 50% of the data from the training data source.
VALIDATE MODEL returns simple metrics for regression, classification, and time series models based on the provided testing set.
Check what has been validated with the sentence below:
VALIDATE MODEL insomnia01AllModel From St_MLL.insomniaValidate01
Repeat it with other models.
VALIDATE MODEL insomnia01SleepModel FROM St_MLL.insomniaValidate01
VALIDATE MODEL insomnia01BMIModel FROM St_MLL.insomniaValidate01
Consuming the model
Now, we will consume this model and inspect whether the model has been learning accurately how to produce the Result value.
With the help of the sentence “SELECT PREDICT”, we are going to forecast what the value of the Result will be. To do that, we will use the test1 table populated before.
SELECT *, PREDICT(insomnia01AllModel) FROM St_MLL.insomniaTest01
The result looks weird after utilizing 50% of the data exploited to train the model... Why has a 29-year-old female nurse been diagnosed with “insomnia”, whereas the model predicted “sleep apnea”? (see ID 54).
We should examine other models (insomnia01SleepModel and insomnia01BMIModel), created with different columns, but don't worry! I will display the columns used to design them.
SELECT Gender, Age, SleepDuration, QualitySleep, SleepDisorder, PREDICT(insomnia01SleepModel) As SleepDisorderPrediction FROM St_MLL.insomniaTest01
You can see again that a 29-year-old female has been diagnosed with “insomnia”, whereas the prediction states “sleep apnea”.
Ok, you are right! We also need to know what percentage of the prediction has been applied to this final value.
How can we know the percentage of a prediction?
To find out the percentage of the prediction, we should exploit the command “PROBABILITY”.
This command retrieves a value between 0 and 1. However, it is not the probability of prediction, it is the probability to get the value that you wish to check.
This is a good example:
SELECT *, PREDICT(insomnia01AllModel) As SleepDisorderPrediction, PROBABILITY(insomnia01AllModel FOR 'Insomnia') as ProbabilityPrediction FROM St_MLL.insomniaTest01
It is the probability of getting “Insomnia” as a sleep disorder.
Our nurse, a woman, 29 years old, diagnosed with “Insomnia” has a 49.71% chance of having Insomnia. Still, the prediction is “Sleep Apnea” … Why?
Is the probability the same for other models?
SELECT Gender, Age, SleepDuration, QualitySleep, SleepDisorder, PREDICT(insomnia01SleepModel) As SleepDisorderPrediction, PROBABILITY(insomnia01SleepModel FOR 'Insomnia') as ProbabilityInsomnia,
PROBABILITY(insomnia01SleepModel FOR 'Sleep Apnea') as ProbabilityApnea
FROM St_MLL.insomniaTest01
Finally, it is a bit clearer now. According to the data (sex, age, sleep quality, and sleep duration), the probability of having insomnia is only 34.63%, whereas the chance of having sleep apnea is 64.18%.
Wow…It is very interesting! Still, we were exploiting only a small portion of data inserted directly into a table with a class method… How can we upload a huge file with data?
Please, wait for the next article!, it is coming soon.
Announcement
Irène Mykhailova · Apr 24
Hi Community!
Welcome to Issue #21 of the InterSystems Ideas newsletter! This edition highlights the latest news from the Ideas Portal, such as:
✓ General statistics✓ Community Opportunity ideas
Here are some March numbers for you. During this month, we had:
19 new ideas
1 implemented idea
6 comments
72 votes
👏 Thanks to everyone who contributed in any way to the Ideas Portal last month.
In recent months, you've added several ideas that were categorized as Community Opportunity, which means that any member of the Developer Community is welcome to implement them and thus pave their way to the Hall of Fame! So here they are:
Idea
Author
Dapper support in IRIS
Vadim Cherdak
InterSystems IRIS Project Initializer
@Yuri.Gomes
Use InterSystems Interoperability as a Traceability tool for GenAI LLM pipelines
@Evgeny.Shvarov
Introduce Flyway Support in IRIS
@Evgeny.Shvarov
Load Data on VSCode
@Yuri.Gomes
Add InterSystems wrapper for Supabase
@Evgeny.Shvarov
✨ Share your ideas, support your favorites with comments and votes, and bring the most interesting ones to life! 🙏
Announcement
Irène Mykhailova · Mar 25
Hi Community!
Welcome to Issue #20 of the InterSystems Ideas newsletter! This edition highlights the latest news from the Ideas Portal, such as:
✓ General statistics✓ Results of the "DC search" sweepstakes✓ DC search ideas to vote for
Here are some numbers from February for you. During this month, we had:
20 new ideas
2 implemented ideas
17 comments
59 votes
👏 Thanks to everyone who contributed in any way to the Ideas Portal last month.
In our sweepstakes dedicated to improving the DC search, we've received 17 unique ideas! We will review them in the next few weeks and try our best to implement your great suggestions! Moreover, we're happy to announce the winner of our sweepstakes — @Jiayan.Xiang, who will soon receive the prize.
More sweepstakes are in the works, so don't miss your chance to become a lucky winner!
To wrap up this bulletin, check out the DC search suggestions and vote for your favorites, so we know what you're looking forward to the most
Idea
Author
Search by key topics and parameters
@Andre.LarsenBarbosa
Improving Search Results Display
@Andrew.Leniuk
Empty Search Query
@Andrew.Leniuk
Typo Correction and Synonym Search
@Andrew.Leniuk
Personalized search
@Andrew.Leniuk
Search by version
@DavidUnderhill
DC Search by date range
@Yuri.Gomes
Remember user picked search result
@Jiayan.Xiang
Perform results query through translation
@Andre.LarsenBarbosa
Combined filter for search
@Andre.LarsenBarbosa
In relevance search order by date desc
@Iryna.Mykhailova
Phonetic search in the community
@Andre.LarsenBarbosa
AI-Powered Recommendations
@diba
Add filtering by type of post in DC search
@Iryna.Mykhailova
People Also Search For
@Yuri.Gomes
Voice search
@Yuri.Gomes
Improve selectivity of Articles and Questions in DC
@Robert.Cemper1003
✨ Share your ideas, support your favorites with comments and votes, and bring the most interesting ones to life! 🙏