Search

Clear filter
Announcement
Olga Zavrazhnova · Aug 19, 2024

InterSystems at HackMIT 2024

InterSystems’ team is heading to the HackMIT 2024 hackathon, taking place on September 14-15, 2024! HackMIT is a weekend-long event where students from around the globe work together to design and build innovative technology projects. Students will focus on four main areas for innovation: healthcare, sustainability, education, and interactive media. Interested to become a mentor? Send a direct message to us!
Announcement
Vadim Aniskin · Mar 27, 2024

InterSystems Ideas News #12

Hi Community! Welcome to the 12th edition of the InterSystems Ideas news! Here's what you can expect from it: ​​​​✓ Call for ideas about InterSystems IRIS Cloud SQL ✓ Get inspired by ideas from the InterSystems Ideas portal ✓ Recently implemented ideas Many Developer Community members have already tested and used InterSystems' new service, InterSystems IRIS Cloud SQL. You must have bright ideas related to this service, and we are waiting for them in the new Ideas Portal category dedicated to this database-as-a-service (DBaaS). If you are on the lookout for an interesting idea to implement or just want inspiration to create a new application and publish it on the Open Exchange, you can now get to the Ideas Portal directly from the "Applications" item of the Open Exchange main menu. Since the launch of the Ideas Portal, 54 ideas related to the InterSystems products and resources for developers have already been implemented. And 12 of them were realized in the first quarter of 2024: Idea Implementation of the idea (project) Developer Add access to Ideas Portal from Global Masters by @LuisAngel.PérezRamos see Global Masters interface InterSystems Make a module which will let to use IRIS and Firebase framework by @Evgeny.Shvarov irisfirebase @Daniel.Aguilar Ideas contributor of the year by @Yuri.Gomes Top InterSystems Ideas Contributors for 2023 InterSystems Web Interface to convert HL7 V2 to FHIR by @Muhammad.Waseem iris-fhir-lab @Muhammad.Waseem Expose Global Masters Badges by @henry Developer Community Release, January 2024 InterSystems On Developer Community put animated GIFs behind a play/pause button by @John.Murray Developer Community Release, January 2024 InterSystems Tutorial about custom login with class extending %CSP.Login by @Yuri.Gomes custom-login @Yuri.Gomes Add a Support category to the Services offered. by @Paul.Hurley see Partner Directory interface InterSystems Sync Community and Open Exchange users by @LuisAngel.PérezRamos see Community and Open Exchange interfaces InterSystems Have nicknames for community users by @Minoru.Horita Developer Community Release, January 2024 InterSystems Introduce an interoperability adapters to import/export data from Google Docs and Google Sheets by @Evgeny.Shvarov googlesheets-adapter @Nikolay.Soloviev Tribute banner and interview for GM million pointers by @Yuri.Gomes Celebrating a True Pillar of the Developer Community: A Journey of Dedication and Expertise @Robert.Cemper1003 and InterSystems 👏 Thank you for implementing and posting these ideas 👏 Stay tuned for our next announcements! In the meantime, post your brilliant ideas, and advertise ideas you like to your peers and all Developer Community members!
Announcement
Vadim Aniskin · Jun 19, 2024

InterSystems Ideas News #14

Hi Developers! Welcome to the 14th edition of the InterSystems Ideas news!! This time, you can read about the following: ​​​​✓ Hints & Tips for the participants of the 3rd InterSystems Ideas Contest ✓ Brand new Ideas Portal landing page ✓ Recently implemented ideas After only 10 days, we already have 14 accepted ideas. We thank all Community members who posted their ideas for the Contest. Special thanks to the @Victoria Castillo for posting 4 brilliant ideas! 👏 Make a difference by posting ideas focused on solving real-world problems. Share your creativity with all Community Members and InterSystems team! Some other ideas were posted during the Contest period but were not accepted because they: contained AI generated text were the same as the existing ideas on the Portal were related to the InterSystems TrakCare product were too general and non-specific Please read the requirements for ideas in the Ideas Contest announcement carefully to ensure your idea is accepted. If you are new to the Ideas Portal, you can read "Ideas promotion rules" on the Portal Guide page and an article dedicated to the creation of new ideas. We hope these hints will help you post your ideas for the Ideas Contest. The Ideas Portal landing page was launched recently. If you are new to the Ideas Portal, please look at it to understand the logic of how this resource works and find useful links. If you are an experienced ideas author, please share your feedback on how we can improve it. To round up this newsletter, please find below the ideas implemented during the second quarter of 2024. Idea Implementation of the idea (project) Developer Assert CodeAssist prompting by @Alex.Woodhead starting from InterSystems IRIS 2023.3.0 InterSystems Introduce WITH into IRIS SQL engine by @Evgeny.Shvarov starting from InterSystems IRIS 2024.1 InterSystems Implement support for FHIRPath Patch resources on InterSystems FHIR Server by @Maksym.Shcherban see InterSystems FHIR Server documentation InterSystems Reference architectures for VIPs on SDNs in GCP by @Eduard.Lebedyuk VIP in GCP @Mikhail.Khomenko ​​​​@Eduard.Lebedyuk InterSystems EF Core Provider by @Evgeny.Shvarov InterSystems 👏 Many thanks to the implementors and authors of these ideas👏 💡Post your ideas till the 30th of June to participate in the 3rd InterSystems Ideas Contest, vote and comment on ideas participating in the Contest! 💡
Announcement
Olga Zavrazhnova · Jan 17

InterSystems at IC HACK ‘25

InterSystems’ team is heading to the IC HACK ‘25 hackathon, taking place on February 1-2, 2025 in London, UK. IC HACK ‘25 is an annual hackathon organised by Imperial College London Department of Computing Society. It is the biggest student-run hackathon in Europe, where over 700 of the UK's most creative and talented students work together for 24 hours of learning, building, fun, and networking.
Announcement
Olga Zavrazhnova · Jan 17

InterSystems at TreeHacks 2025

InterSystems’ team is heading to the TreeHacks 2025 hackathon, taking place on February 14-16, 2025! TreeHacks 2025 is Stanford's premier hackathon. The country's brightest engineering students are flown to Stanford's campus to build solutions to the world's largest challenges for 36 hours straight. Students will focus on four main areas for innovation: healthcare, sustainability, education, fintech & security, and entertainment. Interested to become a mentor? Send a direct message to us!
Article
Yuri Marx · Mar 4

Salesforce Integration with InterSystems IRIS

In 2023, according to IDC, Salesforce's market share in CRM reached 21.7%. This company owns a substantial amount of critical corporate business processes and data, so the InterSystems IRIS must have an interoperability connector to fetch data from the Salesforce data catalog. This article will show you how to get any data hosted by Salesforce and create an interoperation production to get data and send it to such targets as files and relational databases. Create your development account on Salesforce: 1. Go to https://developer.salesforce.com/signup: 2. Fill out the form and click "Sign Me Up" to create your DEV account.3. To explore the Salesforce data catalog, log in and proceed to https://[YOUR-INSTANCE]-dev-ed.develop.lightning.force.com/lightning/setup/ObjectManager/home (my case can be found via https://ymservices-dev-ed.develop.lightning.force.com/lightning/setup/ObjectManager/home). 4. Click the Account Label to go to the Account data object details (the API Name is Account): 5. Head to the Fields & Relationships tab to see the Account fields: 6. We will consume the Account data in our sample. Configure the OAuth credentials: To consume the Salesforce API, you must configure the OAuth credentials:1. Go to https://[YOUR SALESFORCE INSTANCE].lightning.force.com/lightning/setup/NavigationMenus/home.2. Click the button "New Connected App": 3. Select "Create a Connected App" and click "Continue": 4. Fill in the form according to the options shown in the image (use your email on Contact Email) and click "Save": 5. Wait for 30 to 45 minutes and click "Continue". Then click "Manage Consumer Details": 6. You must validate your identity (check your email to get the verification code): 7. Copy the consumer key and consumer secret since you will need them later.8. Now you should edit a few OAuth policies.9. Proceed to https://[your salesforce instance].lightning.force.com/lightning/setup/ConnectedApplication/home and click "Edit" next to the IRISInterop connected application. 10. In the OAuth Policies section, pick the following options and click the "Save" button at the bottom:a. Permitted Users: All users may self-authorizeb. IP Relaxation: Relax IP restrictions 11. At this point, we are ready to consume the Salesforce APIs. Test the consumption of the Salesforce API on Postman: 1. If you do not have Postman, you can download it at https://www.postman.com/downloads/.2. Start your Postman and test the login API to get a new token.3. Create a POST call as indicated below:a. grant_type: passwordb. client_id: your consumer keyc. client_secret: your consumer secretd. username: your username (the one you utilize to access your Salesforce instance web interface)e. password: your user password (the one you employ to access your Salesforce instance web interface) 4. Copy the access_token to use it on the Authorization header for all the upcoming Salesforce API calls5. Now, we are ready to consume the Salesforce Query API to get the current leads. Set your call as shown below:a. URL: https://[your salesforce instance].my.salesforce.com/services/data/v58.0/query?q=SELECT+Id,+Name+FROM+Leadb. On headers, add the Header Authorization with the value: Bearer [the access_token returned from login service] 6. Check out the records with the data about leads.7. All set! It is time to create an interoperability adapter with production to allow us to integrate Salesforce data with any other system, service, API, or data repository. About Salesforce API We will employ the following API to interoperate with Salesforce: Authentication API to get Authorization token: curl --location 'https://ymservices-dev-ed.develop.my.salesforce.com/services/oauth2/token' \ --form 'grant_type="password"' \ --form 'client_id="3MVG91oqviqJK………….fT_0CZ9M.80M_THhxS51LKc7IHk2c5qQt63wD_"' \ --form 'client_secret="6B8860E……………..B3AC840E568D44837094ADC"' \ --form 'username="yurimarx@gmail.com"' \ --form 'password="**************"' Query API (this API finds records using the Salesforce Object Query Language (SOQL) which is similar to SQL language): SELECT one or more fields FROM an object WHERE filter statements and, optionally, results are ordered Example: SELECT Id, Name FROM Account WHERE Name = 'Sandy' A cURL sample: curl --location --request GET 'https://ymservices-dev-ed.develop.my.salesforce.com/services/data/v62.0/query?q=SELECT+Id%2C+Name+FROM+Lead' \ --header 'Authorization: Bearer 00Dbm00000CTkBv!AQEAQFIMdVSCElZDlb0axaaWBGm5Hacr6EN.77fgNvc4zpq4N22Hgf1THQydrewNQNM5W9cttWKyOtZMfDYTHTS4FqlS2uov' For more details about the language capabilities, check out the link below:https://developer.salesforce.com/docs/atlas.en-us.soql_sosl.meta/soql_sosl/sforce_api_calls_soql_select.htm. sObject API (this API executes CRUD operations and describes the Salesforce objects and their metadata). The URL path is /services/data/salesforce version/sobjects/salesforce object/salesforce record ID (not to create). Please note that you must set the Authorization header with the login token. Examine the samples below: 1. Create record: curl https://MyDomainName.my.salesforce.com/services/data/v63.0/sobjects/Account/ -H "Authorization: Bearer token" -H "Content-Type: application/json" -d "@newaccount.json".2. Update record: curl https://MyDomainName.my.salesforce.com/services/data/v63.0/sobjects/Account/001D000000INjVe -H "Authorization: Bearer token" -H "Content-Type: application/json" -d @patchaccount.json -X PATCH3. Delete record: curl https://MyDomainName.my.salesforce.com/services/data/v63.0/sobjects/Account/001D000000INjVe -H "Authorization: Bearer token" -X DELETE4. Get by Id: curl https://MyDomainName.my.salesforce.com/services/data/v63.0/sobjects/Account/001D000000INjVe -H "Authorization: Bearer token" -X GET5. Get metadata information: curl https://MyDomainName.my.salesforce.com/services/data/v63.0/sobjects/Account/ -H "Authorization: Bearer token"More information about sObject API is available via the next link: https://developer.salesforce.com/docs/atlas.en-us.api_rest.meta/api_rest/intro_rest.htm. Creating the interoperability adapter and production Get the sample application: To illustrate how to interoperate with Salesforce, first, you should get my sample app the iris-salesforce application, on Open Exchange via the link https://openexchange.intersystems.com/package/Iris-Salesforce. Download or clone the application from GitHub to see the implementation details described in the upcoming sections. Creating the Interoperability Adapter for Salesforce Since there is no official interoperability adapter for Salesforce on InterSystems IRIS, we need to create one: Class dc.irissalesforce.SalesforceOutboundAdapter Extends Ens.OutboundAdapter { Property SFVersion As %String(VALUELIST = ",v62.0,v61.0,v60.0,v59.0,v58.0,v57.0,v56.0,v55.0,v54.0,v53.0,v52.0,v51.0,v50.0,v49.0,v48.0,v47.0,v46.0,v45.0,v44.0,v43.0,v42.0,v41.0,v40.0,v39.0,v38.0,v37.0,v36.0,v35.0,v34.0,v33.0,v32.0,v31.0") [ InitialExpression = "v62.0" ]; Property SFCredsFile As %String(MAXLEN = 1000); Property SSLConfig As %String; Parameter SETTINGS = "SFVersion:Connect,SFCredsFile:Connect:fileSelector,SSLConfig:Connect:sslConfigSelector"; Method PrepareRequest(sfToken As %DynamicObject) As %Net.HttpRequest { Set sfReq = ##class(%Net.HttpRequest).%New() Set sfReq.Server = sfToken.instance Set sfReq.Https = 1 Set sfReq.SSLConfiguration = ..SSLConfig Do sfReq.SetHeader("Authorization", "Bearer "_sfToken."access_token") Return sfReq } /// Query Salesforce with SQL query to get a JSON response with business data Method Query(queryRequest As dc.irissalesforce.QueryMessage, Output queryResponse As Ens.StreamContainer) As %Status { Set sc = $$$OK Set sfToken = {} Set queryResponse = ##class(Ens.StreamContainer).%New() Try { Do ..GetAuthotizationToken(.sfToken) Set sfReq = ..PrepareRequest(sfToken) Do sfReq.SetParam("q", queryRequest.Query) Do sfReq.Get("/services/data/"_..SFVersion_"/query") Do queryResponse.StreamSet(sfReq.HttpResponse.Data) } Catch ex { Set sc = ex.AsStatus() } Return sc } /// Get Salesforce data with ID to get a JSON response with business data Method GetById(idRequest As dc.irissalesforce.ByIdMessage, Output queryResponse As Ens.StreamContainer) As %Status { Set sc = $$$OK Set sfToken = {} Set queryResponse = ##class(Ens.StreamContainer).%New() Try { Do ..GetAuthotizationToken(.sfToken) Set sfReq = ..PrepareRequest(sfToken) Do sfReq.Get("/services/data/"_..SFVersion_"/sobjects/"_idRequest.ClassName_"/"_idRequest.Id) Do queryResponse.StreamSet(sfReq.HttpResponse.Data) } Catch ex { Set sc = ex.AsStatus() } Return sc } /// Create a new Salesforce object with json data Method Create(createRequest As dc.irissalesforce.CreateMessage, Output createResponse As Ens.StreamContainer) As %Status { Set sc = $$$OK Set sfToken = {} Set createResponse = ##class(Ens.StreamContainer).%New() Try { Do ..GetAuthotizationToken(.sfToken) Set sfReq = ..PrepareRequest(sfToken) Set sfReq.ContentType = "application/json" Do sfReq.EntityBody.Write(createRequest.Content) Do sfReq.Post("/services/data/"_..SFVersion_"/sobjects/"_createRequest.ClassName_"/") Do createResponse.StreamSet(sfReq.HttpResponse.Data) } Catch ex { Set sc = ex.AsStatus() } Return sc } /// Update the Salesforce object with json data Method Update(updateRequest As dc.irissalesforce.UpdateMessage, Output updateResponse As Ens.StreamContainer) As %Status { Set sc = $$$OK Set sfToken = {} Set updateResponse = ##class(Ens.StreamContainer).%New() Try { Do ..GetAuthotizationToken(.sfToken) Set sfReq = ..PrepareRequest(sfToken) Set sfReq.ContentType = "application/json" Do sfReq.EntityBody.Write(updateRequest.Content) Do sfReq.Patch("/services/data/"_..SFVersion_"/sobjects/"_updateRequest.ClassName_"/"_updateRequest.Id_"/") Do sfReq.HttpResponse.OutputToDevice() Do updateResponse.StreamSet(sfReq.HttpResponse.Data) } Catch ex { Set sc = ex.AsStatus() } Return sc } /// Delete the Salesforce object by id Method Delete(deleteRequest As dc.irissalesforce.DeleteMessage, Output deleteResponse As Ens.StreamContainer) As %Status { Set sc = $$$OK Set sfToken = {} Set deleteResponse = ##class(Ens.StreamContainer).%New() Try { Do ..GetAuthotizationToken(.sfToken) Set sfReq = ..PrepareRequest(sfToken) Set sfReq.ContentType = "application/json" Do sfReq.Delete("/services/data/"_..SFVersion_"/sobjects/"_deleteRequest.ClassName_"/"_deleteRequest.Id_"/") Do sfReq.HttpResponse.OutputToDevice() Do deleteResponse.StreamSet(sfReq.HttpResponse.Data) } Catch ex { Set sc = ex.AsStatus() } Return sc } /// Get OAuth token Method GetAuthotizationToken(Output oauthToken As %DynamicObject) As %Status { Set sc = $$$OK Set sfCreds = {} Try { Set sfCreds = {}.%FromJSONFile(..SFCredsFile) Set sfReq = ##class(%Net.HttpRequest).%New() Set sfReq.Server = sfCreds.instance Set sfReq.Https = 1 Set sfReq.SSLConfiguration = ..SSLConfig Do sfReq.InsertFormData("grant_type", "password") Do sfReq.InsertFormData("client_id", sfCreds.clientId) Do sfReq.InsertFormData("client_secret", sfCreds.clientSecret) Do sfReq.InsertFormData("username", sfCreds.username) Do sfReq.InsertFormData("password", sfCreds.password) Do sfReq.Post("/services/oauth2/token") Set oauthToken = {}.%FromJSON(sfReq.HttpResponse.Data.Read()) Set oauthToken.instance = sfCreds.instance } Catch ex { Set sc = ex.AsStatus() } Return sc } } Explore the notes about this implementation:1. The class extends Ens.OutboundAdapter because Salesforce is not used as the first step of the interoperability production (it does not initiate the integration process since it is output, not input).2. Developers utilize the SF Version to choose the Salesforce version.3. The SFCredsFile sets the credentials file with sensitive information to connect to Salesforce (instance name, username, password, client id, and client secret).4. SSLConfig sets the SSL configuration to call the Salesforce HTTPS endpoint.5. SETTINGS are used by IRIS to demonstrate the developer input fields for set values to the SF Version, SFCredsFile, and SSLConfig properties.6. The PrepareRequest method creates the HTTPRequest object and establishes the common properties for all other methods.7. The Query method calls the Query API with the Select sentence passed on QueryMessage.8. The GetById method calls the sObject API passing the Salesforce Object class name and the unique Id required to get the record data.9. The Create method calls the sObject API passing the Salesforce Object class name and the data in JSON format to save a new record with the business data.10. The Update method calls the sObject API passing the Salesforce Object class name, the unique Id, and the data that will be altered.11. The Delete method calls the sObject API passing the Salesforce Object class name and the unique Id needed to delete the record.12. The GetAuthorizationToken method calls the oauth2 API passing the credentials data to get the authorization token needed to call all the other APIs.13. The adapter methods return Ens.StreamContainer to support any content size. Creating the Interoperability Operation for Salesforce Since InterSystems IRIS does not have an official interoperability operation for Salesforce, we must build one with the help of the created adapter: Class dc.irissalesforce.SalesforceOperation Extends Ens.BusinessOperation { Parameter ADAPTER = "dc.irissalesforce.SalesforceOutboundAdapter"; Parameter INVOCATION = "Queue"; /// Get Salesforce data from SQL query Method Query(queryRequest As dc.irissalesforce.QueryMessage, Output queryResponse As Ens.StreamContainer) As %Status { Set sc = $$$OK Set queryResponse = ##class(Ens.StreamContainer).%New() Set sc = ..Adapter.Query(queryRequest, .queryResponse) Return sc } /// Get Salesforce object data from Object ID Method GetById(idRequest As dc.irissalesforce.ByIdMessage, Output queryResponse As Ens.StreamContainer) As %Status { Set sc = $$$OK Set queryResponse = ##class(Ens.StreamContainer).%New() Set sc = ..Adapter.GetById(idRequest, .queryResponse) Return sc } /// Create Salesforce object with data from JSON Method Create(createRequest As dc.irissalesforce.CreateMessage, Output createResponse As Ens.StreamContainer) As %Status { Set sc = $$$OK Set createResponse = ##class(Ens.StreamContainer).%New() Set sc = ..Adapter.Create(createRequest, .createResponse) Return sc } /// Update Salesforce object with data from JSON Method Update(updateRequest As dc.irissalesforce.UpdateMessage, Output updateResponse As Ens.StreamContainer) As %Status { Set sc = $$$OK Set updateResponse = ##class(Ens.StreamContainer).%New() Set sc = ..Adapter.Update(updateRequest, .updateResponse) Return sc } /// Delete Salesforce object by Id Method Delete(deleteRequest As dc.irissalesforce.DeleteMessage, Output deleteResponse As Ens.StreamContainer) As %Status { Set sc = $$$OK Set deleteResponse = ##class(Ens.StreamContainer).%New() Set sc = ..Adapter.Delete(deleteRequest, .deleteResponse) Return sc } XData MessageMap { <MapItems> <MapItem MessageType="dc.irissalesforce.QueryMessage"> <Method>Query</Method> </MapItem> <MapItem MessageType="dc.irissalesforce.ByIdMessage"> <Method>GetById</Method> </MapItem> <MapItem MessageType="dc.irissalesforce.CreateMessage"> <Method>Create</Method> </MapItem> <MapItem MessageType="dc.irissalesforce.UpdateMessage"> <Method>Update</Method> </MapItem> <MapItem MessageType="dc.irissalesforce.DeleteMessage"> <Method>Delete</Method> </MapItem> </MapItems> } } Examine the notes about this implementation: 1. The class extends the Ens.BusinessOperation (the base class for producing custom business operations).2. The ADAPTER parameter is dc.irissalesforce.SalesforceOutboundAdapter. It is your custom Adapter with methods to integrate with Salesforce.3. The INVOCATION parameter with Queue value means the message is created within one background job and placed on a queue simultaneously releasing the original job.4. The methods Query, GetById, Create, Update, and Delete call the respective methods on the Adapter class (dc.irissalesforce.SalesforceOutboundAdapter) utilizing..Adapter.[name of method].5. Each method has a specific message class:a. Method Query: QueryMessage class with Query property used to execute the SQL sentence.b. Method GetById: GetByIdMessage class with the property Id with the unique id value and ClassName with the object class name to be returned.c. Method Create: CreateMessage class with the property ClassName for object class name to be created and the Content property with JSON business data to be saved.d. Method Update: UpdateMessage class with the property ClassName for object class name to be made, the Id property to set the record to be updated and the Content property with JSON business data to be saved.e. Method Delete: DeleteMessage class with the property ClassName for object class name and the Id property for the record to be deleted.6. The XData MessageMap maps the methods to be called based on the MessageType sent to the Operation. For instance, if the Salesforce Operation receives a CreateMessage, the Create method will be called.7. The message classes extend %Persistent to preserve the message history and Ens.Util.RequestBodyMethods to monitor, audit, and trace messages better. Creating a Production to use the new Operation Finally, to test our Salesforce business operation, we will build a new Production:1. Check out the port used by IRIS (in my case, it is 51925): 2. Go to the Management Portal http://localhost:51926/csp/sys/UtilHome.csp?$NAMESPACE=IRISAPP3. Proceed to Interoperability > Configure > Production: 4. Click the New button: 5. Set the values for the new Production and click OK: 6. Move to Settings tab > Development and Debugging section, check the Testing Enabled option, and click the Apply button to save your selection: 7. Click the Plus button located near Operations: 8. Set the values to the new operation (Operation Class is our SalesforceOperation class): 9. Well done! We have TestOperation ready for our tests: 10. Select TestOperation, go to Settings tab > Connect section, and establish the following values:a. SFVersion: v62.0b. SFCredsFile: path for credentials file, (in my case, it is /usr/irissys/creds.json)c. SSL Configuration: with the SSL configuration, (in my case, it is pm.community.intersystems.com)11. Click the Apply button to save the TestOperation configuration: 12. Click the production Start button to execute the operation: 13. At this point, the Operation color should turn dark green: 14. Pick the TestOperation operation and proceed to the Actions tab. Then click the button Test: 15. Set the values to test Create:Request Type: dc.irissalesforce.CreateMessageContent: { "LastName": "InterSystems", "FirstName": "IRIS", "Salutation": "Mr.", "Title": "Datafabric sales", "Company": "InterSystems", "MobilePhone": "555555", "Email": "test@gmail.com", "LeadSource": "Web", "Status": "Working - Contacted", "Rating": "Warm" } ClassName: Lead 16. Click the button Invoke Testing Service and check out the results: 17. Click the link Visual Trace to see the details of the tests: 18. Click the [3] step and move to the Contents tab to explore the result details: 19. The record was created successfully, so copy the ID (in my case, it is 00Qbm00000B4YrDEAV).20. Close the Visual Trace, and choose the Request Type dc.irissalesforce.ByIdMessage. Then expand the section Request Details: 21. Set the values ID with the ID you copied and ClassName with Lead: 22. Click the button Invoke Testing Service and study the results: 23. Click the Visual Trace, and pick [3] step. It will display the results on the Contents tab: 24. At this point, you should see the JSON content that you input in the previous steps.25. Test the remaining message types and enjoy!
Announcement
Derek Robinson · Apr 17

[Video] What is InterSystems OMOP?

Hi, Community! Looking to get actionable insights from your healthcare research? See how InterSystems OMOP can help: 👨‍🔬What is InterSystems OMOP? With InterSystems OMOP—a cloud-based software-as-a-service—you can transform clinical data into the OMOP format and get faster insights. Benefits include: Create research data repositories efficiently. Easily ingest, transform, and store data. 🎬Watch the video to learn more!
Article
Kurro Lopez · Apr 14

InterSystems for dummies – Machine learning

As we all know, InterSystems is a great company. Their products can be just as useful as they are complex. Yet, our pride sometimes prevents us from admitting that we might not understand some concepts or products that InterSystems offers for us. Today we are beginning a series of articles explaining how some of the intricate InterSystems products work, obviously simply and clearly. In this essay, I will clarify what Machine Learning is and how to take advantage of it.... because this time, you WILL KNOW for sure what I am talking about. What (the hell) is Machine Learning? Machine Learning is a branch of artificial intelligence that focuses on developing algorithms and models that enable computers to learn to perform specific tasks based on data, without being explicitly programmed for each task. Instead of following specific instructions, machines learn through experience, identifying patterns in data and making predictions or decisions based on them. The process involves feeding algorithms with datasets (called training sets) to make them learn and improve their performance over time. Those algorithms can be designed to perform a wide range of tasks, including image recognition, natural language processing, financial trend prediction, medical diagnosis, and much more. In summary, Machine Learning allows computers to learn from data and improve with experience, enabling them to perform complex tasks without explicit programming for each situation autonomously... It is a lovely definition. Yet, I guess you need an example, so here we go: Well, imagine that every day you write down somewhere the time of sunrise and sunset. If somebody asked you whether the sun would rise the next day, what would you say? All you have noticed was only the time of sunrise and sunset.. By observing your data, you would conclude that with 100% probability, the sun will rise tomorrow. However, you cannot ignore the fact that there is a chance that, due to a natural catastrophe, you will not be able to see the sun rising the next day. That is why you should say that the likelihood of witnessing a sunrise the following day is, in fact, 99.99%. Considering your personal experience, you can provide an answer that matches your data. Machine Learning is the same thing but done by a computer.. Look at the table below: A B 1 2 2 4 3 6 4 8 How do columns A and B relate to each other? The answer is easy, the value of B is double the A. B=A*2, is a pattern. Now, examine the other table: A B 1 5 2 7 3 9 4 11 This one is a bit more complicated…. If you haven't uncovered the pattern, it is B=(A*2) +3. A human, for instance, can deduce the formula, meaning that the more data you have, the easier it is to guess the pattern behind this mystery. So, Machine Learning uses the same logic to reveal the pattern hidden in the data. How to start? First, you will need a computer, Yes, since this article is about Machine Learning, having only a notebook and a pencil will not be enough. Second, you will require an instance of IRIS Community. You can download a Docker image and execute your test here. Note, that it must have ML integrated, e.g., the latest version of InterSystems IRIS Community: docker pull intersystems/iris-ml-community:latest-em or docker pull intersystems/iris-community:latest If you need another platform, check https://hub.docker.com/r/intersystems/iris-ml-community/tags or https://hub.docker.com/r/intersystems/iris-community/tags. Then, create a container from this container and run it: docker run --name iris-ml -d --publish 1972:1972 --publish 52773:52773 intersystems/iris-m If you are "old-school", you can download a free version for evaluation. Yet, it is important to have an InterSystems account. Check it out at https://login.intersystems.com/login/SSO.UI.Register.cls. Afterward, ask for an evaluation copy at https://evaluation.intersystems.com/Eval/. Install it and run your instance. Now, access the IRIS portal. http://localhost:52773/csp/user/EnsPortal.ProductionConfig.zen User: Superuser Pass: SYS Note: You might be asked to change the password the first time. Do not be afraid, just come up with a password that you can easily remember. Open the "Machine learning configuration" to review the versions you installed. At this point, you can see the provider configurations of ML installed. Earth, "water" and fire... what is the best? All of them are good. The important thing is how to train your dragon, I mean... your data. Explore more info about the existing models: AutoML: AutoML is an automated Machine Learning system developed by InterSystems and housed within the InterSystems IRIS® data platform. It is designed to build accurate predictive models quickly using your data. It automates several key components of the machine-learning process. Click the link below to see more info: https://docs.intersystems.com/iris20241/csp/docbook/Doc.View.cls?KEY=GAUTOML_Intro H2O: It is an open-source Machine Learning model. The H2O provider does not support the creation of time series models. Follow the link below to discover more: https://docs.intersystems.com/irislatest/csp/docbook/DocBook.UI.Page.cls?KEY=GIML_Configuration_Providers#GIML_Configuration_Providers_H2O PMML: (Predictive Modelling Markup Language). It is an XML-based standard that expresses analytics models. It provides a way for applications to define statistical and data mining models so that they can be easily reused and shared. Check out the link below for more info: https://docs.intersystems.com/irislatest/csp/docbook/DocBook.UI.Page.cls?KEY=APMML What is the first step? Just like in the sunset and sunrise example, we need some data to train our model. It is essential to know the data objective and the values that should be predicted. It is also critical to have clear data without any duplicates. You must find out what the minimum set of data is as well. I am going to use the AutoML provider because it is from Intersystems, ha-ha 😉 There are a few kinds of algorithms: Decision trees: First, the information is classified, then the next question is applied to evaluate the probability. Example: Will it rain tomorrow? Check if the sky is cloudy (very or slightly) or clear. If it is very cloudy, then check the humidity. After that, check the temperature... If it is very cloudy, with high humidity and low temperature, then it will rain tomorrow.. Random forests: It is a set of decision trees, each of which "votes" for a class. The majority of the votes define the selected model. Neural networks: It does not mean that Skynet is coming... However, it is too complicated to explain in just a few words. The general idea is to "copy" the function of human neurons. It means that each input data gets analyzed by a "neuron", which, in turn, provides the input data to the next "neuron" to analyze the output data. If you wish to play around with neural networks using Python, you can create one and check how it works. Please, have a look at https://colab.research.google.com/drive/1XJ-Lph5auvoK1M4kcHZvkikOqZlmbytI?usp=sharing. Through the link above, you can run a routine in Python, with the help of the TensorFlow library. To get the pattern of tables A and B do the following: import tensorflow as tf import numpy as np tableA = np.array([1, 2, 3, 4, 5, 6, 7], dtype=float) tableB = np.array([5, 7, 9, 11, 13, 15, 17], dtype=float) hidden1 = tf.keras.layers.Dense(units=3, input_shape=[1]) hidden2 = tf.keras.layers.Dense(units=3) exit = tf.keras.layers.Dense(units=1) model = tf.keras.Sequential([hidden1, hidden2, exit]) model.compile( optimizer=tf.keras.optimizers.Adam(0.1), loss='mean_squared_error' ) print("Start training...") history = model.fit(tableA, tableB, epochs=1000, verbose=False) print("Model trained!") import matplotlib.pyplot as plt plt.xlabel("# Epoch") plt.ylabel("Loss magnitud") plt.plot(history.history["loss"]) print("Doing a predicction!") result = model.predict([100]) print("The result is " + str(result) ) print("Internal variables of the model") print(hidden1.get_weights()) print(hidden2.get_weights()) print(exit.get_weights()) The code above utilizes the values of A and B to create a model to compare and discover the relation between both values. When I have done the prediction, it retrieves the correct value, in this sample the prediction is 203. How does it work in IRIS? Machine Learning in IRIS is called “integratedML”. It has been implemented since InterSystems IRIS 2023.2 as an Experimental Feature, meaning that it is not supported for production environments. However, the feature is well-tested, and InterSystems believes it can add significant value to customers. You can find more information in Using integratedML documentation. Still, since this is an ML lesson for beginners, I will explain how to operate it as simply as possible. Note: I am utilizing a docker with an image from containers.intersystems.com/iris-ml-community docker pull containers.intersystems.com/iris-ml-community You can download the IRIS image and samples from https://github.com/KurroLopez/iris-mll-fordummies. 📣TIP: You can open the terminal from Docker with the following command: docker-compose exec iris iris session iris Sleepland University studio Sleepland University has done extensive research on insomnia, conducting thousands of interviews and building a database with various parameters of patients with and without sleeplessness. The collected data includes the following: Gender (male/female) Age (The age of the person in years) Occupation (The occupation or profession of the person) Sleep Duration (The number of hours the person sleeps per day) Quality of Sleep (A subjective rating of the quality of sleep, ranging from 1 to 10) Physical Activity Level (The number of minutes the person engages in physical activity daily) Stress Level (A subjective rating of the stress level experienced by the person, ranging from 1 to 10) BMI Category (The BMI category of the person: Underweight, Normal, Overweight) Systolic (Systolic blood pressure) Diastolic (Diastolic blood pressure) Heart Rate (The resting heart rate of the person in BPM) Daily Steps (The number of steps the person takes per day) Sleep Disorder (None, Insomnia, Sleep Apnea) For the first sample, I created a class (St.MLL.insomniaBase) with the columns mentioned above: Class St.MLL.insonmniaBase Extends %Persistent { /// Gender of patient (male/female) Property Gender As %String; /// The age of the person in years Property Age As %Integer; /// The occupation or profession of the person Property Occupation As %String; /// The number of hours the person sleeps per day Property SleepDuration As %Numeric(SCALE = 2); /// A subjective rating of the quality of sleep, ranging from 1 to 10 Property QualitySleep As %Integer; /// The number of minutes the person engages in physical activity daily Property PhysicalActivityLevel As %Integer; /// A subjective rating of the stress level experienced by the person, ranging from 1 to 10 Property StressLevel As %Integer; /// The BMI category of the person: Underweight, Normal, Overweight Property BMICategory As %String; /// Systolic blood pressure Property Systolic As %Integer; /// Diastolic blood pressure Property Diastolic As %Integer; /// The resting heart rate of the person in BPM Property HeartRate As %Integer; /// The number of steps the person takes per day Property DailySteps As %Integer; /// None, Insomnia, Sleep Apnea Property SleepDisorder As %String; } Then, I built some classes extending from insomniaBase called insomnia01, insomniaValidate01, and insomniaTest01. It allowed me to have the same columns for each table. Eventually, we will need to populate our tables with sample values, so I designed a class method for that purpose. Class St.MLL.insomnia01 Extends St.MLL.insomniaBase { /// Populate values ClassMethod Populate() As %Status { write "Init populate "_$CLASSNAME(),! &sql(TRUNCATE TABLE St_MLL.insomnia01) …… write $CLASSNAME()_" populated",! Return $$$OK } 📣TIP: To open the terminal, type the following command: docker-compose exec iris iris session iris Using the terminal, call the method Populate of this class Do ##class(St.MLL.insomnia01).Populate() If we do everything right, we will have a table with the values for training our ML. We also need to create a new table for validation. It is easy because you will only require a part of the data provided for the training. In this case, it will be 50% of the items. Please, run the following sentence in the terminal. Do ##class(St.MLL.insomniaValidate01).Populate() Finally, we will prepare some test data to see the results of our training. Do ##class(St.MLL.insomniaTest01).Populate() Train, train, and train... you will become stronger Now, we have all the data needed to train our model. How to do it? You will only need 4 simple instructions: CREATE MODEL TRAIN MODEL VALIDATE MODEL SELECT PREDICT Creating the model CREATE MODEL creates the Machine Learning model metadata by specifying the model’s name, the target field to be predicted, and the dataset that will supply the target field. In our sample, we have some parameters to evaluate sleep disorders so we will design the following models: insomnia01SleepModel: By gender, age, sleep duration and quality of sleep. Check if the age and sleeping habits affect any sleep disorder type. insomnia01BMIModel: By gender, age, occupation and BMI category. Examine whether age, occupation and BMI affect any sleep disorder type. insomnia01AllModel: All factors Inspect if all factors affect any sleep disorder type.. We are going to create all those models now. Using SQL management in IRIS portal, type the following sentence: CREATE MODEL insomnia01AllModel PREDICTING (SleepDisorder) From St_MLL.insomnia01 At this point, our model knows which column to predict. You can check what was created and what the predicting column contains with the sentence below: SELECT * FROM INFORMATION_SCHEMA.ML_MODELS Ensure that the predicting column name and the columns themselves are correct. However, we also want to add different model types since we wish to predict sleep disorders according to other factors, not all fields. In this case, we are going to use the "WITH" clause to specify the columns that should be used as parameters to make the prediction. To utilize the "WITH" clause, we must indicate the name of the columns and their type. CREATE MODEL insomnia01SleepModel PREDICTING (SleepDisorder) WITH(Gender varchar, Age integer, SleepDuration numeric, QualitySleep integer) FROM St_MLL.insomnia01 CREATE MODEL insomnia01BMIModel PREDICTING (SleepDisorder) WITH(Gender varchar, Age integer, Occupation varchar, BMICategory varchar) FROM St_MLL.insomnia01 Make sure that all those models have been successfully created. Training the model The TRAIN MODEL command runs the AutoML engine and specifies the data that will be used for training. FROM syntax is generic and allows the same model to be trained multiple times on various data sets. For instance, you may train a table with data from Sleepland University or Napcity University. The most important thing though is to have the data model with the same fields, same name, and the same type. The AutoML engine automatically performs all necessary machine-learning tasks. It identifies relevant candidate features from the selected data, evaluates feasible model types based on the data and problem definition, and sets hyperparameters to create one or more viable models. Since our model has 50 records, it is enough for such training. TRAIN MODEL insomnia01AllModel FROM St_MLL.insomnia01 Do the same with other models. TRAIN MODEL insomnia01SleepModel FROM St_MLL.insomnia01 TRAIN MODEL insomnia01BMIModel FROM St_MLL.insomnia01 You can find out whether your model has been properly trained with the following sentence: SELECT * FROM INFORMATION_SCHEMA.ML_TRAINED_MODELS It is necessary to validate the model and the training with the command VALIDATE MODEL. Validating the model At this stage, we need to confirm that the model has been trained properly. So, we should run the command VALIDATE MODEL. 📣Remember: Before populating the class, validate it with 50% of the data from the training data source. VALIDATE MODEL returns simple metrics for regression, classification, and time series models based on the provided testing set. Check what has been validated with the sentence below: VALIDATE MODEL insomnia01AllModel From St_MLL.insomniaValidate01 Repeat it with other models. VALIDATE MODEL insomnia01SleepModel FROM St_MLL.insomniaValidate01 VALIDATE MODEL insomnia01BMIModel FROM St_MLL.insomniaValidate01 Consuming the model Now, we will consume this model and inspect whether the model has been learning accurately how to produce the Result value. With the help of the sentence “SELECT PREDICT”, we are going to forecast what the value of the Result will be. To do that, we will use the test1 table populated before. SELECT *, PREDICT(insomnia01AllModel) FROM St_MLL.insomniaTest01 The result looks weird after utilizing 50% of the data exploited to train the model... Why has a 29-year-old female nurse been diagnosed with “insomnia”, whereas the model predicted “sleep apnea”? (see ID 54). We should examine other models (insomnia01SleepModel and insomnia01BMIModel), created with different columns, but don't worry! I will display the columns used to design them. SELECT Gender, Age, SleepDuration, QualitySleep, SleepDisorder, PREDICT(insomnia01SleepModel) As SleepDisorderPrediction FROM St_MLL.insomniaTest01 You can see again that a 29-year-old female has been diagnosed with “insomnia”, whereas the prediction states “sleep apnea”. Ok, you are right! We also need to know what percentage of the prediction has been applied to this final value. How can we know the percentage of a prediction? To find out the percentage of the prediction, we should exploit the command “PROBABILITY”. This command retrieves a value between 0 and 1. However, it is not the probability of prediction, it is the probability to get the value that you wish to check. This is a good example: SELECT *, PREDICT(insomnia01AllModel) As SleepDisorderPrediction, PROBABILITY(insomnia01AllModel FOR 'Insomnia') as ProbabilityPrediction FROM St_MLL.insomniaTest01 It is the probability of getting “Insomnia” as a sleep disorder. Our nurse, a woman, 29 years old, diagnosed with “Insomnia” has a 49.71% chance of having Insomnia. Still, the prediction is “Sleep Apnea” … Why? Is the probability the same for other models? SELECT Gender, Age, SleepDuration, QualitySleep, SleepDisorder, PREDICT(insomnia01SleepModel) As SleepDisorderPrediction, PROBABILITY(insomnia01SleepModel FOR 'Insomnia') as ProbabilityInsomnia, PROBABILITY(insomnia01SleepModel FOR 'Sleep Apnea') as ProbabilityApnea FROM St_MLL.insomniaTest01 Finally, it is a bit clearer now. According to the data (sex, age, sleep quality, and sleep duration), the probability of having insomnia is only 34.63%, whereas the chance of having sleep apnea is 64.18%. Wow…It is very interesting! Still, we were exploiting only a small portion of data inserted directly into a table with a class method… How can we upload a huge file with data? Please, wait for the next article!, it is coming soon. Thank you @Francisco.López1549 , it is very deep and easy to read. Have you ever had a real use case to use Machine Learning in your daily work? I just ask due to me curiosity.
Announcement
Irène Mykhailova · Apr 24

InterSystems Ideas News #21

Hi Community! Welcome to Issue #21 of the InterSystems Ideas newsletter! This edition highlights the latest news from the Ideas Portal, such as: ✓ General statistics✓ Community Opportunity ideas Here are some March numbers for you. During this month, we had: 19 new ideas 1 implemented idea 6 comments 72 votes 👏 Thanks to everyone who contributed in any way to the Ideas Portal last month. In recent months, you've added several ideas that were categorized as Community Opportunity, which means that any member of the Developer Community is welcome to implement them and thus pave their way to the Hall of Fame! So here they are: Idea Author Dapper support in IRIS Vadim Cherdak InterSystems IRIS Project Initializer @Yuri.Gomes Use InterSystems Interoperability as a Traceability tool for GenAI LLM pipelines @Evgeny.Shvarov Introduce Flyway Support in IRIS @Evgeny.Shvarov Load Data on VSCode @Yuri.Gomes Add InterSystems wrapper for Supabase @Evgeny.Shvarov ✨ Share your ideas, support your favorites with comments and votes, and bring the most interesting ones to life! 🙏
Announcement
Anastasia Dyubaylo · Jan 27

[Video] Moving to InterSystems Reports

Hey Community, Enjoy the new video on InterSystems Developers YouTube: ⏯ Moving to InterSystems Reports @ Global Summit 2024 Learn how SHD Einzelhandelssoftware GmbH is leveraging InterSystems Reports, why they chose to use it, benefits, and lessons learned. Presenters: 🗣 Eric Hoelper, Managing Director, SHD Einzelhandelssoftware GmbH🗣 @Michael.Braam, Senior Sales Engineer, InterSystems Watch, learn, and grow with us — subscribe to never miss a beat!👍
Announcement
arun kumar · Mar 19

Looking for Opportunities in InterSystems Technology

Hi everyone, I am looking for new opportunities in the InterSystems technology space and would love to connect with professionals and organizations working in this ecosystem. With 12+ years of experience in software development, I specialize in: ✅ InterSystems Technologies:• InterSystems IRIS & IRIS for Health, Cache, MUMPS, GT.M, Ensemble• InterSystems HealthShare 2023.1, ODBC & MSSQL Integration, Data Exchange & System Design• Clinical Viewer, Registries (Patient, Clinical, Facility), Consent Policies ✅ Healthcare IT & Interoperability:• VA VistA EHR Applications, EMR, EHR, HL7, FHIR, CCDA• Data Interoperability, Clinical Data Exchange, and Scalable Healthcare Solutions ✅ Enterprise & Solution Architecture:• Scalable, High-Performance Systems in Healthcare & BFSI• Investment & Wealth Management – Fintech innovations, risk analytics, and digital transformations ✅ AI & Data Science:• AI, NLP, Digital Image Processing – Healthcare & Fintech Applications• Courses in NLP & Digital Image Processing from IIIT Hyderabad Additionally, I hold an Executive Program in Business Management (EPBM) from IIM Calcutta, strengthening my strategic decision-making and leadership capabilities. Previously, I worked at Franklin Templeton Investments, Optum (UnitedHealth Group), and IQVIA, contributing to enterprise solutions, fintech innovations, and large-scale system integrations. I am eager to explore collaborations and contribute to cutting-edge InterSystems-based solutions. If you know of any opportunities or would like to connect, feel free to reach out! Looking forward to engaging with the community! Best regards,Arun Kumar Durairaj,+918408803322,darunk67@gmail.com,linkedin.com/in/arun-kumar-d-14159457
Announcement
Irène Mykhailova · Mar 25

InterSystems Ideas News #20

Hi Community! Welcome to Issue #20 of the InterSystems Ideas newsletter! This edition highlights the latest news from the Ideas Portal, such as: ✓ General statistics✓ Results of the "DC search" sweepstakes✓ DC search ideas to vote for Here are some numbers from February for you. During this month, we had: 20 new ideas 2 implemented ideas 17 comments 59 votes 👏 Thanks to everyone who contributed in any way to the Ideas Portal last month. In our sweepstakes dedicated to improving the DC search, we've received 17 unique ideas! We will review them in the next few weeks and try our best to implement your great suggestions! Moreover, we're happy to announce the winner of our sweepstakes — @Jiayan.Xiang, who will soon receive the prize. More sweepstakes are in the works, so don't miss your chance to become a lucky winner! To wrap up this bulletin, check out the DC search suggestions and vote for your favorites, so we know what you're looking forward to the most Idea Author Search by key topics and parameters @Andre.LarsenBarbosa Improving Search Results Display @Andrew.Leniuk Empty Search Query @Andrew.Leniuk Typo Correction and Synonym Search @Andrew.Leniuk Personalized search @Andrew.Leniuk Search by version @DavidUnderhill DC Search by date range @Yuri.Gomes Remember user picked search result @Jiayan.Xiang Perform results query through translation @Andre.LarsenBarbosa Combined filter for search @Andre.LarsenBarbosa In relevance search order by date desc @Iryna.Mykhailova Phonetic search in the community @Andre.LarsenBarbosa AI-Powered Recommendations @diba Add filtering by type of post in DC search @Iryna.Mykhailova People Also Search For @Yuri.Gomes Voice search @Yuri.Gomes Improve selectivity of Articles and Questions in DC @Robert.Cemper1003 ✨ Share your ideas, support your favorites with comments and votes, and bring the most interesting ones to life! 🙏 Fantastic👏
Announcement
Irène Mykhailova · Feb 25

InterSystems Ideas News #19

Hi Community! Welcome to Issue #19 of the InterSystems Ideas newsletter! This edition highlights the latest news from the Ideas Portal, such as: ✓ General statistics✓ New "DC search" sweepstakes✓ New ideas to support Here are some numbers from January for you. During this month, we had: 8 new ideas 1 implemented idea 8 comments 24 votes 👏 Thanks to everyone who contributed in any way to the Ideas Portal last month. After the resounding success of the previous sweepstakes, we've launched a new one dedicated to the DC search. Share your idea(s) on how we can improve our search and get in the running to win a prize! More details are in the announcement. To wrap up this bulletin, check out the ideas submitted to the Ideas Portal last month. Maybe you will find something to vote for or even implement! Idea Author Status Native JSON Datatype exclusively for JSON Ashok Kumar Future Consideration Option for Assign append to escape automatically in DTL editor Mark OReilly Needs Review Add InterSystems wrapper for Supabase Evgeny Shvarov Community Opportunity Provide list for export Lewis Houlden Needs Review Load Data on VSCode Yuri Marx Community Opportunity IRIS Architectural solutions Contest Yuri Marx Needs Review Introduce Flyway Support in IRIS Evgeny Shvarov Community Opportunity Add support for SQL Host Variables Robert Cemper Done by Community ✨ Share your ideas, support your favorites with comments and votes, and bring to life the ones you believe matter the most to the Developer Community! 🙏
Announcement
Vadim Aniskin · Dec 14, 2022

InterSystems Ideas News #2

Hello Community! Welcome to the new edition of the InterSystems Ideas News! Learn what we've been up to this past couple of weeks. Curious about what is going on with all the great ideas you've been submitting to our InterSystems Ideas Portal? Here is the current status breakdown: ✓ 58 ideas are being triaged by InterSystems Product Managers. ✓ 43 ideas can be implemented by Developer Community members. ✓ 11 ideas are being implemented by InterSystems. ✓ 2 ideas are already implemented by InterSystems. ✓ 9 ideas are implemented by Developer Community members. To make it clearer what stages your ideas are going through here is the diagram: And to round up this newsletter, here is a list of ideas posted after Idea-A-Thon Improve Ukrainian translation in IRIS Code example Full Code Debugger Promote video contest Create a tool for IRIS BI to test all the pivots and dashboards if they still work after changes made Improve Spanish translation in IRIS Add IRIS as a supported database for Apache Superset For community articles, let admins (and possibly article authors) pin particular comments to the top Add address standardization to Normalization (using Project US@ standards) A tiny reminder, you can filter ideas by status, post new ideas for public discussion, vote for existing ideas, and comment on them on our InterSystems Ideas Portal! Stay tuned for the next InterSystems Ideas news bulletin and get creative in the meantime! Hello Community! Several users have asked me why the number of ideas in some statuses they see in the InterSystems Ideas is different from the number I mentioned in the news bulletin.I will try to explain it in this comment. Experts analyze your ideas daily. Thus, the status of analyzed or implemented ideas changes. In general, the number of ideas in the "Needs review" status is decreasing. And the number of ideas implemented by InterSystems and ideas that can be implemented by members of the Developer Community is increasing. Have a nice day. Vadim. Hello Community! I want to inform you that InterSystems Ideas now has a Portal Guide. The Portal Guide contains: information about the Ideas Portal goal, a complete list of ideas statuses, some links related to the portal. Please share your thoughts here or @Vadim.Aniskin about what else we can add there. Have a nice day, Vadim.
Announcement
Anastasia Dyubaylo · Jan 13, 2023

InterSystems Developer Tools Contest

Hey Developers, We'd like to invite you to join our next contest dedicated to creating useful tools to make your fellow developers' lives easier: 🏆 InterSystems Developer Tools Contest 🏆 Submit an application that helps to develop faster, contributes more qualitative code, and helps in testing, deployment, support, or monitoring of your solution with InterSystems IRIS. Duration: January 23 - February 12, 2023 Prize pool: $13,500 The topic 💡 InterSystems IRIS developer tools 💡 In this contest, we expect applications that improve developer experience with IRIS, help to develop faster, contribute more qualitative code, help to test, deploy, support, or monitor your solution with InterSystems IRIS. General Requirements: Accepted applications: new to Open Exchange apps or existing ones, but with a significant improvement. Our team will review all applications before approving them for the contest. The application should work on InterSystems IRIS Community Edition. Types of applications that match: UI-frameworks, IDE, Database management, monitoring, deployment tools, etc. The application should be an Open Source application and published on GitHub. The README file to the application should be in English, contain the installation steps, and contain either the video demo or/and a description of how the application works. One developer can enter the competition with a maximum of 3 applications. Prizes 1. Experts Nomination - a specially selected jury will determine the winners: 🥇 1st place - $5,000 🥈 2nd place - $3,000 🥉 3rd place - $1,500 🏅 4th place - $750 🏅 5th place - $500 🌟 6-10th places - $100 2. Community winners - an application that will receive the most votes in total: 🥇 1st place - $1000 🥈 2nd place - $750 🥉 3rd place - $500 If several participants score the same amount of votes, they all are considered winners, and the money prize is shared among the winners. Important Deadlines: 🛠 Application development and registration phase: January 23, 2023 (00:00 EST): Contest begins. February 5, 2023 (23:59 EST): Deadline for submissions. ✅ Voting period: February 6, 2023 (00:00 EST): Voting begins. February 12, 2023 (23:59 EST): Voting ends. Note: Developers can improve their apps throughout the entire registration and voting period. Who can participate? Any Developer Community member, except for InterSystems employees. Create an account! 👥 Developers can team up to create a collaborative application. Allowed from 2 to 5 developers in one team. Do not forget to highlight your team members in the README of your application – DC user profiles. Helpful resources ✓ Example applications: iris-rad-studio - RAD for UI cmPurgeBackup - backup tool errors-global-analytics - errors visualization objectscript-openapi-definition - open API generator Test Coverage Tool - test coverage helper and many more. ✓ Templates we suggest to start from: iris-dev-template rest-api-contest-template native-api-contest-template iris-fhir-template iris-fullstack-template iris-interoperability-template iris-analytics-template ✓ For beginners with IRIS: Build a Server-Side Application with InterSystems IRIS Learning Path for beginners ✓ For beginners with ObjectScript Package Manager (ZPM): How to Build, Test and Publish ZPM Package with REST Application for InterSystems IRIS Package First Development Approach with InterSystems IRIS and ZPM ✓ How to submit your app to the contest: How to publish an application on Open Exchange How to submit an application for the contest Need Help? Join the contest channel on InterSystems' Discord server or talk with us in the comment to this post. We can't wait to see your projects! Good luck 👍 By participating in this contest, you agree to the competition terms laid out here. Please read them carefully before proceeding. What a contest! It'd be great if someone could implement the tool to export Interoperability components into a local folder with every changes saved in the interoperability UI? Currently git-source-control can do the job, but it is not complete. Some Interoperability components (e.g. business rules) are not being exported. And lookup tables are exported in an not importable format. I published and idea regarding it. For your inspiration see also the ideas with the "Community Opportunity" status on the Ideas Portal. Here're more ideas for your inspiration by @Guillaume.Rongier7183: https://community.intersystems.com/post/kick-webinar-intersystems-developer-tools-contest#comment-212426 @Evgeny.Shvarov as we've covered in GitHub issues, the business rule issue is a product-level issue (in the new Angular rule editor only, not the old Zen rule editor). I clarified https://github.com/intersystems/git-source-control/issues/225 re: the importable format. The non-"wrapped" XML export format is importable by git-source-control and, I believe, IPM itself, although not by $System.OBJ.Load. It's just a matter of preference/readability. In a package manager context being loadable by $System.OBJ.Load isn't as important, and while the enclosing <Export> and <Document> tags aren't as annoying for XML files as for XML-exported classes/routines/etc., they're still annoying and distract from the true content of the document. Also - git-source-control 2.1.0 fixes issues with import of its own export format. You should try it out. ;) nice job Tim! Thank you for continuing to improve this :) Hey Developers, Watch the recording of the Kick-off Webinar on InterSystems Developers YouTube: ⏯ [Kick-off Webinar] InterSystems Developer Tools Contest Community! There are 5 apps that have been added to the contest already! GlobalStreams-to-SQL by @Robert.Cemper1003 helper-for-objectscript-language-extensions by @Robert.Cemper1003 gateway-sql by @MikhailenkoSergeyxml-to-udl by @Tommy.Heyding iris-persistent-class-audit by @Stefan.Cronje1399 Who's the application going to be next?! Dear Developers! Please use technology bonuses to collect more votes and get closer to victory. 🥳 Happy coding!✌ I am now in the running with DX Jetpack for VS Code Hi Evgeny, The ompare tool has a Schedule option that happily exports classes, routines, productions, lookup tables, HL7 schemas. It can be configured to traverse multiple namespaces in a single run, generating an export file for each namespace. Wonder if this is useful for your case. A periodic export / snapshot of some components. Cheers, Alex Devs! Four more applications have been uploaded to the contest! DX Jetpack for VS Code by @John.Murray JSONfile-to-Global by @Robert.Cemper1003 apptools-admin by @MikhailenkoSergey irissqlcli by @Dmitry.Maslennikov Check them out! I just uploaded my app... It is the first time that I participate in one of the programming contests... I hope you like it. my helper was eliminated today by censorshelper-for-objectscript-language-extensionsTag version: 0.0.3 Released: 2023-01-26 22:33:33