Clear filter
Announcement
Anastasia Dyubaylo · Dec 4, 2023
Hi Community,
Let's meet together at the online meetup with the winners of the InterSystems Java Programming Contest 2023 – a great opportunity to have a chat with the InterSystems Experts team as well as our contestants.
Winners' demo included!
Date & Time: Thursday, December 7, 12 pm EST | 6 pm CET
Join us to learn more about winners' applications and to have a talk with our experts.
➡️ REGISTER TODAY
See you all at our virtual meetup! Hi Devs,
The "Online Meetup with the winners of the InterSystems Java Programming Contest 2023" starts in 15 minutes)
Follow this link https://us02web.zoom.us/j/9822194974?pwd=bnZBdFhCckZ6c0xOcW5GT1lLdnAvUT09
Or join our YouTube stream - https://youtube.com/live/nET2xSLUwfE?feature=share
Announcement
Anastasia Dyubaylo · Oct 27, 2023
Hi Community,
We're excited to share with you the recording of the next webinar in the series of InterSystems UKI Tech Talk:
👉 Analytics Capabilities using InterSystems IRIS 👈
In this tech talk, we put the spotlight on analytics capabilities developers have using both InterSystems IRIS data platform and InterSystems IRIS for Health, including the following ones.
Adaptive Analytics allows developers to create a business-oriented virtual OLAP model layer between InterSystems IRIS and popular BI client tools like Microsoft Excel and Power BI, or Tableau. By having a centralised common data model, enterprises solve the problem of differing definitions and calculations to provide their end users with one consistent view of business metrics and data characterisation.
Embedded real-time analytics that can be created directly on transactional data model and a fully automated synchronisation option avoids the need for ETL processing.
Columnar Storage is a new storage option for IRIS SQL tables that offers an order-of-magnitude faster analytical queries compared to traditional row storage on IRIS. $vector as a new data type to support columnar storage for SQL tables.
To watch a recording, you need to complete the form.
We trust that you'll find this webinar to be valuable 😉
Announcement
Anastasia Dyubaylo · Mar 3, 2023
Hi Community,
Watch this video to explore common security pitfalls within the industry and how to avoid them when building applications on InterSystems IRIS:
⏯ The OWASP Top 10 & InterSystems IRIS Application Development @ Global Summit 2022
Presenters:
🗣 @Timothy.Leavitt, Application Services Development Manager🗣 @Pravin.Barton, Developer, Application Services🗣 @Wangyi.Huang, Technical Specialist, Application Services
Subscribe to our Youtube channel InterSystems Developers to stay up to date!
Announcement
Anastasia Dyubaylo · Feb 19, 2023
Hey Developers,
Enjoy watching the new video on InterSystems Developers YouTube:
⏯ Understanding your InterSystems Login Account & Where to Use It @ Global Summit 2022
Learn about your InterSystems Login Account, how to use it to get access to InterSystems Services like the Developer Community, Evaluation Service, Open Exchange, Online Learning, WRC and others. This will also cover the new features for controlling your personal communication preferences.
Presenters:
🗣 @Timothy.Leavitt, AppServices Development Manager,InterSystems🗣 @Pravin.Barton, Internal Application Developer, InterSystems
Hope you like it and stay tuned! 👍 Correction on this post - I was originally supposed to present but unfortunately was unable to attend Global Summit due to testing positive for COVID the day before :( Call out to @Timothy.Leavitt who stepped in and presented in my place and did a great job.
Watch the video! I got way too much air time last Summit. Thanks for noticing, guys! Fixed ;)
Article
Evgeny Shvarov · Dec 24, 2022
Hi InterSystems Developers!
Recently I've updated the FHIR dev template so that it now publishes an IPM package fhir-server that makes the setup of InterSystems FHIR server a trivial manual or automatic or programmatic procedure one command long.
Please see below how you can benefit from it.
TLDR
USER>zpm "install fhir-server"
All the details below.
Setting up InterSystems FHIR Server without IPM
Of course you can setup InterSystems FHIR server without using IPM package manager. Here are the options:
1. You can setup a cloud FHIR server and have a trial for several days by following these instructions. This will be an InterSystems FHIR server in AWS cloud.
2. You can setup InterSystems FHIR server a running InterSystems IRIS for Health following these steps.
3. And also you can git clone the repository of this template and run in a cloned directory:
$ docker-compose up -d
to have InterSystems FHIR server up and running on your laptop.
What I suggest in the article is point 2 where you can skip all the manual steps and have the FHIR server up and running on a laptop IRIS either in docker or host OS.
Setting up FHIR server with IPM
DISCLAIMER!! The steps described below refer to a newly installed IRIS for Health instance or for usage with docker images. The package creates a new namespace and a new web application so it could possibly harm the setup you had set up before.
IPM stands for InterSystems Package manager, previously known as ZPM. Make sure you have IPM-client installed. You can check this if you run in IRIS terminal zpm command and see the following:
IRISAPP>zpm
=============================================================================
|| Welcome to the Package Manager Shell (ZPM). ||
|| Enter q/quit to exit the shell. Enter ?/help to view available commands ||
=============================================================================
zpm:IRISAPP>
You will need IRIS for Health for that of versions 2022.x and newer.
How to run iris for health on your laptop?
Running on a host-operation
Download the latest IRIS for Health from InterSystems Evaluation site that fits your platform (Windows, Mac, Linux) and install it. Install ZPM. Here is a one-liner:
USER>zn "%SYS" d ##class(Security.SSLConfigs).Create("z") s r=##class(%Net.HttpRequest).%New(),r.Server="pm.community.intersystems.com",r.SSLConfiguration="z" d r.Get("/packages/zpm/latest/installer"),$system.OBJ.LoadStream(r.HttpResponse.Data,"c")
Running a docker version
Call in your terminal to launch:
docker run --rm --name iris4h -d --publish 9091:1972 --publish 9092:52773 intersystemsdc/irishealth-community
Then start terminal:
docker exec -it iris4h iris session IRIS
Installing FHIR Server
Once having IRIS running either on host or just run in IRIS terminal:
USER>zpm "install fhir-server"
This will install FHIR server in FHIRSERVER namespace with parameters:
Set appKey = "/fhir/r4"
Set strategyClass = "HS.FHIRServer.Storage.Json.InteractionsStrategy"
set metadataPackages = $lb("hl7.fhir.r4.core@4.0.1")
Set metadataConfigKey = "HL7v40"
FHIR REST API will be available at http://yourserver/fhir/r4.
It will also add some synthetic data.
How to understand that server is working?
To test on host version:
http://localhost:52773/fhir/r4/metadata
To test on docker version:
http://localhost:9092/fhir/r4/metadata
Also zpm installs the simple UI which is available at: yourserver/fhirUI/FHIRAppDemo.html
And you'll see something like this (with patient id=1 entered):
How it works?
In fact you can observe what is being installed with this ZPM module in the following module.xml scenario. As you can see it imports code, installs demo frontend application fhirUI, runs the post-install script, which calls the following method. The script in the method performs the FHIR server setup.
Installing FHIR server programmatically
You also can install it programmatically via the following command:
set sc=$zpm("install fhir-server")
Happy FHIR coding!
Added a host setup and docker run examples
Article
Alex Woodhead · Jun 15, 2023
Demonstration example for the current Grand Prix contest for use of a more complex Parameter template to test the AI.
Interview Questions
There is documentation. A recruitment consultant wants to quickly challenge candidates with some relevant technical questions to a role.
Can they automate making a list of questions and answers from the available documentation?
Interview Answers and Learning
One of the most effective ways to cement new facts into accessible long term memory is with phased recall.
In essence you take a block of text information, reorganize it into a series of self-contained Questions and Facts.
Now imagine two questions:
What day of the week is the trash-bin placed outside for collection?
When is the marriage anniversary?
Quickly recalling correct answers can mean a happier life!!
Recalling the answer to each question IS the mechanism to enforce a fact into memory.
Phased Recall re-asks each question with longed and longer time gaps when the correct answer is recalled.For example:
You consistently get the right answer: The question is asked again tomorrow, in 4 days, in 1 week, in 2 weeks, in 1 month.
You consistently get the answer wrong: The question will be asked every day until it starts to be recalled.
If you can easily see challenging answers, it is productive to re-work difficult answers, to make them more memorable.
There is a free software package called Anki that provides this full phased recall process for you.
If you can automate the creation of questions and answers into a text file, the Anki will create new flashcards for you.
Hypothesis
We can use LangChain to transform InterSystems PDF documentation into a series of Questions and answers to:
Make interview questions and answers
Make Learner Anki flash cards
Create new virtual environment
mkdir chainpdf
cd chainpdf
python -m venv .
scripts\activate
pip install openai
pip install langchain
pip install wget
pip install lancedb
pip install tiktoken
pip install pypdf
set OPENAI_API_KEY=[ Your OpenAI Key ]
python
Prepare the docs
import glob
import wget;
url='https://docs.intersystems.com/irisforhealth20231/csp/docbook/pdfs.zip';
wget.download(url)
# extract docs
import zipfile
with zipfile.ZipFile('pdfs.zip','r') as zip_ref:
zip_ref.extractall('.')
Extract PDF text
from langchain.document_loaders import PyPDFLoader
from langchain.embeddings.openai import OpenAIEmbeddings
from langchain.text_splitter import CharacterTextSplitter
from langchain.prompts.prompt import PromptTemplate
from langchain import OpenAI
from langchain.chains import LLMChain
# To limit for the example
# From the documentation site I could see that documentation sets
# GCOS = Using ObjectScript
# RCOS = ObjectScript Reference
pdfFiles=['./pdfs/pdfs/GCOS.pdf','./pdfs/pdfs/RCOS.pdf']
# The prompt will be really big and need to leave space for the answer to be constructed
# Therefore reduce the input string
text_splitter = CharacterTextSplitter(
separator = "\n\n",
chunk_size = 200,
chunk_overlap = 50,
length_function = len,
)
# split document text into chuncks
documentsAll=[]
for file_name in pdfFiles:
loader = PyPDFLoader(file_name)
pages = loader.load_and_split()
# Strip unwanted padding
for page in pages:
del page.lc_kwargs
page.page_content=("".join((page.page_content.split('\xa0'))))
documents = text_splitter.split_documents(pages)
# Ignore the cover pages
for document in documents[2:]:
# skip table of contents
if document.page_content.__contains__('........'):
continue
documentsAll.append(document)
Prep search template
_GetDocWords_TEMPLATE = """From the following documents create a list of distinct facts.
For each fact create a concise question that is answered by the fact.
Do NOT restate the fact in the question.
Output format:
Each question and fact should be output on a seperate line delimited by a comma character
Escape every double quote character in a question with two double quotes
Add a double quote to the beginning and end of each question
Escape every double quote character in a fact with two double quotes
Add a double quote to the beginning and end of each fact
Each line should end with {labels}
The documents to reference to create facts and questions are as follows:
{docs}
"""
PROMPT = PromptTemplate(
input_variables=["docs","labels"], template=_GetDocWords_TEMPLATE
)
llm = OpenAI(temperature=0, verbose=True)
chain = LLMChain(llm=llm, prompt=PROMPT)
Process each document and place output in file
# open an output file
with open('QandA.txt','w') as file:
# iterate over each text chunck
for document in documentsAll:
# set the label for Anki flashcard
source=document.metadata['source']
if source.__contains__('GCOS.pdf'):
label='Using ObjectScript'
else:
label='ObjectScript Reference'
output=chain.run(docs=document,labels=label)
file.write(output+'\n')
file.flush()
There were some retry and force-close messages during loop.
Anticipate this is limiting the openAI API to a fair use.
Alternatively a local LLM could be applied instead.
Examine the output file
"What are the contexts in which ObjectScript can be used?", "You can use ObjectScript in any of the following contexts: Interactively from the command line of the Terminal, As the implementation language for methods of InterSystems IRIS object classes, To create ObjectScript routines, and As the implementation language for Stored Procedures and Triggers within InterSystems SQL.", Using ObjectScript,
"What is a global?", "A global is a sparse, multidimensional database array.", Using ObjectScript,
"What is the effect of the ##; comment on INT code line numbering?", "It does not change INT code line numbering.", Using ObjectScript,
"What characters can be used in an explicit namespace name after the first character?", "letters, numbers, hyphens, or underscores", Using ObjectScript
"Are string equality comparisons case-sensitive?", "Yes" Using ObjectScript,
"What happens when the number of references to an object reaches 0?", "The system automatically destroys the object.",Using ObjectScript
Question: "What operations can take an undefined or defined variable?", Fact: "The READ command, the $INCREMENT function, the $BIT function, and the two-argument form of the $GET function.", Using ObjectScript, a
While a good attempt at formatting answers has occurred there is some deviation.
Manually reviewing I can pick some questions and answers to continue the experiment.
Importing FlashCards into Anki
Reviewed text file:
"What are the contexts in which ObjectScript can be used?", "You can use ObjectScript in any of the following contexts: Interactively from the command line of the Terminal, As the implementation language for methods of InterSystems IRIS object classes, To create ObjectScript routines, and As the implementation language for Stored Procedures and Triggers within InterSystems SQL.", "Using ObjectScript","What is a global?", "A global is a sparse, multidimensional database array.", "Using ObjectScript","What is the effect of the ##; comment on INT code line numbering?", "It does not change INT code line numbering.", "Using ObjectScript","What characters can be used in an explicit namespace name after the first character?", "letters, numbers, hyphens, or underscores", "Using ObjectScript""Are string equality comparisons case-sensitive?", "Yes", "Using ObjectScript","What happens when the number of references to an object reaches 0?", "The system automatically destroys the object.","Using ObjectScript""What operations can take an undefined or defined variable?", "The READ command, the $INCREMENT function, the $BIT function, and the two-argument form of the $GET function.", "Using ObjectScript"
Creating new Anki card deck
Open Anki and select File -> Import
Select the reviewed text file
Optionally create a new Card Deck for "Object Script"
A basic card type is fine for this format
There was mention of a "Field 4" so should check the records.
Anki import success
Lets Study
Now choose the reinforcement schedule
Happy Learning !!
References
Anki software is available from https://apps.ankiweb.net/
Article
Claudio Devecchi · Jun 20, 2023
In this article, I will share the theme we presented at the Global Summit 2023, in the Tech Exchange room. Me and @Rochael.Ribeiro
In this opportunity, we talk about the following topics:
Open Exchange Tools for Fast APIs
Open API Specification
Traditional versus Fast Api development
Composite API (Interoperability)
Spec-First or Api-First Approach
Api Governance & Monitoring
Demo (video)
Open Exchange Tools for Fast APIs
As we are talking about fast modern APIs development (Rest / json) we will use two Intersystems Open Exchange tools:
The first is a framework for rapid development of APIs which we will detail in this article.
https://openexchange.intersystems.com/package/IRIS-apiPub
The second is to use Swagger as a user interface for the specification and documentation of the Rest APIs developed on IRIS platform, as well as their use/execution. The basis for its operation is the Open Api specification (OAS) standard, described below:
https://openexchange.intersystems.com/package/iris-web-swagger-ui
What is the Open API Specification (OAS)?
It is a standard used worldwide to define, document and consume APIs. In most cases, APIs are designed even before implementation. I'll talk more about it in the next topics.
It is important because it defines and documents Rest APIs for its use, both on the provider and consumer side. But this pattern also serves to speed up tests and API calls in tools (Rest APIs Clients) on the market, such as Swagger, Postman, Insomnia, etc…
Traditional way in to publish API using IRIS
Imagine we have to build and publish a Rest API from an existing IRIS method (picture below).
In the traditional way:
1: We have to think about how consumers will call it. For example: Which path and verb will be used and how will be the response. Whether in a JSON object or as a plain/text.
2: Build a new method in a %CSP.REST class that will handle the http request to call it.
3: Handle the method's response to the intended http response for the end user.
4: Think about how we're going to provide the success code and how we're going to handle exceptions.
5: Map the route for our new method.
6: Provide the API documentation to the end user. We will probably build the OAS content manually.
7: And if, for example, we have a request or response payload (object), the implementation time will increase, because it must also be documented in OAS.
How can we be faster?
By simply tagging the IRIS method with the [WebMethod] attribute. Whatever it is, the framework will take care of its publication, using the OAS 3.x standard.
Why is the OAS 3.x standard so important?
Because It also documents in detail all properties of the input and output payloads.
In this way, any Rest Client tools on the market can instantly couple to the APIs, Like Insomnia, Postman, Swagger, etc. and provide a sample content to call them easily.
Using Swagger we will already visualize our API (image above) and call it. This is also very useful for testing.
API customization
But what if I need to customize my API?
For example: Instead of the name of the method I want the path to be something else. And I want the input parameters to be in the path, not like a query param.
We define a specific notation on top of the method, where we can complement the meta-information that the method itself does not provide.
In this example we are defining another path for our API and complementing the information for the end user to have a more friendly experience.
Projection Map for Rest API
This framework supports numerous types of parameters.
In this map we can highlight the complex types (the objects). They will be automatically exposed as a JSON payload and each property will be properly documented (OAS) for the end user .
Interoperability (Composite APIs)
By supporting complex types, you can also expose Interoperability Services.
It is a favorable scenario for building composite APIs, which use the orchestration of multiple external components (outbounds).
This means that objects or messages used as request or response will be automatically published and read by tools like swagger.
And it's an excellent way to test interoperability components, because usually a payload template is already loaded so that the user knows which properties the API uses.
First the developer can focus on testing, then shape the Api through customization.
Spec-first or Api-first Approach
Another concept widely used today is having the API defined even before its implementation.
With this framework it is possible to import an Open Api Spec. It creates the methods structure (spec) automatically, missing only their implementation.
API Governance and Monitoring
For Api's governance, it is also recommended to use IAM together.
In addition to having multiple plugins, IAM can quickly couple to APIs through the OAS standard.
apiPub offers additional tracing for APIs (see demo video)
Demo
Download & Documentation
Intersystems Open Exchange: https://openexchange.intersystems.com/?search=apiPub
Complete documentation: https://github.com/devecchijr/apiPub
Very rich material for the use of a new development method, bringing facilities and innovation. This will help a lot in the agility of API development.
Congratulations Claudio, for sharing the knowledge. thank you @Thiago.Simoes
Announcement
Bob Kuszewski · Sep 28, 2023
Red Hat Insights alerts now available for InterSystems IRIS
InterSystems and Red Hat are working together to add IRIS-specific alerts to Red Hat Insights.
Red Hat Insights is a service to predict and recommend remediations for system risks in Red Hat Enterprise Linux environments. Insights is free with nearly every RHEL, OpenShift, or Ansible subscription. You can learn more about Insights at Red Hat’s site.
Swappiness Recommendation
The first recommendation “Apply swappiness recommendation for better performance of InterSystems IRIS” has been activated.
This recommendation checks the system memory and, if swap is used at a level above what we recommend, sends our recommendation for the ideal level of swappiness. Swappiness really only comes into play when under memory pressure. File buffer cache can be as large as it can fit in memory. Linux will only push out buffer cache when it needs to free pages for computation pages, so we recommend keeping swappiness at a low level.
Upcoming Recommendations
Red Hat is currently working on Insights recommendations for:
Huge Pages settings
Upgrading old installations
Shmmax
… and more
If you have suggestions for further Red Hat Insights alerts for IRIS, please contact your account team or reach out to me directly.
If you have questions or problems with Red Hat Insights, please contact Red Hat support. Hi Bob
This is great! We already seeing some recommendations on redhat insight for one of our dev systems.
The performance of InterSystems IRIS server may be impacted when Transparent Huge Pages is enabled
Recommend running SystemPerformance 24-hour daily for InterSystems IRIS
Map the Write Image Journaling (WIJ) disk for better performance of InterSystems IRIS
Enable FreezeOnError for the integrity and recoverability of InterSystems IRIS database
Apply swappiness recommendation for better performance of InterSystems IRIS
Apply hugepages recommendation for better performance of InterSystems IRIS
Question
Lauri Kummala · Nov 1, 2023
Hello, I am quite new with InterSystems, SAM and Grafana. I am looking a way to get Log Files and Application Errors from InterSystems to SAM like in this post: https://community.intersystems.com/post/grafana-support-intersystems-iris. Is this plugin still under development or is there any other way to get those errors? By using SAM, I do not get those errors. Hi!
May be with something like this
https://community.intersystems.com/post/get-most-out-intersystems-sam-implement-your-own-alert-handler
or with a custom setup of grafana/prometheus to send alert and not only visualize them.
Be also advised that it's going to be deprecated
https://community.intersystems.com/post/deprecation-intersystems-system-alerting-and-monitoring-sam Thank you!
We heard about this that SAM is being deprecated and we were thinking of another solution.
Article
Luis Angel Pérez Ramos · Feb 7, 2024
In this article we are going to see how we can use the WhatsApp instant messaging service from InterSystems IRIS to send messages to different recipients. To do this we must create and configure an account in Meta and configure a Business Operation to send the messages we want.
Let's look at each of these steps in more detail.
Setting up an account on Meta
This is possibly the most complicated point of the entire configuration, since we will have to configure a series of accounts until we can have the messaging functionality.
Here you can read the official Meta documentation.
First we will create a personal Meta account, thus giving our soul to Zuckerberg:
The next step is the creation of a business account that will allow us to use WhatsApp services in our applications and that will be linked to our personal account:
And then we have registered as developers from here. The next step was, once inside the developers account, to create an application:
Following the instructions in the documentation, we select a type of application "Other":
And a type of company application:
In the last step we will assign the name of the application and link it with our business account to be able to use the WhatsApp functionalities.
Finally, after this long and tedious process of creating several accounts, we will have our application ready to configure it with the WhatsApp functionality.
You can see in the menu on the left that a new option called Whatsapp will be available once configured. By accessing the API Setup option you can see everything we need to connect to the messaging functionality.
What we see on this screen:
We have a test number from which the messages will be sent (From) to our recipients identified with an ID that we will later use to make calls to the API from our IRIS.
We have defined a destination number (To) to which we will send our test messages (we must register it previously to accept receiving the messages).
An Authentication Token has been generated with a validity of 24 hours.
In our call we must send our data in JSON format as follows:
{
"messaging_product": "whatsapp",
"to": "",
"type": "template",
"template":
{
"name": "hello_world",
"language":
{ "code": "en_US" }
}
}
For this example we are going to use a message template, although we could send any text. As you can also see, all we need is to make a POST HTTP call to the URL defined in the example:
https://graph.facebook.com/{{Version}}/{PhoneID}/messages
For our example we are going to create 3 different templates, so we can see how we could configure different messages. We have accessed this option from the link shown in step 2 of the API configuration.
Well, now we have everything to start sending messages to our client. Let's proceed to configure our IRIS instance.
Configuring IRIS
For our example we are going to configure a small production that simulates the reception of HL7 ORU type messages with glucose level data for a certain patient. Depending on the level received, we will send the patient one message template or another.
Business Service
We will start by creating a Business Service to capture HL7 messaging from a file:
And you will receive a message like this:
MSH|^~\&|HIS|HULP|APP||20230330133551||ORU^R01|71186|P|2.5.1
PID|||1502935519^^^SERMAS^SN~184001^^^HULP^PI||CABEZUELA SANZ^PEDRO^^^||20160627|M|||PASEO JULIA ÁLVAREZ 395 3 E^^MADRID^MADRID^28909^SPAIN||6XXXXXXXX^PRN^^PEDRO.CABEZUELA@GMAIL.COM|||||||||||||||||N|
PV1||O|||||0545128263Q^MARTÍNEZ FERNÁNDEZ^SUSANA^^MD^^^^|||||||1|||||173815|||||||||||||||||||||||||20230330133551|20230330133551
ORC|1|921099|131777||||^^^20231126133551||20230330133551|||0269410060K^URDANETA LÓPEZ^SUSANA^^MD^^^^|HULP||||||||HULP||||||||LAB
OBR|1|921099|131777|LAB^LABORATORY^L||||||||||||0269410060K^URDANETA LÓPEZ^SUSANA^^MD^^^^|||||||||F
OBX|1|NM|GLU^Glucosa|1|200|mg/dL|70-105|N|||F|||20231123124525||Lectura desde dispositivo de control|1|
Business Process
Once the message is received, we will send it to a Business Process that will transform the HL7 message into a type of message created by us and that will have the information that is relevant to us. As you can see it will be a very simple BPL:
If we take a look at the transformation we will see how, depending on the glucose level data and the defined limits, we will indicate the type of message template that we are going to use:
Business Operation
This component will be responsible for sending the POST call to the WhatsApp server. To do this, we will define the EnsLib.HTTP.OutboundAdapter class as the component's adapter. Here you can see the class code:
Class Whatsapp.BO.WhatsAppBO Extends Ens.BusinessOperation
{
Parameter ADAPTER = "EnsLib.HTTP.OutboundAdapter";
Parameter INVOCATION = "Queue";
Property Version As %String(MAXLEN = 5);
Property PhoneNumberId As %String(MAXLEN = 15);
Property Token As %String(MAXLEN = 1000);
Parameter SETTINGS = "Version,PhoneNumberId,Token";
Method SendMessage(pRequest As Whatsapp.Message.WhatsAppRequest, Output pResponse As Whatsapp.Message.WhatsAppResponse) As %Status
{
set tSC=$$$OK
set body = {
"messaging_product": "whatsapp",
"to": "",
"type": "template",
"template": {
"name": "",
"language": {
"code": "en"
}
}
}
do body.%Set("to", pRequest.telephone)
do body.template.%Set("name", pRequest.template)
$$$TRACE(body.%ToJSON())
set request = ##class(%Net.HttpRequest).%New()
set request.Authorization = "Bearer "_..Token
set request.ContentType = "application/json"
set request.Https = 1
set request.SSLConfiguration="default"
set request.Location = "/"_..Version_"/"_..PhoneNumberId_"/messages"
do request.EntityBody.Write(body.%ToJSON())
set status = ..Adapter.SendFormData(.response,"POST",request)
$$$TRACE(response.StatusCode)
set pResponse = ##class(Whatsapp.Message.WhatsAppResponse).%New()
Quit tSC
}
XData MessageMap
{
<MapItems>
<MapItem MessageType="Whatsapp.Message.WhatsAppRequest">
<Method>SendMessage</Method>
</MapItem>
</MapItems>
}
}
We have defined 2 new parameters for the component in which we will indicate:
The version of the API we will invoke.
The identifier of the phone from which the message will be sent and which we have seen previously in the information of our Meta developers account application.
The token that we will send in the header of our call (remember that it is valid for 24 hours).
Since the required connection is HTTPS we have created a default SSL configuration:
Well, we would have everything configured to launch our messaging tests. We will try sending 3 HL7 files, each one with a different glucose value for our patient:
Value 80: which will generate a notification message indicating that our levels are normal.
Value 110: in which it will warn us that we are exceeding the limit and will urge us to exercise to lower the levels.
Value 200: in which it will alert us of our level and urge us to visit a medical center.
Let's copy the messages to the defined folder:
*Attention, if you want to test the example with the associated project you must configure the HL7 message with your phone number (search and replace the value 6XXXXXXXX in the example message) and modify the DTL to take into account the national prefix of your test phone .
Let's see the result:
Here we have our 3 messages received. Another new success from InterSystems IRIS! If you have any questions or comments, you can write them in the comments and I will be happy to answer you. 💡 This article is considered InterSystems Data Platform Best Practice.
Article
Timothy Leavitt · Jun 4, 2020
Over the past year or so, my team (Application Services at InterSystems - tasked with building and maintaining many of our internal applications, and providing tools and best practices for other departmental applications) has embarked on a journey toward building Angular/REST-based user interfaces to existing applications originally built using CSP and/or Zen. This has presented an interesting challenge that may be familiar to many of you - building out new REST APIs to existing data models and business logic.
As part of this process, we've built a new framework for REST APIs, which has been too useful to keep to ourselves. It is now available on the Open Exchange at https://openexchange.intersystems.com/package/apps-rest. Expect to see a few more articles about this over the coming weeks/months, but in the meanwhile, there are good tutorials in the project documentation on GitHub (https://github.com/intersystems/apps-rest).
As an introduction, here are some of our design goals and intentions. Not all of these have been realized yet, but we're well on the way!
Rapid Development and Deployment
Our REST approach should provide the same quick start to application development that Zen does, solving the common problems while providing flexibility for application-specific specialized use cases.
Exposing a new resource for REST access should be just as easy as exposing it a a Zen DataModel.
Addition/modification of REST resources should involve changes at the level being accessed.
Exposure of a persistent class over REST should be accomplished by inheritance and minimal overrides, but there should also be support for hand-coding equivalent functionality. (This is similar to %ZEN.DataModel.Adaptor and %ZEN.DataModel.ObjectDataModel.)
Common patterns around error handling/reporting, serialization/deserialization, validation, etc. should not need to be reimplemented for each resource in each application.
Support for SQL querying, filtering, and ordering, as well as advanced search capabilities and pagination, should be built-in, rather than reimplemented for each application.
It should be easy to build REST APIs to existing API/library classmethods and class queries, as well as at the object level (CRUD).
Security
Security is an affirmative decision at design/implementation time rather than an afterthought.
When REST capabilities are gained by class inheritance, the default behavior should be to provide NO access to the resource until the developer actively specifies who should receive access and under what conditions.
Standardized implementations of SQL-related features minimize the surface for SQL injection attacks.
Design should take into consideration the OWASP API Top 10 (see: https://owasp.org/www-project-api-security)
Sustainability
Uniformity of application design is a powerful tool for an enterprise application ecosystem.
Rather than accumulating a set of diverse hand-coded REST APIs and implementations, we should have similar-looking REST APIs throughout our portfolio. This uniformity should lead to:
Common debugging techniques
Common testing techniques
Common UI techniques for connecting to REST APIs
Ease of developing composite applications accessing multiple APIs
The set of endpoints and format of object representations provided/accepted over REST should be well-defined, such that we can automatically generate API documentation (e.g., Swagger/OpenAPI) based on these endpoints.
Based on industry-standard API documentation, we should be able to generate portions of client code (e.g., typescript classes corresponding to our REST representations) using third-party/industry-standard tools.
Awesome! @Timothy.Leavitt this is amazing!
I'll be making use of it in my application :)
I was looking into the OpenExchange description, and in the Tutorial and User Guide, I think the links are broken. I got a "Not found" message when I try to access the URLs.
https://openexchange.intersystems.com/docs/sample-phonebook.md
https://openexchange.intersystems.com/docs/user-guide.md
Thank you for your interest, and for pointing out that issue. I saw it after publishing and fixed it in GitHub right away. The Open Exchange updates from GitHub at midnight, so it should be all set now.
minimum platform version of InterSystems IRIS 2018.1
Porting old apps with a framework available on new version of the platform (IRIS) only, no contradictions here? :) Is there something fundamental preventing the framework from being used on Cache too? Maybe I'm wrong, but the minimum requirement here it's because you don't have %JSON.Adaptor on Caché.
The %JSON. Adaptor is missing in Caché but %JSON.Formatter was backported half a year ago. it is in OpenExchange available @Henrique is right - that's the reason for the minimum requirement.
IMO, getting an old app running on the new version of the platform is a relatively small effort compared to a Zen -> Angular migration (for example). Hi @Timothy.Leavitt I'm testing the AppS.REST to create a new application, following the Tutorial and Sample steps in Github I created a Dispatch Class:
Class NPM.REST.Handler Extends AppS.REST.Handler
{
ClassMethod AuthenticationStrategy() As %Dictionary.CacheClassname
{
Quit ##class(AppS.REST.Authentication.PlatformBased).%ClassName(1)
}
ClassMethod GetUserResource(pFullUserInfo As %DynamicObject) As AppS.REST.Authentication.PlatformUser
{
Quit ##class(AppS.REST.Authentication.PlatformUser).%New()
}
}
And a simple persistent class:
Class NPM.Model.Task Extends (%Persistent, %Populate, %JSON.Adaptor, AppS.REST.Model.Adaptor)
{
Parameter RESOURCENAME = "task";
Property RowID As %String(%JSONFIELDNAME = "_id", %JSONINCLUDE = "outputonly") [ Calculated, SqlComputeCode = {Set {*} = {%%ID}}, SqlComputed, Transient ];
Property TaskName As %String(%JSONFIELDNAME = "taskName");
/// Checks the user's permission for a particular operation on a particular record.
/// <var>pOperation</var> may be one of:
/// CREATE
/// READ
/// UPDATE
/// DELETE
/// QUERY
/// ACTION:<action name>
/// <var>pUserContext</var> is supplied by <method>GetUserContext</method>
ClassMethod CheckPermission(pID As %String, pOperation As %String, pUserContext As AppS.REST.Authentication.PlatformUser) As %Boolean
{
Quit ((pOperation = "QUERY") || (pOperation = "READ") || (pOperation = "CREATE") || (pOperation = "UPDATE"))
}
}
But when I try the REST API using Postman GET: http://localhost:52773/csp/npm-app-rest/api/task/1
I'm getting a 404 Not Found message.
Am I doing something wrong or missing something?
Thanks
@Henrique , do you have a record with ID 1? If not, you can populate some data with the following (since you extend %Populate):
Do ##class(NPM.Model.Task).Populate(10) Yes, I already populated the class.
Give CSPSystem user access to the database with a REST broker. @Eduard.Lebedyuk is probably right on this. If you add auditing for <PROTECT> events you'll probably see one before the 404. I added auditing on everything, and the <PROTECT> error never showed up. So, I started everything from scratch and found out a typo on Postman.
Thanks, @Eduard.Lebedyuk @Timothy.Leavitt PS: Sorry, guys. I think not sleeping enough hours isn't good for health and cause this kind of mistakes This is really cool, and we will be using this in a big way.
But I have encountered an issue I can't fix.
I took one of my data classes (Data.DocHead) and had it inherit from AppS.REST.Model.Adaptor and %JSON.Adaptor, set the RESOURCENAME and other things and tested using Postman and it worked perfectly! Excellent!
Due to the need to have multiple endpoints for that class for different use cases, I figured I would set it up using the AppS.REST.Model.Proxy, so I created a new class for the Proxy, removed the inheritance in the data class (left %JSON.Adaptor), deleted the RESOURCENAME and other stuff in the data class.
I used the same RESOURCENAME in the proxy that I had used in data class originally.
I compiled the proxy class, and get the message:
ERROR #5001: Resource 'dochead', media type 'application/json' is already in use by class Data.DocHead > ERROR #5090: An error has occurred while creating projection RestProxies.Data.DocHead:ResourceMap.
I've recompiled the entire application with no luck. So there must be a resource defined somewhere that is holding dochead like it was still attached to Data.Dochead via a RESOURCENAME, but that parameter is not in that class anymore.
How do I clear that resource so I can use it in the proxy? @Richard.Schilke, I'm glad to hear that you're planning on using this, and we're grateful for your feedback.
Quick fix should just be: Do ##class(AppS.REST.ResourceMap).ModelClassDelete("Data.DocHead")
Background: metadata on REST resources and actions is kept in the AppS.REST.ResourceMap and AppS.REST.ActionMap classes. These are maintained by projections and it seems there's an edge case where data isn't getting cleaned up properly. I've created a GitHub issue as a reminder to find and address the root cause: https://github.com/intersystems/apps-rest/issues/5 That did the trick - thank you so much!
Best practice check: When I have a data class (like Data.DocHead) that will need multiple Mappings (Base, Expanded, Reports), then the recommended way is to use the proxy class and have a different proxy class for Data.DocHead for each mapping?
For example, RESTProxies.Data.DocHead.Base.cls would be the proxy for the Base mapping in Data.DocHead, while RESTProxies.Data.DocHead.Expanded.cls would be the proxy for the Expanded mapping in Data.DocHead, etc. (the only difference might be the values for the JSONMAPPING and RESOURCENAME prameters)? I'm fine with that, just checking that you don't have some other clever way of doing that... @Timothy.Leavitt, I've run into another issue.
The proxy is setup and working great for general GET access. But since my system is a multi-tenant, wide open queries are not a thing I can use, so I decided to try to use a defined class Query in the data class Lookups.Terms:
Query ContactsForClientID(cClientOID As %String) As %SQLQuery{SELECT * FROM Lookups.TermsWHERE ClientID = :cClientOIDORDER BY TermsCode}
Then I setup the Action Mapping in my proxy class RESTProxies.Lookups.Terms.Base:
XData ActionMap [ XMLNamespace = "http://www.intersystems.com/apps/rest/action" ]{<actions xmlns="http://www.intersystems.com/apps/rest/action"><action name="byClientID" target="class" method="GET" modelClass="Lookups.Terms" query="Lookups.Terms:ContactsForClientID"><argument name="clientid" target="cClientOID" source="url"/></action></actions>}
And I invoked this using this URL in a GET call using Postman (last part only):
terms_base/$byClientID?clientid=290
And the result:
406 - Client browser does not accept the MIME type of the requested page.
In the request, I verified that both Content-Type and Accept are set to application/json (snip from the Postman):
So what have I missed? @Richard.Schilke , yes, having a separate proxy for each mapping would be best practice. You could also have Data.DocHead extend Adaptor for the primary use case and have proxies for the more niche cases (if one case is more significant - typically this would be the most complete representation). What's the MEDIATYPE parameter in Lookups.Terms (the model class)? The Accept header should be set to that.
Also, you shouldn't need to set Content-Type on a GET, because you're not supplying any content in the request. (It's possible that it's throwing things off.)
If you can reproduce a simple case independent of your code (that you'd be comfortable to share), feel free to file a GitHub issue and I'll try to knock it out soon. I'll also note - the only thing that really matters from the class query is the ID. If nothing else is using the query you could just change it to SELECT ID FROM ... - it'll constitute the model instances based on that. (This is handy because it allows reuse of class queries with different representations.) Good to know and, yes, very handy!
Thank you! I posted an issue with my source to Github.
Surfaced another issue this week-end. (I remember when I used to take week-ends off, but no whining!)
So I have a multiple linked series of classes in Parent/Child relationships:
DocHead->DocItems->DocItemsBOM->DocItemsBOMSerial
So if I wanted to express all of this in a JSON object, I would need to make the "Default" mapping the one that exposes all the Child Properties, because it looks like I can't control the Mapping of the Child classes from the Parent class.
This doesn't bother me, as I had already written a shell that does this, and your Proxy/Adaptor makes it work even better, but just wanted to check that the Parent can't tell the Child what Proxy the child should use to display its JSON. It's even more complicated than that, as sometimes I want to show DocHead->DocItems (and stop), while, in other Use Cases, I have to show DocHead, DocItems, and DocItemsBOM (and stop), while in other Use Cases, I need the entire stack. Thanks for posting - I'm taking a look now. This issue is starting to ring a bell; I think this looks like a bug we fixed in another branch internally to my team. (I've had reconciling the GitHub branch and our internal branch on my list for some time - I'll try to at least get this fix in, soon.)
Re: customizing mappings of relationship/object properties, see https://docs.intersystems.com/healthconnectlatest/csp/docbook/Doc.View.cls?KEY=GJSON_adaptor#GJSON_adaptor_xdata_define - this is doable in %JSON.Adaptor mapping XData blocks via the Mapping attribute for an object-valued property included in the mapping. Wow - I think that means I can handle all my Use Cases with that capability. Nice!
Thanks again! @Timothy.Leavitt , have you had a chance to see if this error I'm getting on Actions was resolved? @Richard.Schilke I'm planning to address it tomorrow or Friday. Keep an eye out for the next AppS.REST release on the Open Exchange - I'll reply again here too. (This will also include a fix for the other issue you reported; I've already merged a PR for that.) @Timothy.Leavitt , I will be looking for it.
I'm trying to do something with a custom Header that I want to provide for the REST calls. Do I have access to the REST Header somewhere in the service that I can pull the values, like a %request?
And in something of an edge case, we're calling these REST services from an existing ZEN application (for now as we start a slow pull away from Zen), so the ZEN app gets a %Session created for it, and then calls the REST service. It seems that Intersystems is managing the License by recognizing that the browser has a session cookie, and it doesn't burn a License for the REST call - that's very nice (but I do have a request in to the WRC about whether that is expected behavior or not so I don't get surprised if it gets "fixed"!). Does that mean your REST service can see that %Session, as that would be very helpful, since we store User/Multi-tenant ID, and other important things in there (the %Session, not the cookie). @Richard.Schilke - on further review, it's an issue with the Action map. See my response in https://github.com/intersystems/apps-rest/issues/7 (and thank you for filing the issue!). I'll still create a new release soon to pick up the projection bug you found.
Regarding headers - you can reference %request anywhere in the REST model classes, it just breaks abstraction a bit. (And for the sake of unit testing, it would be good to behave reasonably if %request happens not to be defined, unless your planning on using Forgery or equivalent.)
Regarding sessions - yes, you can share a session with a Zen application via a common session cookie path or using GroupById. You can reference this as needed as well, though I'd recommend wrapping any %session (or even %request) dependencies in the user context object that gets passed to CheckPermissions(). @Timothy.Leavitt - thanks so much for the response. The Action worked perfectly with your corrections!
I will take your advice and work with the %session/headers in the context object, since that makes the most sense.
What are the plans (if any) to enable features in a resultset such as pagination, filters, and sorting?
Users are horrible, aren't they? No matter what good work you do, they always want more! I appreciate what you have done here, and it will save my company probably hundreds of hours of work, plus it is very elegant... @Richard.Schilke - great!
We have support for filtering/sorting on the collection endpoints already, though perhaps not fully documented. Pagination is a challenge from a REST standpoint but I'd love to add support for it (perhaps in conjunction with "advanced search") at some point. I'm certainly open to ideas on the implementation there. :)
Users are the best, because if you don't have them, it's all just pointlessly academic. ;) @Timothy.Leavitt - stuck again.
I'm in ClassMethod UserInfo, and found out some interesting things.
First off, I was wrong about the REST service using the session cookie from the Zen application when it is called from the Zen application. Displaying the %session.SessionId parameters for each call shows that they are all different, and not the same as the SessionId of the Zen application calling the REST service. So the idea that it holds a license for 10 seconds can't be correct, as it seems almost immediate. I run 20 REST calls to different endpoints in a loop, and I saw a single License increase.
You said I should be able to expose the session cookie of the Zen application, but I don't see a way to do that either.
I can't even find a way to see the header data in the UserInfo ClassMethod of the current REST call.
Sorry to be a pest...but since you''re giving answers, I'll keep asking questions!
Have a nice evening... @Richard.Schilke , you should be able to share a session by specifying the same CSP session cookie path for your REST web application and the web application(s) through which your Zen pages are accessed. Alternatively, you could assign the web applications the same GroupById in their web application configuration.
You likely also need to configure your REST handler class (your subclass of AppS.REST.Handler) to use CSP sessions (from your earlier description, I assumed you had). This is done by overriding the UseSession class parameter and setting it to 1 (instead of the default 0).
To reference header data in the UserInfo classmethod, you should just be able to use %request (an instance of %CSP.Request) and %response (an instance of %CSP.Response) as appropriate for request/response headers. 💡 This article is considered as InterSystems Data Platform Best Practice. How would the AppS.REST Handler co-exist with a 'Spec-first' approach, where the dispatch class should not be modified manually - only by re-importing the API spec?
The AppS.REST user-guide states: 'To augment an existing REST API with AppS.REST features, forward a URL from your existing REST handler to this subclass of AppS.REST.Handler.' How would this work in practice with the above?
Thanks in advance.
Announcement
Anastasia Dyubaylo · Sep 16, 2020
Hi Community!
Please welcome the new video on InterSystems Developers YouTube:
⏯ Get InterSystems IRIS from the Docker Store
In this video, you'll learn how to navigate to the InterSystems IRIS listing on the Docker Store and pull a Community Edition image.
⬇️ InterSystems IRIS Community Edition on Docker
Enjoy and stay tuned! 👍🏼
Announcement
Anastasia Dyubaylo · Apr 28, 2021
Hi Community,
Enjoy watching the new video on InterSystems Developers YouTube:
⏯ InterSystems API Manager: Gummy Bear Factories
InterSystems API Manager allows developers to manage multiple APIs and consumers. This demo environment uses InterSystems API Manager, or IAM, to monitor and control the HTTP-based API traffic coming from three different candy factories, leading to an endpoint in InterSystems IRIS data platform.
Stay tuned! 👍🏼
Announcement
Marcus Wurlitzer · Apr 21, 2021
Hi Developers, I am glad to announce Git for InterSystems IRIS, my first submission to OpenExchange and part of the current Developer Tools Contest.
Git for InterSystems IRIS is a source control package that aims to facilitate a native integration of the Git workflow with the InterSystems IRIS platform. It is designed to work as a transparent link between InterSystems IRIS and a Git-enabled code directory that, once setup, requires no user interaction. A detailed description can be found on GitHub.
I am looking forward to learn what you think about this approach. Does it make sense? Would this help you with establishing a Git-based deployment pipeline? Are there any issues that may have been overlooked?
A ready-to-run docker demo is available on OpenExchange. The application is in a usable proof-of-concept state, with some features still to be implemented. I am happy to receive any feedback from you. Thank you for publishing!!
I am curious ... did you start with one of the existing open source Git hooks for ObjectScript or did you start from scratch with this project? Hi Ben, the project started as a fork of Caché Tortoize Git, which was a good starting point, and initially I intended to change only a few things. As development went on, however, most of the code has been rewritten and I think only 10-20% is left from the original code. There were just too many differences in the basic concepts, including the Globals structure, handling of namespaces and projects, and interaction with Git (hooks -> REST) and Studio (none). This is really interesting - I've been starting on a similar project with the same starting point. Got it Marcus - thanks for the history :) Thank you Marcus, great initiative! Any thoughts about how to manage environment specific variables in the pipeline e.g. different interoperability host configurations for dev / prod? @Janne.Korhonen - typically these are managed using System Default Setting: https://docs.intersystems.com/irislatest/csp/docbook/DocBook.UI.Page.cls?KEY=ECONFIG_other#ECONFIG_other_default_settings Hello Marcus,
Thank you for sharing.
I'm building a dockerised DEV environnement.
The main issue I am encountering at this moment is the modification from the portal of Business Process or Transform for exemple. I have to export manually my new processes. If I forget it, I just lose it ...
I try to install and configure GIT directly on my image.
The aim is to link my local repo to the container. I don't want GIT at all. This way, I will have a automatical export.
But, it is quite difficult to use in command line.
I always have to perform an INIT when I start my container.
I tried to perform an init from my DockerFile (do ##class(SourceControl.Git.Utils).UserAction("","%SourceMenu,Init")) but it does not work.
Do you have a clean way to install and configure GIT from a DockerFile ?
Thanks
Regards,
Matthieu.
My iris start in my DockerFile :
RUN iris start IRIS \
&& iris session IRIS -U %SYS < /tmp/iris.script \
&& iris stop IRIS quietly
my iris.script :
//On installe ZPM
set $namespace="%SYS", name="DefaultSSL" do:'##class(Security.SSLConfigs).Exists(name) ##class(Security.SSLConfigs).Create(name) set url="https://pm.community.intersystems.com/packages/zpm/latest/installer" Do ##class(%Net.URLParser).Parse(url,.comp) set ht = ##class(%Net.HttpRequest).%New(), ht.Server = comp("host"), ht.Port = 443, ht.Https=1, ht.SSLConfiguration=name, st=ht.Get(comp("path")) quit:'st $System.Status.GetErrorText(st) set xml=##class(%File).TempFilename("xml"), tFile = ##class(%Stream.FileBinary).%New(), tFile.Filename = xml do tFile.CopyFromAndSave(ht.HttpResponse.Data) do ht.%Close(), $system.OBJ.Load(xml,"ck") do ##class(%File).Delete(xml)
do ##class(%SYSTEM.Process).CurrentDirectory("/opt/irisapp")
//On charge les installer et les deployer
do $SYSTEM.OBJ.Load("InstallerLibrary.cls", "ck")
//On installe les namespaces
set sc = ##class(App.InstallerLibrary).setup()
// Je ne sais pas pourquoi mais je dois redéfinir le dossier de travail
do ##class(%SYSTEM.Process).CurrentDirectory("/opt/irisapp")
//On importe les default settings + le plugin GIT (SURTOUT LAISSER LES PASSAGES A LA LIGNE)
zn "LIBRARY"
zpm "install git-source-control"
d ##class(SourceControl.Git.API).Configure()
/irisdev/app/LIBRARY/
// Pour éviter de devoir modifier le mdp SuperUser.
zn "%SYS"
w ##class(Security.Users).UnExpireUserPasswords("*")
// Pour faire fonctionner le plugin Git, il faut que le path défini existe, par défaut il est à chaine vide et cela fait planter le plugin. En l'enlevant cela fonctionne
k ^SYS("SourceControl","Git","%gitBinPath")
zn "LIBRARY"
do ##class(SourceControl.Git.Utils).UserAction("","%SourceMenu,Init")
halt Hi Matthieu,
so you want to use Git for IRIS for an automated export of classes and set it up from the iris.script, which will be invoked in the Dockerfile.
From the code you have pasted, it seems like you use a different Git Source Control implementation (zpm "install git-source-control“). The implementation discussed in this thread would be installed with
zpm "install git-for-iris“
There, you can use the API functions in SourceControl.Git.Utils in the iris.script:
do ##class(SourceControl.Git.Utils).AddDefaultSettings()
do ##class(SourceControl.Git.Utils).AddPackageToSourceControl(„<My.Package.Name>“, „<MyNamespace>“)
do ##class(SourceControl.Git.Utils).SetSourceControlStatus(1)
A default package is added to source control via module.xml for demo purposes, as well as the /csp/user/sc web application for callbacks from git, both of which you may want to remove.
As a final step, you will have to activate the source control class in IRIS. The manual process is described here https://docs.intersystems.com/irislatest/csp/docbook/Doc.View.cls?KEY=ASC#ASC_Hooks_activating_sc_class, you might look into the corresponding CSP page to find out how to do it programatically.
Hope this helps. Currently there is a select few of us in the group that use Git and Local Repos for VS code, but I want to make this more wide spread for our team as most use the Editors off of the Management Portal to do their coding.
Does anyone have steps they have used in the past to move towards Server Side Source Control from creating the Repos on your Server, getting the IRIS Code into the new Repo you created on your server, and pushing it to github? Git for IRIS has been updated to v0.3. It now provides source control for lookup tables and supports deletion of classes. Also, improvements were made to provision default settings, and Git hooks are now disabled by default to avoid unwanted side-effects (writing hooks to the .git directory, setting a random password for the technical user).
A complete guide for deploying Git for IRIS to an existing IRIS instance has been added to README.md along with detailed descriptions of settings, Globals and behaviour.
Announcement
Anastasia Dyubaylo · Sep 13, 2021
Hi Community,
Enjoy watching the new video on InterSystems Developers YouTube:
⏯ iKnow: Open Source NLP in InterSystems IRIS
Learn about this open-source NLP technology and its unique approach to text analytics. See how the new Python library works, and learn about the exciting journey towards a robust public CI/CD pipeline.
Presenters: 🗣 @Benjamin.DeBoe, Product Manager, InterSystems 🗣 @Aohan.Dang, Systems Developer, InterSystems
Enjoy and stay tuned!