Setting the TZ Environment Variable on Linux

The Update Checklist for v2015.1 recommends setting the TZ environment variable on Linux platforms and points to the manpage for tzset. This is recommended to improve the performance of Cache’s time-related functions. You can find out more about this here:

https://community.intersystems.com/post/linux-tz-environment-variable-not-being-set-and-impact-caché

4 0
0 130.6K

NewBie's Corner Session 26 Globals and Arrays Part 2

Welcome to NewBie's Corner, a weekly or biweekly post covering basic Caché Material.

Globals, Arrays, and Variables Part 2

A thorough understanding of Globals, Arrays, and Variables is foundational for every Caché developer.

Uniqueness of the Caché Global Structure or database

0 0
0 326

Cross-origin Resource Sharing (CORS) is one of the basic security features built into browsers. CORS controls accessing resources from a HTML page in domains other than the original domain. It is particularly important for AJAX calls. Since RESTful services can be used as data provider to any AJAX call, you have to be able to control cross-origin access. By default services are not allowed to do CORS. You are going to learn how to enable it for Ensemble RESTful services.

3 0
0 1.4K

Many mobile applications that enable users to get information about road fines and pay them, send notifications about newly added fines. This functionality can be efficiently implemented using push notifications sent to users’ devices.

Our application was not an exception. The server side is based on the Ensemble platform that offers integrated support of push notifications starting from version 2015.1.

1 0
0 553

At the end of our last lesson, we ended with our page displaying a nice (but garish) Angular Material Toolbar, and our Widget data displaying in a list of Material cards. Our page feels a bit static, and we already know that the large number of Widgets that we will be dealing with will not be especially usable on a static list. What can we do to help?

3 0
0 1.2K
Article
· Dec 9, 2017 1m read
Simple Game Code(21 sticks Game)
sample
    w "Total No.Of Sticks:21"_!
    s sticks=21
    r "enter machine name: ",a
    r "enter dev name: ",b
    w "Display 1 to 21 sticks"_!
    s i=""
    f i=1:1:21{
        w " "_i_" "
    }
user(sticks)
    w !_"User select 1 or 2 or 3 or 4 sticks:"_!
    r "User enter sticks: ",us
    if us>4 {
        w "please select upto 4 Sticks"
        r "User enter sticks: ",us
    }
    s cnt=sticks-us
    s sticks=cnt
    f i=1:1:sticks {
        w " "_i_" "
    }
    s dev=1
    s machine=0
    d:sticks=0 lost(dev,machine)
    d machine(.sticks,us)
    q
machine(sticks,us)    
1 0
0 312
Article
· May 28, 2021 1m read
Fetch Upstream in GitHub

Hi colleagues!

Often when we collaborate to someone's repo in GitHub we do the following cycle:

Fork-Clone-Change-Commit-Push-Pull-Request-Merge to the original repo.

This is all great and works fine!

And if we want to make a second collaboration right after the merge you need to perform "Fetch upstream" to your forked repo first to "ingest" your own Pull-request in the original repo.

Geeky git-professionals do it with ease but this was always a headache for me so I usually simply deleted the fork and created a new one.

And today I figured that Github added a new UI feature that I can easily fetch-upstream for my fork with the original one and make it up to date and capable for pull-requests.

Here is where the button is:

This is a relief! )

Wanted to share this relief and productivity tip with you!

Bring more collaborations to Github repos!

And speaking of PR - I just made a PR with docker to Google Cloud Run deployment for the FHIRaaS demo made by @Anton Umnikov for the current FHIR Contest! Looking for more of your contributions!

9 0
3 382

The first installment of this article series discussed how to read a big chunk of data from the raw body of an HTTP POST method and save it to a database as a stream property of a class. The second installment discussed how to send files and their names wrapped in a JSON format.

Now let’s look closer at the idea of sending large files in parts to the server. There are several approaches we can use to do this. This article discusses using the Transfer-Encoding header to indicate chunked transfer. The HTTP/1.1 specification introduced the Transfer-Encoding header, and the RFC 7230 section 4.1 described it, but it’s absent from the HTTP/2 specification.

4 0
0 684

I work as an Integration Engineer for United States Department of Veterans Affairs (VA). I work on a Health Connect production which processes many RecordMap files. I do not fully understand RecordMaps and I wanted to develop an application for the Interoperability contest where I could learn more about working with RecordMaps. I browsed InterSystems documentation for inspiration on how to start. I was happy to find CSV Record Wizard.

0 0
0 357

Requirement: Transform source XML message to target JSON.

step1: First create json equivalent xml from any online tool.

step2: use add in studio utility create persistent classes from json equivalent xml and compile it , now target persistent classes are ready

step3: Do the above step for source xml or xsd and generate persistent classes for source and compile it

step4: complete the DTL by selecting root node from source, target persistent classes and complete the mappings and compile it

5 0
0 367
Article
· Jul 21, 2022 11m read
ECP With Docker

Hi community,

This is the third article in the series about initializing IRIS instances with Docker. This time, we will focus on Enterprise Cache Protocol (ECP).

In a very simplified way, ECP allows configuring some IRIS instances as application servers and others as data servers. Detailed technical information can be found in the official documentation.

This article aims to describe:

4 0
1 738

One of the most important tasks of dashboards is, on the one hand, to help you perceive data in an aggregated form, and, on the other hand, not to lose the depth of immersion in the data if you need this. One of the tools that help us achieve this result is drill down. It enables us to display several hierarchical levels of data, from aggregated to detailed.

5 0
0 472

Welcome community members to a new article! this time we are going to test the interoperability capabilities of IRIS for Health to work with DICOM files.

Let's go to configure a short workshop using Docker. You'll find at the end of the article the URL to access to GitHub if you want to make it run in your own computer.

Previously to any configuration we are going to explain what is DICOM:

3 0
0 651