Hi Developers!

Sometimes we need to import CSV data programmatically to InterSystems IRIS either from CSV or from URL. And we expect the class with proper datatypes to be created and the data to be imported.

I published a module csvgen on Open Exchange which does exactly that.

If you just need the CSV file be imported into IRIS you can do the following:

USER>do ##class(community.csvgen).Generate("/usr/data/titanic.csv",,"Data.Titanic")

Class name: Data.Titanic
Header: PassengerId INTEGER,Survived INTEGER,Pclass INTEGER,Name VARCHAR(250),Sex VARCHAR(250),Age INTEGER,SibSp INTEGER,Parch INTEGER,Ticket VARCHAR(250),Fare MONEY,Cabin VARCHAR(250),Embarked VARCHAR(250)
Records imported: 891
USER>

Or if you have the CSV on the internet, e.g. COVID-19 Data on Github you can get the data in the following way:

USER>d ##class(community.csvgen).GenerateFromURL("https://raw.githubusercontent.com/CSSEGISandData/COVID-19/master/csse_covid_19_data/csse_covid_19_daily_reports/05-29-2020.csv",",","Data.Covid19")

Class name: Data.Covid19
Header: FIPS INTEGER,Admin2 VARCHAR(250),Province_State VARCHAR(250),Country_Region VARCHAR(250),Last_Update DATE,Lat MONEY,Long_ DOUBLE,Confirmed INTEGER,Deaths INTEGER,Recovered INTEGER,Active INTEGER,Combined_Key VARCHAR(250),Incidence_Rate DOUBLE,Case-Fatality_Ratio DOUBLE
Records imported: 3522
USER>

50
1 12 614

For some years I missed being able to offer, to everybody interested in ObjectScript, a tutorial more or less complete, to start with ObjectScript. Something that could help more and make things easier to those new developers that come to our technology... something intermediate, halfway between the common "Hello World!", that doesn't really get you further, and the "Advanced Training", that is unaffordable because of lack of time,etc.

If there were something truly helpful not only as an introduction to the ecosystem, but as a starting point, as a boost, to really start to walk into ObjectScript and move forward by yourself... wouldn't that be awesome?

60
0 7 349

Hi! For the Opendataset contest I've build a docker container app stack that use InterSystems IRIS & Openflights Dataset in a container and second container with Apache Zeppelin. You can found details here: https://github.com/andreas5588/openflights_demo

With that you can query the Opendflights Dataset from Apache Zepplin with zero configuration. The containers are on hub.docker so you can use it very easily.

10
0 0 24

Hi everyone,

I want to talk about our project and use the dataset theme for this contest.

Our intention never was to be a data curator, especially because sometimes my precious data means a lot for me, but not for the rest of the world.

My Precious

We want to go a step further and empower the user to find the perfect dataset for their needs.

Our project is a bridge between the data science community and the developer's community using InterSystems IRIS to achieve this mission.

30
0 0 29

Previously I had published The Article about the dataset from a real webserver, which can demonstrate how can activity and load of the Apache webserver depends on day of week, search engines indexing and some network noise.

Now I want to describe one useful function for most of webmasters and system administrators who are interested in obtaining of exactly information about visitors, hardware usage, and also about errors that gaing to their clients.

Here it is

20
0 0 31

Hey community! How are you doing?

I hope to find everyone well, and a happy 2022 to all of you!

Over the years, I've been working on a lot of different projects, and I've been able to find a lot of interesting data.

But, most of the time, the dataset that I used to work with was the customer data. When I started to join the contest in the past couple of years, I began to look for specific web datasets.

I've curated a few data by myself, but I was thinking, "This dataset is enough to help others?"

40
0 3 79

With the release of InterSystems IRIS 2021.2 Preview and all-new LOAD DATA functionality dataset can by added with Objectscript Package Manager (ZPM) 

Medical Datasets contains following 12 datasets. For dataset tables and data details please visit ONLINE DEMO by using SuperUser | SYS 

10
0 0 31
Article
Evgeniy Potapov · Jan 14 1m read
Real Webserver Logs Dataset

I'm happy to share with the community a web server log dataset from our longtime customer, an operating company.

Their webserver operates on Apache webserver and contains data which can be useful to analyse a load and search engines activity.

After installing the project, you will get the data for a few months that can show a typical load and activity of clients, robots and also you can see how it depends on day of week, holidays and time of a day.

The Cube is also included in package.

10
0 0 41

Hi community,

Prediction is a critical to the Maternal healthcare. The Health Dataset Application (https://openexchange.intersystems.com/package/Health-Dataset) has 10 real health datasets to predict the most important diseases and health problems, including Maternal Risk.

This article detail the steps to predict Maternal Risk using the InterSystems IRIS IntegratedML. This is a technology of InterSystems to do predictions using SQL Commnands! Great!

Follow these steps:

20
0 0 34

In this article, we’ll build a highly available IRIS configuration using Kubernetes Deployments with distributed persistent storage instead of the “traditional” IRIS mirror pair. This deployment would be able to tolerate infrastructure-related failures, such as node, storage and Availability Zone failures. The described approach greatly reduces the complexity of the deployment at the expense of slightly extended RTO.

220
5 14 1,449

Hi Community!

I think everyone keeps the source code of the project in the repository nowadays: Github, GitLab, bitbucket, etc. Same for InterSystems IRIS projects  check any on Open Exchange.

What do we do every time when start or continue working with a certain repository with InterSystems Data Platform?

We need a local InterSystems IRIS machine, have the environment for the project set up and the source code imported.

So every developer performs the following:

  1. Check out the code from repo
  2. Install/Run local IRIS installation
  3. Create a new namespace/database for a project
  4. Import the code into this new namespace
  5. Setup all the rest environment
  6. Start/continue coding the project 

If you dockerize your repository this steps line could be shortened to this 3 steps:

  1. Check out the code from repo
  2. Run docker-compose build 
  3. Start/continue coding the project 

Profit - no any hands-on for 3-4-5 steps which could take minutes and bring head ache sometime.

You can dockerize (almost) any your InterSystems repo with a few following steps. Let’s go!

50
8 9 838

The JSON is a data document free of types and validation rules. However, in some scenarios it is important that the JSON document has type and business rules validation, especially in interoperability scenarios. This article demonstrates how you can leverage a market-defined JSONSchema technology that is open for everyone to use and do advanced validations.

70
1 0 98

In this article I will explain the usage of %SQL_Diag.Result and %SQL_Diag.Message table along with all-new LOAD DATA functionality.

It is recommended to go through LOAD DATA documentation first. 

After successful operation LOAD DATA insert one record in %SQL_Diag.Result table and details are inserted in %SQL_Diag.Message table


Below is the basic command when table is already created and source file does not contain header row. 

LOAD DATA FROM FILE 'C://TEMP/mydata.txt' 
INTO MyTable

The file name must include a .txt or .csv (comma-separated values) suffix and both source and target have the same sequence of data columns.

 

Loading from File Source: Header

20
0 0 32

Hi folks!

Sometimes we need the docker image of the InterSystems IRIS solution we build to be published on some docker registry. The cases could be:

  1. Deploy it then in Kubernetes cluster
  2. Let your pal run the image of your public repo without building it locally.

You can push the image to Docker Hub Registry or Github Registry.

In this very short article, I provide a way how to do it automatically on every push to your GitHub repository.

10
1 0 32