Question
· Sep 17, 2020
Bulk load csv into global

Is there a way to bulk load csv into a global? I have a csv with 283 lines in the following pattern

123, first text

234, second text

456, third text

I want to load them into a global ^loader(code, text). Is there a way to upload them using code?

0 4
0 316

Hi Devs!

Last weekend I had been testing the newborn csvgen module and was looking for a CSV file to test - thus I came across an interesting datafile on Data.World with Game of Throne episodes statistics. Death statistics. These folks documented all the murders through all the 8 seasons and noted where, who, from what clan with what weapon had killed another one.

So I imported it and made an IRIS Analytics dashboard.

You Know Nothing, Jon Snow | You Know Nothing, Jon Snow | Know ...

Don't worry, Jon, with this dashboard we can figure out something ). See the details below.

6 0
2 566

Hi!

I believe the simplest is (to work with csv delimited by ";"):


set file = ##class(%File).%New( "data.csv" )
    set sc = file.Open( "R" ) 
    if $$$ISERR(sc) quit    ; or do smth

    while 'file.AtEnd {
        set str=file.ReadLine() 
        for i=1:1:$length( str, ";" ) {
            set id=$piece( str, ";" ,i ) 
            write !, id  // or do smth
        }
    }
    do file.Close()

Possible options:

different variants of error handling with sc code.

Embrace while loop into try/catch block.

And what's yours?

3 16
0 6.5K

Hi,

I've setup an operation that pulls text from an MDM message and uses it to generate a csv file. I'm having issues as our vendor is sending the file with a carriage return (as shown below) to signify the next line but Healthshare is recognising this as a new segment which causes an error.

0 2
0 314

Hello,

I am reading in an X12 document into my production that needs to be processed and returned as a CSV file. I have created a record map to support the fields I want to extract with a batch class to store headers. I have a DTL mapping the data to the appropriate fields in the record map and am sending the record map to a EnsLib.RecordMap.Operation.BatchFileOperation.

0 2
0 784

Hi,

I have written a procedure with the help of object scripting to export data to a csv file. There is more data than the csv limit.

Can anyone please tell me how to get the row count of csv file using object scripting, so that I can write an if condition and write to a second csv file.

Please find the code writing to the csv given below.

0 14
0 955
Question
· Nov 19, 2018
Copy csv data into a global object

Hi, I have a CSV file with a list of 5000 records in the following format

Name, Acc, division

Eric, 1234, 567

John, 1235, 987

Peter, 3214, 879

I just want to copy the Acc, division to a global so eventually the global would be like the following:

^People("Customers", "Acc.division")

Can you advice on how I can perform this from the terminal? This is a one time task. I want to read all the values from the csv file and insert them into the global

Regards,

Eric

0 7
0 924

Hi All,

I need urgent help,

I want to export the values from Global to CSV file.

Values are in global are :

^Global1(1)="1,2,3,4"
^Global1(2)="5,6,7,8"
.
.
.
^Global1(n)="n,n,n,n"

I want output in CSV File as:
1,2,3,4
5,6,7,8
.
.
.
n,n,n,n

I made a class:

0 5
0 1.9K

Hi Guys,

have a file located in \\servername\Myfiles\pull.csv but for some reason my cache routine can't file, but can when using the local drive C:\servername\Myfiles\pull.csv.

FYI I can access \\servername\Myfiles\pull.csv via Win explorer with no problems.

anyway how I can fix this?

Thanks Guys

0 4
0 489

I came across How to import a tab separated text file into a SQL table programmatically?, which appears to indicate that I can pass the filename and all records will be imported. However, when I put use the Import method only a single record gets imported. However, all records are imported if I use the management console to import the file by selecting it and choosing the options.

Has anyone ran into this in the past?

0 3
0 646

I have a business service that brings in a xml virtual document to the production and also a csv service that brings in a csv file and have a process that transforms both to a xml output but I have a problem with the csv as it is giving me this error when I try to trans form it ```ERROR <Ens>ErrException: <PROPERTY DOES NOT EXIST>zOnRequest+1 ^EnsLib.MsgRouter.VDocRoutingEngine.1 *DocType``` I have read here followed the suggestion but now I do not get any errors but my m

0 5
0 487

I have implemented a csv record mapper to read files into the production in ensemble but it errors every time I read a file with headers on the columns as these are not recognised as the specified data types .Is there a way to ignore the headers when reading in values from a file with headers. Please note this have been built using the pre built components.

0 1
0 663

Dear All,

I’m having trouble creating the following business service. It’s intention is to pick up an CSV via FTP and pass it to a Business process that transforms it to a HL7 message.

I have created a record map for the CSV file, which I am trying to call in the business service and parse it into a new message class, which can be transformed in the business process.

Please could you advise:

Business Service:

0 2
0 562

Hi, We have a business process that receives back a character stream that has csv content from a SOAP Operation call. I was hoping to make use of the record mapper to map the content to a record map to be able to process the transactions.

I only see examples/documentation of how to use a File or ftp business operation/service to map the stream to a record map.

Is this at all possible to get the csv stream into a record map batch object.

Regards

Thomas

0 1
0 443

A short post for now to answer a question that came up. In post two of this series I included graphs of performance data extracted from pButtons. I was asked off-line if there is a quicker way than cut/paste to extract metrics for mgstat etc from a pButtons .html file for easy charting in Excel.

See: - Part 2 - Looking at the metrics we collected

7 2
0 1.4K