@Eduard Lebedyuk, the code you provided is correct, but it does not do a 24-hour calculation from the moment the purge task is started. Instead, it does a midnight-to-midnight calculation in UTC time.

I should have responded to both @Evgeny Shvarov  and @Eduard Lebedyuk at the same time, so see above, but I still don't understand what this means:

Every persistent class (a class which stores data on disk) has the map of how the data should be stored in a particular global (globals). 

Stores data on disk? Meaning, that if we create these persistent classes, the data is stored twice? Once in the global node and in some other format as defined (or not defined by the class)?  

But! As @Robert Cemper mentioned if you delete it via SQL API with DROP statement it will delete data in a global too (because we suppose the data to be deleted if we DROP a table, right?).

But just to reiterate, how does the DDL interpret a read definition into a write definition? The data stored in the global node and the definition do line up exactly (in our case, almost not at all).

It's all about multi-model nature of IRIS and Caché.

Everything in Caché and InterSystems IRIS is being stored in globals - key-value sparse stored arrays.

Every persistent class (a class which stores data on disk) has the map of how the data should be stored in a particular global (globals).

ObjectScript Class definition provides the API of how the data will be stored and read from a global (globals) via Object Script or SQL access.

You can specify standard or custom storage strategy for a class. 

The persistent 'SomeClass' class with standard storage strategy will provide the API to store data in ^SomeClassD data for data itself and ^SomeClassI global for indexes.

You can see the storage strategy of a class in the source code of a persistent class immediately after successful compilation - it will be visible in a related XML block at the bottom of the class.

In your case you have a custom storage strategy which could be everything. Check @Brendan Bannon's series on that which could be really helpful for you.

As @Eduard Lebedyuk mentioned, if you delete the class as source code (e.g. via Management Portal)  it will not delete the data itself.

But! As @Robert Cemper mentioned if you delete it via SQL API with DROP statement it will delete data in a global too (because we suppose the data to be deleted if we DROP a table, right?).

HTH

Hello! Thank you for posting the question.

Thanks to @Francisco López and @Eduard Lebedyuk for pointing out to the right place.

Indeed, there was a little mess with links and downloads: the information was spread across 3 repositories on GitHub. I moved all the stuff to intersystems-community repository, as well as added releases there. This repository can now be treated as the main repository of the project. Also, there is the releases page with the latest XMLs you can download for installing ObjectScript Visual Editor. This page is also linked in Open Exchange and the readme file.

Would love to hear your feedback! Thank you!

Nikita, the developer of this app.

As @Eduard Lebedyuk pointed out, the method should return a status as indicated by its signature, so at the very minimum you should have a "Return $$$OK" at the end of the method.

That said, %FromJSON() throws an exception when it fails, and the generic error you'll get in the Event Log won't tell you (or your support team) much. You may want to wrap a try/catch around the call to %FromJSON() so that you can return a status that provides a little more insight:

    ...

    Try {
        set tJSON = {}.%FromJSON(pInput)
    } Catch {
        Return $$$ERROR($$$GeneralError,"Badly formed JSON")
    }

...

    Return $$$OK
 }

Hi Community!

This is the update on what are the new applications submitted on OpenExchange in April 2019

New Applications

 

Arduino Snippets published by @Eduard Lebedyuk 

Connect your Arduino to InterSystems IRIS or Caché via com port (or usb<->com)

Japanese Calendar published by @Hiroshi Sato 

Japanese Calendar Converter for InterSystems products

Cache Quality for Atelier published by @Daniel Tamajon 

IDE extension that helps you detect and fix quality issues as you write code. Like a spell checker, this extension squiggles flaws so they can be fixed before committing code. You can install it directly from Atelier and it will then detect new bugs and quality issues as you code (ObjectScript and JavaScript).

ETL Interoperability Adapter published by @Guillaume Rongier

Extend EnsLib.SQL.OutboundAdapter to add batch batch and fetch support on JDBC connection for Ensemble and IRIS.

R Gateway published by Shiao Bing Sung

Use R language with InterSystems IRIS

The Folding Staff published by  @John Murray 

'The Folding Stuff' is a simple VSCode extension that adds Visual Studio Code's existing code folding / unfolding features to the editor context menu.<--break->

0   0 1
0

comments

69

views

0

rating

@Eduard LebedyukEduard Lebedyuk  No it is not  a known node what I am trying to do is pull those characters and  pass them to  $TRANSLATE($SYSTEM.Encryption.Base64Encode(streamString),$C(10,13))' as part of a document I need to convert to a pdf but have tried all the encodings I could use from utf -8 to the windows 1252 and I get an error like so 

ERROR <Ens>ErrException: <ILLEGAL VALUE>zEncodeStream+18  @Eduard Lebedyuk' set encString = $TRANSLATE($SYSTEM.Encryption.Base64Encode(streamString),$C(10,13))' any ways to get around the base 64 encoding

Hi Community!

This is the update on what are the new applications submitted on OpenExchange in March 2019

New Applications

isc-tar  published by @Dmitry Maslennikov 

Compact files as TAR or Extract files from TAR files

Light weight EXCEL download v.1.0 published by @Robert Cemper 

This is the working example of a light weight export to EXCEL based on data in SAMPLES namespace. Good old CSP is well equipped to produce HTML tables accepted from EXCEL as input. With modern Browsers you don't even need and tags. So the required code around your SQL result set is really slim. And you are free to add any formatting you need either by HTML or in SQL.

PythonGateway v.0.7 published by @Eduard Lebedyuk 

Python Gateway for InterSystems Data Platforms.

Adopted Bitmaps v.1.0 published by @Robert Cemper 

This is a running example of the Bitmap Adoption

WebSockets Tutorial v.1.0 published by @Lily Taub 

A short tutorial on WebSockets in InterSystems IRIS 2018.1+ and Caché 2016.2+

Sync Data with DSTIME v.1.0.0 published by @Robert Cemper

Other Sync-Tools just work from Caché/IRIS to Caché/IRIS. Synchronizing your data to some external DB you requires some other solution. DSTIME can do it.

HL7 and SMS Interoperability Demo v.1.3 published by @Amir Samary 

This demo shows how easy it is to integrate an Electronic Medical Record system that is sending HL7 messages with AWS.

0   0 1
0

comments

38

views

0

rating

Hey Developers!

Do you want to reap the benefits of the advances in the fields of artificial intelligence and machine learning? With InterSystems IRIS and the Machine Learning (ML) Toolkit it’s easier than ever.

Join InterSystems Sales Engineers, @Sergey Lukyanchikov and @Eduard Lebedyuk, for the Machine Learning Toolkit for InterSystems IRIS webinar on Tuesday, April 23rd at 11 a.m. EDT to find out how InterSystems IRIS can be used as both a standalone development platform and an orchestration tool for predictive modelling that helps stitch together Python and other external tools.

Last comment 22 April 2019
+ 2   0 1
157

views

+ 2

rating

Hi Alexey!

Thank you for teaching me COS :)

Didn't even try, you know ;) 

But for those who see your code sample, I, as a Community Manager, must say that dot syntax and "do label" is never a good practice in ObjectScript today. 

And I can't imagine that InterSystems product manager who introduced return in ObjectScript could see it called from dot-level ;)

I think that we have {} in ObjectScript to excuse the dot-syntax usage and do ..Method() for do Label cases. 

The rewritten sample is not functional equivalent of the original one and will return the different result (if correct the syntax error).

This illustrates how dot-syntax could easily obfuscate the logic if you want ;)

My sample was not about "old" syntax vs. "new" one; I just wanted to emphasize that simple substitute "quit" with "return" command either may or may not improve the readability of already existing class/routine, it all depends...

Agree. 

But IMHO for a new ObjectScript code in InterSystems IRIS return value is always more readable than quit value.

And I think in majority cases, where you don't use "dot-syntax" this replacement could be applicable too.  @Eduard Lebedyuk  - is it eligible for your 

Quit: $$$IsError(sc) sc

case? 

And to All: this is not an instruction to action!

This is the discussion and activity to introduce a Developer Community Coding Guidelines Best Practices for InterSystems IRIS ObjectScript. You may use this or may not but it's good to have it.

P.S. I think we need to stop shrinking the levels of the thread in DC engine :)

Alexey, I do not argue the difference, between "return" and "quit". There is a difference.

In this case it's a guess that if we check the error and return the value with "quit" it's 100% case that it is a quit from the method, rather then any other use cases of "quit".

So it is "return". @Eduard Lebedyuk ?

Answering your point - IMO "return" is a more readable command to get a direct answer on "What this method returns" question.

@Eduard Lebedyuk , I am running into an issue when attempting to run the commands in the Cache Terminal.

When typing "S sc=##class(isc.rabbitmq.API).sendMessage()

I get "<METHOD DOES NOT EXIST> *sendMessage,isc.rabbitmq.API"

Also when I created the Java Gateway I got this error

"

 

ERROR #5117: In class 'com.rabbitmq.tools.jsonrpc.JsonRpcException' element type 'Method', element 'getMessage' and 'getmessage' have the same name but differ in case. > ERROR #5030: An error occurred while compiling class 'com.rabbitmq.tools.jsonrpc.JsonRpcException'1
   com.rabbitmq.tools.jsonrpc.JsonRpcException imported."

Thank you @Eduard Lebedyuk @Bernd Mueller for the tips - yes, it works with a 32-bit Apache server.  For some reason even though CSPGateway-2018.1.1.643.0-win_x64 is the one installed (it's the only one downloaded), the CSPa24.dll is 32-bit according to the above test.

Regards,

Olga

IMHO this deserves an enhancement request. Data for mapped lib classes is stored in User Namespaces,  but tune params for this data in Lib Namespace. Looks difficult to use persistent classes as part of a library in this case. 

Maybe you can generate automatically the storage class @Eduard Lebedyuk mentioned with the first call from a User Namespace?

Agree with @Eduard Lebedyuk answer, want to introduce another toolset:

1. Import ISC_DEV utility to a DEFAULT_INSTANCE say in a USER namespace and map the classes of the utility to %All.

2. Setup the workdir to export the code

YOURNAMESPACE> w ##class(dev.code).workdir("/path/to/your/wor
king/directory/")

2. export code calling:

YOURNAMESPACE> w ##class(dev.code).export()

This will export cls, routines, and dfi (DeepSee) into separate files.

3. Create the repository in git and commit all the files from the directory into the repository (and even push, if you use Github/Gitlab)

4. Repeat p1-2 for a PRODUCTION_INSTANCE and export classes into the same directory.

5. Compare the changes.  If you Open the directory in  VSCode with Object_Script plugin by @Dmitry Maslennikov you will immediately see the changes in Source Control section of VSCode. E.g. I introduced one line and saved the class and it shows the files changed since the latest commit and the line with the change.

Alternatively you can commit and push changes to Github/Gitlub and see the diff since the latest commit. E.g. like changes in this commit.

If you don't have DeepSee resources, p.1 can be changed to Atelier or VSCode - both have the out-of-the-box functionality to export the source into files in UDL form.

HTH

@Eduard Lebedyuk  There is one problem , The file gets successfully saved in S3 , but when we try to retrieve the content the data is damaged.

This works fine for text files but if we have any file which is PDF or Image then when we try to view that same file the data is completely damaged we get in the for stream, but not actual image or PDF which we had uploaded.


How can I get the original file(image/PDF) which I uploaded back as a response instead of a stream?

Hi Community!

New Badges're already on Global Masters Advocacy Hub

We're happy to announce that this year we again introduced three annual badges on Global Masters Advocacy Hub to let you remember how much you contributed to Developer Community in 2018. Here they are:

 DC Best-Selling Author 2018
 DC Expert 2018
 DC Opinion Leader 2018

Let's take a closer look at the DC Wall of Fame 2018 and greet everyone with big applause! 

0   0 1
0

comments

90

views

0

rating

Hi Jose!

If your dashboards need the authentication for the access it's logical that it demands to enter login and password.

If you want embedded DSW iframe to use same login/password as another application on the same web page where the user already logged in, it's the matter of how to transfer the access to an embedded iframe.

I cannot answer at the moment how to manage this, and pinging @Eduard Lebedyuk: Eduard, do you think it is possible to transfer session to an MDX2JSON app of the embedded DSW iframe?

Hi Community!

New Badges're already on Global Masters Advocacy Hub

We're happy to announce that this year we again introduced three annual badges on Global Masters Advocacy Hub to let you remember how much you contributed to Developer Community in 2018. Here they are:

 DC Best-Selling Author 2018
 DC Expert 2018
 DC Opinion Leader 2018

Let's take a closer look at the DC Wall of Fame 2018 and greet everyone with big applause! 

Last comment 11 January 2019
+ 8   0 4
142

views

+ 8

rating

Hi Community!

Please welcome a new video on Developer Community YouTube Channel:

Continuous Delivery with Containers

 

0   0 1
0

comments

67

views

0

rating

Hi Community!

I'm pleased to announce that InterSystems Developer Community reached 5,000 registered members!

Thank you, developers, not only for registering but rather for making this place more and more helpful for everyone who develops and supports solutions on InterSystems Data Platforms all over the world! Big applause to all of us! 

Last comment 30 October 2018
+ 7   2 3
148

views

+ 7

rating

Hi, Stefan!

In this situation, I would suggest using RELEASES approach for different InterSystems Data Platforms. E.g. you can generate XML release package from the code base in git and then replace all the specific places for a given version on a build release phase. As a result you will have different XML packages for Ensemble and for InterSystems IRIS and have one code base, say in IRIS.

For example, you can see that approach @Eduard Lebedyuk used for RESTForms distribution. There are 2016.1 and 2016.2 release packages which can be installed in 2016.1 and 2016.2+ Caché or Ensemble versions respectfully.

To simplify release building I can recommend ISC.DEV utility which can export a release or patch file(upon git commits) for a given mask of classes.

HTH

Hi Community!

This year will have a special section on Flash Talks which gives you an opportunity to introduce your tool or solution on InterSystems Global Summit 2018!



What is Flash Talks? 

It's 15 min session you have on Technology Exchange scene: 10 min for your pitch, 5 min for Q&A. 

The session WILL BE live streamed on Developer Community YouTube Channel.

Developer Community Flash Talks!

Today, 10/02, Flash Talks Scene @ InterSystems Global Summit 2018!

2 pm Open source approaches to work with Documents @Eduard Lebedyuk, InterSystems

2-15 InterSystems IRIS on Kubernetes by @Dmitry Maslennikov

2-30 Visual Studio Code IDE for InterSystems Data Platforms by @John Murray, GeorgeJames Software

2-45 Static Analysis for ObjectScript with CacheQuality by @Daniel Tamajon, Lite Solutions

3-00 InterSystems Open Exchange by @Evgeny Shvarov, InterSystems

3-15 Q&A Session on Developer Community, Global Masters, and Open Exchange

Last comment 23 October 2018
+ 1   0 4
311

views

+ 1

rating

Few days before Global Summit 2018!

And I can announce yet another presenter: @Eduard Lebedyuk, InterSystems.

Title: Open source approaches to work with Documents.

And we have the day and time!

Find Developer Community Flash Talks "Share Your InterSystems IRIS Solution!"  on Tuesday the 2nd of October on Flash Talks Stage from 2pm to 3-30 pm.

Thank you, John!

Actually, I don't know the case when we may need to change passwords for ALL users to the one specified.

So, don't run this code on your production system.

@Eduard Lebedyuk is there any close to reality case when this snippet could be useful?

This code snippet uses %ZEN.Auxiliary.jsonSQLProvider. The namespace and string of SQL can be edited for different situations. The class method "test" runs the code:


Class eduardlebedyuk.passQuestionParams
{
    classmethod test(pValue = 50) {
        s ns = $Namespace
        zn "samples"
        s tSQL = "SELECT ID, Name FROM Sample.Person WHERE Id > ?"
        s tPR = ##class(%ZEN.Auxiliary.jsonSQLProvider).%New()
        s tPR.sql = tSQL
        s tPR.%Format = "tw"
        s tPR.maxRows = 100
     
        s tParam = ##class(%ZEN.Auxiliary.parameter).%New()
        s tParam.value = pValue
        d tPR.parameters.SetAt(tParam,1)
      
        d tPR.%DrawJSON() 
        //d ##class(%ZEN.Auxiliary.jsonSQLProvider).%WriteJSONFromSQL(,,,,,tPR)  //same thing
        zn ns
    }
}

(Originally posted to Intersystems CODE by @Eduard Lebedyuk, 5/13/15)

Here's a link to the code on GitHub

0   0 2
0

comments

91

views

0

rating

(Originally posted on Intersystems CODE by @Eduard Lebedyuk, 10/12/15) The following code snippet outputs all filenames in the file path "dir" in the Cache/IRIS terminal. The class method "test" runs the code:


Class eduardlebedyuk.filenamesInDir Extends %RegisteredObject
{
	classmethod test() {
		// replace dir with file path you want
		set dir = "D:\directory" 
		set dir = ##class(%File).NormalizeDirectory(dir)
		set file=$ZSEARCH(dir_"*")
		while file'="" {
			write !,file
			set file=$ZSEARCH("")
		}
	}
}

Last comment 29 October 2018
0   2 4
376

views

0

rating

@Eduard Lebedyuk I have tried your suggestions and still get the error when opening the file with adobe reader. If I try on a simple pass through operation like this below everything works fine.

ERROR: Adobe acrobat reader could not open file.pdf because it is either not a supported file type or because the file has been damaged.

the service

Method OnProcessInput(pInput As %Stream.Object, Output pOutput As %RegisteredObject) As %Status
{
#dim pt as TestingEnvironment.ECGTrace.TEST.FSMREQ=##class(TestingEnvironment.ECGTrace.TEST.FSMREQ).%New()
  Set tSource=pInput.Attributes("Filename"), pInput=$zobjclassmethod(..#CONTAINERCLASS,"%New",pInput)
 Set tSC=..resolveAndIndex(pInput) Quit:$$$ISERR(tSC) tSC
 set pt.filestream=pInput
 set pt.path=tSource
 Set tWorkArchive=(""'=..Adapter.ArchivePath)&&(..Adapter.ArchivePath=..Adapter.WorkPath || (""=..Adapter.WorkPath && (..Adapter.ArchivePath=..Adapter.FilePath)))
 $$$SyncCommitSet(tSyncCommit)
 For iTarget=1:1:$L(..TargetConfigNames, ",") { Set tOneTarget=$ZStrip($P(..TargetConfigNames,",",iTarget),"<>W")  Continue:""=tOneTarget
  $$$sysTRACE("Sending input Stream "_pInput.Stream_"("_pInput.Stream.Size_")"_$S(tWorkArchive:" Async",1:" Sync")_" from '"_tSource_"' to '"_tOneTarget_"'")
  If tWorkArchive {
   Set tSC1=..SendRequestAsync(tOneTarget,pt)  Set:$$$ISERR(tSC1) tSC=$$$ADDSC(tSC,tSC1)
   //Set tSC1=..SendRequestAsync(tOneTarget,pInput)  Set:$$$ISERR(tSC1) tSC=$$$ADDSC(tSC,tSC1)
  } Else {
   #; If not archiving send Sync to avoid Adapter deleting file before Operation gets it
   //Set tSC1=..SendRequestSync(tOneTarget,pInput)  Set:$$$ISERR(tSC1) tSC=$$$ADDSC(tSC,tSC1)
   Set tSC1=..SendRequestSync(tOneTarget,pt)  Set:$$$ISERR(tSC1) tSC=$$$ADDSC(tSC,tSC1)
  }
 }
 $$$SyncCommitClear(tSyncCommit)
 Quit tSC
}

the operation

Method OnMessage(pREs As TestingEnvironment.ECGTrace.TEST.FSMREQ, pRequest As Ens.StreamContainer, Output pResponse As %Persistent) As %Status
{
 set pRequest=pREs.filestream
 Quit:'$IsObject(pRequest.Stream) $$$ERROR($$$EnsErrGeneral,"No Stream contained in StreamContainer Request")
 Set tFilename=..Adapter.CreateTimestamp(##class(%File).GetFilename(pRequest.OriginalFilename),..Filename)
 Set tSC=..Adapter.PutStream(tFilename, pRequest.Stream)
 Do pRequest.%Save() ; re-save in case PutStream() optimization changed the Stream filename
 Quit tSC
}

my code the service

Method OnProcessInput(pInput As %Stream.Object, Output pOutput As %RegisteredObject) As %Status
{
 #dim meta as DocumentUpload.GenericUploadMREQ=##class(DocumentUpload.GenericUploadMREQ).%New()
 
   ;get the filepath from the request 
   ;;wrap the stream object into container for easy transpotation
    Set tFileName=pInput.Attributes("Filename") , pInput=$zobjclassmethod(..#CONTAINERCLASS,"%New",pInput)
 $$$TRACE(tFileName) 
 ;get the file name
    set dataPiece=##class(%File).GetFilename(tFileName)
 $$$TRACE(dataPiece)
      Set tSC=..resolveAndIndex(pInput) Quit:$$$ISERR(tSC) tSC
      ;check if the file path data is populated
 if (dataPiece'="")
    {
    ;build the ECG Message
          set meta.ClientID =$Piece(dataPiece,"_",1) 
          set meta.LastName=$Piece(dataPiece,"_",2)
          set meta.FirstName=$Piece(dataPiece,"_",3)
          set meta.DateOfBirth=$Piece(dataPiece,"_",4)
          set meta.Directory =$Piece(tFileName,"\",*-1)
          set meta.OGFileName  =dataPiece
          set meta.Fullpath  =tFileName
          ;get the date to testing
          set DateOfTest=$Piece(dataPiece,"_",5)
          ;get the time of testing
          set mtim=$Piece(dataPiece,"_",6)
          ;separate the extension of the file path and the time
          set TimeOfTest=$Piece(mtim,".",1)
          set meta.FileTimeStamp =DateOfTest_""_TimeOfTest
          set meta.recordAdded =$ZDT($ZTIMESTAMP,3,1,3)
          set meta.TargetConfig ="RIO.DocumentUpload.RiOFileOPRN"
          set meta.payLoad=pInput
          set meta.DocumentType=..DocumentType
       set meta.Description=..Description
       set meta.Title=..Title
       set meta.FinalRevision=..Revesion
       set meta.Author=..Author_""_$Piece(tFileName,"\",*-1)
       set messagetype=$PIECE(..Author," ",1)
       set meta.UserId=..UserID
          set meta.sourceConfig =tFileName
          set meta.TypeMes=messagetype
         
         
          Set tWorkArchive=(""'=..Adapter.ArchivePath)&&(..Adapter.ArchivePath=..Adapter.WorkPath || (""=..Adapter.WorkPath && (..Adapter.ArchivePath=..Adapter.FilePath)))
 $$$SyncCommitSet(tSyncCommit)
 For iTarget=1:1:$L(..TargetConfigNames, ",") { Set tOneTarget=$ZStrip($P(..TargetConfigNames,",",iTarget),"<>W")  Continue:""=tOneTarget
  $$$sysTRACE("Sending input Stream "_pInput.Stream_"("_pInput.Stream.Size_")"_$S(tWorkArchive:" Async",1:" Sync")_" from '"_tFileName_"' to '"_tOneTarget_"'")
  If tWorkArchive {
   Set tSC1=..SendRequestAsync(tOneTarget,meta)  Set:$$$ISERR(tSC1) tSC=$$$ADDSC(tSC,tSC1)
   
  } Else {
   #; If not archiving send Sync to avoid Adapter deleting file before Operation gets it
   Set tSC1=..SendRequestSync(tOneTarget,meta)  Set:$$$ISERR(tSC1) tSC=$$$ADDSC(tSC,tSC1)
  }
 }
 $$$SyncCommitClear(tSyncCommit)
         
         
       }
            quit tSC
}

the operation please note I have a Route in between which simple transformers the message to the message expected by the operation its just a simple mapping scenario

Method WriteOutFiles(pRequest As DocumentUpload.FileMREQ, pInput As Ens.StreamContainer, Output pResponse As DocumentUpload.GenericRESP) As %Status
{
 set pInput=pRequest.FileStream
     ;the variable to hold the status for the method
 #dim status as %Status=$$$OK
 ;clear the pResponse
  kill pResponse
     set pResponse=$$$NULLOREF
 ;set the file name to the sequence number
  set ..Filename=pRequest.NewFileName
 
  ;the filepath set on the settings of this OPERATION
   set origDirectory = ..Adapter.FilePath
  ;the file directory to drop the file
  set ..Adapter.FilePath = ..Adapter.FilePath_"\"_..StubDirectory
  
 ;start writing out file data to the stub file
    set:$$$ISOK(status) status= ..Adapter.PutLine(..Filename_"."_..stubExtension,
  $CHAR(34)_ pRequest.ClientID_$CHAR(34)_$CHAR(44)_$CHAR(34)_pRequest.UserId_$CHAR(34)_$CHAR(44)_$CHAR(34)_
  pRequest.DocumentType _$CHAR(34)_$CHAR(44)_$CHAR(34)_pRequest.Title_$CHAR(34)_$CHAR(44)_$CHAR(34)_pRequest.Description_$CHAR(34)_$CHAR(44)_$CHAR(34)_pRequest.Author _$CHAR(34)_$CHAR(44)_$CHAR(34)_pRequest.DocumentDate_$CHAR(34)_$CHAR(44)_$CHAR(34)_pRequest.FinalRevision_$CHAR(34))
 
 if ($$$ISOK(status))
 {
  ;set back to the operation settings
  set ..Adapter.FilePath = origDirectory
    ;set the file name  to write out to
  set ..Filename=pRequest.NewFileName_"."_..DocExtension
  ;set the filepath on the production settings to this variable
  set origDirectory = ..Adapter.FilePath
  ;set the new filepath
  set ..Adapter.FilePath = ..Adapter.FilePath_"\"_..DocumentDirectory
     
 
 // set:$$$ISOK(status) status=  ..Adapter.PutStream(..Filename, pInput.Stream) //error here to file
  set:$$$ISOK(status) status=..OriginalFileOut(pInput,pResponse,..Filename)
 
      ;set adapter to its original file path
   set ..Adapter.FilePath = origDirectory
  
   ;check writing out file worked
   if ($$$ISOK(status))
 {
  set pResponse=##class(DocumentUpload.GenericRESP).%New()
  set pResponse.Process="FileOPRN_files Written to their respective directories"
  set pResponse.Status=status
  set status=pResponse.%Save()
 }
  
 }
 
 ;return status
 return status
}

// passthrough original method to write out file

Method OriginalFileOut(pRequest As Ens.StreamContainer, Output pResponse As %Persistent, filenamess) As %Status
{
 
 Quit:'$IsObject(pRequest.Stream) $$$ERROR($$$EnsErrGeneral,"No Stream contained in StreamContainer Request")
 Set tFilename=..Adapter.CreateTimestamp(##class(%File).GetFilename(pRequest.OriginalFilename),filenamess)
 Set tSC=..Adapter.PutStream(tFilename, pRequest.Stream)
 Do pRequest.%Save() ; re-save in case PutStream() optimization changed the Stream filename
 Quit tSC
}

@Eduard Lebedyuk Looked at the class this classEnsLib.RecordMap.Service.BatchFileService and the EnsLib.RecordMap.Service.BatchStandard class apparently these are fine  but for some reason the custom class acts as if it goes into a loop.I have put some traces in my code and tried to capture the status and change  it to fail if a certain validation fails but that makes the whole service to log an error after successfully delivering the first message .

here are the changes made thanks for your help

ClassMethod GetBatchHeader(pIOStream As %IO.DeviceStream, pTimeout As %Numeric = -1, Output pBatch As EnsLib.RecordMap.SimpleBatch, ByRef pLookAhead As %String) As %Status
{
    $$$TRACE("Begin")
 Try {
  #dim cpStatus as %Status=$$$OK
    Set tStatus = $$$OK
    Set pBatch = ""
    Set tTerm = ..GetHeaderTerm()
    //Set tFullHeader = 63 _ tTerm
    Set tHeaderLen =63+$length(tTerm)
    set stage=""
    If tHeaderLen
       {
       Set tFound = 0
       $$$TRACE("0")
       Set tLeadingJunk = ""
       Set pLookAhead = $get(pLookAhead)
       Set tTimeout = pTimeout
       Set tEndTime = $zhorolog + pTimeout
      // While ('tFound) && ('pIOStream.AtEnd)
        //  {
          Set tReadLen = tHeaderLen - $length(pLookAhead)
          If tReadLen > 0
             {
             Set tData = pLookAhead _ pIOStream.Read(tReadLen, .tTimeout, .tStatus)
             $$$TRACE("1")
             If $$$ISERR(tStatus) Quit
             If tTimeout
                {
                Set tStatus = $$$ERROR($$$EnsErrTCPReadTimeoutExpired, pTimeout, tReadLen)
                Quit
                }
                  Set pLookAhead = ""
             }
             Else
             {
             $$$TRACE("Else treadlen less than 0-[2]")
             Set tData = $extract(pLookAhead, 1, tHeaderLen)
             Set pLookAhead = $extract(pLookAhead, tHeaderLen + 1, *)
             }
             If ($extract(tData,1,3) = "001" )
                 {
                Set pBatch = ..%New()
                set pBatch.BatchHeader = tData
                Set tFound = 1
                $$$TRACE("Foundlee[3]"_tFound)
                Quit
                 }
                 Else
                 {
                $$$TRACE("else $extract(tData1,3)Failed[3]")
                Set pLookAhead = pLookAhead _ tData
                 #; Check if we should start discarding leading data
                If ($length(pLookAhead) >= tHeaderLen)
                   {
                   If ($length(tLeadingJunk) < 400)
                      {
                     Set tLeadingJunk = tLeadingJunk _ $extract(pLookAhead,1)
                      }
                      Set pLookAhead = $extract(pLookAhead,2,*)
                   }
                   set cpStatus=0
                   set stage="Extractfailed" //check here
                  // Quit
                  //Continue
     
                 }
                 If (pTimeout = -1)
                    {
                    Set tTimeout = -1
                    }
                    Else
                    {
                     $$$TRACE("time out ok [4]")
                    Set tCurrTime = $zhorolog
                    If (tCurrTime > tEndTime)
                       {
                      Set tStatus = $$$ERROR($$$EnsErrTCPReadTimeoutExpired, pTimeout, tReadLen)
                      Quit
                       }
                       Set tTimeout = tEndTime - tCurrTime
                   }
                   If $$$ISERR(tStatus) Quit
         // } //while
         
          $$$TRACE("while end"_tFound)
         
          If $$$ISERR(tStatus) Quit
          #; Clear the lookahead buffer if we didn't find the batch header
          $$$TRACE("###Status###[5]"_tStatus)
          If (('tFound) && ($length(tLeadingJunk) < 400)&&('pLookAhead="")&&($$$ISERR(cpStatus)))
             {
             Set tLeadingJunk = tLeadingJunk _ $get(pLookAhead)
             Set pLookAhead = ""
             $$$TRACE("###Not Found###[6]"_tFound)
             }
            If (tLeadingJunk '= "") && ('..#IgnoreLeadingData)
               {
               #; Use JS escaping to handle control characters
               Set tLoggedJunk = $zconvert($extract(tLeadingJunk,1,400),"O","JS") _ $select($length(tLeadingJunk) > 400: "...", 1: "")
               $$$LOGWARNING($$$FormatText($$$Text("[]Discarding unexpected leading data: '%1'","Ensemble"),tLoggedJunk))
               }
               If (('tFound)&&($$$ISERR(cpStatus)))
                  {
                 Set pBatch = ""
                 $$$TRACE("Not Found[8]")
                 //Set tStatus = $$$ERROR($$$EnsRecordMapErrBatchHeaderNotFound,$classname($this))
                 if (stage="Extractfailed" &&($$$ISERR(cpStatus)))
                          {
                          
                        
                         set tStatus=cpStatus
                        
                         $$$LOGWARNING($classname($this)_":: Empty Batch Discarded")
                        
                          set tStatus=$$$OK
                          }
                          else
                          {
                            // Set tStatus = $$$EnsSystemError
                            Set tStatus = $$$ERROR($$$EnsRecordMapErrBatchHeaderNotFound,$classname($this))
                             }
                
                 Quit
                  }
     }
     Else
     {
   Set pBatch = ..%New()
   Set pLookAhead = $get(pLookAhead)
   Quit
     }
     }
 Catch ex
    {
      $$$TRACE("in catch]")
    Set tStatus = $$$EnsSystemError
      quit
      }
      
 Quit tStatus
}

Since this is made up of three classes how do I control the status to send the call with within the block have a look at where  I have newStatus and I would like to quit and pass new status

Hi, Community!

Continuous Delivery is a software engineering approach in which teams produce software in short cycles, ensuring that the software can be reliably released at any time. It aims at building, testing, and releasing software faster and more frequently. The approach helps reduce the cost, time, and risk of delivering changes by allowing for more incremental updates to applications in production.

Join us at 07:00 UTC, April 24th for a webinar with a live demo "Git flows and Continuous Delivery" by @Eduard Lebedyuk 

The language of the webinar is Russian.

Also, see the related articles on DC.

Last comment 9 August 2018
0   0 2
183

views

0

rating

Sorry to be late. I was busy today:

Thanks to  @Eduard Lebedyuk

I did it in a traditional one-liner: 79  77 char.   + 4 char extra to read the size.

I'll wrap it into a method later to see how much waste of space this generates. devil

5 min. later:
OK. Method consumes 3 char. extra {} to enclose it + blank at the start   =>>> 82   80  ~3.9% overhead

s=1...4 looks odd but it improves.

f j=1:1:15 zw j d ##class(DC.size).main(j) 

j=1
#
j=2
##
##
j=3
###
###
###
j=4
####
####
####
####
j=5
#####
## ##
# # #
## ##
#####
j=6
######
##  ##
# ## #
# ## #
##  ##
######
j=7
#######
##   ##
# # # #
#  #  #
# # # #
##   ##
#######
j=8
########
##    ##
# #  # #
#  ##  #
#  ##  #
# #  # #
##    ##
########

Hi Everyone!

New webinar "Rest API Design and Development" is available now on  DC YouTube Channel:

 

+ 1   0 2
0

comments

129

views

+ 1

rating

2) It's OK approach for relatively small external source tables because in this case, you need to build the cube for all the records and have no option to update/sync.

If you OK with timings on cube building you are very welcome to use the approach.

If the building time is sensible for the application consider use the approach what @Eduard Lebedyuk already has advised you: import only new records from the external database (hash method, sophisticated query or some other bright idea) and do the cube syncing which will perform faster than cube rebuilding.

Hi, Max!

I think you have two questions here.

1. how to import data into Caché class from another DBMS.

2. How to update the cube which uses an imported table as a fact table.

About the second question: I believe you can introduce a field in the imported table with a hash-code for the fields of the record and import only new rows so DeepSee Cube update will work automatically in this case. Inviting @Eduard Lebedyuk to describe the technics in details.

Regarding the 1st question - I didn't get your problem, but you can test linked table via SQL Gateway UI  in Control Panel.

Containers were invented for it. 

The idea is that you have a git repo with docker container in it and a new developer can just checkout the repo and call "docker build" or even "docker up" to get the full prepared environment with one server in a container, two servers for client-server apps, and more for complex solutions.

See @Luca Ravazzolo posts on (one, two) on containers with InterSystems.  Also, see @Eduard Lebedyuk series on continuous delivery with Gitlab , it is a very related topic.

Hi, Community!

I have some good news for you!

I'm pleased to announce that Robert Cemper is a new Developer Community Moderator for 2018 year!

Robert joined DC in June 2017 and is responsible for a significant amount of experience, best practices and deep skills in InterSystems technology presented here in InterSystems Developer Community!

Congratulations, Robert! And thanks for your Yes to work as Moderator in InterSystems Community! 

Last comment 1 March 2018
+ 9   0 7
267

views

+ 9

rating

Hi, Community!

This year we again introduced three annual badges on Global Masters Advocacy Hub to let you remember how much you contributed to Developer Community in 2017. Here they are:

  • DC Best-Selling Author 2017
  • DC Expert 2017
  • DC Opinion Leader 2017

Last comment 1 February 2018
+ 2   0 3
129

views

+ 2

rating

In part of this post in 2016 @Eduard Lebedyuk asked if anyone knew what is meant by an "expanded class", as referred to in the text that appears when we run the ShowQualifiers classmethod of %SYSTEM.OBJ thus:

SAMPLES>DO $system.OBJ.ShowQualifiers()
...
            Name: /checkuptodate
    Description: Skip classes or expanded classes that are up-to-date.

Last answer 30 January 2018 Last comment 30 January 2018
0   0 3
174

views

0

rating

And to start it: for me the most helpful article this year was REST FORMS Queries - yes, I'm using REST FORMS a lot, thanks [@Eduard Lebedyuk]!

Another is Search InterSystems documentation using iKnow and iFind technologies

Two helpful questions were mine  (of course ;):

How to find duplicates in a large text field

and Storage Schema in VCS: to Store Or Not to Store?

and How to get the measure for the last day in a month in DeepSee

Thanks for the advice have changed my code but come to a problem while reading my stream as there seems to an error with the XSLT XML Transformer Error: SAXParseException  I do not seems to see what the problem is as I tested my XLT with the XLT Transform Wizard and it all worked fine could you please have a look at the xml and the XLT and see if you can spot where I am going wrong thanks in advance

XML

 

<Message><Header><Code>HOT</Code><Date>2017-12-08 11:22:34.658</Date></Header><Body><Code>HOT</Code><Name>SIDE</Name><Type>AADULT</Type><Sec>MSec</Sec><ardge>Adult</ardge><Nder>F</Nder><TBC>21</TBC><BO>14</BO><DBOC>0</DBOC><LBC>5</LBC><AB>2</AB></Body><Body><Code>HOT</Code><Name>DARS</Name><Type>AADULT</Type><Sec>MSec</Sec><ardge>OADULT</ardge><Nder>U</Nder><TBC>20</TBC><BO>16</BO><DBOC>0</DBOC><LBC>2</LBC><AB>2</AB></Body><Body><Code>HOT</Code><Name>ENTLE</Name><Type>AADULT</Type><Sec>MSec</Sec><ardge>Adult</ardge><Nder>M</Nder><TBC>22</TBC><BO>18</BO><DBOC>0</DBOC><LBC>3</LBC><AB>1</AB></Body><Body><Code>HOT</Code><Name>ROOXED</Name><Type>AADULT</Type><Sec>MSec</Sec><ardge>OADULT</ardge><Nder>U</Nder><TBC>17</TBC><BO>7</BO><DBOC>0</DBOC><LBC/><AB>9</AB></Body><Body><Code>HOT</Code><Name>DUK</Name><Type>AADULT</Type><Sec>MSec</Sec><ardge>Adult</ardge><Nder>F</Nder><TBC>20</TBC><BO>17</BO><DBOC>0</DBOC><LBC>1</LBC><AB>2</AB></Body><Body><Code>HOT</Code><Name>DALE</Name><Type>AADULT</Type><Sec>MSec</Sec><ardge>Adult</ardge><Nder>M</Nder><TBC>18</TBC><BO>15</BO><DBOC>0</DBOC><LBC>2</LBC><AB/></Body><Body><Code>HOT</Code><Name>DAN</Name><Type>AADULT</Type><Sec>MSec</Sec><ardge>OADULT</ardge><Nder>U</Nder><TBC>16</TBC><BO>8</BO><DBOC>0</DBOC><LBC/><AB>8</AB></Body><Body><Code>HOT</Code><Name>DUN</Name><Type>AADULT</Type><Sec>MSec</Sec><ardge>Adult</ardge><Nder>U</Nder><TBC>2</TBC><BO>0</BO><DBOC>0</DBOC><LBC>1</LBC><AB>1</AB></Body><Body><Code>HOT</Code><Name>DUW</Name><Type>AADULT</Type><Sec>MSec</Sec><ardge>Adult</ardge><Nder>U</Nder><TBC>3</TBC><BO>2</BO><DBOC>0</DBOC><LBC>1</LBC><AB/></Body><Body><Code>HOT</Code><Name>DUM</Name><Type>AADULT</Type><Sec>MSec</Sec><ardge>OADULT</ardge><Nder>U</Nder><TBC>22</TBC><BO>18</BO><DBOC>0</DBOC><LBC>2</LBC><AB>2</AB></Body><Body><Code>HOT</Code><Name>DUR</Name><Type>AADULT</Type><Sec>MSec</Sec><ardge>Adult</ardge><Nder>U</Nder><TBC>16</TBC><BO>13</BO><DBOC>0</DBOC><LBC>2</LBC><AB/></Body></Message>

 

XSLT

 

<?xml version="1.0"?>
<xsl:stylesheet version="1.0" xmlns:xsl="http://www.w3.org/1999/XSL/Transform">
    <xsl:output method="text" indent="yes"/>
    <xsl:template match="/*[node()]">
        <xsl:text>{</xsl:text><xsl:text>&#xa;</xsl:text>
        <xsl:apply-templates select="." mode="detect" />
  <xsl:text>&#xa;</xsl:text>
        <xsl:text>}</xsl:text>
    </xsl:template>
    <xsl:template match="*" mode="detect">
        <xsl:choose>
            <xsl:when test="name(preceding-sibling::*[1]) = name(current()) and name(following-sibling::*[1]) != name(current())">
                    <xsl:apply-templates select="." mode="obj-content" />
     <xsl:text>&#xa;</xsl:text>
                <xsl:text>]</xsl:text>
                <xsl:if test="count(following-sibling::*[name() != name(current())]) &gt; 0">, </xsl:if>
            </xsl:when>
            <xsl:when test="name(preceding-sibling::*[1]) = name(current())">
                    <xsl:apply-templates select="." mode="obj-content" />
                    <xsl:if test="name(following-sibling::*) = name(current())">, </xsl:if>
            </xsl:when>
            <xsl:when test="following-sibling::*[1][name() = name(current())]">
                <xsl:text>"</xsl:text><xsl:value-of select="name()"/><xsl:text>" : [</xsl:text>
        <xsl:text>&#xa;</xsl:text>
                    <xsl:apply-templates select="." mode="obj-content" /><xsl:text>, </xsl:text>
     <xsl:text>&#xa;</xsl:text>
            </xsl:when>
            <xsl:when test="count(./child::*) > 0 or count(@*) > 0">
                <xsl:text>"</xsl:text><xsl:value-of select="name()"/>" : <xsl:apply-templates select="." mode="obj-content" />
                <xsl:if test="count(following-sibling::*) &gt; 0">, </xsl:if>
            </xsl:when>
            <xsl:when test="count(./child::*) = 0">
                <xsl:text>"</xsl:text><xsl:value-of select="name()"/>" : "<xsl:apply-templates select="."/><xsl:text>"</xsl:text>
                <xsl:if test="count(following-sibling::*) &gt; 0">, </xsl:if>
    <xsl:text>&#xa;</xsl:text>
            </xsl:when>
        </xsl:choose>
    </xsl:template>
    <xsl:template match="*" mode="obj-content">
     <xsl:text>&#xa;</xsl:text>
        <xsl:text>{</xsl:text>
  <xsl:text>&#xa;</xsl:text>
            <xsl:apply-templates select="@*" mode="attr" />
            <xsl:if test="count(@*) &gt; 0 and (count(child::*) &gt; 0 or text())">, </xsl:if>
            <xsl:apply-templates select="./*" mode="detect" />
            <xsl:if test="count(child::*) = 0 and text() and not(@*)">
                <xsl:text>"</xsl:text><xsl:value-of select="name()"/>" : "<xsl:value-of select="text()"/><xsl:text>"</xsl:text>
            </xsl:if>
            <xsl:if test="count(child::*) = 0 and text() and @*">
                <xsl:text>"text" : "</xsl:text><xsl:value-of select="text()"/><xsl:text>"</xsl:text>
            </xsl:if>
        <xsl:text>}</xsl:text>
        <xsl:if test="position() &lt; last()">, </xsl:if>
    </xsl:template>
    <xsl:template match="@*" mode="attr">
        <xsl:text>"</xsl:text><xsl:value-of select="name()"/>" : "<xsl:value-of select="."/><xsl:text>"</xsl:text>
        <xsl:if test="position() &lt; last()">,</xsl:if>
    </xsl:template>
    <xsl:template match="node/@TEXT | text()" name="removeBreaks">
        <xsl:param name="pText" select="normalize-space(.)"/>
        <xsl:choose>
            <xsl:when test="not(contains($pText, '&#xA;'))"><xsl:copy-of select="$pText"/></xsl:when>
            <xsl:otherwise>
                <xsl:value-of select="concat(substring-before($pText, '&#xD;&#xA;'), ' ')"/>
                <xsl:call-template name="removeBreaks">
                    <xsl:with-param name="pText" select="substring-after($pText, '&#xD;&#xA;')"/>
                </xsl:call-template>
            </xsl:otherwise>
        </xsl:choose>
    </xsl:template>
</xsl:stylesheet>

 

 

 

There are several options how to deliver user interface(UI) for DeepSee BI solutions. The most common approaches are:

  • use native DeepSee Dashboards, get web UI in Zen and deliver it in your web apps.

  • use DeepSee REST API, get and build your own UI widgets and dashboards.

The 1st approach is good because of the possibility to build BI dashboards without coding relatively fast, but you are limited with preset widgets library which is expandable but with a lot of development efforts.

The 2nd provides you the way to use any comprehensive js framework (D3, Highcharts, etc) to visualize your DeepSee data, but you need to code widgets and dashboards on your own.

Today I want to tell you about yet another approach which combines both listed above and provides Angular based web UI for DeepSee Dashboards -  DeepSee Web library.

Last comment 28 May 2019
+ 2   3 9
861

views

+ 2

rating

Note that apart from Export and Import options -

If you are using a %Installer Manifest for your deployment (for any environment - test or prod) - you can include in that manifest also the creation of Security elements such as Resources and Roles, etc.

For example:

<Resource
    Name="%accounting_user" 
    Description="Accounting"
    Permission="RW"/>

And:

<Role 
    Name="%DB_USER"
    Description="Database user"
    Resources="MyResource:RW,MyResource1:RWU"
    RolesGranted= />

See more information here (in the docs).

[Defining a Role as part of a manifest is also included in an example in [@Eduard Lebedyuk]'s post here]

 

As well as our RE/* tools please also consider Yuzinji.

In addition to the main UI that is illustrated in the short video here, the output from Yuzinji can also be browsed in a web app. An example is available at http://demo.georgejames.com:8080/s101g/tracker/home.html where you can get some insights into how a couple of codebases have changed over time. One comes from the InterSystems SAMPLES namespace, and the other is from [@Eduard Lebedyuk]'s RESTForms project.

Hi, Community!

If you do not know much about DeepSee technology, this video on is exactly for you:

DeepSee Webinar

 

+ 1   0 1
0

comments

100

views

+ 1

rating

Hi, Community!

For those developers who are attending Global Summit 2017 this year: you have an opportunity to share your solutions, framework, and experience with the rest GS attendees and Developer Community.

On Monday 11th we would have Developer Community Sessions in Tech Exchange Open House (see the agenda).

Last comment 11 September 2017
+ 6   0 6
458

views

+ 6

rating

Hi, Community!

Hope you have already put in your schedule the visit to InterSystems Global Summit 2017 which will take place on 10-13 of September in remarkable JW Marriott Desert Springs Resort and Spa.

This year we have Experience Lab, The Unconference, and 50 more other sessions, regarding performance, cloud, scalability, FHIR, high availability and other solutions and best practices.

Last comment 7 July 2017
+ 4   0 1
355

views

+ 4

rating

Hi, Vineeth!

See the sample of exporting and importing global to zip file on the fly from @Eduard Lebedyuk post:

set ^dbg=123
set s=##class(%Stream.FileBinaryGzip).%New()
do s.LinkToFile("1.xml")
do $System.OBJ.ExportToStream("dbg*.GBL", s)
do s.%Save()
kill
kill ^dbg
set s=##class(%Stream.FileBinaryGzip).%New()
do s.LinkToFile("1.xml")
do $System.OBJ.LoadStream(s)
write ^dbg
>123

Hope that helps.

Hi, Ponnumani!

$$ is a  way to call label with parameters  in mac or int code and get the result.

See your previous question and answers about that.

$$$ is the way to call macros which was previously defined with #define directive.

See the good article about macros by @Eduard Lebedyuk

And mark this answer as "accepted" if it goes for you ;)