@Eduard Lebedyuk  There is one problem , The file gets successfully saved in S3 , but when we try to retrieve the content the data is damaged.

This works fine for text files but if we have any file which is PDF or Image then when we try to view that same file the data is completely damaged we get in the for stream, but not actual image or PDF which we had uploaded.


How can I get the original file(image/PDF) which I uploaded back as a response instead of a stream?

Hi Community!

New Badges're already on Global Masters Advocacy Hub

We're happy to announce that this year we again introduced three annual badges on Global Masters Advocacy Hub to let you remember how much you contributed to Developer Community in 2018. Here they are:

 DC Best-Selling Author 2018
 DC Expert 2018
 DC Opinion Leader 2018

Let's take a closer look at the DC Wall of Fame 2018 and greet everyone with big applause! 

0 1
0

comments

71

views

0

rating

Hi Jose!

If your dashboards need the authentication for the access it's logical that it demands to enter login and password.

If you want embedded DSW iframe to use same login/password as another application on the same web page where the user already logged in, it's the matter of how to transfer the access to an embedded iframe.

I cannot answer at the moment how to manage this, and pinging @Eduard Lebedyuk: Eduard, do you think it is possible to transfer session to an MDX2JSON app of the embedded DSW iframe?

Hi Community!

New Badges're already on Global Masters Advocacy Hub

We're happy to announce that this year we again introduced three annual badges on Global Masters Advocacy Hub to let you remember how much you contributed to Developer Community in 2018. Here they are:

 DC Best-Selling Author 2018
 DC Expert 2018
 DC Opinion Leader 2018

Let's take a closer look at the DC Wall of Fame 2018 and greet everyone with big applause! 

Last comment 11 January 2019
0 4
105

views

+ 8

rating

Hi Community!

Please welcome a new video on Developer Community YouTube Channel:

Continuous Delivery with Containers

 

0 1
0

comments

46

views

0

rating

Hi Community!

I'm pleased to announce that InterSystems Developer Community reached 5,000 registered members!

Thank you, developers, not only for registering but rather for making this place more and more helpful for everyone who develops and supports solutions on InterSystems Data Platforms all over the world! Big applause to all of us! 

Last comment 30 October 2018
2 3
126

views

+ 7

rating

Hi, Stefan!

In this situation, I would suggest using RELEASES approach for different InterSystems Data Platforms. E.g. you can generate XML release package from the code base in git and then replace all the specific places for a given version on a build release phase. As a result you will have different XML packages for Ensemble and for InterSystems IRIS and have one code base, say in IRIS.

For example, you can see that approach @Eduard Lebedyuk used for RESTForms distribution. There are 2016.1 and 2016.2 release packages which can be installed in 2016.1 and 2016.2+ Caché or Ensemble versions respectfully.

To simplify release building I can recommend ISC.DEV utility which can export a release or patch file(upon git commits) for a given mask of classes.

HTH

Hi Community!

This year will have a special section on Flash Talks which gives you an opportunity to introduce your tool or solution on InterSystems Global Summit 2018!



What is Flash Talks? 

It's 15 min session you have on Technology Exchange scene: 10 min for your pitch, 5 min for Q&A. 

The session WILL BE live streamed on Developer Community YouTube Channel.

Developer Community Flash Talks!

Today, 10/02, Flash Talks Scene @ InterSystems Global Summit 2018!

2 pm Open source approaches to work with Documents @Eduard Lebedyuk, InterSystems

2-15 InterSystems IRIS on Kubernetes by @Dmitry Maslennikov

2-30 Visual Studio Code IDE for InterSystems Data Platforms by @John Murray, GeorgeJames Software

2-45 Static Analysis for ObjectScript with CacheQuality by @Daniel Tamajon, Lite Solutions

3-00 InterSystems Open Exchange by @Evgeny Shvarov, InterSystems

3-15 Q&A Session on Developer Community, Global Masters, and Open Exchange

Last comment 23 October 2018
0 4
264

views

+ 1

rating

Few days before Global Summit 2018!

And I can announce yet another presenter: @Eduard Lebedyuk, InterSystems.

Title: Open source approaches to work with Documents.

And we have the day and time!

Find Developer Community Flash Talks "Share Your InterSystems IRIS Solution!"  on Tuesday the 2nd of October on Flash Talks Stage from 2pm to 3-30 pm.

Thank you, John!

Actually, I don't know the case when we may need to change passwords for ALL users to the one specified.

So, don't run this code on your production system.

@Eduard Lebedyuk is there any close to reality case when this snippet could be useful?

This code snippet uses %ZEN.Auxiliary.jsonSQLProvider. The namespace and string of SQL can be edited for different situations. The class method "test" runs the code:


Class eduardlebedyuk.passQuestionParams
{
    classmethod test(pValue = 50) {
        s ns = $Namespace
        zn "samples"
        s tSQL = "SELECT ID, Name FROM Sample.Person WHERE Id > ?"
        s tPR = ##class(%ZEN.Auxiliary.jsonSQLProvider).%New()
        s tPR.sql = tSQL
        s tPR.%Format = "tw"
        s tPR.maxRows = 100
     
        s tParam = ##class(%ZEN.Auxiliary.parameter).%New()
        s tParam.value = pValue
        d tPR.parameters.SetAt(tParam,1)
      
        d tPR.%DrawJSON() 
        //d ##class(%ZEN.Auxiliary.jsonSQLProvider).%WriteJSONFromSQL(,,,,,tPR)  //same thing
        zn ns
    }
}

(Originally posted to Intersystems CODE by @Eduard Lebedyuk, 5/13/15)

Here's a link to the code on GitHub

0 2
0

comments

62

views

0

rating

(Originally posted on Intersystems CODE by @Eduard Lebedyuk, 10/12/15) The following code snippet outputs all filenames in the file path "dir" in the Cache/IRIS terminal. The class method "test" runs the code:


Class eduardlebedyuk.filenamesInDir Extends %RegisteredObject
{
	classmethod test() {
		// replace dir with file path you want
		set dir = "D:\directory" 
		set dir = ##class(%File).NormalizeDirectory(dir)
		set file=$ZSEARCH(dir_"*")
		while file'="" {
			write !,file
			set file=$ZSEARCH("")
		}
	}
}

Last comment 29 October 2018
2 4
323

views

0

rating

@Eduard Lebedyuk I have tried your suggestions and still get the error when opening the file with adobe reader. If I try on a simple pass through operation like this below everything works fine.

ERROR: Adobe acrobat reader could not open file.pdf because it is either not a supported file type or because the file has been damaged.

the service

Method OnProcessInput(pInput As %Stream.Object, Output pOutput As %RegisteredObject) As %Status
{
#dim pt as TestingEnvironment.ECGTrace.TEST.FSMREQ=##class(TestingEnvironment.ECGTrace.TEST.FSMREQ).%New()
  Set tSource=pInput.Attributes("Filename"), pInput=$zobjclassmethod(..#CONTAINERCLASS,"%New",pInput)
 Set tSC=..resolveAndIndex(pInput) Quit:$$$ISERR(tSC) tSC
 set pt.filestream=pInput
 set pt.path=tSource
 Set tWorkArchive=(""'=..Adapter.ArchivePath)&&(..Adapter.ArchivePath=..Adapter.WorkPath || (""=..Adapter.WorkPath && (..Adapter.ArchivePath=..Adapter.FilePath)))
 $$$SyncCommitSet(tSyncCommit)
 For iTarget=1:1:$L(..TargetConfigNames, ",") { Set tOneTarget=$ZStrip($P(..TargetConfigNames,",",iTarget),"<>W")  Continue:""=tOneTarget
  $$$sysTRACE("Sending input Stream "_pInput.Stream_"("_pInput.Stream.Size_")"_$S(tWorkArchive:" Async",1:" Sync")_" from '"_tSource_"' to '"_tOneTarget_"'")
  If tWorkArchive {
   Set tSC1=..SendRequestAsync(tOneTarget,pt)  Set:$$$ISERR(tSC1) tSC=$$$ADDSC(tSC,tSC1)
   //Set tSC1=..SendRequestAsync(tOneTarget,pInput)  Set:$$$ISERR(tSC1) tSC=$$$ADDSC(tSC,tSC1)
  } Else {
   #; If not archiving send Sync to avoid Adapter deleting file before Operation gets it
   //Set tSC1=..SendRequestSync(tOneTarget,pInput)  Set:$$$ISERR(tSC1) tSC=$$$ADDSC(tSC,tSC1)
   Set tSC1=..SendRequestSync(tOneTarget,pt)  Set:$$$ISERR(tSC1) tSC=$$$ADDSC(tSC,tSC1)
  }
 }
 $$$SyncCommitClear(tSyncCommit)
 Quit tSC
}

the operation

Method OnMessage(pREs As TestingEnvironment.ECGTrace.TEST.FSMREQ, pRequest As Ens.StreamContainer, Output pResponse As %Persistent) As %Status
{
 set pRequest=pREs.filestream
 Quit:'$IsObject(pRequest.Stream) $$$ERROR($$$EnsErrGeneral,"No Stream contained in StreamContainer Request")
 Set tFilename=..Adapter.CreateTimestamp(##class(%File).GetFilename(pRequest.OriginalFilename),..Filename)
 Set tSC=..Adapter.PutStream(tFilename, pRequest.Stream)
 Do pRequest.%Save() ; re-save in case PutStream() optimization changed the Stream filename
 Quit tSC
}

my code the service

Method OnProcessInput(pInput As %Stream.Object, Output pOutput As %RegisteredObject) As %Status
{
 #dim meta as DocumentUpload.GenericUploadMREQ=##class(DocumentUpload.GenericUploadMREQ).%New()
 
   ;get the filepath from the request 
   ;;wrap the stream object into container for easy transpotation
    Set tFileName=pInput.Attributes("Filename") , pInput=$zobjclassmethod(..#CONTAINERCLASS,"%New",pInput)
 $$$TRACE(tFileName) 
 ;get the file name
    set dataPiece=##class(%File).GetFilename(tFileName)
 $$$TRACE(dataPiece)
      Set tSC=..resolveAndIndex(pInput) Quit:$$$ISERR(tSC) tSC
      ;check if the file path data is populated
 if (dataPiece'="")
    {
    ;build the ECG Message
          set meta.ClientID =$Piece(dataPiece,"_",1) 
          set meta.LastName=$Piece(dataPiece,"_",2)
          set meta.FirstName=$Piece(dataPiece,"_",3)
          set meta.DateOfBirth=$Piece(dataPiece,"_",4)
          set meta.Directory =$Piece(tFileName,"\",*-1)
          set meta.OGFileName  =dataPiece
          set meta.Fullpath  =tFileName
          ;get the date to testing
          set DateOfTest=$Piece(dataPiece,"_",5)
          ;get the time of testing
          set mtim=$Piece(dataPiece,"_",6)
          ;separate the extension of the file path and the time
          set TimeOfTest=$Piece(mtim,".",1)
          set meta.FileTimeStamp =DateOfTest_""_TimeOfTest
          set meta.recordAdded =$ZDT($ZTIMESTAMP,3,1,3)
          set meta.TargetConfig ="RIO.DocumentUpload.RiOFileOPRN"
          set meta.payLoad=pInput
          set meta.DocumentType=..DocumentType
       set meta.Description=..Description
       set meta.Title=..Title
       set meta.FinalRevision=..Revesion
       set meta.Author=..Author_""_$Piece(tFileName,"\",*-1)
       set messagetype=$PIECE(..Author," ",1)
       set meta.UserId=..UserID
          set meta.sourceConfig =tFileName
          set meta.TypeMes=messagetype
         
         
          Set tWorkArchive=(""'=..Adapter.ArchivePath)&&(..Adapter.ArchivePath=..Adapter.WorkPath || (""=..Adapter.WorkPath && (..Adapter.ArchivePath=..Adapter.FilePath)))
 $$$SyncCommitSet(tSyncCommit)
 For iTarget=1:1:$L(..TargetConfigNames, ",") { Set tOneTarget=$ZStrip($P(..TargetConfigNames,",",iTarget),"<>W")  Continue:""=tOneTarget
  $$$sysTRACE("Sending input Stream "_pInput.Stream_"("_pInput.Stream.Size_")"_$S(tWorkArchive:" Async",1:" Sync")_" from '"_tFileName_"' to '"_tOneTarget_"'")
  If tWorkArchive {
   Set tSC1=..SendRequestAsync(tOneTarget,meta)  Set:$$$ISERR(tSC1) tSC=$$$ADDSC(tSC,tSC1)
   
  } Else {
   #; If not archiving send Sync to avoid Adapter deleting file before Operation gets it
   Set tSC1=..SendRequestSync(tOneTarget,meta)  Set:$$$ISERR(tSC1) tSC=$$$ADDSC(tSC,tSC1)
  }
 }
 $$$SyncCommitClear(tSyncCommit)
         
         
       }
            quit tSC
}

the operation please note I have a Route in between which simple transformers the message to the message expected by the operation its just a simple mapping scenario

Method WriteOutFiles(pRequest As DocumentUpload.FileMREQ, pInput As Ens.StreamContainer, Output pResponse As DocumentUpload.GenericRESP) As %Status
{
 set pInput=pRequest.FileStream
     ;the variable to hold the status for the method
 #dim status as %Status=$$$OK
 ;clear the pResponse
  kill pResponse
     set pResponse=$$$NULLOREF
 ;set the file name to the sequence number
  set ..Filename=pRequest.NewFileName
 
  ;the filepath set on the settings of this OPERATION
   set origDirectory = ..Adapter.FilePath
  ;the file directory to drop the file
  set ..Adapter.FilePath = ..Adapter.FilePath_"\"_..StubDirectory
  
 ;start writing out file data to the stub file
    set:$$$ISOK(status) status= ..Adapter.PutLine(..Filename_"."_..stubExtension,
  $CHAR(34)_ pRequest.ClientID_$CHAR(34)_$CHAR(44)_$CHAR(34)_pRequest.UserId_$CHAR(34)_$CHAR(44)_$CHAR(34)_
  pRequest.DocumentType _$CHAR(34)_$CHAR(44)_$CHAR(34)_pRequest.Title_$CHAR(34)_$CHAR(44)_$CHAR(34)_pRequest.Description_$CHAR(34)_$CHAR(44)_$CHAR(34)_pRequest.Author _$CHAR(34)_$CHAR(44)_$CHAR(34)_pRequest.DocumentDate_$CHAR(34)_$CHAR(44)_$CHAR(34)_pRequest.FinalRevision_$CHAR(34))
 
 if ($$$ISOK(status))
 {
  ;set back to the operation settings
  set ..Adapter.FilePath = origDirectory
    ;set the file name  to write out to
  set ..Filename=pRequest.NewFileName_"."_..DocExtension
  ;set the filepath on the production settings to this variable
  set origDirectory = ..Adapter.FilePath
  ;set the new filepath
  set ..Adapter.FilePath = ..Adapter.FilePath_"\"_..DocumentDirectory
     
 
 // set:$$$ISOK(status) status=  ..Adapter.PutStream(..Filename, pInput.Stream) //error here to file
  set:$$$ISOK(status) status=..OriginalFileOut(pInput,pResponse,..Filename)
 
      ;set adapter to its original file path
   set ..Adapter.FilePath = origDirectory
  
   ;check writing out file worked
   if ($$$ISOK(status))
 {
  set pResponse=##class(DocumentUpload.GenericRESP).%New()
  set pResponse.Process="FileOPRN_files Written to their respective directories"
  set pResponse.Status=status
  set status=pResponse.%Save()
 }
  
 }
 
 ;return status
 return status
}

// passthrough original method to write out file

Method OriginalFileOut(pRequest As Ens.StreamContainer, Output pResponse As %Persistent, filenamess) As %Status
{
 
 Quit:'$IsObject(pRequest.Stream) $$$ERROR($$$EnsErrGeneral,"No Stream contained in StreamContainer Request")
 Set tFilename=..Adapter.CreateTimestamp(##class(%File).GetFilename(pRequest.OriginalFilename),filenamess)
 Set tSC=..Adapter.PutStream(tFilename, pRequest.Stream)
 Do pRequest.%Save() ; re-save in case PutStream() optimization changed the Stream filename
 Quit tSC
}

@Eduard Lebedyuk Looked at the class this classEnsLib.RecordMap.Service.BatchFileService and the EnsLib.RecordMap.Service.BatchStandard class apparently these are fine  but for some reason the custom class acts as if it goes into a loop.I have put some traces in my code and tried to capture the status and change  it to fail if a certain validation fails but that makes the whole service to log an error after successfully delivering the first message .

here are the changes made thanks for your help

ClassMethod GetBatchHeader(pIOStream As %IO.DeviceStream, pTimeout As %Numeric = -1, Output pBatch As EnsLib.RecordMap.SimpleBatch, ByRef pLookAhead As %String) As %Status
{
    $$$TRACE("Begin")
 Try {
  #dim cpStatus as %Status=$$$OK
    Set tStatus = $$$OK
    Set pBatch = ""
    Set tTerm = ..GetHeaderTerm()
    //Set tFullHeader = 63 _ tTerm
    Set tHeaderLen =63+$length(tTerm)
    set stage=""
    If tHeaderLen
       {
       Set tFound = 0
       $$$TRACE("0")
       Set tLeadingJunk = ""
       Set pLookAhead = $get(pLookAhead)
       Set tTimeout = pTimeout
       Set tEndTime = $zhorolog + pTimeout
      // While ('tFound) && ('pIOStream.AtEnd)
        //  {
          Set tReadLen = tHeaderLen - $length(pLookAhead)
          If tReadLen > 0
             {
             Set tData = pLookAhead _ pIOStream.Read(tReadLen, .tTimeout, .tStatus)
             $$$TRACE("1")
             If $$$ISERR(tStatus) Quit
             If tTimeout
                {
                Set tStatus = $$$ERROR($$$EnsErrTCPReadTimeoutExpired, pTimeout, tReadLen)
                Quit
                }
                  Set pLookAhead = ""
             }
             Else
             {
             $$$TRACE("Else treadlen less than 0-[2]")
             Set tData = $extract(pLookAhead, 1, tHeaderLen)
             Set pLookAhead = $extract(pLookAhead, tHeaderLen + 1, *)
             }
             If ($extract(tData,1,3) = "001" )
                 {
                Set pBatch = ..%New()
                set pBatch.BatchHeader = tData
                Set tFound = 1
                $$$TRACE("Foundlee[3]"_tFound)
                Quit
                 }
                 Else
                 {
                $$$TRACE("else $extract(tData1,3)Failed[3]")
                Set pLookAhead = pLookAhead _ tData
                 #; Check if we should start discarding leading data
                If ($length(pLookAhead) >= tHeaderLen)
                   {
                   If ($length(tLeadingJunk) < 400)
                      {
                     Set tLeadingJunk = tLeadingJunk _ $extract(pLookAhead,1)
                      }
                      Set pLookAhead = $extract(pLookAhead,2,*)
                   }
                   set cpStatus=0
                   set stage="Extractfailed" //check here
                  // Quit
                  //Continue
     
                 }
                 If (pTimeout = -1)
                    {
                    Set tTimeout = -1
                    }
                    Else
                    {
                     $$$TRACE("time out ok [4]")
                    Set tCurrTime = $zhorolog
                    If (tCurrTime > tEndTime)
                       {
                      Set tStatus = $$$ERROR($$$EnsErrTCPReadTimeoutExpired, pTimeout, tReadLen)
                      Quit
                       }
                       Set tTimeout = tEndTime - tCurrTime
                   }
                   If $$$ISERR(tStatus) Quit
         // } //while
         
          $$$TRACE("while end"_tFound)
         
          If $$$ISERR(tStatus) Quit
          #; Clear the lookahead buffer if we didn't find the batch header
          $$$TRACE("###Status###[5]"_tStatus)
          If (('tFound) && ($length(tLeadingJunk) < 400)&&('pLookAhead="")&&($$$ISERR(cpStatus)))
             {
             Set tLeadingJunk = tLeadingJunk _ $get(pLookAhead)
             Set pLookAhead = ""
             $$$TRACE("###Not Found###[6]"_tFound)
             }
            If (tLeadingJunk '= "") && ('..#IgnoreLeadingData)
               {
               #; Use JS escaping to handle control characters
               Set tLoggedJunk = $zconvert($extract(tLeadingJunk,1,400),"O","JS") _ $select($length(tLeadingJunk) > 400: "...", 1: "")
               $$$LOGWARNING($$$FormatText($$$Text("[]Discarding unexpected leading data: '%1'","Ensemble"),tLoggedJunk))
               }
               If (('tFound)&&($$$ISERR(cpStatus)))
                  {
                 Set pBatch = ""
                 $$$TRACE("Not Found[8]")
                 //Set tStatus = $$$ERROR($$$EnsRecordMapErrBatchHeaderNotFound,$classname($this))
                 if (stage="Extractfailed" &&($$$ISERR(cpStatus)))
                          {
                          
                        
                         set tStatus=cpStatus
                        
                         $$$LOGWARNING($classname($this)_":: Empty Batch Discarded")
                        
                          set tStatus=$$$OK
                          }
                          else
                          {
                            // Set tStatus = $$$EnsSystemError
                            Set tStatus = $$$ERROR($$$EnsRecordMapErrBatchHeaderNotFound,$classname($this))
                             }
                
                 Quit
                  }
     }
     Else
     {
   Set pBatch = ..%New()
   Set pLookAhead = $get(pLookAhead)
   Quit
     }
     }
 Catch ex
    {
      $$$TRACE("in catch]")
    Set tStatus = $$$EnsSystemError
      quit
      }
      
 Quit tStatus
}

Since this is made up of three classes how do I control the status to send the call with within the block have a look at where  I have newStatus and I would like to quit and pass new status

Hi, Community!

Continuous Delivery is a software engineering approach in which teams produce software in short cycles, ensuring that the software can be reliably released at any time. It aims at building, testing, and releasing software faster and more frequently. The approach helps reduce the cost, time, and risk of delivering changes by allowing for more incremental updates to applications in production.

Join us at 07:00 UTC, April 24th for a webinar with a live demo "Git flows and Continuous Delivery" by @Eduard Lebedyuk 

The language of the webinar is Russian.

Also, see the related articles on DC.

Last comment 9 August 2018
0 2
158

views

0

rating

Sorry to be late. I was busy today:

Thanks to  @Eduard Lebedyuk

I did it in a traditional one-liner: 79  77 char.   + 4 char extra to read the size.

I'll wrap it into a method later to see how much waste of space this generates. devil

5 min. later:
OK. Method consumes 3 char. extra {} to enclose it + blank at the start   =>>> 82   80  ~3.9% overhead

s=1...4 looks odd but it improves.

f j=1:1:15 zw j d ##class(DC.size).main(j) 

j=1
#
j=2
##
##
j=3
###
###
###
j=4
####
####
####
####
j=5
#####
## ##
# # #
## ##
#####
j=6
######
##  ##
# ## #
# ## #
##  ##
######
j=7
#######
##   ##
# # # #
#  #  #
# # # #
##   ##
#######
j=8
########
##    ##
# #  # #
#  ##  #
#  ##  #
# #  # #
##    ##
########

Hi Everyone!

New webinar "Rest API Design and Development" is available now on  DC YouTube Channel:

 

0 2
0

comments

102

views

+ 1

rating

2) It's OK approach for relatively small external source tables because in this case, you need to build the cube for all the records and have no option to update/sync.

If you OK with timings on cube building you are very welcome to use the approach.

If the building time is sensible for the application consider use the approach what @Eduard Lebedyuk already has advised you: import only new records from the external database (hash method, sophisticated query or some other bright idea) and do the cube syncing which will perform faster than cube rebuilding.

Hi, Max!

I think you have two questions here.

1. how to import data into Caché class from another DBMS.

2. How to update the cube which uses an imported table as a fact table.

About the second question: I believe you can introduce a field in the imported table with a hash-code for the fields of the record and import only new rows so DeepSee Cube update will work automatically in this case. Inviting @Eduard Lebedyuk to describe the technics in details.

Regarding the 1st question - I didn't get your problem, but you can test linked table via SQL Gateway UI  in Control Panel.

Containers were invented for it. 

The idea is that you have a git repo with docker container in it and a new developer can just checkout the repo and call "docker build" or even "docker up" to get the full prepared environment with one server in a container, two servers for client-server apps, and more for complex solutions.

See @Luca Ravazzolo posts on (one, two) on containers with InterSystems.  Also, see @Eduard Lebedyuk series on continuous delivery with Gitlab , it is a very related topic.

Hi, Community!

This year we again introduced three annual badges on Global Masters Advocacy Hub to let you remember how much you contributed to Developer Community in 2017. Here they are:

  • DC Best-Selling Author 2017
  • DC Expert 2017
  • DC Opinion Leader 2017

Last comment 1 February 2018
0 3
109

views

+ 2

rating

In part of this post in 2016 @Eduard Lebedyuk asked if anyone knew what is meant by an "expanded class", as referred to in the text that appears when we run the ShowQualifiers classmethod of %SYSTEM.OBJ thus:

SAMPLES>DO $system.OBJ.ShowQualifiers()
...
            Name: /checkuptodate
    Description: Skip classes or expanded classes that are up-to-date.

Last answer 30 January 2018 Last comment 30 January 2018
0 3
139

views

0

rating

And to start it: for me the most helpful article this year was REST FORMS Queries - yes, I'm using REST FORMS a lot, thanks [@Eduard Lebedyuk]!

Another is Search InterSystems documentation using iKnow and iFind technologies

Two helpful questions were mine  (of course ;):

How to find duplicates in a large text field

and Storage Schema in VCS: to Store Or Not to Store?

and How to get the measure for the last day in a month in DeepSee

Thanks for the advice have changed my code but come to a problem while reading my stream as there seems to an error with the XSLT XML Transformer Error: SAXParseException  I do not seems to see what the problem is as I tested my XLT with the XLT Transform Wizard and it all worked fine could you please have a look at the xml and the XLT and see if you can spot where I am going wrong thanks in advance

XML

 

<Message><Header><Code>HOT</Code><Date>2017-12-08 11:22:34.658</Date></Header><Body><Code>HOT</Code><Name>SIDE</Name><Type>AADULT</Type><Sec>MSec</Sec><ardge>Adult</ardge><Nder>F</Nder><TBC>21</TBC><BO>14</BO><DBOC>0</DBOC><LBC>5</LBC><AB>2</AB></Body><Body><Code>HOT</Code><Name>DARS</Name><Type>AADULT</Type><Sec>MSec</Sec><ardge>OADULT</ardge><Nder>U</Nder><TBC>20</TBC><BO>16</BO><DBOC>0</DBOC><LBC>2</LBC><AB>2</AB></Body><Body><Code>HOT</Code><Name>ENTLE</Name><Type>AADULT</Type><Sec>MSec</Sec><ardge>Adult</ardge><Nder>M</Nder><TBC>22</TBC><BO>18</BO><DBOC>0</DBOC><LBC>3</LBC><AB>1</AB></Body><Body><Code>HOT</Code><Name>ROOXED</Name><Type>AADULT</Type><Sec>MSec</Sec><ardge>OADULT</ardge><Nder>U</Nder><TBC>17</TBC><BO>7</BO><DBOC>0</DBOC><LBC/><AB>9</AB></Body><Body><Code>HOT</Code><Name>DUK</Name><Type>AADULT</Type><Sec>MSec</Sec><ardge>Adult</ardge><Nder>F</Nder><TBC>20</TBC><BO>17</BO><DBOC>0</DBOC><LBC>1</LBC><AB>2</AB></Body><Body><Code>HOT</Code><Name>DALE</Name><Type>AADULT</Type><Sec>MSec</Sec><ardge>Adult</ardge><Nder>M</Nder><TBC>18</TBC><BO>15</BO><DBOC>0</DBOC><LBC>2</LBC><AB/></Body><Body><Code>HOT</Code><Name>DAN</Name><Type>AADULT</Type><Sec>MSec</Sec><ardge>OADULT</ardge><Nder>U</Nder><TBC>16</TBC><BO>8</BO><DBOC>0</DBOC><LBC/><AB>8</AB></Body><Body><Code>HOT</Code><Name>DUN</Name><Type>AADULT</Type><Sec>MSec</Sec><ardge>Adult</ardge><Nder>U</Nder><TBC>2</TBC><BO>0</BO><DBOC>0</DBOC><LBC>1</LBC><AB>1</AB></Body><Body><Code>HOT</Code><Name>DUW</Name><Type>AADULT</Type><Sec>MSec</Sec><ardge>Adult</ardge><Nder>U</Nder><TBC>3</TBC><BO>2</BO><DBOC>0</DBOC><LBC>1</LBC><AB/></Body><Body><Code>HOT</Code><Name>DUM</Name><Type>AADULT</Type><Sec>MSec</Sec><ardge>OADULT</ardge><Nder>U</Nder><TBC>22</TBC><BO>18</BO><DBOC>0</DBOC><LBC>2</LBC><AB>2</AB></Body><Body><Code>HOT</Code><Name>DUR</Name><Type>AADULT</Type><Sec>MSec</Sec><ardge>Adult</ardge><Nder>U</Nder><TBC>16</TBC><BO>13</BO><DBOC>0</DBOC><LBC>2</LBC><AB/></Body></Message>

 

XSLT

 

<?xml version="1.0"?>
<xsl:stylesheet version="1.0" xmlns:xsl="http://www.w3.org/1999/XSL/Transform">
    <xsl:output method="text" indent="yes"/>
    <xsl:template match="/*[node()]">
        <xsl:text>{</xsl:text><xsl:text>&#xa;</xsl:text>
        <xsl:apply-templates select="." mode="detect" />
  <xsl:text>&#xa;</xsl:text>
        <xsl:text>}</xsl:text>
    </xsl:template>
    <xsl:template match="*" mode="detect">
        <xsl:choose>
            <xsl:when test="name(preceding-sibling::*[1]) = name(current()) and name(following-sibling::*[1]) != name(current())">
                    <xsl:apply-templates select="." mode="obj-content" />
     <xsl:text>&#xa;</xsl:text>
                <xsl:text>]</xsl:text>
                <xsl:if test="count(following-sibling::*[name() != name(current())]) &gt; 0">, </xsl:if>
            </xsl:when>
            <xsl:when test="name(preceding-sibling::*[1]) = name(current())">
                    <xsl:apply-templates select="." mode="obj-content" />
                    <xsl:if test="name(following-sibling::*) = name(current())">, </xsl:if>
            </xsl:when>
            <xsl:when test="following-sibling::*[1][name() = name(current())]">
                <xsl:text>"</xsl:text><xsl:value-of select="name()"/><xsl:text>" : [</xsl:text>
        <xsl:text>&#xa;</xsl:text>
                    <xsl:apply-templates select="." mode="obj-content" /><xsl:text>, </xsl:text>
     <xsl:text>&#xa;</xsl:text>
            </xsl:when>
            <xsl:when test="count(./child::*) > 0 or count(@*) > 0">
                <xsl:text>"</xsl:text><xsl:value-of select="name()"/>" : <xsl:apply-templates select="." mode="obj-content" />
                <xsl:if test="count(following-sibling::*) &gt; 0">, </xsl:if>
            </xsl:when>
            <xsl:when test="count(./child::*) = 0">
                <xsl:text>"</xsl:text><xsl:value-of select="name()"/>" : "<xsl:apply-templates select="."/><xsl:text>"</xsl:text>
                <xsl:if test="count(following-sibling::*) &gt; 0">, </xsl:if>
    <xsl:text>&#xa;</xsl:text>
            </xsl:when>
        </xsl:choose>
    </xsl:template>
    <xsl:template match="*" mode="obj-content">
     <xsl:text>&#xa;</xsl:text>
        <xsl:text>{</xsl:text>
  <xsl:text>&#xa;</xsl:text>
            <xsl:apply-templates select="@*" mode="attr" />
            <xsl:if test="count(@*) &gt; 0 and (count(child::*) &gt; 0 or text())">, </xsl:if>
            <xsl:apply-templates select="./*" mode="detect" />
            <xsl:if test="count(child::*) = 0 and text() and not(@*)">
                <xsl:text>"</xsl:text><xsl:value-of select="name()"/>" : "<xsl:value-of select="text()"/><xsl:text>"</xsl:text>
            </xsl:if>
            <xsl:if test="count(child::*) = 0 and text() and @*">
                <xsl:text>"text" : "</xsl:text><xsl:value-of select="text()"/><xsl:text>"</xsl:text>
            </xsl:if>
        <xsl:text>}</xsl:text>
        <xsl:if test="position() &lt; last()">, </xsl:if>
    </xsl:template>
    <xsl:template match="@*" mode="attr">
        <xsl:text>"</xsl:text><xsl:value-of select="name()"/>" : "<xsl:value-of select="."/><xsl:text>"</xsl:text>
        <xsl:if test="position() &lt; last()">,</xsl:if>
    </xsl:template>
    <xsl:template match="node/@TEXT | text()" name="removeBreaks">
        <xsl:param name="pText" select="normalize-space(.)"/>
        <xsl:choose>
            <xsl:when test="not(contains($pText, '&#xA;'))"><xsl:copy-of select="$pText"/></xsl:when>
            <xsl:otherwise>
                <xsl:value-of select="concat(substring-before($pText, '&#xD;&#xA;'), ' ')"/>
                <xsl:call-template name="removeBreaks">
                    <xsl:with-param name="pText" select="substring-after($pText, '&#xD;&#xA;')"/>
                </xsl:call-template>
            </xsl:otherwise>
        </xsl:choose>
    </xsl:template>
</xsl:stylesheet>

 

 

 

There are several options how to deliver user interface(UI) for DeepSee BI solutions. The most common approaches are:

  • use native DeepSee Dashboards, get web UI in Zen and deliver it in your web apps.

  • use DeepSee REST API, get and build your own UI widgets and dashboards.

The 1st approach is good because of the possibility to build BI dashboards without coding relatively fast, but you are limited with preset widgets library which is expandable but with a lot of development efforts.

The 2nd provides you the way to use any comprehensive js framework (D3, Highcharts, etc) to visualize your DeepSee data, but you need to code widgets and dashboards on your own.

Today I want to tell you about yet another approach which combines both listed above and provides Angular based web UI for DeepSee Dashboards -  DeepSee Web library.

Last comment 8 January 2019
2 7
744

views

+ 2

rating

Note that apart from Export and Import options -

If you are using a %Installer Manifest for your deployment (for any environment - test or prod) - you can include in that manifest also the creation of Security elements such as Resources and Roles, etc.

For example:

<Resource
    Name="%accounting_user" 
    Description="Accounting"
    Permission="RW"/>

And:

<Role 
    Name="%DB_USER"
    Description="Database user"
    Resources="MyResource:RW,MyResource1:RWU"
    RolesGranted= />

See more information here (in the docs).

[Defining a Role as part of a manifest is also included in an example in [@Eduard Lebedyuk]'s post here]

 

As well as our RE/* tools please also consider Yuzinji.

In addition to the main UI that is illustrated in the short video here, the output from Yuzinji can also be browsed in a web app. An example is available at http://demo.georgejames.com:8080/s101g/tracker/home.html where you can get some insights into how a couple of codebases have changed over time. One comes from the InterSystems SAMPLES namespace, and the other is from [@Eduard Lebedyuk]'s RESTForms project.

Hi, Community!

If you do not know much about DeepSee technology, this video on is exactly for you:

DeepSee Webinar

 

0 1
0

comments

88

views

+ 1

rating

Hi, Community!

Hope you have already put in your schedule the visit to InterSystems Global Summit 2017 which will take place on 10-13 of September in remarkable JW Marriott Desert Springs Resort and Spa.

This year we have Experience Lab, The Unconference, and 50 more other sessions, regarding performance, cloud, scalability, FHIR, high availability and other solutions and best practices.

Last comment 7 July 2017
0 1
313

views

+ 4

rating

Hi, Vineeth!

See the sample of exporting and importing global to zip file on the fly from @Eduard Lebedyuk post:

set ^dbg=123
set s=##class(%Stream.FileBinaryGzip).%New()
do s.LinkToFile("1.xml")
do $System.OBJ.ExportToStream("dbg*.GBL", s)
do s.%Save()
kill
kill ^dbg
set s=##class(%Stream.FileBinaryGzip).%New()
do s.LinkToFile("1.xml")
do $System.OBJ.LoadStream(s)
write ^dbg
>123

Hope that helps.

Hi, Ponnumani!

$$ is a  way to call label with parameters  in mac or int code and get the result.

See your previous question and answers about that.

$$$ is the way to call macros which was previously defined with #define directive.

See the good article about macros by @Eduard Lebedyuk

And mark this answer as "accepted" if it goes for you ;)