Stefan Cronje · Feb 10, 2023 go to post

An installer manifest will be helpful,

Also, maybe clearer instructions on the installation? Like to clone the git repo, what to import, the compile order if any?

The root package name being "Demo" does not distinguish the "author/provider". My boss would not want to see something in the Live environment with the name of Demo.

Stefan Cronje · Feb 9, 2023 go to post

Good question. I have had the requirement to do this a few times already, but create a BPL instance every time.

This becomes an effort when exposing services for different API versions, and you build your BPL and Logic to be compatible with the latest. All you need to do is transform the old version to the new one. But you want it somewhere "between" the service and the BPL.

The same goes for message versioning on Business Operations. Some systems you work with use the different version of the same API.

I have started adding APIVersion properties to the services and operations, but then still needsto do a lot of code to cater for each version of the message, or an additional BPL for every transformation.
Also, I need a Lookup to map versions to Business Host names.

It would be great if the following was possible:

  • Ens.Request and Ens.Response had a MessageVersion parameter or something similar.
  • The Business host has a MessageVersion property as a SETTING
  • The Transformation could be handled the Business Host class in some way. Get the MessageVersion value of the message and check the MessageVersion of the TargetHost. If it differs it invokes a transformation class to transform the message. This class will have to have some rules in it to convert between different version of the messages.
  • On the Response message the same principle should apply. Some operations work on older versions of an API.
  • There will of course have to be a transformation class configuration of some sort per business host for the request and responses.

There may even be much simpler solutions than this. But I agree with question asked.

Stefan Cronje · Feb 9, 2023 go to post

Using the proposed approach, there will not be a port listening for REST messages on the Production.

All WS requesst will have to go through the CSP gateway.
If there are other Business Operations utilising that REST service, you will have to connect via the web-server and set up credentials to use.

Also, as suggested by @Dmitry Maslennikov , you can split the services into smaller services if control is required at that level.

Then for local services consumed by the same or other productions pn the same server, you can use Ens RESTService, and control access to the port on the OS firewall to only allow only for traffic from localhost.

Also have a look at the class documentation.
You can disable the local port, so that it will not listen from the production and enable it to go via CSP. You then need to set the CSP application authentication options.
From the class documentation.
property EnableStandardRequests as %Boolean [ InitialExpression = 1 ];

Listen via the CSP WebServer in addition to listening on the HTTP.InboundAdapter's custom local port, if the Adapter is defined/

Note that SSLConfig only applies to the custom local port. To use SSL via the CSP WebServer, you must configure the WebServer separately.

If the Service is invoked via the CSP WebServer, the ?CfgItem= URL parameter may be used to distinguish between multiple configured same-class Services but the standard csp/namespace/classname URL must be used.

Stefan Cronje · Feb 8, 2023 go to post

Which Business Service class are you using?

How is the produciton receiving the message?

Usually for REST services on a production, I do the following:

  • Create a "dispatch" class with the basic which extends from %CSP.REST
  • This REST class will receive the messages, then invoke a business service in the production.
  • You set up a CSP application that uses this class as the dispatcher.
    • Set it as authenticated, which will use basic authentication.

Something like the below. This is in one class, but you can put it into two. You then use this class as the dispatch class on the CSP application:

Class MyPackage.RESTService Extends (%CSP.REST, Ens.BusinessService)
{

Parameter UseSession = 0;

/// Name of Ensemble Business Service. Set in implementation
Parameter BUSINESSSERVICENAME = "Production Clever Methods";

XData UrlMap [ XMLNamespace = "http://www.intersystems.com/urlmap" ]
{
<Routes>
	<Route Url="/somemethod Method="POST" Call="HandleSomeFancyMethod"/>
</Routes>
}



ClassMethod HandleSomeFancyMethod() As %Status
{
	set sc = $$$OK
	try {
		// Check Content Type
		if (%request.ContentType '= "application/json") {
			// Throw some error here or repond
		}
		set %response.ContentType = "application/json"

		// Check Data Recieved
		if (%request.Content = "") {
			// Empty data error or "bad request or something"
		}

		// Parse the input into a Dynamic Object or whatever and validate as you'd like. You can also just send the stream to the service as indicated further down

		// Create a business service
		set sc = ##class(Ens.Director).CreateBusinessService(..#BUSINESSSERVICENAME, .tService)
		if $$$ISERR(sc) {
			// throw some error
		}

		// Create input for Service
		set tEnsRequest = ##class(Ens.StreamContainer).%New()
		do tInput.Rewind()
		set sc = tEnsRequest.StreamSet(tInput)
		if $$$ISERR(sc) {
			// trhow some error
		}

		Set tAttrs=##class(%ArrayOfDataTypes).%New()
		do tAttrs.SetAt(%response.ContentType,"ContentType")
		do tEnsRequest.SetAttributes(.tAttrs)

		// Process the input
		set sc = tService.ProcessInput(tEnsRequest, .tEnsOutput)
		// handle the sc however you see fit
		set sc = tEnsOutput.StreamGet().OutputToDevice()
		// handle the sc however you see fit

	} catch tEx {
		// error 500
	}

	quit sc
}

Method OnProcessInput(pInput As Ens.StreamContainer, Output pOutput As Ens.StreamContainer) As %Status
{
	set sc = $$$OK
	try {
		// do whatever you want to do
		// You can send to other business hosts and so forth


		// Set the response object into the stream
		set tStream = ##class(%Stream.GlobalCharacter).%New()
		// tDynamicObj in this case is the reponse object
		set sc = tStream.Write(tDynamicObj.%ToJSON())
		set sc = pOutput.StreamSet(tStream)
	} catch ex {
		set sc = ex.AsStatus()
	}

	quit sc
}

}
Stefan Cronje · Feb 3, 2023 go to post

Thanks.

I have found the solution. This is for everyone who uses Ubuntu and need to use docker-compose with BuildKit.
Do not use docker-compose up -d, but rather docker compose up -d. In other words, do not use docker-compose, use the Compose plugin of Docker.

See the below link for information on what do to:

Install the Compose plugin | Docker Documentation
 

Stefan Cronje · Feb 3, 2023 go to post

This looks promising.
Struggling with docker-compose though. Running it on Ubuntu and docker-compose does not use BuiltdKit, or so it says. So the --mount option is not working.
I am trying to find a way around it to check this out.

Stefan Cronje · Jan 31, 2023 go to post

Maybe have a look at %Library.FunctionalIndex and look at the defining indexes section in the documentation for BuildValuesArray()

Stefan Cronje · Jan 26, 2023 go to post

Hi there,

Are you using VSCode?

If so, you can convert the EOL for new files you create and ones you edit. On VSCode you can use LF on WIndows too without issues.

Otherwise, after exporting the classes, do the following in terminal

Set tOldFile = ##class(%Stream.FileCharacter).%New()
w tOldFile.LinkToFile("C:\whereever\code-with-crlf.xml")
Set tNewFile = ##class(%Stream.FileCharacter).%New()
w tNewFile.LinkToFile("C:\whereever\code-with-lf.xml")
Do tOldFile.Rewind()
While ('tOldFile.AtEnd) {  set tTempStr = tOldFile.ReadLine()  Do tNewFile.Write($ZSTRIP(tTempStr,"*",""_$CHAR(13)) _ $CHAR(10))  }
w tNewFile.%Save()
do tOldFile.%Close()
do tNewFile.%Close()

Then import that file and see if this solution broke your code. :)

Stefan Cronje · Jan 25, 2023 go to post

Wtih -1 the connection probably "dies" on the other server.

The Passthrough Operation does not know it died, so FTP is going to fail, immediately. Then it will retry if E=R is set on the Action and within the timeout period, it will reconnect and then transfer the file, but soon the next file will come and same will happen if the connection died.

I recommend changing that setting to 0 first and monitoring what happens.

UPDATED:
To ensure delivery

  • Stay Connected: 0
  • Reply Code Actions: E=R
  • Failure Timeout: -1
Stefan Cronje · Jan 25, 2023 go to post

But, before you jump in write a lot of code, check the "Stay Connected" setting on the Passthrough Operation.

I have found that most FTP servers do not like long connections. I have solved many SFTP issues on business operations by setting "Stay Connected" to 0

Stefan Cronje · Jan 25, 2023 go to post

So this code runs on Server A?

Server A handles the passthrough and then needs to check that the file landed on Server B, correct?

If that is the case, the solution is not going to be this simple. You will have to have another operation called by the BP to check if the file is on Server B. In short

  • A custom operation that uses the FTP Outbound Adaptor to FTP to Server B and pull a list of files.
  • The operation will need the filename and should be provided by the Ensemble message from the BP.
  • The operation should then check if the filename received in the Ensemble message is in the File List on Server B. You might be able to filter for the filename directly with FTP, I will have to confirm that.
  • The operation should then return a message to the BP that contains a "result" of the check. The BP can then act on that by creating an Alert Message and sending it to Ens.Alert.

I hope this helps. Let me know if I am misunderstanding the requirement.

Stefan Cronje · Jan 25, 2023 go to post

Do you want to check the file on the local disk or on a different machine?

What is the end goal of the BP? If the file did not transfer, what do you want it to do?

Stefan Cronje · Jan 25, 2023 go to post

As a starting point, you can try thr following to force the IS NULL filter to be applied first - assuming there is an index on it.
From %FIRSTTABLE Records SQLUser.Books Books

Stefan Cronje · Jan 25, 2023 go to post

Will you please get the query plan for this without running it in %parallel first, so that we can see what it does internally.

From there we can determine full scans on tables, etc.

Also, as mentioned, which fields are indexed and the types of indices.

Stefan Cronje · Jan 22, 2023 go to post

Thank you.

The tools used to run updates is not up to me. The end-customer has very strict security policies, so the only way to run SQL is via the SMP. Only that web port is open to most personnel, and then you are doing this on a remote desktop as well.

Stefan Cronje · Jan 22, 2023 go to post

Thank you for clarifying. That makes sense.

It would have been nice to still have that information accessible somehow. If you add audit logs and other things based on triggers, you lose that traceability if SMP is used. This does create a bit of a gap in the audit trail if you are keeping one based on trigger events. The Dynamic SQL audit event can be used, but you can't link it directly to a record that has been updated or deleted, if you are using information that is only available in %session.

Can the InBackground behaviour be changed in a configuration?

Stefan Cronje · Jan 22, 2023 go to post

For simplicity I am updating the table using the System Management Portal SQL.

I used auditing to get the PID and logged the other in the trigger.
Both are the same value: 95437

The ^ERRORS contains 3472 lines just for this. Do you want to see something specific in it?

There is no reference to %session in it.

Stefan Cronje · Jan 16, 2023 go to post

I am back on this posting again.
I started running the tests on a Ubuntu serv

The %objlasterror still contained the error about being unable to stop the %Monitor, but this is very misleading.

What the cause of the issue was, and this might not be obvious to everyone, is the line terminators in the coverage.list file.

On Windows it was CRLF, which worked fine. When using those files on Ubuntu, the file lines are read, but including the trailing CR, which caused it not to detect the packages listed in the file correctly:(PKG(CR) '= PKG

It would be a nice to have the file loader strip out the CR's from the filestream before performing the readlines on a *nix OS.

Stefan Cronje · May 6, 2022 go to post

It does not seem like the XData block parameters for the JSON mappings supports something like that.
I did the following to get a different output value on the JSON. It is not the prettiest solution.

Class Test.JSONThings Extends (%Persistent, %XML.Adaptor, %JSON.Adaptor)
{

Property Name As %String;

Property Surname As %String;

Property Gender As %String(DISPLAYLIST = ",Male,Female", JSONLISTPARAMETER = "DISPLAYLIST", VALUELIST = ",M,F");

Property SomeOtherGender As %String(DISPLAYLIST = ",Man,Vrou", JSONLISTPARAMETER = "DISPLAYLIST", VALUELIST = ",M,F") [ Calculated, SqlComputeCode = { set {*}={Gender} }, SqlComputed, Transient ];

Property Age As %Integer; Property Notes As %String(MAXLEN = ""); Property Code As Test.JSONRef(%JSONREFERENCE = "ID");

XData NSMapping
{
<Mapping xmlns="http://www.intersystems.com/jsonmapping">
        <Property Name="Name" FieldName="Name"/>
        <Property Name="Surname" FieldName="Surname"/>
        <Property Name="SomeOtherGender" FieldName="Gender"/>
</Mapping>
}

}
Stefan Cronje · May 6, 2022 go to post

Hi,

Apologies. My mind must have been in the wrong place. I was looking at it from the Ensemble message browser's side for some reason.

Stefan Cronje · May 5, 2022 go to post

Hiya,

Should it be a WebSocket, or can it be a TCP counted socket connection?

If TCP counted, there are adapters available in Ensemble/IRIS, you  just need to create the business operation for the server-side, and the business service for the client-side.
You can use the EnsLib.TCP.CountedXMLOutboundAdapter  adapter, and the related inbound adapter.
In the operation you will have to serialise the Ensemble message to XML, and vice versa on the service.
Both these adapters have built-in functionality to ensure connections are up and running, and you can set up alerting controls on it.

I have never sent Ensemble messages directly like this to another instance, so I can't comment on if it will work or not.

Stefan Cronje · May 5, 2022 go to post

Hiya,

Look at the %Library.String documentation.

You can use the JSONLISTPARAMETER
Property Type As %String(DISPLAYLIST ",Sveže,Zmrznjeno,Dodelava"VALUELIST ",FRE,FRO,FIN",JSONLISTPARAMETER=",Sveže,Zmrznjeno,Dodelava");

I have found that setting this parameter to the DISPLAYLIST parameter did not work, so I just put the same values in this parameter.

Stefan Cronje · Apr 28, 2022 go to post

I dug around a bit.
It works from the dictionary classes to create the table being displayed.
I did see in there that if you add an XData block to your class called "FormDefinition" and omit that property, then it will not display it.

Have a look at this:
InterSystems CSP AutoForm

I also saw that the table generator goes up to 4 levels deep of recursive processing of a message's properties.

Stefan Cronje · Apr 28, 2022 go to post

I don't know if this will work, but did you try setting XMLPROJECTION="NONE" on the inverse property?

Stefan Cronje · Feb 26, 2022 go to post

Hi,

Can you please provide an example of what you are doing in the DTL?

As far as I know, the DTL does not handle the line-endings, but rather at deserialization of the input to the source object, and at the serialization of the target object into whichever format it is put it, which is after the executed of the DTL.

i.e. Source Data -> Deserialize into object -> DTL -> serialize object into File/XML, etc.

The source and targets can be Record Maps, Virtual Docs and any other object you can think of.

Stefan Cronje · Feb 26, 2022 go to post

Hi,

This might be an obvious question... What is the license usage when this happens?
 

Stefan Cronje · Feb 2, 2022 go to post

If all else fails, uninstall all the extensions related to CacheQuality and ObjectScript.
Restart VSCode
Install the Extension Pack
It works.