If you want to use the group by, then you should probably do the count where the group by is being done, and use distinct by as you had it.

select distinct by(serial,hq) hq, count(1)
from thetable
group by hq

If you want it per that grouping.

There are no filters in it, so it is going to do a full table scan and compare and calculate values for each row. Taking the amount of time it takes, it is actually fast.

Maybe look into Bitslice indexes. It might help but at a cost of performance on insert and update:
InterSystems Documentation

For a start:

select distinct by (VISIT_SERIAL_NO,HQ_ORG_CODE) VISIT_SERIAL_NO,HQ_ORG_CODE
can be changed to
select distinct VISIT_SERIAL_NO,HQ_ORG_CODE

It will do the same.

Secondly:
Will you please remove the %parallel and click on "Show Plan". Post that plan here. It will help to determine where the query is slow. It might be using the wrong index. There are many.

Lastly:
Have you tuned the table and checked the result after that?

This is a great tool.

I am wondering if it will work for everyone. In the world of finance, you do not get SSH access to servers.
Most of the times the super-server port is also closed off for everything except the web gateway.

If the web version can be run on it, it is great - but in banking environment, not everyone is on the "containerised" buzz yet, so this will not be allowed.

Sure, I can probably install and configure the package and set up the web application.

Now there are two things left I want to raise:

  1. Multi-line SQL without having SSH access. Also do not have SCP or SFTP access.
    1. If this is present, and I have missed it, I apologise.
  2. Database transactions.
    1. I have a SQL shell I built a long time ago, which worked with db transactions.
    2. Doing DML, you may want to verify the results before committing it to the DB and have to option to rollback.
      1. This will be really great if the app can handle it.

Thank you for your response on this. I see what you are saying regarding the different problems being solved.

This is basically the agenda I am pushing - let's talk about packages and what should be and should not be packaged together, what we need, etc. BUT without adding "red tape" that will demotivate community members from contributing.

I seem to be giving a lot of 2c today. wink

I understand completely. As a side comment, which is a bit off topic, I like what you've done and think I will contribute to it when time allows. I created the old Dynamic Object Adapter pacakge for Ensemble. There are things in there we can add to the OpenAPI suite if needed, for example the reverse of API first. Existing class definitions to swagger type thing. 

Good question. I have had the requirement to do this a few times already, but create a BPL instance every time.

This becomes an effort when exposing services for different API versions, and you build your BPL and Logic to be compatible with the latest. All you need to do is transform the old version to the new one. But you want it somewhere "between" the service and the BPL.

The same goes for message versioning on Business Operations. Some systems you work with use the different version of the same API.

I have started adding APIVersion properties to the services and operations, but then still needsto do a lot of code to cater for each version of the message, or an additional BPL for every transformation.
Also, I need a Lookup to map versions to Business Host names.

It would be great if the following was possible:

  • Ens.Request and Ens.Response had a MessageVersion parameter or something similar.
  • The Business host has a MessageVersion property as a SETTING
  • The Transformation could be handled the Business Host class in some way. Get the MessageVersion value of the message and check the MessageVersion of the TargetHost. If it differs it invokes a transformation class to transform the message. This class will have to have some rules in it to convert between different version of the messages.
  • On the Response message the same principle should apply. Some operations work on older versions of an API.
  • There will of course have to be a transformation class configuration of some sort per business host for the request and responses.

There may even be much simpler solutions than this. But I agree with question asked.

Using the proposed approach, there will not be a port listening for REST messages on the Production.

All WS requesst will have to go through the CSP gateway.
If there are other Business Operations utilising that REST service, you will have to connect via the web-server and set up credentials to use.

Also, as suggested by @Dmitry Maslennikov , you can split the services into smaller services if control is required at that level.

Then for local services consumed by the same or other productions pn the same server, you can use Ens RESTService, and control access to the port on the OS firewall to only allow only for traffic from localhost.

Also have a look at the class documentation.
You can disable the local port, so that it will not listen from the production and enable it to go via CSP. You then need to set the CSP application authentication options.
From the class documentation.
property EnableStandardRequests as %Boolean [ InitialExpression = 1 ];

Listen via the CSP WebServer in addition to listening on the HTTP.InboundAdapter's custom local port, if the Adapter is defined/

Note that SSLConfig only applies to the custom local port. To use SSL via the CSP WebServer, you must configure the WebServer separately.

If the Service is invoked via the CSP WebServer, the ?CfgItem= URL parameter may be used to distinguish between multiple configured same-class Services but the standard csp/namespace/classname URL must be used.

Which Business Service class are you using?

How is the produciton receiving the message?

Usually for REST services on a production, I do the following:

  • Create a "dispatch" class with the basic which extends from %CSP.REST
  • This REST class will receive the messages, then invoke a business service in the production.
  • You set up a CSP application that uses this class as the dispatcher.
    • Set it as authenticated, which will use basic authentication.

Something like the below. This is in one class, but you can put it into two. You then use this class as the dispatch class on the CSP application:

Class MyPackage.RESTService Extends (%CSP.REST, Ens.BusinessService)
{

Parameter UseSession = 0;

/// Name of Ensemble Business Service. Set in implementation
Parameter BUSINESSSERVICENAME = "Production Clever Methods";

XData UrlMap [ XMLNamespace = "http://www.intersystems.com/urlmap" ]
{
<Routes>
	<Route Url="/somemethod Method="POST" Call="HandleSomeFancyMethod"/>
</Routes>
}



ClassMethod HandleSomeFancyMethod() As %Status
{
	set sc = $$$OK
	try {
		// Check Content Type
		if (%request.ContentType '= "application/json") {
			// Throw some error here or repond
		}
		set %response.ContentType = "application/json"

		// Check Data Recieved
		if (%request.Content = "") {
			// Empty data error or "bad request or something"
		}

		// Parse the input into a Dynamic Object or whatever and validate as you'd like. You can also just send the stream to the service as indicated further down

		// Create a business service
		set sc = ##class(Ens.Director).CreateBusinessService(..#BUSINESSSERVICENAME, .tService)
		if $$$ISERR(sc) {
			// throw some error
		}

		// Create input for Service
		set tEnsRequest = ##class(Ens.StreamContainer).%New()
		do tInput.Rewind()
		set sc = tEnsRequest.StreamSet(tInput)
		if $$$ISERR(sc) {
			// trhow some error
		}

		Set tAttrs=##class(%ArrayOfDataTypes).%New()
		do tAttrs.SetAt(%response.ContentType,"ContentType")
		do tEnsRequest.SetAttributes(.tAttrs)

		// Process the input
		set sc = tService.ProcessInput(tEnsRequest, .tEnsOutput)
		// handle the sc however you see fit
		set sc = tEnsOutput.StreamGet().OutputToDevice()
		// handle the sc however you see fit

	} catch tEx {
		// error 500
	}

	quit sc
}

Method OnProcessInput(pInput As Ens.StreamContainer, Output pOutput As Ens.StreamContainer) As %Status
{
	set sc = $$$OK
	try {
		// do whatever you want to do
		// You can send to other business hosts and so forth


		// Set the response object into the stream
		set tStream = ##class(%Stream.GlobalCharacter).%New()
		// tDynamicObj in this case is the reponse object
		set sc = tStream.Write(tDynamicObj.%ToJSON())
		set sc = pOutput.StreamSet(tStream)
	} catch ex {
		set sc = ex.AsStatus()
	}

	quit sc
}

}

Hi there,

Are you using VSCode?

If so, you can convert the EOL for new files you create and ones you edit. On VSCode you can use LF on WIndows too without issues.

Otherwise, after exporting the classes, do the following in terminal

Set tOldFile = ##class(%Stream.FileCharacter).%New()
w tOldFile.LinkToFile("C:\whereever\code-with-crlf.xml")
Set tNewFile = ##class(%Stream.FileCharacter).%New()
w tNewFile.LinkToFile("C:\whereever\code-with-lf.xml")
Do tOldFile.Rewind()
While ('tOldFile.AtEnd) {  set tTempStr = tOldFile.ReadLine()  Do tNewFile.Write($ZSTRIP(tTempStr,"*",""_$CHAR(13)) _ $CHAR(10))  }
w tNewFile.%Save()
do tOldFile.%Close()
do tNewFile.%Close()

Then import that file and see if this solution broke your code. :)