go to post Stefan Cronje · Feb 12, 2023 If you want to use the group by, then you should probably do the count where the group by is being done, and use distinct by as you had it. select distinct by(serial,hq) hq, count(1)from thetablegroup by hq If you want it per that grouping. There are no filters in it, so it is going to do a full table scan and compare and calculate values for each row. Taking the amount of time it takes, it is actually fast. Maybe look into Bitslice indexes. It might help but at a cost of performance on insert and update:InterSystems Documentation
go to post Stefan Cronje · Feb 12, 2023 For a start: select distinct by (VISIT_SERIAL_NO,HQ_ORG_CODE) VISIT_SERIAL_NO,HQ_ORG_CODEcan be changed toselect distinct VISIT_SERIAL_NO,HQ_ORG_CODE It will do the same. Secondly:Will you please remove the %parallel and click on "Show Plan". Post that plan here. It will help to determine where the query is slow. It might be using the wrong index. There are many. Lastly:Have you tuned the table and checked the result after that?
go to post Stefan Cronje · Feb 11, 2023 Thank you for the clarification. If rollback and commit is supported, then verifying the results is just the step of doing a select before committing in order to verify the update/insert was correct and as expected. Nothing special to it or automated in any way. This is great. Thank you.
go to post Stefan Cronje · Feb 11, 2023 I updated it like that last week. Apparently, I did not send it for approval, which I thought i did.
go to post Stefan Cronje · Feb 10, 2023 Hi there, For iris-persistent-class-audi, I loaded a video on youtube a week ago and linked it on OEX.
go to post Stefan Cronje · Feb 10, 2023 This is a great tool. I am wondering if it will work for everyone. In the world of finance, you do not get SSH access to servers.Most of the times the super-server port is also closed off for everything except the web gateway. If the web version can be run on it, it is great - but in banking environment, not everyone is on the "containerised" buzz yet, so this will not be allowed. Sure, I can probably install and configure the package and set up the web application. Now there are two things left I want to raise: Multi-line SQL without having SSH access. Also do not have SCP or SFTP access. If this is present, and I have missed it, I apologise. Database transactions. I have a SQL shell I built a long time ago, which worked with db transactions. Doing DML, you may want to verify the results before committing it to the DB and have to option to rollback. This will be really great if the app can handle it.
go to post Stefan Cronje · Feb 10, 2023 Thank you for your response on this. I see what you are saying regarding the different problems being solved. This is basically the agenda I am pushing - let's talk about packages and what should be and should not be packaged together, what we need, etc. BUT without adding "red tape" that will demotivate community members from contributing. I seem to be giving a lot of 2c today.
go to post Stefan Cronje · Feb 10, 2023 I understand completely. As a side comment, which is a bit off topic, I like what you've done and think I will contribute to it when time allows. I created the old Dynamic Object Adapter pacakge for Ensemble. There are things in there we can add to the OpenAPI suite if needed, for example the reverse of API first. Existing class definitions to swagger type thing.
go to post Stefan Cronje · Feb 10, 2023 Great idea this. May I suggest one change. Use a different tag for the unit test code. The <example> tag is used to display nicely formatted code in the documatic for a dev - like a one liner on how to use it or something like that, or a just a block of code for context. Now if you put in an example, it is going to be mixing with the unit tests.
go to post Stefan Cronje · Feb 10, 2023 Thank you for the information and the proposal to have a brainstorming session as a community.
go to post Stefan Cronje · Feb 10, 2023 An installer manifest will be helpful, Also, maybe clearer instructions on the installation? Like to clone the git repo, what to import, the compile order if any? The root package name being "Demo" does not distinguish the "author/provider". My boss would not want to see something in the Live environment with the name of Demo.
go to post Stefan Cronje · Feb 10, 2023 Good concept, Good start.Would you mind contributions on GitHub?
go to post Stefan Cronje · Feb 9, 2023 Good question. I have had the requirement to do this a few times already, but create a BPL instance every time. This becomes an effort when exposing services for different API versions, and you build your BPL and Logic to be compatible with the latest. All you need to do is transform the old version to the new one. But you want it somewhere "between" the service and the BPL. The same goes for message versioning on Business Operations. Some systems you work with use the different version of the same API. I have started adding APIVersion properties to the services and operations, but then still needsto do a lot of code to cater for each version of the message, or an additional BPL for every transformation.Also, I need a Lookup to map versions to Business Host names. It would be great if the following was possible: Ens.Request and Ens.Response had a MessageVersion parameter or something similar. The Business host has a MessageVersion property as a SETTING The Transformation could be handled the Business Host class in some way. Get the MessageVersion value of the message and check the MessageVersion of the TargetHost. If it differs it invokes a transformation class to transform the message. This class will have to have some rules in it to convert between different version of the messages. On the Response message the same principle should apply. Some operations work on older versions of an API. There will of course have to be a transformation class configuration of some sort per business host for the request and responses. There may even be much simpler solutions than this. But I agree with question asked.
go to post Stefan Cronje · Feb 9, 2023 Using the proposed approach, there will not be a port listening for REST messages on the Production. All WS requesst will have to go through the CSP gateway.If there are other Business Operations utilising that REST service, you will have to connect via the web-server and set up credentials to use. Also, as suggested by @Dmitry Maslennikov , you can split the services into smaller services if control is required at that level. Then for local services consumed by the same or other productions pn the same server, you can use Ens RESTService, and control access to the port on the OS firewall to only allow only for traffic from localhost. Also have a look at the class documentation.You can disable the local port, so that it will not listen from the production and enable it to go via CSP. You then need to set the CSP application authentication options.From the class documentation.property EnableStandardRequests as %Boolean [ InitialExpression = 1 ]; Listen via the CSP WebServer in addition to listening on the HTTP.InboundAdapter's custom local port, if the Adapter is defined/ Note that SSLConfig only applies to the custom local port. To use SSL via the CSP WebServer, you must configure the WebServer separately. If the Service is invoked via the CSP WebServer, the ?CfgItem= URL parameter may be used to distinguish between multiple configured same-class Services but the standard csp/namespace/classname URL must be used.
go to post Stefan Cronje · Feb 8, 2023 Which Business Service class are you using? How is the produciton receiving the message? Usually for REST services on a production, I do the following: Create a "dispatch" class with the basic which extends from %CSP.REST This REST class will receive the messages, then invoke a business service in the production. You set up a CSP application that uses this class as the dispatcher. Set it as authenticated, which will use basic authentication. Something like the below. This is in one class, but you can put it into two. You then use this class as the dispatch class on the CSP application: Class MyPackage.RESTService Extends (%CSP.REST, Ens.BusinessService) { Parameter UseSession = 0; /// Name of Ensemble Business Service. Set in implementation Parameter BUSINESSSERVICENAME = "Production Clever Methods"; XData UrlMap [ XMLNamespace = "http://www.intersystems.com/urlmap" ] { <Routes> <Route Url="/somemethod Method="POST" Call="HandleSomeFancyMethod"/> </Routes> } ClassMethod HandleSomeFancyMethod() As %Status { set sc = $$$OK try { // Check Content Type if (%request.ContentType '= "application/json") { // Throw some error here or repond } set %response.ContentType = "application/json" // Check Data Recieved if (%request.Content = "") { // Empty data error or "bad request or something" } // Parse the input into a Dynamic Object or whatever and validate as you'd like. You can also just send the stream to the service as indicated further down // Create a business service set sc = ##class(Ens.Director).CreateBusinessService(..#BUSINESSSERVICENAME, .tService) if $$$ISERR(sc) { // throw some error } // Create input for Service set tEnsRequest = ##class(Ens.StreamContainer).%New() do tInput.Rewind() set sc = tEnsRequest.StreamSet(tInput) if $$$ISERR(sc) { // trhow some error } Set tAttrs=##class(%ArrayOfDataTypes).%New() do tAttrs.SetAt(%response.ContentType,"ContentType") do tEnsRequest.SetAttributes(.tAttrs) // Process the input set sc = tService.ProcessInput(tEnsRequest, .tEnsOutput) // handle the sc however you see fit set sc = tEnsOutput.StreamGet().OutputToDevice() // handle the sc however you see fit } catch tEx { // error 500 } quit sc } Method OnProcessInput(pInput As Ens.StreamContainer, Output pOutput As Ens.StreamContainer) As %Status { set sc = $$$OK try { // do whatever you want to do // You can send to other business hosts and so forth // Set the response object into the stream set tStream = ##class(%Stream.GlobalCharacter).%New() // tDynamicObj in this case is the reponse object set sc = tStream.Write(tDynamicObj.%ToJSON()) set sc = pOutput.StreamSet(tStream) } catch ex { set sc = ex.AsStatus() } quit sc } }
go to post Stefan Cronje · Feb 3, 2023 Thanks. I have found the solution. This is for everyone who uses Ubuntu and need to use docker-compose with BuildKit.Do not use docker-compose up -d, but rather docker compose up -d. In other words, do not use docker-compose, use the Compose plugin of Docker. See the below link for information on what do to: Install the Compose plugin | Docker Documentation
go to post Stefan Cronje · Feb 3, 2023 This looks promising.Struggling with docker-compose though. Running it on Ubuntu and docker-compose does not use BuiltdKit, or so it says. So the --mount option is not working.I am trying to find a way around it to check this out.
go to post Stefan Cronje · Jan 31, 2023 Maybe have a look at %Library.FunctionalIndex and look at the defining indexes section in the documentation for BuildValuesArray()
go to post Stefan Cronje · Jan 27, 2023 Rules changing to get additional points halfway through the contest period?
go to post Stefan Cronje · Jan 26, 2023 Hi there, Are you using VSCode? If so, you can convert the EOL for new files you create and ones you edit. On VSCode you can use LF on WIndows too without issues. Otherwise, after exporting the classes, do the following in terminal Set tOldFile = ##class(%Stream.FileCharacter).%New() w tOldFile.LinkToFile("C:\whereever\code-with-crlf.xml") Set tNewFile = ##class(%Stream.FileCharacter).%New() w tNewFile.LinkToFile("C:\whereever\code-with-lf.xml") Do tOldFile.Rewind() While ('tOldFile.AtEnd) { set tTempStr = tOldFile.ReadLine() Do tNewFile.Write($ZSTRIP(tTempStr,"*",""_$CHAR(13)) _ $CHAR(10)) } w tNewFile.%Save() do tOldFile.%Close() do tNewFile.%Close() Then import that file and see if this solution broke your code. :)