Hey Patty.

If you just simply need the empty NTE to be added in using the DTL, you can set the first field to an empty string to force it to appear.

For example:

Will give this:

Note that my example is simply hardcoding the first OBX repetition of every first repeating field with no care for the content. You will likely need to do a for each where you evaluate if the source NTE:1 has a value, and then only set to an empty string if there is no content in the source.

Seeing as I just completed a production upgrade yesterday:

  • What InterSystems products + versions are you running? ($zv is ideal.) 
    • IRIS for Windows (x86-64) 2022.1 (Build 209U) Tue May 31 2022 12:16:40 EDT [Health:3.5.0]
  • What makes you decide to upgrade?
    • New features + security fixes
  • ​​​​​​​What are your blockers to upgrading?
    • ​​​​​​​Bugs in new releases + being limited to the non-CD releases due to current configuration
  • What is your process for evaluating and planning a possible upgrade?
    • ​​​​​​​Install in our NPE and use the new version, run tests against the most heavily used elements
  • What documentation resources do you use?
    • ​​​​​​​Release notes + any upgrade guides that explicitly call out versions you can/can't upgrade from
  • ​​​​​​​What gaps/issues do you see in our existing documentation around upgrades?
    • ​​​​​​​It's a small thing, but a link to the release notes from the online distribution page on WRC would be greatly received alongside the release in question.
  • What would make your InterSystems software upgrade process better?
    • ​​​​​​​One step that always bothers me is the need to do a recompile post upgrade, as it's not been made quite clear to me at what stage this needs to be done when working in a mirrored environment. This could be a step handled by the installer given that it should happen with any major version change.​​​​​​​
  • What has made an upgrade fail?
    • Not to hammer on at the same point, but I did have an upgrade "fail" due to a miscommunication about if the version change was major or minor, and we hadn't run the recompile of all namespaces.
  • When have you had to roll-back?
    • Never had to fully roll back, but have had to fall back to a secondary mirror after noting the upgraded mirror was having issues (see above). Otherwise we aim for a "fix forward" approach to issues found following an upgrade.

So upon further review, it seems that the first ACK is being generated by the Operation, and the second one is the body of the HTTP Response.

Basically, the operation will attempt to parse the http response into a HL7 message, and if that doesn't happen, it will then "generate" an ack and write the http response data at the end of the generated ack.

In my case, although there is a HL7 message in the response, it's not being parsed for some reason, so the code moves onto generating its own ack, followed by the http response body, which is the second ack I was seeing.

I'm now replicating the HTTP operation and attempting to pin down exactly where it's falling down, and failing that I will likely reach out to WRC as it seems to be an issue deeper than I can dive.

Hey Lionel.

I did write an article about this a little while ago which I hope can walk you through what you're looking to achieve, with the difference being to pivot from using the ORCGrp counts like I had, and instead using RXCgrp and then setting the transform to have a source and target class of RDE O11.

If you do ty to follow this approach, I'm more than happy to answer any questions you have.

If you have a record map configured and have the generated class, then the next step would be to use this generated class in a transform as your source. From there, you can transform the data into the SDA class you have set in your target for transform.

Once you have this SDA message, you can send it wherever your requirements need you to send them.

If you're working with a CSV, then you could look at the Record Mapper or Complex Record Mapper depending on the input format. From there, you could then create a transform that uses the generated class from the record mapper as your source, and then the appropriate SDA class as your destination.

However, if you're working with an actual xls/xlsx file, then I'm out of ideas.

Your Process is most likely using ..SendRequestAsync() to send to the Operation and has "pResponseRequired" set to 1 (or not set at all, so it's using the default value of 1).

There's nothing inherently wrong with this, but if you just want to send to the Operation and not worry about the response going back to your process, you could change the "pResponseRequired" flag to 0 in your call. So it would look a little like this:

Set tSC = ..SendRequestAsync("TargetOperationName",ObjToSend,0)

However you may wish to consider if this approach is appropriate to your setup, or if you would be better off using "SendRequestSync()" and dealing with the response synchronously. 

To parse the json, the below is a starting point for taking the content of the stream into a dynamic object, and then saving the value into its own variable.

Set DynamicObject=[].%FromJSON(pRequest.Stream)
Set Name = DynamicObject.name
Set DOB = DynamicObject.DOB
Set SSN = DynamicObject.SSN

You could then store these wherever you need to. If your SQL table is external, then you could have your Operation using the SQL Outbound Adapter to then write these in your external DB.

ETA: If you then need to pick out the values within the content of name (which I assume has come from a HL7 message) you could use $PIECE to pick out the data from the delimited string you're receiving.

Hey Daniel.

As a starting point, I would not be adding the trace when viewing the DTL in Studio, and instead I would add it when using the Data Transformation Builder:

Which gives me:

If this is not working for you, make sure that the router has "Log Trace Events" enabled in the router settings and the router has been restarted since enabling the trace. I have been caught out numerous times enabling the trace and then forgetting to restart the process/router in the production.

The issue of no results being returned is likely elsewhere in your query.

To test this, I created a basic table with the following:

CREATE TABLE Demo.BBQ (
    Name varchar(100),
    Type varchar(50),
    isActive bit
)

And I then added a few rows:

Insert into Demo.BBQ (Name, Type, isActive)
VALUES('Super Grill''s BBQ Hut','Outdoor',1)
Insert into Demo.BBQ (Name, Type, isActive)
VALUES('Bobs BBQ Bistro','Indoor',1)
Insert into Demo.BBQ (Name, Type, isActive)
VALUES('Rubbish Grill''s BBQ Resort','Not Known',0)

This then gave me a table that looks like this (note that the double single quotes used in the insert are inserted as a single quotes into the table):

If I then run a query using the like function:

And if I want to exclude the inactive location:

The use of doubling up a single quote to escape the character is not a Intersystems specific approach, but is generally a standard SQL way of escaping the single quote.

There's two ways around this:

  1. If this is something you're running one off in, say, the sql query tool within the management portal, then you will want to repeat the single quote to be "Select * from cnd.Facilities  where name like ''%Grill''s BBQ%'" and this will escape the single quote.
  2. If this is a query being called within your code and the value could be anything passed to it, then you will want to look to use parameterised SQL queries. An example can be found for SQL adapters here.

Hey Scott.

I'm not sure if this will work at all, but have you tried extending your timeout for your CSP gateway?

Management Portal -> System Administration -> Configuration -> Web Gateway Management is the route to this. The username you'll then need is "CSPSystem" and the password should be the default password used when installing the system.

From within here you can navigate to "Default Parameters" and increase the Server Response Timeout parameter.