Assuming all the classes are identical in both namespaces, then I think you can just copy over the underlying global arrays that hold the table data. You'll need to know a bit about the naming rules, etc. to find the globals, and a bit of programming to clear them out in the target beforehand. (The classes could be copied over separately.) If the namespaces are on the same network, and the tables small, then Merge commands could be used.

The documentation examples all have <sequence> blocks around the calls and sync. Your code only has a <scope> block in the bit we can see. Maybe try adding a <sequence>... </sequence>?

Actually, thinking about it, the behaviour is as if you had in the sync an attribute:
type='any' 
but I can see you have:
type='all'
Maybe try changing the "type"? or removing it completely, as the default is "all"?

Hi,

I've asked the Cody people about this, and they replied:

Cody can only recognize files that are directly in the repository. So files outside the repository won’t be fetched as context. The InterSystem ObjectScript extension does not seem to put the documents in the repository, but in its own local storage.  

However, I'm surprised that Cody can’t at least see the contents of the current edit window or retrieve currently selected text. Surely it doesn’t pull data from a file system for those? Cody must call a VS Code API, which you would assume would know how to return the current text regardless of where the original is stored.

Maybe InterSystems have not configured FileSystemProvider correctly to allow for other extensions to request document info? I'm well out of my skill set here. Just guessing.

Whatever, I can see that the Cody supplier is going to assume it's InterSystems fault, and InterSystems will point back at them.  So I'm back to copy and paste in and out of Claude.  :-)

Mike

Just to tidy this one up, we eventually were told that the other end was supplying HL7 v2 messages, so the input became a single string with a v2 message in it.

And it's still not finished. We eventually found out the source could actually do soap calls direct, before converting to HL7, so it might even end up as we define a simple class with some properties and they pull in the WSDL to use it. That's development for you.  :-)

Hi,

After many years of development and support I've become wary of one-line requests. 🤔 I wonder why you need this (Five whys - Wikipedia). For example, if you are trying to debug a mysterious state in a background job then maybe you just need "D LOG^%ETN" to store the variables in the error log. Or, at least you could look in there for ways to use $ORDER and $QUERY  to scan local variables without involving ^SPOOL (or, as we once did, opening a file to ZW to, closing, and then reading it back).

Hi. It's been a long time since I dealt with this, but when I was trying to get a stats extract to run once a day I used ADAPTER = "Ens.InboundAdapter" (with some custom params for which operation to send to and an email address) and in the production set up a schedule as described above to define a run for a few minutes each day, but then defined CallInterval as 999999. Thus it ran once at start of interval, and not again. I don't know if it was the best solution, but it seemed to work and kept the schedule visible in the production should we need to change it. The service just did queries on the cache database, but you could use a SQL adapter to query other databases.

update - just found that I actually used that same trick for middle of the night SQL extractions (mass update of medical instrument reference data).

and looking back at the original question: the answer is that in the ideal solution (not always the best) things should not "magically" appear in the middle of a BPL, there should be a service starting from the left sending stuff to the right in a visible path, even when the input is from "inside" the system. So here you might need that SQL adapter reading rows and sending a message per row. (Though the solution might get a lot more complex, I know mine did, with special message objects, streams and stored procedures!)

Thanks for the response. I'd love to move them to IRIS, but at the moment we are struggling to even get them from v 2007 to 2018, so we are stuck on cache for now.

It looks like you are confirming my thoughts. The FHIR data is complex and unpacking will be hard. Good point though about only needing classes for each data type.

I'll have to take it to the supplier of the other end of our link. This is a one-off custom feed that will only ever be used once, so we can define it in whatever way is easiest to both ends. My end is "legacy" but maybe the other end will want to use the FHIR standard as it could need it for other links.

Hi. Not entirely sure what you mean, but I usually write a noddy test routine to clear variables, set up objects and properties, run methods, display, compare results, etc. Have it open in Studio alongside the classes being edited. Then call from terminal. Note that to save typing <ctrl>P can be used to get the previous entries at the command line (and <ctrl>N for next) and allow editing. Alternatively, for repeated test cases I often just copy and paste code from Word or OneNote. / Mike

Hi. You say the codebase is over 30 years old. Well, I have a solution from 1991...

The PAS system I (still) work on has some standard search and display ("code list") software that has to return all codes "starting with", but it's not pretty. As the starting point for the $O() it does this:

	; Return a string immediately preceding input in collating sequence (returns null if input is null)
	; ABC becomes ABB||, -1 becomes -1.0000000001, -.1 becomes .1000000001, 0 becomes -.0000000001, .1 becomes .0999999999, 1 becomes 0.9999999999
SEED(A)	Q:A="" ""  ; null string
	I '$$NUMERIC(A) S LEN=$L(A),T=$E(A,1,LEN-1)_$C($A($E(A,LEN))-1)_"||" Q T
	Q A-(1E-10)

Since you know your target global only has non-numeric subscripts, you won't need to see the 5 lines of nastiness that is the NUMERIC call. :-) The end of your SEED=$O(@CLGREF@(SEED)) is either the usual "", or ZIN'=$E(SEED,1,$L(ZIN))  (again ignoring the horrible code dealing with numeric values).

Apologies for the ancient coding style (not mine, but I wrote similar back then).  / Mike

Hi. If you mean the ODBC driver, then it gets installed when you install Cache. So, any Cache install file for that version has it. I don't know if you can select to only install the driver and nothing else as I always want the full lot on my PC.

(... just tried and a "custom" setup allows you to remove everything but the ODBC driver, but it's fiddly.)

Hi. Interesting topic.

Why did you choose to become a software engineer / developer?

Always had an interest in science and tech growing up. Introduced to programming at school and enjoyed it. When looking for a job, there was a programming one, so I chose it. And never looked back.

How and when did you start to generate a "flow state of mind" during your career?

- Again, always had it, I think. Give me a good programming problem and I can lose hours without noticing.

What are recommend habits inside and outside, during you own time and during your work time, to be focused during you coding session and daily tasks?

- As I'm lucky and it "just happens" I'm not sure I'm a good source of ideas on this. But at work I find it helps to try to stop other things breaking my concentration. So I will "clear the decks": very little in my email in tray (it all goes into "later" or is set up as a task, or done, or deleted) and all big things to do are recorded as tasks, and scheduled, so that I don't have that "I must remember to do x" popping into my head.

The hard part is avoiding interruptions like new emails and chats from colleagues. Sometimes I just don't notice them, though, as I'm in "flow". So the problem is the other way around: missing important, urgent stuff! Best to find/allocate time when I'm not expected to respond quickly. Often I split my day into two: first I'll do all the other stuff like emails, admin, training, meetings, etc. and as a reward, play with code later. :-)

Hi. Label printers are a pain. :-) In the application I worked on, the printing was originally to matrix/line printers via a "driver" that knew the escape codes to use. But then along came the "thermal" printers using things like ZPL, and squeezing that into a line by line driver was not easy. Had to keep track of a virtual print head "position".

Using a tool to build a template and just providing the field data from IRIS sounds like a much easier route. I think you can save templates in the printer memory, if you don't want to store the text in IRIS.

If you do want to build the printer commands yourself, I recommend hiding them behind a class that stores a virtual label, and have methods to add the commands that take sensible inputs like x and y in mm (or inches), rotation in degrees, font in points, etc. Then internal functions can convert to dots, replace special characters, work out font to use, etc. And then have a final "print" method that writes the whole set of label commands out.

Have fun,

Mike

When we've had license problems in the past, the WRC have supplied us with a routine that watches for the limit warnings and then does a one-off dump of what processes are running for later investigation. We also wrote our own code to do a snapshot of license usage to a file, which is called by the Caché task manager at regular intervals so that we can analyze license usage over time. Helps when trying to see if the site needs more, or sometimes less, licenses. There's nothing like a good graph to prove your point. :-)

I'm no particular fan of this aspect of the language, but changing it as an "option" (by namespace?, routine?) would be a nightmare for maintenance. :-) Only recently fell foul of it with code like: IF type="X"!type="Y" ... that can never be true.

You could argue that it is at least very simple to understand. Just one rule to remember - left to right, except for brackets - rather than a precedence order for every single operation you might use!

Hi. Awful long time since I looked at that tool, but Michel is probably right: it's to do with how the lines are defined.  I have a vague memory that "multiline" was actually provided as an array, not one string with delimiters. And the simple top of the array had a line count, e.g.

remarks=2

remarks(1)="first line"

remarks(2)="last line"

But I could be wrong.  :-)  Try the help on that "options" field.  / Mike

Hi. This was a few years ago, but we could not find any way to do this in the standard Ensemble set up at the time. The message viewer can be asked the right sort of questions, but then times out as the query takes too long.

For monthly stats we wrote embedded SQL in a class method that we ran at the command prompt. This wrote out the results as CSV to the terminal and once it finished we copied and pasted the various blocks into Excel. Very slow and manual, but only once a month so not so bad. (Our system only purged messages after many months.)

Then we wanted daily numbers to feed into another spreadsheet. So we built an Ensemble service running once a day that ran similar embedded SQL but only for the one previous day of messages. It put the CSV results into a stream in a custom message that went to an operation, that sent then emails out to a user-defined list. Essential bits below. Hope it helps.

    Set today = $ZDATE($H,3)_" 00:00:00"
    Set yesterday = $SYSTEM.SQL.DATEADD("day",-1,today)
    Set tRequest = ##class(xxx.Mess.Email).%New()
    Set tRequest.To = ..SendTo
...etc
    &sql(DECLARE C1 CURSOR FOR
    SELECT
          SUBSTRING(DATEPART('sts',TimeCreated) FROM 1 FOR 10),
          SourceBusinessType,
          SourceConfigName,
          TargetBusinessType,
          TargetConfigName,
          COUNT(*)
        INTO :mDate,...
        FROM Ens.MessageHeader
        WHERE TimeCreated >= :yesterday AND TimeCreated < :today
        GROUP BY...
        ORDER BY...
    )
    &sql(OPEN C1)
    &sql(FETCH C1)
    While (SQLCODE = 0) {
        Set tSC = tRequest.Attachment.WriteLine(mDate_","_...
        &sql(FETCH C1)
    }
    &sql(CLOSE C1)
    Set tSC = ..SendRequestAsync($ZStrip(..TargetConfigName,"<>W"),tRequest) If $$$ISERR(tSC)...