go to post Clayton Lewis · Sep 26 You most likely need to configure the Format setting in your Business Operation, assuming you're using EnsLib.EDI.XML.Operation.FTPOperation or EnsLib.EDI.XML.Operation.FileOperation. Set that to "w" to get the Windows-style CRLF line terminator. Click the label in the Settings panel to get the full list of options. Note that these are the same format codes used by the Output* methods in EnsLib.EDI.XML.Document, so you can do the same thing in your own code as well as in the Operation. You can combine these codes. For example "wt" would indicate Windows-style line terminators and tab indentation.
go to post Clayton Lewis · Sep 3 This is the part that's not behaving as expected: W $ZTIMEH("12:05:38.975411826")Output: 338 On my instance I get Output: 43538.975411826 $ZTH takes a 2nd argument that specifies the time format. If you omit that value it uses the default, which you can see with this: WRITE ##class(%SYS.NLS.Format).GetFormatItem("TimeFormat") I suspect you'll get 3 or 4, since those are the formats for 12 hour clock. Try sending 1 or 2 for 24 hour clock: W $ZTIMEH("12:05:38.975411826", 1) https://docs.intersystems.com/iris20241/csp/docbook/Doc.View.cls?KEY=RCO...
go to post Clayton Lewis · Aug 26 Assuming this is code that runs outside of a Production and is trying to invoke a Business Service that does run within a Production in the same Namespace, the approach would be generally like this: 1) Create an instance of the Business Service: set tSC = ##class(Ens.Director).CreateBusinessService(tServiceName, .tServiceInstance) 2) Invoke the Business Service: set tSC = tServiceInstance.ProcessInput(tRequest, .tResponse) This technique is used for SOAP services, REST APIs, and Tasks that invoke functionality provided by a Production.
go to post Clayton Lewis · Apr 10 Will that work for Object Script? Don't we need to save before we can compile and even find out if there are errors?
go to post Clayton Lewis · Apr 9 I think the 4000 character read from the unencoded source is the problem. That's not an "even number" for base64 encoding, meaning that the encoded version will have trailing "=" signs as padding, in this case 2 of them. Changing to 3000 characters got it to work for me, meaning that the encoded chunks did not have trailing "=". An easy test is to look at your final encoded text. If you see "=" chars anywhere except the end, this is your problem. The trailing "=" is a problem because when you concatenate the encoded chunks those end up embedded in the result, making it invalid base64. You need to rig it so you don't get trailing "=" for any chunk except the last one. You do that by ensuring that your unconverted chunks have a number of bytes such that the bit count is divisible by 6.
go to post Clayton Lewis · Apr 9 This is helpful, but I feel like it's still just a workaround for the real problem, which is that the compiler is reformatting code. Even if I allow that the compiler knows more about how code should be formatted than I do (it doesn't, if only because correct formatting is defined by personal opinion), it shouldn't be doing that until it's fully parsed and validated the code. If it hasn't successfully parsed the code it can't possibly make good decisions about how it should be formatted.
go to post Clayton Lewis · Jan 23 You may have messages that are still queued in an incomplete status. Try Purge Management Data with these settings: Include Message Bodies: Yes Purge Only Complete Sessions: No Do Not Purge Most Recent Days: 0
go to post Clayton Lewis · Jan 8 You need to look at timestamps in 2 globals: ^ROUTINE contains the source code of the routine. ^ROUTINE(RoutineName,0) = Timestamp when the routine was last saved. ($ZTS format, local timezone). ^rOBJ contains the compiled object code. This will not exist if the routine has never been compiled. ^rOBJ(RoutineName,"INT") = Timestamp when the INT routine was last compiled. If the date in ^ROUTINE is later than in ^rOBJ, the compiled code may be out of synch with the source. This isn't guaranteed though, since the last save doesn't necessarily reflect a code change that would require recompile. For MAC routines you'll want to look at: ^rOBJ(RoutineName,"MAC") = Timestamp when MAC routine was last compiled ^rMAC(RoutineName,0) = Timestamp when MAC routine was last saved For Classes I believe you want TimeChanged from %Dictionary.CompiledClass. You can get that using SQL: select TimeChanged from %Dictionary.CompiledClass where ID = FullyQualifiedClassName
go to post Clayton Lewis · Jul 12, 2023 I agree that developers need this information presented in a better way, but I'm thinking the approach would be to have the UI read the Message Map rather than adding properties to the Operation class. Here's my reasoning. One concern with this idea is that a Business Operation can accept many kinds of request messages. In a sense the Business Operation doesn't actually have a request message at all. It's the individual methods within the Operation that do, and the Message Map specifies how they align. I suppose the Operation class could have a property for each method-level request class, but that doesn't seem to be very helpful. To make it useful there'd need to be a way to match the properties with their respective methods, such as by naming convention. Even then, it would just be a form of documentation because the Operation code has no technical need for properties of that kind. We could certainly rework things so it does use them, but that seems forced and I think it would make things more complicated rather than simpler. For one thing, it would mean that the calling framework needs to set the correct property of the Operation object based on what method it's about to call. Then it needs to clear the reference after the method returns. If you didn't clear the reference it would persist until the next call to that method, in the meantime preventing the request object from being garbage collected. That could cause any number of problems related to resources like locks, network connections, file handles, etc. not being released. You'd have similar bookkeeping for the response message if you moved that to a property.
go to post Clayton Lewis · May 2, 2023 I'm not sure it's really necessary, but I typically do this after a hosts file change: ipconfig /flushdns
go to post Clayton Lewis · Apr 21, 2022 I've used this approach before and it generally works fine, but there are some additional things to consider: 1) If your code contains SQL that references tables within the package you're renaming, you need to find/replace the schema as well as the package name. E.g., package my.demo would be schema my_demo. If you don't do this the SQL in your new package will reference the tables in your old package. 2) If you have Persistent classes in your package, you'll likely want to export with the /skipstorage qualifier. That will cause export to omit storage maps, so that the new package gets new storage maps and new globals when you compile. If you don't do this your new package might use the same globals as the old one because find/replace wouldn't change compressed global names like ^package.foobar9876.Class1D. 3) If you follow the previous suggestion, you may run into another problem if you try to copy your old globals to the new ones, so that you're bringing forward your data a well as code into the new package. The issue is that the new storage map will have all properties in the order they're declared. If you've added properties over time to the original class they may be in a different order, making the storage maps incompatible. That happens because new properties are always added at the end of an existing storage map, regardless of where they're declared. That prevents the need to convert data to a new structure if new properties are added to a deployed class. In that case you'll need to manually fix the new storage map by copying forward the old <Value> tags, while retaining the new global references.
go to post Clayton Lewis · Feb 23, 2022 Thanks Jorge, this is helpful. Your example of computing BMI is actually what the customer is trying to do, although they want to do it in Health Insight. That allows them to report BMI and use it in analytics, such as to compute a risk score. After discussing with them we agreed that converting on consumption is safer than on ingestion, which is consistent with what you showed for the CV.
go to post Clayton Lewis · Jun 1, 2018 The premise is that when you have a timestamp property that’s set at the time of row insert, there will be a guarantee of those timestamps being “in order” with respect to Row IDs.At first glance that sounds reasonable, and is probably almost always true, but I’m not sure it’s guaranteed. Isn’t there a race condition around saving the timestamp and generating the new row ID?That is, couldn’t you have a flow like this:Process 1, Step 1: Call %Save. In %OnSave: set ..CreatedAt = $zts (let’s say this gives me 2018-06-01 16:00:00.000)Process 2, Step 1: Call %Save. In %OnSave: set ..CreatedAt = $zts (let’s say this gives me 2018-06-01 16:00:00.010) << +10msProcess 2, Step 2: Generate new Row ID using $increment, and complete %Save (let’s say this gives me RowID = 1)Process 1, Step 2: Generate new Row ID using $increment, and complete %Save (let’s say this gives me RowID = 2)Is that likely? Definitely not, but I don't think it's impossible.Actually, it might be fairly likely in an ECP environment where multiple ECP Clients are inserting data into the same table, one reason being that system clocks could be out of sync by a few milliseconds.Does that make sense, or am I missing something? For example, would this all be okay unless I did something dumb with Concurrency? If so, would that still be the case in an ECP environment?
go to post Clayton Lewis · May 1, 2018 Okay, thanks for updating. That error didn't seem to make sense based on what you showed before.This very simple BPL might help you see how to declare and use a List:<process language='objectscript' request='Ens.Request' response='Ens.Response' height='2000' width='2000' ><context> <property name='MyList' type='%Integer' collection='list' instantiate='0' /></context><sequence xend='227' yend='451' > <assign name="Append 1" property="context.MyList" value="1" action="append" xpos='278' ypos='291' /></sequence></process>Note that I don't need to initialize my list property. That will happen automatically.Also note that I'm using action='append'. That will insert the new value to the end of the list. It corresponds to this in COS:do context.MyList.Insert(1)BPL also has action='insert', but that inserts into a specific location. It's equivalent to InsertAt for lists, or SetAt for arrays.
go to post Clayton Lewis · May 1, 2018 Probably unrelated, but you most likely do want MSI.IN835.EOBList to have storage.The reason is that if your BPL suspends for any reason (such as a Call) between the time you set and use that data, you'll lose whatever values you were trying to save. That's because the job that's executing your BPL will %Save the Context object and go work on another task while it's waiting. When the Call returns it will reload the Context object and resume work. If you extend %RegisteredObject your context property won't survive the save/reload.It might be tempting to ignore that if you're not currently doing any intervening Calls, but things tend to change over time, so doing it now could prevent a hard-to-find bug later.%SerialObject is probably better that %Persistent for this because that way you won't have to implement your own purge of old objects.Or, if you only need to store a list of integers, you could just declare your context property as that and skip the custom wrapper class.
go to post Clayton Lewis · Dec 1, 2017 How about an XML Stylesheet?https://stackoverflow.com/questions/24122921/xsl-to-convert-xml-to-jsonhttps://stackoverflow.com/questions/43355563/convert-xml-to-json-using-xslt
go to post Clayton Lewis · Jan 30, 2017 You need to provide subscript values for the three loops.This will give you the 1st member of each collection:source.{loop2000A(1).loop2000B(1).loop2300(1).CLM:ClaimSubmittersIdentifier}It's likely that in a real DTL you'll want to loop over each collection, because there will probably be multiple claims in the message. Use ForEach to do that:<foreach property='source.{loop2000A()}' key='k2000A' > <foreach property='source.{loop2000A(k2000A).loop2000B()}' key='k2000B' > <foreach property='source.{loop2000A(k2000A).loop2000B(k2000B).loop2300()}' key='k2300' > <assign value='source.{loop2000A(k2000A).loop2000B(k2000B).loop2300(k2300).CLM:ClaimSubmittersIdentifier}' property='target.ClaimInvoiceNo' action='set' /> </foreach> </foreach></foreach>Note that the way I have that now you'll end up with the last ClaimInvoiceNo in your target. You'll need to adjust to make sure you process each of them.