Probably not looking at the underlying code.

I would say its being raised by cspxmlhttp.js when it gets a non 200 status code.

If there was a sever side option then we would probably see some kind of conditional around either of these two functions...

function cspProcessResponse(req) {
  if(req.status != 200) {
    var errText='Unexpected status code, unable to process HyperEvent: ' req.statusText ' (' req.status ')';
    var err new cspHyperEventError(req.status,errText);
    return cspHyperEventErrorHandler(err);
  }

...

}

function cspHyperEventErrorHandler(error)
{
  if (typeof cspRunServerMethodError == 'function'return cspRunServerMethodError(error.text,error);
  alert(error.text);
  return null;
}

Hi Bapu,

There is a really simple solution, no Zen required.

Put some pre tags on your web page...

<pre id="json-preview-panel"></pre>

If your JSON is an object then...

document.getElementById("json-preview-panel").innerHTML=JSON.stringify(json, undefined, 2);

Note that the third argument in stringify() is the number of spaces to insert for prettifying.

If your JSON is a string already then you will need to convert it to an object and then back again...

document.getElementById("json-preview-panel").innerHTML=JSON.stringify(JSON.parse(json),undefined,2);

Sean.

Hi Paul,

Quotes inside quotes need to be escaped, your condition is only looking for one double quote, you will need to try this...

source.{PV1:DischargeDateTime()}=""""""

On a side note, quotes sent in HL7 can be used to nullify a value, e.g. if a previous message had sent a discharge date and time by mistake then "" would be a request to delete that value (as apposed to an empty value).

Sean.

Try this...

ClassMethod Transform(source As EnsLib.HL7.Message, Output target As EnsLib.HL7.Message) As %Status
{
    set target=source.%ConstructClone(1)
    set seg=target.FindSegment("OBX",.idx,.sc)
    while idx'="",$$$ISOK(sc)
    {
        set ntestr = "NTE|"_$I(ident)_"|"_seg.GetValueAt(5)
        set nte = ##class(EnsLib.HL7.Segment).ImportFromString(ntestr,.sc,source.Separators) if $$$ISERR(sc) goto ERROR
        set sc=target.SetSegmentAt(nte,idx) if $$$ISERR(sc) goto ERROR        
        set seg=target.FindSegment("OBX",.idx,.sc)
    }
ERROR
    quit sc
}

Hi Tom,

Should have spotted this earlier. GetSegmentAt will return an immutable segment, so you shouldn't be able recycle that segment for other purposes.

If you create a new segment, then you might be able to set it at the old segments idx, but having never done it this way I wouldn't be 100% sure if it would.

By all means give it a go, but you should at least test the status and bubble it back up the stack so that you don't end up with silent failures. If there is an error it will apear in the logs.

set sc=target.SetSegmentAt(newsegment,idx)
if $$$ISERR(sc) quit sc

BUT, if I was doing this by hand, I would remove the segment with...

target.SetValueAt(,"PIDgrpgrp(1).ORCgrp(1).OBXgrp(1).OBX","remove","")

I've explicitly hard-coded the groups. Note that this path is a 2.4 schema path and may be different for other schema's. If your data does have repeating groups in it, then you will need to set these logically.

I would then set the two values using...

set sc=target.SetValueAt(pObservationIdentifier,"PIDgrpgrp(1).ORCgrp(1).NTE(1):SetIDNTE","set","") 
set sc=target.SetValueAt(pObservationValue,"PIDgrpgrp(1).ORCgrp(1).NTE(1):SourceofComment","set","")

However, I wouldn't do this by hand at all. Having developed 1000's of DTL's over the years, 95% of them have always been done via the Data Transformation Build tool. The code it generates will have no typo's in the schema paths, it will handle immutability for you, it will trap errors for you and you will end up with a more maintainable solution.

If anything, use the tool and inspect the code it generates to see the right way to develop by hand.

Sean

Hi Evgeny,

Not exactly one command, but it can be done on one line...

set file="foo.zip" do $System.OBJ.ExportToStream("foo*.GBL",.s) open file:("WNS":/GZIP=1) use file Do s.OutputToDevice() close file do s.Clear()

This should work in reverse opening the file with the GZIP flag, read the contents to a temporary binary stream and then using the $System.OBJ.LoadStream on the temp binary stream.

Sean.

Hi Scott,

Sounds like classic teapotism from the vendor.

Typically at this stage I would put Wireshark on the TCP port so that I have absolute truth as to what's going on at the TCP level.

If you see no evidence of these messages in Wireshark then you can bounce the problem back to the vendor with the Wireshark logs.

If you see evidence of messages, then you will have something more to go on.

One thing to look out for is if the HL7 messages are correctly wrapped. If you don't see evidence of ending 1c 0d hex values then the message will get stuck in the buffer. If they are dropping the connection then this can get discarded. You might see warnings relating to this, something like "discarding TCP buffer".

The fact that they think they are getting HL7 level ACK's back is a bit odd. Again with Wireshark you will be able to prove or disprove their observations. There is a scenario where by a timed out connection can collect the previous messages ACK, again it would be obvious once you look at the Wireshark logs.

If you need help with Wireshark then I have some notes digging around that might help.

Sean.

> Let me be crystal clear and honest - this is horrible

LOL, well, lets crack open Ensemble and explore some macro code...

In all honesty, this post was not an advocacy but an exploration.

Map, Reduce and Filter are functions that I use every day in other languages that I never think to emulate in COS. Seeing the original OP it got me thinking, why can't we have it in COS as well.

It's good to explore these ideas, particularly as other languages are outpacing COS in a very big way. How else would they end up in the core language.

I agree, the dot syntax is a bit old school.

At the moment its the only way that I can think of for passing code into the context of a map reduce function.

It doesn't look so bad when part of a wider COS code block...

ClassMethod Test2()
{
  set originalCollection = ##class(%ListOfDataTypes).%New()
  do originalCollection.Insert("Sean")
  do originalCollection.Insert("Mark")
  do originalCollection.Insert("Bob")

  $$$map(originalCollection,newCollection,item)
  .$$$return($ZCONVERT(item,"U"))

  $$$foreach(newCollection,item)
  .write !,item

}

If COS implemented Lambda syntax using arrow functions then it would look at lot cleaner.

It wouldn't be hard for the COS compiler to implement. The inner code block would be scoped of to its own underlying M function with its return value being a quit back to the output of the macro or classmethod call.

ClassMethod Test2()
{
  set originalCollection = ##class(%ListOfDataTypes).%New()
  do originalCollection.Insert("Sean")
  do originalCollection.Insert("Mark")
  do originalCollection.Insert("Bob")

  set newCollection=$$$map(originalCollection, (item) => {
    return $ZCONVERT(item,"U")
  })

  $$$foreach(newCollection, (item) => {
    write !,item
  })

}

Nice article Eduard.

Re: last comment. Not sure how having to change the implementation of a Macro is any less of a pain than having to change hundreds of class method arguments. If anything Macro's are a great way to abstract application wide changes with a single line of code. Granted recompilation is required, but code that can't withstand recompilation all day long has a deeper problem.

My only negative about Macro's is when you can't read the code for Macro soup, less is more for me. 

One tip to add. I like to have auto complete work for Macros. If you precede each macro with a triple comment, then they will appear in the auto complete suggestions...

///
#define LogNone(%message)         $$$LogEvent("NONE", %message)
///
#define LogError(%message)        $$$LogEvent("ERROR", %message)
///
#define LogFatal(%message)        $$$LogEvent("FATAL", %message)