Is the timeout occuring waiting for System Managemt Portal web page to return html content to the web browser?

In the "Web Gateway" configuration, "Default Parameters" page the setting "Server Response Timeout" is default 60 Seconds.

Also if using internal Apache webserver for SMP (install dir/httpd/conf/httpd.conf)
the default "Timeout" is 300 seconds.

The lower of these values is applied. So could start by increasing the Web Gateway timeout setting first to see if gives more time for page to return content.

Great questions to consider.

1. The single message.
  The princplie with the example is a single base message for maximum reuse.
  However you define many Methods with the prefix "Test" and these automatically get run by the UnitTest runner.
  Within each method you can send one or more variations of the base test message.
  For example: Consider RoutingRules with different behaviour for Age and Gender.
  Within the same TestMethod you can implemented small updates to the base message to ensure full coverage of your routing rule logic.
  ie: TestAgeGender Method, could send adjusted messages for various age ranges and genders.
2a. Yes the automatic correlation of one base message in BeforeAllTests was done as a convenience.
  It is trivial to add a helper method to pull messages for different XData sources as needed. I will add this. Good suggestion.
 b. Yes. I could also implement a much simpler utility that simply reads HL7 files from a directory and outputs a CSV report of:
   * Source filename
   * Routing Result: Which rule number was applied
   * Routing Return Value: Where it was sent, What transform was employed
   This doesn't need the UnitTest runner and the corresponding web report to be generated.
   Expect you could run the "Test Reporting" class on both the pre-migration and the post migration platform and compare the output report.
3. Yes I have experience with UnitTests for DTLs.
   The particular challenge for these I find are with generated fields like some Dates, and Episode Number.
   ie: You want to compare most things you transform about a message, but wish to exclude certain fields for UnitTest validation as they always have a different / unique value.
   A non-intellegent comparison tool will just say the output files are always different.
   For this I have some other code that analyses all DTLs in a namespace and generates corresponding UnitTest classes with targeted "Compare" methods, generated by parsing the DTL XML.
   This allows you to easily go in after generation and comment out a few specific HL7 fields you wish to exclude from comparison between the expected and the actual HL7 transform result.
   The reuse here is in adding new XData message pairs, for source and expected transformed message, to Tests to the class definition.
   The unit Test automatically picks up the new messages to transforma and validate. So you could actually employ an analyst role instead of developer, to enhance DTL UnitTests for edge cases.  

The main value with UnitTests is for continious integration and / or avoiding future changes breaking code behaviour in unexpected ways.
It also allows you to validate parts of a production in isolation.
It automates regression tests of code.
It can save time with rework. For example I had a routing rule for routing and filtering messages based on their Snomed classifications. The rules were quite complex and would take hours to manually test.
However with one Routing Rule UnitTest I could add an enhancement and retest in minutes with the confidence that previous behaviour was preserved.
There is a judgement on which approach brings the most immediate value, but in the longer term incrementally adding UnitTests can enforce more robust and controlled transformation and routing behaviors.

Taking a step back and please don't be offended but I just wanted to confirm something.
When accessing the Web Gateway does the URL in the browser contain the superserver port number.
ie: Is this: http://servername:52773/csp/bin/Systems/Module.cxw  (Internal WebServer)
or http://servername/csp/bin/Systems/Module.cxw  (External WebServer)

Also...
In the Web Gateway under Configuration -> Default Parameters.
Do you have a non-default value set for "Event Log File"

Interesting challenge as these are not overrideable properites of the Task Definition.

However you can hijack the device to output somewhere else.

Here is an example dynamically adding a datetime suffix to the logfile for a built-in System Task.

So if you enable Logging to "c:\temp\T" (WIndows example) on this task and selected OutputFileSuffix property " [YY]YY-MM-DD" then your Task output is actually directed to new file "c:\temp\T20220711_153600".

/// Seems to need %SYS.Task.Definition in super list to be
/// visible in Task Schedule Wizard
Class Test.SuperRunLegacyTask Extends (%SYS.Task.RunLegacyTask, %SYS.Task.Definition)
{

/// Example flexible property to enable selecting a given Date and Time format 
/// from the Schedule Wizard, to use as Suffix appended to log file name (When in use)
Property OutputFileSuffix As %String(DISPLAYLIST = ",MM/DD/[YY]YY,DD Mmm [YY]YY,[YY]YY-MM-DD,DD/MM/[YY]YY", VALUELIST = ",1,2,3,4") [ InitialExpression = 3 ];

Method OnTask() As %Status
{
	set current=$IO
	set useDev=0
	
	// Checks has at least 2 path seperators
	// Checks not a null device
	if $L($TR(current,"\/"))<($L(current)-1),$L(current,":")<2 {
		set dev=current_$TR($ZDT($ZTS,..OutputFileSuffix),", :/()-","__")
		Open dev:"NWS":2
		if $T set useDev=1
	}
	if useDev {
		use dev do {
			set tSC=##super()
			close dev	
		} while 0
	} else {
		set tSC=##super()
	}
	quit tSC
}

}

$ sudo find / -name "CSP.log" 2>/dev/null

Also if selinux enabled and blocking:
$ sestatus
SELinux status: enabled

Validate. (Example correct path as location above)

$ ls -Z /opt/webgateway/logs/CSP.log

Correction. (Example correct path as location above)

$ sudo semanage fcontext -a -t httpd_log_t /opt/webgateway/logs/CSP.log
$ sudo restorecon -v /opt/webgateway/logs/CSP.log

Hi Ramesh,

Good questions.

Either statement is correct.

By default MAXLEN is 50 characters.

Also there is a TRUNCATE parameter which is "0" false by default.

This means if the string was more than 50 characters you will get a validation error if you try to save a large string.

Setting TRUNCATE to "1" will sliently remove content over the default 50 characters.

No there is not any wasted storage if you don't define a MAXLEN parameter.

In default IRIS storage all of the simple properties (Strings, Numbers) of a record are stored in a list.

This list cannot exceed approximately 32K in length.

So maybe one reason to set your MAXLEN on all properies, for big tables, is to have documented sizes, to easly get total maximum size of the underlying storage list. Know if approaching 32K before you add more properties.

Then new properies can be mapped to a different subscript (storage list) to avoid a storage limitation.

Hope this helps.

There is a light weight, no-install, mini-version of this for comparing differences in the IRIS platform in your web-browser.

Will give an idea whether you wish to invest time into this approach: https://alexatwoodhead.github.io/ompare/index.html

I can look to make the reporting side of the main project runnable in a docker container if will be an easier experience. In addition to Properties, Methods, Parameters the main project does drill down into real code diffs Methods, routines and XData.

For production setting comparison I use the SQL projection comparison feature.

Suggest use "Return" keyword instead of "Quit"

For example:

Class Test.ReturnOfTheTry [ Abstract ]
{

  ClassMethod GetReturn(state = 0)
    {
    try {
      if state=1 return "In Try"
      set x=3/0
    } catch {
      if state=2 return "In Catch"
  }
  quit "At Methods End"
  }

}

// Usage example

Write ##class(Test.ReturnOfTheTry).GetReturn(1)
In Try
Write ##class(Test.ReturnOfTheTry).GetReturn(2)
In Catch
Write ##class(Test.ReturnOfTheTry).GetReturn()
At Methods End

Hi Alberto,

I have an alternative approach that uses Class based Unit Tests for DTL.

Essentially it uses pairs of XDATA blocks to hold the input and expected output messages.

It automatically generates the UnitTest Compare by assertion implementation with assertions, by first analysing the existing DTL path expressions in your real DTLs. code.

The advantage for this is:

1) It doesn't require external folders to store input and output messages.

2) The test input and output message are source controlled with the unit-test class.

3) Easily ignore comparison of specific paths set in DTL that you don't want to compare. For example paths set to todays date or a numeric sequence would always be different so we can easily comment out these specific generated assertions. ie: Not compare the whole input and output file.

4) There is a utility method when given a class package it will seek out and find and generate DTL unit test classes with tailored "compare assertion" methods

5) There is Assertion for paths being found in the input document. For example: If you have a recent change to the schema, and your existing Tests appear to pass. ie: A target field is empty as expected. This unit test will still pick up that the source field is in accessible using the current schema.

6) The ordering of assertions in path loops is re-ordered to be more processing efficient.

7) Its using a more "Row Driven" unit test methodology. ie: An analyst can keep just adding the pairs of XDATA input and expected output messages to the same UnitTest class without the knowledge for how to implement unit-tests. I had considered providing a Studio template to facilitate this, or to pull sample messages into a DTL class where sample messages are available on the system in focus.

Limitations: It works for HL7-to-HL7 and HL7-to Objects. It needs further work for Object-to-Object to auto-generate to compare assertion method.

Note: I also have a classes based UnitTest base for stressing routing rules in an existing production along similar lines.

Let me know if interested. Maybe can share to community.

Kind regards,

Alex