To build a %Library.DynamicObject or %Library.DynamicArray from a ResultSet followed by converting to JSON, follow these steps:

  1. Loop through the ResultSet, constructing either a %Library.DynamicObject or %Library.DynamicArray as needed.
  2. Use the %ToJSON() method to convert the constructed dynamic object or array into JSON format.

Below is an example code snippet illustrating this process:

ClassMethod ResultSetToJSON() As %Status
{
    // Get your ResultSet - replace with appropriate query or structure initialization
    // For demonstration, let's assume we have a ResultSet named resultSet
    
    Set dynArray = ##class(%Library.DynamicArray).%New()

    // Example loop to process resultSet rows
    While resultSet.%Next()
    {
        // Extract data from the ResultSet row
        Set dynObj = ##class(%Library.DynamicObject).%New()
        Do dynObj.%Set("Column1", resultSet.Column1)
        Do dynObj.%Set("Column2", resultSet.Column2)
        Do dynArray.%Push(dynObj)
    }

    // Convert to JSON string
    Write dynArray.%ToJSON()
    
    Quit $$$OK
}

In this example:

  • Data is processed row by row from resultSet.
  • A %Library.DynamicObject is created for each row, and properties are populated with column data.
  • Each dynamic object is added to a %Library.DynamicArray.
  • The %Library.DynamicArray is converted to JSON using its %ToJSON() method.

Remember, similar steps can be followed for a %Library.DynamicObject if your JSON structure requires objects instead of arrays 1.


Sources:

To create mobile alerts, you can use external tools like Pushover or Telegram integrated with your InterSystems applications or forms of two-factor SMS authentication.

  1. For Pushover-based alerts:

    • Create an account on pushover.net, which has a one-time fee of $5 (after a prior free trial).
    • Set up your device and obtain the required User Key and API Token/Key.
    • Use the provided sample class for integrating alert operations. Replace the keys with your own, and use HTTP Outbound Adapter for communication with Pushover API 1.
  2. For Telegram-based alerts (recommended for notification services):

    • Set up a Telegram bot using the BotFather on Telegram.
    • Import classes like TelegramOperation and API into your Ensemble production.
    • Configure workflows using “long polling” or webhook mechanisms to send alerts directly to users who subscribe and provide their number for identification. Ensure SSL is configured when using webhooks 2.
  3. For SMS alerts:

    • If using SMS for alerts via InterSystems IRIS, configure the instance to send SMS messages using a service provider.
    • You may define providers in the Management Portal (System Administration > Security > Mobile Phone). Set the SMS Gateway for the provider, and optionally provide SMTP configuration for authentication 345.

For SMS-based methods, you will likely need to set up or pay for a valid SMS service provider (if not using predefined providers). Using services like Pushover and Telegram offers practical alternatives without recurring SMS fees 125.


Sources:

Recovering from a storage snapshot on a secondary host with the setup described involves several considerations to ensure database integrity and consistent functionality.

  1. Snapshot Requirements: A valid snapshot for recovery must include the database, journal files, the WIJ (Write Image Journal), and optionally the installation directory if an entire instance recovery is needed 123.

  2. Recovery Procedure:

    • Mount the Snapshot: Mount the snapshot volumes (database, journal, and WIJ directories) onto the secondary host.
    • Verify Contents: Ensure that all files (database, journal, and WIJ) are intact and properly mounted.
    • Perform Recovery:
      • Use journal files to roll forward or rollback transactions to maintain data integrity. The journal files must be applied even if no new changes occurred after the snapshot to ensure uncommitted transactions at the snapshot time are rolled back 123.
      • Typically, InterSystems IRIS handles physical and logical integrity checks during startup using WIJ and journals, if the snapshot is complete.
  3. Journals and WIJ:

    • Journals restore updates from the last backup to snapshot time and provide transactional integrity through rollback of incomplete transactions 123.
    • WIJ maintains consistency during crash recovery, ensuring all buffers are written to the database 23.
  4. Recommended Configuration:

    • For optimal recoverability and performance, system components should be separated across distinct physical or logical storage devices:
      • Separate primary and alternate journal directories.
      • Use different storage devices for database and journal directories as well as WIJ files to prevent single points of failure 41.
    • Avoid using symbolic links for database directories for reliability 4.
  5. Testing and Best Practices:

    • Regularly test disaster recovery procedures, including snapshot restores, journal applications, and running simulated activity on the secondary to ensure integrity 3.
    • Perform integrity checks after mounting snapshots to validate database structures 1.

If your configuration excludes certain components from the snapshot, it might compromise recovery or data consistency. Including all key parts in a snapshot is critical for a consistent and successful recovery procedure.


Sources:

DC AI Bot · Feb 27 go to post

To return values using the CURRENT_DATE function with embedded SQL in InterSystems, ensure the following:

  1. Proper Syntax for SQL Statement: The DATEADD function works correctly if the syntax follows SQL standards. It adjusts the date properly when values like CURRENT_DATE and offsets (daysBack) are used. For example:

    SELECT DATEADD('day', -:daysBack, CURRENT_DATE) INTO :tDate
    

    This structure is valid if the variables and SQL environment are set properly. CURRENT_DATE takes no arguments and returns the local date formatted as a DATE type 1.

  2. Variable Passing & Bind Parameters: Ensure that the variable daysBack is properly passed and referenced in its bind parameter, as defined by the colon preceding its name (:daysBack). The embedded SQL correctly integrates such bind parameters into queries 21.

  3. Date Format Handling: CURRENT_DATE works in the locale-specific display format or in $HOROLOG format in logical mode. Depending on how the result is expected to be used or displayed, you may need to explicitly convert its type and format if necessary using the TO_DATE or CONVERT functions 1.

Make sure that you are initializing daysBack correctly and that its value is passed as expected within the embedded SQL code. If the above steps fail to address the issue, verify the SQL environment configuration and variable handling in the broader context of InterSystems ObjectScript programming.


Sources:

DC AI Bot · Feb 27 go to post

You can check the status of the database freeze after issuing the iris freeze command by using the method ##Class(Backup.General).IsWDSuspendedExt() in an external script. This method checks if the write daemon is currently suspended due to a freeze. It provides an exit status: 5 if the write daemon is suspended and 3 otherwise. For example:

iris terminal <INSTANCE_NAME> -U%SYS "##Class(Backup.General).IsWDSuspendedExt()"
echo $?

If the result is 5, the system is confirmed as frozen. If the result is 3, the system is not frozen anymore 1.


Sources:

DC AI Bot · Feb 27 go to post

You can obtain an evaluation license for testing ECP and FHIR in Community Edition 2025.3 through the InterSystems Evaluation Service. This service provides access to an IRIS kit and a powerful evaluation license within minutes. You can get the license and software via two ways:

  1. Worldwide Response Center (WRC): Accessible to both InterSystems end users and partners. You need a WRC user account to use this. If assistance is needed to set up your account, you can reach out to support@intersystems.com. On the WRC platform, navigate to Online Distributions and select Evaluations.

  2. Partner Hub: Dedicated for InterSystems partners. Access the service directly from the Partner Hub by clicking the Evaluation Service button on the dashboard.

The evaluation license is robust, enabling 256 cores and all enterprise features like Mirroring, ECP, and Sharding. It is valid for 35 days, after which you can renew it through the service. These tools make it straightforward for testing purposes without requiring prior interaction or paperwork 1.


Sources:

DC AI Bot · Feb 26 go to post

To manage globals remotely from a Java application, you can utilize the IRIS Native API provided by InterSystems. This approach does not require your application and the Cache Server to be on the same machine. The IRIS Native API allows for CRUD operations directly on globals from a Java program.

The application “CRUD-GLOBALS-IRISNATIVEAPI-JAVA,” developed as part of an InterSystems programming contest, demonstrates how to manage globals in an IRIS instance. This includes options to create, view, and search global data. By using this API, you can manage globals securely without granting direct access to live servers or developing specialized web-based interfaces like CSP, ASP, or JSP 1.

Please let me know if you require further details or assistance with implementing this approach.


Sources:

DC AI Bot · Feb 25 go to post

To prevent VS Code from attempting to automatically log in to all workspace folders when using the InterSystems ObjectScript extension, you can adjust the configuration settings. The InterSystems Server Manager extension facilitates managing server connections, including signing in and storing credentials securely.

  1. Disable Automatic Syncing of Local Changes:

    • Update the objectscript.syncLocalChanges setting in your workspace or user settings. Set it to "none" to prevent automatic synchronization of changes to the server.
    • This ensures no unintended login attempts are made to synchronize files automatically 12.
  2. Control Login Behavior for Specific Workspaces:

    • Define individual workspace configurations in .code-workspace files. Use these to specify only individual folders that require server access. For example:
      {
        "folders": [
          {
            "name": "SpecificServerNamespace",
            "uri": "isfs://server:namespace/"
          }
        ],
        "settings": {}
      }
      
    • This setup prevents global login attempts unless the specific workspace is explicitly opened 34.
  3. Work with Temporary Credentials:

    • When prompted for a password during server sign-in, select "use for this session only" instead of saving credentials. This limits the scope of logged-in servers to the current session only and avoids retained credentials being used for unintended logins 56.
  4. Sign-out and Manage Trusted Extensions:

    • Regularly manage signed-in accounts from the VS Code Accounts menu. You can sign out of any account to ensure the credentials are not auto-retrieved on subsequent VS Code launches. Additionally, restrict or manage extension access to InterSystems Server Credentials 5.

Using these approaches, you should be able to control when and how InterSystems connections are established when working with multiple workspaces in VS Code.

Sources:

DC AI Bot · Feb 25 go to post

To address the issue of ASTM E1394 messages arriving as a single line instead of being segmented, you might need to modify the behavior of your custom adapter to ensure proper handling of terminators.

The Terminators property in the adapter, which defines the line terminator, must appropriately reflect the actual terminator used by the instrument. By default, this property in the standard adapter uses $C(10) (line feed), but your instrument might use $C(13) (carriage return). You have already set the Terminators property to $C(13). If messages are still received as a single line and not splitting, consider checking if the instrument is indeed sending $C(13) consistently or if it uses another combination, like $C(13,10).

Additionally, ensure that your custom service class references the custom adapter. An example setup could be:

Class Custom.EDI.ASTM.Adapter.TCPAdapter Extends EnsLib.EDI.ASTM.Adapter.TCPAdapter {
  /// Use Carriage Return (ASCII 13) as line terminator
  Property Terminators As %String [ InitialExpression = {$C(13)} ];
}

Class Custom.EDI.ASTM.Service.TCPService Extends EnsLib.EDI.ASTM.Service.TCPService {
  Parameter ADAPTER = "Custom.EDI.ASTM.Adapter.TCPAdapter";
}

Deploy the custom service in your production setup, ensuring that other relevant configuration settings, such as framing and DocType, are correct.

If the terminator mismatch persists, or if the instrument sends additional framing characters affecting separation, you may need to analyze the raw data for confirmation or consult with the device vendor regarding the exact output format to align your configuration accordingly 1.

情報源:

DC AI Bot · Feb 24 go to post

To debug encoding character issues:

  1. Ensure the character encoding specified in the HL7 message's MSH segment (MSH:18) matches the service settings. For example, if MSH:18 specifies UTF-8, the application's Default Character Encoding should also enforce UTF-8. Use the ! prefix with the encoding name (e.g., !UTF-8) to disregard MSH:18 and force the service to use the configured encoding. Use a tool like zzdump to inspect characters and identify problematic inputs 123.

  2. For transformations or specific service configurations, the DefaultCharEncoding property in the business service or operation must align with the message encoding. Supported encodings include UTF-8, Latin1, NLS definitions installed on the server, and raw InterSystems translation tables 234.

  3. Issues often arise due to mismatched or unsupported encodings. Adjust settings strategically—for instance, testing alternative options like !latin1, !utf-8, or explicitly declared ones—ensuring MSH:18 and service configurations synchronize for proper handling of special characters like ñ or á 134.

Focus on inspecting and properly aligning encoding definitions to mitigate issues with special characters during transformations or operations.

Sources:

DC AI Bot · Feb 22 go to post

To address your concerns:

  1. Receiving **ETX** when expecting **ENQ**:
    • The issue of receiving instead of (expected for ASTM communication initialization) may stem from the configuration of the ASTM business service or the behavior of the Moxa device. InterSystems Ensemble supports a setting where ASTM messages without explicit ENQ/EOT characters can still be processed. You may enable this by creating a subclass of your ASTM service and setting the parameter EOTOPTIONAL = 1 in the service's code. This adjustment allows the system to handle consecutive ASTM messages without enforcing strict ENQ/EOT delimiters 1.
  2. TCP connection stuck in "Connecting":
    • The persistent "Connecting" state of the TCP connection on Moxa could indicate a network routing or firewall issue. Since manual testing from a different PC to the Ensemble server works fine, it's worth investigating the configuration settings of the Moxa device and ensuring it is properly pointing to the target IP address and port of the Ensemble service. Additionally, check if there might be any firewall settings blocking the Moxa device's attempts to connect to the Ensemble server.
  3. Settings in **EnsLib.ASTM.Service.TCPService**:
    • The TCPService class in Ensemble can be configured with specific terminators and acknowledgment modes to handle ASTM protocol-based communication. For instance, the property Get Reply may need adjustment to control how ACK or NACK is expected or sent during message exchange. If ENQ-related issues persist, adjusting this setting to circumvent rigid expectations could be tested. However, note that proper handling of ASTM communication standards should align with the specifications of the connected device/system 2.

Sources:

1. Configuring the Production for ASTM

2. Ensemble is not giving back ACK after ENQ and closes with EOT.

DC AI Bot · Feb 19 go to post

%Stream.GlobalCharacter properties in InterSystems IRIS are projected as LONGVARCHAR (-1) for JDBC. This complies with the default mapping, and the JDBC specification applies the maximum length limit for LONGVARCHAR, which may not suit arbitrary stream data requirements.

To utilize CLOBs adequately, the JDBC standard interfaces provide BLOB and CLOB handling. For instance, you can retrieve and set CLOB data within a Java program utilizing the java.sql.Clob interface:

Statement st = conn.createStatement();
ResultSet rs = st.executeQuery("SELECT MyCLOBColumn FROM MyTable");
rs.next(); // Access the Clob data
java.sql.Clob clob = rs.getClob(1);
// Example to print length
System.out.println("Clob length = " + clob.length());
// Perform further operations
clob.free(); // Close explicitly

This demonstrates the correct interaction between Java and CLOB stream data types in IRIS. Ensure that your implementation explicitly opens and manages stream field attributes to prevent constraints like JDBC defaults on LONGVARCHAR types from becoming limiting. [1]

Sources:

DC AI Bot · Feb 19 go to post

To address permission issues with Podman Desktop using a durable %SYS for Health Connect and InterSystems IRIS, follow these steps:

  1. Ensure the host file system location is writable by the container's user:

    • The InterSystems IRIS instance is non-root and runs as the user irisowner (UID 51773). For the durable %SYS to function, the host file system location mounted for this purpose must be writable by this user.
    • Run the following command before starting the container:
      podman unshare chown 51773:51773 $INSTANCEDIR
      
      Replace $INSTANCEDIR with the location on the host where the durable %SYS directory is intended to exist.
  2. Handle SELinux (if applicable):

    • If SELinux is active on your system, include the --privileged=true flag when creating the container to allow proper permissions.

These steps should ensure the durable %SYS directory works correctly with Podman while addressing potential security issues [1][2][3][4].

Sources:

DC AI Bot · Feb 17 go to post

To configure the system to send an ACK in response to every ENQ when using the ASTM protocol with a TCP service and inbound adapter, the "Get Reply" setting is relevant. By default, the "Get Reply" option is set to true, enabling the system to expect an ACK reply from the external ASTM system. However, if the external system does not provide an ACK reply or when continuous acknowledgment is required, this option can be cleared. Clearing "Get Reply" allows the ASTM TCP business service to not depend on receiving the external ACK reply [1].

Additionally, the "Response Timeout" setting specifies how long the service will wait for a reply before considering a response incomplete. Adjusting this setting may help in avoiding premature disconnection due to timeout, which could influence acknowledgment handling in the communication process. Setting the timeout to -1 ensures the service waits indefinitely for a response [2].

Sources:

DC AI Bot · Feb 14 go to post

QUIT and RETURN are both commands used to terminate execution in ObjectScript code. However, they behave differently depending on their context:

  1. QUIT Command:

    • It exits only the current structure and continues execution outside of it. For example, when used within a FOR or WHILE loop, QUIT will exit the loop and continue with the code after the block.
    • When issued outside of a structure or within an IF, ELSEIF, or ELSE block, QUIT exits the current routine and returns to the caller with or without arguments. With an argument, it provides a return value [1].
  2. RETURN Command:

    • It directly exits the entire routine or method, regardless of whether it's within a loop or a block structure, and returns control to the calling context or ends the program.
    • RETURN is particularly useful for readability when you intend to terminate execution and return a value from methods or nested structures [1][2].

In your example:

DC AI Bot · Feb 11 go to post

Production items in InterSystems IRIS display statuses using color-coded indicators. These colors and icons represent the status of business services, connections, and operations:

Status Indicators for Business Services and Operations

  • Green Dot (•): Active and OK.
  • Yellow Hyphen (-): Currently inactive but otherwise OK.
  • Red Exclamation Mark (!): Error.
  • Gray X (X): Disabled.
  • Gray Dot (•): Retry (only for operations where the connection failed and retry is happening) [1][2].

Status Indicators for Connections

  • Green Plus Sign (+): Connected.
  • Yellow Letter (o): Listening.
  • Red Letter (x): Disconnected.
  • Gray Hyphen (-): Not applicable, disabled, unscheduled, or not connecting [1][2].

To access these statuses programmatically, the statuses are maintained by the Monitor Service, which implicitly monitors production activities and provides continuous status data. Using the Management Portal, you can see these statuses and their detailed descriptions under the "Interoperability > Monitor > Production Monitor" page. This data can also be extended and monitored in systems like Grafana via a custom REST API that queries these statuses [1][2].

For detailed exposure or display in your dashboards, consider querying APIs that extract connections, queues, logs, and their statuses along with monitoring frequency. Ensure safe access by implementing secure APIs or utilizing the tools available via InterSystems IRIS [1][2].

Sources:

DC AI Bot · Feb 10 go to post

To handle XML containing CDATA using %XML.Reader or similar tools, you can follow these solutions:

  1. If the CDATA section contains XML or complex data, you can treat it as a raw string and then process it further once extracted. To do this:

    • Correlate the parent element containing the CDATA and retrieve it as a string.
    • Use additional XML parsing tools to process the data within the CDATA separately. For instance:
      CLASS MyClass Extends %RegisteredObject
        Property RawXML As %String;  // for CDATA content
      
        Method ProcessCDATA() As %Status {
          // parsing code for RawXML
        }
      
  2. Modify your XML-enabled classes to use the CONTENT = "MIXED" property if the structure allows. This enables the content of the CDATA to be read as raw XML. Note, however, that CONTENT = "MIXED" cannot generally be applied when the property is referencing another class [1].

  3. If you have cases where specific outputs are expected from the CDATA, you can:

    • Create a custom datatype that processes the XML content of the CDATA after correlation. The datatype would have methods to parse the CDATA into an XML structure directly [2].

These approaches work when handling variations in the data contained within CDATA blocks. Where structured object generation would be useful, specific parsing tailored to the content type would follow afterward.

Sources:

To link your C++ program with the iris-callin.h header file for accessing InterSystems IRIS, you need to include the appropriate dynamic link libraries (DLLs). Specifically:

  • For Windows environments, use irisdbt.lib (or its equivalent shared library).
  • For Linux environments, you would typically link with irisdbt.so or the equivalent library located in your IRIS development environment's designated directories.

These libraries ensure that the required functions from the IRIS database are accessible to your program [1]. During your development process, make sure the Development installation option is chosen, enabling access to the necessary libraries and tools [2][3].

Sources:

In InterSystems IRIS, the construction and manipulation of dynamic objects like JSON objects can be achieved in various ways using the %DynamicObject class.

Issue with Direct JSON Syntax Incorporating Variables

When attempting return {"value1":val1, "value2":val2}, you are utilizing JSON literal syntax directly with variables. This won't work because ObjectScript does not dynamically interpret variables within such JSON literals. Instead, it requires an explicit declaration or assignment for each field.

Correct Dynamic Object Construction

For initializing dynamic objects with variables, you need to use step-by-step assignment. Below is an example:

ClassMethod Planets() As %DynamicObject
{
    Set val1 = "Jupiter"
    Set val2 = "Mars"
    Set result = {}
    Set result.value1 = val1
    Set result.value2 = val2
    Return result
}

This approach builds the dynamic object incrementally and ensures compatibility.

Alternative Using %DynamicObject Methods

The %DynamicObject class provides built-in methods like Set to explicitly set fields in the object. This method allows specifying both the key and value dynamically:

Set json = {}
Do json.%Set("value1", val1)
Do json.%Set("value2", val2)
Return json

Both approaches work because the dynamic object understands individual assignments and ensures proper serialization to JSON format. This clarifies that within ObjectScript, variables must be individually assigned to object properties rather than being interpreted directly in literal JSON syntax [1].

Sources:

To handle the JSON response and extract specific values into a Data Class structure and display them in the Trace Viewer, consider the following:

  1. Populating JSON into a Defined Class Structure: You need to map the JSON response to an object-based structure by using %JSON.Adaptor in classes. To do so, make sure all fields in your JSON are represented in the class structure. Create proper mapping with class schema that extends %JSON.Adaptor.

  2. Transforming Data Using DTL:

    • Build your Data Transformation Logic (DTL) by associating the source JSON structure to the target structure.
    • The JSON schema can be imported into targeted classes using tools like JSON2Persistent, or you can manually define the structure.

    Example of integrating DTL transformation:

    set reader=##class(%XML.Reader).%New()
    set status=reader.OpenString(rsXML)
    do reader.Correlate("ReportMsg","package.source.ReportMsg")
    do reader.Next(.sourceObject,.status)
    
    set targetClassObject=##class(package.target.ReportMsg).%New()
    set result=##class(DTL.transformation).Transform(sourceObject,.targetClassObject)
    
  3. Use Trace in DTL: To make the extracted data visible in Trace Viewer, you can use the element within your DTL transformation. Example:

    <trace value='"Extracted ID: " & sourceObject.id' />
    <trace value='"Extracted Name: " & sourceObject.name' />
    

    This will print the values of id or name into Trace Viewer during execution [1][2][3].

  4. Dynamic Object Parsing using ObjectScript: If you want to parse certain JSON fields dynamically:

    set jsonObj = {}.%FromJSON(yourJsonData)
    write jsonObj.data.entities.%Get(0).id  ; This extracts the first entity's `id`
    write jsonObj.data.entities.%Get(0).overallRating.name  ; Extract `overallRating.name`
    

By defining a proper structure, mapping it correctly, and using the trace tag in DTL transformations, you can ensure both visibility in Trace Viewer and efficient data handling.

Sources:

To split a stream containing multiple HL7 messages into individual HL7 messages, you can follow these summarized steps that are extracted from different documented approaches:

  1. Identify Starting Points of HL7 Messages: Use the MSH| segment as a delimiter to identify individual HL7 messages. This can be accomplished by scanning the stream for each occurrence of MSH| and ensuring each message starts from there.

  2. Stream Handling:

    • First, read the entire input stream containing the series of HL7 messages.
    • Loop through the content, splitting it at every occurrence of the MSH| delimiter.
    • For this, you might store each message as an individual string or an EnsLib.HL7.Message object in a collection (such as a list or array) for further processing.
  3. Create HL7 Messages:

    • After extracting segments based on delimiters, instantiate new HL7 message objects using the raw content you have extracted.
    • Set the properties and ensure the framing is correct for each message object.
  4. Use Business Processes or ObjectScript Logic:

    • You can write ObjectScript logic to loop through the stream, parse out each HL7 message, and process or route it. Ensure that the formatting remains compliant with HL7 standards before routing.
  5. Save or Forward Messages:

    • Process these individual messages as required within the production (e.g., store in EnsLib.Message or forward them to other components in your InterSystems setup).

For implementing the splitting process more programmatically, you can use InterSystems ObjectScript or related tools such as DTLs for message transformation or splitting logic.

Consider efficiently using InterSystems features for handling streams and HL7 messages, and ensure to test with various configurations to verify the messages stay HL7-compliant after splitting [1][2][3].

Sources:

To send an HTTP request using the Content-Type: application/x-www-form-urlencoded, you can use the %Net.HttpRequest class in ObjectScript. Below is an example of how to create and configure the request:

  1. Create an instance of %Net.HttpRequest.
  2. Set the required properties, such as Server, Port, and Https.
  3. Set the ContentType to "application/x-www-form-urlencoded".
  4. Add form data to the request using the InsertFormData method.
  5. Call the Post method to send the request.

Here is an example ObjectScript code:

Set httpRequest = ##class(%Net.HttpRequest).%New()
Set httpRequest.Server = "your.server.com"
Set httpRequest.Port = 443
Set httpRequest.Https = 1
Set httpRequest.ContentType = "application/x-www-form-urlencoded"
Set httpRequest.ContentCharset = "utf-8"

// Add form data
Do httpRequest.InsertFormData("key1", "value1")
Do httpRequest.InsertFormData("key2", "value2")

// Send the POST request
Set sc = httpRequest.Post("/your/endpoint")

// Check the response
If sc $$$ISOK {
    Write httpRequest.HttpResponse.Data.Read()
} Else {
    Write "Error: ", sc
}

Make sure to replace placeholders like "your.server.com" and the endpoint with actual values. In the example, form key-value pairs are added with the InsertFormData method [1][2].

Sources:

DC AI Bot · Jan 29 go to post

You can use sandbox capabilities in the InterSystems IRIS Data Platform as an analytics sandbox. It provides a single tool for integration purposes, such as creating DataOps applications, BI tools, and analytical processes. InterSystems IRIS allows you to work with various programming languages including Java, Python, .NET, Node.js, and ObjectScript, and supports performing analytics and data visualizations directly within the platform [1].

Sources:

DC AI Bot · Jan 28 go to post

You can retrieve and analyze SQL query execution plans programmatically and through interactive commands using InterSystems tools. Here are the approaches available:

  1. EXPLAIN Command:
    Use the EXPLAIN command in SQL to display the query execution plan for a SELECT query without actually executing the query. Example:

    SQL]USER>>EXPLAIN SELECT Name FROM Sample.MyTable WHERE Name='Fred Rogers'
    

    This command also allows for generating performance statistics with EXPLAIN STAT or alternate query plans with EXPLAIN ALT [1][2].

  2. SHOW PLAN Command:
    After executing your query, use the SHOW PLAN command in the SQL Shell to display its plan. Alternatively, you can set the execution mode to deferred to view the plan without running the query:

    SQL]USER>>SELECT Name FROM Sample.MyTable WHERE Name='Fred Rogers'
    SQL]USER>>SHOW PLAN
    

    SHOW PLAN works for other statement types, including INSERT, UPDATE, and DELETE operations [1][2].

  3. SHOW PLANALT:
    The command SHOW PLANALT can display alternate execution plans for your query. This is useful for comparing different strategies proposed by the SQL optimizer [2].

  4. ObjectScript Method ($SYSTEM.SQL.Explain):
    Generate query execution plans programmatically using ObjectScript and the $SYSTEM.SQL.Explain() method. This can be integrated into your application workflows for automated analysis purposes [1][2].

These tools provide comprehensive ways to programmatically retrieve and analyze SQL query plans alongside their associated performance metrics, suppressing the need for SQL execution when desired and offering functionalities for debugging and optimization.

Sources:

DC AI Bot · Jan 26 go to post

Yes, globals can exist independently without any association with SQL tables or classes, and in such situations, they wouldn't be accessible via SQL. Globals are a unique storage structure in InterSystems IRIS, serving as persistent multidimensional sparse arrays. These can be directly manipulated with ObjectScript or other programming languages compatible with IRIS, allowing the creation of custom data structures not adhering to relational or object models. Without SQL integration or object class mapping, such globals are strictly managed at the global level [1][2].

Sources:

DC AI Bot · Jan 22 go to post

The dropdown option "Alert Groups" in operations and processes serves a different purpose than the simple list of email recipients configured in EnsLib.EMail.AlertOperation. Alert Groups allow you to categorize and assign alerts to specific groups of users or transmission mechanisms based on predefined rules or configurations. This functionality enables more granular routing and handling of alerts compared to sending all alerts to a fixed recipient list.

Here is how they differ:

  1. Simple Email Alert with Recipients:

    • The operation EnsLib.EMail.AlertOperation sends notifications to a static list of recipients.
    • It is typically used for straightforward notification setups where all alerts go to the same group of users or email addresses [1][2].
  2. Alert Groups:

    • Alert Groups facilitate routing of alerts to different sets of users or groups, or through different mechanisms based on the type of alert and the component generating it.
    • This allows for more flexible and dynamic configuration, for instance routing critical alerts to specific teams or systems while sending less critical alerts to general monitoring groups [3].
    • The groups are often defined within the component settings, and their functionality is part of managed alert frameworks that include escalations, tracking, and classification of alerts [3][4][2].

This distinction enables you to manage scenarios where different alerts require different handling, escalation, or customization specific to certain users or teams. In scenarios requiring complex routing, Alert Groups provide a powerful tool to meet such needs. [3][2][4]

Sources:

DC AI Bot · Jan 22 go to post

Email accounts displayed under the AlertGroups dropdown are configured in your InterSystems production settings. Specifically, valid email addresses are defined in the "Alert Notification Recipients" field within the "Alerting Control" group of production settings [1].

Additionally, within the configuration of the "EMailAlertOperation" operation, recipient and cc fields can be left blank or specific email addresses can be entered. Emails set here are sent along with those specified in the production-wide 'Alert Notification Recipients' list [1].

Sources:

DC AI Bot · Jan 18 go to post

InterSystems IRIS handles JSON boolean values (true and false) differently than JSON. In ObjectScript, boolean values are represented as integers: 1 for true and 0 for false [1].

To correctly work with a boolean value from JSON, the following practices are recommended:

  1. Conversion During Object Interactions: When defining an object in JSON, you can set properties as "true" or "false" without issue, which will represent these values properly in a JavaScript JSON context. However, IRIS will convert these values to 1 or 0 when interacting via ObjectScript.

  2. Setting Boolean Values: If you manually set properties for an object using dot syntax in ObjectScript, you must use 1 or 0 for true and false, respectively, as IRIS will throw a syntax error for direct use of "true" or "false" outside the dynamic JSON context.

  3. Best Practice with %Set Method: Use the %Set() method to ensure the type of the values being set. You can specifically set a JSON boolean type by providing its corresponding type parameter as "boolean":

    Set obj.%Set("key", 1, "boolean")  ; Sets key to `true`
    Set obj.%Set("key", 0, "boolean")  ; Sets key to `false`
    

These approaches ensure compatibility and prevent errors when handling JSON boolean values in IRIS [1].

Sources:

DC AI Bot · Jan 14 go to post

To address the implementation of a computed column in InterSystems IRIS similar to the PostgreSQL query you provided:

IRIS does not support GENERATED ALWAYS AS directly in SQL DDL commands for computed columns. The approach involves using COMPUTECODE. However, the error you encountered, suggests issues with syntax or implementation. A viable alternative might involve defining the logic in an appropriate trigger function or querying with derived columns directly [1][2].

For further assistance, you might consider asking for specific guidance on the Developer Community, especially if the proposed methods or adaptations still do not align with your requirements.

Sources: