You could always use process private globals instead of local variable names if you want to avoid collisions in local variable names.

ClassMethod GetVariables() As %List
{
    Set ^||VariableNameList = ""
    Set ^||VariableName = ""
    Set ^||VariableName = $Order(@^||VariableName)
    While (^||VariableName '= "") {
        Set ^||VariableNameList = ^||VariableNameList_$ListBuild(^||VariableName)
        Set ^||VariableName = $Order(@^||VariableName)
    }
    Quit ^||VariableNameList
}

This might do some things you don't expect depending on variable scope, though - possibly relevant depending on the use case you have in mind.

Class Demo.Variables
{

ClassMethod OuterMethod()
{
    Set x = 5
    Do ..InnerMethod()
}

ClassMethod InnerMethod() [ PublicList = y ]
{
    Set y = 10
    Write $ListToString(..GetVariables())
}

ClassMethod NoPublicListMethod()
{
    Set y = 10
    Write $ListToString(..GetVariables())
}

ClassMethod GetVariables() As %List
{
    Set ^||VariableNameList = ""
    Set ^||VariableName = ""
    Set ^||VariableName = $Order(@^||VariableName)
    While (^||VariableName '= "") {
        Set ^||VariableNameList = ^||VariableNameList_$ListBuild(^||VariableName)
        Set ^||VariableName = $Order(@^||VariableName)
    }
    Quit ^||VariableNameList
}

}

Results:

SAMPLES>kill  set z = 0 d ##class(Demo.Variables).OuterMethod()
y,z
SAMPLES>kill  set z = 0 d ##class(Demo.Variables).NoPublicListMethod()
z

If you specified credentials during installation and don't remember them, you can use this process to get to the terminal prompt:

http://docs.intersystems.com/latest/csp/docbook/DocBook.UI.Page.cls?KEY=...

After starting in emergency access mode, you can run the command:

Do ^SECURITY

And navigate through the prompts to see what users there are and change the password of one so you can log in.

That's odd - something may have gone wrong with your build. (This is an internal issue.)

You can remove/change the "Source Control" class for the namespace in Management Portal:

Go to System Administration > Configuration > Additional Settings > Source Control, select the proper namespace, select "None," and click "Save."

The class you listed fails for me too, but it's because there's no package name, not because of the bracket mismatch. In Atelier, the class name gets an error marker:

If I change it to:

class somepackage.myclass {
//if someVar {
}

Then the file will happily sync and compile.

If adding the package doesn't fix things, it would be helpful to know what Atelier and Caché versions you're using.

Export with File > Export > General > Preferences; check "Keys Preferences" (which only appears if you've customized any preferences)

Import with File > Import > General > Preferences; select file then check "Keys Preferences"

See: http://stackoverflow.com/questions/481073/eclipse-keybindings-settings

It seems that the CSV export from Window > Preferences, General > Keys, Export CSV ... doesn't have a corresponding import feature.

Ah - yes, a number of things log to ^%ISCLOG. It's very important to set ^%ISCLOG = 0 at the end to keep it from continuing to record. The command I mentioned previously is an easy way to make sure that you're only logging for a brief period of time - paste the command into Terminal and hit enter to start logging, then load the page, then hit enter in Terminal to stop logging. Still, there could be lots of other stuff in there even from having it enabled briefly depending on how busy the server is.

It might make sense for you to contact InterSystems' outstanding Support department - someone could work through this with you directly and help you find a solution more quickly.

Here's a sample, using %ToDynamicObject (2016.2+):

Class DC.CustomJSONSample extends %RegisteredObject
{
Property myProperty As %String [ InitialExpression = "hello" ];

Property other As %String [ InitialExpression = "world" ];

/// Rename myProperty to custom_property_name
Method %ToDynamicObject(target As %Object = "", ignoreUnknown = 0) [ ServerOnly = 1 ]
{
	Do ##super(.target,.ignoreUnknown)
	Do target.$set("custom_property_name",target.myProperty,target.$getTypeOf("myProperty"))
	Do target.$remove("myProperty")
}

ClassMethod Run()
{
	Set tObj = ..%New()
	Write tObj.$toJSON()
}

}

Output:

SAMPLES>d ##class(DC.CustomJSONSample).Run()
{"other":"world","custom_property_name":"hello"}

For other discussions with solutions/examples involving %ToDynamicObject, see:
https://community.intersystems.com/post/json-cache-and-datetime
https://community.intersystems.com/post/create-dynamic-object-object-id

The problem is that REST uses IO redirection itself, and OutputToStr changes the mnemonic routine but doesn't change it back at the end.

For a great example of the general safe approach to cleaning up after IO redirection (restoring to the previous state of everything), see %WriteJSONStreamFromObject in %ZEN.Auxiliary.jsonProvider.

Here's a simple approach that works for me, in this case:

set tOldIORedirected = ##class(%Device).ReDirectIO()
set tOldMnemonic = ##class(%Device).GetMnemonicRoutine()
set tOldIO = $io
try {
	set str=""

	//Redirect IO to the current routine - makes use of the labels defined below
	use $io::("^"_$ZNAME)

	//Enable redirection
	do ##class(%Device).ReDirectIO(1)

	if $isobject(pObj) {
		do $Method(pObj,pMethod,pArgs...)
	} elseif $$$comClassDefined(pObj) {
		do $ClassMethod(pObj,pMethod,pArgs...)
	}
} catch ex {
	set str = ""
}

//Return to original redirection/mnemonic routine settings
if (tOldMnemonic '= "") {
	use tOldIO::("^"_tOldMnemonic)
} else {
	use tOldIO
}
do ##class(%Device).ReDirectIO(tOldIORedirected)

quit str

It would be cool if something like this could work instead:

new $io
try {
	set str=""

	//Redirect IO to the current routine - makes use of the labels defined below
	use $io::("^"_$ZNAME)

	//Enable redirection
	do ##class(%Device).ReDirectIO(1)

	if $isobject(pObj) {
		do $Method(pObj,pMethod,pArgs...)
	} elseif $$$comClassDefined(pObj) {
		do $ClassMethod(pObj,pMethod,pArgs...)
	}
} catch ex {
	set str = ""
}

quit str

But $io can't be new'd.

You could map the package containing the class related to that table using a package mapping, and the globals containing the table's data using global mappings.

You can see which globals the class uses in its storage definition - since the entire package is mapped, it might make sense to add a global mapping with the package name and a wildcard (*).

After taking those steps, you can insert to the table the way you usually would, without any special syntax or using zn/set $namespace.

There was a similar question and answer at https://community.intersystems.com/post/how-include-dynaform-custom-task-form-ensemble-workflow that might be helpful.

In short, the simplest solution (and possibly the only one) would be to put the Zen page in an <iframe> in a CSP page that EnsLib.Workflow.TaskRequest.%FormTemplate points to.

I really would recommend creating a %All namespace (if there isn't already one), via %Installer or something that works similarly.

One projects on the intersystems-ru github, Caché Web Terminal, has the same requirement (use from any namespace); this class might be helpful for reference: https://github.com/intersystems-ru/webterminal/blob/master/export/WebTerminal/Installer.xml. It doesn't actually use %Installer, so configuration changes are implemented in COS instead of being generated based on XML, but it works similarly.

Particularly, see methods CreateAllNamespace and Map/UnMap. You should be able to adapt these without too much effort. If your code coverage project eventually has a UI, then the web application setup method will be useful too (for simple installation).

Sure - although it'd be a property, not a parameter. Looking at utcov.ClassLookup (glad to see it's not %utcov now, by the way), this should work fine for you. Here's a sample:

Class Sample.ClassQueryProperty Extends %RegisteredObject
{

Property SubclassQuery As %SQL.Statement [ InitialExpression = {##class(%SQL.Statement).%New()}, Private, ReadOnly ];

Method %OnNew() As %Status [ Private, ServerOnly = 1 ]
{
	Quit ..SubclassQuery.%PrepareClassQuery("%Dictionary.ClassDefinition","SubclassOf")
}

Method Demo()
{
	Set tRes = ..SubclassQuery.%Execute("%UnitTest.TestCase")
	While tRes.%Next(.tSC) {
		$$$ThrowOnError(tSC)
		Write tRes.%Get("Name"),!
	}
	$$$ThrowOnError(tSC)
}

}

Then:

SAMPLES>d ##class(Sample.ClassQueryProperty).%New().Demo()
%UnitTest.IKnowRegression
%UnitTest.PMMLRegression
%UnitTest.SQLDataRegression
%UnitTest.SQLRegression
%UnitTest.SetBuilder
%UnitTest.TSQL
%UnitTest.TestCacheScript
%UnitTest.TestProduction
%UnitTest.TestScript
%UnitTest.TestSqlScript
Wasabi.Logic.Test.InventoryTest
Wasabi.Logic.Test.PricingTest

The SVG diagram is loaded in Eclipse's internal browser, which will always be IE for you. The preference you found applies to "external" browsers.

Within the internal browser in Eclipse, you can right click and select "view source." When you do so, you should see something like this near the top:

<meta http-equiv="X-UA-Compatible" content="IE=9" />

It would be interesting to know what <meta> tag you see, if any. It would also be useful to know the value of the User-Agent header sent by the internal browser. There are several ways to find that; here's one quick option:

  1. Open a BPL class in Atelier
  2. Run the following code in Terminal:
    k ^%ISCLOG s ^%ISCLOG = 2 read x s ^%ISCLOG = 0
  3. In Atelier, right click in the BPL class and click the "Open diagram editor" popup menu item
  4. Hit enter in Terminal to stop logging.

If you then zwrite ^%ISCLOG you should see the user-agent in a $listbuild list near the end of the output. I see:

^%ISCLOG("Data",180,0)=$lb(900,,0,5532241409,"0²!t"_$c(28,16)_"IÎ"_$c(22)_"F40"_$c(133)_"¯4_ài"_$c(156)_"èB_9}%"_$c(144,155,9)_"!`"_$c(135)_"ü",2,"ENSDEMO","001000010000OoTvE12bLJWATFMLUAodU0gK1Z8HvjdbJWLK3M",,0,"en-us","OoTvE12bLJ",2,1,"/csp/ensdemo/",$lb("UnknownUser","%All","%All",64,-559038737),"","","","2016-04-22 13:28:27","2016-04-22 13:28:30","","Mozilla/5.0 (compatible; MSIE 10.0; Windows NT 6.1; Win64; x64; Trident/6.0)","","",0,"",$lb($lb("%ZEN.SessionEvents","ENSDEMO",)),"","%iscmgtportal:5ykW4kOfOzwr7O8gcok8XQ--",0,"","","","","")

(It's awesome how IE says it's Mozilla, for compatibility reasons.)

You're really close; the key is using the stream's OID (from %Oid()). Here's a simple example; you can substitute any appropriate file path.

Class Demo.DynamicImage Extends %ZEN.Component.page
{

/// This XML block defines the contents of this page.
XData Contents [ XMLNamespace = "http://www.intersystems.com/zen" ]
{
<page xmlns="http://www.intersystems.com/zen" title="">
<image id="myImage" src="" />
<button onclick="zenPage.ChangeImage(zen('myImage'))" caption="Dynamically Change Image" />
</page>
}

ClassMethod ChangeImage(pImage As %ZEN.Component.image) [ ZenMethod ]
{
    Set tStream = ##class(%Stream.FileBinary).%New()
    Do tStream.LinkToFile(##class(%File).ManagerDirectory()_"..\CSP\broker\images\einstein.jpg")
    Set tOID = ..Encrypt(tStream.%Oid())
    Set pImage.src = "%25CSP.StreamServer.cls?STREAMOID="_tOID
}

}

I'm really curious what that image is doing in /csp/broker/...

I think this was a caution for anyone changing their username, since it's shared across InterSystems' sites/applications.

IIRC you use CCR (Change Control Record). The username change may prevent you from using the version control integration in that application. It might be good to ensure that it's still working, or at least to make a note that if it doesn't work, you'll need to change the username back (and then probably log out and back in again for the change to take effect in CCR).

Others may not be impacted as much.

Here's some code from the application I'm working on that might help. The "load/delete the test classes" behavior was annoying enough that we decided to always have the classes loaded on development/testing systems.

First, I think it's useful to have a Run() method in each unit test class, or in a subclass of %UnitTest.TestCase that your unit tests will extend. This code could live somewhere else too, but it's useful to be able to say:

do ##class(my.test.class).Run()

and not have to remember/type the test suite format and /nodelete. Sample implementation:

Class Tools.UnitTest.TestCase Extends %UnitTest.TestCase
{

/// Runs the test methods in this unit test class.
ClassMethod Run(ByRef pUTManager As %UnitTest.Manager = "", pBreakOnError As %Boolean = 0)
{
    If '$IsObject(pUTManager) {
        Set pUTManager = ##class(%UnitTest.Manager).%New() //Or Tools.UnitTest.Manager if you have that
        Set pUTManager.Debug = pBreakOnError
        Set pUTManager.Display = "log,error"
    }
    Set tTestSuite = $Piece($classname(),".",1,*-1)
    Set qspec = "/noload/nodelete"
    Set tSC = $$$qualifierParseAlterDefault("UnitTest","/keepsource",.qspec,.qstruct)
    Do pUTManager.RunOneTestSuite("",$Replace(tTestSuite,".","/"),tTestSuite_":"_$classname(),.qstruct)
}

}

This allows you to specify an instance of a %UnitTest.Manager to capture the test results in, which is useful if you're running a bunch of specific unit test classes (like you suggested, from a Studio project). My team organizes tests in packages rather than in projects, which makes more sense for us.

Next up, here's our %UnitTest.Manager subclass that works with the %UnitTest.TestCase subclass shown above, allowing all the classes in a particular namespace or package (or, really, with class names that contain a particular string) to be run without deleting them afterward:

Class Tools.UnitTest.Manager Extends %UnitTest.Manager
{

/// Runs all unit tests (assuming that they're already loaded)
/// May filter by package or output to a log file rather than terminal
ClassMethod RunAllTests(pPackage As %String = "", pLogFile As %String = "") As %Status
{
    Set tSuccess = 1
    Try {
        Set tLogFileOpen = 0
        Set tOldIO = $io
        If (pLogFile '= "") {
            Open pLogFile:"WNS":10
            Set tLogFileOpen = 1
            Use pLogFile
        }
        
        Write "*** Unit tests starting at ",$zdt($h,3)," ***",!
    
        Set tBegin = $zh
    
        Set tUnitTestManager = ..%New()
        Set tUnitTestManager.Display = "log,error"
        Set tStmt = ##class(%SQL.Statement).%New()
        Set tSC = tStmt.%PrepareClassQuery("%Dictionary.ClassDefinition","SubclassOf")
        $$$THROWONERROR(tSC,tStmt.%PrepareClassQuery("%Dictionary.ClassDefinition","SubclassOf"))
        Set tRes = tStmt.%Execute("Tools.UnitTest.TestCase")
        While tRes.%Next(.tSC) {
            If $$$ISERR(tSC) $$$ThrowStatus(tSC)
            Continue:(pPackage'="")&&(tRes.%Get("Name") '[ pPackage)
            Do $classmethod(tRes.%Get("Name"),"Run",.tUnitTestManager)
        }
    
        If $IsObject(tUnitTestManager) {
            Do tUnitTestManager.SaveResult($zh-tBegin)
            Do tUnitTestManager.PrintURL()
    
            &sql(select sum(case when c.Status = 0 then 1 else 0 end) as failed,
                        sum(case when c.Status = 1 then 1 else 0 end) as passed,
                        sum(case when c.Status = 2 then 1 else 0 end) as skipped
                        into :tFailed, :tPassed, :tSkipped
                   from %UnitTest_Result.TestSuite s
                   join %UnitTest_Result.TestCase c
                     on s.Id = c.TestSuite
                  where s.TestInstance = :tUnitTestManager.LogIndex)

            If (tFailed '= 0) {
                Set tSuccess = 0
            }
        } Else {
            Write "No unit tests found matching package: ",pPackage,!
        }
    } Catch anyException {
        Set tSuccess = 0
        Write anyException.DisplayString(),!
    }
    Write !,!,"Test cases: ",tPassed," passed, ",tSkipped," skipped, ",tFailed," failed",!
    If 'tSuccess {
        Write !,"ERROR(S) OCCURRED."
    }
    Use tOldIO
    Close:tLogFileOpen pLogFile
    Quit $Select(tSuccess:1,1:$$$ERROR($$$GeneralError,"One or more errors occurred in unit tests."))
}

This could probably be tweaked to use a project instead without too much work, but I think packages are a more reasonable way of organizing unit tests.

There are good options for what you want available in 2016.2, and possibly better answers for SQL -> JSON after that.

In 2016.2, %RegisteredObject also supports $toJSON and $fromJSON, so there won't be any need to use %ZEN.Auxiliary.jsonProvider to do that conversion. Under the hood, the path is really RegisteredObject -> Dynamic Object (via $compose) -> JSON, and JSON -> Dynamic Object -> RegisteredObject (via $compose)

Therefore, the behavior of $toJSON and $fromJSON can be modified for %RegisteredObject subclasses by overriding (typically) %ToDynamicObject and %FromObject. Here's an example that might serve as a useful starting point for Object -> JSON/JSON -> Object on 2016.2+:

Class DCDemo.JSONDateTime Extends (%Persistent, %Populate)
{

Property Name As %String;

Property DateField As %Date;

Property "Time_Stamp_Field" As %TimeStamp;

Property TimeField As %Time;

ClassMethod Run()
{
    Do ..%KillExtent()
    Do ..Populate(1)
    
    Set tObj = ..%OpenId(1)
    Write "Object ID 1",!
    zw tObj
    Write !
    
    Set tJSON = tObj.$toJSON()
    Write "JSON for that object:",!
    Write tJSON,!,!
    
    Set tObj2 = ..$fromJSON(tJSON)
    Write "Object from that JSON:",!
    zw tObj2
    Write !
}

Method %ToDynamicObject(target As %Object = "", ignoreUnknown = 0) [ ServerOnly = 1 ]
{
    Set tObj = ##super(target,ignoreUnknown)
    Do ..DateTimeToISO8601(tObj)
    Quit tObj
}

ClassMethod %FromObject(source = "", target = "", laxMode As %Integer = 1) As %RegisteredObject [ ServerOnly = 1 ]
{
    Set tObj = ##super(source,target,laxMode)
    If source.%IsA("%Library.AbstractObject") {
        Do ..ISO8601ToDateTime(tObj)
    }
    Quit tObj
}

ClassMethod DateTimeToISO8601(pObj As %Library.AbstractObject) [ CodeMode = objectgenerator ]
{
    #dim tProp As %Dictionary.CompiledProperty
    Set tKey = ""
    For {
        Set tProp = %compiledclass.Properties.GetNext(.tKey)
        Quit:tKey=""
        
        If (tProp.Type '= "") && 'tProp.ReadOnly && 'tProp.Calculated {
            Set tType = tProp.Type
            Set tExpr = ""
            If $ClassMethod(tType,"%Extends","%Library.Date") {
                Set tExpr = "Set %arg = $zd(%arg,3)"
            } ElseIf $ClassMethod(tType,"%Extends","%Library.Time") {
                Set tExpr = "Set %arg = $zt(%arg,1)"
            } ElseIf $ClassMethod(tType,"%Extends","%Library.TimeStamp") {
                Set tExpr = "Set %arg = $Case(%arg,"""":"""",:$Replace(%arg,"" "",""T"")_""Z"")"
            }
            Do:tExpr'="" %code.WriteLine($c(9)_$Replace(tExpr,"%arg","pObj."_$$$QN(tProp.Name)))
        }
    }
}

ClassMethod ISO8601ToDateTime(pObj As DCDemo.JSONDateTime) [ CodeMode = objectgenerator ]
{
    #dim tProp As %Dictionary.CompiledProperty
    Set tKey = ""
    For {
        Set tProp = %compiledclass.Properties.GetNext(.tKey)
        Quit:tKey=""
        
        If (tProp.Type '= "") && 'tProp.ReadOnly && 'tProp.Calculated {
            Set tType = tProp.Type
            Set tExpr = ""
            If $ClassMethod(tType,"%Extends","%Library.Date") {
                Set tExpr = "Set %arg = $zdh(%arg,3)"
            } ElseIf $ClassMethod(tType,"%Extends","%Library.Time") {
                Set tExpr = "Set %arg = $zth(%arg,1)"
            } ElseIf $ClassMethod(tType,"%Extends","%Library.TimeStamp") {
                Set tExpr = "Set %arg = $Extract($Replace(%arg,""T"","" ""),1,*-1)"
            }
            Do:tExpr'="" %code.WriteLine($c(9)_$Replace(tExpr,"%arg","pObj."_$$$QN(tProp.Name)))
        }
    }
}

}

The output of this is:

USER>d ##class(DCDemo.JSONDateTime).Run()
Object ID 1
tObj=<OBJECT REFERENCE>[1@DCDemo.JSONDateTime]
+----------------- general information ---------------
|      oref value: 1
|      class name: DCDemo.JSONDateTime
|           %%OID: $lb("1","DCDemo.JSONDateTime")
| reference count: 2
+----------------- attribute values ------------------
|       %Concurrency = 1  <Set>
|          DateField = 40424
|               Name = "North,Richard G."
|          TimeField = 74813
|   Time_Stamp_Field = "1963-11-18 01:49:29"
+-----------------------------------------------------
 
JSON for that object:
{"$CLASSNAME":"DCDemo.JSONDateTime","$REFERENCE":"1","DateField":"1951-09-05","Name":"North,Richard G.","TimeField":"20:46:53","Time_Stamp_Field":"1963-11-18T01:49:29Z"}
 
Object from that JSON:
tObj2=<OBJECT REFERENCE>[4@DCDemo.JSONDateTime]
+----------------- general information ---------------
|      oref value: 4
|      class name: DCDemo.JSONDateTime
| reference count: 2
+----------------- attribute values ------------------
|       %Concurrency = 1  <Set>
|          DateField = 40424
|               Name = "North,Richard G."
|          TimeField = 74813
|   Time_Stamp_Field = "1963-11-18 01:49:29"
+-----------------------------------------------------

The matter of SQL -> JSON is a bit more complicated. ODBC select mode for SQL is similar to ISO 8601, but not completely (the timestamp format is different). One option would be to create a class (extending %RegisteredObject) to represent a query result with date/time fields in ISO 8601 format, and to override the same methods in it so that:

  • It can be $compose'd from a %SQL.IResultSet (done in %FromObject)
  • Based on query column metadata, dates/times/timestamps are converted to the correct format when the object is represented as a %Object/%Array or, indirectly, in JSON (done in %ToDynamicObject / %ToDynamicArray).

This could probably be done in 2016.2, but might be less work to accomplish in a future version when SQL result sets support $fromJSON/$toJSON. (I think this plan was mentioned in a different post.)

I suppose there are some possible complications with all this, depending on whether times/timestamps in your application are actually local or UTC. (Or worse, a mix...)

This is a really good point.

At some level, this is part of the behavior of %Studio.SourceControl.ISC, the studio extension class for source control using Perforce. Studio doesn't automatically recompile the class and dependent classes after checkout either. This has bitten me before - I've undone a checkout, but forgotten to recompile, leaving the old compiled version in effect. It might be reasonable for %Studio.SourceControl.ISC to have an option to automatically compile edited items after undo of a checkout, or even to just do that all the time.

Also, Atelier actually does have a separate "compile" option, in the toolbar at the top. (The icon has a file with "010" on it.)

This is an important feature; in addition to the case you noted, there are several situations I can think of offhand where a class would need to be recompiled even though it hasn't changed:

  • The behavior of a macro defined in a .inc file changes. Classes that use that macro must be recompiled to get the new behavior.
  • A method in Class A is called from a [ CodeMode = objectgenerator ] method in Class B. If the implementation of the method in Class A changes, Class B may need to be recompiled. (It won't be recompiled automatically.)

One downside to automatically compiling impacted/dependent classes is that it can take a while - if a minor change impacts hundreds of classes, it might be reasonable to save the class and compile it in separate actions. There's a preference in Atelier (Preferences -> Atelier -> Save Settings, "server save action") to not compile files automatically when they're saved to the server. Atelier is much better about this than Studio, though; when compiling hundreds of dependent classes, Studio tends to freeze up. With Atelier there's the possibility of a timeout, but the editor should remain responsive while the compilation is happening.

In short: you could put a Zen page with a dynaForm in an <iframe>, or use something other than Zen/dynaForm.

The documentation about custom workflow task forms says that the form should be a fragment of HTML in a CSP page, not an entire page. Although Zen pages are CSP pages, it looks like Zen pages can't be used directly as the form template. Under the hood, the inclusion of this CSP page bypasses %OnPreHTTP, which does some necessary setup for Zen pages (particularly, initializing %page and %application). Even if this wasn't the case, and a full Zen page could be inserted, it would end up looking pretty weird.

A fairly simple solution would be create a very simple CSP page that has an <iframe> containing your Zen page, and to use that CSP page as the form template. Any necessary data from %task could be passed along in the Zen page's URL. The onAction method could also be propagated to the iframe, perhaps using Window.postMessage (etc.) to define how the frames can interact.

If that's getting too complicated, perhaps consider using something other than Zen/dynaForm that would fit more naturally in a CSP page. (Perhaps modern JS libraries, REST, etc.)