go to post Dmitry Maslennikov · Aug 27, 2019 There is one more option, is the root of sources, so, the final settings are "objectscript.export.folder": "src", "objectscript.export.addCategory": true But it will be /src/CLS, not /src/cls, yet, until the next version.
go to post Dmitry Maslennikov · Aug 27, 2019 There is an option in the settings: "objectscript.export.addCategory", when it's set to true, it will add categories for different file types.In uppercase CLS for files "cls", and RTN for "int", "mac", "inc".The next coming version will support the way to have more control over it, where the left side for extension of a file, and the right part for a category name "objectscript.export.addCategory": { "cls": "_cls", "mac": "_mac", "int": "_int", "inc": "_inc" }
go to post Dmitry Maslennikov · Aug 12, 2019 look at this exercise, it may help you to get how to achieve it Set id = 1 Set streamGN = $Name(^IRIS.TempStream) Kill @streamGN Set @streamGN@(id, 1) = "some binary data chunk 1" Set @streamGN@(id, 2) = "some binary data chunk 1" Set lastChunk = $Order(@streamGN@(id, ""), -1) Set @streamGN@(id) = lastChunk Set size = 0 For chunk=1:1:lastChunk { Set size = size + $Length(@streamGN@(id, chunk)) } Set @streamGN@(id, 0) = size Set stream = ##class(%Stream.GlobalBinary).%Open($Listbuild(id, , streamGN)) While 'stream.AtEnd { Write !,stream.Read() } Quit
go to post Dmitry Maslennikov · Aug 8, 2019 You already using concatenation, it, means, that you lost leading zero, somewhere before. You may have some arithmetic operations with your parameters, which will case to lose those zeros.
go to post Dmitry Maslennikov · Aug 8, 2019 Remember about the second parameter in methods like Get and Post, which named test, which you can use for debugging purposes. If test is 1 then instead of connecting to a remote machine it will just output what it would have send to the web server to the current device, if test is 2 then it will output the response to the current device after the Get. This can be used to check that it will send what you are expecting.
go to post Dmitry Maslennikov · Aug 3, 2019 Just install this extension for Docker, and you will be able to up, and down compose configuration, from the context menu on file, or from the command palette. And of course, after that, you will be able to set any shortcut for those commandsAnother solution would be to use tasks in VSCode. So, you can add new or edit file .vscode/tasks.json, with content like this. { // See https://go.microsoft.com/fwlink/?LinkId=733558 // for the documentation about the tasks.json format "version": "2.0.0", "tasks": [ { "label": "Compose Build", "type": "shell", "command": "docker-compose build", }, { "label": "Compose Up", "type": "shell", "command": "docker-compose up", } ] } And by command Run task from the command palette, it will offer to select which task to run.
go to post Dmitry Maslennikov · Aug 2, 2019 How about community edition in docker hub, when to expect it there?
go to post Dmitry Maslennikov · Aug 1, 2019 $SYSTEM.Util.Decompress() and $SYSTEM.Util.Compress() can help you to decompress and compress any data from/to gzip just from string
go to post Dmitry Maslennikov · Aug 1, 2019 That's strange, then. Maybe, somebody from InterSystems, can say, what's wrong.
go to post Dmitry Maslennikov · Aug 1, 2019 How many files did you try to sync?Could you try to sync less files?
go to post Dmitry Maslennikov · Aug 1, 2019 As you getting <STORE> error, it is related to the process's memory. So, I think you can solve it easily by increasing the value of maximum memory per process.It should be by default 256Mb, but, maybe you have too much CSP files. Just temporarily increase maximum memory per process may solve the issue.
go to post Dmitry Maslennikov · Jul 26, 2019 You can use any newest version of Studio with older versions of Caché. Or you can even use Studio for IRIS, which now you can download and install separately.The other way I would recommend is to migrate to VSCode ObjectScript. You can contact me directly so, I can help you with this process.
go to post Dmitry Maslennikov · Jul 23, 2019 You just have to enable Ensemble in the installer <Namespace Name="${NAMESPACE}" Code="${DBNAME}-CODE" Data="${DBNAME}-DATA" Create="yes" Ensemble="1">
go to post Dmitry Maslennikov · Jul 22, 2019 I think it would be good to add screenshot like this, to show how to configure memory limits in macOS. In Windows should be quite similar I think.
go to post Dmitry Maslennikov · Jul 17, 2019 Did you try to set it as is?There is no way to get duplication as you describe. You can check If there is already the same value with $DATA function. Write $Data(^Data("Athens"))
go to post Dmitry Maslennikov · Jul 8, 2019 you can delete all .DS_Store folders and repeat install. This command will delete these folders, which not supposed to be there. find /Users/Downloads/cache-2018.1.2.309.0su-macx64/ -name '.DS_Store' -delete There are some reasons why you have those folders, but it is safe to just delete them.
go to post Dmitry Maslennikov · Jul 4, 2019 Good to hear that you solved it. Unfortunately, development workflow guide is on the way, yet. But I'm going to do it this month.First of all, any logging related to this extension goes to Output ObjectScript, so, you maybe can find there errors, or compile log.Actually, it should compile class and notify about success or errors, just after change and save any class/routine.Import and compile from explorer, good to use when you have changed many files in the sources folder, by Source Control System for example.You can also trigger compile from command pallete and with a shortcut Cmd +F7/ Cmd + F7
go to post Dmitry Maslennikov · Jul 4, 2019 Could you please create a new issue here. I'll try to help to solve it. I don't have AIX system, but not sure if it could be a problem.If you see something unexpected, you can try to look at Output for ObjectScript and just main Log, maybe some errors there, which may help to diagnose.
go to post Dmitry Maslennikov · Jul 1, 2019 I would not agree with the way of using "in-memory global" instead for logging. It would be easier to have one ClassMethod Log, which would log everything needed to be logged, it can do it with objects, which would have indexes for future usage, to get faster access. But it can temporarily switch off journalling at all, or just suspend the transaction. In any normal application, any logging should already be centralized. So, it would not add any complexity for an application. But in some cases quite difficult to debug some issues, when you lost some logging because they were rollbacked.