This is definitely not a JSON.
- Log in to post comments
This is definitely not a JSON.
Obviously, the response.Data does not contain valid JSON. You can simply check the received data by putting the data aside in a temporary global, something like this:
do request.HttpResponse.Data.Rewind()
set ^temp.debug($j,"size")=request.HttpResponse.Data.Size
set ^("data")=request.HttpResponse.Data.Read(request.HttpResponse.Data.Size) // or just the first 1000 bytes
zw ^temp.debugNow you can take a look on the incoming data, maybe there is an encoding problem or the data do not adhere to JSON specification
According to your code, the variable obx5 contains the base64 encoded tiff image. There is one thing I do not understand: what are those "\.br\" char-sequences, how they came into the base64 stream?
Anyway, I suppose they are OK (those "\.br\"s), so put all those pieces together and decode all at once:
set input = ""
for i=1:1:$L(obx5,"\.br\") { set input = input _ $P(obx5,"\.br\",i)) }
Do obj.Write($system.Encryption.Base64Decode(input))Now you should have a correct decoded image, provided, the obx5 variable really contains the original base64 encoded tiff image with randomly inserted "\.br\" chars (for whatever reason).
You are sure, for each and every $P(obx5,"\.br\",i) the equation $L($P(obx5,"\.br\",i))#4=0 holds?
Ach, the linebreaks, good point, спасибо!
Good Morning Vietnam... ach, I meant Good Morning Julius!
After 1977 (the year I first met Mumps) now is the time to learn M the right way and entirely!
OK, the truth is, I never usd neither the call nor the expression codemode., hence there was no need to check, how parameter passing works...![]()
Nice solution, but just one question, how gets your S routine the parameter <t>?
Is there some trick, I don't know? I would have written this way
ClassMethod ToNato(t) [ CodeMode = call ]
{
^S(t)
}but then makes 5 chars
Oh, believe me, I can top even myself
include macrodefs
ClassMethod ToNato(t = "If, you can read?" ) [ CodeMode = expression ]
{
$$$S
}
ClassMethod s(t)
{
// whatever solution you have, put it there.
}
macrodefs.inc
#define S ..s(t)Are four chars short enough?
Today, I'm just
and
ish
The possibilities to get a corrupted file are:
- you do not read the (Base64) encoded file in chunks of N*4 (where N is an integer)
- your stream wasn't rewinded before starting with reading
- your (incoming) stream is (already) corrupted (but this would in some cases trigger an error in Base64Decode() method).
Just for a test, try this
str = is your incoming Base64 stream
set filename = "test.tiff"
open filename:"nwu":1
if '$test write "Can't open",! quit
do str.Rewind()
use file
while 'str.AtEnd { write $system.Encryption.Base64Decode(str.Read(32000)) } // 32000 = 4 * 8000
close file
If the incoming stream is not corruoted, the right now created tiff file should be readable
I bet, this one, with 6 chars only, is shorter
ClassMethod ToNato(t = "If, you can read?" ) [ CodeMode = expression ]
{
..s(t)
}
ClassMethod s(t)
{
// whatever solution you have, put it there.
}I have no idea, what is the date format of PID 7.1, but I'm sure, you can convert this date to $h format, so the answer to your question is
set age = $h - $zdh(PID7.1Date,?) \ 365.25now <age> contains the patients age in full years
Can you please give us an example for (each) those "variantes" (I mean, those JSON strings)?
Something like:
{"sent":"2021-06-10 09:00:00", "received":"2021-06-10 09:05:00", variante1... }
{"sent":"2021-06-10 09:00:00", "received":"2021-06-10 09:05:00", variante2... }Thank you.
If it helped you to understand how things work, then everything is OK. Have a nice day.
Somehow I don't get you right. To save obj.%Size() in a variable, just do a simple assign
set myVariable = obj.%Size()but I'm pretty shure, this was not your intended question.
I suppose, you have JSON formatted data (a string or a stream) and you want to store those data in a table. Am I right?
If yes, then follow the next steps:
1) create a class which describes your JSON objects (strings)
Class DC.SehindeRaji Extends (%Persistent, %JSON.Adaptor)
{
Property byr As %String(%JSONFIELDNAME = "byr:");
Property iyr As %String(%JSONFIELDNAME = "iyr:");
Property eyr As %String(%JSONFIELDNAME = "eyr:");
// do the same for all other fields
ClassMethod Import(data)
{
set obj=..%New() // create a new DC.SehindeRaji object
set sts=obj.%JSONImport(data,"") // import the (JSON) data
if sts {
set sts = obj.%Save()
if sts {
write "Saved, ID=",obj.%Id(),!
quit 1
} else {
write "Not saved, Err=",$system.Status.GetOneErrorText(sts),!
quit 0
}
} else {
write "Can't import: ",$system.Status.GetOneErrorText(sts),!
quit 0
}
}
}2) You can create some test data (interactively) in a terminal session
set dynObj = {"byr:":"1937", "iyr:":"2017", "eyr:":"2020"}
set data = dynObj.%ToJSON()or get your data somehow from an input (possibly from a file), the only important thing is, your data should look like this
write data --> {"byr:":"1937","iyr:":"2017","eyr:":"2020"}3) import those data
write ##class(DC.SehindeRaji).Import(data) --> Saved, ID=14) Now open the saved data and check the result
set oref = ##class(DC.SehindeRaji).%OpenId(1)
write oref.byr --> 1937
write oref.iyr --> 2017
write oref.%JSONExportToString(.exported,"") --> 1
write exported --> {"byr:":"1937","iyr:":"2017","eyr:":"2020"}
zw ^DC.SehindeRajiD
^DC.SehindeRajiD=1
^DC.SehindeRajiD(1)=$lb("","1937","2017","2020")I hope, this is what yoy want to do...
The facts:
1) According to the error message: "The system cannot find the file specified."
2) Futhermore, the error message shows slashes and backslashes, mixing is rarely good, Windows uses "\", Unix "/"
What to do is:
1) check the filename, you want to send (including the path)
2) check the existence of the file
3) Under which user accont is IRIS/Cache running?
4) May this user read the file?
Why so complicated?
write obj.%Size()should do the job
It's not clear to me what you want to do.
A property like
Property MyData as %(Global-or-File)Stream;means, the size of MyData can be something between 0 and the free space on your (hard) drive.
That's the reason, why is MyData defined as a stream and not as a %String.
On the other hand, in an excel cell you can put no more then 32767 characters, hence the plan to extract those data to an spreadsheet will work only if the MyData properties do not have more then 32767 chars, see
https://support.microsoft.com/en-us/office/excel-specifications-and-lim…
Nevertheless, you could use the following stored procedure to extrac the first 32767 chars from those stream data:
Class Your.Table Extends %Persistent
{
Property StreamData As %GlobalCharacterStream;
// other properties
ClassMethod StreamDataAsText(ID) As %String [ SqlProc ]
{
set obj = ..%OpenId(ID,0), text = ""
if obj { do obj.StreamData.Rewind() set text obj.StreamData.Read(32767) }
quit text
}
}Now you can get, beside the other data, the first 32767 chars of those stream data too
select Your.Table_StreamDataAsText(ID), * from Your.TableIf you can call a JavaScript function, then you could do something like this...
<html>
<head><title>Test</title>
<link id="fav" rel="icon" href="data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABAAAAAQCAIAAACQkWg2AAAABnRSTlMAAAAAAABupgeRAAAACXBIWXMAAA7EAAAOxAGVKw4bAAAAF0lEQVQokWP8z0AaYCJR/aiGUQ1DSAMAQC4BH5CRCM8AAAAASUVORK5CYII=">
<script>
function changeFavicon() {
var lid=document.getElementById("fav");
if (lid) {
lid.setAttribute("href","data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABAAAAAQCAIAAACQkWg2AAAABnRSTlMAAAAAAABupgeRAAAACXBIWXMAAA7EAAAOxAGVKw4bAAAAGElEQVQokWNk+M9AEmAiTfmohlENQ0kDAD8vAR+xLJsiAAAAAElFTkSuQmCC");
}
}
</script>
</head>
<body>
<button onclick="changeFavicon();")>Change my favicon</button><br>
</body>
</html>The (red and green) icons are just a demo example.
Oh yes, the idea ... I still have to think about that and get some sleep
The idea is good, I just have to sleep about it...
I can't you provide a .Net help but there are methods in IRIS/Cache to create QR code:
##class(%SYS.QRCode).GenerateFile(...) and
##class(%SYS.QRCode).GenerateImage(...)so your developers could have a direct use instead of messing with passing data back and fort between IRIS/Cache and .Net
The correct timestamp format is YYYY-MM-DD HH:MM:SS but according to the error message, your data does not meets this format.
104 Field validation failed in INSERT, or value failed ...MyTimeStampField' (value '2021-05-26 11:45:40 ') You see the space or tab character after the seconds?
Yes, use the TRACE utility.
See also https://community.intersystems.com/post/macro-know-all-parameters-trans…
If I got you correctly... for IRIS (and newer Cache Versions) you can use
select * from INFORMATION_SCHEMA.ROUTINES where ROUTINE_NAME='...'and for older Cache versions try
select * from %Dictionary.CompiledMethod where SqlProc=1 and Name='...'(but be patient, this takes some time)
You have a string of digits... like
set result="12345678900987654321"then you can easily extract groups of four digits as
for i=1:4:$length(result) write $extract(result,i,i+3),!this gives you
1234
5678
9009
8765
4321assuming, there are no other characters between those numbers...
[1,2,,3] is equally arguably as [1,2,3,,] or [1,2,3,,,,,] and IRIS/Cache accepts all of them.
Nothing against a system which is input tolerant (forgiving, with your words) but then this tolerance should be obvious and in some way logical. An example, I tolerate trailing comma(s), becuse they could be leftovers of editing. So I would say, all the arrays
[1,2,3]
[1,2,3,]
[1,2,3,,,]have a size of 3 - there are three elements, no more. But IRIS/Cache says the sizes are 3, 4 and 6. So let check the last one
set obj=[].%FromJSON("[1,2,3,,,]")
write obj.%Size() --> 6
for i=0:1:9 write i,?3,obj.%Get(i),?7,obj.%GetTypeOf(i),!The output of the FOR-Loop is:
0 1 number
1 2 number
2 3 number
3 unassigned
4 unassigned
5 unassigned
6 unassigned
7 unassigned
8 unassigned
9 unassigned
The elements with index 3, 4 and 5 are unassigned and in some kind, I can understand that. But if the higher indices, like 6, 7, 88 or 1000 etc. are also unassigned then I ask you, why is the size 6 and not, say 12 or 573?
For me the logical size should be 3 because there are three intendeed elements, the others are a result of tolerated delimiters!
Finally, I don't want to start a war, how to interpret JSON strings. It was just my 2cc to a theme, which is out-of-round, according to my opinion.
OK, take a more simple case:
set obj=[1,2,,3] // again, this is a SYNTAX
set obj=[].%FromJSON("[1,2,,3]") // this is OKbut in both cases, the problem was the bouncing comma-key on my keyboard.
The first was told by compiler the second was "forgiven" by JSON-Reader! BUT the real question is, WHAT IS the third item in the above array? The latter shows obj has a size of 4, so the, and the desired thrid element could be null, 3 or maybe something else!
I wrote my very first program somewhere in 1971 or 1972, I can't remember anymore. But one thing I have learned is, one should accept checked data only.
Imagine, you accept those incorrect (aka forgiven) data and beside processing, store the data in your database, then later, for whatever reason, you send the (original string) data to an external party.... bang! They can't read it, because it's not JSON conform.
For the sake of completness, there is one more validator: https://jsonlint.com/ (which shows the above settings.json file as incorrect).
One more problem, it seems IRIS (and Cache) speaks with a forked tongue (but has nothing to do with the above problem) :
set string = "{""Value"":123, }" // note the extra comma!
set stream=##class(%Stream.TmpCharacter).%New()
do stream.WriteLine(string)
set obj1={"Value":123, } --> gives a SYNTAX
set obj2={}.%FromJSON(string) --> accepts the incorrect (json)string!I'm not sure... but I think, your settings.json does NOT conform to JSON specification. It seems, you like extra commas but JSON does not likes them. Take a look at:
...
"active":true , <----extra comma
}, <----------------------- extra comma
}This produces in IRIS/Cache an <SYNTAX... which starts with an "<"
Long time ago I did some connections to external databases (MySql and PostGres).
The essential parts such a connection are:
1) First, you have to create in your OS the corresponding ODBC Data Source entries
(System-DSN) after installing the required DB-Driver
2) The connection
set gtwConn=##class(%SQLGatewayConnection).%New(), gtwHandle=0
if gtwConn.Connect(OdbcName, OdbcUser, OdbcPass) {
if gtwConn.AllocateStatement(.gtwHandle) {
// check gtwConn.GatewayStatus
// etc.
} else { write "Can't Allocate: "_OdbcName }
} else { write "Can't connect to "_OdbcName }
3) SQL-Commands
do gtwConn.CloseCursor(gtwHandle)
if gtwConn.PrepareW(gtwHandle, sqlStatement) {
if gtwConn.Execute(gtwHandle) {
...
...
} else { /* check gtwConn.GatewayStatus */ }
} else { /* check.gtwConn.GatewayStatus */ }
4) Finish
if gtwConn {
do gtwConn.DropStatement(gtwHandle), gtwConn.Disconnect()
set gtwConn="", gtwHandle=""
}