Robert Cemper · Dec 3, 2018 go to post

If you don't define an explicit MAXLEN it is just ignored during validation.
But you may run into problems with ODBC/JDBC   when your VARCHAR or similar just has no maximum SIZE.

So you might instead store your super-long-string into a GlobalStream 
But then you lose all string related operations. It's an option I personally dislike. 

Robert Cemper · Dec 2, 2018 go to post

Hi,

I had to prepare a suitable file

Previous case:  Open Param "RU" /UNDEFINED ignores  line terminators

 
USER>open file:("RU":1000000)
 
USER>use file read x use 0 write $l(x)
164405
USER>write $e(x,*-30,*)
e></xs:complexType></xs:schema>

USER>close file


With open Param "RS" /STREAM  line terminators are honored

USER>open file:("RS":1000000) s l=0
 
USER>use file read x set l=l+$l(x) use 0 write $l(x)," ",l
4126 4126
USER>use file read x set l=l+$l(x) use 0 write $l(x)," ",l
18433 22559
USER>use file read x set l=l+$l(x) use 0 write $l(x)," ",l
61497 84056
USER>use file read x set l=l+$l(x) use 0 write $l(x)," ",l
80343 164399

USER>write $e(x,*-30,*)
e></xs:complexType></xs:schema>

USER>close file

The difference in lenght results from skipped line terminators.
 

Robert Cemper · Dec 2, 2018 go to post

Hi,

I just tried this  :

USER>set file="Eduard.txt"
 
  USER>open file:("RU":1000000)
 
  USER>use file read x#1000000
 
  USER>write $length(x)
  183237
  
  USER>

"RU":Reclen    seams to do the trick 

I used it in READ just as fallback. The content was as expect.

Robert Cemper · Nov 20, 2018 go to post

There's still the option to place an enhancement request @WRC.
If granted,  version 2019.2 more likely 2020.* or after may include it. 
Until then you depend on own writing.  

Robert Cemper · Nov 20, 2018 go to post

There is no option 'NO_JSON' or similar for strings.
so you have to do it by hand.

Assumption according to your description the basic table  looks like this: (except for Name)

select ID, Name, Options from Whatever.Whatever

ID
Name
Options
1
Zucherro,Michelle Q.
{"Color":"Green","Count":4}
2
Paraskiv,Alexandra E.
{"Color":"Purple,""Count":6}
3
Ramsay,Jules T.
{"Color":"White""Count":8}
4
Grabscheid,Julie K.
{"Color":"Orange","Count":2}
5
Edwards,Mark S.
{"Color":"Red","Count":1}

Then this might do the trick:
You manually mask out the critical characters first and mask it in after JSON Processing.
No help by the system just your own fate.

SELECT top 5 ID, REPLACE(REPLACE(REPLACE(
JSON_OBJECT('ID':ID,'Name': Name,'Options': $TRANSLATE(Options,'{}"','()^') )
,'"(','{')
,')"','}')
,'^','"')

FROM Whatever.Whatever

 
1
{"ID":1,"Name":"Zucherro,Michelle Q.","Options":{"Color":"Black","Count":7}}
2
{"ID":2,"Name":"Paraskiv,Alexandra E.","Options":{"Color":"Red","Count":1}}
3
{"ID":3,"Name":"Ramsay,Jules T.","Options":{"Color":"Purple","Count":6}}
4
{"ID":4,"Name":"Grabscheid,Julie K.","Options":{"Color":"Green","Count":4}}
5
{"ID":5,"Name":"Edwards,Mark S.","Options":{"Color":"White,""Count":8}}

 

Not funny but working

You could as well compose your 'personalized' JSON result in a Classmethod and project it as SqlProcedure 

Robert Cemper · Nov 19, 2018 go to post

for display in terminal:

ZWRITE ^People
or
ZWRITE ^People("Customers")
or
ZWRITE ^People("Customers", "Acc.divison")

depending on your needs

Robert Cemper · Nov 16, 2018 go to post

I'm really surprised by this discussion.
Especially having actual numbers. What should mean 6400 with just 15000 rows in total? Sorry, I oppose!
It's a matter of selectivity.  If you can collect 2% of your records or more by a single value, then a BITMAP makes sense.
Even EXTENTTbitmap that filters Exists or Not falls into this rule. Though this isn't really property based.

Robert Cemper · Nov 14, 2018 go to post

1)  YES, you can! 
And it will not affect stability and usability. Though understanding COS is definitely an advantage in understanding what is going on. Similar to all other DBs: Understanding concepts and internals is always a benefit. Other DBs are just not as open to investigation and not as flexible doing the "undoable".

2a) Importing and running a Caché DB into IRIS works for 98% at least.
For the remaining 2%, ISC engineers are very open to assist you and solve the issue.

2b) Converting COS to anything else depends mostly on the COS code you have in hands. I know of no converter to do it for you.
As COS allows coding style that was used 40yrs back the range of styles is a very broad and unpredictable field without touching it.
#1 You depend on the quality of external documentation.
#2 You depend on inline documentation, comments, remarks in code. This can be excellent or just not existing, 
#3 You depend on how tricky the code is designed and written.
At that point just knowing COS might not be enough and even experts could get their headache on what I call "dirty coding". 
#4 ISC has experts also to read and understand old style and its side effects.
#5 You have this bright community to ask.
#6 You have excellent online training facilities to learn COS. I've done this with ~12 people over the last few years.
If they understand Objects, SQL, Java (or generic OO programming concepts) it's a matter of a few weeks to be inside COS.
And they have to be willing to break out and see something new with other limits and other possibilities other horizons.

HTH 

Robert Cemper · Nov 13, 2018 go to post

The key issue in a DR scenario is network performance between the instances.

You will run most likely an Async Mirror to have a reasonable distance between production and DR site.
I wouldn't suppose enough bandwidth for a sync Mirror.

The other issue is the performance of the DR site.  You require enough performance to process all the synchronization within a reasonably short delay. This is often underestimated, Production servers grow and leave their DR site behind.

Not specific to the cloud but not less important: How can you verify that the content of your DR site is really identic to your production.
For a heavy transactional operation, this can be a real tricky exercise.

And last not least if you don't train your team for a disaster situation and verify your instructions step by step at least once a year all your investment could be wasted money.

Especially this last point is skipped quite often as it means in most cases a lot of effort with no immediate ROI.
 

Robert Cemper · Nov 10, 2018 go to post

from Caché prompt:

USER>$uname -a
Linux MYSERVER 4.15.0-38-generic #41-Ubuntu SMP Wed Oct 10 10:59:38 UTC 2018 x86_64 x86_64 x86_64 GNU/Linux

 
Robert Cemper · Nov 10, 2018 go to post

and for Linux:

(don't have a Caché  installation at hands)  but  CPIPE should do it

rcemper@ubuntu:~$ uname -a
Linux anyServer
4.15.0-38-generic #41-Ubuntu SMP Wed Oct 10 10:59:38 UTC 2018 x86_64 x86_64 x86_64 GNU/Linux
rcemper@ubuntu:~$ 

 
Robert Cemper · Nov 10, 2018 go to post

something fast for WINDOWS

USER>$wmic os list brief
 
BuildNumber  Organization  RegisteredUser  SerialNumber             SystemDirectory      Version
17134                      cemper          0                      0 C:\WINDOWS\system32  10.0.17134
Robert Cemper · Nov 9, 2018 go to post

I oppose:

- all text coloring, fonts sizes help to express the importance of text. 

- text from right to left  is only useful for Hebrew or Arabic writing:  not my world

- special characters help a lot if you don't have them on your keyboard   ¿isn't it ? my 2 ¢

-  what's bad with smileys?  crying

I do spell checking with Grammarly. the embedded only confused me.

Source could need an improvement to wrap the text in the window. The single line display is cumbersome.

Robert Cemper · Nov 5, 2018 go to post

To separate physical storage from applications you may do this inside Caché using ECP 

Robert Cemper · Nov 5, 2018 go to post

As stated in the 2nd paragraph:

Unlike the standard .NET binding, the eXTreme APIs do not use TCP/IP to communicate with Caché.
Instead, they use a fast
in-memory connection

So by definition, it can't be on a different server. It acts like COS but in the .NET based language of your choice.
 

Robert Cemper · Nov 4, 2018 go to post

Still another approach using your original "solution" in PHP following the idea of a Micro-Service.

Instead of a mimic of what PHP might do I use it directly for this purpose.
That way more sophisticated functionalities that can be used without recoding.

I extended your test to include doubled double quotes

USER>write  %dstr
ABC Company,"123 Main St, Ste 102","Anytown, DC",10001,234-567-8901,"hi ""rcc"" was here"
USER>set reply=$$^phpCSV(%dstr) write reply,!! zwrite reply
ABC Company       123 Main St, Ste 102    Anytown, DC 10001   234-567-8901    hi "
rcc" was here
 
reply="ABC Company"_$c(9)_"123 Main St, Ste 102"_$c(9)_"Anytown, DC"_$c(9)_"10001"_$c(9)_"234-567-8901"_$c(9)_"hi ""rcc"" was here"

*

and here the code:

phpCSV(str) {       ; use PHP for conversion
#define php "............\php.exe "    ; add location of php.exe
#define pipe "|CPIPE|1"
  set file="myTest.php"
  open file:"WN" use file
  write "<?php "
        ,!,"$str='"_str_"';"
        ,!,"$strtotab = implode('\t', str_getcsv($str, ','));"
        ,!,"print_r($strtotab);"
        ,!,"?>",!
  close file
  open $$$pipe:$$$php_file
  use $$$pipe read result
  close $$$pipe
  use 0 ; write result
  quit $replace(result,"\t",$c(9))
}
Robert Cemper · Nov 2, 2018 go to post

GREAT ! 
all well-documented code!
no (dirty) Harry_Potter_Coding !

I was sure you know it

Robert Cemper · Nov 1, 2018 go to post

The Unix/Linux world often uses LF := $C(10) as line terminator
while in Win (and VMS) world  CRLF := $C(C13,10) is a default.
 So you depend on the source system providing the data.

Suggested approach: use LF as the (common) line terminator  and   just drop $C(13) or   &#x0D; from your input record by

$replace($translate(record,$c(13)),"&#x0D;","")

Before any other processing.

Robert Cemper · Nov 1, 2018 go to post

OK, this handles double quotes. But only INSIDE a quoted string

parseCSV(string,newsep,sep=",",quote="""",newquote) {    ;adjust for flexible quoting
    set res="",newsep=$g(newsep,$c(9)),newquote=$g(newquote,quote)
    for  {
        if $g(string)[sep 
            if $e(string)=quote {
                set string=$replace(string,"""""",$c(2)) ; exclude double quotes
                set part=$P(string,quote,2)
                    ,string=$replace($p(string,part_quote_sep,2,*),$c(2),"""""")
                    ,res=res_newquote_$replace(part,$c(2),"""""")_newquote_newsep }
            else  
                set part=$P(string,sep),string=$p(string,sep,2,*)
                    ,res=res_part_newsep 
        else  
               set res=res_$g(string) quit }
        }
    quit res

}

-

HTH 

Robert Cemper · Nov 1, 2018 go to post

you are right. I didn't think on empty parts and doubled double quotes yes
which I never met from CSV.
next level exercise wink 

Robert Cemper · Nov 1, 2018 go to post

I discourage since years the use of $ZU(...) functions as they aren't documented since 2010. 
I recently had to dig back to 2009 for just a weak hint what might happen.

It is even worse with all the internal stuff around %occ* and similar.
No docs. No guaranty of the life cycle. No (external) notice of eventual changes. Mostly as a deployed code.

If it is used inside a $system.* or part of a generated code that's OK. The responsibility is not at the user side.

Verifying those "specials" with every release change can be a very heavy exercise. 
(just experiencing this on a system locked down to on an older version unable to migrate)

Robert Cemper · Nov 1, 2018 go to post

not being verbose in %occ* world I had this solution also allowing to change quoting
 

parseCSV(string,newsep,sep=",",quote="""",newquote) {    ;adjust for flexible quoting
  set res="",newsep=$g(newsep,$c(9)),newquote=$g(newquote,quote)
  for  
     if $g(string)[sep 
       if $e(string)=quote {
          set part=$P(string,quote,2),string=$p(string,part_quote_sep,2,*)
                ,res=res_newquote_part_newquote_newsep }
     else  
         
set part=$P(string,sep),string=$p(string,sep,2,*)
               ,res=res_part_newsep 
    else  
          set res=res_$g(string) quit }
     }
  quit res
}

BTW.
It's an excellent test exercise for new COS programmers
I'll add it to my collection.

Thanks   yes