Well, that's a new one.  

Do you have Long Strings enabled on this installation?  If not, you should.  It will fix this problem, but also increases the max size a string can be from 32KB to something like 3.1MB, which allows a lot more flexibility in your development decisions.  But that's another thing.

To delete your query history you can run the following code in Caché terminal:

NSP>k ^%sqlcq("NSP","SMPQueryHistory")

Where NSP is the name of your namespace that's having this problem.  You can also clear history from the Management Portal, though I expect you cannot get there with this error. Let me know if this works - if not I would advise you to open a new WRC issue by going to wrc.intersystems.com, emailing support@intersystems.com, or calling 617-621-0700.  If you're lucky, you might even get to talk to me! :-)

When I want to run a bunch of statements I find it easier to open the SQL Shell and parameterize the queries.  Like so:

SAMPLES>d $SYSTEM.SQL.Shell()

SQL Command Line Shell

----------------------------------------------------

The command prefix is currently set to: <<nothing>>.

Enter q to quit, ? for help.

SAMPLES>>update sample.person set name=? where name=?

1. update sample.person set name=? where name=?

Enter the value for parameter '1': Kyle

Enter the value for parameter '2': Tester

executing statement with parameter values: set %tResult=%tStatement.%Execute("'Kyle'","'Test'")

1Rows Affected

statement prepare time(s)/globals/lines/disk: 0.1260s/4915/70580/0ms

          execute time(s)/globals/lines/disk: 0.0033s/10/110/0ms

                          cached query class: %sqlcq.SAMPLES.cls14

---------------------------------------------------------------------------

SAMPLES>># 

     1. update sample.person set name=? where name=?

SAMPLES>>#1

update sample.person set name=? where name=?

1. update sample.person set name=? where name=?

Enter the value for parameter '1': Sexy Ginger God

Enter the value for parameter '2': Kyle

executing statement with parameter values: set %tResult=%tStatement.%Execute("'Sexy Ginger God'","Kyle")

1 Rows Affected

statement prepare time(s)/globals/lines/disk: 0.0002s/5/98/0ms

          execute time(s)/globals/lines/disk: 0.0002s/5/113/0ms

                          cached query class: %sqlcq.SAMPLES.cls14

---------------------------------------------------------------------------

SAMPLES>>

Some notes:

           1) Note that entering the hash/pound/tic-tac-toe sign (#) gives you a list of statements that have been run

           2) You can run these statements by following that sign with the number.  So #1 runs the first statement from this session (it's actually saved by process)

           3) Parameterized queries do not need quotes, and can be easily rerun

           4) Not allowing multiple statements per line is a way to help us be more resilient against SQL Injection attacks (that said, parametrization is still key).

The second option will be faster because we don't need to take in the whole object and put it into memory.  The first option does have to do that.  Here's a quick test that shows the second way is faster:

SAMPLES>s ts = $P($ZTS,",",2) f i=1:1:100000 { s name= ##class(Sample.Person).NameGetStored(1) } w "Time: "_(($P($ZTS,",",2))-ts)

Time: .139673

SAMPLES>s ts = $P($ZTS,",",2) f i=1:1:100000 { s p= ##class(Sample.Person).%OpenId(1) s name=p.Name } w "Time: "_(($P($ZTS,",",2))-ts)

Time: .504776

Now, if you want to go SUPER-fast, you can skip all this objects mumbo-jumbo and get that info right from the global:

SAMPLES>s ts = $P($ZTS,",",2) f i=1:1:100000 { s name = $LG(^Sample.PersonD(1),2)} w "Time: "_(($P($ZTS,",",2))-ts)

Time: .029287

However, this has no safeguards built in, and should only be used for your most dire of performance needs.  

Hi David,

Well you're in luck, because you're almost done!  First you have to link the table.  To do this go to the Management Portal: System Explorer->SQL.  Then go to Actions->Linked Table Wizard.   Choose the SQL Gateway connection from the drop down and choose your table.  Go through the couple of screens where you can normally accept the defaults.  

Once you link a table, then you can interact with it as if it were local!  That is, you can pretend the table is not linked, and use the SQL and/or Object access you're used to in Caché, and access the data in the remote tables.   If you are having ANY trouble with this, contact the WRC and one of us will be happy to walk you through the procedure.

I want to take a moment here and advise you to be very careful with iKnow.  iKnow is NOT a solution, it is a way for you to develop your own solutions (much like Caché and Ensemble, actually).  While iKnow can give structure to your free text fields, it cannot tell you what to do with that information once you gather it.  So before implementing iKnow and developing a solution, you need to know what it is you want to look for, the purpose of putting the iKnow structure on your data, and what you are going to display or show off once you get it.  

You should contact the WRC at wrc.intersystems.com to help you debug this issue - we'd be happy to help!

As a first guess, are you using Cache 5.0.2?  If so those DLLs might be 32-bit and not match your 64-bit web server, causing some problems.  I would suggest using the most recent CSP Gateway client, which you can also download from wrc.intersystems.com, and make sure you use 64-bit.  From there, following those instructions has always led me to success in configuring webservers.

Shouldn't be too bad.  I think all you need to do is to set up Caché as an ODBC Data source on the system.  Steps are as follows:

    1) Download Caché ODBC Driver from wrc.intersystems.com or .intersystems.com/pub/cache/odbc/2016

    2) Go to Control Panel->Administrative Tools->Data Sources ODBC -> System DSN and create a new DSN with the InterSystems ODBC Driver.  You will need to know the IP, Port, Namespace, and credentials for your Caché server.

    3) Configure Crystal Reports to use that DSN.

    4) ????

    5) Profit!

I think any answer here should be thinking about using triggers.  I don't know MySQL enough to know if it has linked tables, but if it were Caché as the master copy, this is pretty easy.  You have your Master table which contains your data, and an external table that has a copy.  You link the external table to Caché and put in triggers in the master table to call INSERT/UPDATE/DELETE on the mapped table.  

Find in files is a nice Studio feature, but I would say that the Atelier paradigm allows for a much more powerful search by using grep.  All the files that are in your workspace will exist on your hard drive in  your workspace directory in UDL (that is, plaintext format).  So if you wanted to find in files for, say, ^KMB, you can do this:

kbaxterMAC:User kbaxter$ grep -iR "KMB" ./

.//TwoGWQueries.mac: s ^KMB = 1

Remember, the whole point of Atelier is to move the source of truth from the Database to your Source Control, which will in some cases be linked into your filesystem.  I think grep is a great solution here.

Two examples below, one with a name and one without.  It looks like this is supposed to work like a Stored Procedure (according to my understanding of the Postgresql docs).  Having a property that uses the function as its sqlcomputecode is pretty trivial.


Class Test.Seq {

Classmethod KyleSeq() as %Integer [sqlproc, sqlname="KyleSeq", codemode=expression]

{

$I(^KMB("KyleSeq"))

}

Classmethod KyleSeqName(name as %String) as %Integer [sqlproc, sqlname="KyleSeqName",codemode=expression]

{

$I(^KMB($G(name,"KyleSeq")))

}

}

SAMPLES>>select Test.KyleSeqName('Fabio')

8. select Test.KyleSeqName('Fabio')

Expression_1

3



SAMPLES>>select Test.KyleSeq()           

9. select Test.KyleSeq()


Expression_1

5

There is no  Requirement for bandwidth that I'm aware of.  Studio uses ODBC to connect to the Server, and it will have to bring over the class/routine list in order for the open dialog to work, and if you have a large system this can take some time.  I think as long as Studio is getting information, though, that it should work (albeit slowly).  If you find Studio is working too slowly, you can import your project to a local instance and work off of that. 

This is working as expected (to the best of my knowledge).  If you have a property that is SqlComputed and not Calculated, then the property value is stored on disk.  We calculate that value to store it on disk and do so on the first INSERT/%Save().  Then, since there is no SqlComputeOnChange value, the value won't be re-computed on any UPDATEs.  However, I believe you will be able to set that property directly via INSERT/%Save().