go to post John Lisa · Sep 23, 2020 I have another question. Now that the project is underway, It appears that I have over 86 million Invoices to extract into a flat file. When I run the select statement, (Now using DBEAVER) the job bombs out. I'm not a system admin and so I can't add tables to the Cache dB, only pull them out and I was thinking about importing the flat files into SQL server for further manipulation. Is there a best practice for getting this enormous table out of Cache?
go to post John Lisa · Aug 8, 2020 Thanks. Looks like I'm gonna try to have my infrastructure team install this.
go to post John Lisa · Aug 7, 2020 I have tried a small sample extract using SQL code in the 'System Explorer' function on the management portal. With just a few joins and a small sampling of invoices, I get this message: I have to extract large volumes of various data for an integrations project. Are the other tools mentioned better (DBeaver and DataGrip) to handle large files?
go to post John Lisa · Aug 6, 2020 I have tried a small sample extract using SQL code in the 'System Explorer' function on the management portal. With just a few joins and a small sampling of invoices, I get this message: Is there a more efficient way to extract very large amounts of various data via SQL?
go to post John Lisa · Jul 25, 2020 Thank you Stephen for your response. The extraction is indeed GE Centricity for sure, but the hospital is advising that it is 'Groupcast'. I will see this week coming. If that's the case, I will see if I can use the IRIS Management system via System Explorer.
go to post John Lisa · Jul 16, 2020 Hi Henrique, Thank you for the speedy reply. Very anxious to give this a run! Regards! John