Good day. This fall I had to solve this problem. For the conversion, I wrote the ZDev tool. (https://github.com/MyasnikovIA/ZDev). Maybe someone will need. The whole project is in %ZDev.xml (in the project there is an example of copying the global %ZDev.Demo.demo5CopyGlobal). Briefly about the principle of operation: A socket server is started on the remote database, the target client connects and executes the ZW ^MyGlobal.tab command, and then reads the result and restores locally, with a change in the encoding $ZCVT(...). The same action can be done through uploading to a file, but for me this mechanism was unacceptable, since the base for which this tool was written exceeds two terabytes.
Yes, i would like to list it on Open Exchange. What must I do for it?
For me, this is also a problem. This problem has remained. Within 14 hours in the framework of one process about 80 gigabytes were downloaded. To smooth the database conversion process, it was divided into important tables and no. After that, all the small tables were overloaded in 20 threads, and the large tables were overloaded through the bust of $Order(^XXX.tab (ind), -1). Again, this is a purely individual decision. And it does not solve the problem of continuous work. If there are ready-made solutions, I will be very happy.
Log in or create a new account to continue