There is a large file that needs to be processed, with over 500 000 rows. Each row needs to be verified for various data and then all the results collated and reported into a single report. What is the most efficient way to do this? I tried calling the processing function with Job , where each function jobbed off would report results to a different node of a common global. But the jobbed functions are not updating their respective rows even though I am passing the global name and root node. What is the most efficient way to process this large file?

0 1
0 1.1K

I am trying to return the maximum of the value of 2 fields: LastViewed and LastDownloaded AS a local variable -LastAccessed for each row, using a SQL query . These values are stored as $ H format. Is there an existing SQL command that compares two column values ? I could not find one, so I tried using a $Select statement . I got an error that said A term expected beginning with either of: identifier, constant, aggregate, $$,(,:,+....)

Here is the SQL Query I am trying to run:-

0 5
0 781

A method to convert certain non-readable ASCII characters in a %Stream.FileCharacter object first copies it into a %Stream.FileBinary object and then loops through each character one at a time to find and convert these offensive characters to our interpretation of their readable ASCII equivalents . The loop is sequential (while 'bStream.AtEnd) and is taking too long for large files.

0 2
0 814

I am having a problem getting pair key authentication to work on my local PC. I am using freeSShd as the SFTP server. I generated private and public keys using putty gen . I used these keys to successfully log into the root directory using both Winscp and Putty. But this is not working with an FTP Operation. Below is the error that results. freeSSHd server log shows that I log in but immediately disconnect.

I am working on HS2015.1.1 Please help !!

0 1
0 1.3K