Contestant

Here at InterSystems, we often deal with massive datasets of structured data. It’s not uncommon to see customers with tables spanning >100 fields and >1 billion rows, each table totaling hundred of GB of data. Now imagine joining two or three of these tables together, with a schema that wasn’t optimized for this specific use case. Just for fun, let’s say you have 10 years worth of EMR data from 20 different hospitals across your state, and you’ve been tasked with finding….

4 0
2 10

Like many others probably find themselves, we were stuck doing live data mapping in our Interface Engine that we really didn't want to do, but had no good alternative choice. We want to only keep mappings for as long as possibly needed and then purge expired rows based upon a TTL value. We actually had 4 use cases for it ourselves before we built this. Use cases:

1 0
0 25