New post

Find

Article
· Oct 21, 2015 2m read

Solving the Problem of Data Silos: Process and Architecture

Introduction

The lack of visibility across data silos — data sources that are not integrated with enterprise systems — is a threat to business efficiency and profits in many industries. In financial services, front-office silos may develop where operations are segregated by product and region without coordination on data model design. Mergers and acquisitions may result in additional disparate silos, or regulations may require that data in one arm of the firm be inaccessible to another. When risk managers and compliance officers in financial services firms cannot see how activities in one silo are related to activities in another, the chance of rogue risk-taking, rate manipulation, or financial fraud is high. This May 2015 headline is just one example of the consequences: “Five global banks to pay $5.7 billion in fines over rate rigging.”

Most firms have risk and crime prevention operations aimed at forestalling such headline events. But the applications they use cannot give them a clear line of sight into the data across all of the firm’s silos. Without this awareness, it is nearly impossible to recognize anomalies and make adjustments before they become larger problems. Despite the best efforts of risk managers and compliance officers, negative events still occur.

This white paper describes a data line-of-sight solution based on InterSystems Ensemble® technology. It illustrates how you can gain visibility into data and activity across all of your silos to reduce the risk of negative events.

Discussion (0)0
Log in or sign up to continue
Article
· Oct 21, 2015 1m read

Using Two-Factor Authentication

Introduction

If the administrators responsible for securing applications had their way, passwords would be long complex strings of random symbols, and users would memorize different passwords for every application they use. But in the real world, few people are capable of such prodigious feats of memory. The typical user can only remember a handful of relatively short passwords.

That’s why an increasing number of applications are requiring two-factor authentication. In addition to asking for a password (something the user knows), applications can be configured to ask for a supplementary password delivered in real time via a device (something the user has). Two-factor authentication provides an extra layer of assurance that the person logging on to an application is, in fact, who he or she claims to be.

This paper outlines how InterSystems supports two-factor authentication in all of our products.

Discussion (0)0
Log in or sign up to continue
Article
· Oct 21, 2015 2m read

Addressing the Healthcare Connectivity Challenge: Selecting a Health Service Bus

Introduction

In healthcare, information accessibility can impact the outcome of a medical decision, or the success of a bundled payment initiative. To ensure that the right information is available at the right place and time, healthcare organizations typically have used HL7® interface engines to share data among clinical applications. But the demands on healthcare information technology are changing so rapidly that these simple engines are no longer sufficient.

  • New data sharing and interoperability standards and protocols arise and evolve continuously
  • The volume, variety, and velocity of data — including images, genomic data, and other outputs from new diagnostics, therapeutics, and monitoring devices — are constantly growing
  • Business models are evolving from fee-for-service to pay-for-value, creating demand for greater care coordination across care teams that include the patient and health plans

To keep pace with these demands and reduce the burden they place on in-house software development, many healthcare organizations have turned to enterprise service bus (ESB) products and their support for service-oriented architecture (SOA). Specifically, they have been looking to:

  • Make the organization’s data easier to capture and use
  • Deliver information into existing applications and clinician workflows for better care coordination
  • Keep up with re-orderapidly changing healthcare communication and interoperability standards by leaving that task to the ESB vendor

The utility of ESBs and SOA is not limited to a specific industry. It is important to remember, however, that these technologies were developed to address the needs of e-commerce and online retail. They were not built with the special challenges of the healthcare environment in mind.

Discussion (0)0
Log in or sign up to continue
Article
· Oct 21, 2015 2m read

Why You Should Consider the Cloud

Introduction

By now, anybody working in the technology sector will have heard of Cloud computing. But the concept is increasingly being paid attention to outside of IT departments, with growing recognition among boardlevel executives of the potential of this range of innovations. Frequently, senior personnel are hearing stories about how the Cloud helps organizations reduce costs, boost efficiency and expand their operations, so they’ll be excited about what the Cloud can do for them.

As a result, Cloud is one of the fastest-growing parts of the IT industry. Gartner forecasts that by 2016, this technology will make up the bulk of new spending in the sector Meanwhile, International data Corporation predicts that in 2014, Cloud spending will surge by 25 percent, taking it over the $100 billion milestone for the first time: As a result, the number of options available to businesses when it comes to choosing Cloud solutions and other related technologies is constantly growing.

Because the decisions a business takes now will affect its operations for years to come, migrating operations from on-premise networks to the Cloud is not a task that should be undertaken lightly. The key for many organizations will be knowing when they will benefit from making such a transition, what services they should be investigating and what potential risks they need to be aware of.

The choices available to organizations can be bewildering, particularly if they do not have a great deal of expertise in the area. There are Cloud-based as-a-service options available for almost any activity, including software (SaaS), infrastructure (IaaS) and platform (PaaS), as well as several others.

Determining which aspects of a business to place in the Cloud and which would be better served remaining on-premise is one of the first decisions a company must make. In addition to this, there are choices about whether to opt for private or public options, or even a combination of the two as part of a hybrid package.

This may seem a lot to think about, but by asking a few basic questions at the beginning of the process about what organizations expect to achieve as a result of using the Cloud, they can identify the most appropriate tools that will help get migration projects off on the right foot.

Discussion (0)0
Log in or sign up to continue
Article
· Oct 21, 2015 2m read

Deploying an Elastic Data Fabric with Caché

Executive Summary

For twenty years or more, large financial institutions have been locked in a battle between the need for extremely high performance transaction processing and the demands of downstream applications that can deliver competitive advantage if they can get real time access to this transactional data. When individual database servers could no longer handle simultaneous transaction and query workloads, many firms turned to replication, offloading data access onto read-only copies of production databases. While this strategy worked well for smaller volumes, greater data volumes (years of accelerating trading velocity, more data sources) have stretched data replication architectures to the breaking point. Many are exhibiting problems with performance, scalability, manageability, maintainability, and data governance. In short, they have become an unacceptable risk to firms’ continued growth and market responsiveness.

InterSystems Caché offers a different approach, by enabling customers to deploy an elastic Data Fabric that is massively scalable, runs on commodity architectures, and can be deployed in the cloud. Instead of replication, Caché uses its advanced enterprise cache protocol that transparently delivers in-memory access speeds for massive amounts of data distributed in local- and wide-area configurations. To oversimplify, Caché replaces hordes of database replicas and database servers with a distributed, shared data cache designed for fault tolerance, data integrity, and linear scalability.

Most importantly, Caché includes robust SQL capabilities so that existing SQL-based applications can use the elastic Data Fabric without disruption. And, because Caché natively delivers other powerful data paradigms – objects, documents, key/value pairs, and so on – the elastic Data Fabric also transparently enables future transactional and analytic development, even with Big Data, on an enterprise-grade massively scalable foundation.

Discussion (0)0
Log in or sign up to continue