API management is the process of publishing, promoting and overseeing application programming interfaces (APIs) in a secure, scalable environment. APIs not only facilitate integration of new features into existing applications but also provide standard specifications for accessing and sharing data via remote Web service calls via SOAP and RESTful APIs. API management includes the creation of end user support resources that define and document the API, as well as setting attributes and parameters for publishers and subscribers. APIs are essential for establishing a virtualized data environment and event-driven architecture (EDA). With proper API management, publishers make APIs available to subscribers, and those transactions are managed at the application level. Pushing this capability to the applications promotes higher performance, flexibility and scalability in the architecture.

Data virtualization is an approach to data management that allows an application to retrieve and manipulate data without requiring technical details about the data, such as location, structure, format, and storage technology. Data virtualization provides an abstraction layer that data consumers can use to access data in a consistent manner. A data consumer can be any application retrieving or manipulating data, such as a reporting or data entry application. This abstraction layer hides all the technical aspects of data storage. Data virtualization is typically implemented on an enterprise service bus (ESB) as the abstraction layer, through which Web services (i.e., SOAP or RESTful APIs) are invoked by applications that allow access to data. The ESB can also publish APIs and allow access to data in a cloud repository. Data virtualization is also used in conjunction with in-memory databases, in which APIs facilitate processing analytics at the source and presenting the results rather than the underlying data. This provides significant performance and scalability advantages.

EDA is a design pattern that builds on the fundamental aspects of Service-Oriented Architecture (SOA), in which event notifications are transmitted between decoupled software components and services to facilitate immediate information dissemination and reactive business process execution. In an EDA, information can be propagated in nearly real time throughout a highly distributed environment, enabling the organization to proactively respond to business “events.” (Examples include submission of new patient admission form, payment of a claim, approval for a procedure, and physician orders.) Since EDA uses asynchronous messaging to communicate among two or more application processes, it is much more efficient than SOA. Within an EDA, business processes are modeled into discrete state transitions (compared to sequential process workflows), with event-based triggers and decoupled interactions. EDA relies on data sources sharing a common gateway or ESB. EDA also supports and extends the following:

  • Enterprise application integration—emphasis on reuse, interoperability and scalability, open standards, and shared services
  • Common development platform—a stable foundation for agile development and rapid deployment of business, operational, and clinical applications
  • Business process management—flexible design patterns, business rules inventory, and process definitions for workflow automation
  • Technical interoperability—standard ESB, data services layer, asynchronous messaging, and publication and subscription of transactions
  • Application of best practices for data management, access, quality, and stewardship
  • Establishment of a common data model and metadata/master data management to achieve a virtualized data environment

Areas of Expertise