An Introduction to AS4: A B2B Integration Standard That’s Low on Frills, High on Functionality

In the beginning was EDI, and B2B integration standards were born, and it was good. And the value-added networks that carried the EDI settled over businesses everywhere, and that was good, too (if more than a little expensive). And then came the Internet, cheap data transport, and with it EDIINT – a leaner, meaner integration standard than made for lightweight, Internet-based integration.

Pretty soon almost every business had migrated to the Internet for sending and receiving B2B data, and AS2 (a version of EDIINT) became the ruling standard – until web services came along, a standard unto themselves for agnostic interface between systems.

Confused yet? You’re not alone. The world of B2B integration standards is no longer as simple as it once was, and it has evolved rapidly. But it needs to be simple again, because B2B integration is no longer a luxury – it’s a mission-critical necessity, in the Internet-driven, demand-driven, ad hoc business universe that now contains all commerce.

The size of the door

A lean and mean standard for mapping business communications – one that can be rapidly implemented and easily supported – makes all the sense in the world, because it makes B2B integration practical (and affordable) for businesses of all sizes. Where, then, is the problem?

The problem is that web services – the doorway into and out of organizations doing B2B with other organizations – is robust and platform-agnostic, but also very complicated, because web services handle data transfers of many different kinds. This complexity is a barrier in itself, because a great deal of detail must go into sculpting a particular transaction to work within a web service correctly.

AS2 accommodates that complexity, but that makes AS2 itself very complicated. While it may be lean and mean on the data formatting side (which is its most important feature), the protocol side is klunky and difficult.

Think of it this way: web services are doors, yes, but in practice they resemble bank vault doors – heavy, complicated, hard to open and close correctly. That’s what you want, when you’re trucking in a great deal of valuable content.

But what if the content is bite-sized, immediate, simple? Isn’t it a waste of time and brainpower to figure out how to get small and numerous chunks of data through that ungainly door?

Think now of what most organizations really need: a doggy door. Small, simple, sized to admit only what’s important and nothing else.

Enter AS4

AS4 is everything its predecessor is, in terms of B2B business integration. The utility that services business document sharing in AS2 is all there in AS4.

But AS4 is the answer to the doggy-door problem. The bank-vault complexity of web services – which emerges from a broad and complicated technical specification called ebMS – is bypassed by AS4. The AS4 protocol uses, and allows for, only those features of ebMS that are really necessary for conventional B2B integration. All the other forms of data transfer that are accommodated by web services through ebXML are ignored. The machinery for handling them exists, unnecessarily, in AS2, making AS2 somewhat exasperating in practice. In AS4, all of that exasperation goes away, because AS4 only requires the basics, in order to get in and out of the web services doorway.

AS4 is payload-agnostic – meaning that it can be carrying any type of business document, and the transport protocol doesn’t care what it is. A single AS4 message can carry multiple payloads (a must for effective B2B integration). It is friendly to a wide range of security specifications, and supports business receipts (notification of the disposition of the message, once it’s been sent). And it supports ebMS’s One-Way/Push and One-Way/Pull exchange patterns: the first allows transactions to be either synchronous or asynchronous; the second enables endpoint variability.

Put simply, AS4 is following in AS2’s footsteps. The latter simplified the lumbering giant of EDI and made it Internet-friendly, an essential step in the evolution of B2B integration. AS4 has in turn simplified AS2, and made the sharing of messages friendlier still – which removes one of the last excuses for not implementing B2B integration.

Read More 0

Is Standards Adoption Keeping Up With the Pace of Healthcare Evolution?

The Health Information Exchange is a key enabling technology in healthcare reform in the United States, and conceptually a seminal step in B2B partner integration. Its intent, to standardize and unify the management of private clinical data, is not only progressive but essential in the emerging architecture of healthcare systems that must be efficient, highly responsive, and pervasive across the industry and its ancillaries.

But the evolution of HIE standards is outpacing the speed with which healthcare partners can adapt, creating bottlenecks in the growth of industry adoption of improved methodologies, necessary for support of the provisions of the Affordable Care Act.

The need for innovation was stressed at this year’s Health Information and Management Systems Symposium, held in Orlando in February. Keynote speakers addressed the issue in a session titled, “Innovation Drives Value in HIE.”

“There should be a free and easy exchange of [personal health] information,” said Brian Patrick Kennedy, health reform advocate and Rhode Island state representative, “because if you need to in and see a doctor, and another doctor has previously seen you about something else, that information should all be together in that medical record so that you don’t have to have another procedure, for example, or another test…” Patrick delivered the keynote remarks that opened the symposium along with Dr. Michael Hankins.

“One of the key benefits of the whole health information exchange is that the portability of that record, that brings along all the information that’s in that record, so that when you go to see a doctor or a specialist…all of that information is gathered together.”

Included in that clinical data is information useful to every participant in the delivery of healthcare: the patient, clinicians, specialists, hospitals, suppliers, pharmacies, and health insurance companies.

HIE standards, useful as they are, nonetheless present a huge challenge to many participants in integrated healthcare, due to the rapidity with which they are evolving. Each year, HIMSS reports on industry uptake of the revised standards – and, each year, many entities in the healthcare delivery system remain behind.

An example of this is the Stage 2 Meaningful Use Rules, which represent a major step forward in secure information exchange between healthcare delivery participants, including common implementation specifications for electronic information exchange, rules for formatting structured data, coding for procedures, medications, lab results, diagnoses, and other clinical data, various status codes, care plans, and medical diagnostic data. (The Meaningful Use adoption deadline is July 1.)

But implementing the standard is easier said than done. Stage 2 requires providers to complete two years of participation in comformity to the standard; but enough participants are lagging behind in implementation that Stage 2, which commenced over a year ago, has been extended into 2016.

According to the US Department of Health and Human Services, barriers to implementation include restrictions imposed upon providers who do not meet the required security standards; lack of reliable source data from government organizations feeding data into the system; and the failure of some providers and participants to meet eligibility requirements for participation.

But one of the biggest shortfalls, according to a survey of CIOs in health information management, is that HIE standards have outpaced the evolution of computerized provider order entry systems. The initial point of entry for medical records, the computer systems in the offices of doctors and specialists, do not measure up to the standards now in place among other participants in healthcare delivery.

“Policymakers and other stakeholders should consider strategies that maintain the critical elements of Meaningful Use,” the survey read, “while adequately supporting hospitals that desire to become Meaningful Use [compliant] but are impeded by specific technological, cultural and organizational adoption and use challenges.”

With the Meaningful Use adoption deadline looming, many organizations are scrambling to complete technological upgrades in order to come into compliance; many will succeed, some will not. But the impetus – the advancement of industry initiatives accommodating healthcare reform in general and the Affordable Care Act in particular – is strong, and the technological inertia is undeniable. HIE has assumed a unique and prominent stature among B2B integration initiatives.

The evolution of health information standards is greatly increasing interoperability, and in upcoming phases of its evolution will enable new models of care delivery. Despite the growing pains, HIE is providing a positive demonstration of the complexity and high standards that integrated partnership can accommodate.

Read More 0

Is the Cloud the Place to Go for B2B Collaboration?

The cloud seems ideal for many things, and high on the list is easy sharing of data between partners. Far more than EDI, XML or integration engines, shared cloud resources seem to solve the integration problems that B2B partners commonly face.

Or so is seems, in theory. While it’s true that an easily-accessed, secure, shared storage resource is theoretically a boon for partners swapping logistics and analytics, there’s a dark side to hosted storage that can be a serious concern, and even a deal-breaker.

Cloud Trade-Offs

The up side of cloud-based resources from a B2B standpoint is centralization – the gathering of shared resources in a single location that eliminates the need for partners to pitch and catch data (which is troublesome and expensive). At a stroke, this centralization removes most (though not all) of the headache of data-sharing. And while it’s true that one partner in a supply chain could step up and provide such a resource, it’s more convenient – and more economical – to simply lease the space and not upset the balance of power. Infrastructure maintenance and server administration are somebody else’s problem, and supply chain partners have no burden bigger than divvying up the bill.

The problem with that scenario is that freedom of responsibility for server administration and maintenance also means loss of server control, and that’s where the supply chain can get into trouble.

Cloud-based collaboration can be a wonderful thing, when everything is up and running and going right. But it can be terrible when things go wrong.

Crash Time

When things go wrong on the ground, armies mobilize. Service level agreements between supply chain partners tend to be intricate and well-considered, because any one glitch in a process can cause a chain reaction, interrupting the operations of every participant. When something does go wrong, every partner managing a resource can be counted on to mitigate the damage, for the good of all.

But in the cloud, outage and interruption are independently administrated. That blessing becomes a curse, because many carefully-negotiated SLAs now boil down to one, and that one is rendered by a disinterested third party.

This is where the happy theory of cloud-based supply chain collaboration comes off the rails. B2B integration, now more than ever, requires rapid response, with carefully-tiered, data-driven mitigations ready to go. In the cloud, that largely disappears.

Azure, For Example

Microsoft’s cloud platform showcases the dangers of surrendering server control. With its Office 365 platform, for instance, which hosts SharePoint Online – an ideal collaboration platform, in principle, for B2B partners – the SLA does not extend to root cause analysis. Microsoft support will extend to resolution, but no farther, unless the crash affects many or most tenants.

Now, that’s bad enough, in itself; but Microsoft also denies access to the logs that would allow a B2B maintenance team to troubleshoot the crash themselves.

If a B2B-integrated supply chain were to trust its collaborative logistical data to such a platform, and an unexplained crash led to an outage – interrupting operations for one or more supply chain participants – all partners in the chain remain in limbo, in terms of understanding what went wrong and why. Microsoft would get things rolling again, but that would be it. It could happen again. And again.

It isn’t overstating to say that the supply chain would be at the host’s mercy.

How It Should Work

Ideally, the cloud is a great place for supply chain partners to host B2B integration resources. And, of course, data that isn’t time-sensitive can easily be platformed that way. But for cloud-hosted real-time B2B integration to be practical, immediate mitigation of unexpected crashes, service interruptions and other calamities must be under the control of the supply chain partners themselves.

Why, exactly, do most cloud platforms refuse this control? Because multi-tenant environments have to be extremely efficient and scalable to be useful at all, and both user administration and customization work against that efficiency and scalability. This makes for a “fast food” style of hosting – a limited menu of capabilities and only generic support, client-by-client. And that’s before we even begin discussing disaster recovery.

It’s easy to wag a finger at Microsoft (or any cloud provider) over such tight policy, but in their defense, this is just the state of the art as it is today. Despite Steve Ballmer’s promises over the past few years, the Azure platform remains unfriendly to both customization and truly remedial support. And it’s easy to see that a big reason for his promises remaining unfulfilled is the rapid growth of the platform.

For now, the best B2B integration solution is flexible, highly-configurable in-house resources within each partner company – and a strong collaborative attitude among the partners.

Read More 0

How Technology Can Support Strategic B2B in 2014

Recently IDC released its annual end-of-year predictions for manufacturing in 2014. The research firm predicts a convergence of operational, information and consumer technologies will push organizations to “reshape approaches to information management.”

This shift will require modernizing the underlying B2B commerce backbone, and IDC predicts 2014 will be the year IT makes these investments a major priority.

Here’s why B2B leaders may want to update B2B technologies, including integration and automation tools, to support these emerging trends:

CFOs will collaborate more with supply chain leaders to create a more strategic B2B organization. CFOs love hard numbers, particularly when it comes to monetary-related metrics such as time-to-pay, order turnaround and how well a supplier performs. While supply chain leaders may monitor some of these KPIs, CFOs will expect data to be precise and comprehensive.

Read More 0