Rawlins EC Consulting

Rawlins EC ConsultingRawlins EC Consulting

Rawlins EC Consulting - Help! Rawlins EC Consulting - Contact Rawlins EC Consulting - Resources Rawlins EC Consulting - Services Rawlins EC Consulting - About

 

Analysis





XML Resources





EDI Resources





General EC



 

 

Process Modeling for e-Business

Advantages, challenges, caveats, and implications

Those who haven't been involved in EDI standards development might be surprised to know that process modeling has long been a contentious topic of debate. After having been studied, developed, and promoted for several years, it recently became a key component of UN/CEFACT's work program. ANSI ASC X12 also decided in 2000 to incorporate modeling using CEFACT's methodology, but without the same "must do" mandate that CEFACT has imposed on its constituent work groups. CEFACT carried this modeling emphasis over into ebXML, and business process analysis using modeling techniques and methodologies is a core part of ebXML. However, there are still many people in both CEFACT and X12 who are not particularly enthusiastic about modeling and question its usefulness. There are many reasons for the controversy around this topic, but the primary reason is the tension between two basic facts about the current state of software engineering:

  1. Business process analysis is "good" software engineering.
  2. Very few organizations place a high priority on "good" software engineering.

For those for whom these facts aren't obvious, I'll elaborate.

Good engineering

One of the foundation principles of software engineering is that it is much easier and less costly to correct requirements mistakes in the analysis phase of a project than it is in later phases of design, coding, and testing. Good analysis is of paramount importance in the efficient production of software systems that satisfy user requirements. There has been a great evolution in analysis techniques over the years. Today's prevalent technique, the Unified Modeling Language (UML), is an object oriented approach that incorporates key features of previous techniques (such as data flow diagrams and state charts). To these it adds its own object oriented viewpoint that meshes very well with today's prevalent OO design and programming approaches. Likewise, today's analysis methodologies, such as UN/CEFACT's UMM (the UN/CEFACT Modeling Methodology N090, based on Rational Software's Rational Unified Process), are based on years of experience. Consistency in techniques and methodologies not only forces discipline on analysts and produces better analysis, but also produces analysis in a form that can be more easily validated by subject matter experts and used by software developers. In addition, in the classical software development model the analysis is kept distinctly separate from architecture or design. In addition to making both the analysis and design more understandable by not mixing them up, this allows the consideration of several different implementations that might satisfy the most important functional requirements. One can then be chosen which best satisfies nonfunctional, or quality requirements. Like Mom and apple pie, it is very hard to argue against these approaches to analysis.

Good engineering is expensive

On the other hand, very few organizations place a high enough priority on "good" software engineering to pay for it. Several years ago when I worked for Digital Equipment I occasionally taught project management and led software process improvement projects. We used to quote studies by Carnegie Mellon University's respected Software Engineering Institute (www.sei.cmu.edu) which found that over 95% of the development organizations that they assessed and surveyed had practices which are best characterized as "anarchic and chaotic". In the framework of their five level Capability Maturity Model, those at this lowest level basically can't do something the same way twice. In the years since then I have seen very little evidence among the organizations for which I've consulted to suggest that this is still not generally the case. Organizations pay lip service to "good" engineering, but more often than not they cut corners due to schedule or budget constraints. The ebXML initiative itself was a classic example of this phenomenon. The accepted practice of doing requirements followed by architecture followed by detailed design was collapsed, with all of the teams starting at the same time. The ostensible reason for this was that the aggressive eighteen month schedule didn't permit us to do things the "right" way. Of the few organizations who seriously consider significant process improvement projects, most run into trouble because management doubts that the long term return will offset the high initial investment.

 

Except for the rare few who have ISO 9000 or other such quality control criteria, people who purchase software generally don't know and don't care about how it was developed. They just want it to perform their required functions, at a reasonable price, with a reasonable degree of reliability (tired of Win9X crashes, yet you still run it?). Only parts of NASA, developers of control software for nuclear power plants, certain defense contractors, and a handful of similar organizations really care enough about "good" software engineering to do it. Most of the other organizations that develop software just barely get by.

 

This is not to say that organizations who don't do "good" engineering also don't do analysis. It just means that they probably don't do "good" analysis.   Ever seen a product fail in the market because its features couldn't match or exceed the competition? How about a program that does exactly what was specified, but still doesn't do what you wanted it to do? These are both examples of failures in analysis. How many times have you seen major IT initiatives fail because they didn't deliver the expected cost savings or didn't come in on time?   These are generally problems of execution, but can often still be traced back to not adequately identifying priorities during the analysis phase.

It's not just about the process

Aside from these two basic facts there some other factors which complicate the picture. The two key themes that one finds throughout computing are data and processing (remember when it used to be called "data processing"?). At the machine level we have memory stores and instructions. In procedural programming we have data structures and algorithms. In the object oriented world objects have attributes and methods. The same is true for the various approaches to analysis. In the past the EDI standards bodies have only been concerned with the data portion of analysis. This leads to problems in understanding how EDI messages are to be used. It also has led to some "bad" message design, with some messages being very complicated because they were designed for more than one business purpose. Adding process to the analysis process (and putting a formal process in place where there was only an informal process!) is a significant improvement in how B2B standards are developed. Business process analysis completes the picture by showing the contexts and steps in which messages are exchanged. However, there is a limitation to the benefits. As a general rule BP analysis still doesn't help very much in identifying the contents of messages. In the extreme case of tax or regulatory reporting, the data requirements are set by statute or other government mandate. In other cases such as procurement the data required by the exchange is dictated by the application systems of the buyer and seller, which are themselves mostly based on long established custom. One determines those data requirements by examining system user documentation that is probably not in the form of UML models. The exception to this rule is redesigning processes or designing new ones. This usually also entails significant changes to internal systems to support the new external processes. In these cases, BP analysis may be of benefit in identifying message contents. However, it probably won't be sufficient just to model the process that occurs between two organizations. It will likely be necessary also to model internal processes and the data that is expected from external systems.

It's none of your business

CIO Insight recently ran a critique of RosettaNet entitled "Deciphering RosettaNet" that discussed in detail its adoption problems. It pointed out that companies are reluctant to change their business processes to conform to "standard" processes defined in models. They see competitive advantages in doing things their way. They are reluctant to change, even if it means that other companies' systems must be more complicated to handle their way of doing things. Beyond just the RosettaNet example discussed in the article, companies are sometimes very reluctant to reveal how they do business for fear that competitors will copy their processes. It is not uncommon to find that EDI implementation guides are considered proprietary and have restricted distribution.

There is no "standard"

The upshot of this is that it is difficult to define a "standard" process that everyone will adopt. Years of experience in EDI have shown us that companies persist in requiring a set of information in a business document that varies at least slightly from what other companies require. In an XML environment it's widely assumed that there won't be a "standard" schema for a purchase order, but a multitude of company variations on that "standard" that are tailored to reflect each company's mandatory data. However, we can look at a higher level of abstraction and just consider the steps in the process and not the data that is exchanged at each step. At this level there will probably be a great deal more uniformity in processes. This is the level at which the ebXML BPSS operates. But even with the greater uniformity it's likely that companies will still have a few tweaks of their own. In addition to this, we have the question of SME systems being able to support part of a process but not the complete process. For example, they may be able to receive purchase orders and generate invoices, but not generate advanced shipment notices. Does this mean that they only partially support the process or does it become a different process?

So where does this leave us?

CEFACT is to be commended for trying to improve the way in which analysis is performed for developing EDI (and more broadly B2B) standards. The issues really come down to emphasis, real benefits, and the appropriateness and feasibility of their particular approach.

 

First, let's look at emphasis. One can certainly get the impression that all that CEFACT cares about are the models, and that everything else is secondary. I get different readings on this depending on who I talk with and how I interpret what I read. However, the bottom line is that systems don't care about models. What's primary are the interfaces, defined at the technical level, and the protocols between systems. In B2B and EDI land, that means the message definitions or XML schemas and the business process "choreography", or the sequence in which the messages are exchanged. At the operational level the particular techniques and methodologies used to develop these are completely irrelevant.

 

As noted above, there are overwhelming benefits to good analysis. There are also benefits to using business process analysis for B2B standards, although there are some limitations. There are also some other benefits to the particular approach promoted by ebXML and CEFACT. Using one particular technique (UML) with a specific set of expected deliverables (the UML profile, or UMM metamodel) can lend to consistency, which in turn leads to more understandable standards. Another purported benefit of complying with the UMM metamodel is that models which conform to it can be compared to determine if they describe the same business process. Although this may sound nice in theory, it will probably be of little benefit to most companies involved in e-Business. The "golden rule" applies, i.e., he who has the gold makes the rules. Sellers generally conform to the processes dictated by buyers.

 

UML, when used as proposed by ebXML and CEFACT, can also be used to develop XML schemas. One can develop and view XSD complex types as object class hierarchies in UML. For many designers this might be more intuitive than trying to bottom-up trace extension from derived types in XML specific tools such as XML Spy. However, non-standard extensions to UML may be required for the XML schemas to be generated directly from UML. Developers who work primarily in an XML environment may find that the overhead of using a UML tool in addition to their XML tools may not be justified. An ebXML BPSS instance document describing a business process can also be generated from a UML model. It is notable, however, that the ebXML BP team didn't want to force people to use UML. They instead provided a series of worksheets that enable a BPSS document to be created without having to do a full UML model. At any rate, I've noted previously that the BPSS document, regardless of how it's created, is probably going to be of little use to SMEs.

 

Finally, we need to consider the whole of the UMM. The full set of documents is several hundred pages long, and it mostly is concerned just with the analysis phase of software engineering. If we look at past experience with complex, multi-volume methodologies (such as Andersen Consulting's Method/1), we find that generally they don't work very well. People find them very hard to follow to the letter. One of the main problems with them is that one size doesn't fit all, i.e., most of them don't scale down well to small projects. Many B2B exchanges and processes fall into that category. XP (Extreme Programming, not the new M* operating system) is only the latest in a series of approaches to formalize somewhat less disciplined and simpler development practices. Even such luminaries as Ed Yourdon (in his book "Death March") are now advising people to throw away the book and do what is expedient.

 

For CEFACT, the UMM is a big improvement over having practically no formal analysis process. However, it may be a case of trying to go too far too fast. The bottom line is that even CEFACT work groups, who have a mandate to use the UMM, are probably going to find it hard to use. The benefits are going to come with a fairly high price. Other organizations may find the UMM helpful as a reference and may see benefits in developing some or even most of the components required in the UMM metamodel. But I think it unlikely that any organization outside of CEFACT is going to adopt the UMM in toto. Even if they do they are almost certainly not going to follow it to the letter.

 

What is needed to make this all practical and feasible for individual companies (and smaller standards organizations) is a scaled down version of the UMM, say no more than twenty or thirty pages worth. This would most likely require a scaled down version of the metamodel as well. Something like this was started in X12 this past year, though it has yet to be rolled out. If CEFACT's ebTWG or TMWG don't already have something like this in mind, they should. The ebXML "Business Process Analysis and Worksheets and Guidelines" is a worthwhile contribution and the closest thing we have now to "UMM for Dummies". But I'm concerned that even it is not quite what is needed. It packs a lot of information and examples into over 100 pages of text, but still frequently has to deal with important topics by only offering a reference to the appropriate part of the UMM. This is really not the fault of the ebXML BP team; the problem is that the UMM is too complex. In this regard the ebXML document is right on target and even somewhat (intentionally?) amusing. It draws an analogy between the BP worksheets and guidelines and U.S. Internal Revenue Service tax forms and instructions. It compares the UMM to the U.S. tax code.

 

When we consider the caveats and obstacles to feasibility, one must ask whether or not BP modeling, particularly as promoted by ebXML and embodied in CEFACT's UMM, is really the right model for developing e-Business standards and systems. This approach tends to view building standards and systems for the world of commerce in the same way that we approach building bridges or buildings. One time, design and build, then it stays static (at least for awhile). On the other hand, the real world of commerce tends to have a lot more in common with a living organism or ecosystem than it does with a building. Commerce evolves, relationships change, buyers and sellers come and go, the weak and inefficient get eaten by the strong, the small and nimble can evade the larger predators. There is some very interesting work being done at the Santa Fe Institute on complex adaptive systems (also often referred to as self organizing systems). This field of thought, an outgrowth of their earlier work on chaos theory, involves applying models found in living organisms to other disciplines as diverse as computer network design and economics. I think that in the long term taking this kind of approach to e-Business will probably be more fruitful. However, the entire discipline of complex adaptive systems is still in its infancy and I know of no efforts yet to try to apply its approach to e-Business. For now, even with their problems, UML and methodologies like the UMM are probably close to the best that we can do.

 

So, about all that I can recommend is that you need to decide for yourself the best way for your organization to pursue the "process" part of analysis, given your own unique situation. Some may go for full blown UMM while others may decide to continue sketching their analysis and design on the backs of napkins. I expect that most will fall somewhere in between.

 

My next article, the next to the last one in the series, will deal with the "data" part of analysis - ebXML's Core Components.

 

December 13, 2001

© Michael C. Rawlins