Books on process modeling generally warn against getting bogged down in detail. They tend to recommend a top-down methodology that starts with a big-picture end-to-end view and drills down just as far as you need for your modeling purpose. In preparing my Process Modeling with BPMN training I stumbled across a pretty good recipe for how to do this in Worflow Modeling by Sharp and McDermott - one of the only books on the topic I can recommend. My training overlaps slightly with the information-gathering phase that Sharp and McDermott talk about, but mainly focuses on how to put that knowledge into BPMN, so you can share it, run it through a simulation engine, or even generate an executable implementation.
Subprocesses are a very handy concept in BPMN. Besides providing a natural way to draw a condensed top-down view with drill-down to any level of detail, BPMN subprocesses also determine the boundaries of when an event can be received - a change to an order in process, for example. But I find that one really basic property of BPMN subprocesses gets in the way of top-down modeling, as described by Sharp and McDermott, or anyone else for that matter.
One of the reasons for my absence from the blogosphere this month is I've been heads-down putting together my training on Process Modeling With BPMN. But when I talk to vendors about what I'm trying to do - teach business analysts how to use things like gateways and events correctly and with a recognizable business purpose - they kind of chuckle at my quixotic delusion. You'll never get business analysts to understand that stuff, they say. But I'm still of the mind that the main reason business people don't quite get BPMN yet is that the tool vendors themselves don't follow the spec. Come on, guys, it's not that hard.
A couple months ago I posted about the deficient simulation capabilities of most process modeling tools. More recently I've been working with ITP Commerce - the tool provider for my upcoming BPMN training - on enhancing their simulation features to address the use cases that figure most prominently in process analysis. It's been a really instructive exercise, and I'm still struggling through it. I'm interested in hearing about the experience of BPMS Watch readers who have used simulation successfully. From my own process of thinking about the problem, here's what I've come up with.
Air travel is God's punishment for living in California. At 5:30 on Friday evening, when most attendees are safely home with loved ones, my journey home is just beginning, many miles before I sleep. Day 3 of Process World ("User Day") was the best for me, since I came to find out what ARIS actually is and does, more than the corporate vision.
Long-time readers of BPMS Watch know I've learned the hard way that to most people who self-identify with an interest in BPM, the big leap is not executing the process and rules but simply documenting it, writing it down. Now that I'm waist-deep in that world myself with the new BPMN training, I decided to trek over to IDS Scheer's user conference in Florida. It's been an eye-opener for sure.
Next week at Gartner BPM, Appian will take the wraps off Appian Anywhere, a hosted subscription-based version of Appian Enterprise. Appian Anywhere includes the full BPM suite, and leverages Appian's ability to host the entire environment - design tools, engine, object persistence, rule engine, portal, etc. - on the web. Initially the systems will reside on Amazon's hosting services. Pricing varies on the scale of the hosting hardware. Entry-level is around $15 per user per month; "
I had a briefing recently with Global 360's CEO Michael Crosno, and it's interesting to see how far that organization has come since the management buyout last year. Although G360 is one of the largest BPM vendors from a total software revenue perspective - Gartner/DQ had them #2 to DST in 2005, but... well, let's let Gartner defend their own numbers - they don't get a lot of respect, or even recognition, outside of their base of 1900+ customers.
[Submitted via email re my What is Case Management? column on bpminstitute.org. Posted with Ken's permission.] I enjoyed your article on BPM and CASE management. It is a sticky subject. The BPM folks don't understand CASE management very well, and unfortunately, the CASE management folks don't understand BPM any better. At the heart of the problem is that "CASE Management" is a catchall phrase that is often simply a glorified electronic filing system.
While I've been shouting from the rooftops that process modeling (in BPMN, ARIS, or whatever) is not that hard, Lombardi Software has been hearing from its customers that it's not that easy, either. The tools are complex, expensive, and only a small fraction of their features are used. Collaborating on models - while they're being developed - is near impossible. Making the models understandable to executives or business users means reducing them to a simple Powerpoint diagram or Visio flowchart. So process modeling - step #1 in the process of BPM - is already a barrier.
That barrier is what Lombardi aims to blow away with Blueprint, launched officially today. I've seen a lot of tools that the vendor insists is cool and different, but Blueprint really is cool and different.
It seems the smartest guys in the room, BPM-wise, have suddenly discovered content management. Or, more accurately, have begun to imagine what an intersection between the two might possibly look like... as if that were almost possible. Ismael tosses out three candidates: 1. Manage the lifecycle of content objects in CM, and just store the links to them in the process instance. (Memo to FileNet. Hmmm, might work, maybe try this...) 2. Use the CM repository as source control system for BPM developers (Memo to engineering... Oh, never mind.) 3. Embed a subset of BPM inside CM to do document approval workflows (is there a CM product anywhere that doesn't do this natively?)