Usability - Productivity - Business - The web - Singapore & Twins

The folly of root cause analysis

IT support's dealing with management is a funny business. Whenever something goes wrong, support teams engage in "defensive blaming" and the elusive quest for a root cause.
I've seen this quest (and blaming god and country along the way if it doesn't appear) taking priority over problem resolution and prevention. The twisted thought is: "If I'm not sure about the (single) root cause, I can't neither fix nor prevent it from happening again".

Why is that a folly?
  • It paralyses: If a person bleeding enters an ER, the first call to action is to stop the bleeding. Asking: Have all the suppliers of the manufacturer of the blade that caused it have ISO 9000 certifications? Where was the ore mined to make the blade? Root cause analysis is like that. IT support however is an ER room
  • It is not real: There is never a single reason. Example: Two guys walk along the street in opposite directions. One is distracted, because he is freshly in love. The other is distracted because he just was dumped. They run into each other and bump their heads. What is the root cause for it?
    You remove a single factor: route chosen, fallen in love, being dumped, time of leaving the house, speed of walking, lack of discipline to put attention ahead of emotion etc. and the incident never would have happened.
    If I apply an IT style root cause analysis it will turn out: it was the second guy's grandfather who is the root cause: He was such a fierce person, so his son never developed trust, which later in the marriage led to a breakup while guy two was young, traumatising every breakup for him - thus being distracted
  • There is more than one: as the example shows: removing one factor could prevent the incident to happen, but might leave a instable situation. Once a root cause has been announced, the fiction spreads: "everything else is fine". Example: A database crashes. The root cause gets determined: only 500 users can be handled, so a limit is introduced. However the real reason is a faulty hardware controller that malfunctions when heating up, which happens on prolonged high I/O (or when the data center aircon gets maintained).
    The business changes and the application gets used heavier by the 500 users, crossing the temperature threshold and the database crashes again.
  • Cause and effect are not linear: Latest since Heisenberg it is clear that the world is interdependent. Even the ancient Chinese knew that. So it is not, as Newton discovered, a simple action/reaction pair (which suffers from the assumption of "ceteris paribus" which is invalid in dynamic systems), but a system of feedback loops subject to ever changing constraints.
Time is better spend understanding a problem using system thinking: Assess and classify risk factors and problem contributors. Map them with impact, probability, ability to improve and effort to action. Then go and move the factors until the problem vanishes. Don't stop there, build a safety margin.
As usual YMMV

Posted by on 2014-07-09 01:45 | Comments (2) | categories: Software

Documents vs eMails

With a public sector customer I had an interesting discussion on non-repudiation, messaging and regulatory control. We were discussing how to ensure awareness of information that has behavioural or legal consequences. While "I didn't know" is hardly a viable defence, relying on the other party to keep themselves updated is just asking for trouble. In a collaborative environment, where a regulator sees itself primarily as the facilitator of orderly conduct and only as policing the conduct as secondary mission, this is inefficient.
An efficient way is a closed loop system of information dissemination and acknowledgement. The closed loop requirement isn't just for regulators, but anybody that shares information resulting in specific behaviour. Just look at the communication pattern of a pilot with air traffic control (paraphrased): Tower: "Flight ABC23 turn to runway 270, descend to 12 thousand feet" - Pilot: "Roger that, turning to 270, descent to 12 thousand"

When we look at eMail, the standard mechanism that seems to get close to this pattern are Return Receipts:
Standard eMail flow
Using RFC3798 - Message Disposition Notification - MDN (commonly referred to as Return Receipt) to capture the "state of acknowledgement", is a folly.
  1. the RFC is completely optional and a messages system can have it switched off (or delete it from the outbox), so it isn't suitable as guarantee
  2. MDN only indicates that a message has been opened. It does not indicate: was it read, was it understood, were the actions understood, was the content accepted (the later one might not be relevant in a regulatory situation). It also doesn't - which is the biggest flaw - indicate what content was opened. If the transmission was incomplete, damaged or intercepted a return receipt wouldn't care.
So some better mechanism is needed!
Using documents that have a better context a closed loop system can be designed. When I say "document" I don't mean: propriety binary or text format file sitting in a file system, but entity (most likely in a database) that has content and meta information. The interesting part is the meta information:
  • Document hierarchy: where does it fit in. For a car manufacturer recalling cars that could be: model, make, year. For a legislator the act and provisions it belongs in
  • Validity: when does it come into effect (so one can browse by enactment date), when does (or did) it expire
  • History: which document(s) did it supersede, which document(s) superseded it
  • Audience: who needs to acknowledge it and how fast. Level of acknowledgement needed (simple confirmation or some questionnaire)
  • Pointer to discussion, FAQ and comments
  • Tags
An email has no structured way to carry such information forward. So a document repository solution is required. On a high level it can look like this:
Document Flow to acknowledge
Messaging is only used for notification of the intended audience. Acknowledgement is not an automatic, but a conscious act of clicking a link and confirming the content. The confirmation would take a copy of original text and sign it, so it becomes clear who acknowledged what. An ideal candidate would be XML Signature, but there isn't a model how to sign that from a browser. There is an emerging w3C standard for browser based Crypto, that has various level of adoption: Once you have dedicated records who has acknowledged a document, you can start chasing, reliable and automated, the missing participants and, if you are a regulator, take punitive actions when chasing fails. It also opens the possibility to run statistics how fast what type of documents get adopted.
The big BUT usually is: I don't want to deploy additional 2 servers for document storage and web access. The solution for that is, you might have guessed it, is Domino. One server can provide all 3 roles easily:
Document Flow on a Domino server
As usual YMMV

Posted by on 2014-07-04 11:41 | Comments (0) | categories: Software

The taxi loyalty program isn't working and how to fix it

Singapore is a little like New York: train and taxis are a mainstay of the daily commute. So the taxi market is highly regulated and fiercely competitive. As no surprise taxi companies try to bind customers before they loyalty switches to alternative bookings or the disruptors.
So Comfort & CityCab started CabRewards. After all loyalty cards work well for their inventor.
In a smart move, instead of creating a new piece of plastic, Comfort teamed up with ezLink Singapore's leading provider of cash cards. Everyone in Singapore has a ezLink card, since they are used for train access and road tolls.
From there it all went downhill.
In Usability studies one of they key activities is to watch users and refine on their feedback. So I loaded my ezLink card, to see how Taxi drivers will handle it. In (so they told me, after I inquired with Comfort) an attempt (a futile one) to make things "consistent", the designer decided to add a prompt into the touchscreen application "Cabrewards Yes/No" before the driver can process payment. So something the driver has no benefit from stands between him and his livelihood. To no surprise, 95% of the drivers don't bother to ask "Are you a CabRewards member?", even when I announce "I will pay by ezLink". They just click NO and process to payment. If they would answer yes, they had to tap once for the points and another time for the actual payment. They have no benefit, so they skip it (and rightly so, their job is driving, not administration of loyalty programs).
I asked Comfort and they explained, I could ask the Taxi driver to switch back and add the points - but I'm not their program admin either. So how to fix this?
Plan, Reality and Fix
The graphic above, shows the intended workflow, the actual workflow and a possible fix. There are several touchpoints, where an automated IT system can determine if I'm a known passenger and a CabRewards member: I call a cab, I uase a mobile app to get a cab or I pay with a registered method of payment (ezLink, Nets, CreditCard). In all of these cases the taxi driver doesn't need to be bothered with a question. Most likely that also would cover most reward members. The ones paying cash might anyway not have registered for the program - or a simple change of terms (points only with electronic payments) would even completely eliminate the question. So far I haven't seen a change. I wonder if the person in charge of the process is trying to cover up the problem?
I would say: It is OK to have an idea, even if problems arise, just fix them. Despite other opinion there is no such thing as a "honest mistake". There is trial, error and correction (and start over).
Simplicity after all isn't simple
Comfort, you can do better!

Posted by on 2014-06-16 10:43 | Comments (1) | categories: Business Singapore

Let's ditch IBM Notes and Domino

Finally you decided it is time to move on, legacy no longer means "tried and tested" but "we need to move one" to you. After all you never really liked Cher.
Notes data is available via LotusScript, dotNet (through the COM bridge), in Java, Corba, C++, XML, REST, MIME, so how hard can it be? Actually not very hard, you just need to:
  1. Find sutiable replacement application platform(s)
  2. Rewrite the applications (don't dream: there is no such thing as "migrate an app")
  3. Migrate your users
  4. Migrate your data
... and then live happily ever after. Probably the easiest part is migrating the users, after all directories are more or less the same when you look at the LDAP side of them.
Migrating the users' encryption keys and secured documents could be considered "an application function", so you dodge a potential problem here.
Getting the data out, short of that pesky RichText content (yeah, the one with sections and embedded OLE controls), is easy - never mind reader and author access protection.
The transformation into the target format happens in a black box (the same the local magician uses for his tricks), you buy from a service provider, so that's covered, leaves platform and applications. (Bring your own <irony /> tags)
Lets have a look at the moving parts (Listing OpenSource components only, you can research commercial ones on your own):
What Domino does for you
  • Directory : OpenLDAP comes with most Linux distributions and is well understood. It doesn't take you down the propriarty extension hell (and you can remove obsolete schema parts) - it requires an RDBMS to run (not a blue one)
  • PKI : PrimeKey
  • Sync Server : For the database you could get away with the database, but you need one for mobile: Z-Push
  • Document Server : Pick any - or go with: Dropbox, Box, Copy etc.
  • Database Server : Notes is the mother of NoSQL, so you want to pick one of those: CouchDB, OrientDB or you go with DB/2 PureXML
  • Application Server : XPages is full fledged JSF, you you might opt for TomEE, but you will need some extra libraries. Or you ditch Java and adopt the cool kid on the block
  • Mail Server : the venerable SendMail does, as the name implies, send mail. Goes nicely with Mozilla Thunderbird
  • Web Server : That's easy. There's NGinX or Apache
  • Enterprise Integration : Might be quite tricky. Domino's DECS and LEI shuffle data from all sorts of sources. OpenESB might do the trick
  • Task scheduling : You can use OS level CRON (Just find a nice UI) or application level Quartz
Keep in mind, this replaces one server - Domino. Each of the tools needs to be:
  1. configured
  2. hardened
  3. populated with data
The ease of clustering you had with Domino, gets replaced with different methods and capabilities for each of the tools. Today Cloud is the new infrastructure, so you might get lucky with someone else configuring all of the above for you.

Once you got all the moving parts in place, you need to redevelop your apps. Don't be fooled, but run the analysis, to see the full magnitude of that task ahead.
As usual YMMV

Posted by on 2014-05-30 07:05 | Comments (7) | categories: IBM Notes

Value, Features and Workflows

In sales school we are taught to sell value. Initially that approach was designed to defang the threat of endless haggling over price, but it took an extra twist in the software industry. Since software companies rely on user's desire to "buy the next version" to secure revenue from maintenance and upgrade sales, a feature war was the consequence.
As a result, buyers frequently request feature comparison tables, driving the proponents of "value & vision" up the wall. It also creates tension inside a sales organisation, when a customer asks for a specific feature and the seller is reprimanded for "not selling value". How can this split in expectations be reconciled? As learned in Negotiation Basics, we need to step back and see beyond positions at the interest that drives them:
The seller doesn't want to do feature to feature comparisons, since they never match and are time consuming and tedious. On the other hand showing how trustworthy, visionary and future-prove the product is, makes creating confidence much easier.
The buyer is very aware, that software doesn't have any inherit value, but only its application has. Using software requires invoking its features, so the feature comparison is a proxy for the quality of workflows it can provide in the buyer organisation. The challenge here is that any change in feature set will break somebody's workflow.
An example:
Outlook users quite often add themselves as BCC into an outgoing eMail, so the message appears in the inbox again. From there it is dragged into a folder in an archive, so it is kept in the local, not size restrained PST file where it can be found. IBM Notes doesn't allow to drag from the inbox into a specific folder in an archive. The equivalent workflow: The user doesn't need to add herself to BCC, but simply uses Send & File on the original message. The automatic scheduled Archive task moves the message later without user action required. The searchbox will find messages regardless their location in the main mail file or in one of the archives - same result, IMHO less work - but a different flow.
The solution to this is consultative selling, where a seller looks at the workflows (that are mapped to features of existing tools or practises) and proposes improved workflows based on the feature set of his products and services. A nice little challenge arises, when the flow isn't clear or the proposed product has no advantage.
A little story from the trenches to highlight this: Once upon the time, when files still were mainly paper, a sales guy tried to sell one of my customers a fax server, stating that having the fax on screen, thus eliminating the need to walk to the fax machine, would be really beneficial. He looked quite dumbfounded when the manager asked: "And how do I write on this?". The manager's workflow was to scribble instructions onto incoming faxes or document that it had been acted upon. The software couldn't do that.

In conclusion: there is a clear hierarchy in software: To have a goal and destination, there needs to be a vision, that vision needs to be supported by software that has value in implementing this vision. Value is generated by supporting and improving workflows by that software. Workflows use one or more features of the application. Comparing features is aiming one level too low, the workflows are the real value generators. A change in software most likely requires a change in workflow.

Quite a challenge!

Posted by on 2014-05-11 10:36 | Comments (1) | categories: Software

You want to move to Domino? You need a plan!

Cloud services are all en vogue, the hot kid on the block and irressitible. So you decided to move there, but you decided your luggage has to come along. And suddenly your realize, that flipping a switch won't do the trick. Now you need to listen to the expert.
The good folks at Amazon have compiled a table that gives you some idea how much it would take to transfer data:
Available Internet Connection Theoretical Min. Number of Days
to Transfer 1TB
at 80% Network Utilization
T1 (1.544Mbps) 82 days
10Mbps 13 days
T3 (44.736Mbps) 3 days
100Mbps 1 to 2 days
1000Mbps Less than 1 day Some stuff for your math
(Reproduced without asking)
Talking to customers gung ho to move, I came across data volumes of 10-400 TB. Now go and check your pipe and do the math. A big bang, just flip the switch migration is out of the picture. You need a plan. Here is a cheat sheet to get you started:
  1. Create a database that contains all information of your existing users and how they will be once provisioned on Domino (If you are certified for IBM SmartCloud migrations IBM has one for you)
  2. Gather intelligence on data size and connection speed. Design your daily batch size accordingly
  3. Send a message to your existing users, where you collect their credentials securely and offer them a time slot for their migration. A good measure is to bundle your daily slots into bigger units of 3-7 days, so you have some wiggle room. Using some intelligent lookup you only present slots that have not been taken up
  4. Send a nice confirmation message with date and steps to be taken. Let the user know, that at cut-over day they can use the new mail system instantly, but it might take a while (replace "while" with "up to x hours" based on your measurements and the mail size intelligence you have gathered) before existing messages show up
  5. When the mailbox is due, send another message to let the user kick off the process (or confirm her consent that it kicks off). In that message it is a good idea to point to learning resources like the "what's new" summary or training videos or classes
  6. Once the migration is completed, send another message with some nice looking stats and thanking for your patience
  7. Communicate, Communicate, Communicate!
The checklist covers the user facing part of your migration. You still have to plan DNS cut-over, routing while moving, https access, redirection on mail links etc. Of course that list also applies for your pilot group/test run.
As usual: YMMV

Posted by on 2014-04-17 04:36 | Comments (0) | categories: IBM Notes Cloud Computing

Domino Design Pattern: Secret documents

Domino's stronghold is security. However security is only as good as you design it. A frequent requirement in applications is to store a data set that is partially confidential and partially available for a wider audience. When you store these 2 data sets in one document, it isn't too hard to have the confidential information slip out:
  • using the document properties in a Notes client
  • using the document rest service
  • the property control from openNTF
In a nutshell: if you have 2 sets of data with different levels of read access requirements, don't store them in one document. A well working pattern in Domino is the "Secret Document". The following picture illustrates the concept:
Use 2 documents to store 2 sets of information security requirements
The user is presented with one form, but saving the entered data is done in two documents. The documents are cross referenced using the UNID. This can happen two way (as shown in the picture): the public document's UNID is saved in the secret document and vice versa - or - one way, with only the secret ID in the public document. A few pointers:
  • Based on the application's need some of the public data get repeated inside the secret document if that needs to be displayed on its own (e.g. a salary list in an HR application)
  • To avoid data drifting apart the respective data would only get updated in the public document ever and then copied to the secret document. In classic Notes that is done using a on-change agent, while in XPages a session-as-signer code snippet will suffice.
  • For very sensitive data (like even the normal user shall not see), these data sets could be stored in their own encrypted NSF. Then the UNID might not be enough, but the full notes:// url would make more sense
  • In classic Notes the embedded form editor makes the user experience with 2 documents seamless
  • In XPages two (or more) data sources sitting on one page will do the trick
As usual YMMV

Posted by on 2014-04-17 03:38 | Comments (3) | categories: XPages IBM Notes

SmartCloud Notes little agent helper

Now that we all drank the Cloud Computing CoolAid, we need to make it work. IBM's SmartCloud Notes looks enticing, since it offers 25G of eMail storage, way beyond what IT departments usually want to commit.
SmartCloud Notes even allows you customisation albeit within clear limits. So before you upload your extension forms you need to plan well.
One of the most unpleasant restrictions is: "No customer agents or scripts will be executed on server ", so no agent, no DOLS tasks. However you can run an agent (or other code) on an on-premises server. The interesting question is: when and how to trigger such code. Looking at the basic iNotes customization article you can find the Custom_Scene_PreSubmit_Lite JavaScript function. This could be the place to launch such a trigger. More on that in the next installment.
This article outlines the receiving end - the stuff that runs on your on-premises server. Instead of running agents, I'll outline a plug-in that allows to process the document submitted. The interface between SCN and this service is a JSON submission in the form of:
  "userName": "TestUser@acme.com",
  "action": "DemoTask",
  "unid": "32char-unid"

Once the plug-in receives this data, processing can commence. Of course the server (that's the ID the plug-in runs with) needs to have access to the mail file at the task required level. Let's get started:
In a plugin project we define a new servlet:
<?xml version="1.0" encoding="UTF-8"?>
<?eclipse version="3.4"?>

Then our servlet looks like this:
package com.notessensei.cloudproxy;

import java.io.IOException;
import java.io.InputStream;
import java.io.PrintWriter;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.Executors;

import javax.servlet.ServletException;
import javax.servlet.http.HttpServlet;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;

import lotus.domino.Base;
import lotus.domino.NotesException;

public class TaskServlet extends HttpServlet {

	private static final long		serialVersionUID	= 1L;

	// Number of threads allowed to run concurrently for data sync
	private static final int		THREADPOOLSIZE		= 16;

	// The background executor for talking to the cloud
	private final ExecutorService	service				= Executors.newFixedThreadPool(THREADPOOLSIZE);

	// The Cache where we keep our user lookup objects, handle with care!
	private final UserCache			userCache			= new UserCache();

	protected void doPost(HttpServletRequest req, HttpServletResponse resp) throws ServletException, IOException {
		// Takes in a JSON String and makes a task object
		InputStream in = req.getInputStream();
		CloudNotesTask it = CloudNotesTask.load(in);
		String result;

		if (it != null) {
			result = "{\"status\" : \"task accepted\"}";
		} else {
			result = "{\"status\" : \"task failed\"}";

		// Prepare the reply back
		PrintWriter out = resp.getWriter();


	 * Get rid of all Notes objects
	 * @param morituri
	 *            = the one designated to die, read your Caesar!
	public static void shred(final Base... morituri) {
		for (Base obsoleteObject : morituri) {
			if (obsoleteObject != null) {
				try {
				} catch (NotesException e) {
					// We don't care we want go get
					// rid of it anyway
				} finally {
					obsoleteObject = null;

Read more

Posted by on 2014-04-17 08:52 | Comments (1) | categories: IBM Notes

Mustache and CKEditor - Round two

Having just a few static values in the CK Editor drop down list really doesn't cut it. So we extend the bean today to have more flexible options. There are a few that spring to mind:
  1. List of all items in a given document
  2. List of all fields in a form (including subforms), eventually with or without the $ fields
  3. List of provided field names
So here we go:
<?xml version="1.0" encoding="UTF-8"?>
<xp:view xmlns:xp="http://www.ibm.com/xsp/core">
	<xp:scriptBlock id="scriptBlock1">
		<xp:this.value><![CDATA[#{javascript:mustache.getFormFields(database, "Memo,Person", false);}]]></xp:this.value>
	<h1>Mustache and CKEdit demo</h1>
	<xp:inputRichText id="inputRichText1">
			<xp:dojoAttribute name="extraPlugins" value="mustache">

The big change here is the replacement of the EL Expression mustache.sampleData with a SSJS expression, so we can hand over all needed parameters. The pattern is the same for the other variations, so I won't repeat it further onwards. The interesting part is the Java method. Since a form might contain subforms you are interested it, I use an array like sting, that I'll split and a boolean parameter to include system fields. Of course one could vary the approach and automatically figure out the subforms in use (have fun with that once it is conditionally computed) or first present a list of forms and then the respective fields. Also, not to depend on some magic, I add the database dependency as parameter. So you have options to play with.
    public String getFormFields(Database db, String formNameString, boolean includeSystemFields) {
		StringBuilder result = new StringBuilder();
		// Get a sorted set first
		Set fieldNames = new TreeSet();
		String[] formNames = formNameString.split(",");
		for (String formName : formNames) {
            try {
	            Form form = db.getForm(formName);
	            if (form != null) {
	            	Vector fields = form.getFields();
	            	for (int i = 0; i < fields.size(); i++) {
	            		String curField = fields.get(i).toString();
	            		if (includeSystemFields || !curField.startsWith("$")) {
            } catch (NotesException e) {
	           // Too bad
		// Now the content
		for (String f : fieldNames) {
			this.add(result, f);
		return result.toString();

Read more

Posted by on 2014-04-14 09:06 | Comments (0) | categories: XPages

Lotus de la Mancha

One of my personal heroes is Don Quixote de la Mancha. He is a bat-shit crazy knight, who is true in his courtship of his Lady Dulcinea and never tired to pick a fight with a giant (windmill). His charge against the windmills, is regarded as a result of his craziness, but digging deeper you will find a nobility, worthy of a true knight: stand in for what you deem is right, regardless of the odds of success.
Being true to your calling resonates with me. Wikipedia has an image of the crest of La Mancha.
Based on it I hereby present the coat of arm of Lotus de la Mancha
Lotus de la Mancha - Crest of arms

Posted by on 2014-04-09 11:35 | Comments (3) | categories: After hours