Who makes the decision anyway?

You’re Fired

I know that anyone that watches ‘The Apprentice’ is not doing so for an insight into how a modern business is run but hearing the words ‘You’re Fired’ frequently bellowed through an office door couldn’t be further from my own experience. It represents a clichéd and caricatured view of management that I last saw to ‘comedic’ effect in Terry and June  a 70’s BBC Sitcom. I am sure it is a style that exists but hopefully in a diminishing minority of organisations that haven’t found a way to deal with the bullying and haranguing of greying and dysfunctional dinosaurs.

A New  Generation of Decision Makers

I was born in the 60’s which you have probably already worked out given the reference to Terry Scott and June Whitfield. I don’t recall being consulted by my parents on family decisions too often. Loving and supportive as they were, they were part of a generation that didn’t ask what kind of party we wanted, what cut of jeans we preferred or which destination we preferred for a day out.

Of course their choices were far narrower but this was the generation of parents that pre-dated Parenting magazine let alone Parenting.com.

Compare this with the generation entering the workforce today. Most have been involved in choices that affect them, carefully consulted in family decisions. Some, including those like Montessori educated Google founders, Larry Page and Sergey Brin have taken their progressive education and created progressive and hugely successful organisational cultures.

Waning Autonomy in Decision Making

The connection with Alan Sugar’s pantomime boss and the future of BI is this. The purpose of BI is to make better decisions. Those decisions, two decades ago, used to be made by one person (and in the main it was a man) Increasingly those decisions are made by teams, peer groups, special interest groups and the staff that are impacted by it.

Waxing Collaboration in Decision Making

The drivers for the need for increasing collaboration in decision making are largely cultural. This includes the small matter of a whole generation entering the workforce that expect to be consulted and who are sociologically predisposed to sharing responsibility for the outcomes of those decisions. This means growing engagement in successful outcomes in organisations from a much broader group.

Until very recently, this was just too difficult to do. The cultural implications aside, how do you poll groups, get their input, collate views, share opinions and establish any kind of consensus without committee’s, sub-committees and employee councils? How to you distribute the information, the hard numbers, that are needed to make a decision that aligns the needs of the business, it’s stakeholders with the needs of those that participate in the business as employees and partners?

Social business tools and their convergence with BI are an enabler. They have made collaboration in decision making possible.

The opportunity is an engaged and an informed workforce that can positively participate in the decision making process. Even if the individual did not support the outcome, they will know that their voice was heard.

No, You’re Fired

If an engaged workforce sounds ‘fluffy’ then ask yourself what’s your organisations largest cost? It isn’t usually paperclips. Estimates vary but some suggest that knowledge workers will account for 80% of the cost in the US labour force in 2012.

If 80% of costs were in a machine would we be content on it idling, running at 25% capacity which some HR studies suggest is the current average level of engagement? I doubt it.

So the manager of yore, jealously keeping information to themselves so that they can exercise power and control and ultimately make autonomous decisions without offering or taking counsel… You’re Fired.

If only BI was as efficient as Facebook

BI, Facebook and Decision Loops

I  was at an analysts briefing event with IBM last week who were sharing their thinking on Social Business and what I believe is the inspired and innovative pairing of Connections Collaboration and Cognos Business Intelligence. IBM’s Social Business Leader for Northern Europe, Jon Mell shared a slide that compared the number of operations it takes to share a photo and gather feedback with friends on facebook and  the number of operations it takes to do the same on email.

This set my mind racing. If there are efficiency gains on something simple like sharing and getting feedback on a photo, imagine the productivity gains on sharing critical business information through Business Intelligence reports.

Why do I say this? Because sharing a photo is typically a single ‘sharing loop’ process. Someone publishes the photo, others contribute with their clever and witty observations. Done.

A single loop … Count ‘em … One. (A quote from Muppet Treasure Island, btw)

The out-dated view of BI is that it is shared this way too. That it’s published and the job is done. This just doesn’t hold true any more and I am not sure it ever did. BI requires many sharing or decision loops. Ten, distinct loops to be precise but some of these are repeated which means there can often more decision loops than a bowl full of fruit loops (an all too infrequent guilty pleasure of mine)

  1. Meaning Loop. Gain and assign agreement on the meaning of the information
  2. Implication Loop. Decide if the implication is neutral or if there is a problem or opportunity
  3. Investigation Loop. If there is an issue then it will be rare that the one piece of business intelligence will provide the full story. This loops is about investigating the problem or opportunity is in more detail.
  4. Solution Loop. Determine possible solutions to exploit the problem or resolve the problem
  5. Decision Loop. To decide on the best possible solution
  6. Action Loop. Once the solution is determined it will be broken down into tasks and assigned to individuals to be actioned.
  7. Progress Loop. Providing feedback on the progress of the solution
  8. Monitoring Loop. To determine if the solution has been successful or if the group need to return to refine the tasks or redo some loops.
  9. Conclusion Loop. Closure. Establish agreement that that there are no further actions and that the problem or opportunity is resolved.
  10. Celebration Loop. Acknowledge the support and contributions of those involved

That’s ten loops which means that if sharing a photograph on Facebook is more efficient than sharing it in the office using email, the productivity benefits of doing ‘real’ business are tenfold.

There are those that are sceptical about Facebook styled social platforms in the office because they may waste time. I understand this, I really do. However, the opposite is true. Organisations need social platforms, particularly for collaborative decision making. Without them, they are wasting time.

Decision Making Black Holes

 

A Funny Thing Happens at the Forum

Meetings are one of the most common decision making ‘forums’ we are all regularly involved in. In fact one in five company meetings we take is to make a decision. As a way of making decisions though, they can be problematic. Once the meeting has concluded, the connection between information shared, decisions made and actions taken can be weak even lost. It’s as if the meeting itself were a decision making black hole.

Some Decisions are More Equal Than Others

Some decision making meetings are impromptu for making a timely, tactical decision quickly. Others are regular, formal and arranged around the ‘drum beat’ or ‘cadence’ of a business to make more strategic decisions. The more strategic the decisions and longer term the impact the less frequent the forum so a Senior or Executive Management Team may only meet quarterly for a business review (QBR)

How a QBR ‘Rolls’

A typical QBR will see Senior Managers sharing results in PowerPoint, possibly with financial results in spread-sheets which I would hope have at least been extracted from a Business Intelligence application.

If the SMT are reasonably well organised, they will summarise their conclusions and actions in meeting minutes. The meeting minutes will be typed up by an assistant in a word document and then distributed in email.

Throughout, they will all have been keeping individual notes so will walk out with these in their daybooks. The most senior manager in the room might not do this particularly if it’s their assistant who’s taking the minutes.

Later, actions from daybooks and minutes are likely transferred to individuals to-do lists and all follow-up will be conducted in email and phone calls.

An Implosion of Information, Conclusion and Decision

So let’s recap. Critical decisions about how resources are going to be allocated will be discussed in a ‘QBR’ and yet the artefacts of this critical decision making forum are scattered into Word documents, excel spread-sheets, emails and outlook tasks. Tiny fragments of the discussion, information, conclusion, decisions and activities implode around the organisation. To be frank, the team are now only going to make progress because the forum was recent and can be relatively easily recalled.

Of course, once time or people move on so does the corporate memory of the decision. Conversations begin with ‘what did we agree to do about that cost over-run?’ or ‘why did we say we were ok with the revenue performance in Q1?’

Executive Attention Deficit Syndrome

Many executives complain of a syndrome that feels like ADS. This is because the more senior the manager the more things they will probably have to deal with at an increasingly superficial level. A functional head will probably spend no more than 15 minutes on any one thing. To productively make decisions they will need to be able to have the background, status and related information to hand so that they can deal with it quickly and move on to the next thing. Decision making black holes contribute to this feeling of EADS.

CDM and Corporate Memory

Corporate Decision Making platforms will be successful when they connect;

  • Decisions
  • Information on which the decision was made
  • Insight derived from the information
  • Actions taken on the decision
  • Results of the actions

This means total recall of corporate decisions good and bad so that, over time, decisions can be recalled, evaluated, re-used or improved. A far cry from current decision making forums which whilst functional are inherently flawed, fragmented and are not improving the timeliness and quality of decisions in our organisations.

BI Requirements Should not just be Gathered

There are many resources remonstrating with the IT community on the importance of gathering requirements. Failing to gather requirements, they warn, will lead to a poor solution delivered late and over budget. This is largely inarguable.

However, I would warn that simply ‘gathering’ requirement is as big a risk. Fred Brooks, author of ‘The Mytical Man Month’ once said that ‘the hardest part of building a software system is deciding what to build’. And deciding what to build is a two way process rather than the act of listening, nodding and documenting that we all too often see in Business Intelligence projects.

From time to time, I hear someone cry foul on this assertion. They argue that it seems like the tail is wagging the dog or that the business cannot compromise on the requirement. I usually point out that simply building what the user asked for doesn’t happen in any other field of engineering. Architects advise on the cost of materials when planning a major new office building, City officials take advice on the best possible location for a bridge and environmental consultants are actively engaged in deciding exactly if and what should be built in any major civil engineering project.

And this is exactly how we should approach business analytic requirements. As a two-way exploration of what is required, possible solutions and the implications of each. Incidentally, this is particularly difficult to do if business users are asked to gather and document their own requirements without input from their implementation team.

An example of why this is important is rooted in the fact that many BI technologies (including IBM Cognos) are tools not programming languages. They have been built around a model to increase productivity. That is, if you understand and work with the assumptions behind the model reports, dashboards and other BI application objects can be built very quickly. Bend the model and development times increase. Attempting to work completely around the model may result in greatly reduced productivity and therefore vastly increased development time.

So be wary of treating ‘gathering’ and ‘analysis’ as distinct and separate steps. Instead, the process should be an iterative collaboration between users and engineers. Requirements should be understood but so should the implications from a systems perspective. The resulting solution will almost undoubtedly be a better fit and it will significantly increase the chance of it being delivered on time, at the right cost and with an increased understanding between those that need the systems and those that build them.

"We fail more often because we solve the wrong problem than because we get the wrong solution to the right problem", Russell Ackoff, 1974

Single version of the truth, philosophy or reality?

Assuming you want the truth and you can handle it then you will have heard this a lot. The purpose of our new (BI/Analytics/Data Warehouse) project is to deliver ‘a single version of the truth’. In a project we are engaged with right now the expression is one version of reality or 1VOR. For UK boomers that will almost undoubtedly bring to mind a steam engine but I digress.

I have to admit, I find the term jarring whenever I hear it because it implies something simple and  attainable through a single system which is rarely the reality.

In fact it’s rarely attained causing some of our community to ponder on it’s viability or even if it exists. Robin Bloor’s ‘Is there a single version of the Truth’ and  Beyond a single version of the truth in the Obsessive Compulsive Data Quality blog are great examples.

Much, on this subject, has been written by data quality practitioners and speaks to master data management and the desire, for example, for a single and consistent view of a customer. Banks often don’t understand customers, they understand accounts and if the number of (err, for example Hotel Chocolat) home shopping brochures I receive is anything to go by then many retailers don’t get it either. Personally I want my bank and my chocolatier to know when I am interacting with them. I’m a name, not a number, particularly when it comes to chocolate.

This problem is also characterised by the tired and exasperated tone of a Senior Manager asking for (and sometimes begging for) a single version of the truth. This is usually because they had a ‘number’ (probably revenue) and went to speak to one of their Department Head about it (probably because it was unexpectedly low) and rather than spending time on understanding what the number means or what the business should do, they spent 45 minutes comparing the Senior Managers ‘number’ with the Department Heads ‘number’. In trying to reconcile them, they also find some more ‘numbers’ too. It probably passed the time nicely. Make this a monthly meeting or a QBR involving a number of department heads and the 45 minutes will stretch into hours without any real insight from which decisions might have been made.

This is partly about provenance. Ideally it came from a single system of record (Finance, HR) or corporate BI but it most likely came from a spreadsheet or even worse a presentation with a spreadsheet embedded in it.

It’s also about purity (or the addition of impurities, at least) It might have started pure but the department head or an analyst that works in their support and admin team calculated the number based on an extract from the finance system and possibly some other spreadsheets. The numbers were probably adjusted because of some departmental nuance. For example, if it’s a Sales Team, the Sales Manager might include all the sales for a rep that joined part way through the year whilst Finance left the revenue with the previous team.

It will be no comfort (or surprise) to our Senior Manager that it is also a Master Data Management problem too. Revenue by product can only make sense if everyone in the organisation can agree the brands, categories and products that classify the things that are sold. Superficially this sounds simple but even this week I have spoken with a global business that is launching a major initiative, involving hundreds of man hours to resolve just this issue.

It’s also about terminology. We sacrifice precision in language for efficiency. In most organistions we dance dangerously around synonyms and homonyms because it mostly doesn’t catch us out. Net revenue … net of what? And whilst we are on the subject … revenue. Revenue as it was ordered, as it was delivered, as it was invoiced and as it is recognised according to GAAP rules in the finance system. By the way does your number include credit notes? And this is a SIMPLE example. Costs are often centralised, allocated or shared in some way and all dependent on a set of rules that only a handful of people in the finance team really understand.

Finally, it’s about perspective. Departments in an organisation often talk about the same things but mean subtly different things because they have different perspectives. The sales team mean ordered revenue because once someone has signed hard (three copies) their job is done whilst the SMT are probably concerned about the revenue that they share with the markets in their statutory accounts.

So is a single version of the truth philisophy? Can it really be achieved? The answer is probably that there are multiple versions of the truth but they are, in many organisations, all wrong. Many organisations are looking at different things with differing perspectives and they are ALL inaccurate.

A high performing organisations should be trying to unpick these knots, in priority order, one at a time. Eventually they will be able to look at multiple versions of the truth and understand their business from multiple perspectives. Indeed the differences between the truth’s will probably tell them something they didn’t know from what they used to call ‘the single version of the truth’.

More and more choices for BI Solution Architects

We analytics practitioners have always had the luxury of alternatives to the RDBM as part of our data architectural choices. OLAP of one form or another has been providing what one of my colleagues calls ‘query at the speed of thought’ for well over a decade. However, the range of options available to a solutions architect today is bordering on overwhelming.

First off, the good old RDBMS offers hashing, materialised views, bitmap indexes and other physical implementation options that don’t really require us to think too differently about the raw SQL. The columnar database and implementations of it in products like Sybase IQ are another option. The benefits are not necessarily obvious. We data geeks always used to think the performance issues where about joining but then the smart people at InfoBright, Kickfire et al told us that shorter rows are the answer to really fast queries on large data volumes. There is some sense in this given that disk i/o is an absolute bottleneck so less columns means less redundant data reading. The Oracle and Microsoft hats are in the columnar ring (if you will excuse the garbled geometry and mixed metaphor) with Exadata 2 and Gemini/Vertipaq so they are becoming mainstream options.


Data Warehouse appliances are yet another option. The combined hardware, operating systems and software solution usually using massively parallel (MPP) deliver high performance on really large volumes. And by large we probably mean Peta not Tera. Sorry NCR, Tera just doesn’t impress anyone anymore. And whilst we are on the subject of Teradata, it was probably one of the first appliances but then NCR strategically decided to go open shortly before the data warehouse appliance market really opened up. The recent IBM acquisition of Netezza and the presence of Oracle and NCR is reshaping what was once considered niche and special into the mainstream.


We have established that the absolute bottleneck is disk i/o so in memory options should be a serious consideration. There are  in-memory BI products but the action is really where the data is.Databases include TimesTen (now Oracle’s) and IBM’s solidDB. Of course, TM1 fans will point out that they had in-memory OLAP when they were listening to Duran Duran CD’s and they would be right.

The cloud has to get a mention here because it is changing everything. We can’t ignore those databases that have grown out of the need for massive data volumes like Google’s BigTable, Amazon’s RDS and Hadoop. They might not have been built with analytics in mind but they are offering ways of dealing with unstructured and semi-structured data and this is becoming increasingly important as organisations include data from on-line editorial and social media sources in their analytics. All of that being said, large volumes and limited pipes are keeping many on-premises for now.

So, what’s the solution? Well that is the job of the Solutions Architect. I am not sidestepping the question (well actually, I am a little) However, it’s time to examine the options and identify what information management technologies should form part of your data architecture. It it is no longer enough to simply chose an RDBMS.