BI and Poor Decision making

Good Decision/Bad Decision

This has been something of a preoccupation for me of late. We spend much of our time debating the technologies. We invest valuable time in deciding if we should we go with mega-vendors (IBM, Oracle, SAP) or a challenger? We agonise over should it be cloud or on-premises, mart or warehouse, dimensional or relational? And it is all, frankly academic if the businesses is not making good decisions.

There is no shortage of material that try and make sense of why good people and great businesses make monumentally bad decisions. In the book ‘Thing Again:Why Good Leaders Make Bad Decisions’ by Sydney Finkelstein, Jo Whitehead and Andrew Cambell the focus is on the strategic decisions that have dramatic and highly visible consequences for the organisation.

Good People in Great Organisations Can Make Poor Decisions

An example is one of the UK’s premier retailers Boots which enjoys one of the largest footfalls in the UK. Established in the 19th century, it is now a subsidiary of £20billion Alliance Boots. In September 1998, the Chief Executive, Steve Russell excitedly announced a range of healthcare offerings including dentistry, chiropody and laser hair removal. Five years later, the initiative had lost in the region of £100m and Boots needed to break open the piggy bank and look down the back of the sofa for another £50m just to close down the operation and convert that premier retail space back to being … retail. It almost goes without saying that the changes were implemented by a new CEO, Richard Baker.

Apparently, one of the chief reasons for making the move into Healthcare services was  that a slowdown in the Beauty business ‘had been detected’. However a spokesman was later quoted in the Telegraph as saying that ‘they recognised that these areas are still growing strongly’.

Let’s stop there for a second. Spotting trends in sales and revenue by product category is probably marketing and business 101. And even the most rudimentary business intelligence solution should be trending sales over time. Yet the trend in sales in a key category for Boots was diagnosed as slowdown and only a few months later as growth. Of course, the slowdown may have been a short-term blip but the point of trending is to smooth these out for the purpose of longer-term planning. And, the error in trending might be more understandable had it not been for the fact that the later growth was characterised as ‘strong’.

Of course, I am not on the board of Boots and I have an advantage shared with all those analysts and commentator that put the boot (or should that be Boots) into Mr Russell … hindsight. Indeed, it’s a testimony to the strength of Boots as a high street giant that they can make major booboo’s and still go on to survive and thrive.

The Problem with Decisions …

And organisations are complex systems of individuals and interactions. Large organisations are very complex. This is why organisational decision making doesn’t always stand up to the scrutiny of us as individuals who retrospectively try and apply the logic of rational decision making to such mistakes.

There are a number of problems associated with individuals making decisions. Individuals have bias, self-interest, pre-conceptions. There are also a number of problems with organisational decisions. Groups have to manage conflict, disagreement and there are dynamics that can produce undesirable outcomes like Groupthink.

Today BI’s only Contribution is a Report, Chart or Dashboard

So if we accept that the purpose of Business Intelligence is to help organisations make better decisions (surely there is no debate here?) then Business Intelligence applications have to be more than reports, dashboards and charts.

They need to make decisions easier to collaborate around, they need to link decisions directly to the information that is required to make them. Furthermore decisions need to be open, transparent, accountable not just for the regulators but so that the whole organisation can buy into them.

Advertisements

Decision Making Black Holes

 

A Funny Thing Happens at the Forum

Meetings are one of the most common decision making ‘forums’ we are all regularly involved in. In fact one in five company meetings we take is to make a decision. As a way of making decisions though, they can be problematic. Once the meeting has concluded, the connection between information shared, decisions made and actions taken can be weak even lost. It’s as if the meeting itself were a decision making black hole.

Some Decisions are More Equal Than Others

Some decision making meetings are impromptu for making a timely, tactical decision quickly. Others are regular, formal and arranged around the ‘drum beat’ or ‘cadence’ of a business to make more strategic decisions. The more strategic the decisions and longer term the impact the less frequent the forum so a Senior or Executive Management Team may only meet quarterly for a business review (QBR)

How a QBR ‘Rolls’

A typical QBR will see Senior Managers sharing results in PowerPoint, possibly with financial results in spread-sheets which I would hope have at least been extracted from a Business Intelligence application.

If the SMT are reasonably well organised, they will summarise their conclusions and actions in meeting minutes. The meeting minutes will be typed up by an assistant in a word document and then distributed in email.

Throughout, they will all have been keeping individual notes so will walk out with these in their daybooks. The most senior manager in the room might not do this particularly if it’s their assistant who’s taking the minutes.

Later, actions from daybooks and minutes are likely transferred to individuals to-do lists and all follow-up will be conducted in email and phone calls.

An Implosion of Information, Conclusion and Decision

So let’s recap. Critical decisions about how resources are going to be allocated will be discussed in a ‘QBR’ and yet the artefacts of this critical decision making forum are scattered into Word documents, excel spread-sheets, emails and outlook tasks. Tiny fragments of the discussion, information, conclusion, decisions and activities implode around the organisation. To be frank, the team are now only going to make progress because the forum was recent and can be relatively easily recalled.

Of course, once time or people move on so does the corporate memory of the decision. Conversations begin with ‘what did we agree to do about that cost over-run?’ or ‘why did we say we were ok with the revenue performance in Q1?’

Executive Attention Deficit Syndrome

Many executives complain of a syndrome that feels like ADS. This is because the more senior the manager the more things they will probably have to deal with at an increasingly superficial level. A functional head will probably spend no more than 15 minutes on any one thing. To productively make decisions they will need to be able to have the background, status and related information to hand so that they can deal with it quickly and move on to the next thing. Decision making black holes contribute to this feeling of EADS.

CDM and Corporate Memory

Corporate Decision Making platforms will be successful when they connect;

  • Decisions
  • Information on which the decision was made
  • Insight derived from the information
  • Actions taken on the decision
  • Results of the actions

This means total recall of corporate decisions good and bad so that, over time, decisions can be recalled, evaluated, re-used or improved. A far cry from current decision making forums which whilst functional are inherently flawed, fragmented and are not improving the timeliness and quality of decisions in our organisations.

BI Requirements Should not just be Gathered

There are many resources remonstrating with the IT community on the importance of gathering requirements. Failing to gather requirements, they warn, will lead to a poor solution delivered late and over budget. This is largely inarguable.

However, I would warn that simply ‘gathering’ requirement is as big a risk. Fred Brooks, author of ‘The Mytical Man Month’ once said that ‘the hardest part of building a software system is deciding what to build’. And deciding what to build is a two way process rather than the act of listening, nodding and documenting that we all too often see in Business Intelligence projects.

From time to time, I hear someone cry foul on this assertion. They argue that it seems like the tail is wagging the dog or that the business cannot compromise on the requirement. I usually point out that simply building what the user asked for doesn’t happen in any other field of engineering. Architects advise on the cost of materials when planning a major new office building, City officials take advice on the best possible location for a bridge and environmental consultants are actively engaged in deciding exactly if and what should be built in any major civil engineering project.

And this is exactly how we should approach business analytic requirements. As a two-way exploration of what is required, possible solutions and the implications of each. Incidentally, this is particularly difficult to do if business users are asked to gather and document their own requirements without input from their implementation team.

An example of why this is important is rooted in the fact that many BI technologies (including IBM Cognos) are tools not programming languages. They have been built around a model to increase productivity. That is, if you understand and work with the assumptions behind the model reports, dashboards and other BI application objects can be built very quickly. Bend the model and development times increase. Attempting to work completely around the model may result in greatly reduced productivity and therefore vastly increased development time.

So be wary of treating ‘gathering’ and ‘analysis’ as distinct and separate steps. Instead, the process should be an iterative collaboration between users and engineers. Requirements should be understood but so should the implications from a systems perspective. The resulting solution will almost undoubtedly be a better fit and it will significantly increase the chance of it being delivered on time, at the right cost and with an increased understanding between those that need the systems and those that build them.

"We fail more often because we solve the wrong problem than because we get the wrong solution to the right problem", Russell Ackoff, 1974

Death of the BI Generalist

I have fond memories of the 90s. I spent a fair amount of it being a ‘Cognos Consultant’ travelling the length and breadth of the UK being an expert in all things Cognos. There was a great sense of independence and autonomy. It was possible to pitch up at a new Client with 2 CDs (One for PowerPlay, one for Impromptu) and install the software on a handful of PCs on Monday morning and by Friday I would have interviewed for requirements, built a metadata model, cubes and reports and would be huddled around a desktop with the users putting the final touches to what I dared to call a BI application.

It was the day of the renaissance BI Consultant, the Polymath. We all felt like experts from the time of we opened the jewel case to the point the user signed off their brand new Business Intelligence application.

As much as I enjoyed this, I don’t yearn after these as simpler times. The solutions we built were great and delivered real and incremental value. They didn’t, however, have the breadth, depth and coverage as the best implementations today. These are deployed over the web to hundreds, sometimes thousands of users covering multiple business domains. Modern performance management solutions have bought together Sales, Marketing, Operations, Finance and the Senior Management Team together in ways that were less achievable perhaps even impossible ten years ago.

There is a cost though. The range of skills required to deploy a modern BI (or Business Analytics) Application is broad. Take the individual that installs the software. What used to require a working knowledge of Windows and an installation manual now also needs an understanding of web servers, operating systems, virtualisation, application servers and security all in multiple flavours. It may also require systems management conversations to configure for fail-over and load-balancing in a way that wasn’t required before because it wasn’t possible. And whilst the internet means we are all effectively sharing a single network it stands to reason that it is important to understand all the different ways of communicating and processing on this single, global network.

So far, we have only unwrapped the cellophane but things get really interesting when we get our hands on data. The possibilities for delivering value from the vast array of corporate data assets as improved dramatically but this means that they need to be integrated too. This required an in-depth understanding of relational and dimensional modelling techniques, databases sql and at least one ETL tool. Whilst there are some tools that simplify this process with tools and (usually) OLAP cubes, they are typically departmental in scale. Modern solutions may also required a mix of on-premises and cloud data and a mix of  structured, and unstructured which increases the richness of the solution but also the number of things a developer needs to keep in their head.

So where does this leave us? If you have a team of homogeneous ‘BI Developers’ then their skills may be too general. I am working with one client in an advisory capacity at the moment and they have one consultant that has implemented everything. He has been very successful too. However, their business, though, is of a size where he can wrap his arms around the requirement, the infrastructure and the data. He also has exceptional aptitude and is an outstanding consultant throwing herculean effort at their projects. For the rest of us mortals we need to divvy up the responsibilities.

Many of my clients tend to do this across technical/systems, data and application. It’s not the only way but works really well. The IS team pick up the install and maintenance of the application server environment, a data team create a single reporting database/data warehouse/marts and a BI team maintain metadata and the reporting application in all it’s variants of reports, analysis, dashboards, scorecards etc.

There are interesting implications for the modern, complex out-sourcing and off-shoring organisation. Another of my clients based in the City out-source their databases including data warehouse development. Because there is a cost each time they want to add a new data item for the purpose of BI they tended to try to resolve new requirements in metadata or the reporting application. Inevitably the quality of their solution deteriorated over time because they were not always extending or fixing in the right place. Eventually, the sticking plasters gave out and they were forced to back to first principles which had implications of cost and re-work.

The more common issue is one of a shared design. Changes to data impact ETL, metadata, reports and the application. A new requirement might need to be changed in one or all of these places. Whilst usually a number of small and straightforward changes, they do need to applied consistently. To co-ordinate activity when the solution is expanded or enhanced there needs to be a common data model. In our experience this needs to be two models. One, a logical (and relational) model that represents the business data in a perfect and integrated world. Secondly a dimensional model that represents the data as the business see it in their reporting application through reports and metadata. These are surprisingly rare in our experience and usually not because they take a long time to draw (they don’t) Our suspicion is that what makes these time consuming is the need for consensus. However, if there is no consensus then there is a risk that the solution is already flawed.

So, a BI/Analytics solution cannot be built single-handedly. It requires a range of skills and it’s difficult to be an expert in all of them. The generalist then is fading away and being replaced by a team and a shared design who can deploy solutions with greater reach and richness than when we could ever have believed a few short years ago.

Single version of the truth, philosophy or reality?

Assuming you want the truth and you can handle it then you will have heard this a lot. The purpose of our new (BI/Analytics/Data Warehouse) project is to deliver ‘a single version of the truth’. In a project we are engaged with right now the expression is one version of reality or 1VOR. For UK boomers that will almost undoubtedly bring to mind a steam engine but I digress.

I have to admit, I find the term jarring whenever I hear it because it implies something simple and  attainable through a single system which is rarely the reality.

In fact it’s rarely attained causing some of our community to ponder on it’s viability or even if it exists. Robin Bloor’s ‘Is there a single version of the Truth’ and  Beyond a single version of the truth in the Obsessive Compulsive Data Quality blog are great examples.

Much, on this subject, has been written by data quality practitioners and speaks to master data management and the desire, for example, for a single and consistent view of a customer. Banks often don’t understand customers, they understand accounts and if the number of (err, for example Hotel Chocolat) home shopping brochures I receive is anything to go by then many retailers don’t get it either. Personally I want my bank and my chocolatier to know when I am interacting with them. I’m a name, not a number, particularly when it comes to chocolate.

This problem is also characterised by the tired and exasperated tone of a Senior Manager asking for (and sometimes begging for) a single version of the truth. This is usually because they had a ‘number’ (probably revenue) and went to speak to one of their Department Head about it (probably because it was unexpectedly low) and rather than spending time on understanding what the number means or what the business should do, they spent 45 minutes comparing the Senior Managers ‘number’ with the Department Heads ‘number’. In trying to reconcile them, they also find some more ‘numbers’ too. It probably passed the time nicely. Make this a monthly meeting or a QBR involving a number of department heads and the 45 minutes will stretch into hours without any real insight from which decisions might have been made.

This is partly about provenance. Ideally it came from a single system of record (Finance, HR) or corporate BI but it most likely came from a spreadsheet or even worse a presentation with a spreadsheet embedded in it.

It’s also about purity (or the addition of impurities, at least) It might have started pure but the department head or an analyst that works in their support and admin team calculated the number based on an extract from the finance system and possibly some other spreadsheets. The numbers were probably adjusted because of some departmental nuance. For example, if it’s a Sales Team, the Sales Manager might include all the sales for a rep that joined part way through the year whilst Finance left the revenue with the previous team.

It will be no comfort (or surprise) to our Senior Manager that it is also a Master Data Management problem too. Revenue by product can only make sense if everyone in the organisation can agree the brands, categories and products that classify the things that are sold. Superficially this sounds simple but even this week I have spoken with a global business that is launching a major initiative, involving hundreds of man hours to resolve just this issue.

It’s also about terminology. We sacrifice precision in language for efficiency. In most organistions we dance dangerously around synonyms and homonyms because it mostly doesn’t catch us out. Net revenue … net of what? And whilst we are on the subject … revenue. Revenue as it was ordered, as it was delivered, as it was invoiced and as it is recognised according to GAAP rules in the finance system. By the way does your number include credit notes? And this is a SIMPLE example. Costs are often centralised, allocated or shared in some way and all dependent on a set of rules that only a handful of people in the finance team really understand.

Finally, it’s about perspective. Departments in an organisation often talk about the same things but mean subtly different things because they have different perspectives. The sales team mean ordered revenue because once someone has signed hard (three copies) their job is done whilst the SMT are probably concerned about the revenue that they share with the markets in their statutory accounts.

So is a single version of the truth philisophy? Can it really be achieved? The answer is probably that there are multiple versions of the truth but they are, in many organisations, all wrong. Many organisations are looking at different things with differing perspectives and they are ALL inaccurate.

A high performing organisations should be trying to unpick these knots, in priority order, one at a time. Eventually they will be able to look at multiple versions of the truth and understand their business from multiple perspectives. Indeed the differences between the truth’s will probably tell them something they didn’t know from what they used to call ‘the single version of the truth’.

What 127 Hours Tells us About Social Networks

In an interview this week for Mark Kermode’s Film Review show, Danny Boyle made it clear that he used the success of Slumdog to make a film that might otherwise not have been made. To leverage the success of 2010 most acclaimed film is an indication that 127 hours is more than this years ‘would you?’ movie.

It, it is the true story of Aron Ralston who gets trapped under a boulder whilst canyoneering alone in Utah. The desperate measure that he takes to free himself is well documented so it is not giving anything away to say that he was trapped by his arm, he has a multi-tool (a really cheap one) and a little under 127 hours to debate if he should … or should not.

Before, I go on you might be wondering what’s the connection between BI and Social Networks let alone the connection between Danny Boyle’s latest movie and Social Networks. Those that follow my posts and tweets will know that Social platforms interest me because I think they are changing the way we share and use information in business and will profoundly change the Business Analytics space over the coming years. A social platform has already made it into IBM Cognos 10 because these guys, again, are ahead of the game. Many don’t see it yet because the original use of social platforms have trivialised their significance but it’s there nonetheless.
 
Back to the connection. Aron Ralston, played by James Franco, is an all-American hero. He’s young, fit, strong, intrepid and independent. He is good at what he does, he has spent a lot of time in his chosen wilderness and is able to navigate it with speed and ease. In fact, at one point in his story, he briefly but convincingly takes the role of park guide. The hopelessness of his literal and figurative fall takes a long time to sink in for our hero. Indeed, when it does dawn on him that he could have shared his hiking plan with his friends or family it wouldn’t be exaggerating to call it an epiphany. It’s clearly a powerful realisation for Aron that he’s not a hero, he’s an arse.
 
There is a moment in the movie where Aron says ‘thank you’. It’s a strange moment. I don’t want to give away why it is strange but once you have seen the movie, you will know why. For me, it was significant because he knew that if he made it home alive (which was still, by no means certain) then he would be changed forever. He would live the rest of his life in the knowledge that however strong, smart and experienced he was that those tiny connections we all make each day matter. Sometimes in small ways because it’s just about about sharing. Sometimes in significant and surprising ways.

For me, I am continually and pleasantly surprised by what I learn on the subjects of analytics, organisational leadership, productivity, start-ups and social media in my twitter stream. It’s full of links to content that cover important ideas from solid thinkers. Admittedly none of them are life-saving but, at a stretch, a rare few might be described as life-changing. Each of them make a tiny but positive difference and sometimes someone in my network helps me (or me them) in a surprising way.

More and more choices for BI Solution Architects

We analytics practitioners have always had the luxury of alternatives to the RDBM as part of our data architectural choices. OLAP of one form or another has been providing what one of my colleagues calls ‘query at the speed of thought’ for well over a decade. However, the range of options available to a solutions architect today is bordering on overwhelming.

First off, the good old RDBMS offers hashing, materialised views, bitmap indexes and other physical implementation options that don’t really require us to think too differently about the raw SQL. The columnar database and implementations of it in products like Sybase IQ are another option. The benefits are not necessarily obvious. We data geeks always used to think the performance issues where about joining but then the smart people at InfoBright, Kickfire et al told us that shorter rows are the answer to really fast queries on large data volumes. There is some sense in this given that disk i/o is an absolute bottleneck so less columns means less redundant data reading. The Oracle and Microsoft hats are in the columnar ring (if you will excuse the garbled geometry and mixed metaphor) with Exadata 2 and Gemini/Vertipaq so they are becoming mainstream options.


Data Warehouse appliances are yet another option. The combined hardware, operating systems and software solution usually using massively parallel (MPP) deliver high performance on really large volumes. And by large we probably mean Peta not Tera. Sorry NCR, Tera just doesn’t impress anyone anymore. And whilst we are on the subject of Teradata, it was probably one of the first appliances but then NCR strategically decided to go open shortly before the data warehouse appliance market really opened up. The recent IBM acquisition of Netezza and the presence of Oracle and NCR is reshaping what was once considered niche and special into the mainstream.


We have established that the absolute bottleneck is disk i/o so in memory options should be a serious consideration. There are  in-memory BI products but the action is really where the data is.Databases include TimesTen (now Oracle’s) and IBM’s solidDB. Of course, TM1 fans will point out that they had in-memory OLAP when they were listening to Duran Duran CD’s and they would be right.

The cloud has to get a mention here because it is changing everything. We can’t ignore those databases that have grown out of the need for massive data volumes like Google’s BigTable, Amazon’s RDS and Hadoop. They might not have been built with analytics in mind but they are offering ways of dealing with unstructured and semi-structured data and this is becoming increasingly important as organisations include data from on-line editorial and social media sources in their analytics. All of that being said, large volumes and limited pipes are keeping many on-premises for now.

So, what’s the solution? Well that is the job of the Solutions Architect. I am not sidestepping the question (well actually, I am a little) However, it’s time to examine the options and identify what information management technologies should form part of your data architecture. It it is no longer enough to simply chose an RDBMS.