A Data Science Central Community
For original blog post: http://sctr7.com/2013/02/20/achilles_heel/
Business Analytics Achilles' Heel: Organizational Politics
In spite of overwhelming power and strength, a deadly vulnerability can lead to resolute destruction. While the current generation likely has seen this in the guise of Star Wars in the form of poor ‘Death Star’ engineering, the ancient Greeks embodied this principle in the myth of Achilles. It is proposed that organizational politics is the Achilles Heel of business analytics programs. In order to overcome this vulnerability, SARK7 applies an integrated analytics decision methodology which embeds robust organizational validation: http://www.sark7.com/docs/integrated_analytics.pdf
Achilles, the invulnerable warrior-hero of the Iliad, was prophesied to die an early death. To forestall the fates, his loving mother, Thetis, dipped him in the River Styx as a baby, which was said to confer invulnerability. However, holding him by his ankle while dousing him in the river, her hand covered and so prevented his ankle from being treated. After subsequently surviving many tumultuous military battles as a young warrior, Achilles was in the end dispatched by a poison arrow which lodged in his vulnerable heel. Thus stands the still gravid notion that even the mighty can be felled by a small but deadly vulnerability.
Here it is proposed that the ‘deadly weakness’ associated with powerful business analytics solutions is hiding in plain sight: our own organizational decision processes, or lack thereof as the case may be. On a daily basis we are tantalized with ever more powerful tools and techniques for business analytics and ‘big data’ insight. We might say that analytics is being positioned as the ‘mighty Achilles’, a warrior capable of achieving business victory from the tumult of chaos, complexity, and disorder besieging global enterprise.
The advancement of business analytics capabilities is multifaceted and emergent: a fusion of advancing methodological techniques (statistical procedures, methods, and understanding), improved software tools (improving capabilities throughout the analytics lifecycle, from data handling and management through visualization), and raw computing power (storage capability, retrieval speed, in memory transformation, processing speed, transmission). However, celebrating such powerful technology-driven capabilities often ignores a stark and immobile, purely social limitation: ourselves, particularly our intrinsic political nature within organizations.
While it may disturb a software marketer or two, business analytics and ‘big data’, stripped of their hyped ‘gee whiz’ technological, mathematical, and scientific trappings, are simply about making high quality organizational decisions, a social process. In the end, making high quality organizational decisions is a social, political, even anthropological, avocation. While impractical and decidedly inefficient, it is possible to conceive of ‘big data’ decisions being driven by manual calculators, paper, and meetings.
This gets to the very origin of business analytics: World War II, lacking sophisticated computers, saw the implementation of increasingly complex, literal configurations of ‘organizational calculators’ in military computation rooms to drive multi-staged quantitative analysis. Analytics took the form of staged rows of sequenced, desk bound analysts driving step-wise component calculations to deliver manual linear programming, matrix analysis, and similar complex results to drive both tactical and strategic decision making. However, ultimately it was still the war room generals that interpreted and hashed out the final decisions from such analysis.
A ‘decision’, the end result of any analysis (even if the decision is to ‘do nothing’), regardless of its technical or mathematical sophistication, is ultimately a psychological, and finally, an organizational (sociological and political) process: humans make their own judgments, both individually and in groups, all the way from problem identification, framing, and model design, through to results interpretation, decision making, and policy implementation.
It is therefore worthy to quickly dispose upon the etymological roots for the term ‘decision’. ‘Decision’, via Middle French and thereby Latin, implies a ‘settlement’ or ‘agreement’. The central notion is thus that a ‘decision’ signifies the end of some process of deliberation (implicitly a psychological and/or sociological process), the end result of which is a commitment to action via an agreement to settle, or an agreement to accept the results of the analysis process and a subsequent commitment to action.
The problem faced by modern enterprise is that the complexity of problem sets, analytical models, and decision vetting processes often obfuscate, and thus invite the derailment, of decision outcomes. As sophisticated as software-driven processes and solutions become, they cannot escape the natural ‘speed limit’ of human organizations. Indeed, increasingly sophisticated technical and methodological solutions run the danger of wrapping core assumptions in a ‘black box’ of hidden algorithms, methods, and computations. It becomes tempting to ‘fetishise the machine’ as an oracle, to trust the complex ‘clockwork gears’ and thus to deify whatever results magically spring forth. To the degree that organizational decision making processes lack robust organizational checks and balances, there is a continual risk that analytics decisions are hijacked by human bias, both implicit (unaware) and explicit (conscious).
As a result of these unintended perverse organizational tendencies, thoroughness, lack of bias, and ‘objectivity’ must be inculcated in business analytics decision making processes for the results of a given analysis to have ‘scientific merit’ or simple pragmatic robustness. Such process robustness can only be assured when proper organizational model and decision validation practices are put in place and applied appropriately. This is, at root, a socio-organizational problem, not a technical one.
Analytics models and decisions must be vetted by robust political-organizational mechanisms. To the degree such mechanisms are poorly designed and communicated in organizations, there is a continual risk that decisions are hijacked and perverted. Organizational decisions, as a collaboration between experts and management stakeholders, both heavily motivated by financial and power-politics associated agency incentives in commercial settings, there is always the risk that experts and stakeholders will collude to bias the results of an analytics inquiry. This temptation to sway the inclination of an analytics or experimental inquiry can occur at all stages in the analytics decision process, from problem identification and framing through results interpretation and communication. In other words, to ensure robustness, analytics models and decisions must undergo an organizational validation process which consciously programs out the temptations of implicit and explicit bias.
Tools can help drive such validation procedures, but if the organizational structure and culture is, at base, unwilling, sophisticated business analytics technology and solutions are all for naught. We see this, for example, in the form of organizations that are heavily bureaucratic, and pursue analytical inquiry in order to ‘check a box’, with aberrant results sent back for reframing and fidgety manipulation until the results conform to the expected conclusion. In an experimental setting, this is called ‘bad science’; in an organizational setting, this is at times the status quo, as embodied in Disraeli’s (or Twain’s) quote: “There are three kinds of lies: lies, damned lies, and statistics.”
Every analytics practitioner I know has a story concerning when they were pressured or influenced / incentivized, sometimes both, to orient the results of an analytical inquiry in a particular direction. The incentives are complex and diverse, but come down to stakeholder politics: managers seeking to increase funding or headcount (empire building), the desire to portray an initiative in a positive light in order to increase professional visibility or reputation, kowtowing to a patronage network obligation, fear of criticism or reprisal from negative news, simple sycophantism and advancement-seeking, and the subtle influence of hidden heuristic biases (i.e. overconfidence, availability, etc.: http://en.wikipedia.org/wiki/List_of_biases_in_judgment_and_decisio...).
Without standardized practices and procedures for consciously removing agency and bias, and so fortifying ‘objectivity’, analytics efforts can easily result in decisions which are hamstrung, biased, or captive to special interests. Organizational decision processes, due to the power of incentives, thus run the risk of being perverted towards biased or self-centered interests, no matter how fancy or expensive the analytics solution supporting the inquiry. Thus it can be said that poor or lacking organizational validation is a central ‘Achilles’ Heel’ to analytics implementations. Buyer beware: failing to put in place proper organizational controls can turn a massive investment in business analytics technology into an expensive paper weight.
How do we, as analytics practitioners and managers, navigate this treacherous, yet subtle channel? Urging caution, cursory ‘check-boxing’ procedures, finger wagging, prescribing placebos, and pure lip service are too easily ineffective victims of good intentions. However, there is hope: if we ‘flip our frame’, we can consider any organization as, in of itself, a type of ‘decision making computer’, that is, a unified mechanism for framing and making strategic decisions to guide the behavior of a commercial entity as a whole. Indeed, this is the central principal of the ‘Knowledge-based theory of the firm’ (http://en.wikipedia.org/wiki/Knowledge-based_theory_of_the_firm), namely that a firm is a composite of agents, each granted decision making rights and with access to particular information. The ‘dynamic tension’ of these decision agents, bound together through incentives, assessment systems, and particular powers, determine the robustness and outcome of firm-wide decision making.
While an exciting principle, this notion may strike some as compelling, yet impractical. Seemingly, the overhead required in mapping and containing such entrenched social complexity would be impractical? The answer, and increasingly so, is: NO. While political willpower may be lacking, there are rapid and powerful techniques for ferreting out potential bias in organizational decision making processes. For instance, there are increasingly powerful tools and methods for quickly mapping social networks, propelled in part by the emerging power and visibility of social network websites such as Facebook and LinkedIn. The underlying principals and methods go back as far as 1930, to sociologist Moreno’s sociograms. Today, there is a robust methodology called Social Network Analysis (SNA): http://en.wikipedia.org/wiki/Social_network_analysis .
In future articles, I will go into greater detail concerning the application of SNA to business analytics decision making process validation. In short, structured understandings of organizational social networks, focused on decision making processes, gives insight into the hidden potentials for weaknesses in organizational decision making. For instance, is there a lack of redundancy in organizational decision networks? Does one stakeholder monopolize a ‘gatekeeper’ role in some aspect of the decision process? Is there a lack of direct connection between different parts of the decision analytics process, for instance problem framing, data analysis, and results interpretation, such that there is the risk of a ‘game of secrets’ whereby intent and meaning are lost across a broad, tenuous communication chain? Are there stakeholders with pronounced conflicts of interest in a decision network, for instance, a manager that both sponsors an analytics inquiry and stands to gain from a particular decision outcome, particularly when there is a lack of objective oversight? Is there a ‘shadow’ decision network in place whereby an independent group reviews the robustness of the analytics modeling and validation process? Is there an independent expert body, for instance an Analytics Center of Excellence, which monitors and ensures the robustness of the decision process?
The raison d’être for business analytics is ultimately about organizational decision making. Business analytics technical and methodological solutions are thus limited by the relative development of organizational decision making processes themselves. A lack of organizational analytics maturity, particularly identifiable in weaknesses in organizational decision networks, can easily derail any sophisticated analytics inquiry.
Methods such as SNA give organizations a tool to strengthen evidence-based management via deeper sociological and anthropological understandings of organizational decision processes. A practitioner-focused methodology, based upon SARK7 consulting engagements, is the use of SNA in an integrated analytics process. An overview can be located, with complements, here: http://www.sark7.com/docs/integrated_analytics.pdf