MIS40650 – Big Data, Analytics, and Evidence-based Management

This session covered the connections between Big Data, Analytics and Evidence Based paradigm. Goldenberg’s article on evidence based approach in medicine was discussed. In particular the debate was around the notion articulated in the paper that the appeal to the authority of evidence that characterises evidence based practices does not increase objectivity but rather obscures the subjective elements that inescapably enters all forms of human enquiry. The definition of evidence was discussed as ‘some conceptual warrant for belief or action’ and centrality of evidence in science was accepted.

Goldenberg – On evidence and EBM – lessons from the philosophy of science

It is the practice of basing all beliefs and practices strictly on evidence that allegedly separates science from other activities. The evidence based medicine (EBM) movement purports to eschew unsystematic and intuitive methods of individual clinical practice in favour of more scientifically rigorous approach. This rigour is achieved through methodological clinical decision making based on examination of evidence derived from the latest clinical research. The evidence based techniques is an extension of the philosophical system of logical positivism which recognises only scientifically verifiable propositions as meaningful. This school of thought originated in Vienna in 1920 and as number of members of Vienna circle emigrated to UK & US that led to the strong influence of logical positivism on Anglo-American analytic philosophy.

The EBM movement centres around five linked ideas:

  1. Clinical decisions should be based on the best available scientific evidence
  2. Clinical problem  and not the habits or protocols should determine the type of evidence to be sought
  3. Identifying the best evidence means epidemiological and bio statistical way of thinking
  4. Conclusions derived from identifying and critically appraising evidence are useful only if put into action in managing patients or making health care decisions
  5. Performance should be constantly evaluated

The synthesis of large amount of clinical trial data into manageable “clinical summaries” or “meta-analyses” in EBM’s projects like Cochrane Collaboration is first step towards Big Data concept.

The critique of EBM is done on two grounds. In the first Hanson (1958) , Kuhn (1970,1996) and Feyerabend ( 1978) have claimed that observation is theory-laden and is coloured by our background beliefs and assumptions therefore can never be unmitigated perception of nature of things. In the second Duhem (1982) and Quine (1960) have argued that our theory choices are never determined exclusively by evidence instead a given body of evidence may support numerous even contradicting  theories.

Phenomenological approaches to science and medicine further challenge notions of evidence in EBM by questioning why relevant evidence is assumed to come primarily from clinical trials and other objective measures. They argue instead that patients self understanding and experience of illness also offers a legitimate source of relevant medical knowledge. This theoretical approach is grounded in the philosophy of Edmund Husserl and his followers who questioned the philosophical completeness of natural sciences. They argued that Cartesian dualism which splits the world into minds and bodies  fails to explain human understanding leading to a crisis of meaning.

Next the dictum of “You can’t manage what you don’t measure” by Deming & Drucker was explored in McAfee & Brynjolfsson’s article on Big Data.

McAfee & Brynjolfsson – Big Data – The Management Revolution

The claim that more data we measure the better we can manage the things can be justified statistically by showing that data driven companies are more profitable than others. But this is not a given, it all depends on how the data is analysed and how committed the senior management is with data analytics. The article describes how Big Data is different from field of analytics and why it has become important in recent days. The article outlines three key differences

Volume: As of 2012, about 2.5 exabytes of data are created each day, and that number is doubling every 40 months or so. More data cross the internet every second than were stored in the entire internet just 20 years ago. This gives companies an opportunity to work with many petabyes of data in a single data set—and not just from the internet

Velocity: For many applications, the speed of data creation is even more important than the volume. Real-time or nearly real-time information makes it possible for a company to be much more agile than its competitors

Variety: Big data takes the form of messages, updates, and images posted to social networks; readings from sensors; GPS signals from cell phones, and more. Many of the most important sources of big data are relatively new

The 5 management challenges with Big Data has been described in the article

Leadership: Companies succeed in the big data era not simply because they have more or better data, but because they have leadership teams that set clear goals, define what success looks like, and ask the right questions. Big data’s power does not erase the need for vision or human insight.

Talent management: As data become cheaper, the complements to data become more valuable. Some of the most crucial of these are data scientists and other professionals skilled at working with large quantities of information. Along with the data scientists, a new generation of computer scientists are bringing to bear techniques for working with very large data sets. The best data scientists are also comfortable speaking the language of business and helping leaders reformulate their challenges in ways that big data can tackle. Not surprisingly, people with these skills are hard to find and in great demand.

Technology: The tools available to handle the volume, velocity, and variety of big data have improved greatly in recent years. In general, these technologies are not prohibitively expensive, and much of the software is open source. Hadoop, the most commonly used framework, combines commodity hardware with open-source software. However, these technologies do require a skill set that is new to most IT departments, which will need to work hard to integrate all the relevant internal and external sources of data.

Decision making: An effective organization puts information and the relevant decision rights in the same location. In the big data era, information is created and transferred, and expertise is often not where it used to be. The artful leader will create an organization flexible enough to minimize the “not invented here” syndrome and maximize cross-functional cooperation

Company culture: The first question a datadriven organization asks itself is not “What do we think?” but “What do we know?” This requires a move away from acting solely on hunches and instinct. It also requires breaking a bad habit we’ve noticed in many organizations: pretending to be more data-driven than they actually are.

Marcus’s article in New Yorker also shares the same themes.

Marcus – Steamrolled by Big Data (The New Yorker)

The story of Google improving spell checkers using Big Data has been mentioned along with case of Oren Etzioni created Farecast (eventually sold to Microsoft, and now part of Bing Travel), which scraped data from the Web to make good guesses about whether airline fare would rise or fall.

The case study on Numenta founded by Jeff Hawkins of Palm Pilot’s fame has been mentioned in quite a detail.  According to Numenta’s Web site, their software, Grok, “finds complex patterns in data streams and generates actionable predictions in real time…. Feed Grok data, and it returns predictions that generate action. Grok learns and adapts automatically.” Numenta boasts that “As the age of the digital nervous system dawns, Grok represents the type of technology that will convert massive data flows into value.”

Marcus does claim that that every problem is different and that there are no universally applicable solutions. An algorithm that is good at chess isn’t going to be much help parsing sentences, and one that parses sentences isn’t going to be much help playing chess. A faster computer will be better than a slower computer at both, but solving problems will often (though not always) require a fair amount of what some researchers call “domain knowledge”—specific information about particular problems, often gathered painstakingly by experts. Big Data is a powerful tool for inferring correlations, not a magic wand for inferring causality.

The article also presents a critique of Big Data by invoking a chat the author had with Anthony Nyström, of the Web software company Intridea in which Nystrom claimed that selling Big Data is a great gig for charlatans, because they never have to admit to being wrong. “If their system fails to provide predictive insight, it’s not their models, it’s an issue with your data.” You didn’t have enough data, there was too much noise, you measured the wrong things. The list of excuses can be long.

Morozov’s article on planning machine was discussed next

http://www.newyorker.com/magazine/2014/10/13/planning-machine

The article describes the origins of Big Data concept with the story of Stafford Beer, leading theorist of Cybernetics who envisaged Project Cybersyn to help Chile’s socialist government control the country and it’s economy with the help of computers. Stafford Beer helped design systems like Datafeed which had four screens that could show hundreds of pictures and figures on historical & statistical information on state of production in the country. There was another screen that simulated the future state of the Chilean economy under various conditions.

One wall was reserved for Project Cyberfolk, an ambitious effort to track the real-time happiness of the entire Chilean nation in response to decisions made in the op room. Beer built a device that would enable the country’s citizens, from their living rooms, to move a pointer on a voltmeter-like dial that indicated moods ranging from extreme unhappiness to complete bliss. The plan was to connect these devices to a network—it would ride on the existing TV networks—so that the total national happiness at any moment in time could be determined. The algedonic meter, as the device was called (from the Greek algos, “pain,” and hedone, “pleasure”), would measure only raw pleasure-or-pain reactions to show whether government policies were working.

As Eden Medina shows in “Cybernetic Revolutionaries,” her entertaining history of Project Cybersyn, Beer set out to solve an acute dilemma that Allende faced. How was he to nationalize hundreds of companies, reorient their production toward social needs, and replace the price system with central planning, all while fostering the worker participation that he had promised? Beer realized that the planning problems of business managers—how much inventory to hold, what production targets to adopt, how to redeploy idle equipment—were similar to those of central planners. Computers that merely enabled factory automation were of little use; what Beer called the “cussedness of things” required human involvement. It’s here that computers could help—flagging problems in need of immediate attention, say, or helping to simulate the long-term consequences of each decision. By analyzing troves of enterprise data, computers could warn managers of any “incipient instability.” In short, management cybernetics would allow for the reëngineering of socialism—the command-line economy.

Yet central planning had been powerfully criticized for being unresponsive to shifting realities, notably by the free-market champion Friedrich Hayek. The efforts of socialist planners, he argued, were bound to fail, because they could not do what the free market’s price system could: aggregate the poorly codified knowledge that implicitly guides the behavior of market participants. Beer and Hayek knew each other; as Beer noted in his diary, Hayek even complimented him on his vision for the cybernetic factory, after Beer presented it at a 1960 conference in Illinois. (Hayek, too, ended up in Chile, advising Augusto Pinochet.) But they never agreed about planning. Beer believed that technology could help integrate workers’ informal knowledge into the national planning process while lessening information overload.

Next Harford’s article on Big Data was discussed in which he details what might be going wrong with this whole concept.

Harford, T. (2014). Big Data: Are we making a big mistake? The Financial Times.

(http://www.ft.com/intl/cms/s/2/21a6e7d8-b479-11e3-a09a-00144feabdc0.html#axzz3VCAz0yo4)

The article refers to “Google Flu Trends” which was quick, accurate and cheap & theory-free but still made almost accurate predictions of Flu trends across America. Google’s engineers didn’t bother to develop a hypothesis about what search terms – “flu symptoms” or “pharmacies near me” – might be correlated with the spread of the disease itself. The Google team just took their top 50 million search terms and let the algorithms do the work.

The article professes to tread cautiously on the four claims on Big Data prevalent among businesses i.e.

  • Data analysis produces uncanny accurate results
  • Every single data point can be captured, making old statistical sampling techniques obsolete
  • It is passé to fret about what causes what, because statistical correlation tells us what we need to know
  • Scientific and statistical models aren’t needed as “with enough data, the numbers speak for themselves”

A big data is one where “N = All” where we have the whole population and no sampling is required but this notion can be challenged as it is virtually impossible to get all the data points.

Posted in iBusiness | Tagged , , | Leave a comment

MIS40650 – Trust in Global Networks of Innovation

This session covered the issue of trust which is now becoming more important due to the advent of Virtual organisations.The concept of virtual organisations have been discussed in Introna’s article but to say that the concept has been articulated or nailed down would be a fallacy.

Introna_and_Tiow__Virtual_Org

The notion of virtual organisation is difficult to grasp and the reasons why virtual organisations have started to work well in recent years are still being thought through. Introna’s article has suggested that in the recent years ‘virtual has successfully become the metaphor for technology’. The dictionary definition of ‘virtual’ is ‘almost, even if not exactly or in everyway’ and this is now evident in the neologisms made popular  in IT industry as ‘virtual memory’, ‘virtual computer’, virtual reality’, ‘virtual space’. In each of these instances virtual connotes the information technology which possesses the ability to:

  1. provide a way of making a computer system act as if it had more capacity than it really possessed
  2. give the user illusion to exist at any time or any place they were needed

Introna’s article has highlighted some elements of the virtual organisations

  1. Strategic alliance: The key attribute of virtual organisations is partnering and is are all about alliances and outsourcing agreements
  2. Core Competence: The concept of core competencies furnishes the creation of virtual organisations in that it forges a form of partnership which is essentially made up of partners who are deemed to apply their core competencies to deliver world class products & services.
  3. Trust: The partners in a virtual organisation exhibit ‘unprecedented levels of trust and commitment’
  4. Organisation restructuring: Virtual organisation is greatly dependent on its structure for successful execution of identified work tasks.

The second part of Introna’s article presents a critique on the notion of virtual organisations and following points were covered

  1. Trust & Conflict: Virtual organisations have an issue with trust as people are remotely located and miscommunication often results in conflict
  2. Whole & Parts: Virtual Organisations assume that all partners will bring their core competence together resulting in knowledge sharing and better execution but that rarely happens
  3. Knowledge & Language: Organisational knowledge is a tacit commodity and the question remains how do we locate it to make it available to the partners in a virtual organisation.

The concept of trust was taken up in more detail in reference to the offshoring case study in the Kelly & Noonan’s article between an Irish start-up firm and a big Indian outsourcer.

Kelly & Noonan – Anxiety and psychological security in offshoring (JIT final)

The case study highlights the process of trust building between two stranger organisations and the challenges faced during this journey.

Seamas also touched upon Gidden’s distinctive and non-cognitive conception of trust where it is defined as ‘emotional commitment’ and how this makes the parties involved vulnerable and can potentially risk their very existence.

The reasons why trust is so difficult to generate and maintain in post modern world were analysed. The following reasons for the rising mistrust were given

  1. Globalisation: Giddens (1990) argues that the risk profile of the modern globalized world has been dramatically altered as institutional reflexivity has increased and social relations are disembedded from local contexts and stretched over extended tracts of time–space. These new social arrangements have problematized the means by which individuals establish and maintain a sense of psychological security and coherent identity (Giddens, 1991), which has resulted in the simultaneous transformation, and renewed importance, of trust relations.
  2. Decay in social institutions: Decay in institutions like Church, kinship, family, worker unions have resulted in  loss of social settings which earlier had helped in generation of trust.
  3. Technology & complexities around it: Gidden’s distinguishes between two types of trust relations prevalent in modern societies: trust in abstract systems and personal trust. The former are based, to a large extent, on faceless commitments while the latter depend on facework commitments (trust relations that are sustained). The investment of trust in abstract systems like aeroplane, Computer OS is a central feature of modern life. No one can completely opt out of the abstract systems involved in modern institutions yet, due to their diversity and complexity, our knowledge of their workings is necessarily limited. Therefore, trust (or faceless commitments) be-comes a very important means of generating the ‘leap of faith’ that practical engagement with them demands. Often, however, engagement with abstract systems involves encounters with individuals who ‘represent’ or are ‘responsible’ for them (e.g. in the case of visiting a medical doctor who represents a broader system of medical knowledge). Such contacts with experts are very consequential and take place at access points, which form the meeting ground of facework and faceless commitments.
Posted in iBusiness | Tagged , , , | Leave a comment

Dublin City Centre

Dublin City Centre

Image | Posted on by | Tagged | Leave a comment

You start dying slowly – Pablo Neruda

You start dying slowly

If you do not travel,

If you do not read,

If you do not listen to the sounds of life,

If you do not appreciate yourself.

You start dying slowly

When you kill your self-esteem;

When you do not let others help you.

You start dying slowly

If you become a slave of your habits,

Walking everyday on the same paths…

If you do not change your routine,

If you do not wear different colours

Or you do not speak to those you don’t know.

You start dying slowly

If you avoid to feel passion

And their turbulent emotions;

Those which make your eyes glisten

And your heart beat fast

You start dying slowly

If you do not change your life when you are not satisfied with your job, or with your love,

If you do not risk what is safe for the uncertain,

If you do not go after a dream,

If you do not allow yourself,

At least once in your lifetime,

To run away from sensible advice…

Pablo Neruda

Posted in Literature | Tagged , | Leave a comment

The latest use for drones: spotting exam cheats – ScienceAlert

The latest use for drones: spotting exam cheats – ScienceAlert.

Posted in Technology | Tagged , | Leave a comment

"Those Irish are a disgrace to mankind!"

– political cartoon from a German newspaper after marriage equality referendum

Posted in News and politics | Tagged , | Leave a comment

MIS40650 – ICT and Knowledge Working in Practice

This session continued from the last session’s discourse on McDermott’s six characteristics of knowledge. There was a broad agreement with concepts like knowing is a human act, that knowledge is a residue of thinking and that knowledge belongs to the present moment.

But there were contentious issues raised on the point that knowledge belongs to communities. Knowledge might belong to communities but the importance of communities in knowledge generation is debatable. Communities help disseminate knowledge but to be a knowledge producer needs more than to be a member of communities of practice. A true genius who produces knowledge needs an ability to consume existing knowledge and then brood over it, connect the dots and generate new insights to come out with a radically different thought process.

Being a member of communities can also be a distraction as it opens one to different viewpoints and can cause confusion. To produce knowledge it is important to remain focussed and on course. If Einstein, instead of sitting and brooding in his small patent office room would have been a professor in university or a petty manager in an organisation, would he been able to postulate theory of relativity? It is quite a possibility that being an active member of community Einstein might have had enough distractions and would not been in position to think and deliberate over the existing knowledge which is a prerequisite for knowledge creation.

Next the case studies for the session were discussed.

Kirkpatrick’s article on Groupware was adjudged to be a PR brochure for Price Waterhouse and was found to be full of triumphant rhetorical claims about the efficiency and core competency of PWC in using these groupware solutions. The article claims that PWC is a competent user of Groupware technologies and can help it’s clients become more efficient with the usage of such technologies. This article presents an example where Groupware usage has been a presented as a success story.

Kirkpatrick

Kirkpatrick – Groupware goes boom.pdf (1.154 MB)

Kirkpatrick – Groupware goes boom

Kirkpatrick, D. (1993), “Groupware goes boom”, in Fortune, Vol. 128

The next article from Orlikowski presents a case where Groupware implementation was a failure. Seamus in-fact claimed that the implementer in the article actually was PWC though it is not explicitly mentioned in the article. So these two articles represents two cases at opposite spectrums highlighting both success as well as failure of groupware implementations. 

Orlikowski

Orlikowski – Learning from Notes.pdf (3.338 MB)

Orlikowski – Learning from Notes

Orlikowski, W. J. (1993). “Learning from Notes: organizational issues in groupware implementation.” The Information Society 9: 237-250

Seamus then gave a background on the next article which was written by Seamus himself. It represented a case where groupware was implemented in a firm with radically different outcomes in different locations within the same firm. In most locations the implementation was a failure but in one particular location it was a success and it was due to the fact that the leader there had already invested in community building and was ready to take on responsibility for any adverse consequences with the usage of this new technology.

An interesting point raised by Seamus was around the theory of Gift giving and how it is related to the community building within an organisation. Inside an organisation people help each other, in fact they gift their or their teams expertise to other teams. There is a sense of obligation in these exchanges and that makes different teams help each other in different circumstances. But most of the groupware systems built for organisations ignore this concept of obligation and does not represent reality.

Another concept is the disconnect between management and workforce. The management wants to implement groupware as it wants to gain insight into the functions of the workforce but what is there in it for the workforce to input those details into the groupware systems.

Also to build successful communities of practice it is important that there is a core group of experts who exchange as well as enjoy information & knowledge sharing among themselves. And then and only then there will be something for junior members in the knowledge sharing process.

Kelly & Jones

Kelly and Jones – Groupware and the Social Infrastructure of Communication.pdf (105.86 KB)

Kelly and Jones – Groupware and the Social Infrastructure of Communication

Kelly, S. and Jones, M. (2001), “Groupware and the social infrastructure of communication”, Communications of the ACM, Vol. 44, No. 12, pp. 77-79.

The articles from Walsham & Hofman were discussed briefly as these just reiterated the concepts already raised in previous articles.

Hayes & Walsham

Hayes and Walsham – Participation in gw mediated CoPs.pdf (126.468 KB)

Hayes and Walsham – Participation in gw mediated CoPs

Hayes, N. and G. Walsham (2001). “Participation in groupware-mediated communities of practice: a socio-political analysis of knowledge working.” Information and Organization 11(4): 263-288.

Orlikowski & Hofman

Orlikowski & Hofman – Improvisational model.pdf (1.025 MB)

Orlikowski & Hofman – Improvisational model

Orlikowski, W. J. and J. D. Hofman (1997). “An improvisational model for change management: The case of groupware technologies.” Sloan Management Review(Winter): 11-21.

The Boeing article was discussed in detail as it represented a successful implementation of collaborative practices in a group with Boeing that helped with an extremely successful outcome. But article was debatable as it claimed 3 KSFs for successful collaboration within an organisation

  1. Strategy-Setting: Establishing an umbrella agreement in advance of team formation
  2. Technology Use: Using collaborative technology not only to collaborate but also to manage knowledge
  3. Work Restructuring: Restructure work processes without changing the core creative
    needs of the team

But most of the collaborative teams uses in some degree these three success factors but still they fail dramatically. So there is something beyond these KSFs, perhaps how well these KSFs were used within the team and what is missing in the article is this insight on how they managed to use these KSFs so well to achieve the successful outcome.

Also during the deep dive it was concluded that the principles behind these 3 KSFs are just representation of Agile methodology and team was using Agile processes without explicitly knowing it.

Malhotra et al

Malhotra et al – boeing rocketdyne case misq.pdf (366.835 KB)

Malhotra et al – Boeing Rocketdyne case misq

Malhotra, A., Majchrzak, A., Carman, R. and Lott, V. (2001), “Radical innovation without collocation: a case study at Boeing-Rocketdyne”, MIS Quarterly, Vol. 25, No. 2, pp. 229-249.

Posted in iBusiness | Tagged , , , | Leave a comment