Introduction to Cryptography

Secure communications in presence of third parties i.e. adversaries is an old age problem and cryptography is the practice and study of techniques to solve this problem.

The history of cryptography can be split into two eras: the classical era and the modern era. In classical era, cryptography was synonymous with encryption – conversion of human readable message into incomprehensible information, such that interceptors cannot make any sense of the communication. A breakthrough was achieved in 1977, when both the RSA algorithm and the Diffie-Hellman key exchange algorithm were introduced and marked the start of modern era. These new algorithms were ground-breaking as they represented the first viable cryptographic schemes where security was based on the number theory enabling for the first time secure communication between adversaries without a shared secret.

In modern era with the advent of emerging technologies like Blockchain, cryptography has much more to offer than just encryption in the form of integrity, authentication, and digital signatures, interactive proofs and secure computations. Modern cryptography is founded on the idea that the key that you use to encrypt your data can be made public while the key that is used to decrypt your data can be kept private i.e. Encryption with the public key can only be undone by decrypting with the private key. The public keys are generated by transforming private key through a one-way function that is easy to calculate in one direction and difficult in oCrypto1ther. The whole security of modern communication depends on the “hardness” of this one-way function. Here the word hardness represents the computational complexity i.e. time taken to compute one key from another. In theory it should be very easy and computationally cheap to calculate public key from private key but impossible or computationally very intensive to calculate private from public. As such, these systems are known as public key cryptographic systems.

The first, and still most widely used of these systems, is known as RSA—made up of the initial letters of the surnames of Ron Rivest, Adi Shamir, and Leonard Adleman, who first publicly described the algorithm in 1977 and it uses factorization.

Integer factorization is the process of determining which prime numbers divide a given positive integer. Computers don’t do well with arbitrary large numbers so in factorization it is important to ensure that the numbers do not get too large by choosing a maximum number and only dealing with numbers less than the maximum. Any calculation that results in a number larger than the maximum gets wrapped back to a number in the valid range. In RSA, this maximum value (max) is obtained by multiplying two random prime numbers. The public and private keys are two specially chosen numbers that are greater than zero and less than the maximum value (say pub and priv). To encrypt a number, you multiply it by itself pub times, it needs to be wrapped around whenever it hits the max. To decrypt a message, it just needs to be multiplied by itself priv times, to get back to the original number.

It works like magic, let’s try to encrypt and decrypt word CITI using this technique.

Take the prime numbers 13 and 7. Their product gives maximum value of 91. Let’s assign the public encryption key to be the number 5. Then using the fact that we know 13 and 7 are the factors of 91 and applying an algorithm called the Extended Euclidean Algorithm, the private key can be deduced as the number 29. These parameters (max: 91, pub: 5, priv: 29) define a fully functional RSA system. You can take a number and multiply it by itself 5 times to encrypt it and then take that encrypted number and multiply it by itself 29 times and you get the original number back.

In case of CITI, to represent it mathematically, let’s first turn the letters into numbers. A common representation of the Latin alphabet is UTF-8. Each character corresponds to a number.



So CITI can be represented mathematically as 67 73 84 73.

To start with letter C, as it is number 67 on UTF-8, let’s multiply 67 by itself five times to get the encrypted value for C.

67×67 = 4489

Since 4489 is larger than max i.e. 91, it needs to be wrapped around. This can be done by dividing by 91 and taking the remainder i.e. 4489 = 91×49 + 30

30×67 = 2010 = 8

8×67 = 536 = 81

81×67 = 5427 = 58

This means the encrypted version of 67 (or ‘C’) is 58. Similar calculation for rest of the letters would give ‘I’ = 47, ‘T’ = 28, ‘I’ = 47

Hence the encrypted value for CITI becomes 58 47 28 47.

Now let’s see how decryption works.

To decrypt each encrypted value it needs to be multiplied by itself 29 times. Again starting with C, the encrypted value for C is 58, so this needs to be divided by itself 29 times and if exceeds max then needs to be wrapped around.

58×58 = 3364 = 88 (Remember, we wrap around when the number is greater than max.)

88×58 = 5104 = 8

… (Repeated in total 29 times)

9×58 = 522 = 67

There you’re, back to 67. Similarly we get back 73 for ‘I’, 84 for ‘T’ and 73 for ‘I’ again.

And CITI gets decrypted back to original UTF-8 representation as 67 73 84 73.

Though factorization provides rigorous security proofs yet it is not a hardest problem on a bit by bit basis. Due to recent advancements in cryptanalysis, it has become very easy to factor keys which were previously thought secure. To counter this simple fix cryptographers have come up with is just to increase the bit size of the keys. Since the resources available to decrypt numbers are increasing, the size of the keys needs to grow even faster. This is not a sustainable situation for mobile and low-powered devices that have limited computational power. The gap between factoring and multiplying is not sustainable in the long term.

In 1985, new types of cryptographic algorithms were proposed based on an esoteric branch of mathematics called elliptic curves.

Elliptical Curve is what most common browsers use today to secure the communication. An elliptic curve is the set of points that satisfy a specific mathematical equation. The equation for an elliptic curve looks something like this:



= x


+ ax + b

The property of elliptic curves that makes it interesting for cryptography is the horizontal symmetry of these graphs. Any point on the curve can be reflected over the x-axis and remain the same curve. Another more interesting property is that any non-vertical line will intersect the curve in at most three places.



Take any two points on the curve and draw a line through them; the line will intersect the curve at exactly one more place. In elliptical curve systems, we start with our private key (q) and a well-defined point (p) on the curve. Then we find the point (p*q) on the curve. That point is defined as public key corresponding to the private key. On elliptical curves it turns out that if you have two points, an initial point “dotted” with itself n times to arrive at a final point, finding out n when you only know the final point and the first point is hard. The mathematics works out in such a way that all rational multiples of p also ends up being on the curve. The claim is that it is very easy on elliptical curves to derive the public key by multiplication but computationally hard to find the reverse i.e. private from public.

After a slow start, elliptic curve based algorithms are gaining popularity, and the pace of adoption is accelerating. Elliptic Curve Cryptography (ECC) is now used in a wide variety of applications: the US government uses it to protect internal communications; the Tor project uses it to help assure anonymity, it is the mechanism used to prove ownership of bitcoins, and it provides signatures in Apple’s iMessage service. First generation cryptographic algorithms like RSA and Diffie-Hellman are still the norm in most arenas, but ECC is quickly becoming the go-to solution for privacy and security online.

Tarun Rattan

Credits: Nick Sullivan,d.ZGg

Posted in Blockchain | Tagged , | Leave a comment

Fascinating Words

If you’re fascinated by words, then this is for you…

Glabella – The space between your eyebrows is called a glabella.

Petrichor – The way it smells after the rain is called petrichor.

Aglet – The plastic or metallic coating at the end of your shoelaces is called an aglet.

Barm – The foam on beer is called a barm.

Wamble – The rumbling of stomach is actually called a wamble.

Vagitus – The cry of a new born baby is called a vagitus.

Tines – The prongs on a fork are called tines.

Phosphenes – The sheen or light that you see when you close your eyes and press your hands on them are called phosphenes.

Box Tent – The tiny plastic table placed in the middle of a pizza box is called a box tent.

Overmorrow – The day after tomorrow is called overmorrow.

Minimus – Your tiny toe or finger is called minimus. 

Agraffe – The wired cage that holds the cork in a bottle of champagne is called an agraffe.

Vocables – The ‘na na na’ and ‘la la la’, which don’t really have any meaning in the lyrics of any song, are called vocables.

Interrobang – When you combine an exclamation mark with a question mark (like this ?!), it is referred to as an interrobang.

Columella Nasi – The space between your nostrils is called columella nasi.

Armscye – The armhole in clothes, where the sleeves are sewn, is called armscye.

Dysania – The condition of finding it difficult to get out of the bed in the morning is called dysania.

Griffonage – Unreadable hand-writing is called griffonage (Are you reading this dear doctors?)

Tittle – The dot over an “i” or a “j” is called tittle.

Crapulence – That utterly sick feeling you get after eating or drinking too much is called crapulence.

Brannock Device – The metallic device used to measure your feet at the shoe store is called Brannock device

Posted in Literature | Tagged | Leave a comment

MIS40650 – Emerging Modes of Organising Work and their Implications

This session explored the concept of privacy and its implications in modern IT based social culture. Introna’s article summarizes the various theoretical notions of privacy and the current debate about privacy in academic circles.

Introna – Why we need privacy

The article mentions that what is surprising is that privacy did not get explicit attention from any of the great liberals. Liberal philosophers such as John Locke, Rousseau, Wilhelm van Humboldt and J. S. Mill did not spend as much as a page in their voluminous writings on the subject. Moreover, significant philosophical debate on the subject only emerged in the late 1960s. Why is this so? Could it be that privacy, as some suggest, is a modern very suspect concept invented by Warren and Brandeis in 1890 in response to a personal situation?

In the first part of article author tries to define privacy and accepts that it is such a primordial notion that any universally acceptable definition is difficult. But still Introna groups various different definitions into three fairly distinct but not mutually exclusive categories, namely: 1. privacy as no access to the person or the personal realm; 2. privacy as control over personal information; and 3. privacy as freedom from judgment or scrutiny by others.

Privacy as no access to a person or personal realm

Warren and Brandeis (1890, 205) defined privacy as the “the right to be let
alone.” But this definition was critiqued on various points. By this definition a person or institution can watch your every move but if they leave you alone their spying on you is acceptable. Also there are certain institutions or individuals that have a legitimate right not to leave you alone, such as the tax service or your creditors. Another definition from Van Den Haag (1971) is more deep i.e. “privacy is the exclusive access of a person to a realm of his own. The right to privacy entitles one to exclude others from (a) watching,(b) utilizing, (c) invading his private [personal] realm.” But again this definition implies that there is a certain realm, here expressed as personal, to which one may legitimately limit access. The obvious problem here is the definition of what is private or personal. Most scholars agree that to a large extent the exact demarcation of the personal realm is culturally defined. There is no ontologically defined personal realm.

Nevertheless, from a legal and communicative perspective, personal information can be defined as “those facts, communications or opinions which relate to the individual and which it would be reasonable to expect him to regard as intimate or confidential and therefore to want to withhold or at least to restrict their circulation” (Wacks, 1980). Gross (1967) is in agreement with this notion of privacy as “the condition of human life in which acquaintance with a person or with affairs of his life which are personal to him is limited.” He also refers to “intellectual” access by using the word “acquaintance”.

The above definitions, however, do not enable one to differentiate between the loss of privacy and the question of whether or not one’s right to privacy has been violated. An individual may voluntary give access to his personal realm to various other individuals intimately known or maybe unknown to him. In such a case, the person may be said to be less private, but no one has violated his right to privacy. This leads to the issue of control that is made explicit in the next group of definitions.

Privacy as control over personal information

Fried (1968) defines privacy as “control over knowledge about oneself.” This notion of control of personal information is also captured in the definition by Westin by defining privacy as “the claim of individuals, groups or institutions to determine for themselves when, how and to what extent information about them is communicated to others” (1967, 7, 42). Or in a more general sense by Parker (1974) as the “control over when and whom
the various parts of us can be sensed by others.”

Clearly, from a legal point of view, the violation of the right to privacy is very important. However, from a social relationship perspective it is the actual loss of privacy that is the issue at stake. Gavison (1980) defines a loss of privacy occurring when “others obtain information about an individual, pay attention to him, or gain access to him.” In this definition and the previous group the need for privacy is implicitly assumed. There is also no mention of the ‘other’ in the relationship given that privacy is a relational notion. This is where the notion of judgment-by others in the next group of definitions becomes explicit.

Privacy as freedom from judgment or scrutiny by others

The real issue of privacy according to Johnson (1989) is the judgment by others. He expresses it as follows:

Privacy is a conventional concept. What is considered private is socially or culturally defined. It varies from context to context. It is dynamic, and it is
quite possible that no single example can be found of something which is
considered private in every culture. Nevertheless, all examples of privacy have a single common feature. They are aspects of a person’s life which are culturally recognized as being immune from the judgment of others.

It is the knowledge that others would judge us in a particular way, perhaps
based on preconceived ideas and norms, that makes the individual’s desire
a personal or private space of immunity.

Author summarises the notion of privacy in these definitions as below

  1. Privacy is a relational concept. It comes to the fore in a community. Where people interact, the issue of privacy emerges.
  2. Privacy is directed towards the personal domain. What is deemed personal is, to some extent at least, culturally defined. In general one may state that personal or private aspects of my life are those aspects that do not, or tend not to, affect the significant interests of others.
  3. To claim privacy is to claim the right to limit access or control access to my personal or private domain.
  4. An effective way to control access to my personal realm is to control the distribution of textual images or verbal information about it.
  5. To claim privacy is to claim the right to a (personal) domain of immunity
    against the judgments of others.
  6. Privacy is a relative concept. It is a continuum. Total privacy may be as
    undesirable as total transparency. It is a matter of appropriateness for the
    situation at hand. It is unfortunately (or fortunately) a matter of judgment.

Next Introna’s article delves into the notion of Why Privacy? and this was explained from four perspectives

Privacy as the context of social relationships

The perspective here is that all social relationships, relationships of collaboration or of competition, require at least some level of privacy. There will be no private thoughts and no private places. Every thought and every act is completely transparent from motive right through to the actual thought or behaviour; body and mind immediately and completely transparent to each and every “other”. In such a world, how would you differentiate yourself, how would you compete? Is creativity possible? How is it possible to say “this is my idea” or “this is what I think”?

Privacy and intimate relationships

Fried argues that privacy provides the “moral capital” required by intimate relationships
of love and friendship:

Love and friendship, as analysed here, involve the initial respect for the rights of others which morality requires of everyone. They further involve the voluntary and spontaneous relinquishment of something between friend and friend, lover and lover. The title to  information about oneself [one’s beliefs, emotions, feelings, dreams, desires, etc.] conferred by privacy provides the necessary something. (p. 483)

It is this possibility of exchanging personal information about oneself (within a context of caring) that creates the possibility for intimacy. Gerstein argues that intimacy is an experience of a relationship in which one is deeply engrossed and in which one fully and wholly participates. It is a relationship where we relinquish our role as independent observer to lose ourselves in the experience. The key point is that we cannot at the same time be lost in an experience and be observers of it. Thus, privacy creates the moral capital (the personal information) and the possibility to participate (share the information) in a relationship in which I am deeply and exclusively engrossed as participant. Without privacy such intimate relationships would not be possible, or at least they would be extremely difficult to maintain.

Privacy and social roles

It is a generally accepted fact that individuals maintain a variety of relationships by assuming or acting out different roles. It is in fact different patterns of behaviour or roles that to a large degree define the different relationships and make them what they are. Privacy, through the rules, rituals, etc. that demarcate the private/public domain for a specific class of relationships, creates simplified relational structures that allow the individual to cope with the complexity – also, to appropriately invest in a selected set of intimate relationships. Gavison (1984) argues that Privacy “permits individuals [in the reciprocal relationship] to do what they would not do without it for fear of an unpleasant or hostile reaction from others.”

Privacy and self-constitution or autonomy

One of the most common arguments for privacy is its role in the creation and preservation of individual autonomy. If a person is aware that he is being observed, the person becomes conscious of himself as an object of observation. As an object of observation the person will then not merely structure his action according to his own will or intention, but also in line with (or in realisation of) what he believes those who observe would expect to see. Without privacy there would be no self. It would be difficult, even impossible, to separate the self from the other, since no act or thought could be said to be, in any significant way, original. Without privacy, a person would not be creator or originator, but merely a copier or enactor. As ReinÈman (1976) concludes: “privacy is necessary to the creation of selves out of human beings, since a self is at least in part a human being who regards his existence, his thoughts, his body, his actions as his own”

The last section of the article tries to unravel the linkage between notion of privacy and the information society.

Information technology, through electronically mediated communication, by removing time and space limitations, is rapidly multiplying interaction possibilities by orders of magnitude. As the technological infrastructure expands, the issues of social relationships, roles and autonomy will become more and more urgent. A whole new set of rules, rituals and gestures will have to evolve to deal with this new, more abstract set of electronically
induced social roles. The whole notion of trust, so important for social roles and relations is becoming ambiguous in modern world. The appropriate demarcation of private and public (in terms of appropriate behaviour and knowledge) for a specific type of role now becomes very vague.

Reinman argued that privacy is a social ritual by means of which an individual’s moral title to his existence is conferred. Privacy is an essential part of the complex social practice by means of which the social group recognizes and communicates to individuals that their existence is their own. Thus, as information technology (cellular telephones, television, the Internet, Groupware, etc.) progressively invades more and more private space (turning it into public space), individuals will be faced with fewer possibilities for making their existence their own. This is the essence of Foucault’s argument that the modern society through its panoptic universal “gaze” is creating mechanisms of power that are far more subtle and encompassing than ever before.

Introna concludes the article by saying that it is for the ultimate good of society as a whole that privacy is preserved, even at the expense of legitimate social control. Without some
preserved private spaces, society would lose its most valuable asset: the true individual.

Next Orlikowski’s article on matrix of control was discussed and extent to which IT deployed in work processes facilitates changes in forms of control and forms of organising was explored.

Orlikowski – Matrix of Control

The information technology facilitates decentralization and flexible operations on the one hand while increasing dependence and centralized knowledge and power on the other hand. The article discussed two forms of control

Internal forms of control

Pennings and Wiceshyn (1987) examine various forms of internal control in organisations, two important ones are personal and systemic control. Personal control is identified in terms of a dyadic relationship between supervisors and subordinates having its usual expression in direct supervision where one individual assumes authority over the actions of others and closely monitors that action to ensure compliance with orders. Systemic control represents a shift from personal relations to more transparent, indirect and impersonal forms of control and is vested in three interrelated structural properties of organisations – technology, social structure and culture. In control through technology control is embedded in the technical infrastructure of the production process. In control through social structure control is embedded in a firm’s policies, procedures and rules, it’s well defined job descriptions, career ladders and incentive schemes. In control through culture worker’s shared norms and values shape behaviour, order perception and influence attitudes.

External forms of control

Professional control is also employed by organisations where they delegate a large part of the indoctrination and training of their specialist employees to outside institutions such as professional schools and occupational communities. Organisations resort to this form of control as production processes become complex and dependent on highly specialized skill and knowledge. The authority invested in individual professionals is based on the special occupational competence they apply under conditions of task uncertainty, risk, complexity and variability e.g. accountants, engineers.

Systemic and professional forms of control can be seen as instances of Foucault (1979) disciplinary power in that control is exercised indirectly and impersonally through a range of institutional, technical and normative regulations and does not emanate directly or physically from individuals.

Next the article explores the linkage between IT and forms of control in which author draws on Gidden’s theory of structuration. Structure exists only as it is instantiated in action and information technology can be interpreted as an occasion for structuring organizations which both facilitates and constrains action. The forms of control in organisation are just mechanisms by which agents seek to achieve and maintain the compliance of others. In Gidden’s analysis such forms of control are premised on expressed through asymmetrical distribution of resources. Two kinds of resources are distinguished: allocative resources used to generate power over objects and authoritative resources used to generate power over persons.

The author then applies these concepts in a field study conducted in a large, multinational software consulting firm. Software Consulting Corporation (SCC). The author deduced that SCC uses systemic forms of control by the use of Production Knowledge, Socialization & Impression Management. It also uses personal forms of control using Direct Supervision & Electronic Supervision.

Next Solove’s article on End of Privacy was discussed.

Solove – End Of Privacy

The key concepts picked up in the article are

  • Social-networking sites allow seemingly trivial gossip to be distributed to a worldwide audience, sometimes making people the butt of rumours shared by millions of users across the Internet.
  • Public sharing of private lives has led to a rethinking of our current conceptions of privacy.
  • Existing law should be extended to allow some privacy protection for things that people say and do in what would have previously been considered the public domain

The session also covered an interview with MIT Media Lab’s Alex “Sandy” Pentland where Big Data and its impact on privacy was discussed.

Pentland – With Big Data comes Big Responsibility

Big data and the “internet of things”— in which everyday objects can send and receive data—promise revolutionary change to management and society. But their success rests on an assumption: that all the
data being generated by internet companies and devices scattered across the planet belongs to the organizations collecting it. What if it doesn’t? Pentland outlined the concept of the New Deal which is a legal framework proposed to rebalance the ownership of data in favour of the individual whose data is collected.

Below is a good video of Turkle’s lecture exploring the concept of privacy in modern world

PODCAST: Sherry Turkle – “Alone Together” – London School of Economics Public Lecture Series – (96 mins)

Posted in iBusiness | Tagged , , , | Leave a comment

MIS40650 – Big Data, Analytics, and Evidence-based Management

This session covered the connections between Big Data, Analytics and Evidence Based paradigm. Goldenberg’s article on evidence based approach in medicine was discussed. In particular the debate was around the notion articulated in the paper that the appeal to the authority of evidence that characterises evidence based practices does not increase objectivity but rather obscures the subjective elements that inescapably enters all forms of human enquiry. The definition of evidence was discussed as ‘some conceptual warrant for belief or action’ and centrality of evidence in science was accepted.

Goldenberg – On evidence and EBM – lessons from the philosophy of science

It is the practice of basing all beliefs and practices strictly on evidence that allegedly separates science from other activities. The evidence based medicine (EBM) movement purports to eschew unsystematic and intuitive methods of individual clinical practice in favour of more scientifically rigorous approach. This rigour is achieved through methodological clinical decision making based on examination of evidence derived from the latest clinical research. The evidence based techniques is an extension of the philosophical system of logical positivism which recognises only scientifically verifiable propositions as meaningful. This school of thought originated in Vienna in 1920 and as number of members of Vienna circle emigrated to UK & US that led to the strong influence of logical positivism on Anglo-American analytic philosophy.

The EBM movement centres around five linked ideas:

  1. Clinical decisions should be based on the best available scientific evidence
  2. Clinical problem  and not the habits or protocols should determine the type of evidence to be sought
  3. Identifying the best evidence means epidemiological and bio statistical way of thinking
  4. Conclusions derived from identifying and critically appraising evidence are useful only if put into action in managing patients or making health care decisions
  5. Performance should be constantly evaluated

The synthesis of large amount of clinical trial data into manageable “clinical summaries” or “meta-analyses” in EBM’s projects like Cochrane Collaboration is first step towards Big Data concept.

The critique of EBM is done on two grounds. In the first Hanson (1958) , Kuhn (1970,1996) and Feyerabend ( 1978) have claimed that observation is theory-laden and is coloured by our background beliefs and assumptions therefore can never be unmitigated perception of nature of things. In the second Duhem (1982) and Quine (1960) have argued that our theory choices are never determined exclusively by evidence instead a given body of evidence may support numerous even contradicting  theories.

Phenomenological approaches to science and medicine further challenge notions of evidence in EBM by questioning why relevant evidence is assumed to come primarily from clinical trials and other objective measures. They argue instead that patients self understanding and experience of illness also offers a legitimate source of relevant medical knowledge. This theoretical approach is grounded in the philosophy of Edmund Husserl and his followers who questioned the philosophical completeness of natural sciences. They argued that Cartesian dualism which splits the world into minds and bodies  fails to explain human understanding leading to a crisis of meaning.

Next the dictum of “You can’t manage what you don’t measure” by Deming & Drucker was explored in McAfee & Brynjolfsson’s article on Big Data.

McAfee & Brynjolfsson – Big Data – The Management Revolution

The claim that more data we measure the better we can manage the things can be justified statistically by showing that data driven companies are more profitable than others. But this is not a given, it all depends on how the data is analysed and how committed the senior management is with data analytics. The article describes how Big Data is different from field of analytics and why it has become important in recent days. The article outlines three key differences

Volume: As of 2012, about 2.5 exabytes of data are created each day, and that number is doubling every 40 months or so. More data cross the internet every second than were stored in the entire internet just 20 years ago. This gives companies an opportunity to work with many petabyes of data in a single data set—and not just from the internet

Velocity: For many applications, the speed of data creation is even more important than the volume. Real-time or nearly real-time information makes it possible for a company to be much more agile than its competitors

Variety: Big data takes the form of messages, updates, and images posted to social networks; readings from sensors; GPS signals from cell phones, and more. Many of the most important sources of big data are relatively new

The 5 management challenges with Big Data has been described in the article

Leadership: Companies succeed in the big data era not simply because they have more or better data, but because they have leadership teams that set clear goals, define what success looks like, and ask the right questions. Big data’s power does not erase the need for vision or human insight.

Talent management: As data become cheaper, the complements to data become more valuable. Some of the most crucial of these are data scientists and other professionals skilled at working with large quantities of information. Along with the data scientists, a new generation of computer scientists are bringing to bear techniques for working with very large data sets. The best data scientists are also comfortable speaking the language of business and helping leaders reformulate their challenges in ways that big data can tackle. Not surprisingly, people with these skills are hard to find and in great demand.

Technology: The tools available to handle the volume, velocity, and variety of big data have improved greatly in recent years. In general, these technologies are not prohibitively expensive, and much of the software is open source. Hadoop, the most commonly used framework, combines commodity hardware with open-source software. However, these technologies do require a skill set that is new to most IT departments, which will need to work hard to integrate all the relevant internal and external sources of data.

Decision making: An effective organization puts information and the relevant decision rights in the same location. In the big data era, information is created and transferred, and expertise is often not where it used to be. The artful leader will create an organization flexible enough to minimize the “not invented here” syndrome and maximize cross-functional cooperation

Company culture: The first question a datadriven organization asks itself is not “What do we think?” but “What do we know?” This requires a move away from acting solely on hunches and instinct. It also requires breaking a bad habit we’ve noticed in many organizations: pretending to be more data-driven than they actually are.

Marcus’s article in New Yorker also shares the same themes.

Marcus – Steamrolled by Big Data (The New Yorker)

The story of Google improving spell checkers using Big Data has been mentioned along with case of Oren Etzioni created Farecast (eventually sold to Microsoft, and now part of Bing Travel), which scraped data from the Web to make good guesses about whether airline fare would rise or fall.

The case study on Numenta founded by Jeff Hawkins of Palm Pilot’s fame has been mentioned in quite a detail.  According to Numenta’s Web site, their software, Grok, “finds complex patterns in data streams and generates actionable predictions in real time…. Feed Grok data, and it returns predictions that generate action. Grok learns and adapts automatically.” Numenta boasts that “As the age of the digital nervous system dawns, Grok represents the type of technology that will convert massive data flows into value.”

Marcus does claim that that every problem is different and that there are no universally applicable solutions. An algorithm that is good at chess isn’t going to be much help parsing sentences, and one that parses sentences isn’t going to be much help playing chess. A faster computer will be better than a slower computer at both, but solving problems will often (though not always) require a fair amount of what some researchers call “domain knowledge”—specific information about particular problems, often gathered painstakingly by experts. Big Data is a powerful tool for inferring correlations, not a magic wand for inferring causality.

The article also presents a critique of Big Data by invoking a chat the author had with Anthony Nyström, of the Web software company Intridea in which Nystrom claimed that selling Big Data is a great gig for charlatans, because they never have to admit to being wrong. “If their system fails to provide predictive insight, it’s not their models, it’s an issue with your data.” You didn’t have enough data, there was too much noise, you measured the wrong things. The list of excuses can be long.

Morozov’s article on planning machine was discussed next

The article describes the origins of Big Data concept with the story of Stafford Beer, leading theorist of Cybernetics who envisaged Project Cybersyn to help Chile’s socialist government control the country and it’s economy with the help of computers. Stafford Beer helped design systems like Datafeed which had four screens that could show hundreds of pictures and figures on historical & statistical information on state of production in the country. There was another screen that simulated the future state of the Chilean economy under various conditions.

One wall was reserved for Project Cyberfolk, an ambitious effort to track the real-time happiness of the entire Chilean nation in response to decisions made in the op room. Beer built a device that would enable the country’s citizens, from their living rooms, to move a pointer on a voltmeter-like dial that indicated moods ranging from extreme unhappiness to complete bliss. The plan was to connect these devices to a network—it would ride on the existing TV networks—so that the total national happiness at any moment in time could be determined. The algedonic meter, as the device was called (from the Greek algos, “pain,” and hedone, “pleasure”), would measure only raw pleasure-or-pain reactions to show whether government policies were working.

As Eden Medina shows in “Cybernetic Revolutionaries,” her entertaining history of Project Cybersyn, Beer set out to solve an acute dilemma that Allende faced. How was he to nationalize hundreds of companies, reorient their production toward social needs, and replace the price system with central planning, all while fostering the worker participation that he had promised? Beer realized that the planning problems of business managers—how much inventory to hold, what production targets to adopt, how to redeploy idle equipment—were similar to those of central planners. Computers that merely enabled factory automation were of little use; what Beer called the “cussedness of things” required human involvement. It’s here that computers could help—flagging problems in need of immediate attention, say, or helping to simulate the long-term consequences of each decision. By analyzing troves of enterprise data, computers could warn managers of any “incipient instability.” In short, management cybernetics would allow for the reëngineering of socialism—the command-line economy.

Yet central planning had been powerfully criticized for being unresponsive to shifting realities, notably by the free-market champion Friedrich Hayek. The efforts of socialist planners, he argued, were bound to fail, because they could not do what the free market’s price system could: aggregate the poorly codified knowledge that implicitly guides the behavior of market participants. Beer and Hayek knew each other; as Beer noted in his diary, Hayek even complimented him on his vision for the cybernetic factory, after Beer presented it at a 1960 conference in Illinois. (Hayek, too, ended up in Chile, advising Augusto Pinochet.) But they never agreed about planning. Beer believed that technology could help integrate workers’ informal knowledge into the national planning process while lessening information overload.

Next Harford’s article on Big Data was discussed in which he details what might be going wrong with this whole concept.

Harford, T. (2014). Big Data: Are we making a big mistake? The Financial Times.


The article refers to “Google Flu Trends” which was quick, accurate and cheap & theory-free but still made almost accurate predictions of Flu trends across America. Google’s engineers didn’t bother to develop a hypothesis about what search terms – “flu symptoms” or “pharmacies near me” – might be correlated with the spread of the disease itself. The Google team just took their top 50 million search terms and let the algorithms do the work.

The article professes to tread cautiously on the four claims on Big Data prevalent among businesses i.e.

  • Data analysis produces uncanny accurate results
  • Every single data point can be captured, making old statistical sampling techniques obsolete
  • It is passé to fret about what causes what, because statistical correlation tells us what we need to know
  • Scientific and statistical models aren’t needed as “with enough data, the numbers speak for themselves”

A big data is one where “N = All” where we have the whole population and no sampling is required but this notion can be challenged as it is virtually impossible to get all the data points.

Posted in iBusiness | Tagged , , | Leave a comment

MIS40650 – Trust in Global Networks of Innovation

This session covered the issue of trust which is now becoming more important due to the advent of Virtual organisations.The concept of virtual organisations have been discussed in Introna’s article but to say that the concept has been articulated or nailed down would be a fallacy.


The notion of virtual organisation is difficult to grasp and the reasons why virtual organisations have started to work well in recent years are still being thought through. Introna’s article has suggested that in the recent years ‘virtual has successfully become the metaphor for technology’. The dictionary definition of ‘virtual’ is ‘almost, even if not exactly or in everyway’ and this is now evident in the neologisms made popular  in IT industry as ‘virtual memory’, ‘virtual computer’, virtual reality’, ‘virtual space’. In each of these instances virtual connotes the information technology which possesses the ability to:

  1. provide a way of making a computer system act as if it had more capacity than it really possessed
  2. give the user illusion to exist at any time or any place they were needed

Introna’s article has highlighted some elements of the virtual organisations

  1. Strategic alliance: The key attribute of virtual organisations is partnering and is are all about alliances and outsourcing agreements
  2. Core Competence: The concept of core competencies furnishes the creation of virtual organisations in that it forges a form of partnership which is essentially made up of partners who are deemed to apply their core competencies to deliver world class products & services.
  3. Trust: The partners in a virtual organisation exhibit ‘unprecedented levels of trust and commitment’
  4. Organisation restructuring: Virtual organisation is greatly dependent on its structure for successful execution of identified work tasks.

The second part of Introna’s article presents a critique on the notion of virtual organisations and following points were covered

  1. Trust & Conflict: Virtual organisations have an issue with trust as people are remotely located and miscommunication often results in conflict
  2. Whole & Parts: Virtual Organisations assume that all partners will bring their core competence together resulting in knowledge sharing and better execution but that rarely happens
  3. Knowledge & Language: Organisational knowledge is a tacit commodity and the question remains how do we locate it to make it available to the partners in a virtual organisation.

The concept of trust was taken up in more detail in reference to the offshoring case study in the Kelly & Noonan’s article between an Irish start-up firm and a big Indian outsourcer.

Kelly & Noonan – Anxiety and psychological security in offshoring (JIT final)

The case study highlights the process of trust building between two stranger organisations and the challenges faced during this journey.

Seamas also touched upon Gidden’s distinctive and non-cognitive conception of trust where it is defined as ‘emotional commitment’ and how this makes the parties involved vulnerable and can potentially risk their very existence.

The reasons why trust is so difficult to generate and maintain in post modern world were analysed. The following reasons for the rising mistrust were given

  1. Globalisation: Giddens (1990) argues that the risk profile of the modern globalized world has been dramatically altered as institutional reflexivity has increased and social relations are disembedded from local contexts and stretched over extended tracts of time–space. These new social arrangements have problematized the means by which individuals establish and maintain a sense of psychological security and coherent identity (Giddens, 1991), which has resulted in the simultaneous transformation, and renewed importance, of trust relations.
  2. Decay in social institutions: Decay in institutions like Church, kinship, family, worker unions have resulted in  loss of social settings which earlier had helped in generation of trust.
  3. Technology & complexities around it: Gidden’s distinguishes between two types of trust relations prevalent in modern societies: trust in abstract systems and personal trust. The former are based, to a large extent, on faceless commitments while the latter depend on facework commitments (trust relations that are sustained). The investment of trust in abstract systems like aeroplane, Computer OS is a central feature of modern life. No one can completely opt out of the abstract systems involved in modern institutions yet, due to their diversity and complexity, our knowledge of their workings is necessarily limited. Therefore, trust (or faceless commitments) be-comes a very important means of generating the ‘leap of faith’ that practical engagement with them demands. Often, however, engagement with abstract systems involves encounters with individuals who ‘represent’ or are ‘responsible’ for them (e.g. in the case of visiting a medical doctor who represents a broader system of medical knowledge). Such contacts with experts are very consequential and take place at access points, which form the meeting ground of facework and faceless commitments.
Posted in iBusiness | Tagged , , , | Leave a comment

Dublin City Centre

Dublin City Centre

Image | Posted on by | Tagged | Leave a comment

You start dying slowly – Pablo Neruda

You start dying slowly

If you do not travel,

If you do not read,

If you do not listen to the sounds of life,

If you do not appreciate yourself.

You start dying slowly

When you kill your self-esteem;

When you do not let others help you.

You start dying slowly

If you become a slave of your habits,

Walking everyday on the same paths…

If you do not change your routine,

If you do not wear different colours

Or you do not speak to those you don’t know.

You start dying slowly

If you avoid to feel passion

And their turbulent emotions;

Those which make your eyes glisten

And your heart beat fast

You start dying slowly

If you do not change your life when you are not satisfied with your job, or with your love,

If you do not risk what is safe for the uncertain,

If you do not go after a dream,

If you do not allow yourself,

At least once in your lifetime,

To run away from sensible advice…

Pablo Neruda

Posted in Literature | Tagged , | Leave a comment