Feeds:
Posts
Comments

Posts Tagged ‘Analytics’

Revelations about the Facebook – Cambridge Analytica affair last month (March 2018) invoked a heated public discussion about data privacy and users’ control over their personal information in social media networks, particularly in the domain of Facebook. The central allegation in this affair is that personal data in social media was misused for the winning political presidential campaign of Donald Trump. It offers ‘juicy’ material for all those interested in American politics. But the importance of the affair goes much beyond that, because impact of the concerns it has raised radiates to the daily lives of millions of users-consumers socially active via the social media platform of Facebook; it could touch potentially a multitude of commercial marketing contexts (i.e., products and services) in addition to political marketing.

Having a user account as member of the social media network of Facebook is pay free, a boon hard to resist. Facebook surpassed in Q2 of 2017 the mark of two billion active monthly users, double a former record of one billion reached five years earlier (Statista). No monetary price requirement is explicitly submitted to users. Yet, users are subject to alternative prices, embedded in the activity on Facebook, implicit and less noticeable as a cost to bear.

Some users may realise that advertisements they receive and see is the ‘price’ they have to tolerate for not having to pay ‘in cash’ for socialising on Facebook. It is less of a burden if the content is informative and relevant to the user. What users are much less likely to realise is how personally related data (e.g., profile, posts and photos, other activity) is used to produce personally targeted advertising, and possibly in creating other forms of direct offerings or persuasive appeals to take action (e.g., a user receives an invitation from a brand, based on a post of his or her friend, about a product purchased or  photographed). The recent affair led to exposing — in news reports and a testimony of CEO Mark Zuckerberg before Congress — not only the direct involvement of Facebook in advertising on its platform but furthermore how permissive it has been in allowing third-party apps to ‘borrow’ users’ information from Facebook.

According to reports on this affair, Psychologist Aleksandr Kogan developed with colleagues, as part of academic research, a model to deduce personality traits from behaviour of users on Facebook. Aside from his position at Cambridge University, Kogan started a company named Global Science Research (GSR) to advance commercial and political applications of the model. In 2013 he launched an app in Facebook, ‘this-is-your-digital-life’, in which Facebook users would answer a self-administered questionnaire on personality traits and some personal background. In addition, the GSR app prompted respondents to give consent to pull personal and behavioural data related to them from Facebook. Furthermore, at that time the app could get access to limited information on friends of respondents — a capability Facebook removed at least since 2015 (The Guardian [1], BBC News: Technology, 17 March 2018).

Cambridge Analytica (CA) contracted with GSR to use its model and data it collected. The app was able, according to initial estimates, to harvest data on as many as 50 million Facebook users; by April 2018 the estimate was updated by Facebook to reach 87 millions. It is unclear how many of these users were involved in the project of Trump’s campaign because CA was specifically interested for this project in eligible voters in the US; it is said that CA applied the model with data in other projects (e.g., pro-Brexit in the UK), and GSR made its own commercial applications with the app and model.

In simple terms, as can be learned from a more technical article in The Guardian [2], the model is constructed around three linkages:

(1) Personality traits (collected with the app) —> data on user behaviour in Facebook platform, mainly ‘likes’ given by each user (possibly additional background information was collected via the app and from the users’ profiles);

(2) Personality traits —> behaviour in the target area of interest — in the case of Trump’s campaign, past voting behaviour (CA associated geographical data on users with statistics from the US electoral registry).

Since model calibration was based on data from a subset of users who responded to the personality questionnaire, the final stage of prediction applied a linkage:

(3) Data on Facebook user behaviour ( —> predicted personality ) —>  predicted voting intention or inclination (applied to the greater dataset of Facebook users-voters)

The Guardian [2] suggests that ‘just’ 32,000 American users responded to the personality-political questionnaire for Trump’s campaign (while at least two million users from 11 states were initially cross-referenced with voting behaviour). The BBC gives an estimate of as many as 265,000 users who responded to the questionnaire in the app, which corresponds to the larger pool of 87 million users-friends whose data was harvested.

A key advantage credited to the model is that it requires only data on ‘likes’ by users and does not have to use other detailed data from posts, personal messages, status updates, photos etc. (The Guardian [2]). However, the modelling concept raises some critical questions: (1) How many repeated ‘likes’ of a particular theme are required to infer a personality trait? (i.e., it should account for a stable pattern of behaviour in response to a theme or condition in different situations or contexts); (2) ‘Liking’ is frequently spurious and casual — ‘likes’ do not necessarily reflect thought-out agreement or strong identification with content or another person or group (e.g., ‘liking’ content on a page may not imply it personally applies to the user who likes it); (3) Since the app was allowed to collect only limited information on a user’s ‘friends’, how much of it could be truly relevant and sufficient for inferring the personality traits? On the other hand, for whatever traits that could be deduced, data analyst and whistleblower Christopher Wylie, who brought the affair out to the public, suggested that the project for Trump had picked-up on various sensitivities and weaknesses (‘demons’ in his words). Personalised messages were respectively devised to persuade or lure voters-users likely to favour Trump to vote for him. This is probably not the way users would want sensitive and private information about them to be utilised.

  • Consider users in need for help who follow and ‘like’ content of pages of support groups for bereaved families (e.g., of soldiers killed in service), combatting illnesses, or facing other types of hardship (e.g., economic or social distress): making use of such behaviour for commercial or political gain would be unethical and disrespectful.

Although the app of GSR may have properly received the consent of users to draw information about them from Facebook, it is argued that deception was committed on three counts: (a) The consent was awarded for academic use of data — users were not giving consent to participate in a political or commercial advertising campaign; (b) Data on associated ‘friends’, according to Facebook, has been allowed at the time only for the purpose of learning how to improve users’ experiences on the platform; and (c) GSR was not permitted at any time to sell and transfer such data to third-party partners. We are in the midst of a ‘blame game’ among Facebook, GSR and CA on the transfer of data between the parties and how it has been used in practice (e.g., to what extent the model of Kogan was actually used in the Trump’s campaign). It is a magnificent mess, but this is not the space to delve into its small details. The greater question is what lessons will be learned and what corrections will be made following the revelations.

Mark Zuckerberg, founder and CEO of Facebook, gave testimony at the US Congress in two sessions: a joint session of the Senate Commerce and Judiciary Committees (10 April 2018) and before the House of Representatives Commerce and Energy Committee (11 April 2018). [Zuckerberg declined a call to appear in person before a parliamentary committee of the British House of Commons.] Key issues about the use of personal data on Facebook are reviewed henceforth in light of the opening statements and replies given by Zuckerberg to explain the policy and conduct of the company.

Most pointedly, Facebook is charged that despite receiving reports concerning GSR’s app and CA’s use of data in 2015, it failed to ensure in time that personal data in the hands of CA is deleted from their repositories and that users are warned about the infringement (before the 2016 US elections), and that it took at least two years for the social media company to confront GSR and CA more decisively. Zuckerberg answered in his defence that Cambridge Analytica had told them “they were not using the data and deleted it, we considered it a closed case”; he immediately added: “In retrospect, that was clearly a mistake. We shouldn’t have taken their word for it”. This line of defence is acceptable when coming from an individual person acting privately. But Zuckerberg is not in that position: he is the head of a network of two billion users. Despite his candid admission of a mistake, this conduct is not becoming a company the size and influence of Facebook.

At the start of both hearing sessions Zuckerberg voluntarily and clearly took personal responsibility and apologized for mistakes made by Facebook while committing to take measures (some already done) to avoid such mistakes from being repeated. A very significant realization made by Zuckerberg in the House is him conceding: “We didn’t take a broad view of our responsibility, and that was a big mistake” — it goes right to the heart of the problem in the approach of Facebook to personal data of its users-members. Privacy of personal data may not seem to be worth money to the company (i.e., vis-à-vis revenue coming from business clients or partners) but the whole network business apparatus of the company depends on its user base. Zuckerberg committed that Facebook under his leadership will never give priority to advertisers and developers over the protection of personal information of users. He will surely be followed on these words.

Zuckerberg argued that the advertising model of Facebook is misunderstood: “We do not sell data to advertisers”. According to his explanation, advertisers are asked to describe to Facebook the target groups they want to reach, Facebook traces them and then does the placement of advertising items. It is less clear who composes and designs the advertising items, which also needs to be based on knowledge of the target consumers-users. However, there seems to be even greater ambiguity and confusion in distinguishing between use of personal data in advertising by Facebook itself and access and use of such data by third-party apps hosted on Facebook, as well as distinguishing between types of data about users (e.g., profile, content posted, response to others’ content) that may be used for marketing actions.

Zuckerberg noted that the ideal of Facebook is to offer people around the world free access to the social network, which means it has to feature targeted advertising. He suggested in Senate there will always be a pay-free version of Facebook, yet refrained from saying when if ever there will be a paid advertising-clear version. It remained unclear from his testimony what information is exchanged with advertisers and how. Zuckerberg insisted that users have full control over their own information and how it is being used. He added that Facebook will not pass personal information to advertisers or other business partners, to avoid obvious breach of trust, but it will continue to use such information to the benefit of advertisers because that is how its business model works (NYTimes,com, 10 April 2018). It should be noted that whereas users can choose who is allowed to see information like posts and photos they upload for display, that does not seem to cover other types of information about their activity on the platform (e.g., ‘likes’, ‘shares’, ‘follow’ and ‘friend’ relations) and how it is used behind the scenes.

Many users would probably want to continue to benefit from being exempt of paying a monetary membership fee, but they can still be entitled to have some control over what adverts they value and which they reject. The smart systems used for targeted advertising could be less intelligent than they purport to be. Hence more feedback from users may help to assign them well-selected adverts that are of real interest, relevance and use to them, and thereof increase efficiency for advertisers.

At the same time, while Facebook may not sell information directly, the greater problem appears to be with the information it allows apps of third-party developers to collect about users without their awareness (or rather their attention). In a late wake-up call at the Senate, Zuckerberg said that the company is reviewing app owners who obtain a large amount of user data or use it improperly, and will act against them. Following Zuckerberg’s effort to go into details of the terms of service and to explain how advertising and apps work on Facebook, and especially how they differ, Issie Lapowsky reflects in the ‘Wired’: “As the Cambridge Analytica scandal shows, the public seems never to have realized just how much information they gave up to Facebook”. Zuckerberg emphasised that an app can get access to raw user data from Facebook only by permission, yet this standard, according to Lapowsky, is “potentially revelatory for most Facebook users” (“If Congress Doesn’t Understand Facebook, What Hope Do Its Users Have”, Wired, 10 April 2018).

There can be great importance to how an app asks for permission or consent of users to pull their personal data from Facebook, how clear and explicit it is presented so that users understand what they agree to. The new General Data Protection Regulation (GDPR) of the European Union, coming into effect within a month (May 2018), is specific on this matter: it requires explicit ‘opt-in’ consent for sensitive data and unambiguous consent for other data types. The request must be clear and intelligible, in plain language, separated from other matters, and include a statement of the purpose of data processing attached to consent. It is yet to be seen how well this ideal standard is implemented, and extended beyond the EU. Users are of course advised to read carefully such requests for permission to use their data in whatever platform or app they encounter them before they proceed. However, even if no information is concealed from users, they may not be adequately attentive to comprehend the request correctly. Consumers engaged in shopping often attend to only some prices, remember them inaccurately, and rely on a more general ‘feeling’ about the acceptable price range or its distribution. If applying the data of users for personalised marketing is a form of price expected from them to pay, a company taking this route should approach the data fairly just as with setting monetary prices, regardless of how well its customers are aware of the price.

  • The GDPR specifies personal data related to an individual to be protected if “that can be used to directly or indirectly identify the person”. This leaves room for interpretation of what types of data about a Facebook user are ‘personal’. If data is used and even transferred at an aggregate level of segments there is little risk of identifying individuals, but for personally targeted advertising or marketing one needs data at the individual level.

Zuckerberg agreed that some form of regulation over social media will be “inevitable ” but conditioned that “We need to be careful about the regulation we put in place” (Fortune.com, 11 April 2018). Democrat House Representative Gene Green posed a question about the GDPR which “gives EU citizens the right to opt out of the processing of their personal data for marketing purposes”. When Zuckerberg was asked “Will the same right be available to Facebook users in the United States?”, he replied “Let me follow-up with you on that” (The Guardian, 13 April 2018).

The willingness of Mark Zuckerberg to take responsibility for mistakes and apologise for them is commendable. It is regrettable, nevertheless, that Facebook under his leadership has not acted a few years earlier to correct those mistakes in its approach and conduct. Facebook should be ready to act in time on its responsibility to protect its users from harmful use of data personally related to them. It can be optimistic and trusting yet realistic and vigilant. Facebook will need to care more for the rights and interests of its users as it does for its other stakeholders in order to gain the continued trust of all.

Ron Ventura, Ph.D. (Marketing)

 

 

 

 

 

Advertisements

Read Full Post »

Human thinking processes are rich and variable, whether in search, problem solving, learning, perceiving and recognizing stimuli, or decision-making. But people are subject to limitations on the complexity of their computations and especially the capacity of their ‘working’ (short-term) memory. As consumers, they frequently need to struggle with large amounts of information on numerous brands, products or services with varying characteristics, available from a variety of retailers and e-tailers, stretching the consumers’ cognitive abilities and patience. Wait no longer, a new class of increasingly intelligent decision aids is being put forward to consumers by the evolving field of Cognitive Computing. Computer-based ‘smart agents’ will get smarter, yet most importantly, they would be more human-like in their thinking.

Cognitive computing is set to upgrade human decision-making, consumers’ in particular. Following IBM, a leader in this field, cognitive computing is built on methods of Artificial Intelligence (AI) yet intends to take this field a leap forward by making it “feel” less artificial and more similar to human cognition. That is, a human-computer interaction will feel more natural and fluent if the thinking processes of the computer resemble more closely those of its human users (e.g., manager, service representative, consumer). Dr. John E. Kelly, SVP at IBM Research, provides the following definition in his white paper introducing the topic (“Computer, Cognition, and the Future of Knowing”): “Cognitive computing refers to systems that learn at scale, reason with purpose and interact with humans. Rather than been explicitly programmed, they learn and reason from interactions with us and from their experiences with their environment.” The paper seeks to rebuke claims of any intention behind cognitive computing to replace human thinking and decisions. The motivation, as suggested by Kelly, is to augment human ability to understand and act upon the complex systems of our society.

Understanding natural language has been for a long time a human cognitive competence that computers could not imitate. However, comprehension of natural language, in text or speech, is now considered one of the important abilities of cognitive computing systems. Another important ability concerns the recognition of visual images and objects embedded in them (e.g., face recognition receives particular attention). Furthermore, cognitive computing systems are able to process and analyse unstructured data which constitutes 80% of the world’s data, according to IBM. They can extract contextual meaning so as to make sense of the unstructured data (verbal and visual). This is a marked difference between the new computers’ cognitive systems and traditional information systems.

  • The Cognitive Computing Forum, which organises conferences in this area, lists a dozen characteristics integral to those systems. In addition to (a) natural language processing; and (b) vision-based sensing and image recognition, they are likely to include machine learning, neural networks, algorithms that learn and adapt, semantic understanding, reasoning and decision automation, sophisticated pattern recognition, and more (note that there is an overlap between some of the methodologies on this list). They also need to exhibit common sense.

The power of cognitive computing is derived from its combination between cognitive processes attributed to the human brain (e.g., learning, reasoning) and the enhanced computation (complexity, speed) and memory capabilities of advanced computer technologies. In terms of intelligence, it is acknowledged that cognitive processes of the human brain are superior to computers inasmuch as could be achieved through conventional programming. Yet, the actual performance of human cognition (‘rationality’) is bounded by memory and computation limitations. Hence, we can employ cognitive computing systems that are capable of handling much larger amounts of information than humans can, while using cognitive (‘neural’) processes similar to humans’. Kelly posits in IBM’s paper: “The true potential of the Cognitive Era will be realized by combining the data analytics and statistical reasoning of machines with uniquely human qualities, such as self-directed goals, common sense and ethical values.”  It is not sufficiently understood yet how cognitive processes physically occur in the human central nervous system. But, it is argued, there is growing knowledge and understanding of their operation or neural function to be sufficient for emulating at least some of them by computers. (This argument refers to the concept of different levels of analysis that may and should prevail simultaneously.)

The distinguished scholar Herbert A. Simon studied thinking processes from the perspective of information processing theory, which he championed. In the research he and his colleagues conducted, he traced and described in a formalised manner strategies and rules that people utilise to perform different cognitive tasks, especially solving problems (e.g., his comprehensive work with Allen Newell on Human Problem Solving, 1972). In his theory, any strategy or rule specified — from more elaborate optimizing algorithms to short-cut rules (heuristics) — is composed of elementary information processes (e.g., add, subtract, compare, substitute). On the other hand, strategies may be joined in higher-level compound information processes. Strategy specifications were subsequently translated into computer programmes for simulation and testing.

The main objective of Simon was to gain better understanding of human thinking and the cognitive processes involved therein. He proclaimed that computer thinking is programmed in order to simulate human thinking, as part of an investigation aimed at understanding the latter (1). Thus, Simon did not explicitly aim to overcome the limitations of the human brain but rather simulate how the brain may work-out around those limitations to perform various tasks. His approach, followed by other researchers, was based on recording how people perform given tasks, and testing for efficacy of the process models through computer simulations. This course of research is different from the goals of novel cognitive computing.

  • We may identify multiple levels in research on cognition: an information processing level (‘mental’), a neural-functional level, and a neurophysiological level (i.e., how elements of thought emerge and take form in the brain). Moreover, researchers aim to obtain a comprehensive picture of brain structures and areas responsible for sensory, cognitive, emotional and motor phenomena, and how they inter-relate. Progress is made by incorporating methods and approaches of the neurosciences side-by-side with those of cognitive psychology and experimental psychology to establish coherent and valid links between those levels.

Simon created explicit programmes of the steps required to solve particular types of problems, though he aimed at developing also more generalised programmes that would be able to handle broader categories of problems (e.g., the General Problem Solver embodying the Means-End heuristic) and other cognitive tasks (e.g., pattern detection, rule induction) that may also be applied in problem solving. Yet, cognitive computing seeks to reach beyond explicit programming and construct guidelines for far more generalised processes that can learn and adapt to data, and handle broader families of tasks and contexts. If necessary, computers would generate their own instructions or rules for performing a task. In problem solving, computers are taught not merely how to solve a problem but how to look for a solution.

While cognitive computing can employ greater memory and computation resources than naturally available to humans, it is not truly attempted to create a fully rational system. The computer cognitive system should retain some properties of bounded rationality if only to maintain resemblance to the original human cognitive system. First, forming and selecting heuristics is an integral property of human intelligence. Second, cognitive computing systems try to exhibit common sense, which may not be entirely rational (i.e., based on good instincts and experience), and introduce effects of emotions and ethical or moral values that may alter or interfere with rational cognitive processes. Third, cognitive computing systems are allowed to err:

  • As Kelly explains in IBM’s paper, cognitive systems are probabilistic, meaning that they have the power to adapt and interpret the complexity and unpredictability of unstructured data, yet they do not “know” the answer and therefore may make mistakes in assigning the correct meaning to data and queries (e.g., IBM’s Watson misjudged a clue in the quiz game Jeopardy against two human contestants — nonetheless “he” won the competition). To reflect this characteristic, “the cognitive system assigns a confidence level to each potential insight or answer”.

Applications of cognitive computing are gradually growing in number (e.g., experimental projects with the cooperation and support of IBM on Watson). They may not be targeted directly for use by consumers at this stage, but consumers are seen as the end-beneficiaries. The users could first be professionals and service agents who help consumers in different areas. For example, applied systems in development and trial would:

  1. help medical doctors in identifying (cancer) diagnoses and advising their patients on treatment options (it is projected that such a system will “take part” in doctor-patient consultations);
  2. perform sophisticated analyses of financial markets and their instruments in real-time to guide financial advisers with investment recommendations to their clients;
  3. assist account managers or service representatives to locate and extract relevant information from a company’s knowledge base to advise a customer in a short time (CRM/customer support).

The health-advisory platform WellCafé by Welltok provides an example of application aimed at consumers: The platform guides consumers on healthy behaviours recommended for them whereby the new assistant Concierge lets them converse in natural language to get help on resources and programmes personally relevant to them as well as various health-related topics (e.g., dining options). (2)

Consider domains such as cars, tourism (vacation resorts), or real-estate (second-hand apartments and houses). Consumers may encounter tremendous information in these domains on numerous options and many attributes to consider (for cars there may also be technical detail more difficult to digest). A cognitive system has to help the consumer in studying the market environment (e.g., organising the information from sources such as company websites and professional and peer reviews [social media], detecting patterns in structured and unstructured data, screening and sorting) and learning vis-à-vis the consumer’s preferences and habits in order to prioritize and construct personally fitting recommendations. Additionally, it is noteworthy that in any of these domains visual information (e.g., photographs) could be most relevant and valuable to consumers in their decision process — visual appeal of car models, mountain or seaside holiday resorts, and apartments cannot be discarded. Cognitive computing assistants may raise very high consumer expectations.

Cognitive computing aims to mimic human cognitive processes that would be performed by intelligent computers with enhanced resources on behalf of humans. The application of capabilities of such a system would facilitate consumers or the professionals and agents that help them with decisions and other tasks — saving them time and effort (sometimes frustration), providing them well-organised information with customised recommendations for action that users would feel they  have reached themselves. Time and experience will tell how comfortably people interact and engage with the human-like intelligent assistants and how productive they indeed find them, using the cognitive assistant as the most natural thing to do.

Ron Ventura, Ph.D. (Marketing)

Notes:

1.  “Thinking by Computers”, Herbert A. Simon, 1966/2008, reprinted in Economics, Bounded Rationality and the Cognitive Revolution, Massimo Egidi and Robin Marris (eds.)[pp. 55-75], Edward Elgar.

2. The examples given above are described in IBM’s white paper by Kelly and in: “Cognitive Computing: Real-World Applications for an Emerging Technology”, Judit Lamont (Ph.D.), 1 Sept. 2015, KMWorld.com

Read Full Post »

Retail banking is built on trust; it is at the core of the ‘public license’ to manage the accounts of customers. Think of phrases such as “People trust the bank with their money” or “We entrust our income in the hands of a banker”. Consumers often have a lot at stake held in the bank: their livelihoods and their hopes to use the funds accumulated to improve their quality of life in the future. They expect to have access to money in their accounts readily, before seeking more money via credit and loans from the bank. Banks are additionally expected to offer account holders means to make financial profit on their money. Since the financial crisis of 2008, depletion of consumer trust in the banking system has been troubling many countries. A question still hangs, as it was valid five years ago: How should banks regain consumer trust and improve their relationships with customers?

Digital banking and financial services are proliferating, and not from yesteryear. For example, consumers can view account information and perform by ‘self-service’ a selection of banking operations in their accounts on the Internet; practise of these activities is gradually spreading from desktop and laptop computers to mobile devices. Yet, digital financial services or features are also provided by a variety of non-banking companies, non-profit organizations and institutions, most notably in the area of digital ‘remote’ payment, whether via a debit/credit card or a third-party utility (e.g., PayPal).  The features are becoming increasingly available through mobile apps. Undoubtedly, applying digital banking services remotely and independently can smooth and facilitate for consumers everyday account follow-up and operations, save them time and increase efficiency in managing their accounts. But digital banking may prove as the opposite course of action than needed to help banks regain and rebuild their customers’ trust in them — it risks instead to increase the distance between banks and customers. For instance, is reliance on digital banking appropriate in managing an investment portfolio?

  • Complicating matters, many of the digital service tools are developed by financial technology (fintech) companies for execution online or in mobile apps. They are leading the field in developing those tools, and said to be leaving most banks lagging behind. The fintech companies allow retailers to offer shoppers different options for digital payment, and even running some form of current or expense accounts with them; investment houses and financial consultants can employ advanced tools to better update and communicate with their customers; other fintech’s work includes applications for assisting consumers to manage their personal finances and portals for mediating peer-to-peer loans.

At a conference of the central Bank of Israel, titled “The Technology Changes the Face of Banking” (3/3/16, Hebrew), the Banking Supervisor, Dr. Hedva Ber, embraced the expansion of digital banking, in vision and in action. She encouraged increased communication between banks and customers by digital means, guided by rules of conduct set by her department. Consumers less accustomed to using digital services will have to be accommodated to help them adjust through the process (e.g., by operating limited or temporary ‘pop-up’ branches where ‘fixed’ branches are to close down). But eventually a broad transition will take place and the intention is to include all parts of the population in the transformation of retail banking. The key instrument to achieve that goal will be digital education of banking customers, joined by enforcing a principle of customers’ ownership of their personal information and creating a ‘credit profile’ for each customer. There is also a plan to advance the establishment of a fully digital ‘branchless’ bank. Dr. Ber further talked in favour of computer-automated (AI) reply to customers on the phone.

This transition is likely to result in a significant reduction in the number of employees (mainly engaged in back office processes). The Supervisor projected that the digital transformation of banking will lead to better control of the customer over his or her financial situation, greater transparency, expansion of banks’ baskets of products and services, and foremost will contribute to increased efficiency. Several references to ‘efficiency’ were actually noticed in the presentation, but none regarding ‘trust’.

An initial requisite for trust is competence: the fundamental ability of the organisation to perform the tasks it took upon itself. The building blocks of the expected competence are  knowledge, skills and resources. Chaudhuri and Holbrook (2001) used the definition: “The willingness of the average consumer to rely on the ability of the brand to perform its stated function” (p. 82). The researchers studied the effect of brand trust and affect on brand performance, mediated through loyalty. In their view, brand trust is an involving process, deliberate and well thought out whereas brand affect is developed more spontaneously, immediate and less carefully reasoned. They find that trust and affect each contribute to purchase (behavioural) and attitudinal brand loyalty, whereupon purchase loyalty is positively related to market share and attitudinal loyalty contributes to higher price premiums. In particular, brand trust and commitment are both important for developing  a valued customer relationship (1).

With respect to retail banking, the key competence asked of banks is to protect the money of their customers; it is about safekeeping, or the customer’s feeling that his or her money is ‘kept in good hands’. That kind of attitude may be hard to foster if all contacts the customer has with the bank are indirect through computers. Trust is built between people, therefore customers should be able to meet at the very least a few representatives of the bank that will instill in them the notion that someone cares about them and is taking good care of their money. Such a representative could be an adviser or ‘advocate’ for the customer in the bank.

  • Taking good care of the customer’s money includes warning him when taking excessive investment risks, as the bank should act responsibly in its own risk management.

Another vital requisite for trust maintains that the organisation (bank) should be acting in the interest of its customers and not just in its self-interest. For example, it means that the bank creates and offers saving programmes that are fair and beneficial to the customer, protecting her money with a plus of a reasonable interest rate (as opposed to reducing cost by paying too low rates). The risk for self-interest of the bank may be more pronounced in offering so-called ‘structured products’ of investment that oftentimes use complex rules, obscuring from the investor in whose interest the product will work best. Peppers and Rogers offer the concept of a ‘trusted agent’: in a relationship wherein the customer trusts the enterprise to act in his own interest, “the customer perceives the enterprise to be his trusted agent, making recommendations and giving advice that furthers the customer’s interest, even when it occasionally conflicts with the enterprise’s self-interest, at least in the short-term” (p. 78). Although relationships can exist without trust, it should be obvious that they can become stronger, and grow in value, only when built on trust — trust-based relationships evoke greater dedication (2).

  • We can see how the position of a ‘customer advocate’ relates to fulfilling this requisite, ensuring that the bank is acting in the customer’s interest.

Credibility and reliability are additional important antecedents to trust. Credibility would manifest in the bank’s practice to provide correct information about products and services it offers or delivers, that it is able to provide them, and stands behind them. Furthermore, in the current state of customer relationship management, offering a financial product would be more credible if selected to be more suitable for a specific customer, based for example on his current bank assets and risk attitude. That is, the offer would be more credible if based on knowledge of the customer to fit him better. Reliability concerns more specifically aspects of the accuracy of information and execution of instructions in time as intended (i.e., predictability). Objectives of credibility and reliability can be achieved in offerings made through platforms of online or mobile digital banking, but trust is reliant on more than these two criteria alone.

Charles Green (President of Trusted Advisors Associated, 2004) formulated that credibility, reliability and intimacy enhance customer trust whereas self-orientation diminishes trust in the company (a discount factor). Green describes intimacy as follows: “Intimacy has to do with perceived safety: ‘I can trust talking with him about…'”. He associates intimacy with security and integrity (3). The aspect of intimacy is noteworthy because in banking it corresponds most closely to the kind of delicate affairs that may arise in bank-customer relationships about one’s finances. It is about the level of confidence a customer can put in the bank, based on integrity and consideration he or she can find during any dealings with it and its employees. It is hard to talk about intimacy in human-computer interactions. Integrity also is reflected in conduct of human bank representatives, much less through digital interactions.

Intimacy should not be confused with personalisation that can be achieved with analytics-based digital tools (e.g., a ‘Digital First’ strategy that puts most weight on digital channels, as suggested by Accenture). It is wrong to equate computer-based personalisation with intimacy while talking with another person. Talking with an expert adviser on more complex financial services is especially not equivalent to automated customization, though analytic tools may help the adviser in making her recommendations. Demitry Estrin (Vision Critical) addresses the eroding banks’ relationships with customers who are blaming banks for treating people as numbers. He explains: “Nothing would address the problem better than face-to-face encounter, but these are increasingly rare. In fact, the problem is self-perpetuating: the less people interact with financial services professionals, the less they value them, and the companies they work for.”

Customers are looking to combine interactions in different modes (e.g., mobile, online, phone, face-to-face), but those human and digital interactions have to be streamlined and information exchanged in them should be coordinated within the bank. In a white paper of IBM on “Rebuilding Customer Trust in Retail Banking” (Sept. 2012), the technology and consulting company claims yet that banks managed to create more competition than co-ordination between channels with their working methods (e.g., rewards, targets, metrics). Banks have taken different measures that seem to make customers feel they are treated more conveniently and friendly, efficiently, even fairly, but not necessarily feeling that the bank thinks of each like a person. In that respect, consumers see banks as falling behind other companies they interact with in digital platforms.

The paper of IBM optimistically argues: “Fortunately, trust and digital communication channels can be and are best built together.” It is true but just to a limited extent. It is possible to maintain a certain degree of trust to allow for digital communication to succeed, but trust can grow only so far. Digital banking can provide efficiency, convenience, reliability, even credibility, but that is not enough for building a high level of trust that breeds commitment and dedication. It is doubtful if digital banking can remedy the deeper problems of trust in banks. Perhaps the answer is better found in a combination of human and digital modes of delivering banking services for fostering trust.

  • Digital banking, particularly communication via Internet, raises additional issues of protecting data from cyber-attacks and securing customer data privacy. Acting on those matters to reduce threats is vital to building trust, yet it would not ease the original causes of declining trust that are not digital-related.

Even within a bank branch, the scene can change — a new model is emerging, presenting a novel form of combined digital self-service and human service. Most likely, future branches will no longer have human tellers; otherwise, however, digital and human services will be intertwined in new design concepts. In the upcoming future, a customer may find in a branch central arena with personal working posts equipped with self-service terminals where each can view account information and perform various operations; the customer will be able to proceed to talk with ‘advisers’ sitting in the periphery and settle more complex issues such as loans or investments (e.g., RBC-Royal Bank of Canada, HSBC-flagship branch in Singapore).  At RBC, customers may sit comfortably to read materials (print, online) or watch instructive videos on a large screen about financial products and related topics, thus he or she may prepare before talking with an adviser. BMO Harris Bank is experimenting with ‘video tellers’ for assisting customers; representatives in stand-by, holding tablets, are available to help with any difficulty. There is also a trend to change the visual design of branches to make them look and feel more like shops: less formal, more friendly and rejoicing in colour and form.

Customers are seeking a combination of user-friendly digital tools and human expert advisory on more complex issues. To that end, Mike Baxter and Darrell Rigby advocate a combined ‘digical’ approach: a mashup of digital technologies and physical facilities (“Rethinking the Bank Branch in a Digital World“, HBR, 15 Sept. ’14). The authors argue that combined technological and human services can be implemented on-site within a branch — as illustrated above. They note that financial products and services are often complicated, and security and trust are paramount. Baxter and Rigby conclude: “Physical banking is evolving rapidly, but not disappearing. Branches may be fewer in number, but they will be more useful and efficient, and banks without branches are likely to find themselves at a competitive disadvantage.”

Human banking and digital banking are like two arms of the retail bank. Banks have to provide digital ‘self-service’ tools to allow customers manage their accounts of different kinds more conveniently and efficiently, at an acceptable level of reliability; banks gain from this as well in efficiency and cost reduction. Digitization of banking services extends from the long-running ATMs to more advanced information ‘kiosk’ terminals and remote online and mobile banking utilities. However, digital banking is becoming a necessity, not a basis for competitive advantage for banks. If it were all about digital services, customers would find it even easier to look for more friendly and useful financial services from non-banking companies, and their commitment to retail banks could decline further.

Retail banks need the ‘human arm’ to differentiate themselves from external competition and to develop excellence in competition with other banks. It is also essential to regain and foster trust, tighten and strengthen banks’ relationships with their customers. In branches, it will be a question of creating a friendly atmosphere and balancing in a useful way between digital utilities and the assistance and expertise of human personnel.

Ron Ventura, Ph.D. (Marketing)

Notes:

1. The Chain of Effects from Brand Trust and Brand Affect to Brand Performance: The Role of Brand Loyalty; Arjun Chaudhuri and Morris B. Holbrook, 2001; Journal of Marketing, 65 (2), pp. 81-93.

2. Customer Relationships: Basic Building Blocks of IDIC and Trust (Ch. 3), Managing Customer Relationships: A Strategic Framework; Don Peppers and Martha Rogers, 2004; John Wiley & Sons, Inc.

3. The Trust Equation: Generating Customer Trust; Charles H. Green; in (2), pp. 72-77.

 

Read Full Post »

In late February the annual Mobile World Congress (MWC) 2016 took place in Barcelona, including a large festive exhibition and a conference next to it. The leading motto of the MWC declared that “Mobile Is Everything“. This motto, directed primarily at people involved in the mobile industry, on either the technology-side or the management-side, could help to increase their interest in the event, create a uniting theme, and energise them to be part of the congress and its community. But what does this ‘invitation’ tell client-companies operating mainly outside the field of mobile telecom and technology? Moreover, what does this call suggest for the lives of consumers?

A little over 100,000 people from 204 countries attended the MWC this year according to MWC official website. Some 2,200 companies were represented in the exhibition; during that time the conference hosted speeches and panel discussions by experts and business leaders. An intensive media coverage on TV, online, and in the press, made sure news from the event reach almost everyone. Everything important, it would appear, has happened that week at the MWC.

Companies were presenting in the exhibition their technological solutions, methods and products. Each company could summarily describe its areas of specialisation by classification in any of 90 different product categories (companies more frequently applied 3-5 categories). A remarkable variety of mobile-related products, applications and services were shown in the exhibition: mobile devices (i.e., latest models of smartphones and tablets); accessories and mobile-supported peripheral equipment (e.g., virtual reality [VR], 3D printing, Internet of Things [IoT]); mobile apps; equipment and services in connection with mobile communication (e.g., infrastructure, business & tech consulting, data analysis). While some companies demonstrated apps as designed to be used by consumers, most exhibitors offered  platforms for developing apps (custom or adapted) and mobile-oriented methodologies and services intended for business clients.

  • The classification appears to single out the salience of mobile apps these days. It is interesting to note that out of the ninety categories, five were dedicated to App Development: General, Film, Gaming, Music, and Shopping.

Key areas associated with digital marketing (e.g., data analysis, CRM, content management) need to be extended from online (PC-based) to smart mobile devices. Clearly, technology companies that were not originally in the mobile industry have to adapt and add digital solutions respectively for the mobile channel. Yet it is no less a challenge for companies in lines of business that only use digital technologies for improving their performance (e.g., food, cosmetics, fashion, retail) to keep pace with the latest developments — in mobile communication to this matter. Some companies may produce their solutions in-house but many others have to hire specialist companies to provide them with systems or services tailored to their needs. Those kinds of companies, offering business solutions in a mobile context, would be found most likely at the MWC.

Mobile Advertising and Marketing was one of the more crowded categories (290 companies classified). One of the issues receiving particular attention in companies’ offerings is targeted advertising on mobile devices as well as improved targeting techniques for mobile apps. This category is closely tied with data analysis (e.g., to provide input for implementing more accurate personalised targeting), and is also connected with topics of customer relationship management (e.g., loyalty clubs) and content management in the mobile environment. For example, Ingenious Technologies (Germany) is an independent provider of cloud utilities for business analytics and marketing automation (e.g., omni-channel activities, tracking customer journeys), and Jampp (UK) specialises in app marketing, offering ways to grow consumer engagement in mobile apps (e.g., combine machine learning with methods of big data and programmatic buying). Exhibitors also addressed an increasing concern of monetization, that is the ability of businesses to charge and collect payments for content or for products and services that can be ordered on mobile devices, especially via apps.

In an era that promotes digital and data-driven marketing, it becomes imperative to cover and analyse data from mobile touchpoints. The category of Data Analysis (148 companies) includes the marketing aspect, yet relates to applications in other fields as well.  Among the applications concerned: integrating predictive analytics with campaign management (e.g., Lumata [UK]); analytic database platform for IoT and processing app-based queries (e.g., Infobright [Canada]); traffic analytics for enhancing urban mobility of vehicles and people (e.g., INRIX [UK]).

In the category of Consumer Electronics (222 companies) one may find: (a) devices (e.g., Samsung Galaxy S7 smartphones); (b) accessories (e.g., SanDisk’s portable data storage solutions, fast charging [Zap-go-charger, UK] or portable power backup [CasePower, Sweden]); and (c) components (e.g., LED components by Ledmotive [Spain]). But there were also some less usual devices such as a wearable device for tracking a dog’s health and fitness, which comes with an app (Sense of Intelligence [Finland]).

  • The area of audio (music) and video playing gains special interest, and is further connected to gaming and mobile entertainment overall. A couple of examples under the heading of consumer electronics: software for audio enhancement (AM3D A/S [Denmark]; a mobile video platform, supporting live streaming and video chat (avinotech [Germany]). Video also appears in the context of content management, such as an advanced technology for accelerating display of video content in HD TV quality (Giraffic [Israel]).

This brief review would not be complete without the rising category of Location Technologies and Services (141 companies). Location technologies and their applications can be found in different areas, not just marketing or shopping. For instance, a French company (Sensineo) offers an ultra-low-GPS tracking and positioning device which may help in locating cars or dogs, but furthermore important, tracing vulnerable people who may have lost their way and need support or medical assistance — location apps and mobile alarm devices emerge as new aids to healthcare. In the context of advertising, we may refer to technologies that bridge online and offline domains (e.g., targeting by combining text analysis of consumers’  conversations in social media and intelligence on where they go in the physical world [Cluep, Canada], eliciting online-to-offline engagement in brand or retail campaigns [Beintoo, Italy]). Another technology (by Pole Star [France]) specialises in indoor location, involving analytics through precise geofencing (i.e., activation as people enter specified perimeters) and proximity detection. The last three examples have apparent relevance to consumer behaviour during shopping trips.

  • In regard specifically to development of shopping mobile apps (46 companies), there seems to be greater reference of exhibitors to technologies that may support shopping utilities but not enough examples for apps that truly connect retailers and shoppers. As an example for a more relevant app, Tiendeo Web Marketing (Spain) offers an app, working in partnership with retail chains, that informs consumers of weekly ads, deals or coupons in their area of residence.

For businesses that are client-users of technologies and associated services, the message is very clear — in order to be accessible and relevant to consumers, the business must have mobile presence. Consumer brands of products and services, and in retail, cannot afford to neglect the mobile channel. Moreover they must have a strong showing because the competition is intense and ‘mobile is everything’. The need to be present and useful via mobile devices (mobile websites and apps) is undisputed. As more consumers are engaged with their smartphones much of the time, and perform more tasks in mobile mode, companies should be there available to them. The idea, however, that this is all that matters for marketing and customer service is dubious. Companies are under endless pressure to keep to-date with continuous advances in technology. Technology and consulting companies remind their clients all the time that in order to be competitive they must apply the most advanced mobile features and tools. But companies have to be available, effective and attractive through multiple channels and the kind of pressure implied by the MWC’s motto is neither helpful nor productive.

The danger is that companies engaged in consumer marketing may neglect other important channels in attempt to develop a strong mobile presence. In fact, this kind of shift to interactions through newer technological channels has been happening for years. The latest shift advised to companies is from Web 2.0 on personal computers to mobile websites and apps. It could mean that companies would be forced to invest more in mobile compatibility of their websites, while neglecting improvement of the functionality and visual attractiveness of their usual websites. One of the implications of the shift to online and mobile touchpoints is reduction in direct human interactions (e.g., fewer brick-and-mortar service branches, fewer service hours, not enough trained and skilled personnel in call centres). But consumers continue to appeal call centres for help, and when faced with inadequate assistance they are encouraged to prefer computer-based interactions. More companies offer customers options to chat by text, audio and video, but on the other hand they also refer customers more frequently to virtual agents. The mobile facilities are not desirable for everyone, and at least not all of the time; having the most advanced technology is not always an advantage, except for tech-enthusiasts.

Companies that develop technologies and market hardware and software products and associated services are on a constant race to provide more advanced competent solutions. It starts to be a problem when too many companies are pursuing a single main course — mobile in our case. It is the kind of push induced by MWC’s organizers that should worry us. The interest of GSMA — a consortium of mobile telecom operators, joined by device manufacturers, software companies etc. (“broader mobile ecosystem”) — in putting mobile under the spotlight is clear. However, following the claim that “mobile is everything” can have negative consequences for many stakeholders in industry and also for the general public. There is a sense of rush to develop apps and all other sorts of mobile products and utilities that is concerning. It may never develop into a bubble as fifteen years ago because the conditions are different and better (i.e., stronger technological foundations, greater experience), but there are disturbing signs that should alert stakeholders.

It is hard to argue with the many conveniences that mobile phones, particularly smartphones, provide to consumers. Basically, if one is late for a meeting, wants to set a meeting point with a friend in the city, or just needs to update a colleague in the office about anything, he or she can call while being out on the way somewhere. It has become an invaluable time saver as one can settle any professional or business issues at work while travelling. Yet the elevation of mobile phones to computer-based ‘smart’ phones (and in addition tablets) has expanded greatly the number and types of tasks people can perform while being away from home or office. It is not just sending and receiving voice calls and SMS but also e-mails and various forms of updates on social media networks. Then one can check the news and stock prices, prepare shopping lists and compare products and prices while visiting shops, schedule a forgotten appointment for the doctor, order a table at a restaurant for the evening, listen to his favorite music, and far more. The point is that any minute one can find something to do with the smartphone; people cannot lose hold and sight of their smartphones. Smartphones no longer just serve consumers for their convenience but the consumers ‘serve’ the smartphones.

The motto of MWC could be right in arguing that for consumers ‘mobile is everything’, yet it is also complicit in eliciting the consumers to become even more preoccupied with their mobile devices and adopt forms of behaviour that are not honestly in their benefit. Consumers bear a responsibility to notice these effects and sanction their use of mobile devices reasonably. For instance, people not only can call others when convenient but may also be reached by others in less convenient times (e.g., by an employer). Talking and messaging while travelling on a bus, taxi or train is fine but there are stronger warnings now that people put themselves and others in greater danger if doing so while driving, because this diverts their attention from the road. Being preoccupied with their smartphones causes people in general to look less around them and be less communicative with other people. Immediately sorting every query on a website or app may get consumers hasten purchase decisions unnecessarily and also ignore other channels of resolution (e.g., consulting staff in-store). Finally, relying on mobile devices to find any information instantly online evokes people to make less effort to remember and accumulate new knowledge, to retrieve information from memory, and think (i.e., less cognitive effort).

The motto “Mobile Is Everything” sounds shallow and simplistic. Sweeping generalisations usually do no much good — they cannot be taken too seriously. Perhaps this title was meant to be provocative, so as to fuel the MWC with enthusiasm, but it can end up aggravating. The field of mobile telecom and digital technology has much to show for in achievements in recent years. There is no need to suggest that businesses and consumers cannot do without ‘mobile’ and should invest themselves even more fully into it. Using such a motto is not acting out of strength.

Mobile indeed is a great deal, yet is definitely not everything.

Ron Ventura, Ph.D. (Markting)

 

Read Full Post »

Companies are increasingly concerned with the “customer journey“, covering any dealings customers have with their brands, products and services; it has become one of the key concepts associated with customer experience in recent years.  Companies are advised to map typical journeys of their customers, then analyse and discuss their implications and consequences with aim to ameliorate their customers’ experiences.

At the foundation of the customer journey underlies a purchase decision process, but the developed concept of a “journey” now expands beyond purchase decisions to a variety of activities and interactions customers (consumers) may engage, relating to marketing, sales, and service. This broad spectrum of reference as to what a journey may encompass could be either the concept’s strength (establishing a very general framework) or a weakness (too generalised, weak-defined). Another important emphasis accepted with respect to contemporary customer journeys accentuates consumers’ tendency to utilise multiple channels and touch-points available to them, especially technology-supported channels, in their pathway to accomplish any task. Furthermore, interactions in different channels are inter-related in consumers’ minds and actions (i.e., a cross-channel journey). This post-article reviews propositions, approaches and solutions in this area offered by selected consultancy, technology and analytics companies (based on content in their webpages, white papers, brochures and blogs).

Multi-channel, omnichannel, cross-channel — These terms are used repeatedly and most frequently in association with the customer journey. Oracle, for instance, positions the customer journey squarely in the territory of cross-channel marketing. But companies not always make it sufficiently clear whether these terms are synonymous or have distinct meanings. All above descriptive terms agree that consumers more frequently utilise multiple channels and touch-points to accomplish their tasks yet “cross-channel” more explicitly refers to the flow of the journey across channels, the connectivity and inter-relations between interactions or activities customers engage.

Writing for the blog of Nice “Perfecting Customer Experience”, Natalia Piaggio (5 Feb. 2015) stresses that for better understanding the end-to-end customer experience through customer journey maps (CJMs), focus should be directed to the flow of interactions between touch-points and not to any single touch-point. She explains that customers encounter problems usually during transitions between touch-points (e.g., inconsistency of information, company is unable to deliver on a promise, the next channel transferred to cannot resolve the customer’s problem) and therefore touch-points must be considered connectedly. Oracle notes in its introduction to cross-channel marketing that companies should see the big picture and consider how devices (i.e., laptops, smartphones and tablets) are being used in tandem at different points or stages in the customer journey (whether customers use their email inbox, the Web or social media). Paul Barrett (22 Feb. 2010), an industry expert contributing to a blog of Teradata, adds a nice clarification: when talking about (multiple) channels, moments-of-truth relate to individual and separate channels; yet in a cross-channel environment those moments-of-truth are connected into a customer journey. In other words, the customer journey puts moments-of-truth in context.  Therefore, cross-channel customer journeys refer to the flow as well as inter-dependencies of channels and their touch-points engaged by a customer.

TeleTech enhances the salience of the multi-channel and cross-channel aspects of the customer journey but further adds some valuable observations (TeleTech is parent company of Peppers & Rogers Group as its consultancy arm). First, they propose an association between all three terms above when defining a customer ‘path’ or ‘journey’:

Multichannel signifies the digital and physical channels that customers use in their path to purchase or when seeking support for a product or service. Omnichannel represents the cross-channel path that customers take for product research, support and purchasing.

Notably in the view of TeleTech, “omnichannel” is more directly associated with “cross-channel”. Also noteworthy is the inclusion by TeleTech of physical and digital channels. TeleTech emphasise the need to characterise different customer personas, and construct a map for each persona of her typical journey through channels and touch-points; thereafter a company should be ready to notice changes in customer behaviour and modify the map accordingly (“Connecting the Dots on the Omnichannel Customer Journey“, 2015 [PDF]). Nevertheless, Jody Gilliam contends in a blog of TeleTech that companies should attend not only to the inter-relations between touch-points but also to the (reported) mood of customers during their interactions. It is important to describe and map the whole experience ecosystem (The Relationship Dynamic, Blog: How We Think, 19 July 2013).

  • Teradata addresses the complexity introduced by the use of multiple channels through a customer journey from an analytic viewpoint. They propose a multi-touch approach to attribution modelling   (i.e., evaluating to what extent each touch-point contributed to a final desired action by the customer). Three model types for assigning weights are suggested: unified (equal) weighting, decay-driven attribution (exponential: the later an interaction, the higher its weight), and precision (customised) weighting.

The scope of the customer journey — Consensus is not easy to find on what a customer journey encompasses. On one hand, professional services providers focus on particular components of a journey (e.g., interactions, digital touch-points, purchase or service), on the other hand there are attempts to present at least an all-inclusive approach (e.g., reference to a “customer lifecycle”). It may also be said that a gap currently exists between aims to cover and link all channels and the ability to implement — some of those companies talk more openly about their challenges, particularly of including both digital (e.g., web, social media) and physical (in-store) channels, and linking all types of channels during a journey of a given customer.  Orcale relates specifically to the problem of identity multiplicity, that is, the difficulty to establish the identity of actually the same customer across all channels or touch-points he or she uses, since overcoming this challenge is essential to unfolding the whole journey (“Modern Marketing Essentials Guide: Cross-Channel Marketing“, 2014 [PDF]). This challenge is also echoed by Nice, termed as identity association (Customer Journey Optimization [webpage]).

Another key issue that needs to be addressed is whether a customer journey includes only direct interactions between a customer and a focal company through channels where it operates (e.g., call centre, website, social media) or are there other activities consumers perform towards accomplishing their goal to be accounted for (e.g., searching other websites, consulting a friend, visiting brick-and-mortar stores).

  • In a blog of Verint (In Touch), Koren Stucki refers to a definition of the customer journey as a series of interactions performed by the customer in order to complete the task. Stucki thereafter points out a gap between the straightforward definition and the complexity of the journey itself in the real world. It may not be too difficult to understand the concept and its importance for customer engagement and experience, but capturing customer journeys in practice, identify and link all channels the customer uses for a given type and purpose of a journey (e.g., product purchase, technical support) can be far more complicated. Understanding these processes is truly imperative for being able to enhance them and optimise customer engagement (“Why Customer Journeys?“, 16 Sept. 2014).
  • Piaggio (Nice) also related to the frustration of companies with difficulties in mapping customer journeys. She identifies possible causes as complexity, technical and organizational obstacles to gathering and integrating data, and the dynamic nature of consumer behaviour. She then suggests seven reasons to using CJMs. In accordance, in their brochure on customer journey optimization, Nice see their greater challenge in gathering data from various sources-channels and of different types, and integrating the data, generating complete sequences of customer journeys; three main analytic capabilities they offer in their solution are event-sequencing and visualisation in real-time, contact reasoning (predictive tool), and real-time optimization and guidance (identifying opportunities for improvement).
  • In their first out of four steps to a customer journey strategy — namely map the current customer journey — IBM state that the customer journey “signifies the series of interactions a customer has” with a brand (IBM refers specifically to digital channels). Importantly, they suggest that customer journeys should be mapped around personas representing target segments. The CJMs should help managers put themselves in their customers’ shoes (“Map and Optimize Your Customer Journey“, 2014 [PDF])..
  • In the blog of TeleTech (How We Think), Niren Sirohi writes about the importance of defining target segments and mapping typical customer journeys for each one. Sirohi emphasises that all stages and modes engaged and all activities involved should be included, not only those in which the company plays a role. Next, companies should identify and understand who are the potential influencers at every stage of the journey (e.g., self, retailer, friend). Then ideas may be activated as to how to improve on customer experiences where the company can influence (“A Framework for Influencing Customer Experience“, 16 Oct. 2014).

Customer engagement — This is another prominent viewpoint from which companies approach the customer journey. Nice direct to Customer Journey Optimization via Multi-Channels and Customer Engagement. Verint also present customer journey analysis as part of their suite of Customer Engagement Analytics (also see their datasheet). The analytic process includes “capturing, analysing, and correlating customer interactions, behaviours and journeys across all channels”.  For IBM, the topic of customer journey strategy belongs in a broader context of Continuous Customer Engagement. The next steps for a strategy following mapping (see above) are to pinpoint areas of struggle for customers, determine gaps to fill wherein customer needs and preferences are unmet by current channels and functionalities they offer, and finally strategize to improve customer experiences.

  • Attention should be paid not only to the sequence of interactions but also to what happens during an interaction and how customers react or feel about their experiences. As cited above, Gilliam of TeleTech refers to the mood of customers. Verint say that they apply metrics of customer feedback regarding effort and satisfaction while Nice use text and speech analytics to extract useful information on the content of interactions.

Key issues in improving customer engagement that professional services providers recognize as crucial are reducing customer effort and lowering friction between channels. Effort and struggle by customers may arise during interaction in a single touch-point but furthermore due to frictions experienced while moving between channels. Behind the scenes, companies should work to break down walls between departments, better co-ordinate functions within marketing and with other areas (e.g., technical support, delivery, billing), and remove silos that separate departmental data pools and software applications. These measures are necessary to obtain a complete view of customers. At IBM they see departmental separation of functions in a company, and their information silos, as a major “enemy” of capturing complete customer journeys. Ken Bisconti (29 May 2015) writes in their blog Commerce on steps that can be taken, from simple to sophisticated (e.g., integrated mapping and contextual view of customers across channels), to improve their performance in selling to and serving customers across channels, increase their loyalty and reduce churn. Genesys see the departmental separation as a prime reason to discrete and disconnected journeys; continuity between touch-points has to be improved in order to reduce customer effort (solution: omnichannel Customer Journey Management). Piaggio (Nice) suggests that input from CJMs can help to detect frictions and reduce customer effort; she also relates to the need to reduce silos and eliminate unnecessary contacts. Last, TeleTech also call in their paper on “Connecting the Dots” to break down walls between customer-facing and back-office departments to produce a more channel-seamless customer experience.

  • Technology and analytics firms compete on their software (in the cloud) for mapping customer journeys, the quality of journey visualisation (as pathways or networks), their analytic algorithms, and their tool-sets for interpreting journeys and supporting decision-making (e.g., Nice, Verint, Teradata, TeleTech while IBM intend to release their specialised solution later this year).

Varied approaches may be taken to define a journey. From the perspective of a purchase decision process, multiple steps involving search, comparison and evaluation up to to purchase itself may be included, plus at least some early post-purchase steps such as feedback and immediate requests for technical assistance (e.g., how to install a software acquired). In addition, a journey of long-term relationship may refer to repeated purchases (e.g., replacement or upgrade, cross-sell and up-sell). Alternatively, a journey may focus on service-related issues (e.g., technical support, billing). How a journey is defined depends mostly on the purpose of analysis and planning (e.g., re-designing a broad process-experience, resolving a narrow problem).

As use of digital applications, interfaces and devices by consumers grows and expands to perform many more tasks in their lives (e.g., in self-service platforms), we can expect reliance of CJMs on digital channels and touch-points to become more valid and accurate. But we are not there yet, and it is most plausible that consumers will continue to perform various activities and interactions non-digitally. Consumers also see the task they need or want to perform, not merely through the technology employed. Take for example physical stroes: Shoppers may not wish to spend every visit with a mobile device in hand (and incidentally transmit their location to the retailer). Don Peppers laments that companies have designed customer experiences  with a technology-first, customer-second approach whereas the order should be reverse. Undertaking a customer perspective is required foremost for effectively identifying frictions on a journey pathway and figuring out how to remove them  (“Connecting the Dots”, TeleTech). Excessive focus on technologies can hamper that.

Bruce Temkin (Temkin Group, Blog: Experience Matters) provides lucid explanations and most instructive guidance on customer journey mapping. However, it must be noted, Temkin advocates qualitative research methods for gaining deep understanding of meaningful customer journeys. Quantitative measures are only secondary. He does not approve of confusing CJMs with touch-point maps. His concern about such interpretation is that it may cause managers to lose the broader context in which touch-points fit into consumers’ goals and objectives. Temkin puts even more emphasis on adopting a form of Customer Journey Thinking by employees to be embedded in everyday operations and processes, following five questions he proposes as a paradigm.

There are no clear boundaries to the customer journey, and doubtful if they should be set too firmly — flexibility should be preserved in defining the journey according to managerial goals.  A journey should allow for various types of activities and interactions that may help the customer accomplish his or her goals, and it should account not only for their occurrence and sequence but also for content and sentiment. A viewpoint focusing on channels and touch-points, leading further to technology-driven thinking, should be modified. An approach that emphasises customer engagement but from the perspective of customers and their experiences is more appropriate and conducive.

Ron Ventura, Ph.D. (Marketing)

Read Full Post »

Since the mid-1990s the dominant approach to marketing is centered on the customer (cf. previous approaches emphasised production, the product and sales); more fully, the customer-centric approach evolved from a modern marketing approach, conceived somewhat earlier (1970s to early 1980s), as it sharpened the focus on the customer (*). In this era theories and concepts have developed of relationship marketing (and customer relationship management, CRM, more generally), customer experience and data-driven marketing. Retrospectively,  brand theory has been the bridge linking between the early stages of the marketing approach and the advanced customer approach, and truly to this day the brand and customer views are inter-dependent and should not be separated.

In the past twenty years we have further witnessed intensive developments in digital technologies (e.g., computer information processing, Internet and communication). Their effects on marketing and retailing now call into debate whether the technologies still constitute a progression in the execution of the customer-centric approach or already its evolution into a new approach, entering an era of “digital marketing”. This question is at the core of a recent article in the McKinsey Quarterly magazine  titled “The Dawn of Marketing’s New Golden Age” (Issue 1 of 2015). The authors (Jonathan Gordon, New-York City, and Jesko Perrey, Düsseldorf) outline five forces driving this new age: science, substance, story, speed and simplicity.

The picture emerging from the article entails consumers conducting most or all of their interactions with companies through digital portals or applications on computer-based appliances and mobile devices, and communicating among themselves and with companies about products and services in social media platforms; companies on their part analytically employ huge streams of data associated with their customers (active as well as prospects) to perform automated processes for selling to and servicing the customers. What we are about to see is a formidable enhancement on a large-scale of digital methods and programmes already familiar from the past few years. The engine of marketing will be increasingly powered by modelling, segmenting and predicting customer preferences and behavioural actions with little need for day-to-day human inspection and intervention.

Managerial thinking usually views instruments, data and methods as the tools for executing a well-specified strategy, as in customer-oriented marketing. Undoubtedly the new digital technologies have been vital for engaging customers at an individual level on a large-scale (e.g., one-to-one marketing, personalising and customising). But there are strong signs that in the new golden age the digital technologies, their tools and data-driven methods, will become the essence, the fundamental way in which marketing and retailing function, and not just as a means to an end. They will not be used to perform a customer-driven strategy — they will be the strategy in and by itself. That is what a new digital approach to marketing could mean. McKinsey & Co. already seem to adopt and support that kind of marketing empowered by Big Data, and they are not alone in this attitude. However, a prognosis for such a new age of marketing should be put up to a debate in business circles and in consumer or social circles.

Science has made significant contributions in extracting meaningful information and insights from large amounts of data for marketing purposes. It is the primary engine of the new age foreseen by the authors. The broader impact of science now spans from measurement by sensors and cameras (e.g., in smart and wearable devices) through analytics and modelling to the utilities and services that apply the derived information. Scientific advancement in the area of Big Data enable the automated estimation of multiple statistical models and handling of their results in marketing platforms. Just two examples of applications are (a) customised recommendations based on learned preferences of users; and (b) geo-location and mapping utilities that can direct shoppers to relevant stores in their vicinity. Yet science in marketing has also led to the development of more sophisticated models and better optimization and estimation techniques even before Big Data. The authors note that advanced analytic capabilities also play an important role in managerial decision-making by enabling quicker responses (e.g., in the area of hospitality, noticing trends and changes in hotel room reservations).

It is completely agreed that managers should be trained and encouraged to base their decisions more on information derived from research and analyses of customer and marketing data than on intuition. For achieving that aim managers need to understand better analytics and their outputs, and wisely combine their insights with knowledge from their practical experience. But a problem arises when more processes are channelled to automation and managers are not required to interfere and make decisions. Definitely when a company needs to handle transactions, calls and other activities from hundreds-of-thousands to millions of customers, automation of procedures is essential to let the marketing system work, but keeping an open eye by managers is as essential, particularly to make sure that customers are well-served. Automation is desirable to the extent that it allows decision-makers to devote their time to more complex issues requiring their judgement while not sacrificing the quality and sensibility of processes automated. Human reason and sense of fairness are still valuable.

Of course not every marketing and service process is automated (as yet); customer service representatives (CSRs) are required to navigate the information provided to them on any individual customer to decide on the best approach or solution for helping him or her. Information in the customer profile may include characteristics and recommendations produced by prior modelling and analytic processes. It should be the responsibility of the CSR finally to utilise the information and choose the best-apparent mode of action. The CSRs can be presented with a few feasible alternatives for a type of service or other assistance requested and should be trained how to assess and choose the most appropriate solution for the situation at hand and the customer served. As the authors Gordon and Perrey importantly observe, “Knowing what can be automated, when judgement is required, and where to seek and place technical talent are becoming increasingly central to effective marketing leadership”. Taking a position that employees, from CSRs to managers, are inadequate evaluators or judges of information who are bound to make mistakes, and therefore their decisions are better computer-automated, is misguided. It may get the opposite negative outcome where employees rely on the information system to provide also the best solution and not think for themselves which possible solution is the most appropriate or the most effective.

  • Take for example the domain of healthcare: Suppose that an elder patient calls her HMO to make an appointment for a clinical test. The system may suggest a medical center or clinic that is in a neighbouring town because that is the closest date available or because performing the test in that facility (out-sourced) is less expensive for the HMO. Yet especially for patients in their golden age a CSR should also consider the distance from the patient’s home and the time of day (e.g., not too early) so that it would be convenient enough and not too complicated for the patient to keep the appointment.

The article does not neglect the Substance of marketing and business overall. The authors suggest in particular the experiences of customers, the delivery of functional benefits, and the development of new products and services as the core interests of substance. In this important section they truly explain, through examples, how Big Data, analytics and digital technologies are used by companies to adapt to changes in the market and achieve customer-driven marketing goals.

In another article of McKinsey Quarterly, Getting Big Impact from Big Data (January 2015), its author (David Court, Dallas) acknowledges that the predictions of McKinsey Global Institute (MGI) on the adoption of Big Data in their report from 2012 may have been too optimistic, saying that achieving the expected impact has proved difficult. The article appears as a new effort to re-ignite the growth of Big Data implementation. Some of the explanations given for lagging behind, however, are puzzling. A general claim made in the article is that companies did not realise the expected returns because their financial investments and efforts were not big enough: “many senior managers are reluctant to double down on their investments in analytics — investments required for scale, because early efforts have not yielded a significant return.” How can managers be expected to expand their investment in an initiative if they were not convinced in earlier tests of its benefits? There should be special circumstances to convince them that if a project did not work well in small-scale it would if undertaken in large-scale. While that may be the case with Big Data projects, managers should not be blamed for not seeing it or for not trusting the claim blindly.

The article further points out that companies were not focused enough and did not plan their analytic initiatives with well-specified goals. But responsibility is also put at the doorsteps of analytic vendors and data scientists for misleading managers by making unfounded promises about the kind of valuable information they could extract (or mine) from a company’s data pools. As told by Court, it was not unusual for executives to hear the claim: “just give us your data and we will find new patterns and insights to drive your business” — yet executives became disappointed and discouraged to invest further. Notably, albeit the author’s charge about the insufficient scale of investment in Big Data, he leads to the more welcome conclusion that it is “better to pursue scale that’s achievable than to overreach and be disappointed or to scatter pilots all over the organization”.

  • Automated dynamic pricing: With regard to setting prices, this article maintains that “it’s great to have real-time data and automated pricing engines, but if management processes are designed to set prices on a weekly basis, the organization won’t be able to realize the full impact of these new technologies”. Here lurks another enigma about the new way of thinking. It is technology that should adjust to management processes which in turn accommodate the structure and behaviour of the market (e.g., consumers, shoppers) and not the other way round. For once, if prices change daily or hourly (e.g., in an online store) it is likely to be perceived by consumers as lack of stability, unreliability, an attempt to manipulate, or unfair conduct by a retailer not to be trusted. Moreover, it may not be even economically justified: if most consumers perform concentrated shopping trips in supermarkets between weekly to monthly, it should not be necessary nor beneficial to update prices much more frequently.

The third driver of the new golden age — Story — is an interesting contribution in Gordon and Perrey’s article. However, it brings up again the discussion on who creates and who owns the story of a brand or a company. It is well appreciated that consumers participate and contribute to the story of a brand. Agreeably, the story would not be able to exist without the customers. Yet composing the story should not be relinquished to consumers — the company must remain in charge of designing and presenting it. First, a brand’s story is built around its history and heritage. Second, the story is enriched by the customers’ experiences with the brand. Nevertheless, a company cannot rely on discourse of customers in digital social media networks (e.g., in text and photos) to tell the whole story. The company is responsible for developing the shared experiences and  customer interactions into a narrative and coming up with a compelling story. It may use as input its maps of customer journeys to develop the story.

Speed and Simplicity entail the measures that companies have to take to organise themselves better for the new age. These may be structural, functional and logistic measures that improve the implementation of data-driven processes and marketing initiatives (e.g., reducing layers and connecting silos, sharing data and smoothing operations, more agile product development).

  • Digital self-service, through Internet websites or mobile apps, is widespreading for product or service ordering and customer support. But managers should remember that not all consumers feel equally comfortable with these platforms and have the skill and confidence in using them; consider in particular that the proportion of people age 65 and above is forecast to rise in developed countries and may reach 20% in two to three decades). Furthermore, many people do not like to “talk” with algorithms; they prefer to talk with other people to get the assistance and advice they seek.

It is important to draw a line and respect a distinction between the customer-centric approach (“what”) and the technologies, data and methods that can be employed to implement it (“how”). There is no need to declare a new age of marketing, at least not on behalf of digital technologies or Big Data. Advancement of the latter may signal a new phase of progression in implementation of the customer approach (i.e., ‘marketing in a digital age’), but suggesting beyond that may lead to dilution of the focus on the customer. Nonetheless, time may be ripe for a mature integrated approach that is guided by a triad of Customer-Product & Service-Brand as the complex of these entities and the relations between them are at the foundation of modern marketing.

Ron Ventura, Ph.D. (Marketing)

(*) The marketing approach was already oriented towards the customer as its focal target but largely at a segment-level; it advanced strategic thinking beyond sales. Consumer marketing most progressed during this period.

Read Full Post »

Big Data, Big Science, Data Science — This triad of concepts exemplifies the new age of utilisation of data in large Volume by companies to produce information and insights for guiding their operations, such as in marketing, to perform more effectively and profitably. Yet Big Data also means that data exhibit great Variety (e.g., types and structures), and are generated and transformed in high Velocity. The data may be retrieved from internal or external sources. To be sure, non-business organisations also utilise Big Data and Data Science methods and strategies for a range of purposes (e.g., medical research, fraud detection), though our interest is focused here on marketing, inter-linked with sales and customer service, as well as retailing.

It is not quite easy to separate or draw the line between the concepts above because they are strongly connected and cover similar ideas. Big Data may seem to emphasise the properties of the data but it is tied-in with specialised technologies and techniques needed to store, process and analyse it. Likewise, Data Science (and Big Science) may imply greater emphasis on research strategies, scientific thinking, and analytic methods, but they are directed towards handling large and complex pools of data, namely Big Data. Nonetheless, we may distinguish Data Science by reference to occupation or position: Professionals recognized as “data scientists” are identified as distinct from all other business analysts or data analysts in this field – data scientists are considered the superior analysts, experts, and mostly, the strategists who also connect between the analytic domain and the business domain.

The Trend Lab VINT (Vision – Inspiration – Navigation – Trends), part of Sogeti network of experts (Netherlands), published an instructive e-book on Big Data. In the e-book titled “No More Secrets With Big Data Analytics” (2013), the team of researchers propose a logical linkage between these concepts while relating them to Big Business. Big Science was conceived already in the early 1960s (attributed to atomic scientist Alvin Weinberg) to describe the predicted rise of large-scale scientific projects. It was not associated necessarily with amount of data (typical contexts have been physics and life sciences). Big Data as a concept emerged nearly ten years ago and turned the spotlight on data. Data Science is introduced by VINT as the toolbox of strategies and methods that allows Big Data to bring us from Big Science to Big Business. Data Science is “the art of transforming existing data to new insights by means of which an organizsation can or will take action” (p. 33). Originally, Big Science emphasised a requirement of scientific projects that is true today with regard to Big Data projects: collaboration between researchers with different areas of expertise to successfully accomplish the research task.

  • The researchers of VINT note that some scientists disapprove of connotations of the word “big” and prefer to use instead the term “extreme” which is in accordance with statistical theory.

The VINT e-book cites a profile for the position of data scientist suggested by Chirag Metha (a former technology, design and innovation strategist at SAP). In the headline Metha stated that the role of a data scientist is not to replace any existing BI people but to complement them (p. 34; BI=Business Intelligence). He defined requirements from a data scientist in four areas: (a) deep understanding of data, their sources and patterns; (b) theoretical and practical knowledge of advanced statistical algorithms and machine learning; (c) strategically connecting business challenges with appropriate data-driven solutions; and (d) devise an enterprise-wide data strategy that will accommodate patterns and events in the environment and foresee future data needs of the organisation. Therefore, primary higher-level contributions expected from a data scientist include the capacity to bridge between the domains of business and data/analytics (i.e., translate business needs to analytic models and solutions and back to [marketing] action plans), and an overview of data sources and types of data, structured and unstructured, and how to combine them properly and productively.

The pressure on companies to implement data-driven marketing programmes is growing all the time. As one company becomes publicly commended for successfully using, for instance, feedback on its website and in social media to create better-tailored product offerings, it gains an advantage that puts its competitors under pressure to follow suit. It may also inspire and incentivize companies in other industries to take similar measures. Such published examples are increasing in number in recent years. Furthermore, companies are encouraged to apply individual-level data of customer interactions with them (e.g., personal information submitted online, stated preferences and tracking page visits and item choices made on viewed pages) in order to devise customized product offerings or recommendations for each customer. Already in the late 1990s the grocery retailer Tesco leveraged its business in the UK and gained a leading position by utilising the purchase and personal data of customers gathered through their loyalty Clubcard to generate offerings of greater relevance to specific customer segments they identified. Amazon developed its e-commerce business by recommending to individual customers books related to those they view or purchase based on similar books purchased by other customers and on customers’ own history of behaviour.

A key challenge facing many companies is to implement an integrative approach that enforces a single view of the customer across organisational functions and channels. Thus, marketing programmes and operations must be coordinated and share data with sales and customer service activities. Moreover, data of interactions with customers, and consumers overall (as prospects), need to be examined and incorporated across multiple channels — offline, online, and mobile. This is a mission of utmost importance for companies these days; ignoring or lagging behind on this mission could mean losing ground in a market and relevance to customers. This is because customers’ experience extends over different stages of a journey in their relationship with a company and across multiple alternative channels or touchpoints they may use to fulfill their objectives. They expect that data that become available to companies be employed to improve in some way their customer experience anywhere and anytime they interact with the company. For companies, it definitely requires that they not only gather but also analyse the data in meaningful and productive ways. Whether the interactions occur in-store, over the phone, on a company’s website, in social media networks, or through mobile apps, customers consequently expect those interactions in and between channels to be smooth and frictionless. As for companies, they need to be able to share and join data from the different channels to obtain a comprehensive view of customers and co-ordinate between channels.

  • The American leading pharmacy retailer Walgreens established a platform for monitoring, analysing and managing its inventory jointly across all of its outlets, over 8,000 physical stores and four online stores, so as to allow shoppers to find, purchase and collect products they need in as a seamless manner as possible. They integrate point-of-sale data for customers with data from additional sources (e.g., social media, third-party healthcare organisations) in order to improve patient care.
  • Procter & Gamble, which does not have direct access to sales data as retailers, created an independent channel of communication with consumers; with the help of Teradata, they use personal data provided by consumers online and other data (e.g., social media) to put forward more personalised product offerings for them.

An additional important aspect is the need to join different types of data, both structured (e.g., from relational customer databases) and unstructured (e.g., open-end text in blog posts and social media posts and discussions). Data that companies may utilise become ever more heterogeneous in type, structure and form, posing greater technical and analytical challenges to companies, but also offering better opportunities. Companies may also consider using digital images, voice tracks (i.e., not only for verbal content but also tone and pitch), and all sorts of traffic data (e.g., electronic, digital-online and mobile, and even human-physical traffic in-store). For example, suppose that a company identifies photo images posted by its customers online and recognizes that the images include objects of product items; it then may complement that information with personal data of those customers and various interactions or activities they perform online (e.g., company’s websites, social media) to learn more about their interests, perceptions, and preferences as reflected through images.

  • The US airliner JetBlue uses the Net Promoter Score (NPS) metric to trace suspected problems of customer satisfaction, and then utilise survey data and content from social media networks, blogs and other consumer-passenger communications to identify the possible source and nature of a problem and devise an appropriate fix (an award-winning initiative by Peppers & Rogers).

But there is reason for some concern. In a report titled “Big Data: The Next Frontier for Innovation, Competition, and Productivity” (2011), McKinsey & Co. Consulting Group cautioned of an expected shortage in highly advanced analytic professionals and data-proficient managers. They estimated that by 2018  organisations in the US alone could face a shortage of 140,000 to 190,000 people with “deep analytical skills”. Nonetheless, the report also predicts a shortage of 1.5 million managers and analysts “with the know-how to use the analysis” of Big Data and its effective application for decision-making.  The first part seems to refer to the professional-technical level whereas the second part points to utilisation of Big Data at the business level. Thus, McKinsey & Co. appear to be even more concerned by inadequate ability of companies at a managerial level to benefit from the business advantages, such as with marketing-related objectives, that Big Data can produce. Data Scientists may be counted in both categories of this forecast, but because they need to be simultaneously expert analysts and business-savvy they could belong more closely with managers.

However, the situation may not improve as quickly as sought. The problem may be that young people are not attracted, not encouraged, and are not educated and trained enough to obtain high proficiency and skills in the exact sciences of mathematics and statistics, at least not at a growing pace that the industry may require. This problem seems to be imminent particularly in Western countries. Popular areas of studies such as technology, computer sciences and business administration can not compensate for lack of sound knowledge and skills in mathematics and statistics as far as utilisation of Big Data in marketing in particular and management in general is concerned. Yet business students, MBAs included, are more inclined to stay away rather than embark on their courses and tasks in statistics and data analysis; and the number of graduates in exact sciences is not increasing fast enough (in some cases even decreasing).  Here are some figures indicative of the problem that may help to illuminate it:

  • In the latest PISA exams carried out by the OECD in 2012 for school students aged 15-16, seven out the ten top ranking countries (or economies) in math are from the Far East, including Shanghai and Hong-Kong of China, Singapore, Republic of Korea, and Japan. Three European countries close the top list: Switzerland, adjacent Lichtenstein, and the Netherlands. Their scores are above the mean OECD score (494), ranging between 523 and 613.
  • Western countries are nevertheless among the next ten countries that still obtain a score in math above the OECD mean score, including Germany, Finland, Canada, Australia, Belgium and Ireland. But the United Kingdom is in 26th place (score 494) and the United States is even lower, in the 36th place (481). Israel is positioned a bit further down the list (at 41st, score 466). [34 OECD members and 31 partner countries participated].

  • In Israel, the rate of high school students taking their matriculation exam in math at an enhanced level (4 or 5 units) has changed negatively in recent years. It ranged in the years 1998-2006 from 52% and up to 57% but since 2009 and until 2012 it dropped dramatically to 46% of those eligible to a matriculation certificate, according to a press release of the Israeli Central Bureau of Statistics (CBS). It is noted by CBS that this decrease occurs in parallel with an increase in the total number of students who obtain the certificate, but this suggests that effort was not made to train and prepare the additional students to a high level in mathematics.

  • In statistics published by UNESCO on the proportion of academic  graduates (ISCED levels 5 or 6 — equivalents of bachelor to PhD) in Science fields of study, we find that this proportion decreased from 2001 to 2012 in countries like Australia (14.2% to 9%), Switzerland (11.5% to 9%), Republic of Korea (9.8% to 8.5%), UK (17.4% to 13.7%), and Israel (11.7% to 8.5% in 2011).
  • This rate is stable in the US (8.6%) and Japan (though low at 2.9%), while in Finland it has been relatively stable (10%-11%) but shifting down lately. Nice rises are observed in Poland (5% to 8%), Germany (13% to 14.5%), and the Netherlands (5.7% to 6.5%); Italy is also improving (up from 7.5% to 8%). [Levels of ISCED scheme of 1997; a new scheme enters this year].

The notion received is that supply of math and science-oriented graduates may not get closer to meet market demand by companies and non-business organisations in the coming years; it could even get worse in some countries. Companies can expect to encounter continued difficulties to recruit well-qualified analysts with potential to become high-qualified data scientists, and managers with good data and analytics proficiency. Managers and data scientists may have to work harder to train analysts to a satisfying level. They may need to consider recruiting analysts from different areas of specialisation (e.g., computer programming, math and statistics, marketing), each with a partial skill set in one or two areas, continue to train them in complementary areas, and foremost oversee the work and performance of mixed-qualification teams of analysts.

Big Data and Data Science offer a range of new possibilities and business opportunities to companies for better meeting consumer needs and providing better customer experiences, functionally and emotionally. They are set to change the way marketing, customer service, and retailing are managed and executed. However, reaching the higher level of marketing effectiveness and profitability will continue to command large investments, not only in technology but also in human capital. This will be a challenge for qualified managers and data scientists to work together in the future to harvest the promised potential of Big Data.

 

Ron Ventura, Ph.D. (Marketing)

Read Full Post »

Older Posts »