Feeds:
Posts
Comments

Posts Tagged ‘Social media’

Revelations about the Facebook – Cambridge Analytica affair last month (March 2018) invoked a heated public discussion about data privacy and users’ control over their personal information in social media networks, particularly in the domain of Facebook. The central allegation in this affair is that personal data in social media was misused for the winning political presidential campaign of Donald Trump. It offers ‘juicy’ material for all those interested in American politics. But the importance of the affair goes much beyond that, because impact of the concerns it has raised radiates to the daily lives of millions of users-consumers socially active via the social media platform of Facebook; it could touch potentially a multitude of commercial marketing contexts (i.e., products and services) in addition to political marketing.

Having a user account as member of the social media network of Facebook is pay free, a boon hard to resist. Facebook surpassed in Q2 of 2017 the mark of two billion active monthly users, double a former record of one billion reached five years earlier (Statista). No monetary price requirement is explicitly submitted to users. Yet, users are subject to alternative prices, embedded in the activity on Facebook, implicit and less noticeable as a cost to bear.

Some users may realise that advertisements they receive and see is the ‘price’ they have to tolerate for not having to pay ‘in cash’ for socialising on Facebook. It is less of a burden if the content is informative and relevant to the user. What users are much less likely to realise is how personally related data (e.g., profile, posts and photos, other activity) is used to produce personally targeted advertising, and possibly in creating other forms of direct offerings or persuasive appeals to take action (e.g., a user receives an invitation from a brand, based on a post of his or her friend, about a product purchased or  photographed). The recent affair led to exposing — in news reports and a testimony of CEO Mark Zuckerberg before Congress — not only the direct involvement of Facebook in advertising on its platform but furthermore how permissive it has been in allowing third-party apps to ‘borrow’ users’ information from Facebook.

According to reports on this affair, Psychologist Aleksandr Kogan developed with colleagues, as part of academic research, a model to deduce personality traits from behaviour of users on Facebook. Aside from his position at Cambridge University, Kogan started a company named Global Science Research (GSR) to advance commercial and political applications of the model. In 2013 he launched an app in Facebook, ‘this-is-your-digital-life’, in which Facebook users would answer a self-administered questionnaire on personality traits and some personal background. In addition, the GSR app prompted respondents to give consent to pull personal and behavioural data related to them from Facebook. Furthermore, at that time the app could get access to limited information on friends of respondents — a capability Facebook removed at least since 2015 (The Guardian [1], BBC News: Technology, 17 March 2018).

Cambridge Analytica (CA) contracted with GSR to use its model and data it collected. The app was able, according to initial estimates, to harvest data on as many as 50 million Facebook users; by April 2018 the estimate was updated by Facebook to reach 87 millions. It is unclear how many of these users were involved in the project of Trump’s campaign because CA was specifically interested for this project in eligible voters in the US; it is said that CA applied the model with data in other projects (e.g., pro-Brexit in the UK), and GSR made its own commercial applications with the app and model.

In simple terms, as can be learned from a more technical article in The Guardian [2], the model is constructed around three linkages:

(1) Personality traits (collected with the app) —> data on user behaviour in Facebook platform, mainly ‘likes’ given by each user (possibly additional background information was collected via the app and from the users’ profiles);

(2) Personality traits —> behaviour in the target area of interest — in the case of Trump’s campaign, past voting behaviour (CA associated geographical data on users with statistics from the US electoral registry).

Since model calibration was based on data from a subset of users who responded to the personality questionnaire, the final stage of prediction applied a linkage:

(3) Data on Facebook user behaviour ( —> predicted personality ) —>  predicted voting intention or inclination (applied to the greater dataset of Facebook users-voters)

The Guardian [2] suggests that ‘just’ 32,000 American users responded to the personality-political questionnaire for Trump’s campaign (while at least two million users from 11 states were initially cross-referenced with voting behaviour). The BBC gives an estimate of as many as 265,000 users who responded to the questionnaire in the app, which corresponds to the larger pool of 87 million users-friends whose data was harvested.

A key advantage credited to the model is that it requires only data on ‘likes’ by users and does not have to use other detailed data from posts, personal messages, status updates, photos etc. (The Guardian [2]). However, the modelling concept raises some critical questions: (1) How many repeated ‘likes’ of a particular theme are required to infer a personality trait? (i.e., it should account for a stable pattern of behaviour in response to a theme or condition in different situations or contexts); (2) ‘Liking’ is frequently spurious and casual — ‘likes’ do not necessarily reflect thought-out agreement or strong identification with content or another person or group (e.g., ‘liking’ content on a page may not imply it personally applies to the user who likes it); (3) Since the app was allowed to collect only limited information on a user’s ‘friends’, how much of it could be truly relevant and sufficient for inferring the personality traits? On the other hand, for whatever traits that could be deduced, data analyst and whistleblower Christopher Wylie, who brought the affair out to the public, suggested that the project for Trump had picked-up on various sensitivities and weaknesses (‘demons’ in his words). Personalised messages were respectively devised to persuade or lure voters-users likely to favour Trump to vote for him. This is probably not the way users would want sensitive and private information about them to be utilised.

  • Consider users in need for help who follow and ‘like’ content of pages of support groups for bereaved families (e.g., of soldiers killed in service), combatting illnesses, or facing other types of hardship (e.g., economic or social distress): making use of such behaviour for commercial or political gain would be unethical and disrespectful.

Although the app of GSR may have properly received the consent of users to draw information about them from Facebook, it is argued that deception was committed on three counts: (a) The consent was awarded for academic use of data — users were not giving consent to participate in a political or commercial advertising campaign; (b) Data on associated ‘friends’, according to Facebook, has been allowed at the time only for the purpose of learning how to improve users’ experiences on the platform; and (c) GSR was not permitted at any time to sell and transfer such data to third-party partners. We are in the midst of a ‘blame game’ among Facebook, GSR and CA on the transfer of data between the parties and how it has been used in practice (e.g., to what extent the model of Kogan was actually used in the Trump’s campaign). It is a magnificent mess, but this is not the space to delve into its small details. The greater question is what lessons will be learned and what corrections will be made following the revelations.

Mark Zuckerberg, founder and CEO of Facebook, gave testimony at the US Congress in two sessions: a joint session of the Senate Commerce and Judiciary Committees (10 April 2018) and before the House of Representatives Commerce and Energy Committee (11 April 2018). [Zuckerberg declined a call to appear in person before a parliamentary committee of the British House of Commons.] Key issues about the use of personal data on Facebook are reviewed henceforth in light of the opening statements and replies given by Zuckerberg to explain the policy and conduct of the company.

Most pointedly, Facebook is charged that despite receiving reports concerning GSR’s app and CA’s use of data in 2015, it failed to ensure in time that personal data in the hands of CA is deleted from their repositories and that users are warned about the infringement (before the 2016 US elections), and that it took at least two years for the social media company to confront GSR and CA more decisively. Zuckerberg answered in his defence that Cambridge Analytica had told them “they were not using the data and deleted it, we considered it a closed case”; he immediately added: “In retrospect, that was clearly a mistake. We shouldn’t have taken their word for it”. This line of defence is acceptable when coming from an individual person acting privately. But Zuckerberg is not in that position: he is the head of a network of two billion users. Despite his candid admission of a mistake, this conduct is not becoming a company the size and influence of Facebook.

At the start of both hearing sessions Zuckerberg voluntarily and clearly took personal responsibility and apologized for mistakes made by Facebook while committing to take measures (some already done) to avoid such mistakes from being repeated. A very significant realization made by Zuckerberg in the House is him conceding: “We didn’t take a broad view of our responsibility, and that was a big mistake” — it goes right to the heart of the problem in the approach of Facebook to personal data of its users-members. Privacy of personal data may not seem to be worth money to the company (i.e., vis-à-vis revenue coming from business clients or partners) but the whole network business apparatus of the company depends on its user base. Zuckerberg committed that Facebook under his leadership will never give priority to advertisers and developers over the protection of personal information of users. He will surely be followed on these words.

Zuckerberg argued that the advertising model of Facebook is misunderstood: “We do not sell data to advertisers”. According to his explanation, advertisers are asked to describe to Facebook the target groups they want to reach, Facebook traces them and then does the placement of advertising items. It is less clear who composes and designs the advertising items, which also needs to be based on knowledge of the target consumers-users. However, there seems to be even greater ambiguity and confusion in distinguishing between use of personal data in advertising by Facebook itself and access and use of such data by third-party apps hosted on Facebook, as well as distinguishing between types of data about users (e.g., profile, content posted, response to others’ content) that may be used for marketing actions.

Zuckerberg noted that the ideal of Facebook is to offer people around the world free access to the social network, which means it has to feature targeted advertising. He suggested in Senate there will always be a pay-free version of Facebook, yet refrained from saying when if ever there will be a paid advertising-clear version. It remained unclear from his testimony what information is exchanged with advertisers and how. Zuckerberg insisted that users have full control over their own information and how it is being used. He added that Facebook will not pass personal information to advertisers or other business partners, to avoid obvious breach of trust, but it will continue to use such information to the benefit of advertisers because that is how its business model works (NYTimes,com, 10 April 2018). It should be noted that whereas users can choose who is allowed to see information like posts and photos they upload for display, that does not seem to cover other types of information about their activity on the platform (e.g., ‘likes’, ‘shares’, ‘follow’ and ‘friend’ relations) and how it is used behind the scenes.

Many users would probably want to continue to benefit from being exempt of paying a monetary membership fee, but they can still be entitled to have some control over what adverts they value and which they reject. The smart systems used for targeted advertising could be less intelligent than they purport to be. Hence more feedback from users may help to assign them well-selected adverts that are of real interest, relevance and use to them, and thereof increase efficiency for advertisers.

At the same time, while Facebook may not sell information directly, the greater problem appears to be with the information it allows apps of third-party developers to collect about users without their awareness (or rather their attention). In a late wake-up call at the Senate, Zuckerberg said that the company is reviewing app owners who obtain a large amount of user data or use it improperly, and will act against them. Following Zuckerberg’s effort to go into details of the terms of service and to explain how advertising and apps work on Facebook, and especially how they differ, Issie Lapowsky reflects in the ‘Wired’: “As the Cambridge Analytica scandal shows, the public seems never to have realized just how much information they gave up to Facebook”. Zuckerberg emphasised that an app can get access to raw user data from Facebook only by permission, yet this standard, according to Lapowsky, is “potentially revelatory for most Facebook users” (“If Congress Doesn’t Understand Facebook, What Hope Do Its Users Have”, Wired, 10 April 2018).

There can be great importance to how an app asks for permission or consent of users to pull their personal data from Facebook, how clear and explicit it is presented so that users understand what they agree to. The new General Data Protection Regulation (GDPR) of the European Union, coming into effect within a month (May 2018), is specific on this matter: it requires explicit ‘opt-in’ consent for sensitive data and unambiguous consent for other data types. The request must be clear and intelligible, in plain language, separated from other matters, and include a statement of the purpose of data processing attached to consent. It is yet to be seen how well this ideal standard is implemented, and extended beyond the EU. Users are of course advised to read carefully such requests for permission to use their data in whatever platform or app they encounter them before they proceed. However, even if no information is concealed from users, they may not be adequately attentive to comprehend the request correctly. Consumers engaged in shopping often attend to only some prices, remember them inaccurately, and rely on a more general ‘feeling’ about the acceptable price range or its distribution. If applying the data of users for personalised marketing is a form of price expected from them to pay, a company taking this route should approach the data fairly just as with setting monetary prices, regardless of how well its customers are aware of the price.

  • The GDPR specifies personal data related to an individual to be protected if “that can be used to directly or indirectly identify the person”. This leaves room for interpretation of what types of data about a Facebook user are ‘personal’. If data is used and even transferred at an aggregate level of segments there is little risk of identifying individuals, but for personally targeted advertising or marketing one needs data at the individual level.

Zuckerberg agreed that some form of regulation over social media will be “inevitable ” but conditioned that “We need to be careful about the regulation we put in place” (Fortune.com, 11 April 2018). Democrat House Representative Gene Green posed a question about the GDPR which “gives EU citizens the right to opt out of the processing of their personal data for marketing purposes”. When Zuckerberg was asked “Will the same right be available to Facebook users in the United States?”, he replied “Let me follow-up with you on that” (The Guardian, 13 April 2018).

The willingness of Mark Zuckerberg to take responsibility for mistakes and apologise for them is commendable. It is regrettable, nevertheless, that Facebook under his leadership has not acted a few years earlier to correct those mistakes in its approach and conduct. Facebook should be ready to act in time on its responsibility to protect its users from harmful use of data personally related to them. It can be optimistic and trusting yet realistic and vigilant. Facebook will need to care more for the rights and interests of its users as it does for its other stakeholders in order to gain the continued trust of all.

Ron Ventura, Ph.D. (Marketing)

 

 

 

 

 

Advertisements

Read Full Post »

Companies are increasingly concerned with the “customer journey“, covering any dealings customers have with their brands, products and services; it has become one of the key concepts associated with customer experience in recent years.  Companies are advised to map typical journeys of their customers, then analyse and discuss their implications and consequences with aim to ameliorate their customers’ experiences.

At the foundation of the customer journey underlies a purchase decision process, but the developed concept of a “journey” now expands beyond purchase decisions to a variety of activities and interactions customers (consumers) may engage, relating to marketing, sales, and service. This broad spectrum of reference as to what a journey may encompass could be either the concept’s strength (establishing a very general framework) or a weakness (too generalised, weak-defined). Another important emphasis accepted with respect to contemporary customer journeys accentuates consumers’ tendency to utilise multiple channels and touch-points available to them, especially technology-supported channels, in their pathway to accomplish any task. Furthermore, interactions in different channels are inter-related in consumers’ minds and actions (i.e., a cross-channel journey). This post-article reviews propositions, approaches and solutions in this area offered by selected consultancy, technology and analytics companies (based on content in their webpages, white papers, brochures and blogs).

Multi-channel, omnichannel, cross-channel — These terms are used repeatedly and most frequently in association with the customer journey. Oracle, for instance, positions the customer journey squarely in the territory of cross-channel marketing. But companies not always make it sufficiently clear whether these terms are synonymous or have distinct meanings. All above descriptive terms agree that consumers more frequently utilise multiple channels and touch-points to accomplish their tasks yet “cross-channel” more explicitly refers to the flow of the journey across channels, the connectivity and inter-relations between interactions or activities customers engage.

Writing for the blog of Nice “Perfecting Customer Experience”, Natalia Piaggio (5 Feb. 2015) stresses that for better understanding the end-to-end customer experience through customer journey maps (CJMs), focus should be directed to the flow of interactions between touch-points and not to any single touch-point. She explains that customers encounter problems usually during transitions between touch-points (e.g., inconsistency of information, company is unable to deliver on a promise, the next channel transferred to cannot resolve the customer’s problem) and therefore touch-points must be considered connectedly. Oracle notes in its introduction to cross-channel marketing that companies should see the big picture and consider how devices (i.e., laptops, smartphones and tablets) are being used in tandem at different points or stages in the customer journey (whether customers use their email inbox, the Web or social media). Paul Barrett (22 Feb. 2010), an industry expert contributing to a blog of Teradata, adds a nice clarification: when talking about (multiple) channels, moments-of-truth relate to individual and separate channels; yet in a cross-channel environment those moments-of-truth are connected into a customer journey. In other words, the customer journey puts moments-of-truth in context.  Therefore, cross-channel customer journeys refer to the flow as well as inter-dependencies of channels and their touch-points engaged by a customer.

TeleTech enhances the salience of the multi-channel and cross-channel aspects of the customer journey but further adds some valuable observations (TeleTech is parent company of Peppers & Rogers Group as its consultancy arm). First, they propose an association between all three terms above when defining a customer ‘path’ or ‘journey’:

Multichannel signifies the digital and physical channels that customers use in their path to purchase or when seeking support for a product or service. Omnichannel represents the cross-channel path that customers take for product research, support and purchasing.

Notably in the view of TeleTech, “omnichannel” is more directly associated with “cross-channel”. Also noteworthy is the inclusion by TeleTech of physical and digital channels. TeleTech emphasise the need to characterise different customer personas, and construct a map for each persona of her typical journey through channels and touch-points; thereafter a company should be ready to notice changes in customer behaviour and modify the map accordingly (“Connecting the Dots on the Omnichannel Customer Journey“, 2015 [PDF]). Nevertheless, Jody Gilliam contends in a blog of TeleTech that companies should attend not only to the inter-relations between touch-points but also to the (reported) mood of customers during their interactions. It is important to describe and map the whole experience ecosystem (The Relationship Dynamic, Blog: How We Think, 19 July 2013).

  • Teradata addresses the complexity introduced by the use of multiple channels through a customer journey from an analytic viewpoint. They propose a multi-touch approach to attribution modelling   (i.e., evaluating to what extent each touch-point contributed to a final desired action by the customer). Three model types for assigning weights are suggested: unified (equal) weighting, decay-driven attribution (exponential: the later an interaction, the higher its weight), and precision (customised) weighting.

The scope of the customer journey — Consensus is not easy to find on what a customer journey encompasses. On one hand, professional services providers focus on particular components of a journey (e.g., interactions, digital touch-points, purchase or service), on the other hand there are attempts to present at least an all-inclusive approach (e.g., reference to a “customer lifecycle”). It may also be said that a gap currently exists between aims to cover and link all channels and the ability to implement — some of those companies talk more openly about their challenges, particularly of including both digital (e.g., web, social media) and physical (in-store) channels, and linking all types of channels during a journey of a given customer.  Orcale relates specifically to the problem of identity multiplicity, that is, the difficulty to establish the identity of actually the same customer across all channels or touch-points he or she uses, since overcoming this challenge is essential to unfolding the whole journey (“Modern Marketing Essentials Guide: Cross-Channel Marketing“, 2014 [PDF]). This challenge is also echoed by Nice, termed as identity association (Customer Journey Optimization [webpage]).

Another key issue that needs to be addressed is whether a customer journey includes only direct interactions between a customer and a focal company through channels where it operates (e.g., call centre, website, social media) or are there other activities consumers perform towards accomplishing their goal to be accounted for (e.g., searching other websites, consulting a friend, visiting brick-and-mortar stores).

  • In a blog of Verint (In Touch), Koren Stucki refers to a definition of the customer journey as a series of interactions performed by the customer in order to complete the task. Stucki thereafter points out a gap between the straightforward definition and the complexity of the journey itself in the real world. It may not be too difficult to understand the concept and its importance for customer engagement and experience, but capturing customer journeys in practice, identify and link all channels the customer uses for a given type and purpose of a journey (e.g., product purchase, technical support) can be far more complicated. Understanding these processes is truly imperative for being able to enhance them and optimise customer engagement (“Why Customer Journeys?“, 16 Sept. 2014).
  • Piaggio (Nice) also related to the frustration of companies with difficulties in mapping customer journeys. She identifies possible causes as complexity, technical and organizational obstacles to gathering and integrating data, and the dynamic nature of consumer behaviour. She then suggests seven reasons to using CJMs. In accordance, in their brochure on customer journey optimization, Nice see their greater challenge in gathering data from various sources-channels and of different types, and integrating the data, generating complete sequences of customer journeys; three main analytic capabilities they offer in their solution are event-sequencing and visualisation in real-time, contact reasoning (predictive tool), and real-time optimization and guidance (identifying opportunities for improvement).
  • In their first out of four steps to a customer journey strategy — namely map the current customer journey — IBM state that the customer journey “signifies the series of interactions a customer has” with a brand (IBM refers specifically to digital channels). Importantly, they suggest that customer journeys should be mapped around personas representing target segments. The CJMs should help managers put themselves in their customers’ shoes (“Map and Optimize Your Customer Journey“, 2014 [PDF])..
  • In the blog of TeleTech (How We Think), Niren Sirohi writes about the importance of defining target segments and mapping typical customer journeys for each one. Sirohi emphasises that all stages and modes engaged and all activities involved should be included, not only those in which the company plays a role. Next, companies should identify and understand who are the potential influencers at every stage of the journey (e.g., self, retailer, friend). Then ideas may be activated as to how to improve on customer experiences where the company can influence (“A Framework for Influencing Customer Experience“, 16 Oct. 2014).

Customer engagement — This is another prominent viewpoint from which companies approach the customer journey. Nice direct to Customer Journey Optimization via Multi-Channels and Customer Engagement. Verint also present customer journey analysis as part of their suite of Customer Engagement Analytics (also see their datasheet). The analytic process includes “capturing, analysing, and correlating customer interactions, behaviours and journeys across all channels”.  For IBM, the topic of customer journey strategy belongs in a broader context of Continuous Customer Engagement. The next steps for a strategy following mapping (see above) are to pinpoint areas of struggle for customers, determine gaps to fill wherein customer needs and preferences are unmet by current channels and functionalities they offer, and finally strategize to improve customer experiences.

  • Attention should be paid not only to the sequence of interactions but also to what happens during an interaction and how customers react or feel about their experiences. As cited above, Gilliam of TeleTech refers to the mood of customers. Verint say that they apply metrics of customer feedback regarding effort and satisfaction while Nice use text and speech analytics to extract useful information on the content of interactions.

Key issues in improving customer engagement that professional services providers recognize as crucial are reducing customer effort and lowering friction between channels. Effort and struggle by customers may arise during interaction in a single touch-point but furthermore due to frictions experienced while moving between channels. Behind the scenes, companies should work to break down walls between departments, better co-ordinate functions within marketing and with other areas (e.g., technical support, delivery, billing), and remove silos that separate departmental data pools and software applications. These measures are necessary to obtain a complete view of customers. At IBM they see departmental separation of functions in a company, and their information silos, as a major “enemy” of capturing complete customer journeys. Ken Bisconti (29 May 2015) writes in their blog Commerce on steps that can be taken, from simple to sophisticated (e.g., integrated mapping and contextual view of customers across channels), to improve their performance in selling to and serving customers across channels, increase their loyalty and reduce churn. Genesys see the departmental separation as a prime reason to discrete and disconnected journeys; continuity between touch-points has to be improved in order to reduce customer effort (solution: omnichannel Customer Journey Management). Piaggio (Nice) suggests that input from CJMs can help to detect frictions and reduce customer effort; she also relates to the need to reduce silos and eliminate unnecessary contacts. Last, TeleTech also call in their paper on “Connecting the Dots” to break down walls between customer-facing and back-office departments to produce a more channel-seamless customer experience.

  • Technology and analytics firms compete on their software (in the cloud) for mapping customer journeys, the quality of journey visualisation (as pathways or networks), their analytic algorithms, and their tool-sets for interpreting journeys and supporting decision-making (e.g., Nice, Verint, Teradata, TeleTech while IBM intend to release their specialised solution later this year).

Varied approaches may be taken to define a journey. From the perspective of a purchase decision process, multiple steps involving search, comparison and evaluation up to to purchase itself may be included, plus at least some early post-purchase steps such as feedback and immediate requests for technical assistance (e.g., how to install a software acquired). In addition, a journey of long-term relationship may refer to repeated purchases (e.g., replacement or upgrade, cross-sell and up-sell). Alternatively, a journey may focus on service-related issues (e.g., technical support, billing). How a journey is defined depends mostly on the purpose of analysis and planning (e.g., re-designing a broad process-experience, resolving a narrow problem).

As use of digital applications, interfaces and devices by consumers grows and expands to perform many more tasks in their lives (e.g., in self-service platforms), we can expect reliance of CJMs on digital channels and touch-points to become more valid and accurate. But we are not there yet, and it is most plausible that consumers will continue to perform various activities and interactions non-digitally. Consumers also see the task they need or want to perform, not merely through the technology employed. Take for example physical stroes: Shoppers may not wish to spend every visit with a mobile device in hand (and incidentally transmit their location to the retailer). Don Peppers laments that companies have designed customer experiences  with a technology-first, customer-second approach whereas the order should be reverse. Undertaking a customer perspective is required foremost for effectively identifying frictions on a journey pathway and figuring out how to remove them  (“Connecting the Dots”, TeleTech). Excessive focus on technologies can hamper that.

Bruce Temkin (Temkin Group, Blog: Experience Matters) provides lucid explanations and most instructive guidance on customer journey mapping. However, it must be noted, Temkin advocates qualitative research methods for gaining deep understanding of meaningful customer journeys. Quantitative measures are only secondary. He does not approve of confusing CJMs with touch-point maps. His concern about such interpretation is that it may cause managers to lose the broader context in which touch-points fit into consumers’ goals and objectives. Temkin puts even more emphasis on adopting a form of Customer Journey Thinking by employees to be embedded in everyday operations and processes, following five questions he proposes as a paradigm.

There are no clear boundaries to the customer journey, and doubtful if they should be set too firmly — flexibility should be preserved in defining the journey according to managerial goals.  A journey should allow for various types of activities and interactions that may help the customer accomplish his or her goals, and it should account not only for their occurrence and sequence but also for content and sentiment. A viewpoint focusing on channels and touch-points, leading further to technology-driven thinking, should be modified. An approach that emphasises customer engagement but from the perspective of customers and their experiences is more appropriate and conducive.

Ron Ventura, Ph.D. (Marketing)

Read Full Post »

Big Data, Big Science, Data Science — This triad of concepts exemplifies the new age of utilisation of data in large Volume by companies to produce information and insights for guiding their operations, such as in marketing, to perform more effectively and profitably. Yet Big Data also means that data exhibit great Variety (e.g., types and structures), and are generated and transformed in high Velocity. The data may be retrieved from internal or external sources. To be sure, non-business organisations also utilise Big Data and Data Science methods and strategies for a range of purposes (e.g., medical research, fraud detection), though our interest is focused here on marketing, inter-linked with sales and customer service, as well as retailing.

It is not quite easy to separate or draw the line between the concepts above because they are strongly connected and cover similar ideas. Big Data may seem to emphasise the properties of the data but it is tied-in with specialised technologies and techniques needed to store, process and analyse it. Likewise, Data Science (and Big Science) may imply greater emphasis on research strategies, scientific thinking, and analytic methods, but they are directed towards handling large and complex pools of data, namely Big Data. Nonetheless, we may distinguish Data Science by reference to occupation or position: Professionals recognized as “data scientists” are identified as distinct from all other business analysts or data analysts in this field – data scientists are considered the superior analysts, experts, and mostly, the strategists who also connect between the analytic domain and the business domain.

The Trend Lab VINT (Vision – Inspiration – Navigation – Trends), part of Sogeti network of experts (Netherlands), published an instructive e-book on Big Data. In the e-book titled “No More Secrets With Big Data Analytics” (2013), the team of researchers propose a logical linkage between these concepts while relating them to Big Business. Big Science was conceived already in the early 1960s (attributed to atomic scientist Alvin Weinberg) to describe the predicted rise of large-scale scientific projects. It was not associated necessarily with amount of data (typical contexts have been physics and life sciences). Big Data as a concept emerged nearly ten years ago and turned the spotlight on data. Data Science is introduced by VINT as the toolbox of strategies and methods that allows Big Data to bring us from Big Science to Big Business. Data Science is “the art of transforming existing data to new insights by means of which an organizsation can or will take action” (p. 33). Originally, Big Science emphasised a requirement of scientific projects that is true today with regard to Big Data projects: collaboration between researchers with different areas of expertise to successfully accomplish the research task.

  • The researchers of VINT note that some scientists disapprove of connotations of the word “big” and prefer to use instead the term “extreme” which is in accordance with statistical theory.

The VINT e-book cites a profile for the position of data scientist suggested by Chirag Metha (a former technology, design and innovation strategist at SAP). In the headline Metha stated that the role of a data scientist is not to replace any existing BI people but to complement them (p. 34; BI=Business Intelligence). He defined requirements from a data scientist in four areas: (a) deep understanding of data, their sources and patterns; (b) theoretical and practical knowledge of advanced statistical algorithms and machine learning; (c) strategically connecting business challenges with appropriate data-driven solutions; and (d) devise an enterprise-wide data strategy that will accommodate patterns and events in the environment and foresee future data needs of the organisation. Therefore, primary higher-level contributions expected from a data scientist include the capacity to bridge between the domains of business and data/analytics (i.e., translate business needs to analytic models and solutions and back to [marketing] action plans), and an overview of data sources and types of data, structured and unstructured, and how to combine them properly and productively.

The pressure on companies to implement data-driven marketing programmes is growing all the time. As one company becomes publicly commended for successfully using, for instance, feedback on its website and in social media to create better-tailored product offerings, it gains an advantage that puts its competitors under pressure to follow suit. It may also inspire and incentivize companies in other industries to take similar measures. Such published examples are increasing in number in recent years. Furthermore, companies are encouraged to apply individual-level data of customer interactions with them (e.g., personal information submitted online, stated preferences and tracking page visits and item choices made on viewed pages) in order to devise customized product offerings or recommendations for each customer. Already in the late 1990s the grocery retailer Tesco leveraged its business in the UK and gained a leading position by utilising the purchase and personal data of customers gathered through their loyalty Clubcard to generate offerings of greater relevance to specific customer segments they identified. Amazon developed its e-commerce business by recommending to individual customers books related to those they view or purchase based on similar books purchased by other customers and on customers’ own history of behaviour.

A key challenge facing many companies is to implement an integrative approach that enforces a single view of the customer across organisational functions and channels. Thus, marketing programmes and operations must be coordinated and share data with sales and customer service activities. Moreover, data of interactions with customers, and consumers overall (as prospects), need to be examined and incorporated across multiple channels — offline, online, and mobile. This is a mission of utmost importance for companies these days; ignoring or lagging behind on this mission could mean losing ground in a market and relevance to customers. This is because customers’ experience extends over different stages of a journey in their relationship with a company and across multiple alternative channels or touchpoints they may use to fulfill their objectives. They expect that data that become available to companies be employed to improve in some way their customer experience anywhere and anytime they interact with the company. For companies, it definitely requires that they not only gather but also analyse the data in meaningful and productive ways. Whether the interactions occur in-store, over the phone, on a company’s website, in social media networks, or through mobile apps, customers consequently expect those interactions in and between channels to be smooth and frictionless. As for companies, they need to be able to share and join data from the different channels to obtain a comprehensive view of customers and co-ordinate between channels.

  • The American leading pharmacy retailer Walgreens established a platform for monitoring, analysing and managing its inventory jointly across all of its outlets, over 8,000 physical stores and four online stores, so as to allow shoppers to find, purchase and collect products they need in as a seamless manner as possible. They integrate point-of-sale data for customers with data from additional sources (e.g., social media, third-party healthcare organisations) in order to improve patient care.
  • Procter & Gamble, which does not have direct access to sales data as retailers, created an independent channel of communication with consumers; with the help of Teradata, they use personal data provided by consumers online and other data (e.g., social media) to put forward more personalised product offerings for them.

An additional important aspect is the need to join different types of data, both structured (e.g., from relational customer databases) and unstructured (e.g., open-end text in blog posts and social media posts and discussions). Data that companies may utilise become ever more heterogeneous in type, structure and form, posing greater technical and analytical challenges to companies, but also offering better opportunities. Companies may also consider using digital images, voice tracks (i.e., not only for verbal content but also tone and pitch), and all sorts of traffic data (e.g., electronic, digital-online and mobile, and even human-physical traffic in-store). For example, suppose that a company identifies photo images posted by its customers online and recognizes that the images include objects of product items; it then may complement that information with personal data of those customers and various interactions or activities they perform online (e.g., company’s websites, social media) to learn more about their interests, perceptions, and preferences as reflected through images.

  • The US airliner JetBlue uses the Net Promoter Score (NPS) metric to trace suspected problems of customer satisfaction, and then utilise survey data and content from social media networks, blogs and other consumer-passenger communications to identify the possible source and nature of a problem and devise an appropriate fix (an award-winning initiative by Peppers & Rogers).

But there is reason for some concern. In a report titled “Big Data: The Next Frontier for Innovation, Competition, and Productivity” (2011), McKinsey & Co. Consulting Group cautioned of an expected shortage in highly advanced analytic professionals and data-proficient managers. They estimated that by 2018  organisations in the US alone could face a shortage of 140,000 to 190,000 people with “deep analytical skills”. Nonetheless, the report also predicts a shortage of 1.5 million managers and analysts “with the know-how to use the analysis” of Big Data and its effective application for decision-making.  The first part seems to refer to the professional-technical level whereas the second part points to utilisation of Big Data at the business level. Thus, McKinsey & Co. appear to be even more concerned by inadequate ability of companies at a managerial level to benefit from the business advantages, such as with marketing-related objectives, that Big Data can produce. Data Scientists may be counted in both categories of this forecast, but because they need to be simultaneously expert analysts and business-savvy they could belong more closely with managers.

However, the situation may not improve as quickly as sought. The problem may be that young people are not attracted, not encouraged, and are not educated and trained enough to obtain high proficiency and skills in the exact sciences of mathematics and statistics, at least not at a growing pace that the industry may require. This problem seems to be imminent particularly in Western countries. Popular areas of studies such as technology, computer sciences and business administration can not compensate for lack of sound knowledge and skills in mathematics and statistics as far as utilisation of Big Data in marketing in particular and management in general is concerned. Yet business students, MBAs included, are more inclined to stay away rather than embark on their courses and tasks in statistics and data analysis; and the number of graduates in exact sciences is not increasing fast enough (in some cases even decreasing).  Here are some figures indicative of the problem that may help to illuminate it:

  • In the latest PISA exams carried out by the OECD in 2012 for school students aged 15-16, seven out the ten top ranking countries (or economies) in math are from the Far East, including Shanghai and Hong-Kong of China, Singapore, Republic of Korea, and Japan. Three European countries close the top list: Switzerland, adjacent Lichtenstein, and the Netherlands. Their scores are above the mean OECD score (494), ranging between 523 and 613.
  • Western countries are nevertheless among the next ten countries that still obtain a score in math above the OECD mean score, including Germany, Finland, Canada, Australia, Belgium and Ireland. But the United Kingdom is in 26th place (score 494) and the United States is even lower, in the 36th place (481). Israel is positioned a bit further down the list (at 41st, score 466). [34 OECD members and 31 partner countries participated].

  • In Israel, the rate of high school students taking their matriculation exam in math at an enhanced level (4 or 5 units) has changed negatively in recent years. It ranged in the years 1998-2006 from 52% and up to 57% but since 2009 and until 2012 it dropped dramatically to 46% of those eligible to a matriculation certificate, according to a press release of the Israeli Central Bureau of Statistics (CBS). It is noted by CBS that this decrease occurs in parallel with an increase in the total number of students who obtain the certificate, but this suggests that effort was not made to train and prepare the additional students to a high level in mathematics.

  • In statistics published by UNESCO on the proportion of academic  graduates (ISCED levels 5 or 6 — equivalents of bachelor to PhD) in Science fields of study, we find that this proportion decreased from 2001 to 2012 in countries like Australia (14.2% to 9%), Switzerland (11.5% to 9%), Republic of Korea (9.8% to 8.5%), UK (17.4% to 13.7%), and Israel (11.7% to 8.5% in 2011).
  • This rate is stable in the US (8.6%) and Japan (though low at 2.9%), while in Finland it has been relatively stable (10%-11%) but shifting down lately. Nice rises are observed in Poland (5% to 8%), Germany (13% to 14.5%), and the Netherlands (5.7% to 6.5%); Italy is also improving (up from 7.5% to 8%). [Levels of ISCED scheme of 1997; a new scheme enters this year].

The notion received is that supply of math and science-oriented graduates may not get closer to meet market demand by companies and non-business organisations in the coming years; it could even get worse in some countries. Companies can expect to encounter continued difficulties to recruit well-qualified analysts with potential to become high-qualified data scientists, and managers with good data and analytics proficiency. Managers and data scientists may have to work harder to train analysts to a satisfying level. They may need to consider recruiting analysts from different areas of specialisation (e.g., computer programming, math and statistics, marketing), each with a partial skill set in one or two areas, continue to train them in complementary areas, and foremost oversee the work and performance of mixed-qualification teams of analysts.

Big Data and Data Science offer a range of new possibilities and business opportunities to companies for better meeting consumer needs and providing better customer experiences, functionally and emotionally. They are set to change the way marketing, customer service, and retailing are managed and executed. However, reaching the higher level of marketing effectiveness and profitability will continue to command large investments, not only in technology but also in human capital. This will be a challenge for qualified managers and data scientists to work together in the future to harvest the promised potential of Big Data.

 

Ron Ventura, Ph.D. (Marketing)

Read Full Post »

Social media networks are flourishing in activity. Most attention is given to Facebook that reached one billion members in the summer of this year. The lively arena of Facebook, humming with human interaction, and its potential to provide easy access to millions of consumers, has soon attracted the interest of marketers. A particular area of interest is the opportunity to study consumer perceptions, attitudes, preferences and behaviour through research activity in online social media networks, primarily in Facebook.

We may distinguish two tracks of research:

  • One track entails the collation and analysis of personal content created by network members with minimal or no intervention of companies. This track falls mainly within the domain of Big Data analytics that evolved dramatically in the past few years and keeps growing. Analytic processes may include text mining in search of keywords and key phrases in discussions, frequencies of “like”s, and movement between pages.
  • The other track, that is the focus of this post-article,  includes interaction between a company and consumers, usually within a community or forum set-up by the company in its corporate name or in the name of one of its brands (e.g., its “page” on Facebook). This activity may take the form of regular discussions initiated by the company (e.g., introducing an idea or a question on topic of enquiry to which members are invited to comment) but also invitations to participate in surveys and moderated focus-group discussions online.

Online marketing research is prevalent for at least ten years now and the methods associated with this field, including surveys, experiments and focus-group discussions, continue to improve. However, the belief taking hold among marketers that they can reliably and transparently shift their research studies to the environment of social media is illusive and misleading (see articles in The New-York Times, TheMarker [Israel]).

Advantages in speed and cost may be tempting marketers to replace established methods with new techniques accustomed to social media or attempt at launching the former from within social media networks. But social media has distinctive features, particularly in structure of information and the coverage of its audiences, that do not allow an easy and simple transition into the new environment, at least not so much as turning traditional marketing research methods redundant.

The problem starts with the “rules of game” typical in a social media network. The codes or norms of discourse between members in the network do not generally fit well with the requirements of rigorous tools of research for data collection. Questions in surveys usually have specially designed structures and format and are specific in defining what the respondent is asked about. They are formulated to achieve satisfactory levels of validity and reliability. The social network on the other hand gives utmost freedom of expression in writing entries or comments. It tries to avoid constraining members into particular modes of reply. Questions prompted to members are usually written in everyday friendly language, the less formal as possible. One may normally post one to three questions at most in such mode of discussion. It lacks any discipline that robust research usually demands. The mode of questioning normally feasible within the pages of the social media website may be acceptable for some forms of qualitative research but, reasonably, it takes more than a few questions to properly investigate any topic.

A marketer may get some idea of direction where consumers or customers are driving at in their thoughts and feelings by scrutinizing their answers subjectively and individually. But it would be presumptuous to derive quantitative estimates at any reasonable level of accuracy (e.g., purchase intentions and willingness-to-pay).

  • Critics of surveys argue that the reliability of responses is often compromised when respondents attempt to second-guess what the client of the survey wants to hear or they are subject to “social desirability”, that is, they are trying to give the answer believed to be approved by others. However, this problem is not any less susceptible to surface in comments in the setting of social media. When writing in their own words in the less formal setting of a social media community, members may feel more free to express their opinions, preferences, thoughts and feelings; yet they are still expressing what they are ready to share. Furthermore, the social media is a great venue for people to promote the way they wish to be perceived by others, that is, their “other-image”, so we should not assume that they are not “fixing” or “improving” on some of their answers about their preferences, attitudes, the brands they use, etc.

One may use a web application to upload a short survey questionnaire embedded in his or her own page or as a pop-up window. The functionality of such surveys is rather limited, with only a few questions, and is usually more of a gadget than a research tool. The appropriate alternative for launching a more substantive study is to invite and refer participants to a different specialised website where an online survey is conducted with a self-administered questionnaire or a remote focus-group session can be carried out. Here we should become concerned: Who answers the survey questions or takes part in a study? Who do the participants represent?

This concern is a more critical issue in the case of surveys for quantitative research than in forms of qualitative research. Firms are normally allowed and able to address members of their own pages or communities who are “brand advocates” or “brand supporters”. The members-followers are most likely to be customers, but in addition to buying customers they may also include consumers who are just favourable towards the brand (e.g., for luxury brands). If the target population of the research that the marketer wishes to study matches this audience then it is acceptable to use the social media network as a source, and at least for a qualitative study it can be sufficient and satisfactory. However, for a quantitative study it is vital to meet additional requirements upon the process of selection or sampling of participants in order to allow valid inferences. Unfortunately, the match is in many cases inadequate or very poor (e.g., the pool of accessible members covers only a faction of the customer-base with particular demographic and lifestyle characteristics). For quantitative research the problem is likely to be more severe because the ability to draw probabilistic samples is limited or non-existent, and recruitment relies mostly on self-selection by the volunteering members.

The field of online research is still in development where issues of sampling from panels for example are still debatable. There are also misconceptions about the speed of online surveys because in practice one may need to wait even for a week for late respondents in order to obtain a better representative sample. Yet advocates of marketing  research through social media networks like Facebook try, quite immaturely, to pave the way into this special territory facing even more difficult methodological challenges.

There are certainly advantages to focusing research initiatives on the company’s customers, particularly in matters of new product development. Customers, and possibly even more broadly “brand supporters”, are likely to be more ready and motivated to help their favourite company, contributing their opinions and sharing information about their preferences. They are also likely to have closer familiarity with the company or brand and obtain better knowledge of its products and services than consumers in general. Hearing first what its own customers think of an early idea or a product concept in development makes much sense to help putting the company on the right track. However, as the configuration of a product concept becomes more advanced and specific, more specialised research techniques are required to adequately measure preferences or purchase intentions. Wider consumer segments also need to be studied. Even at an early stage of an idea there is a risk of missing on real opportunities (or vice versa) if an inappropriate audience is consulted or insufficient and superficial measurement techniques are used. Using the responses from “brand supporters” in a social media network can be productive for an exploratory examination to “test the water” before plunging in with greater financial investment. But such evidence should be evaluated with care; relying on the evidence from social media for making final decisions can be reckless and damaging.

Nevertheless, marketers should distinguish between interactions and collaboration between a company and its customers and research activity. Not every input should be quickly regarded as data for research and analysis. First of all, the mutual communication between customers or advocates and a company/brand is essential to maintaining and enhancing the relationship between them, and the company therefore should encourage customers to interact and furthermore contribute to its function and performance. Hence, when product users offer their own genuine ideas for new products or product improvements (e.g., hobbyists and enthusiasts who develop and build new Lego models) their contributions are welcome, and the better ones are implemented. And when a company (Strauss food company, Israel) gives feedback on ideas by its followers on its Facebook page as to which ideas are inapplicable, to be applied “maybe another time”, as well as in initial review, this activity is commended. But these interactions belong in the domain of collaboration, not research. Survey-like initiatives in Facebook may aid in enforcing a feeling of partnership between a company and its customers (commented to TheMarker by Osem food company).  A debate extended on this issue of “partnership” questions whether the reward to originators of successful ideas is only a sense of achievement and contribution or should they receive also material rewards from the benefiting companies.

Social media networks seem foremost appropriate as a source for qualitative research. If those who advocate performing marketing research in Facebook refer primarily to qualitative types of research, then it seems reasonable and more often may be admissible. It is also generally appropriate for exploratory and preliminary examinations of marketing initiatives but when done with caution in view of the limitations of the social media forums. It is much less appropriate as a venue and source for quantitative studies.

While interesting and valid studies can be conducted on how consumers behave in social media websites (e.g., on what subjects they talk, with whom, and the narrative of discourse they use), using a social media network as a source of research on other topics is a different matter. When done for marketing purposes, there are ethical issues regarding analytics of personal content in social media that could not be discussed in the current post. Primarily at stake is the concern whether companies are entitled to analysing content of conversations between consumers-members, suggesting that they are spying on and eavesdropping to network members. Even in discussions on the company’s page the utilization of analytic techniques may not be appropriate or effective. Access to background information on members who activate web apps on the company’s page (with their permission) is another contentious issue. For most users, this is the kind of privacy they have to give up for participating in a network free of charge, but to what extent will consumers agree to go on like this?

The use of social media networks for marketing research, as well as analytics, is therefore more complex and less straightforward than many marketers appear to perceive those activities. Foremost, explorations in social media should not be viewed head-on as a substitute for the more traditional methods of marketing research.

Ron Ventura, Ph.D. (Marketing)

Read Full Post »

In product categories crowded with brands and models, entering a new alternative for consumer choice is a serious challenge. Whether one thinks of shampoos, TV screens, or cars, the competition on consumers’ minds, hearts, and pockets is tough. Consumers also have become more experienced and more critical about products and services made available to them. While the knowledge of the majority of consumers in any particular category is likely to be partial and lacking in detail about brands, models and product attributes, they can rather easily gather additional information as needed by reading articles online and offline, visiting stores or company websites, and by consulting with savvier (‘expert’) consumers. Trying to launch a product without having the consumer point-of-view would be under these contemporary market conditions an irresponsible if not reckless move.

Large corporations are already persistent in using research to learn about consumers’ preferences and expectations when developing many of their new products (i.e., at least when those are new concepts, lines or generations of products). Medium- and small-sized companies are much less committed.  Nevertheless, it seems that companies of every size recognise the potential of social media and other forms of direct interaction between a company and its customers as a source of feedback from consumers. In many cases using these channels would be more convenient, readily accessible and less expensive than application of research methodology. So, it can be quite tempting to shift from research to these new modes of interaction.

But research is not like social media and other channels of interaction with a company where consumers can voice their preferences and expectations. The latter lack the rigour of research methods in collecting input from consumers. In addition, the pool of consumers who contribute their feedbacks via these channels may not represent closely enough the target segment for the new product. Thus information received can be grossly biased and misleading. Consumers should be allowed to participate in the process of developing new products for their utility and enjoyment. Yet, great care should be taken as to how such participation takes place.

Modes to be considered include:

  • Social media in a public domain (e.g., Facebook) or as a private community hosted by the company;
  • Interaction or dialogue with customers via e-mail or a message web-interface built originally for service and support;
  • Focus group discussions and in-depth interviews;
  • Customer and market/consumer surveys employing methods and techniques designated for New Product Development (NPD) research.

Companies should aspire to use more of these modes by advancing down the list: Any approach lower on the list may be used in addition or at the expense of a solution upper on the list, depending on budget.

For small-size businesses social media may be the only feasible and efficient option for hearing from consumers, and it is better than relying only on their own conceptions.  Still, managers who rely on this approach should be aware of its limitations and disadvantages.

Primarily it is the lack of structure and consistency that characterises information derived from contribution of participants in social media as well as messages in other forms of interaction. The consumers participating in the process may be good-willing but their contributions should be carefully scrutinized to derive real value. But even if we did try to get some order in contributions by prompting participants with guiding questions, there could be problems with the validity of our inferences.

The ability to control and correct for biases in characteristics of the participants in social media and other forms of interaction  is very limited when the participants volunteer to contribute their viewpoints. This means that the validity of the marketing conclusions drawn from these inputs is in jeopardy. The suggestions and expectations traced in social media can be used in a manner similar to focus groups, that is, to screen possibilities, set priorities and guide further investigation with quantitative research methods. However, generalization to a target population for the product will not be valid no matter how many hundreds or thousands of consumers contributed feedback to the company.

Furthermore, convenience, availability and cost should be balanced against confidentiality and protection from competition. Management can use contributions from social media to guide internal product ideation. However, the inputs would be quite unfocused, spread in different directions, sometimes at the whim of the community members. As feedback is desired on a more mature and specific product concept, the more restrictive and secured forums are preferrable (e.g., a social media website sponsored by the company and dedicated to its areas of activity). As we go down the proposed list of modes for consumer participation, fewer consumers or customers will actually be exposed to the particular product, and actually without decreasing the value and validity of the information gathered.

Conducting research among customers chosen from a customer database of the company has its advantages. These can be members of a loyalty club or a customer community on the web. Importantly, they should be consumers identified as customers prior to the research. It can be a good start to turn to existing customers compared with consumers in general, as the customers already have some level of commitment to the company, and are therefore more likely to be willing to participate and contribute. They also are likely to be more familiar with the company, its brands and products to make their expectations, suggestions and other forms of feedback more valuable and meaningful to the company. In a survey, the sample should be randomly drawn from the customer database.

On the other hand, there are limitations to customer research for which reliance on customers is insufficient. First, the so-called “inside” perspective of customers can be a drawback because they see things too much like the company, or they may be willing to please and compliment the management for their ideas. A partial solution to this pitfall could be to ensure a mixture of customers having different levels of satisfaction with and loyalty to the company and brands. Second, the preferences and expectations of existing customers may not be similar or representative of those of the target segment of consumers as a whole for the new product. If the company wishes, as is normally expected, to attract new customers with its new product, then research should be conducted among existing customers and consumers from the general public. A research program for a company developing a new product may involve an initial customer survey followed by a consumer survey. Respondents who reported they were users of a brand by the company should be compared on some key characteristics to the customer sample and the researchers should decide how to evaluate and incorporate their responses.

Various methods and techniques have been developed over the past few decades for generation ideas (e.g., brainstorming, creative workshops), measuring and modelling preferences, evaluating model designs, and not less important, setting appropriate prices for candidate products. In the next post I will refer more extensively to a particular set of methods and techniques developed and organized as a comprehensive programme by a team of researchers at MIT, namely the “Virtual Customer Initiative”.

Letting consumers participate in a process of new product development should be encouraged because it can contribute refreshing viewpoints to the product developers and it can allow the company to create a product that better fits the needs and preferences of consumers. Engaging consumers in the process can increase considerably the chances that the product developed will be useful,  beneficial, and its design will be visually appealing  to the target consumers. For that purpose it is desirable that a number of modes for participation and gathering information will be employed and the appropriate combination of them will be carefully selected.

Ron Ventura, Ph.D. (Marketing)       

Read Full Post »