Zombies come in various forms, at least in the entertainment world. Sometimes they have magic as their animus, rising from the dead with an insatiable appetite for healthy flesh, thanks to an evil spell. Sometimes a virus is what turns the living into undead corpses.
Zombie behaviour also varies. Sometimes they are slow-moving monsters that struggle to navigate anything that blocks their path. On TV’s The Walking Dead, for example, a closed door or gate is typically all it takes to keep a zombie out. In the movie World War Z, on the other hand, zombies are lightning fast and can scale walls like an army of ants. Sometimes zombies are self-aware, not to mention capable of love. But most often, they have no free will, moving about not by choice, but through manipulation by external factors beyond their control.
Most people believe zombies are nothing more than a product of our imagination. According to History.com, however, they “have a basis in fact, and several verified cases of zombies have been reported from Haitian voodoo culture.” Either way, traditional zombie behaviour is real enough. In fact, our world faces an epidemic of it thanks to the dark magic of oppositional marketing and its deployment by big data platforms.
Every day more people go online to inhabit virtual spaces like Instagram or Facebook than live physically in the United States and Canada combined. In return for being able to share their thoughts and images with others within these spaces, consumers allow their own data to be collected. The largest private-sector gatherer of data is Google, which reportedly stores an estimated 10–15 exabytes of data (one exabyte = one billion gigabytes). Not all of Google’s collected data is user-specific, but much of it is. Facebook—which reportedly stores roughly 0.3 exabytes of data in total—is estimated to have 100 MB of data and metadata for each user.
Most users understand their data is being collected, but few fully comprehend how it is used to do far more than simply better understand consumer needs or improve user satisfaction. This article refers to “users” because consumers who use search engines and social media platforms are usually not making a conscious decision to engage in a value exchange with an enterprise. While big data platforms may provide value in the form of seemingly free services, they operate without any real accountability, and in this Wild West environment, consumers are developing a collusive dependence on major platforms that deploy the data they collect to manage, modify, moderate, magnify, and magnetize consumer behaviours. As a result, millions of individuals flock to social media applications and search engines every day, where oppositional marketing moves them about—as if by magic—like mindless zombies on The Walking Dead being herded about with air horns.
Not all firms that collect data at scale actively seek to animate users. Nevertheless, major big data platforms are essentially designed to liberate users of data, time, and money by liberating them of free will. In fact, some platforms have become near monopolists partly because of the wills they bend. These trusted brands aggressively accumulate personal information as a raw material—from which value can be collusively extracted in opposition to customer interests—because internal values, leadership, and regulatory intervention are not yet sufficient to curtail the negative side of the rise of big data platforms, at least not to the extent hapless users might prefer, or society might need.
Since most major businesses are now essentially data companies, the objective of this paper is to raise awareness of the ethical questions raised when oppositional marketing is deployed to turn consumers into online zombies and to highlight the strategic benefits to be gained via the suboptimization of oppositional marketing. By dialing back the corporate use of personal data, organizational reputations can be preserved if a public backlash occurs or regulators bring down the hammer. This would also bring the customer relationship back into focus.
The Rise of Oppositional Marketing
For years, marketers have collected information related to customer transactions, preferences, and demographics, mining it for ways to cater to a “market of one.” This is reasonable because using consumer data to develop insights that improve customer satisfaction and relationships by enabling customized goods and services is in the interest of consumers, provided that privacy issues are managed appropriately.
Because consumer data collection started as a win–win that delivered a mutual benefit to customers and companies, it has long been seen as a legitimate driver of competitive advantage. As noted in my book Relationship Marketing, “the competitive position of a company and its relative profitability is likely tied directly to the cumulative volume of data it maintains on its customers, relative to its competitors.”
While that book was published over two decades ago, the central point remains: collecting customer data enables customer-specific management, which, in turn, can drive better business performance. Unfortunately, the use of personal data to drive performance has evolved somewhat along with the business models of big data platforms.
In the early years, big data platforms generally hemorrhaged cash while perfecting their solutions and business models. Facebook was launched in 2004, but it didn’t turn cashflow positive until 2009, when it posted a profit of US$229 million. Last year, the social network’s net income was counted in the tens of billions. What changed? Long story short, the key to boosting revenue was offering advertisers more than access to user eyeballs. Big data platforms figured out how to offer advertisers predictable changes in consumers’ behaviours, which reduces advertising risk, increasing the value that can be extracted from users. As a result, the central objective of data collection shifted.
Instead of being used by corporations to increase value in concert with the interest of consumers, much of the consumer data collected today is used to simultaneously support and oppose the interests of consumers. Big data platforms treat consumer information like engineers deploying a control theory “black box” to measure and manage inputs, outputs, and feedback loops. The negative side of this evolution of data collection and use was largely ignored, since consumers willingly gave up personal data in return for the benefits of using social media applications, search engines, etc.
According to informal surveys I have conducted, many individuals, especially younger ones, appear quite prepared to make this trade. Most consider the convenience a good value exchange for the data they give away. But few have made up their minds about the tradeoffs in a completely free, rational, and informed manner. Technological black magic is deployed to water down their free will, generally making them ill-prepared to recognize the hidden drawbacks. With so many friends and acquaintances using big data services, consumers are also heavily influenced by the power of FOMO (fear of missing out). Not using popular apps like Instagram and Facebook almost seems irrational when everyone else is on the platforms.
Every minute of every day, millions of consumers happily use their connected devices to post thoughts, videos, and images; communicate with friends and family; surf the Internet; and shop. And most have no real understanding of what’s happening on the other side of their screens—where oppositional marketers are hard at work deploying algorithms, artificial intelligence (AI), machine learning, and other advanced technologies, prioritizing the interests of advertisers and shareholders over those of the platforms’ users.
“Oppositional marketing does not replace relationship marketing. It operates in parallel. User value may be created, but the primary goal isn’t about serving user needs. It is all about data acquisition and user management.”
All marketers seek customer loyalty and that means to habituate users, but the usual route to this end has been to manage customer perceptions by engaging their cognition and emotions. Changed perceptions modify attitudes, which drives new behaviours. Oppositional marketers reverse the usual process. They use personal data to get users to behave in certain ways repeatedly until behaviours become habitual—often without users making a conscious choice, knowing that they will then modify their prior attitudes and perceptions to align with their new behaviours. After all, no one wants to behave in a way they cannot justify.
Oppositional marketing does not replace relationship marketing. It operates in parallel. User value may be created, but the primary goal isn’t about serving user needs. It is all about data acquisition and user management. By increasing “attention and engagement” and “time on platform,” big data platforms aim to out-collect their data-collecting competitors and lead the market in the control of user behaviour.
Think about those three little dots that flash after you send a message to someone. The dots indicate that the other party is preparing to respond, enticing you to remain on the platform to await the reply. Does the name Pavlov ring a bell? It should. After all, more than a few marketers and technologists who helped big data platforms become profitable studied the work of professor B. J. Fogg, director of the Persuasive Technology Lab at Stanford University, who coined the term “captology,” which refers to the study of how interactive technology like smartphones and websites can be used to change user attitudes and behaviours. In order to serve their real customers—ranging from goods and services companies to not-for-profits, political entities, and scammers—big data platforms regularly fog the free will of users via the brain’s limbic system, inducing adrenaline surges by serving up videos that they know will outrage us into staying online or designing the user experience to provide pleasurable shots of dopamine every time something we post scores a “like.”
Big data platforms learned from the gaming industry. As anthropologist Natasha Dow Schüll highlighted in Addiction by Design, slot machines are designed to operate with a mechanical rhythm that puts players in a trance-like state in which daily worries, demands, and greater awareness fade away. This euphoric condition is enhanced by machine ergonomics, casino architecture, customer tracking algorithms, and cash access systems, which help keep players playing to the point of physical and economic exhaustion. The industry’s ambience management is so effective that many gamblers can’t wait to return even after leaving a casino penniless because they have been habituated to desire the mental happy zone that gaming companies manufacture to override logical reasoning and regrets.
Like slot machines, the user experience offered by big data firms is designed to entice consumers to forget about doing other things and keep coming back for more. Eli Pariser’s The Filter Bubble, an excellent early book on this subject, describes how big data companies started captivating online consumers by spoon-feeding them only information and opinions that conform to and reinforce their own beliefs, leaving less room for unexpected encounters that spark creativity and innovation and arguably support democratic societies by providing diversity in the exchange of ideas.
Pariser’s book identified how the personalization trend was being driven in ways that isolated consumers and undermined the Internet’s original purpose as an open platform for the spread of ideas. That was almost a decade ago. Today, how the data being collected about consumers can be used to manipulate and control their behaviour poses a real threat to the public good.
As things stand, major market players like Amazon, Facebook, Google, LinkedIn, TikTok, Tinder, and Uber are highly advanced at using the data they collect online along with whatever aggregations they might perform (what they may learn about their users off platform) to make users feel comfortable and secure while online. This may be done via artificial intelligence, algorithms, and machine learning, which help the platforms understand how to increase engagement and feed filter bubbles, and it is done repeatedly with some incremental change, while measuring what works and what does not, in order to better serve advertisers who seek a custom list of users with particular behaviours or users who resemble current customers. Figure 1 describes this graphically for Facebook as an exemplar.
Figure 1: Facebook, Users, and Advertisers
Platforms design for behavioural modification by doing four things very well: understanding the current emotional and cognitive states and contexts of their users, increasing user motivation, simplifying the route to new behaviours, and then, when user motivations are heightened, triggering the desired behaviours.
Motivation can be increased by appeals to thoughtful consideration of aspects of the Maslow hierarchy of needs, which lists five categories of human need that dictate behaviour (physiological needs, safety needs, affiliation, love and belonging needs, esteem needs, and self-actualization needs). Not all psychologists agree with this assessment, but satisfying lower-level needs appears to enable higher-level needs to be met, so lower-level needs receive most attention when big data platforms seek to increase user motivation. LinkedIn, for example, levers social acceptance by sending users emails saying: “You are getting noticed. See who’s looking at your profile.” This appeals to social needs, vanity, curiosity, and FOMO simultaneously—a powerful mix—and triggers more time being spent on the site.
Fear is also a powerful motivator and can involve creating apprehension of loss or diminishment (e.g., in relation to friends, status, income, wealth, health, safety) and threats meriting attention. Big data platforms actively stoke fear creation and the spread of conspiracy theories because they provide the adrenaline and dopamine referred to previously. In other words, they benefit from the trolls, foreign operatives, and conspiracy theorists not just because of their purchase of advertising or other services but also because truthful news is often too tepid to sustain users looking to stimulate their neurotransmitters. Facts take a backseat to hypotheses that do not sustain credible scrutiny, but drive increased user time on the platform, greater user retention, and the acquisition of new users.
Platforms remove trolls and other undesirable users, but they constantly reappear, so suppression is an ongoing case of “whack-a-mole.” And after users read about the evil scientists who created COVID-19, the latest enemy within or a new caravan of foreign marauders, machine learning takes over and amplifies the outrage or concern by directing threatened users to a likeminded group or website. How exactly the machine learning operates is not always easily understandable, even by the platforms themselves, but platforms know it supports business by driving fear while they make a public show of de-platforming..
Generally speaking, big data platforms are addicted to the heightening of users’ emotional intensity because this helps the platforms achieve their growth objectives. Figure 2 describes progressively higher levels of users’ emotional intensity. Users at a low level of emotional intensity can be induced to spend more time online by raising their emotional intensity to increase engagement. Appeals to the highest level of emotional intensity can include clickbait or messaging that seeks to create loathing, rage, or vigilance, for example.
Figure 2: Increasing Emotional Intensity
Source: Based in part on content from Neil V. Watson and S. Marc Breedlove, The Mind’s Machine: Foundations of Brain and Behavior.
The adoption of new behaviours can be accelerated when the user journey is simplified or processes are combined to make it easier for users to act. Amazon’s introduction of the shopping cart is one such example. One-click purchasing is another such innovation. A third is a simplified one-click contact upload feature for which several platforms ask. With this information, they have an increased ability to change behaviours through social suasion, both to attract new users and to tweak existing users. “Your contact Susie just bought a new Widget. Click here to find out more.”
With user emotions heightened, motivation to stay online increased, and the journey to user action simplified, the behavioural trigger can be pulled using a variety of tactics based on predictive analytics. For example, purchases can be accelerated by offering time-limited discounts, communicating products are in short supply, sending reminders of gift-giving opportunities such as friends’ birthdays, or, potentially more perniciously, serving links to content that resonate with a user’s preference bubbles or emotional state.
Where Are the Regulators?
The following discussion is not intended to serve as a legal primer on regulations. It is simply meant to illustrate the environment in which big data platforms remain more or less free to follow the motto of Facebook founder Mark Zuckerberg—which is “move fast and break things”—without facing much accountability for individual and societal impact.
Regulators are notoriously slow at adjusting to technological changes, especially when complex technology-based “solutions” supposedly address the negative impact of the changes in question. As a result, one of the major things that is broken today is the level of protection provided to users of big data platforms who are unable to protect themselves from marketing that opposes their interests.
Privacy regulations differ from country to country, and even within national jurisdictions. As things stand, the strongest rules exist in the European Union, where the General Data Protection Regulation (GDPR) limits the gathering of personal data by stating that it can only be collected “for specified, explicit and legitimate purposes and not further processed in a manner that is incompatible with those purposes.” As a result of this wording, the explicit nature of data-collection practices can be found by individuals who look for it in end-user licensing agreements and posted privacy policies. Whether individuals in the EU know it or not, the GDPR ensures they can access data collected about them by big data platforms, have it erased, and contest its use for purposes not related to the service that collected it.
In the United States, privacy provisions are an uneven patchwork across federal and state levels. The California Consumer Privacy Act (CCPA) asks data-collecting companies to inform consumers about the categories of data being gathered along with what is done with it. While consumers have no control over data collected, the CCPA does attempt to limit the use of data to the purpose for which it was originally gathered. That said, the CCPA does not limit the gathering of sensitive information like geolocation data or biometric data, including DNA, fingerprints, and retinal scans. As noted in a DataGuidance comparison of major privacy laws, California also leaves companies free to make inferences about an individual’s psychological state, predispositions, behaviours, attitudes, and intelligence.
Canada’s Personal Information Protection and Electronic Documents Act leaves much to interpretation by stating: “An organization may collect, use or disclose personal information only for purposes that a reasonable person would consider are appropriate in the circumstances.” The Office of the Privacy Commissioner of Canada also opens the door to different interpretations of what is allowable when stating “significant harm” should not occur when consumers trade some of their privacy for convenience and choice. Is incremental behavioural modification—where one single change may not be particularly problematic but which can be egregious in aggregate across time—a significant harm? And is harm considered only in respect to individuals (as seems to be the case), or can a broad negative societal impact be considered significant enough to limit the use of personal data?
A Zombie Apocalypse Future
Big data platforms don’t just seek to manipulate consumer behaviour using their online posts, search histories, interaction and transaction data, and entertainment preferences. According to a recently updated “Big brother brands report” by cybersecurity firm Clario, the information currently being collected “ranges from the things you might expect—like your name, date of birth and email address—to the more obscure, like your pets, hobbies, height, weight and even what you like to get up to in the bedroom.” And it is not just personal information being collected. Of all the brands that collect data, 6.25 per cent store facial images, and some go further by requesting access to your entire image library, which is full of insights related to your interests, health, activities, and lifestyle. Taking advantage of voice recognition usage, user voice files are also stored for later use or analysis.
If the Wild West environment remains in place, it is reasonable to assume the amount of user data collected will grow, perhaps by orders of magnitude. It is also safe to assume big data platforms will continue to develop the ability to modify and manipulate user behaviour, and do it seamlessly without users noticing, at scale, and in real time. Data gathering will also likely become more invasive, and not just online. Platforms acquire competitors and firms in other markets to provide new opportunities for data gathering, integration, insights, algorithm deployment, and machine learning. Google’s acquisition of Nest and Fitbit, for example, will extend its data gathering to our doorsteps and beyond, on our bodies. The EU has imposed some restrictions on the use of Fitbit’s data for providing ads, but other jurisdictions have not been as quick to recognize potential downsides, such as integration of Fitbit data with other user data, emerging technologies (like the Fossil Group tech Google has acquired), electronic health records (where Google has patents), and the firm’s various health industry initiatives (like Verily Life Sciences).
While most users trust Google not to do evil, the current regulatory environment doesn’t ensure this trust won’t be broken, especially in labyrinthine ways enabled by metadata, which is why the current round of acquisition activity should raise concerns over the direction data collection is going. As noted by reporter Abby Vesoulis, Amazon’s recently announced plan to acquire MGM Studios from a group of private equity firms for US$8.45 billion, will, if completed, do more than help the company compete with streaming giants like Netflix. It sets up this data-collecting scenario:
Imagine you invite friends over for a movie night on a new flatscreen TV purchased on Amazon Prime. The gathering is last minute, but the television was delivered to you in two days through Amazon’s speedy fulfillment services. You swing by Amazon-owned Whole Foods to get some snacks and pizza beforehand, which you’ll get a discount on because you’re a Prime member. When your friends arrive, you may stream some tunes on Amazon Music via your Amazon Echo speaker, and then queue up the thousands of movie options on Amazon Prime Video. Before finalizing the selection, your friends compare movie reviews on IMDB, an Amazon subsidiary since 1998.
Brands like Apple have long been in the product ecosystem business. Now we are living within behavioural ecosystems managed and directed by major platforms.
Facebook and other platforms are investing heavily in the development of consumer hardware products, so it is not hard to imagine implantable data-gathering devices becoming as common as today’s wearables. Imagine a Nest-branded pacemaker or glucose meter implant for people with diabetes. Now imagine what kind of data might be gathered by these devices and how it might be used.
Public resistance to the intrusion of big data platforms will likely increase as data gathering becomes even more advanced. But most users will naively carry on, and the interests of big data platforms will likely further converge as they continue to vacuum up all possible sources of user data. With even larger data repositories in all spheres of a user’s life, and with an even greater capability to apply data to user behaviour management in real time, it will be increasingly hard for a user to assert their individualism and independence, at least online.
Clearly, ignoring where data collection is going in the current regulatory environment would be a profound mistake. Think about what can be done when data-collecting platforms really start analyzing everything users are doing and saying in the moment and combine what they learn with everything an individual user has ever said or done online. Now include data on users’ location, age, education, and social connections along with users’ psychographics and idiosyncrasies and information on users’ political leanings, assets, income, brand preferences, and spending patterns, not to mention health, sexual orientation, and gender identification.
What if big data platforms started measuring one’s emotional state by assessing pupil dilation, heart rate, facial expressions, and reaction time? Many have access to all the data they need to do this now. What if they used in-home devices to analyze speech patterns (e.g., how many times the words “I” or “we” are used) and looked for insights related to a person’s egocentricity or lack thereof? What if they conducted cluster analysis, A/B tests, conjoint analysis, longitudinal analysis (across time), and pricing analytics to separate users from their money? What if the content delivered was designed to be so engaging based on how users felt at the moment that we stopped doing anything else and remained captivated to the point of exhaustion?
Once much of this would have sounded impossible or unlikely, the stuff of movies such as A Clockwork Orange and Minority Report or TV shows like Black Mirror. But this is indeed the apparent direction of big data platforms, and it threatens to exacerbate social issues related to mental health, neediness, outrage, polarization, pornification, radicalization, and sub-cultural affiliation, as well as suffering from loneliness in a sea of digital “friends.”
Restoring Respect for Free Will
When Amazon first introduced profile-based recommendations based on purchases made by other users with similar interests, people who valued them were conditioned to click again next time a recommendation appeared. This was a fairly benign behaviour modification. But big data’s oppositional marketers have since become far more advanced at introducing new behaviours, which users then rationalize with modified cognition.
If the Canadian government used our personal information to manipulate citizens this way, we would revolt. The major big-data platforms should explore business model innovation, including reduced or different dependence on advertisers. But until consumer sentiment radically changes, or regulators catch up and seriously limit how advanced technologies and personal data collection can be used to manipulate and influence individuals, big data platforms appear to have enough room to keep hacking minds.
The question for other companies is whether they want to be in the business of turning consumers into zombies. As mentioned above, all major companies are now essentially data companies, which is why organizations of all sizes are moving to mimic the oppositional marketing practices of big data platforms. For many, the case for doing so appears clear-cut, since it is legal and the profits that can be made are significant. But being able to easily rationalize something doesn’t make it right, especially for organizations that support the emerging concept of stakeholder capitalism—which dictates managing a business in the interests of all stakeholders, including society.
Simply put, deploying oppositional marketing to manipulate consumer behaviour is like using loopholes to avoid paying corporate taxes, meaning it raises ethical questions, or at least it should. If truth be told, what occurs in the complex world of data collection is often opaque enough to defy comprehensive ethical inspection. But this complexity should raise executives’ eyebrows. After all, as the 2008 financial crisis taught the banking world, complexity raises corporate risk and not understanding what is being done in the pursuit of profits does not negate the responsibility for oversight and corporate leadership.
The simple fact that data collection has become so complex is precisely why this paper argues that responsible firms looking to get into the game need to go beyond regulatory compliance and ask how far is too far in the planning phase for any data-collecting strategy. To what extent is oppositional marketing to be embraced? That’s the big question, and getting to the answer involves exploring whether the gains that can be obtained from full revenue optimization via oppositional marketing are warranted or if suboptimization might lead to better long-term results, including preserving the reputations of firms in future regulatory and societal environments.
When considering how far to go with oppositional marketing, firms should seek guidance from their own statement of values along with their customer bill of rights, if one exists. At a high level, all ethical action stems from the “Golden Rule” that says do unto others as you would have them do unto you, which provides conceptual guidance. The postmodern philosopher Jacques Derrida is known for examining the paradoxes that afflict notions like giving. For a gift to be genuine, he argued, it had to be given in an altruistic way rather than come with strings attached by triggering the cycle of giving and taking. In a similar vein, when offering consumers free services, companies should consider ensuring that the orientation of the enterprise is genuinely and fully focused on the interests of users by offering the option to opt out of data collection or at least giving users more control over data retention, use, sharing, aggregation, and value extraction.
Despite the enormous wealth currently being created by big-data platforms, it remains highly questionable whether any business can be sustained long-term through oppositional marketing and the zombification of consumers. Firms just entering the data collection game should recognize the opportunity to prepare for the future and forgo the gains to be had by bending the will of customers to their ultimate detriment in favour of creating new and mutual value collaboratively with individual customers—the essence of relationship marketing.
After all, if users won’t object, regulators surely will eventually move to fend off a zombie apocalypse by bring down the hammer on oppositional marketing. And at that point, the data collectors may find that it was much cheaper and easier to create zombies than to undo the strategies that created them.