Tapping into big data, researchers and planners are building mathematical models of personal and civic behavior. But the models may hide rather than reveal the deepest sources of social ills.
- By Nicholas Carr on April 16, 2014
In 1969, Playboy published a long, freewheeling interview with Marshall McLuhan in which the media theorist and sixties icon sketched a portrait of the future that was at once seductive and repellent. Noting the ability of digital computers to analyze data and communicate messages, he predicted that the machines eventually would be deployed to fine-tune society’s workings. “The computer can be used to direct a network of global thermostats to pattern life in ways that will optimize human awareness,” he said. “Already, it’s technologically feasible to employ the computer to program societies in beneficial ways.” He acknowledged that such centralized control raised the specter of “brainwashing, or far worse,” but he stressed that “the programming of societies could actually be conducted quite constructively and humanistically.”
The interview appeared when computers were used mainly for arcane scientific and industrial number-crunching. To most readers at the time, McLuhan’s words must have sounded far-fetched, if not nutty. Now they seem prophetic. With smartphones ubiquitous, Facebook inescapable, and wearable computers like Google Glass emerging, society is gaining a digital sensing system. People’s location and behavior are being tracked as they go through their days, and the resulting information is being transmitted instantaneously to vast server farms. Once we write the algorithms needed to parse all that “big data,” many sociologists and statisticians believe, we’ll be rewarded with a much deeper understanding of what makes society tick.
One of big data’s keenest advocates is Alex “Sandy” Pentland, a data scientist who, as the director of MIT’s Human Dynamics Laboratory, has long used computers to study the behavior of businesses and other organizations. In his brief but ambitious new book, Social Physics, Pentland argues that our greatly expanded ability to gather behavioral data will allow scientists to develop “a causal theory of social structure” and ultimately establish “a mathematical explanation for why society reacts as it does” in all manner of circumstances. As the book’s title makes clear, Pentland thinks that the social world, no less than the material world, operates according to rules. There are “statistical regularities within human movement and communication,” he writes, and once we fully understand those regularities, we’ll discover “the basic mechanisms of social interactions.”
Pentland’s idea of a “data-driven society” is problematic. It would encourage us to optimize the status quo rather than challenge it.
What’s prevented us from deciphering society’s mathematical underpinnings up to now, Pentland believes, is a lack of empirical rigor in the social sciences. Unlike physicists, who can measure the movements of objects with great precision, sociologists have had to make do with fuzzy observations. They’ve had to work with rough and incomplete data sets drawn from small samples of the population, and they’ve had to rely on people’s notoriously flawed recollections of what they did, when they did it, and whom they did it with. Computer networks promise to remedy those shortcomings. Tapping into the streams of data that flow through gadgets, search engines, social media, and credit card payment systems, scientists will be able to collect precise, real-time information on the behavior of millions, if not billions, of individuals. And because computers neither forget nor fib, the information will be reliable.
To illustrate what lies in store, Pentland describes a series of experiments that he and his associates have been conducting in the private sector. They go into a business and give each employee an electronic ID card, called a “sociometric badge,” that hangs from the neck and communicates with the badges worn by colleagues. Incorporating microphones, location sensors, and accelerometers, the badges monitor where people go and whom they talk with, taking note of their tone of voice and even their body language. The devices are able to measure not only the chains of communication and influence within an organization but also “personal energy levels” and traits such as “extraversion and empathy.” In one such study of a bank’s call center, the researchers discovered that productivity could be increased simply by tweaking the coffee-break schedule.
Pentland dubs this data-processing technique “reality mining,” and he suggests that similar kinds of information can be collected on a much broader scale by smartphones outfitted with specialized sensors and apps. Fed into statistical modeling programs, the data could reveal “how things such as ideas, decisions, mood, or the seasonal flu are spread in the community.”
The mathematical modeling of society is made possible, according to Pentland, by the innate tractability of human beings. We may think of ourselves as rational actors, in conscious control of our choices, but most of what we do is reflexive. Our behavior is determined by our subliminal reactions to the influence of other people, particularly those in the various peer groups we belong to. “The power of social physics,” he writes, “comes from the fact that almost all of our day-to-day actions are habitual, based mostly on what we have learned from observing the behavior of others.” Once you map and measure all of a person’s social influences, you can develop a statistical model that predicts that person’s behavior, just as you can model the path a billiard ball will take after it strikes other balls.
Deciphering people’s behavior is only the first step. What really excites Pentland is the prospect of using digital media and related tools to change people’s behavior, to motivate groups and individuals to act in more productive and responsible ways. If people react predictably to social influences, then governments and businesses can use computers to develop and deliver carefully tailored incentives, such as messages of praise or small cash payments, to “tune” the flows of influence in a group and thereby modify the habits of its members. Beyond improving the efficiency of transit and health-care systems, Pentland suggests, group-based incentive programs can make communities more harmonious and creative. “Our main insight,” he reports, “is that by targeting [an] individual’s peers, peer pressure can amplify the desired effect of a reward on the target individual.” Computers become, as McLuhan envisioned, civic thermostats. They not only register society’s state but bring it into line with some prescribed ideal. Both the tracking and the maintenance of the social order are automated.
Ultimately, Pentland argues, looking at people’s interactions through a mathematical lens will free us of time-worn notions about class and class struggle. Political and economic classes, he contends, are “oversimplified stereotypes of a fluid and overlapping matrix of peer groups.” Peer groups, unlike classes, are defined by “shared norms” rather than just “standard features such as income” or “their relationship to the means of production.” Armed with exhaustive information about individuals’ habits and associations, civic planners will be able to trace the full flow of influences that shape personal behavior. Abandoning general categories like “rich” and “poor” or “haves” and “have-nots,” we’ll be able to understand people as individuals—even if those individuals are no more than the sums of all the peer pressures and other social influences that affect them.
Replacing politics with programming might sound appealing, particularly given Washington’s paralysis. But there are good reasons to be nervous about this sort of social engineering. Most obvious are the privacy concerns raised by collecting ever more intimate personal information. Pentland anticipates such criticisms by arguing for a “New Deal on Data” that gives people direct control over the information collected about them. It’s hard, though, to imagine Internet companies agreeing to give up ownership of the behavioral information that is crucial to their competitive advantage.
Even if we assume that the privacy issues can be resolved, the idea of what Pentland calls a “data-driven society” remains problematic. Social physics is a variation on the theory of behavioralism that found favor in McLuhan’s day, and it suffers from the same limitations that doomed its predecessor. Defining social relations as a pattern of stimulus and response makes the math easier, but it ignores the deep, structural sources of social ills. Pentland may be right that our behavior is determined largely by social norms and the influences of our peers, but what he fails to see is that those norms and influences are themselves shaped by history, politics, and economics, not to mention power and prejudice. People don’t have complete freedom in choosing their peer groups. Their choices are constrained by where they live, where they come from, how much money they have, and what they look like. A statistical model of society that ignores issues of class, that takes patterns of influence as givens rather than as historical contingencies, will tend to perpetuate existing social structures and dynamics. It will encourage us to optimize the status quo rather than challenge it.
Politics is messy because society is messy, not the other way around. Pentland does a commendable job in describing how better data can enhance social planning. But like other would-be social engineers, he overreaches. Letting his enthusiasm get the better of him, he begins to take the metaphor of “social physics” literally, even as he acknowledges that mathematical models will always be reductive. “Because it does not try to capture internal cognitive processes,” he writes at one point, “social physics is inherently probabilistic, with an irreducible kernel of uncertainty caused by avoiding the generative nature of conscious human thought.” What big data can’t account for is what’s most unpredictable, and most interesting, about us.
Nicholas Carr writes on technology and culture. His new book, The Glass Cage: Automation and Us, will be published in September.
Nenhum comentário:
Postar um comentário
Observação: somente um membro deste blog pode postar um comentário.