I was given a look at the Whisper moderation process because Michael Heyward, Whisper’s CEO, sees moderation as an integral feature and a key selling point of his app. Whisper practices “active moderation,” an especially labor-intensive process in which every single post is screened in real time; many other companies moderate content only if it’s been flagged as objectionable by users, which is known as reactive moderating. “The type of space we’re trying to create with anonymity is one where we’re asking users to put themselves out there and feel vulnerable,” he tells me. “Once the toothpaste is out of the tube, it’s tough to put it back in.”
Watching Baybayan’s work makes terrifyingly clear the amount of labor that goes into keeping Whisper’s toothpaste in the tube. (After my visit, Baybayan left his job and the Bacoor office of TaskUs was raided by the Philippine version of the FBI for allegedly using pirated software on its computers. The company has since moved its content moderation operations to a new facility in Manila.) He begins with a grid of posts, each of which is a rectangular photo, many with bold text overlays—the same rough format as old-school Internet memes. In its freewheeling anonymity, Whisper functions for its users as a sort of externalized id, an outlet for confessions, rants, and secret desires that might be too sensitive (or too boring) for Facebook or Twitter. Moderators here view a raw feed of Whisper posts in real time. Shorn from context, the posts read like the collected tics of a Tourette’s sufferer. Any bisexual women in NYC wanna chat? Or: I hate Irish accents! Or: I fucked my stepdad then blackmailed him into buying me a car.
“There is a very strong track record of places that attract talent becoming places of long-term success,” said Edward Glaeser, an economist at Harvard and author of “Triumph of the City.” “The most successful economic development policy is to attract and retain smart people and then get out of their way.”
The economic effects reach beyond the work the young people do, according to Enrico Moretti, an economist at the University of California, Berkeley, and author of “The New Geography of Jobs.” For every college graduate who takes a job in an innovation industry, he found, five additional jobs are eventually created in that city, such as for waiters, carpenters, doctors, architects and teachers.
“It’s a type of growth that feeds on itself — the more young workers you have, the more companies are interested in locating their operations in that area and the more young people are going to move there,” he said.
About 25 percent more young college graduates live in major metropolitan areas today than in 2000, which is double the percentage increase in cities’ total population. All the 51 biggest metros except Detroit have gained young talent, either from net migration to the cities or from residents graduating from college, according to the report. It is based on data from the federal American Community Survey and written by Joe Cortright, an economist who runs City Observatory and Impresa, a consulting firm on regional economies.
We then watched a clip on YouTube where monkeys in adjacent cages in a university laboratory perform the same task for food. Monkey A does the task and gets a grape – delicious. Monkey B, who can see Monkey A, performs the same task and is given cucumber – yuck. Monkey B looks pissed off but eats his cucumber anyway. The experiment is immediately repeated and you can see that Monkey B is agitated when his uptown, up-alphabet neighbour is again given a grape. When he is presented with the cucumber this time, he is furious – he throws it out the cage and rattles the bars. I got angry on his behalf and wanted to give the scientist a cucumber in a less amenable orifice. I also felt a bit pissed off with Monkey A, the grape-guzzling little bastard. I’ve not felt such antipathy towards a primate since that one in Raiders of the Lost Ark with the little waistcoat betrayed Indy.
Slingerland explained, between great frothing gobfuls of munched hazelnut, that this inherent sense of fairness is found in humans everywhere, but that studies show that it’s less pronounced in environments where people are exposed to a lot of marketing. “Capitalist, consumer culture inures us to unfairness,” he said. That made me angry.
The YouTube personality with the most subscribers isn’t Justin Bieber (8 million) or Rihanna (12.5 million). That honor goes to a 24-year-old Swede named Felix Kjellberg, better known by his YouTube handle, PewDiePie.
PewDiePie doesn’t sing or dance, no. PewDiePie has made his name—and a fortune—posting videos of himself playing video games. In one November video, for instance, he plays the Xbox Indie game “Techno Kitten Adventure,” helping a feline avatar navigate dangerous terrain filled with unicorns and narwhals, and shrieking in frustration each time his cat crashes into an obstacle.
I would argue, even, that programmer salaries are low when taking a historical perspective. The trend is flat, adjusting for inflation, but the jobs are worse. Thirty years ago, programming was an R&D job. Programmers had a lot of autonomy: the kind of autonomy that it takes if one is going to invent C or Unix or the Internet or a new neural network architecture. Programmers controlled how they worked and what they worked on, and either answered to other programmers or to well-read scientists, rather than anti-intellectual businessmen who regard them as cost centers. Historically, companies sincerely committed to their employees’ careers and training. You didn’t have to change jobs every 2 years just to keep getting good projects and stay employable. The nature of the programming job, over the past couple decades, has become more stressful (open-plan offices) and careers have become shorter (ageism). Job volatility (unexpected layoffs and, even, phony “performance-based” firings in lieu of proper layoffs, in order to skimp on severance because that’s “the startup way”) has increased. With all the negatives associated with a programming job in 2014, that just didn’t exist in the 1970s to ’80s, flat performance on the salary curve is disappointing. Finally, salaries in the Bay Area and New York have kept abreast of general inflation, but the costs of living have skyrocketed in those “star cities”, while the economies of the still-affordable second-tier cities have declined. In the 1980s and ’90s, there were more locations in which a person could have a proper career, and that kept housing prices down. In 2014, that $142,000 doesn’t even enable one to buy a house in a place where there are jobs.
The scale I’m about to define comes from one insight about human organizations. Teams, in general, have four categories into which a person’s contribution can fall: dividers, subtracters, adders, and multipliers. Dividers are the cancerous people who have a broad-based negative effect on productivity. This usually results from problems with a person’s attitude or ethics– “benign incompetence” (except in managers, whose job descriptions allow them only to be multipliers or dividers) is rarely enough to have a “divider” effect. This is an “HR issue” (dividers must improve or be fired) but not the scope of this professional-development scale, which assumes good-faith and a wish for progress. Subtracters are people who produce less than they cost, including the time of others who must coach and supervise them. As a temporary state, there’s nothing wrong with being a subtracter– almost every software engineer starts out his career as one, and it’s common to be a subtracter in the first weeks of a new job. Adders are the workhorses: competent individual contributors who deliver most of the actual work. Finally, multipliers are those who, often in tandem with “adder” contributions, make other people more productive. In many industries, being a multiplier is thought to be the province of management alone, but in technology that couldn’t be farther from the truth, because architectural and infrastructural contributions (such as reusable code libraries) have a broad-based impact on the effectiveness of the entire company.
Today’s young adults are constantly rebuked for not following the life cycle popular in 1960. But a quick look at earlier eras shows just how unusual mid-20th-century young people were. A society in which people married out of high school and held the same job for 50 years is the historical outlier. Some of that era’s achievements were enviable, but they were not the norm.The anxieties that 19th-century young people poured into their New Year’s diary entries are more common. Americans considered young adulthood the most dangerous part of life, and struggled to find a path to maturity. Those who did best tended to accept change, not to berate themselves for breaking with tradition. Young adults might do the same today. Stop worrying about how they appear from the skewed perspective of the mid-20th century and find a new home, a new stability and a new community in the new year.