Gigged
Begin Reading
Table of Contents
About the Author
Copyright Page
Thank you for buying this
St. Martin’s Press ebook.
To receive special offers, bonus content,
and info on new releases and other great reads,
sign up for our newsletters.
Or visit us online at
us.macmillan.com/newslettersignup
For email updates on the author, click here.
The author and publisher have provided this e-book to you for your personal use only. You may not make this e-book publicly available in any way. Copyright infringement is against the law. If you believe the copy of this e-book you are reading infringes on the author’s copyright, please notify the publisher at: us.macmillanusa.com/piracy.
For my family: Debra, Steve, Richard, and Alex
Preface
When I first heard about the “future of work” in 2011, I was working as a reporter at a tech blog—a job that involved wading through an endless stream of startup pitches.
This future, dozens of young entrepreneurs explained to me, didn’t involve jobs. Nobody liked jobs: The boredom! The rigid structure! The obedience! What the world really needed were gigs.
The pitch came in different versions. Some startups had created ecommerce stores for labor. Small businesses and Fortune 500 companies alike could sift through worker profiles by skill and hire them on a project-by-project basis. Other startups worked more like dispatchers. Drivers, dog walkers, and errand runners could get notifications on their phones when a job became available and choose to either accept it or reject it. A small handful of companies had taken a third approach, breaking work into tiny tasks that took only minutes and paid only cents. They assigned online crowds of people to work on large, tedious projects, like transcribing audiotapes, or checking to make sure that grocery stores across the country remembered to put a certain brand of cola in a prime location.
The rise of these new apps, their founders assured me, meant that soon we would all be working on the projects we chose, during the hours that we wanted. We would no longer be laboring for the man, but for our own tiny businesses. This meant that in the future, it wouldn’t matter how many jobs got shipped overseas or were taken by robots. We could work for our neighbors, connect with as many projects as we needed to get by, and fit those gigs in between our band rehearsals, gardening, and other passion projects. It would be more than the end of unemployment. It would be the end of drudgery.
The idea was deeply appealing to me. In addition to sounding like more fun than a job, this version of the future of work relieved a deep uncertainty I had about the future.
From a young age, my baby boomer parents had instilled in me that the mission of becoming an adult—the path to dignity, security, and independence—was to obtain a job. Most adults I knew in my rural Wisconsin town had a straightforward profession like teacher, lawyer, or mechanic. They worked at the grocery store or for the postal service. A large Nestlé factory in a town nearby made the air smell like chocolate if the wind blew just right, and another factory made Kikkoman soy sauce. Becoming employable—not following dreams, seeking some sort of personal fulfillment, or whatever it is they tell kids in coastal states—was in itself deserving of respect and dignity.
So eager was I to become a real person, a person with a job, that I’d spent a good chunk of my summer vacation before high school at a greenhouse, picking aphids off of herbs and pulling the hard-to-reach weeds (being 13, I was skinny enough to squeeze between plant stands). My parents didn’t need the money. I didn’t need the money. Jobs just felt instinctively important.
I’m told most millennials don’t feel that way, but I haven’t really met many people in general who don’t value stability and safety. Maybe what makes millennials different is that those things feel particularly elusive. My peers and I came of age at a time when everything everyone believed about work was at best in flux and at worst already clearly no longer the case.
In 2005, when I was a junior in high school, I decided I would become a journalist. In 2007, as newsrooms were scrambling to move their business models online, the Great Recession started. And three years after that, the winter of my senior year of college, the unemployment rate in the United States hit double digits. Only the computer programmers, it seemed, were excited for graduation. As I conjured a frantic storm of resumes, informational interviews, and job fair mailing lists, I had trouble sleeping and, at times, breathing. Though at the time I was narrowly focused on my own employment prospects (or lack thereof), my anxiety was small by comparison to many. I had a college degree, parents willing to help me, and connections at a local greenhouse that would have been happy to have me back for another summer. I was going to be ok. About the future, the world around me, I wasn’t so sure.
Media wasn’t the only industry being remade by technology. As newsrooms were announcing layoffs, other companies were using internet freelance marketplaces and staffing agencies to zap white-collar jobs overseas. Artificial intelligence and robotics were replacing others. Many of the jobs that remained in the United States no longer came with security. Companies had, under pressure from shareholders, cut the fat from their benefits packages for employees, piling more and more risk onto their shoulders. As the economy recovered, the companies hired temp workers, contract workers, freelancers, seasonal workers, and part-time workers, but full-time jobs that had been lost to the recession were never coming back. Over the next five years, nearly all of the jobs added to the US economy would fall into the “contingent” category.1 That “job” that we’d all been told was the key to our secure life no longer seemed like a natural path.
As a young person, you’re not allowed to sit out the future. You don’t get to put off learning how to use email because you’d rather fax. Nobody thinks that’s endearing. When you see a trend coming down the pike, you know it’s going to hit you. So perhaps when entrepreneurs described for me a world in which work would be like shopping at a bazaar (a gig economy startup had picked up this concept in its name, Zaarly), it appealed to me more than it would have to someone with more gray hairs: I’ll take that vision of the future—no need to play that horrifying mass unemployment and poverty vision that I had all lined up and ready to go.
I wrote my first story about the gig economy in 2011, long before anyone had labeled it the “gig economy.” The headline was “Online Odd Jobs: How Startups Let You Fund Yourself.”2 Though my job changed throughout the next seven years, my fascination with the gig economy didn’t. I first watched as the gig economy became a venture capital feeding frenzy, a hot new topic and a ready answer to the broader economy’s problems. Then, as stories of worker exploitation emerged, I listened as the same companies that had once boasted about creating the “gig economy” worked to distance themselves from the term. I saw the gig economy start a much-needed conversation about protecting workers as technology transforms work.
The more I learned, the more I understood that the startup “future of work” story, as consoling as it was, was also incomplete. Yes, the gig economy could create opportunity for some people, but it also could amplify the same problems that made the world of work look so terrifying in the first place: insecurity, increased risk, lack of stability, and diminishing workers’ rights. The gig economy touched many people. Some of them were rich, some poor, some had power, and some didn’t. Its impact on each of them was different.
The chapters of this book alternate between five of their stories. It’s not intended to be a complete, bird’s-eye view of the gig economy. Any economy is built by humans, and this book is about them.
PART I
THE END OF THE JOB
CHAPTER 1
A VERY
OLD NEW IDEA
At South by Southwest 2011, the napkins featured QR codes. Flyers rained down from party balconies, and the grilled cheese—provided by group messaging app GroupMe—was free.
Startups looked forward to the tech-focused “Interactive” portion of the famous music festival in Austin, Texas, like a popular high school student looks forward to the prom. One of the new companies among them, it was widely assumed, would be crowned a “breakout hit,” just as Twitter had once “broken out” by introducing its app to the tech-savvy SXSW crowd. It was only a matter of attracting enough attention—an effort that usually involved a marketing gimmick.
At the time, Uber was a little-known app that worked as a dispatch service for local owners of licensed private car companies. Its attempt at guerrilla marketing was an on-demand pedicab service.
The startup decorated 100 rented pedicabs with banners that said “I U” next to a solid black shape of Texas (“I Uber Texas,” I suppose), and in interviews with bloggers, its executives hopefully suggested that riders post photos of themselves with the hashtag #Uberspotting. “If you’re an Uber virgin, prepare to experience the future of transportation,” its blog explained, helpfully noting that the process of calling an Uber pedicab would be easy to navigate “even when drunk.”
Within a few short years, Uber would become one of the most valuable companies in the world. It would allow anyone—not just the professional drivers with which it had begun—to earn money as a taxi driver, and its fares (then $15 at minimum) would drop so low that in some cities they’d compete with public transportation. The startup would raise more than $12 billion in venture capital funding at a valuation that made it, on paper, worth more than 100-year-old companies like GM and Ford, and the Uber business model would give rise to an entire category of startups. The transportation service would also set a new expectation among consumers: that everything should come to them “on demand,” at the push of a button—an idea that would reshape service industries, retail, and digital interface design.
But at SXSW 2011, Uber just looked like yet another dream.
At the time, I was working as a reporter at a tech blog. My list of “13 Potential Breakout Apps to Watch at SXSW 2011,”1 published the week before the festival, featured four group messaging apps, an app that turned a cell phone into a walkie-talkie (because I somehow thought walkie-talkies were better than phones?), and two nearly identical photo-sharing apps (one of which was Instagram). Uber didn’t make the cut.
I wasn’t alone in ignoring Uber. Despite its earnest attempt at social media marketing, only about five of SXSW Interactive’s nearly 20,000 attendees that year participated in #Uberspotting.
Uber attracted nearly as little attention a year later with an offer to deliver barbecue to SXSW attendees. The hype that year instead surrounded an app called Highlight, which made phones buzz when strangers in the same proximity had mutual friends and interests, as determined by their social media accounts. “The way we find people has been terribly inefficient,” Highlight’s founder told me earnestly in an interview.2 “We don’t realize how horrible it is because it’s always been that way.” He was dead serious about human interaction being broken, and his pitch for fixing it with an app was quite effective. The Highlight hype became so pervasive that at one panel I attended, a waggish moderator instituted a fake drinking game: “Every time Highlight is mentioned, drink twice … and then punch yourself.” Nobody, by contrast, was talking about Uber.
I had so little expectation of Uber becoming a mainstream utility that when I took my Uber-sponsored pedicab ride, I used my work email address to sign up for the app. I didn’t want my personal email account to get spam.
It wasn’t for another two years, by which time Highlight had been all but forgotten, that Uber finally emerged as a darling of Silicon Valley. Its “breakout” had nothing to do with a marketing stunt.
In 2013, the company raised a $258 million round of funding led by Google’s investment arm, Google Ventures—an amount that Gawker’s tech blog called “stupefying.”3
The $258 million investment seemed remarkable partly because Uber had so little in common with the hot apps of the time, those for sharing photos, turning phones into walkie- talkies, or making social connections on the street. Though some of these “potential breakout apps” sound trivial or silly in retrospect, they all had the potential to become quickly and massively profitable—Instagram and Snapchat both emerged from this period—which isn’t the case for most companies. By the time Facebook bought Instagram, the most successful of my “2011 breakout apps,” for $1 billion in 2012, the photo-sharing service had 30 million users but only 13 employees, including its cofounders. That’s more than $75 million of value per person.
Venture capitalists love companies that scale massively with as little infrastructure as Instagram. They generally ignore companies that grow slowly and sustainably over time, which, until around 2013, included most companies that sold in-person services like transportation.
Uber, though, was changing the game. Instead of buying cars or hiring employees, it made two apps: one for customers, one for drivers. When a customer requested a ride, Uber sent a notification to a nearby driver, who used his own car to do the job. Uber handled payments and charged a commission. All it needed to grow was the same thing that Instagram needed to grow: app downloads. The startup had figured out how to scale an analog service company as though it were a software company.
Uber avoided medallions, special license plates, and other government-created systems aimed at regulating taxi and limousine companies by claiming that it was a technology company rather than a transportation company. This would soon cause a dramatic confrontation between itself and regulators. But another key to the startup’s seemingly endless potential for growth was—as important, powerful things so often are—extremely boring, at least at surface level. It was essentially a tax classification.
Uber had called its drivers “independent contractors.” This relieved the company from government-mandated employer responsibilities in most countries, and in the United States, where Uber started, it relieved the company of almost all of them. Workers who are classified as “employees” must be paid while they take coffee breaks and must be treated according to anti-discrimination laws. They come with commitments to contribute to government safety net programs for retirement and unemployment benefits. And they can be difficult to fire when business circumstances change.
Independent contractors come with none of these responsibilities. They also do not have the right to unionize under US federal collective bargaining laws, and there’s no requirement to provide them with training, equipment to do the job, or benefits.4 The situation is similar, albeit to a lesser extent, elsewhere. UK employers, for instance, do not need to offer sick days, holiday pay, a guaranteed minimum wage, or other benefits to self-employed contractors.
When a driver signed up to work for Uber as an independent contractor, he or she (but most likely he, as 81% of US drivers, as of December 2015, were men5) supplied his own car, gas, and overly pungent air fresheners. He paid for his own coffee breaks and his own health insurance. All of the responsibility of being in business, including taxes, rested on his shoulders. An Uber driver, in other words, was as close to a piece of code as Uber could find without having the cars drive themselves (an initiative that quickly became the company’s priority).
It seemed to investors like a smart strategy, but it wasn’t a new one. Decades before Uber started, companies in Silicon Valley had begun shifting work to independent contractors, subcontractors, and temporary workers as a way to reduce cost and liability. As an ad for the temporary staffing agency Kelly Services put it in 1971, the type of worker clients could expect to hire through such an agency:
Never takes a vacation or holiday.
Never asks for a raise.
Never costs you a dime for slack time. (When the workload drops, you drop her.)
Never has a cold, slipped di
sc or loose tooth. (Not on your time anyway!)
Never costs you for unemployment taxes and Social Security payments. (None of the paperwork, either!)
Never costs you for fringe benefits. (They add up to 30% of every payroll dollar.)
Never fails to please. (If your Kelly Girl employee doesn’t work out, you don’t pay.)6
By 2009, the year Uber launched, nearly all taxi drivers and around 13% of the US population were already self-employed or working as independent contractors. Other alternatives to hiring employees were also on the rise. Around 45% of accountants, 50% of IT workers, and 70% of truck drivers were working for contractors rather than as employees at the companies for which they provided services.7 And the number of temp workers in the United States was on its way to an all-time high. By 2016, 20% to 30% of the working-age population in the United States and European Union had engaged in freelance work.8 Add part-time work to the mix, and some estimates put the percentage of the US workforce that did not have a full-time job as high as 40%.9 Uber merely took a trend among corporations—employing as few people as possible—and adapted it for the smartphone era.
The Uber model worked great for both venture capitalists and customers. Uber’s technology was inarguably a huge improvement over the incumbent system for hailing a ride (which in an era of online shopping and dating apps somehow still involved raising a hand and hoping a cab would pass). Several months after Uber confirmed the massive Google Ventures investment, data about its users leaked to the press. They showed that around 80,000 new customers were signing up for Uber every week (about as many new users as Instagram added per week in late 2010) and suggested the company was on track to make around $210 million by the end of the year.10 Success seemed inevitable.
While any successful startup spawns imitators, with Uber, it felt like a gold rush. Entrepreneurs and venture capitalists suddenly wanted to apply the Uber business model to every analog industry that had once seemed too slow for Silicon Valley.