'We refuse to create technology for warfare and oppression.' 'We’re not trying to take a political side.' These are just a few ways the world's top tech leaders, from Jeff Bezos to Peter Thiel, have responded to pushback against government contracts.
For as long as there’s been a technology industry, its leading companies have closely collaborated with the US government and its military.
One of the first big scores for the company that would go on to become IBM was a government contract to complete the 1890 census. Forty years later, a Navy air base would bring a glut of scientists and researchers to the area now known as Silicon Valley. Even Hewlett-Packard sold radar and artillery technology during World War II.
Download the free report to learn about the biggest emerging trends in AI and strategies to watch for 2021.
GET the enterprise AI TRENDS report
Download the free report to learn about the biggest emerging trends in AI and strategies to watch for 2021.
But for all that the military-industrial complex has done to build the tech sector into what it is today, tech workers are increasingly staring to question the moral implications of their employers’ relationships with the US government.
In some cases, such as at Google, tech companies have made small concessions. Other companies, like Amazon, have stood firm on military involvement, even when unpopular. Elsewhere, pro-military executives and entrepreneurs like Peter Thiel have continued pursuing government contracts with zeal.
Below, several leaders weigh in on this conflict, and how tech’s role in the government could be the next big threat or benefit to society — including opinions from executives who’ve lived it, like Jeff Bezos, Satya Nadella, Marc Benioff, and more.
Table of Contents
- The role of AI in warfare
- New ethics questions
- Government data sharing
- The Silicon Valley culture wars
- Security and freedom
The role of AI in warfare
Diane Greene, former Google Cloud CEO
WE’LL STOP BUILDING AI FOR WEAPONIZED SYSTEMS
In April 2018, a group of over 3,100 Google employees wrote a letter to the company’s CEO in protest of its work on Project Maven, a military AI project that would help drone warfare tools become more accurate.
The letter explained, “We believe that Google should not be in the business of war. Therefore we ask that Project Maven be cancelled, and that Google draft, publicize and enforce a clear policy stating that neither Google nor its contractors will ever build warfare technology… This plan will irreparably damage Google’s brand and its ability to compete for talent.”
Former Google Cloud CEO Diane Greene responded to workers’ concerns by announcing in June 2018 that the company would fulfill the requirements of its existing Department of Defense contract for Project Maven, but that it would decline to pursue follow-on contracts or similar projects in the future.
“Earlier today Google released a set of AI principles designed to provide clarity about the use of Google AI technology. As part of that, we announced that Google, including Google Cloud, will not support the use of AI for weaponized systems… While this means that we will not pursue certain types of government contracts, we want to assure our customers and partners that we are still doing everything we can within these guidelines to support our government, the military and our veterans.”
Google’s measured stance sought to retain trust and good will on both sides of the issue — workers and consumers on one end, and the government and military on the other.
The $9M Project Maven contract expired in March 2019.
Satya Nadella, Microsoft CEO
MICROSOFT WON’T WITHHOLD TECHNOLOGY FROM DEMOCRATIC GOVERNMENTS
In February 2019, dozens of Microsoft workers wrote a letter demanding the company cancel its $479M contract to provide technology for the DoD’s Integrated Visual Augmentation System, an augmented reality headset made for the battlefield.
“We are a global coalition of Microsoft workers, and we refuse to create technology for warfare and oppression. We are alarmed that Microsoft is working to provide weapons technology to the US military, helping one country’s government “increase lethality” using tools we built. We did not sign up to develop weapons, and we demand a say in how our work is used.”
Three days after the workers’ letter, Microsoft CEO Satya Nadella made it clear the company would move forward with its augmented reality contract.
“We made a principled decision that we’re not going to withhold technology from institutions that we have elected in democracies to protect the freedoms we enjoy… We were very transparent about that decision and we’ll continue to have that dialogue [with employees].”
Two months later, the Army unveiled the headsets to the public.
Palmer Luckey, Oculus founder & Anduril co-founder
IT’S OUR DUTY TO MAKE THE US A LEADER IN AI WARFARE
Oculus founder Palmer Luckey co-founded the defense company Anduril in 2017 to help the US military stay ahead of rival states such as Russia and China in the military technology arms race.
“I knew that we needed more companies with smart people and lots of investment working to make sure that Russia and China don’t dictate the future of warfare, so they don’t dictate the norms behind how artificial intelligence is used, behind how cyber warfare tools are used… And I felt like I had a responsibility to do something with the money that I had made that would make a difference.”
Luckey argues that the United States’ commitment to ethics and democracy means it will use its position as a leader in AI more responsibly than other countries that are developing advanced military technology.
“We have to realize that countries like China are weaponizing artificial intelligence and using it not just to create totalitarian police states in their own countries, but exporting that technology to other countries that are going to use it to build their own totalitarian police states… When you give a government really advanced technology and there aren’t any safeguards in place against the way you use it and there aren’t any thoughts about the ethics behind it, you’re going to end up trending towards building a police state. The United States is a very different place.”
New ethics questions
Marc Benioff, Salesforce CEO
TECH EXECUTIVES NEED AN ETHICS TEAM
In June 2018, Salesforce workers circulated a petition asking the company to re-examine its contract with US Customs and Border Patrol due to its detention of children at the southern border. The agency had been using the software to improve its recruiting process.
Salesforce CEO Marc Benioff decided to keep the contract, but felt a need for a new team to handle similar ethical concerns moving forward.
“[Salesforce employees] ask me questions I don’t have the answer to and I don’t have the authority or understanding to be able to opine on… I said I need a team that I can pivot to to say, “What is the right thing to do here?” And I’m like, it’s crazy that we don’t have a team like this. And it’s crazy that no company does.”
In December 2018, Benioff hired Omidyar Network’s Paula Goldman as Salesforce’s first chief ethical and humane use officer.
Jeff Bezos, Amazon CEO
EXECUTIVES NEED TO MAKE THE RIGHT DECISION, EVEN WHEN IT’S unpopular
Amazon CEO Jeff Bezos has expressed surprise that companies like Google are pulling back from military contracts — and perhaps even that tech workers were critical of these relationships to begin with.
“If big tech companies are going to turn their back on the US Department of Defense, this country is going to be in trouble… It doesn’t make any sense to me… One of the jobs of the senior leadership team is to make the right decision, even when it’s unpopular.”
From Bezos’ perspective, technology advancements are neither good nor bad. Rather, it’s up to the people who use them to make sure they are doing so in a way that is beneficial to society.
“The reason that we have pulled ourselves up as a species — by our bootstraps — is because we have continued to make technological progress… Technologies always are two-sided. There are ways they can be misused… The book was invented and people could write really evil books and lead bad revolutions with them and create fascist empires with books. That doesn’t mean the book is bad.”
Brad Smith, Microsoft president
THE TECH SECTOR’S EXPERTISE IS CRUCIAL TO ETHICAL AI USE
In October 2018, Microsoft president Brad Smith released a blog post affirming the company’s commitment to outfitting the government with “the best technology we create.” In it, he made the case that the expertise of Microsoft engineers would be key to helping society make sound decisions about how we use AI technology.
“Artificial intelligence, augmented reality and other technologies are raising new and profoundly important issues, including the ability of weapons to act autonomously. As we have discussed these issues with governments, we’ve appreciated that no military in the world wants to wake up to discover that machines have started a war. But we can’t expect these new developments to be addressed wisely if the people in the tech sector who know the most about technology withdraw from the conversation.”
Andy Jassy, Amazon Web Services CEO
GOVERNMENT IS RESPONSIBLE FOR REGULATING TECHNOLOGY
Amazon faced backlash from its employees last year over its decision to market its facial recognition software, Rekognition, to government and law enforcement agencies, including US Immigration and Customs Enforcement (ICE). In a letter, employees expressed their concern that Rekognition would be used to surveil and harm marginalized members of society, including black activists and undocumented immigrants.
Amazon Web Services CEO Andy Jassy responded to these concerns in November 2018 by pledging that the company would cut off service to customers if it felt they were violating people’s constitutional rights. Further, he expressed his perspective that it was up to the government to set guidelines for how companies should use technology.
“You want to make sure that people use the technology responsibly, and we have a set of terms and services in AWS. And with all our services, including Rekognition, where if people violate those terms of service and don’t use them responsibly, they won’t be able to use our services any longer. In fact, if we find people are violating folks’ constitutional rights, they won’t be able to use the services any longer. I also think, by the way, in a democracy [it] is also often the role and the responsibility of the government to help specify what the guidelines and regulations should be about technology. And if and when that happens, we will abide by those as well.”
In August 2019, an ACLU test found that Rekognition incorrectly matched 26 California elected officials to mugshots from a 25,000-image database. The advocacy organization seeks to ban the use of facial recognition in police body cameras.
Government data sharing
Chris Sonderby, Facebook VP & deputy general counsel
WE CAREFULLY SCRUTINIZE GOVERNMENT REQUESTS FOR INFORMATION
The conversation surrounding Silicon Valley’s relationship with the government is, in a sense, a sequel to the conversation that occurred in 2013 after leaks revealed that the US National Security Administration had been using a tool called PRISM to collect private data from major tech services.
But while many activists were outraged, there was not a mass mobilization of workers the way there has been in more recent years.
Facebook has responded to the public’s concerns by releasing an annual report on government requests for its users’ data every year since 2013.
“As we’ve shared in previous reports, we carefully scrutinize every government request we receive to protect the information of the people who use our services. Each request must be legally sufficient, and if a request appears to be defective or overly broad, we push back and will fight in court, if necessary. This is true no matter which government makes the request. We’ll also keep working with partners in industry and civil society to encourage governments around the world to reform surveillance in a way that protects their citizens’ safety and security while respecting their rights and freedoms.”
Katherine Adams, Apple senior VP & general counsel
WE’RE working to train LAW ENFORCEMENT agencieS
It remains to be seen whether tech workers will push back against government data sharing in this highly politicized moment.
In September 2018, Apple — which has historically positioned itself as a leader in privacy — announced that it was creating an online portal to help law enforcement more easily submit requests for its information.
“We are building a team of professionals dedicated to training law enforcement officers globally, which will significantly increase our ability to reach smaller police forces and agencies. This will include the development of an online training module for officers. This will assist Apple in training a larger number of law enforcement agencies and officers globally, and ensure that our company’s information and guidance can be updated to reflect the rapidly changing data landscape.”
The Silicon Valley culture wars
Jeffrey Sonnenfeld, The Yale Chief Executive Leadership Institute founder & president
SILICON VALLEY COMPANIES ARE A DIFFERENT KIND OF INVESTMENT
One of the biggest reasons tech companies have faced so much employee pushback is that Silicon Valley has long billed itself as an economic sector committed to social progress. Yale’s Jeffrey Sonnenfeld suggests that this altruistic culture makes big tech companies a different sort of investment opportunity than other businesses that are driven purely by profit.
“Anybody who invested in these companies who harbored the misconception that these companies will do anything for money should quickly sell their stock and buy into a tobacco company… That’s not what these companies stood for from the beginning. They stood for values beyond the quickest short-term buck that can be made.”
Alex Karp, Palantir CEO
MILITARY SKEPTICS ARE OUT OF TOUCH WITH EVERYDAY AMERICANS
Source: Business Insider
Amid the tech sector’s growing skepticism of military and security state partnerships, a countervailing force has risen in response. Led by PayPal founder Peter Thiel, these executives embrace the opportunity to work with the US military, which they believe provides the security necessary for the entire tech industry to flourish.
Palantir CEO Alex Karp, whose data analytics company frequently collaborates with the security state, believes that the rest of the tech industry is out of step with how most Americans perceive the military. Despite recent protests from his employees over the company’s work with ICE, Karp has remained steadfast in his support of Palantir’s government contracts.
“Now Silicon Valley is creating micro communities that break the consensus of larger society while simultaneously telling the average American, “I will not support your defense needs,” and then selling their products that are adversarial to America… That is a loser position. It is not intelligible. It is not intelligible to the average person. It’s academically not sustainable. And I am very happy we’re not on that side of the debate.”
James Baron, Yale School of Management professor
SILICON VALLEY FIRMS NEED TO KEEP SCARCE TALENT HAPPY
Tech workers’ position in the labor market gives them the leverage to hold their employers socially responsible.
As Yale professor James Baron notes, tech companies are reliant on highly skilled computer programmers who can make or break their businesses. If even just a handful of high-end engineers are turned off by a company’s work with DoD or ICE, it could cost the firm millions of dollars in productivity.
“These tech companies are all extremely dependent on scarce talent… It would not serve companies well that are struggling mightily to attract top talent, to engage in actions that would antagonize employees and have them feel that their ability to express themselves would be forfeited upon their employment there. Other employers that are less dependent on top talent might be able to get away with a hard-line stance.”
Steve Conine, Wayfair co-founder
WE’RE NOT TRYING TO TAKE A POLITICAL SIDE
Source: Vanity Fair
If one thing is clear, it’s that it will be increasingly difficult for tech companies to avoid grappling with their relationships with the government.
This past June, 500 Wayfair employees walked out of the company’s Boston headquarters after it was revealed that Wayfair had sold $200,000 worth of beds to immigrant detention centers along the southern US border.
The company’s co-founder, Steve Conine, made the case that it’s inappropriate for workers to make their workplace a site of political struggle.
“The level of your citizenship as citizens is really the appropriate channel to try and attack an issue like this. To pull a business into it — we’re not a political entity. We’re not trying to take a political side.”
Alexandria Ocasio-Cortez, congresswoman for New York’s 14th district
WORKERS HAVE THE POWER
Congresswoman Alexandria Ocasio-Cortez disagreed with Conine’s assessment. She took to Twitter to applaud Wayfair employees for using their collective power to demand more ethical behavior from their bosses.
“Wayfair workers couldn’t stomach they were making beds to cage children. They asked the company to stop. CEO said no. Tomorrow, they’re walking out. This is what solidarity looks like — a reminder that everyday people have real power, as long as we’re brave enough to use it.”
Wayfair responded to the protest by donating $100,000 to the Red Cross, but it affirmed its commitment to fulfill the orders.
Security & freedom
Phebe Novakovic, General Dynamics CEO
THE MILITARY GIVES TECH COMPANIES THEIR FREEDOM
General Dynamics CEO Phebe Novakovic believes that the strength of the US government gives tech companies the freedom to pursue profits and innovation.
Where more idealistic Silicon Valley companies are more ambivalent about working with the military, traditional defense contractors like General Dynamics are more than happy to take on its business.
“I’m frankly alarmed when I see some companies, to whom much is given, not want to work with the US government… Who do they think provides them this freedom? Where do they think the platform for their technology and innovation comes from? It comes from the security and stability of this nation.”
Peter Thiel, PayPal co-founder and Palantir chairman
TECH COMPANIES CAN CREATE PEACE by supporting strong national security
Thiel has been outspoken in favor of tech-military collaboration for years, well before the recent outbreak of debates on the issue.
In 2016, he argued that a strong national security apparatus can help prevent the US from going to war.
If you aren’t already a client, sign up for a free trial to learn more about our platform.
“While households struggle to keep up with the challenges of everyday life, the government is wasting trillions of dollars of taxpayer money on faraway wars. Right now we’re fighting five of them, in Iraq, Syria, Libya, Yemen, and Somalia… If we can pinpoint real security threats, we can defend ourselves without resorting to the crude tactic of invading other countries.”