Arjuna Capital, 1 Elm Street, Manchester, MA 01944 has represented that it is the
beneficial owner of at least $2,000 in market value of Twitter’s common stock and has given notice of its intention to present the proposal below at the Annual Meeting. The proposal and the proponent’s supporting statement appear below.
The board of directors opposes adoption of the proposal and asks stockholders to review
our opposition statement, which follows the proponent’s proposal and supporting statement.
Proposal and Supporting Statement by Stockholder Proponent
RESOLVED
Shareholders request that Twitter Inc. (“Twitter”) nominate for the next Board election at
least one candidate who: (a) has high level of human and/or civil rights experience and is widely recognized as such as determined by the Board, and (b) will qualify as an independent director.
SUPPORTING STATEMENT
Shareholders believe Twitter requires expert, board level oversight of civil and human
rights issues to assess risk and develop strategy to avoid causing or contributing to widespread violations of human or civil rights, such as voter suppression, disinformation and hate campaigns, or violence.
Twitter reports “…if we are not able to address user concerns regarding the safety and
security of our products and services or if we are unable to successfully prevent or mitigate…abusive…behavior on our platform, the size of our engaged user base may decline.”
White supremacists were responsible for the most domestic extremist violence since 1995 –
39 out of 48 deaths in 2019. Henry Fernandez, Center for American Progress, said “The muted efforts of giant social media companies to address racial violence and hate crimes perpetuated via their platforms have had terrible consequences,” noting
“white nationalist rhetoric being fueled on social media leading to real-world violence including mass killings in El Paso, Texas; Gilroy, California; and, Christchurch, New Zealand.”
In October 2020, Chairwoman, Subcommittee on Cybersecurity, Infrastructure Protection, and
Innovation called out Twitter’s use by malicious actors attempting to silence Black voters and sow racial division requesting disclosure of “measures put in place to counter voter suppression, interference, and disinformation targeting Black
voters.” A Senate report on Russia’s role in the United States’ elections and social media platforms’ role concluded, “No single group of Americans was targeted by information operatives more than African Americans.”
Twitter enabled police to surveil Black Lives Matter protests through a data startup,” a
practice that potentially exposes people — particularly Black, Indigenous, and people of color – to further surveillance and state violence.”
Amnesty International revealed a “shocking scale of online abuse against women. “ with
“troublesome” tweets sent once every 30 seconds on average, disproportionately targeting black women…contributing to the silencing of already marginalized voices.”
Ranking Digital Rights reports: “Facebook, Google (Youtube), and Twitter lack oversight
and risk assessment mechanisms that could help them identify and mitigate the ways that their platforms can be used by malicious actors to organize and incite violence or manipulate public opinion.”
As fiduciaries, our Board is responsible for stewardship of business performance and long
term strategic planning, in light of risk factors like widespread violations of human and civil rights.
The Company’s Statement of Opposition
Promoting healthy conversation is built into every facet of Twitter’s culture, including
our product and policy implementation and design. From our work to prevent dehumanizing speech, to the ways in which we leverage tech to tackle abuse, and our work with industry peers to counter terrorism, the impact we have made in these areas
continues to grow and evolve as the public conversation around us does too. Using a combination of product, technology, and human review, we are scaling our efforts and the level of sophistication at the Tweet and account level to be able to take
more and more proactive action to reduce the burden on our customers.
We have a Human Rights team at Twitter who think through and work on human rights issues
for the company every day, and we have established a Civil Rights Task Force. We also look to outside experts — safety advocates, academics and researchers, public comment periods, expert organizations, and community groups — to help us evaluate
our products, policies, and programs. Much of this work happens through the Twitter Trust and Safety Council, its issue-specific advisory groups, and special ad-hoc groups that