You are on page 1of 110

Responsible

Tech Guide
The people, organizations, and ideas of the Responsible Tech
ecosystem and actionable ways to get involved.
There’s a vibrant Responsible
ALL TECH IS HUMAN | 2023

Tech community

2
Welcome to the Responsible
Tech Guide!
ALL TECH IS HUMAN | 2023

The Responsible Tech Guide is designed to provide an overview of


the people, organizations, and ideas of the growing Responsible
Tech movement.

In order to build a tech future aligned with the public interest, we


need a diverse and robust responsible tech ecosystem that
promotes knowledge-sharing and collaboration and moves at the
speed of tech to tackle wicked tech and society issues.

The purpose of the Responsible Tech Guide is to learn about the


ecosystem and find actionable ways to get involved.

The Responsible Tech Guide is the flagship resource of All Tech Is


Human, a non-profit organization dedicated to making a more
connected, inclusive, and equitable environment to collectively
approach complex tech and society issues. Together we can build a
better tech future.

You can find the latest version, along with additional resources like
our responsible tech org list, at responsibletechguide.com!

3
Table of
Contents

01
Overview and Welcome
The Five Subject Matter Areas of Responsible Tech, About All
Tech Is Human, Ten Principles, Welcome Letter | Pages 5- 10

02
Getting Involved
Common hurdles, ways to get involved, diverse range of
backgrounds, ways to affect positive change, what better tech
future look likes, and responsible tech activities | Pages 11- 28

03
Profile Interviews
Learn from others about their roles in responsible tech, their
career trajectory, and issue areas crucial to building a better
tech future | Pages 29- 59

04
Five Areas of Responsible Tech
Responsible AI, Trust & Safety, Tech & Democracy, Public
Interest Tech, and Youth, Tech, & Wellbeing, and our list of
contributors | Pages 60- 94

05
What We’re Learning
Highlights from recent All Tech Is Human panels, reports, podcast
series, and key takeaways | Pages 95- 107

06
Staying in Touch
Learn about All Tech Is Human’s team and how to stay in touch! |
Page 108- 110

4
ALL TECH IS HUMAN | 2023

Five subject matter


areas of responsible tech
(see Page 60)
1960s

1
Responsible AI
Responsible AI is at the forefront of ethical technology
development. It emphasizes the need for transparency,
fairness, and accountability in AI systems. Our community of
ethical practitioners works to ensure applications of AI benefit
society without causing harm or reinforcing bias. In the
1980s
responsible technology space, it’s important to craft
guidelines, implement robust testing, and advocate for
policies that prioritize ethical considerations while respecting
human rights and equity.

2
Trust & Safety
Trust & Safety teams play a vital role in maintaining the
integrity of online platforms and digital spaces. Practitioners
are not only dedicated to combating harmful content,
disinformation, and cyber threats, they also work to foster an
2000s
environment of trust among users. The Trust & Safety field is
evolving and emerging with difficult tradeoffs and challenges
in promoting a secure and trustworthy digital ecosystem.

5
3
ALL TECH IS HUMAN | 2023

Tech & Democracy


The intersection of technology and democracy involves the
collective endeavor of safeguarding our digital public squares
and the legitimacy of democratic institutions. In the complex
interplay between technology and democratic processes,
practitioners in this intersection advocate for transparency
and election integrity, safeguard against foreign interference,
and ensure our digital spaces remain open for discourse and
civic engagement. This work is pivotal to preserving principles
that underpin our society.

4
Public Interest Tech

Public Interest Tech involves the championing of technology


serving the greater good. Practitioners in this space work to
address societal challenges through innovative, sustainable
solutions, with expertise spanning issue areas like civic tech,
data, and digital accessibility. This work involves cross-sector
1980s
collaboration with governments, civil society, and
communities to create digital tools and policies that prioritize
wellbeing and equitable access for all.

5
Youth, Tech, & Wellbeing

The intersection of youth, tech, and wellbeing features a range


of stakeholders steeped in the unique and emerging
challenges between young people and technology. There is an
2000s
emphasis on digital literacy, online safety, and the responsible
use of technology, particularly in the realm of social media.
Youth advocacy in the space promotes meaningful tech
legislation, digital literacy, and wellbeing tools, ensuring mental
and emotional health while navigating the digital landscape.

6
About All Tech Is Human
ALL TECH IS HUMAN | 2023

All Tech Is Human is a non-profit organization based in NYC


committed to coalescing people and ideas to tackle wicked tech and
society issues at the speed of tech. We believe we can build a better
tech future by diversifying the people involved in the process, having a
more cohesive and collaborative responsible tech ecosystem, and
creating a conducive environment for the consideration of
technology’s impact at a much faster pace.

Since our founding in 2018, we have directly interacted with thousands


of individuals from a broad range of backgrounds around the world,
through activities centered in multistakeholder convening and
community building, multidisciplinary education, and the
diversification of the traditional tech pipeline with more
backgrounds, disciplines, and lived experiences.

We simultaneously learn from the responsible tech community with the


ability to influence its future. Our activities include operating a Slack group
of over 6k members across 77 countries, a talent pool of over 1.7k
members, a global mentorship program, regular summits and mixers, and
much more. We are highly-participatory and naturally inclusive, weaving
together the emerging with the established to strengthen the ecosystem
and seed the next generation. See all of our projects here, and learn about
our theory of change here.

All Tech Is Human’s activities are entirely free — thereby creating a low
barrier for entry — through the support of the Patrick J. McGovern
Foundation, Schmidt Futures, and the Siegel Family Endowment.
7
The ten principles of
All Tech Is Human
ALL TECH IS HUMAN | 2023

The future of technology is intertwined with the future of democracy and the
1 human condition.

In order to align our tech future with the public interest, we need to involve the
2 public.

3 We need collective action in tech, not just individual thought leadership.

4 No application without representation — not about us without us.

Combining multiple stakeholders, disciplines, and perspectives requires an


5 agnostic space for understanding and knowledge-sharing.

People often struggle to “find the others” and discover the wide variety of people
6 and orgs committed to co-creating a better tech future.

7 Technology is not just for technologists; we need all disciplines involved.

Top-down models have power but often lack a diversity of ideas; grassroots
8 models have ideas but often lack power. We unite these models.

Tech innovation moves fast, while our consideration of its impact often moves
9 slow. We need to reduce the gulf between these.

There is a growing awareness of the root causes of our current dilemma, but limited
10 action toward understanding values, trade-offs, and best paths forward.

We cannot align our tech future with the public interest unless we
actively involve the public. All Tech Is Human’s approach brings
together people of all backgrounds and skill levels to learn from
each other, build community, and co-create a better tech future.
Find out more at AllTechIsHuman.org

8
It’s time for a better approach
to tackling wicked tech &
ALL TECH IS HUMAN | 2023

society issues
As a society, we are facing a slew of complex tech and society challenges
that evolve each day. Whether it’s understanding the impact of generative
AI, reducing harms online, or considering emerging technologies’ effect on
our civil liberties, the problem space feels endless. But one question
remains: What can we do to ensure our tech future works for all of us?

All Tech Is Human has built a better approach for tackling wicked tech and
society issues. Learning from our interactions with tens of thousands of
individuals around the world through our activities (our Slack community,
mentorship program, summits, mixers, and working groups), we are
disrupting the current approach to tech problem-solving (that is not
working). The three main problems we are committed to resolving are:

Tech innovation outstripping our ability as a society to understand its


impacts and create necessary guardrails.
The lack of clear pathways or adequate support for the diverse range
of professionals wanting to engage the responsible tech ecosystem.
The ecosystem’s inability to adequately leverage collective
intelligence and collaboration.

9
It’s time for a better approach
to tackling wicked tech &
ALL TECH IS HUMAN | 2023

society issues (continued)


The Responsible Tech Guide is designed to address and offer remedies to
the three problems, in conjunction with our activities centering
multistakeholder convening and community-building, multidisciplinary
education, and the diversification of the traditional tech pipeline with
more backgrounds, disciplines, and lived experiences. We need to move
at the speed of tech when considering the impacts of technology, elevate
new voices and perspectives into the ecosystem, and promote greater
knowledge-sharing and collaboration.

Tech and society issues will never be resolved by relying on the wisdom
of a small sliver of society. Social media and emerging technology have
profound impacts on our lives, so it behooves us to have an approach
that incorporates these viewpoints.

The good news? There is a vibrant community committed to aligning our


technology and its related policies with the public interest. The
Responsible Tech Guide is the premiere resource to discover and amplify
the people at the forefront of this movement.

Let’s co-create a better tech future.

DAVID RYAN POLGAR


Founder and President
All Tech Is Human
New York, New York
david@alltechishuman.org

10
Three common hurdles
to getting more involved
ALL TECH IS HUMAN | 2023

01
Where to start?

It can be overwhelming to understand what


1960s the responsible tech ecosystem looks like
and determine how to get more involved.
Individuals often get inspired by a book, a
movie, or a personal experience as a
catalyst to making a positive difference in
our tech future; however, they may struggle
to understand how.

02
Finding community

It can oftentimes feel like a solitary pursuit


1980s for individuals who want to make a
difference, so it is essential to illuminate
pathways into the ecosystem. There are
thousands of people and organizations
committed to the responsible tech
movement.

03
Getting support and mentorship to grow in
responsible tech

2000s Many struggle to find the necessary support


to better understand the ecosystem,
expand their network, upskill, and find career
opportunities in responsible tech. Our
activities at All Tech Is Human shine a light
on this issue area and offer solutions.
11
Ways to get involved
ALL TECH IS HUMAN | 2023

with All Tech Is Human


01 Join a working group:
We mix the emerging
02 Join the Slack
community: Share
with the established in resources, discover
responsible tech in our new jobs and events,
working groups. and meet others.

03 Attend a mixer:
Thousands have come
04 Use our free career
resources: We offer a
together in-person robust job board, talent
around the world pool, support materials,
through All Tech Is and more.
Human.

05 Participate in our
mentorship program:
06 Attend a livestream:
Our community is
Be paired with a intentionally global;
mentor based on individuals are able to
topical interest and gather virtually.
geographic location.

07 Read our reports and


hub resources: Dive
08 Contribute your voice:
By participating in our
deep into a number of activities, you add to
organizations and the collective
resources in intelligence of the
responsible tech. responsible tech
community.

12
No matter your skill
level, you are needed
ALL TECH IS HUMAN | 2023

Starting Students or career-changers just getting


out started in Responsible Tech:

For young people and students, we created a


global Responsible Tech University Network
with intentional agnosticism to disciplines. For
career-changers, we recommend reading our
resources to understand the ecosystem and
find support through mentorship, mixers, and
the Slack community.

Mid- Some level of experience in Responsible


career Tech:

Many in our community seek mentorship


while mentoring others getting started in the
field. By getting involved in our working
groups, attending our gatherings, and using
our free resources, they can find their niche
within the responsible tech ecosystem.

Established Experienced in Responsible Tech:

Individuals with years of experience in


Responsible Tech pay their knowledge
forward by mentoring others, speaking at
our gatherings, and getting interviewed for
our reports.

13
We need a diverse range of
ALL TECH IS HUMAN | 2023

backgrounds in the
responsible tech ecosystem

Environmental
Studies
How are
technologies and
computing altering International
Computer Science + Relations
our environment?
Engineering Anthropology
How does What role can
How can I develop
technology and technology play in
technologies
responsibly? culture influence one Economics international affairs
another? and politics?
In what ways can we
balance responsible
innovation and
econnomic growth?

Law Digital
Design
How can we ensure What is the impact of
Education the legal protections
of individuals in
thoughtful design on
technology?
digital spaces?
How will technology
shape the way we
learn? Statistics
Responsible How can we
demystify the
statistical Information
Tech foundations of "AI"
and "Big Data"?
Science

How is information
Philosophy Sociology stored and disseminated
online?
How can we harness In what ways does
theories of technology impact
philosophy and our social
ethics to better organizations and
shape technology? relationships?

Art
Health Social Work
Community How does
What role can
Psychology Development technology play in How can we apprise
technology shape the
way we view art and
How can developing a more individuals about
media?
How can technology communities equitable healthcare their digital rights
influence our minds leverage technology system?
Policy and protections?
and behavior? for equity and
access? As technology
becomes more
ingrained in society,
how can policy
evolve to the voice of
citizens?

We can’t solve complex tech and society issues alone. Instead, we must
incorporate these disciplines and backgrounds to ensure a better
understanding of the evolving ways technology impacts us — and the options
improving the current situation.

14
Three ways to affect
positive change
ALL TECH IS HUMAN | 2023

01
Change happens from the inside.

Having more socially responsible and


1960s ethically-minded tech companies hinges on
tech workers making a difference from the
inside.

Our aim is to diversify the pipeline of talent


for better outcomes.

02
Change happens from the outside.

We need continued research and greater


1980s oversight in tech spaces. There are civil
society organizations, think tanks, university
outlets, and governmental bodies that play
an important part in the responsible tech
ecosystem. We maintain a list of over 500
organizations at alltechishuman.org.

03
Change happens from reimagining
potential tech futures.

2000s While a good portion of the Responsible


Tech community focuses on improving our
current approach to technology, there are
artists, designers, entrepreneurs,
technologists, and more thinking through
and reimagining possibilities.
15
Creating a tech future that is
ALL TECH IS HUMAN | 2023

aligned with the public interest.


What’s your vision for a better tech future?

One of the problems with our current approach to these challenges is


that we allow a small sliver of society to determine the design,
development, and deployment of technologies that impact society at
large.

Here, at All Tech Is Human, we say “no application without


representation.” In other words, if you are impacted by technology,
you deserve some modicum of control or a mechanism for input.

This means our tech future should be determined collectively, not by a


select group that may have values, concerns, and visions of a future
that are out of step with the public’s general consensus.

The responsible tech movement needs to allow for speaking, but also
listening. A few voices should not drown out many.

16
What does a “better tech
ALL TECH IS HUMAN | 2023

future” look like?

Better Tech
Education

Proactive Tech Underlying


Policy Systemic Tech
Issues

Multi-stakeholder
Collaboration on
Tech Issues Human
Diversifying Flourishing
Tech Pipeline Alongside
Tech

Greater Tech Reimagining


Oversight Tech Futures

Responses to the question, "What does your better tech future look like and
what can we do to achieve it?” surfaced eight distinct categories related to a
perceived root cause of today's problems or an avenue for improving systems
and structures.

17
What does a “better tech
ALL TECH IS HUMAN | 2023

future” look like?


A better tech future is one where communities – especially
historically overlooked communities – are centered in the design,
development, and deployment of technology. Even more so, humans’
relationship with technology has changed because humans’
relationships with each other have changed. Cross-sector endeavors
and cross-functional teams allow people with myriad skillsets to
successfully collaborate to prioritize problems, identify opportunities
for improvement, develop technology while mitigating systemic bias,
and implement tech and data solutions that address community-
identified needs. Technical innovation and complexity continues to
progress at incredible rates, and access to advanced technology is
widespread.

A better tech future also includes structures


that provide enough support for people and
organizations to positively impact the world
in sustainable ways. This means the world has
business structures that allow both for-profit
and not-for-profit organizations to benefit
when they design for equity and inclusion.

Additionally, policies and regulations


encourage strong technical talent in all
sectors. Finally, investors fund not just
technical innovation but also process
innovation, and support incremental
improvements in addition to new
products and services.

AFUA BRUCE, PRINCIPAL, ANB ADVISORY GROUP LLC


CO-AUTHOR OF THE TECH THAT COMES NEXT

18
What does a “better tech
ALL TECH IS HUMAN | 2023

future” look like?

[The future of technology] should inspire us to human creativity. It


should let us do the things that bring us joy, whether that's art or
literature. It should inspire us to create more and better and
augmented things. It should create shared economic opportunity,
which means that as these technologies are creating new models for
how we might build economic systems, we need to move to a world
where there's a cornucopia of bounty and everybody shares in it
equally.

We need to live in a world where technology transforms political


power, where individuals understand what's happening and are able to
use technology to influence policy. I have a deep hope and confidence
that if we empowered communities to be a part of shaping our shared
destiny, we'd live in a world where all of these things I've just described
would happen naturally.

So really, to me, the best form of a


technology future is one where people
are at the center and have the ability to
make decisions that create a dignified
future for all of us.

[From our podcast series archive]

VILAS DHAR, PRESIDENT OF THE PATRICK J. MCGOVERN FOUNDATION

19
What does a “better tech
ALL TECH IS HUMAN | 2023

future” look like?

A better tech future must be centered on public interest values that


shape both the development of technological innovation, and the
public policy that sets guardrails around it. The public cannot afford to
be passive consumers of technology any longer. Internet centered
innovation provides the public with great power to communicate,
create, share, and form bonds in community.

This democratic power to communicate never existed before the


internet. The public also has distributed responsibility for the use of
that power, especially when it is used to create harm. Values like free
expression, competition and consumer choice, privacy and control of
individual data, quality content moderation, and affordable access, and
other values have to be as important as a tech company's bottom line.
It will take the public will, through democratic government and civil
society organizations, to set this expectation
in tech innovation...and to enforce it through
regulation that is specific and flexible as
innovation develops.

CHRIS LEWIS, PRESIDENT & CEO AT PUBLIC KNOWLEDGE

20
Three ways to build a
better tech future
ALL TECH IS HUMAN | 2023

01
Creating a more cohesive ecosystem

Having a more cohesive responsible tech


ecosystem that promotes knowledge-
sharing and collaboration leverages
collective intelligence and participation. Our
activities are designed to bring together
people across civil society, government,
industry, and academia.

02
Moving at the speed of tech

Our current issues stem from tech


innovation moving much faster than
society’s ability to grapple with its
ramifications and determine appropriate
guardrails and policies. Our ability to
consider the impacts of technology needs
to move at the speed of tech.

03
Getting new voices into responsible tech
ecosystem
Complex tech and society issues require a
diverse range of backgrounds, disciplines,
and perspectives involved. Too often,
individuals with valuable insight and ideas
are not sitting at the proverbial table; this
needs to change.
21
There is a global network
ALL TECH IS HUMAN | 2023

focused on tackling wicked


tech & society issues

If you are looking to get more involved in the growing responsible tech
movement, All Tech Is Human has multiple in person and virtual
options. Our Slack community (illustrated above) now has over 6k
members across 77 countries. Individuals in our Slack are sharing
resources, learning from each other, and meeting in person in cities
around the world.

Our open working groups feature a mix of established leaders and


emerging voices from various backgrounds across the globe, and our
mentorship program has involvement from over 40 countries.

So, no matter the location, there is a community waiting for you. Find
all of our projects here or at alltechishuman.org.

22
ALL TECH IS HUMAN | 2023

Mentorship Program
All Tech Is Human’s Responsible Tech Mentorship Program has continued to grow
throughout the last two years. In our 2023 cohort, 117 mentors representing 19
countries are leading 300 mentees representing 40 countries. We have had
nearly 1,000 mentees complete the program since its first cohort in 2021!

Below are just some of our incredible mentors who are paying their knowledge
forward to build a more robust responsible tech ecosystem.

23
ALL TECH IS HUMAN | 2023

Mentorship Program
The Responsible Tech Mentorship Program is a free program run annually to help
build the Responsible Tech pipeline. The program accomplishes this by
facilitating connections and career development opportunities among talented
students, career changers, and practitioners.

Our mentorship program cohorts represent people from a range of fields who
work in Responsible Tech all over the world. In 2023, we had mentors and
mentees from the following fields: Ethical AI, Digital Governance, Tech &
Democracy, Public Interest Technology, Research, UX Research, Product,
Responsible Technology in Healthcare, Tech Journalism, Technology & Wellbeing,
Trust & Safety, Privacy, and Tech Policy.

By expanding pathways for more disciplines and backgrounds to actively get


involved, we believe we can help build a better tech future.

What does it look like to participate in the program?

The All Tech Is Human team reviews applications and creates mentorship pods
that consist of one mentor and three mentees. Mentors are fully vetted, and
many return to participate with us every year. Mentees are college students, new
grads, early and mid career practitioners, and seasoned professionals.

Mentors lead a one 1-hour virtual meeting per month with an optional monthly
curriculum provided by All Tech Is Human. Meeting topics include insight into
what it looks like to be a responsible tech practitioner, career search advice,
navigating the field, and more. Depending on what the pod looks like, mentors
have the freedom to tailor the program however they wish. Some groups choose
to work on a group project over the course of the program together, which might
be an article, webinar, or podcast.

Mentors and mentees have opportunities to network with other participants


through designated channels in the All Tech Is Human Slack, and virtual meetings
and mixers.

How to Get Involved

If you’re interested in applying to participate in the future, join the waitlist linked
on the mentorship program page of our website to be notified when applications
open for the next cohort. And find additional info at AllTechIsHuman.org and
reach out to our team.

24
Responsible Technology Career
ALL TECH IS HUMAN | 2023

and Talent Pipeline


Transforming the Responsible Tech We curate a list, updated daily, of
talent pipeline to make it more diverse, hundreds of jobs, internships, and
multidisciplinary, and aligned with the fellowships in fields like:
public interest is a crucial component
of co-creating a better tech future. The Responsible or Ethical AI
homogenous sliver of society currently Public Interest Technology
in the position to design, develop, and Online Trust & Safety
deploy the technologies that we use Tech Policy
every day is failing to effectively and Data Privacy
comprehensively navigate the Accessible & Human-Centered
complexities of these emerging Design
technologies, leaving communities Digital Governance
vulnerable to negative and unintended Youth, Tech, and Wellbeing
impacts. Our theory of change involves Tech & Democracy
increasing the discoverability of the
growing number and types of available
opportunities and facilitating
participation from talent with a
significantly wider variety of “At All Tech Is Human, we are
backgrounds, disciplines, and lived
experiences.
focused on illuminating career
pathways for responsible tech
As the meta-connector for the people, practitioners and aspirants
organizations, and ideas of the within our growing community
Responsible Tech movement, All Tech Is
Human is concerned with multi- and on connecting these
stakeholder cooperation, bringing passionate individuals with
together talented job seekers alongside impactful positions throughout
leading responsible technologists,
practitioners, and hiring managers into the Responsible Tech
one collaborative community. We ecosystem.”
feature roles that are focused on
- Rebekah Tweed,
reducing the harms of technology,
diversifying the tech pipeline, and
ensuring that tech is aligned with the Executive Director
public interest.

25
ALL TECH IS HUMAN | 2023

Learnings from the Job Board Despite the recent challenges in the
hiring environment across the tech
Rebekah Tweed first started the industry, there has been an uptick in
Responsible Tech Job Board in available opportunities related to trust
September of 2020 in an effort to curate and safety and artificial intelligence,
into a single resource the many thanks to shifting priorities of many
disparate opportunities that constellate companies in the wake of the
our shared center of gravity – tackling widespread availability of generative AI
the thorny issues at the intersection of tools and the corresponding regulatory
tech and society. This job board, now interest from policymakers across the
curated by Elisa Fox and expanded to globe. We expect to see early career
regularly feature more than 500 opportunities grow as Responsible Tech
opportunities, has quickly grown into a departments within the industry
go-to resource for both applicants and continue to grow throughout the next
hiring managers to understand the year.
evolving field of Responsible Tech. We
track roles across sectors, including: Our conversations with hiring managers
provide insights into the skills,
Academia experiences, and educational
Responsible Tech-oriented backgrounds that are most highly sought
Faculty after, and we incorporate these learnings
University-based Research into the advice we give to job seekers
Institutes for how to best prepare themselves to
Civil Society become great candidates, secure these
Global NGOs roles, and contribute to the field of
Non-profits Responsible Tech.
Think Tanks
Research Institutes Responsible Tech Talent Pool &
Philanthropic Foundations Matchmaking Service
Government
Federal All Tech Is Human offers a personalized
State Talent Matchmaking Service to connect
Local hiring managers and recruiters with
Industry Responsible Tech talent within the All
Tech Industry Tech Is Human community and our
Other Industries: Finance, extensive network of talented individuals
Textiles, Energy, Communication, who are ideal for these hard-to-place
Automotive, Pharmaceuticals, roles, and we have a large Responsible
and more Tech Talent Pool of job seekers who are
Responsible Tech Startups interested in connecting with these
Global Consultancies employers!

26
ALL TECH IS HUMAN | 2023

This private platform is open to those University Network


currently looking for a role as well as
those already in a role but interested in With our Responsible Tech University
being notified about additional Network, we are aiming to connect all
opportunities in Responsible Tech: stakeholders engaged in the tremendous
level of activity happening at universities
Responsible AI across the globe around tackling wicked
Online Trust & Safety and Integrity tech and society issues.
work
Public Interest Technology In particular, our organization takes a
Tech Policy bottom-up approach that finds and
Digital Governance connects key professors, researchers,
Data Privacy career counselors, and student leaders
Accessible & Human-centered across a wide variety of disciplines,
Design uniting them all in a network that
Tech & Wellbeing promotes knowledge-sharing and
Tech & Democracy & more! collaboration and has the agility to move
at the speed of tech.
We have built up extensive relationships
with hiring managers, amassing Whether you are an undergraduate
invaluable insights into emerging careers, student, graduate student, or
desired skill sets, and an understanding professional pursuing a certification, All
of how we can better align individual Tech Is Human’s Responsible Tech
career paths, employer needs and job University Network is for all students and
descriptions, and new university prospective students who are entering
programs being launched to educate the or advancing in the field through the
next generation. pursuit of an academic degree, as well as
responsible tech practitioners, academic
With our Talent Matchmaking Service, faculty, and university staff who are
our business model (to offset our contributing to the field within the
reliance on foundations as a small non- academic sector.
profit) is to charge employers a 10%
finder's fee for successful matches The Responsible Tech University
which allows us to assist thousands of Network provides the opportunity to
individuals freely through our job board, connect with like-minded people across
mentorship program, office hours, campuses in a collaborative
mixers, and more. Cost should never be environment, to deepen engagement
a barrier to bring in new voices! and involvement in the nascent
responsible tech movement, and to
share ideas and compare perspectives
that are informed by the unique campus
experiences of each member of the
network.

27
ALL TECH IS HUMAN | 2023

The Responsible Tech University University Ecosystem Mapping Initiative


Network convenes advisory sessions to examine the many emerging degree
around important topics as well as programs at the intersection of tech and
facilitates webinars and provides society, as well as the faculty who are
resources, including best practices for researching these issues and teaching
starting a student club or organizing an these courses, and the student clubs
event (with resources for finding ideal addressing topics in Responsible Tech.
speakers, such as our list of previous This resource is available on our website
speakers), and providing support for and serves as a helpful guide for
members of underrepresented groups in students looking into academic
tech. programs for their undergraduate majors
or trying to navigate where to head for
The Responsible Tech University grad school to pursue masters and
Network provides the opportunity to doctoral degrees and graduate
engage in important conversations certificates.
within a like-minded community that
spans stakeholder groups in a more As part of this initiative, we’re also
effective convening, intentionally tracking professional certifications,
structured to break the silos that which is an increasingly important
traditionally create friction in academic component of the ecosystem as career
settings. changers look for upskilling
opportunities. We find Responsible Tech
All Tech Is Human’s University Network degree programs offered within colleges
attracts students across the globe who and schools focused on a variety of
want to create Responsible Tech disciplines, including:
student clubs, organize and host Philosophy, Public Policy, Law,
Responsible Tech events, expand the Engineering, Computer Science, Data
Responsible Tech movement on their Science, Information, Design, Sociology,
own campuses, and learn more about Anthropology, Social Work, Education,
the field – to network, to find peers and Communication, Business, Arts &
mentors, to deepen their understanding Sciences.
of the space and to explore the varied
professional and educational options, as Many students were first introduced to
well as to give back to the larger the Responsible Tech movement
community. through a single, typically undergraduate,
course. The University Network also
Our University Network provides an provides an opportunity for interested
opportunity to connect with likeminded students to join our group peer review
students across the globe who are often initiative – we’ve partnered with Springer
navigating without a clear sense of AI and Ethics Journal for students to gain
common degree programs that lead experience of the anonymous peer
directly to jobs in the field of review process and are preparing a
Responsible Tech. This lack of clearly Topical Collection of the Springer AI and
defined academic pathways is why we Ethics Journal examining the impacts of
have undertaken a Responsible Tech artificial intelligence on children and
youth set for release in 2024.

28
ALL TECH IS HUMAN | 2023

Profile
Interviews
Hear from individuals in the All Tech
Is Human community on career
advice, what a better tech future looks
like, and more!
1980s

Our organization has featured and


learned from hundreds of responsible
tech professionals through our
reports, summits, and more.

2000s

29
Alix
Fraser
Director, Council for Responsible
Social Media at Issue One

Tell us about your role and what success looks


like for you.

I am the Director of the Council for Responsible


Social Media at Issue One, a cross-partisan
advocacy organization that is fighting to build a
better online world that can enhance, rather than
undermine, American democracy. In this role, I have
the great fortune to lead the Council, which is a
diverse and bipartisan coalition of leaders–including
former politicians, tech insiders, national security
and religious leaders, and impacted individuals, like "My advice to anyone looking to
parents and Gen Z advocates who have felt the improve our online world is that
negative impacts of social media firsthand. It’s an
incredible group of people to work with as we
you should think about where
advocate for responsible social media safeguards you can add the most value to
that protect our kids, communities, and U.S. national this work–whether that is as a
security.
researcher, journalist, data
As an advocacy organization bringing Republicans, scientist, or advocate. Talk to the
Democrats, and independents together, success is people doing this work and
real change that impacts people’s daily lives. That figure out where you would be
best suited to make an impact
change can come in many forms, but we are
primarily working to pass legislation and make
policy changes that will fundamentally alter how and then go for it!"
social media, AI, or other technology impacts our
lives and our democracy.

ALL TECH IS HUMAN | 2023


30
Profile Interviews | Alix Fraiser

How did you carve out your including at the cost of


career, and what advice would American democracy.
you give to others wanting a
similar role? My advice to anyone looking to they continue to have free reign
improve our online world is that over our digital lives — and
you should think about where continue to rake in the money
I found my way into responsible
you can add the most value to that follows. The only way for
tech by continuing to follow my
this work–whether that is as a that to change is for people to
passion for democracy and researcher, journalist, data step up, be brave, and fight to
politics and looking at where I scientist, or advocate. Talk to make this better–from inside
could make the biggest impact. the people doing this work and
these platforms to the halls of
After working in government and figure out where you would be
Congress and beyond.
trying to protect democracy best suited to make an impact
around the world at the State and then go for it!
Department, I moved into state
and local politics working to What advice would you give to
individuals looking to be
improve our education system.
involved in the Responsible
Then I watched January 6th
Tech ecosystem?
unfold and felt deeply
compelled to find a way to Whether you want to make this
improve the health of American your full-time career, or just
democracy. For me, that started know that you want to spend
with the information some of your energy working to
environment and making social make technology better, safer,
media a much healthier place. and healthier–do it! We need
January 6th didn’t happen more voices, more volunteers,
overnight. While much blame and more incredible
professionals among our ranks
has been put onto one
to help create a world where
individual, it is unlikely that the
technology enhances the human
lies could have spread in a
experience and lifts
healthier online world that democracies around the world.
wasn’t focused on maximizing That reality is possible, but the
engagement at any cost– opposition to it is immense. The
tech platforms are incredibly
powerful and spend hundreds of
millions of dollars to ensure that

ALL TECH IS HUMAN | 2023


31
Amira
Dhalla
Director, Impact Partnerships and
Program (Privacy and Security) at
Consumer Reports

Tell us about your role.

Consumer Reports is an independent, non-profit


that speaks directly to, and for, consumers across
the nation, while standing up for the issues they
care about. In my role, I work closely with
organizations helping to advance similar missions
“A better tech future depends
to develop impactful and collaborative projects.
These projects focus on how we have to improve on the people. Whether it is
the cybersecurity and privacy of products and people combining power to
tools in the marketplace, while also tackling topics create movements that
like discriminatory technologies and the impacts improve technology for
algorithmic biases have in the systems that
surround us.
everyone, or it is individuals
who are developing new
How did you pave your career in the technology to address the harm
Responsibility Tech field? others have felt. I’m optimistic
about people’s abilities to hold
As someone who’s always been excited about the
future of technology, I started my career early each other and the
working at start-ups, where I got first-hand systems accountable.”

ALL TECH IS HUMAN | 2023


32
Profile Interviews | Amira Dhalla

experience of what it was like toincludes a framework for how we


build, launch, and develop understand and practice
emerging tech. As I continued a accountability among those
career in technology, going respective stakeholders. All of us
beyond start-ups, I found myself have to be more accountable to
how technology impacts the
chasing projects and roles that
most vulnerable people using it mandatory curriculum on how to
were guided by the issues faced
and our respective role. In doing reduce biases in code or
by communities I cared about.
so, we will see more roles evolve prevent deceptive design
Working with communities that involve ethical decision- patterns. In turn, we’ll see a field
around the world, I was able to making that is directed towards that is filled with people who are
thread similarities in their companies or trust and safety creating audience-facing
concerns with the technology teams that protect individuals. technology in a more
they use and how it impacts We will also see more resources responsible way.
them on different levels. I and information that shows how
realized the various threats our digital systems impact Looking ahead, what makes
emerging technology poses to people without them even you optimistic that we can
realizing it. build a better tech future?
historically oppressed people
and have spent many days since
What does a better tech future A better tech future depends on
then advocating for a more
look like to you? the people. Whether it is people
inclusive and safe digital world.
combining power to create
One thing I’m looking forward to movements that improve
Where do you see the future of seeing in the future is how the
technology for everyone, or it is
Responsible Tech headed? term “Responsible Technology” individuals who are developing
is integrated in academic new technology to address the
In a recent Consumer Reports institutions. Schools are vital to harms others have felt.
survey, we asked individuals who the development of technology I’m optimistic in people’s
were responsible for protecting skills, but we’re starting to see abilities to hold each other and
more conversations on how the systems accountable.
the privacy and security of
Responsible Tech becomes a
individuals – not surprisingly,
core part of curriculum. Instead
most selected the federal
of just learning how to become a
government (33%) which was software engineer or a designer,
followed closely by companies a school can also have
(32%) and individuals (25%). The
future of Responsible Tech

ALL TECH IS HUMAN | 2023


33
Bilva
Chandra
Technology and Security Policy
Fellow at the RAND Corporation

In your opinion, what are the most important


issues in Responsible Tech today?

The most important issues in Responsible Tech


today are AI governance and mitigating present
risks and harms from advanced AI systems, data
privacy and preserving the rights of the individual
online, and building robust regulatory frameworks to
reduce technology misuse. This triumvirate of
issues is often intersectional--for example, large-
scale AI systems pose significant data privacy and
security risks given new avenues for adversarial "Diversity, inclusion, and
exploitation and have been built largely without belonging are not just helpful but
external legal and regulatory oversight. The advent essential in the AI safety domain,
of generative AI has captured sweeping public
attention about the benefits of its use, along with its
if we want AI systems to be less
potential to accelerate current societal harms-- biased, less discriminatory, and
misinformation, discrimination and bias, cyber more reflective of the global
attacks, and much more. These systems largely
population. Similarly,
depend on vast amounts of online data--including
personally identifiable information, and are representation is a national
functionally inoperable without human data, and security concern, as it is how the
human model-level intervention. We are at a critical best and the brightest minds can
bring fresh perspectives into
juncture in history to protect against elevated risks
and harms resulting from emerging technologies, to
ensure an individual’s right to privacy is not government and civil society. "
infringed while determining a path forward to utilize
technology to benefit the whole of society.

ALL TECH IS HUMAN | 2023


34
Profile Interviews | Bilva Chandra

How did you carve out your this domain would be to focus
career, and what advice would on a values-driven career first--
you give to others wanting a how will your work impact
similar role? society for the better? Are you discriminatory, and more
nourished by the culture and reflective of the global
environment you are working in? population. Similarly,
I have carved my career with
Are you supported by the representation is a national
intention while enjoying the
people around you? I do not like security concern, as it is how the
many unexpected twists, turns, to perceive any opportunity as best and the brightest minds
and opportunities along the way. the end goal, but rather a piece
can bring fresh perspectives
While I was in the Masters’ of a broader journey towards into government and civil
Security Studies Program (SSP) career satisfaction and success. society.
at Georgetown, learning about What you’re working on today
national security risks, Section should be a building block for
230 of the Communications what you do 5 years from now--
Decency Act, and how the -but it does not have to fit into a
perfect box. My desires for my
extremist far-right operates
career have changed over time,
online, I had an epiphany that I
but a constant goal has been to
wanted to work in tech. More
work on issues at the
specifically, work on the nexus intersection of technology and
of technology, policy, and society that matter in a values-
national security to bolster driven environment.
online safety and write policies
to govern new technologies. Within your area of practice,
With that in mind, I pursued who still needs to be included
several roles after graduation, in your field?
from busting malicious influence
The area in which I practice is
actors and preserving election
largely dominated by men, both
integrity at LinkedIn to building a
in the national security domain
product safety pipeline for AI
and in the AI domain. There must
image generation at OpenAI. be a conscious effort to bring
more women and people of
The main piece of advice that I color to the forefront of
would give to anyone starting in technology. Diversity, inclusion,
and belonging are not just
helpful but essential in the AI
safety domain, if we want AI
systems to be less biased, less

ALL TECH IS HUMAN | 2023


35
Brittney
Smith
Senior Manager of Education
Partnerships at the News Literacy
Project

Tell us about your role and what success looks


like for you.

For the News Literacy Project, success means


building a movement for news literacy throughout
American society, creating better informed, more
engaged, and more empowered individuals, and
ultimately a stronger democracy.

In my role, I collaborate with educators across the


country to cultivate partnerships that encourage "Just because young people are
‘digital natives’ doesn't mean that
schools to teach their students news literacy skills.
Schools are a crucial part of how we will build this
movement. That means we need to support they know how to determine what
requirements for news literacy instruction as part makes information and sources
of a robust civics’ education, with the same credible. And it's not just
emphasis and importance as English or math
classes. We also need to prepare educators and
adolescents and teens – people of
develop their skills in news literacy instruction. all ages are struggling to know
what's fact and what's fiction."
In your opinion, what are the most important
issues in Responsible Tech today?

Misinformation and disinformation are existential


threats to our democracy, and widely available

ALL TECH IS HUMAN | 2023


36
Profile Interviews | Brittney Smith

artificial intelligence Ultimately, we need to teach


technologies will likely only add people how to think, not what to
to this problem. We need to think.
teach everyone how to navigate
today's information environment Which individuals or
organizations do you believe
by becoming more news-
are doing impactful work in If you could travel back in time
literate. News literacy means
building a better tech future? 5 to 10 years, what would you
knowing how to identify credible
sources of news and other urge companies/govt to do in
Educators, school librarians, and order to prevent some of the
information. school district leaders across tech related problems we are
the country are doing impactful facing now in the 2020s?
Just because young people are work to build a better tech
"digital natives" doesn't mean future by ensuring that young If I could travel back in time for
that they know how to people have news literacy skills five to 10 years, I would urge
determine what makes to find credible information tech companies and the
information and sources online and resist misinformation. government to prevent the
credible. And it's not just messy information space we live
They have the support of many in today by taking the following
adolescents and teens – people
nonprofits like mine, the News actions proactively: First,
of all ages are struggling to know
Literacy Project, which provides requesting social media and
what's fact and what's fiction. free resources and training to tech companies to have clear
educators in all 50 states. community standards for their
We can teach students and Currently, our resources have products, that are enforced at
people of all ages how to spot reached about 450,000 the very launch, and algorithms
misinformation and identify students and 60,000 educators. that reduce the spread of
credible information. This We also partner with misinformation and promote
requires supporting organizations like AARP to bring credible information. Second,
requirements for media literacy our training and resources to theurging school leaders to require
public. media literacy instruction
instruction in schools, so young
people learn transferable skills beginning at kindergarten to
Other champions in this area prepare students to think
that are flexible enough to apply
include the National Association critically and safely navigate our
to the ever-changing for Media Literacy Instruction, information environment.
information and technology which advocates media literacy
landscapes. It also requires instruction, and Media Literacy
building a movement for news Now, which tracks legislative
literacy, so these skills are an efforts to make media literacy
integral part of American life. instruction required in schools.

ALL TECH IS HUMAN | 2023


37
Deepti
Doshi
Co-Director, New_ Public

What does a better tech future look like to you?

A better tech future is one that is not controlled


by capitalist interests. It’s not necessarily one that
only serves public needs, but one where there is
an ecosystem of talent and funding that supports
the creation of public social spaces online - that
are welcoming to everyone, support people to be
in healthy community with one another, and work
together - without profitability as the motive.
"I think it’s important to ask
How did you carve out your career, and what
advice would you give to others wanting a yourself, what’s the problem
similar role? facing people that you want to
contribute to solving? I stumbled
I think it’s important to ask yourself, what’s the
into tech because I was
problem facing people that you want to contribute
to solving? I stumbled into tech because I was interested in how you help people
interested in how you help people build civic build civic power by working
power by working together, and had created together... "
Haiyya in India to run grassroots door-to-door
neighborhood campaigns. Through that work and
in the wake of the Arab Spring, I saw the value of
social media to helping ordinary people build
power through relationships - and decided to go

ALL TECH IS HUMAN | 2023


38
Profile Interviews | Deepti Doshi

to Facebook to work on Groups. product development process -


I began to feel the limits and even across responsible tech.
pressures of an incentive But if one of the problems we
system governed by advertising are trying to solve online is how
to support people to be
revenue, and so joined Eli Pariser
together better, then we must know people different from
and Talia Stroud to build New_
turn to people - like pastors, them is an important part that's
Public. But the problem I am
neighborhood community decaying and often underlooked.
trying to solve over the last leaders, as well as online And today, digital innovation is a
decade has always been the moderators - who have this large part of making it happen.
same: how to help people build lived experience.
their power through
relationships and community. Everyone has a different
Figure out what your question is! motivation for being involved
in the Responsible Tech
Here at All Tech Is Human, we movement. What is your
motivation?
aim for a diverse range of
disciplines to be involved in
I come from a Jain family, and
tackling wicked tech & society
one really important tenet of our
issues. In your opinion, what faith is Anekäntaväd, which
background(s) should be means that multiple truths can
included more? exist at the same time. My belief
in this principle brought me to
At New_ Public, we believe in this work, and my curiosity: how
and practice sociotechnical can we build technology and
design — by that we mean that social media that serve as a
we bring together technical skills space for different kinds of
people who may hold different
with practical knowledge about
truths to be in healthy and
community formation,
functional communities, and
maintenance and governance.
build their power together?
Individuals who have this
practical experience are often I see community as the
under-represented in the foundation of democracy. It's
not the only thing a good
democracy needs — we need
reform across many pillars — but
spaces for people to get to

ALL TECH IS HUMAN | 2023


39
Evelyn
Aswad
Co-Chair and Member of the Oversight
Board

Tell us about your role and what success looks


like for you.

I’m a Co-Chair and Member of the Oversight Board,


which issues binding decisions on content on
Facebook and Instagram as well as makes
recommendations on a broad range of issues to
both companies. In this capacity, success for me
means applying oversight of these companies’
content moderation over the speech of billions
"For me, a better tech future
through a global Board that is independent of the
companies and which applies an international means more tech companies
human rights framework to all of its decisions. The espouse the UN Guiding
Oversight Board remains a bold experiment in Principles on Business &
responsible tech. I think in our (just over) three
years of existence, we are showing that through
Human Rights, which is a
careful work such oversight is capable of global framework that calls on
increasing transparency in content moderation, corporations to respect
promoting fairness in the treatment of users, and
international human rights
highlighting the utility of human rights principles as
an ethical framework in this space. standards in their business
operations."
What does a better tech future look like to you?

For me, a better tech future means more tech


companies espouse the UN Guiding Principles on

ALL TECH IS HUMAN | 2023 40


Profile Interviews | Evelyn Aswad

Business & Human Rights, which judge at a U.S. federal appellate


is a global framework that calls court that had a big focus on
on corporations to respect tech and intellectual property.
international human rights Afterward, I was a lawyer at the
standards in their business U.S. State Department for about Technology does great work in
14 years. I spent 9 years in the identifying issues as they arise,
operations. As part of
human rights law division, with providing a useful analysis of
implementing this framework,
the last four as the director of freedom of expression and
companies are supposed to
that office. My endeavors at the privacy problems, and
engage in human rights impact Department included working on recommending ways forward.
assessments before launching Internet freedom issues, Witness is doing cutting-edge
new products and services. participating in consultations work in tackling the problem of
These assessments should, and negotiations on the UN manipulated media in the digital
among other things, include Guiding Principles & Human age. Access Now does
consultation with a variety of Rights, and promoting freedom tremendous work in flagging
stakeholders and experts. I think of expression. When I became a global trends in digital rights
we’ll have a better tech future if law professor about 10 years issues as well as convening a
ago, I focused on applying those wide variety of stakeholders
companies engage in such
principles, frameworks, and from around the world at its
human rights impact
multilateral approaches to tech RightsCon conference to
assessments before launching
issues in my scholarship. My discuss digital rights issues in a
new products and services advice is to seek professional timely way and with an
rather than leaving society to experiences that give you a interdisciplinary approach.
clean up the problems after they strong grounding in both
arise. technology issues and
international human rights
How did you carve out your principles. Then bring that
career, and what advice would combined perspective to
you give to others wanting a responsible tech issues.
similar role?
Which individuals or
organizations do you believe
I started as a corporate lawyer
are doing impactful work in
at a private law firm in DC building a better tech future?
working with information and
communication technology There are many organizations
companies. Then I clerked for a doing very impactful work in this
field. For example, I think the
Center for Democracy and

ALL TECH IS HUMAN | 2023


41
Jackie Lho
Global Policy and Engagement
Manager, Naver Z

Tell us about your role and what success looks


like for you.

I work on the Global Policy and Engagement team


at NAVER Z. Our flagship service, ZEPETO, is an
immersive 3D avatar-based social universe of over
400 million global users that provides a powerful
platform and intuitive tools for users to express
their creative visions, find community, and explore
new forms of entertainment. I'm responsible for
developing minor safety policies, communications
and external engagement with multi-stakeholders "What we need to focus on, in
in the responsible tech space, and the creation of my opinion, isn't to put all of our
resources into trying to win
educational resources for ZEPETO to empower the
next generation of healthy digital citizens. Because
I wear a few different hats in this role, success every race, but to channel them
takes different forms, but chief among them is into equipping the next
seeing high levels of engagement and generation with the knowledge
understanding across our user base (especially our
younger users) in online safety knowledge -- both
and tools to navigate the
specific to our platform and online spaces in landscape-- a ‘teach a person to
general. fish’ approach. This requires
collaboration and active
In your opinion, what are the most important
issues in Responsible Tech today? involvement from all parts of
society."
Perhaps I'm biased because my role is geared very

ALL TECH IS HUMAN | 2023


42
Profile Interviews | Jackie Lho

much towards minor safety, but direct experience you have in


one of the most critical issues tech. The beauty of the
we face is keeping kids safe. We responsible tech ecosystem is
are constantly playing catch up that fresh perspectives and
in the online safety space, just diverse skill sets are necessary
by the nature and pace of to improve the direction that
technological advances. It will tech is taking. It's a complex these complex and dynamic
always be challenging to get problem that isn't going to be online spaces is critical to their
ahead of new harms. What we fixed by a line of code or a set of safety and wellbeing.
need to focus on, in my opinion, policies. There are so many ways Foundational youth education
isn't to put all of our resources to get involved, whether it's at should include the knowledge
into trying to win every race, but the individual level of learning about protecting digital privacy,
to channel them into equipping what responsible tech is and detecting harmful actors, being
the next generation with the sharing that knowledge, joining respectful and responsible
knowledge and tools to navigate an organization focused on online, and understanding how,
the landscape-- a "teach a responsible tech or online when, and where to seek help
person to fish" approach. This safety, working at a tech
requires collaboration and organization and infusing those
active involvement from all parts values into its operations, and so
of society. much more.

What advice would you give to What group would you like to
individuals looking to be see more active in the
involved in the Responsible Responsible Tech movement
Tech ecosystem? and why?

One of the things that drew me It would likely require multiple


to All Tech is Human was its firm groups to effect this type of
belief that anyone who uses or is change, but I would love to see a
affected by technology can and digital safety curriculum made
mandatory in our education
should be involved in it in some
systems. Our online and offline
capacity, no matter how big or
lives are inextricably linked,
small, and no matter how much
perhaps even indistinguishable
from one another in many areas,
and youth's ability to navigate

ALL TECH IS HUMAN | 2023


43
Kathy
Baxter
Architect, Ethical AI Practice at
Salesforce

Tell us about your role.

As an Architect of Ethical AI Practice at Salesforce,


I work with teams and roles across the company
including research scientists, product
management, engineering, UX, legal, and
government affairs. I work with our research
scientists to examine ethical considerations in
their AI research to ensure it is safe. I collaborate
with product teams to identify features that
empower our customers to use our AI platform
(Einstein) responsibly and to build those features "Block off at least two hours per
with ethics in mind. week, every single week, to stay
up-to-date on the
A chunk of my time is also focused on developing
latest research and resources.
processes that bake ethical considerations into
our development cycle. And finally, I engage with Follow experts on social media,
customers, policy makers, civil society groups, attend conferences and
peers at other tech companies, and consumers meetups, and build your
broadly to share what we’ve learned and to learn
from them so we can create a more fair and just
network. "
world together.

ALL TECH IS HUMAN | 2023


44
Profile Interviews | Kathy Baxter

How did you pave your career Follow experts on social media,
in the Responsible Tech field? attend conferences and
What advice would you give to meetups, and build your
college & grad students network. This area is rapidly
developing & if you don’t invest
looking to be involved in the
in keeping up, you’ll be left The push for AI regulation by a
Responsible Tech ecosystem?
behind. There is no shortcut, concerned and angry society will
summary course, or CliffsNotes only increase. AI regulation is
I have a BS in Applied to learning about the vast world already being implemented in
Psychology and MS in of ethics in technology as a California, Illinois and
Engineering Psychology/Human foundation and ethics in AI Massachusetts, with more US
Factors Engineering. It is a specifically. states will follow. Just as the
technical degree situated in EU’s GDPR changed the way US-
humanities with an emphasis on Where do you see the future of based companies handled the
research ethics -- ensuring what Responsible Tech headed? data and privacy of customers
we are doing provides benefit in the EU, we will see significant
I’m encouraged by the increased changes in how EU and US-
and avoids harm. I began my
public focus on issues of racial based companies work following
career in user experience
injustice in technology in the AI regulations in the EU. This is a
research, examining people’s
wake of the BLM movement and good thing. Despite fears that
needs, context of use, and COVID. There have been more regulation will harm innovation, I
values. In 2016 I transitioned to discussions about how much believe it will elevate the ethical
research in AI and focused on surveillance we are comfortable playing field and stop the race
the ethical risks AI can present. with, whether it is for health or to the bottom for profits by any
In 2018 I pitched and created a security purposes. Big tech means necessary.
role to focus on ethics in AI full companies are reconsidering
time. who they sell facial recognition
to and for what purposes. There
are questions about the harms
My advice: There are loads of
vs. anticipated benefits of
existing resources and research
predictive policing and if it can
in this area. Block off at least two
ever be applied fairly. There is
hours per week, every single greater awareness of the risks of
week, to stay up-to-date on the deepfakes and disinformation to
latest research and resources. our democracy and who is
responsible for keeping it in
check.

ALL TECH IS HUMAN | 2023


45
Katie
Harbath
Founder and CEO of Anchor Change

Tell us about your role.

I currently work for myself doing a variety of


projects for organizations at the intersection of
technology and democracy. This includes
fellowships at the Bipartisan Policy Center, the
Integrity Institute, Atlantic Council, National
Conference on Citizenship and the International
Republican Institute. My work specifically focuses
on how tech companies protect the integrity of "We are in the middle of a
elections on their platforms as well as their content massive transformation in how
policies. we consume content online.
How did you pave your career in the
This means we are rewriting
Responsibility Tech field? societal norms and laws for how
we hold people accountable for
My career started in the early 2000s running
what they say and do on the
digital campaigns for the Republican party. I then
spent 10 years at Facebook where I built and led internet. By having more people
global teams that managed elections and helped involved in Responsible Tech
government and political figures use the social work and sharing their
experiences we can learn from
network to connect with their constituents. This
work included managing the global elections
strategy across the company by working closely the lessons of the past to build
with product teams to develop and deploy civic the future we want to have."
engagement and election integrity products
including political ads transparency features;

ALL TECH IS HUMAN | 2023


46
Profile Interviews | Katie Harbath

developing and executing every day. It’s also worth getting


policies around elections; experience in policymaking as
building the teams that support well through a job or fellowship
the government, political, and in a state legislature or
Congress. TechCongress is a data scientists, researchers,
advocacy partners; working with
great fellowship to consider for analysts, and policy experts that
policymakers on shaping the have or currently work at the
that. Finally, read a lot. There are
regulation of elections online, platforms on integrity issues. By
a ton of amazing people writing
and serving as a spokesperson sharing their knowledge
and talking about these topics.
for the company about these publicly they are helping
issues. It was at Facebook where Looking ahead, what makes policymakers, civil society,
I started to get more into the you optimistic that we can academics, and the media
integrity field and I am build a better tech future? better understand the
continuing that work mainly challenges this
Just the fact that we are having work faces every day. Moreover,
through the Integrity Institute.
these conversations in the open some of the most impactful
work is still happening at the
What advice would you give to gives me great hope. We are in
the middle of a massive platforms themselves. There
college & grad students are thousands of people doing
transformation in how we
looking to be involved in the the hard work every day to make
consume content online. This
Responsible Tech ecosystem? difficult decisions, invent new
means we are rewriting societal
norms and laws for how we hold ways to prevent harm, and
Nearly every job I’ve had in my people accountable for what write policies.
career never existed before I they say and do on the internet.
created it. This is true for the By having more people involved
Responsible Tech ecosystem in Responsible Tech work and
which has exploded in the last sharing their experiences we can
five years and will continue to learn from the lessons of the
grow and evolve. For those that past to build the future we want
to have.
want to get into this line of work
I highly recommend spending
Which individuals or
time working at a technology
organizations do you believe
company on integrity and/or are doing impactful work in
content policy issues. There is building a better tech future?
no better learning ground for the
really hard tradeoffs and I continue to be inspired by my
decisions that companies face colleagues at the Integrity
Institute who are the engineers,

ALL TECH IS HUMAN | 2023


47
Dr.
Murtaza
Shaikh
Online Hate & Terrorism Lead at
Ofcom, and Advisor to UN Special
Rapporteur on Minority Issues

What are the most pressing and important


topics in Responsible Tech?

I am biased, but I think online hate speech poses


an incredibly challenging and complex societal
problem that in the context of social media
platforms is global and cross-cultural. It is distinct
from other online harms as its precise definition
continues to be contested and highly dependent
on context. It takes place on public channels and "Soon after in 2016, when I gave
will always take place at incredible scale. We may oral evidence to the UK Home
improve tech/human hybrid solutions, but when it Affairs Select Committee on
Islamophobic hate speech on a
is mixed with far-right violent extremism,
disinformation and violent conspiracy theories, the
result is a concoction that poses a real threat to number of social media
the fabric of societies, risks animosity towards platforms, it dawned on me the
minorities and increases levels of real-world huge potential benefit that could
violence.
result from such tech companies'
How did YOU pave your career in the exerting sincere efforts at
Responsible Tech field? improvement that went beyond
reputational or commercial
I saw and realised that my expertise in
international human rights law and conflict concerns."

ALL TECH IS HUMAN | 2023


48
Profile Interviews | Murtaza Shaikh

prevention relating to commercial concerns.


decreasing religious hatred and
countering extremism was far What impact is your
more relevant and meaningful in organization having on the
Responsible Tech ecosystem?
the online context. Advocating
for changes to laws and policies
Ofcom is one of the first national should never be allowed to
was a slower and more difficult repeat.
regulators to take on the task of
process, which comparably had ensuring online safety for UK
less societal impact. So I sought online users. It has just begun to
to apply my legal, policy and do this in relation to UK-
international relations established video sharing
knowledge to online content platforms with incitement to
policies and moderation. hatred expressly within its
scope. This role will expand to all
I worked with the late Jo Cox MP, user-generated content hosting
and search services under the
who was murdered by a far-
Online Safety Bill, once passed. It
right extremist radicalised
will seek to protect online users
through online channels. Societal
from illegal and harmful content.
divisions and toxic campaigning
for the UK to leave the EU also Prior to joining Ofcom, I helped
contributed to this. Soon after in draft 'Effective Guidelines on
2016, when I gave oral evidence Hate Speech, Social Media and
to the UK Home Affairs Select Minorities' for the UN Special
Committee on Islamophobic Rapporteur on Minority Issues.
hate speech on a number of Once published, it is hoped that
social media platforms, it it will provide a blueprint for the
application of international
dawned on me the huge
human rights standards to how
potential benefit that could
social media companies can
result from such tech
practically ensure better online
companies' exerting sincere safety for minorities and other
efforts at improvement that protected groups. The resort to
went beyond reputational or social media channels in inciting
ethnic violence in Myanmar and
Sri Lanka and associated failures

ALL TECH IS HUMAN | 2023


49
Randy
Fernando
Co-Founder, Center for Humane
Technology

What does a better tech future look like to you?

In the short term, it’s a world where incentive


structures are more in sync with humanity’s best
interests—where the costs of invisible harms show
up on company balance sheets. Many companies
that seem incredibly profitable are only that way
because we aren’t accounting for their invisible
harms. "...People who deploy and profit
from powerful technology
In the longer term, it’s a world where the mindsets
that drive the technology world aren’t rooted in must have matching
competition and extraction. What we need instead responsibility and liability. The
is a mentality that is deeply rooted in reality is that if all the
interdependence: how our minds and bodies
influence and are influenced by our experiences,
unanticipated harms from
relationships, and the world around us. This one new technology are added up
principle unlocks all sorts of positive downstream and put back on company
thinking, such as better economic goals, metrics,
balance sheets promptly,
and product designs.
companies will be forced to
Without folding at least some elements of slow down and proceed much
interdependence, our abundance of technology more thoughtfully. "
and resources will most likely lead to more

ALL TECH IS HUMAN | 2023


50
Profile Interviews | Randy Fernando

concentrations of wealth and More nuanced discussions


power, broken sensemaking, about tech, and especially AI:
widespread mental health we’ll be able to make better
harms, and a world that leaves collective sense of our
the vast majority of humans situation if more people try
behind. to represent different
viewpoints more accurately. Specialization still has
Rapidly advancing technology We can ask: Where is the tremendous value, but the time
that’s tied to competition, truth in a viewpoint that invested in exploring unfamiliar
extraction, and harmful makes us cringe? Why might areas can help us make better
incentives may force a someone else feel so sense of complex, cross-
reckoning sooner than later. strongly about that disciplinary situations. It can also
viewpoint? Can we add build appreciation for experts in
What changes would you like elements of those answers those areas and help us invite
to see happen over the next to our work? those voices into our work.
year?
Here at All Tech Is Human, we
There’s a lot here, but here are aim for a diverse range of
two high-level ideas: disciplines to be involved in
tackling wicked tech & society
Binding of power and issues. In your opinion, what
responsibility: people who background(s) should be
deploy and profit from included more?
powerful technology must
have matching responsibility We need so much help! One
and liability. The reality is idea I’d like to emphasize is
that if all the unanticipated taking some time to round out
harms from new technology areas we don’t know as much
are added up and put back about.
on company balance sheets
promptly, companies will be A technologist can learn more
forced to slow down and about relevant social sciences—
proceed much more perhaps the history and power
thoughtfully. structures of a society that
technology reinforces. A social
scientist can learn more about
how a relevant piece of tech
works and how it impacts
communities.

ALL TECH IS HUMAN | 2023


51
Rebecca
Portnoff
Head of Data Science, Thorn

Tell us about your role and what success looks


like for you.

I lead the Data Science team at Thorn. Thorn is an


NGO dedicated to building technology to combat
child sexual abuse. My team builds the ML/AI
technology that accelerates our mission across the
three main intervention points we pursue -
supporting victim identification, stopping
revictimization (ending the viral spread of child
sexual abuse material), and preventing abuse from
happening in the first place. "Technology can be a double-
edged sword - are we working
We have ambitious goals here at Thorn. If I think
about what success looks like on a day-to-day to mitigate both how current
basis, it’s: does my team have what they need to harms against children
keep building and having impact? Are we manifest, as well as how
technological trends and
supporting our partners well, and providing tools
that are easy to use, easy to integrate, meet their
needs, and are highly effective? Are we ahead of advancements will impact
the curve on technological trends and harms against children in the
advancements, ensuring the best of technology is future?"
used to help kids first? Technology can be a
double-edged sword - are we working to mitigate
both how current harms against children manifest,

ALL TECH IS HUMAN | 2023


52
Profile Interviews | Rebecca Portnoff

as well as how technological On the other hand, we’re seeing the proper authorities. There
trends and advancements will generative AI technology get are opportunities at each
impact harms against children misused by bad actors to stage of the lifecycle to
in the future? further scale harm against prioritize child safety, and
children. They use this now is our moment to do so.
technology to create AIG-CSAM
Stepping out of the day-to-
(AI-generated child sexual
day, to the broader
abuse material). Victim
perspective: success looks like identification is already a needle
every single kid who has been in the haystack problem for law
identified and recovered from enforcement, where they have
their abusive situation, every to sift through huge amounts of
single image or video content to find that child in
documenting this abuse taken active harm’s way. Anything that
down and reported, and every adds to this haystack makes
single kid who has been their job more difficult. Bad
actors also use this technology
reached and moved away from
to further re-victimization, using
a dangerous situation in time
existing CSAM to generate more
as a result of the work we do in
explicit images of those same
partnership with the broader children. Sextortion is another
child safety ecosystem. area of impact—bad actors
accelerate their efforts by using
What are some trends / GAI technology to support the
growing possibilities in the content creation necessary to
future in your field? target a child.

Generative AI is front of mind The good news is that while the


prevalence of AIG-CSAM is
for me. I said earlier that
growing, it’s still small. Now is the
technology is a double-edged
moment for safety by design:
sword. On the one hand, we’re
prioritizing child safety across
seeing this exciting burst of the entire ML/AI life cycle of
technical advancement with development, deployment, and
large language models and maintenance. We can clean
transformer models— training datasets of CSAM,
advancements that my team is include content provenance as
incorporating into our tech part of deployment, and share
stack. hashes of known AIG-CSAM with

ALL TECH IS HUMAN | 2023


53
Sarah Gold
CEO, Projects by IF

Tell us about your role and what success looks


like for you.

I'm founder and CEO at Projects by IF. Success


looks like more organisations designing, shipping
and maintaining trustworthy products and
services. I want that to be as a result of more
teams understanding what it takes to make
trustworthiness table stakes. That they have the
tools, measurements and examples that support
"I want more digital teams to
them in this work so it's easier to achieve.
engage in the innovation work
In your opinion, what are the most important that is Responsible Technology.
issues in Responsible Tech today? To make the space for the messy,
choppy, imagination work that
I have been working in this space for the last 7
years (if not a little longer!) and there are still lots is needed to move from where
of challenges. I don't want to pretend that there are we are to a more responsible
any silver bullets. future. To be courageous and try
For me, the most pressing challenge remains the
new things. That means getting
gap between principles and practise. Responsible uncomfortable, challenging your
Technology and its adjacent fields have so often preconceptions and existing
been dominated by academic research and
ways of thinking.“
philosophy. There's lots of good that comes from
this work! However, when you're working at the

ALL TECH IS HUMAN | 2023


54
Profile Interviews | Sarah Gold

coal face of products and services, and into complexity, and move through that
making decisions every day about the unknown territory to discover new knowledge.
position of buttons or what words to use - New ways of being. This is all so we can
we need more practical guidance. Because collectively see the world in a different way.
once you get into the practicalities of what Where technology enables meaningful
it takes to ship product, you need the outcomes that are better shared between
problem and opportunity framed differently people, organisations and society . And in
in order to help you do your job. 7 years on I doing that work, we make it possible for
still see an absence of smart thinking and everyone to see that new world too. A
prototypes in the bridging between trustworthy world that opens up new markets
research and product. It's full of wicked and new possibilities that we all benefit from.
problems - like how much friction can you To be inspired by it.
actually add to an interface - but very
rewarding work when you uncover new How did you carve out your career, and
insights and make progress. what advice would you give to others
What is one major change that you would wanting a similar role?
like to see in the next few years?
I started working in Responsible Technology
What does a better tech future look like very early on in my career without realising it.
to you? I was helping to start up a project and
organisation called WikiHouse, which used
We all deserve products and services that open source and 3D manufacturing to
are worth trusting. I want a future where democratise access to home building. Then
technology embeds care. Where through my masters at Central Saint Martins
technology supports each of us as we live, in London I started to look at themes of
work and play. I want more digital teams to technology and societal change and that led
engage in the innovation work that is me to start my company Projects by IF. I have
Responsible Technology. To make the 2 pieces of advice for people wanting to get
space for the messy, choppy, imagination into the Responsible Technology field - one is
work that is needed to move from where we to access and participate in communities and
are to a more responsible future. To be networks already embedded in this space. All
courageous and try new things. That means Tech is Human is a brilliant example of this.
getting uncomfortable, challenging your You'll make amazing connections and find
preconceptions and existing ways of opportunities and experience through
thinking. Because Responsible Technology opportunities that become available. The
work demands new ideas, conversations, second piece of advice is to publish your
and teamwork. Because processes don’t thinking, there is space for all of us and we
solve problems. Teams do. Thinkers, makers need more practitioners who are practically
and doers. Multidisciplinary teams that lean building responsible technology to share their
learning and experiences. So get sharing! I
look forward to reading about your work!
ALL TECH IS HUMAN | 2023
55
Tamara
Kneese
Senior Researcher and AIMLab Project
Director

Tell us about your role and what success looks


like for you.

In my role as a Senior Researcher and Project


Director of Data & Society's newly launched
Algorithmic Impact Methods Lab, (AIMLab), we are
attempting to gain access to systems so we can
figure out the best methods, both qualitative and
quantitative, for assessing technologies' broad "Having a deep ethnographic
range of social impacts across their entire life understanding of potential and
cycles. The only way to hold the developers of existing harms is important,
algorithmic systems accountable is to gather rich,
empirical data and make actionable
and this requires building
recommendations. Having a deep ethnographic trusted relationships and
understanding of potential and existing harms is examining a problem from
important, and this requires building trusted
many angles. We will partner
relationships and examining a problem from many
angles. We will partner with a variety of with a variety of organizations,
organizations, emphasizing and foregrounding emphasizing and
marginalized communities, to assess impacts in foregrounding marginalized
communities, to assess
the public interest. Success to us means providing
transparent documentation and in-depth
methodological frameworks for understanding impacts in the public interest."
algorithmic impacts in different sectors and
according to particular, granular use cases, helping

ALL TECH IS HUMAN | 2023


56
Profile Interviews | Tamara Kneese

headvocates, industry leaders, AI art generators. This does


and policymakers make seem to be happening more at
informed decisions. events like Mozilla Festival,
within nonprofit research What does a better tech future
What are some trends / organizations, and on academic look like to you?
conference panels.
growing possibilities in the
future in your field? Right now, a handful of
How did you carve out your companies have a great deal of
career, and what advice would money and power. But the vast
I'm excited about the work I see you give to others wanting a majority of people who interface
that provides a more coalition- similar role? with platforms do not have
based, interdisciplinary much control over their working
approach to addressing the Throughout my career, including conditions or their experiences
biggest problems in the tech during my time as a graduate of technology more generally. In
industry. Technologists and student, I have always been an ideal world, technology for
academics can listen to workers, involved in external political good would also mean changing
artists, and communities that causes. My background as a the social, political, and
labor organizer and my work economic conditions that
are most affected by algorithmic
with various coalitions in the produce and deploy
systems or other technological
cities in which I lived have technologies in the first place.
realities. Rather than panels of
helped frame my ethnographic Historically vulnerable
academics and tech insiders and historical work on communities and contingent
talking about the impacts of AI computing. I worked a number of tech workers around the world,
on workers, it's great to hear jobs in nonprofits, academia, and from ride-hail drivers to content
from affected workers in tech, which helped me to see moderators, must have more say
themselves, from gig workers how power operates in different over how technology is used in
who are experiencing settings and taught me how to their daily lives. In addition to
algorithmic wage discrimination communicate with a wide changing labor conditions, there
to Amazon Mechanical Turkers variety of audiences. There is no must be more attention paid to
right or wrong path, and there is the resources that go into
experiencing information
precarity or other forms of software and hardware
asymmetry in their relationships
difficulty in any field. But I do production. There is a dire need
with requestors, or artists who
think that having professional for circularity and reuse, given
are having their work stolen by experience in a variety of the growing problem of e-waste,
sectors, along with my and a greater effort to switch to
background in organizing, renewable energy and forsake
opened up different kinds of fossil fuels.
opportunities.

ALL TECH IS HUMAN | 2023


57
Theodora
Skeadas
Deputy Director of Strategy for the
State of Massachusetts' Executive
Office of Technology Services and
Security

Tell us about your role and what success looks


like for you.

My organization’s mission is to provide secure and


quality digital information, services, and tools to
customers and constituents when and where they

"People for whom technology


need them. The Commonwealth’s lead IT and
cybersecurity organization provides secure and
quality digital information, services and tools directly impacts them, such as
across the Executive Branch. In this role, I advise those who have experienced
online discrimination,
executive agencies across Massachusetts state
government around technology decisions to
provide effective constituent services. Our team harassment, and account
has assisted agency partners by managing hacking, need to be included
in the tech policy
procurements (from drafting the RFX to evaluating
responses), mapping business processes,
conducting market research and RFIs, and conversation. Their voices are
developing requirements for technology. Success essential to ensuring that tech
means that we have worked thoughtfully with an
policy is designed to protect
agency partner to articulate a clear need and work
with them to implement their vision. and empower everyone."

ALL TECH IS HUMAN | 2023


58
Profile Interviews | Theodora Skeadas

How did you carve out your At Twitter, I managed our Trust and
career, and what advice would Safety Council, a trusted partners
you give to others wanting a program to support journalists and
similar role? human rights defenders globally, People from all racial,
and a research hub for the Public ethnic, geographic, and
Policy team. My team’s work sat at religious backgrounds need
I started my career working with
the heart of global debates around to be represented in this
civil society organizations in online speech governance, content
field, so that we can
Eastern Europe, the Middle East, moderation, and trust and safety.
develop policies that
and North Africa. In Greece, the Most recently, I have consulted with
reflect everyone’s diverse
civil society organizations and
West Bank, Morocco, and needs. This includes people
companies. These have included
Turkey, I worked on issues who are women, Muslim,
Carnegie Endowment for
including refugee integration International Peace's Partnership African, and Southeast
and immigration, youth violence, for Countering Influence Operations Asian.
community development, on government efforts to combat People with lived experience:
poverty alleviation, conflict disinformation in Ukraine, National People for whom technology
Democratic Institute on online directly impacts them, such
resolution, and education. I
violence against women in politics as those who have
observed firsthand how people and public life, and the Committee experienced online
used platforms to advance to Protect Journalists on a new
discrimination, harassment,
meaningful political discourse chat-based safety initiative that
and account hacking, need
delivers journalist safety
and social movements around to be included in the tech
information. Now, I am the Deputy
restrictive government policies. policy conversation. Their
Director of Strategy for the
Later, at Booz Allen Hamilton, I Massachusetts Executive Office voices are essential to
examined public sentiment, Technology Services and Security, ensuring that tech policy is
social movements, and where I advise executive agencies designed to protect and
across the Massachusetts empower everyone.
disinformation using social
government around technology People with expertise in
media for the U.S. Federal
issues. other fields: Technology
Government. I witnessed how
policy is a complex field that
digital technologies proliferated Within your area of practice, who
requires expertise in a
in the hands of political and still needs to be included in your
field? variety of areas, such as
social organizers and violent conflict resolution,
extremists. This work educated philosophy, and sociology.
There are several groups that still
me on global conversations. need to be included in my field: People with expertise in
these other fields need to be
People from underrepresented included in the tech policy
groups: The tech policy conversation, so that we can
profession remains dominated develop policies that are
by one traditional group.
informed by a wide range of
perspectives.
ALL TECH IS HUMAN | 2023
59
ALL TECH IS HUMAN | 2023

Five subject matter


areas of
responsible tech

Our global working group of 100


individuals from a range of
backgrounds, disciplines, and
experience levels came together to
1980s

offer an overview of Responsible AI,


Trust & Safety, Tech & Democracy,
Public Interest Tech, and Youth, Tech,
& Wellbeing.

60
Responsible AI

FROM OUR AI WORKING GROUP


CONTRIBUTORS
ALL TECH IS HUMAN | 2023

Overview explore what AI can do, what it should do,


The capabilities of artificial intelligence (AI) and what it could do in the future. The
are developing rapidly and in multiple responsible tech ecosystem is a venue
directions. Recent breakthroughs in where such issues are examined, value
Generative AI are currently the most visible, propositions are defined, tradeoffs are
but AI technologies are impacting decision- explored, and guardrails are proposed.
making and automation in a wide range of Responsible AI focuses on addressing
fields, with implications for industry, ethical, social, and safety concerns
government, and civil society. This rapid associated with AI systems. Responsible AI
pace of evolution and deployment is systems should be designed to make
resurfacing important and complex ethical decisions and align with and
questions surrounding AI ethics. As a result, prioritize values of human well-being,
a diverse community of advocates for fairness, transparency, and accountability
responsible technology, including to have a positive and sustainable impact
responsible artificial intelligence, is actively on society.
working to provide practical thought
leadership to guide AI development and AI should not perpetuate or reinforce
deployment - with a focus on equity, existing biases and discrimination.
inclusion, societal benefits, harm reduction, Responsible AI thus ensures that
and environmental viability. algorithms treat all individuals fairly and
without discrimination based on factors
The technical and social complexity of AI such as race, gender, ethnicity, age, ability,
systems requires a multi-voice effort to geographic location, or socioeconomic
status.

61
Responsible AI

Responsible AI must safeguard user data and (ANI) refers to AI systems that are designed
respect individual privacy. AI systems should for specific tasks or narrow domains.
be designed with safety and security in mind
and should have mechanisms in place for Black Box AI vs Glass Box/White Box: A ‘black
human oversight and control to prevent undue box’ is a system that is so complex that its
reliance or inappropriate use. behavior cannot be explained in terms of its
individual components. In AI and machine
Creating Responsible AI systems requires learning, the components of interest are the
collaboration between engineers, ethicists, features, or inputs, and the parameters that
researchers, policymakers, and the public. the system learns from data. Although it is
Only a multidisciplinary approach ensures that possible to grasp these components
AI is developed with a broad understanding of mathematically and understand them, the
its implications. system as a whole is not accessible—hence
‘black box.’" Glass box models, often referred
Key Terms and Definitions (From to as "white box," are the opposite of Black Box
ActiveFence, TSPA, and Digital Trust & models. With these models, users can
Safety Partnership glossaries) understand the decision-making process and
trace the relationship between inputs and
AI Bias: Bias in AI is the presence of unfair or outputs.
discriminatory outcomes arising from the
incorporation of biased data or flawed Generative AI: Refers to AI systems or models
algorithms. It occurs when the AI's predictions that can create or generate new content, such
or decisions disproportionately favor or as images, music, or text, based on patterns
disadvantage certain groups, thereby learned from training data. See Also:
replicating existing societal biases present in Foundation/Frontier Model
the training data. Addressing AI bias involves https://www.adalovelaceinstitute.org/resource
recognizing, understanding, and rectifying /foundation-models-explainer/
these disparities to ensure outcomes that are
as equitable and unbiased as possible. LLMs (Large Language Models): Refers to
advanced AI models that are trained on large
AGI vs ANI: Artificial General Intelligence (AGI) amounts of text data and can generate
is the hypothetical concept of AI systems that human-like text responses. These models use
possess general intelligence, similar to human deep learning techniques, such as transformer
intelligence. AGI systems would have the architectures, to understand and generate
ability to understand, learn, and apply relevant language.
knowledge across various domains and tasks.
On the contrary, Artificial Narrow Intelligence Human-in-the-loop: A human operator is

ALL TECH IS HUMAN | 2023


62
Responsible AI

involved in every step of the machine learning of AI systems with particular emphasis on the
process, in which human oversight, unintended consequences of civil and human
intervention or decision-making is integrated rights abuses. The “Blueprint” calls for safe and
into an automated or AI-driven process. This effective systems, algorithmic discrimination
approach ensures that humans remain protections, and data privacy. – The White House
actively involved in critical tasks, allowing
them to monitor, guide, and correct the November 2022: OpenAI releases GPT-3.5
system’s actions as needed. A person is part
of every stage of the machine learning February 2023: A reporter’s unsettling
process, blending human oversight and conversation with Bing Chat implies there is still
decision-making with automated or AI work to do. – The New York Times
processes. This way, humans can stay hands-
on with important tasks, keeping an eye on March 2023: The Future of Life Institute
the system and stepping in when necessary. published a letter calling for “all AI labs to
immediately pause for at least 6 months the
Responsible AI: RAI involves developing and training of AI systems more powerful than GPT-
using artificial intelligence systems ethically, 4.” The Future of Life Institute focuses on
considering their potential impacts on society. mitigating long-term “existential” risks to
It requires adhering to human values, legal humanity such as superintelligent AI which they
frameworks, and ethical standards, while argue could lead to extreme automation of jobs
ensuring transparency, accountability, and even human obsolescence. The letter was
fairness, and privacy. The goal of responsible signed by more than 20,000 people, including
AI is to harness the benefits of AI while academic AI researchers as well as industry
minimizing any adverse effects on individuals CEOs. The letter has been criticized for diverting
and society. attention from immediate societal risks such as
algorithmic bias and the lack of a transparency
TESCREAL: The acronym stands for requirement for training data. The pause did not
Transhumanism, Extropianism, occur. – Future of Life Institute
Singularitarianism, Cosmism, Rationalism,
Effective Altruism and Long Termism. See also: August 2023: OpenAI releases GPT-4, its largest
Timnit Gebru. LLM. GPT-4 is publicly availabnle via the paid
ChatGPT Plus, and OpenAI’s API. GPT-4 is a
Key Moments in Responsible AI multimodal model, accepting image and text-
October 2022: The White House Blueprint for based input.
an AI Bill of Rights is released. This “Blueprint”
identifies five core principles to guide and August 2023: Statement on AI Risk released. –
govern the development and implementation Center for AI Safety

ALL TECH IS HUMAN | 2023


63
Responsible AI

August 2023: AI experts inform Congress held the first-ever session on artificial intelligence.
about the advantages and drawbacks of The Council emphasized the risks AI poses to
artificial intelligence, as well as provide international peace and discussed how to mitigate
insights on how to effectively regulate this potential security implications. – The New York
swiftly advancing technology. – Bloomberg Times
Law
The Canadian government sought input on a
August 2023: The Federal Election voluntary code of practice for generative AI, aiming
Commission begins a process to potentially to ensure that participating firms adopt safety
regulate AI-generated deepfakes in political measures, testing protocols, and disclosure
ads ahead of the 2024 election. – Federal practices. – Venture Beat
Election Commission
Europe: The European Parliament passed its
Global Perspectives on AI version of the AI Act, triggering the final stage of
AI has become increasingly intertwined with the Union’s regulatory process. The EU is expected
our daily lives, influencing what we see, where to vote through and implement the law in early
to go, what to buy, and even how we vote. 2024. The Act sets out a comprehensive
Governments and local legislatures are framework for regulating the development and use
working to create actionable laws and of AI in the EU. – EU AI ACT
regulatory practices, as companies increase
the availability of various generative AI models 149 civil society organizations called on EU
to the general public. The following is an institutions to put people first in AI ACT. –
international list of the current major Algorithm Watch
happenings and trends surrounding the use of
artificial intelligence. The National Risk Register officially classified AI as
a long-term security threat to the UK’s safety and
Global: G7 Hiroshima Leaders’ Communiqué critical systems. – CSO
published with a reference to a commitment
to Responsible AI. In May, leaders from the G7 Asia: China unveiled new rules governing the AI.
countries announced they will be setting up Beijing’s controls on internet content and U.S.
the Hiroshima AI Process this year, in curbs on semiconductor exports to the world’s
collaboration with the OECD. The nations are second-largest economy are thought to hamper
set to discuss AI governance, IP rights, and progress. – Reuters
transparency.
India's telecom regulator, Trai, recommends an
North America: The United Nations Security independent statutory authority, the Artificial

ALL TECH IS HUMAN | 2023


64
Responsible AI

Intelligence and Data Authority of India New Zealand updates expectations on the use of
(AIDAI), to regulate responsible AI use across Generative AI. – RegulationAsia
sectors. – Mint
The Way Forward
Africa: Microsoft & DSM in collaboration with More codified enforcement of AI safety and
Data/Scientists Network and Data Science protocols for businesses: Companies may not take
Nigeria and Federal Government of Nigeria into account the implications of their software’s
(FGN) hosted a responsible AI workshop in impact on consumers, including youth. Social
Nigeria on the usage of the new Responsible media platforms and big tech have designed their
AI Dashboards in decision making. – Technext software with large language models that have
increased privacy violations, racial bias,
A report called Mankind and AI was released manipulation, and pressuring and deceptive
as part of Africa Tech Radio. African marketing tactics to turn over a profit using an
researchers across the continent come individual’s data. State and federal governments
together as an open research forum to better must act to uphold strict standards.
understand the African landscape. – Africa
Tech Radio Increased collaboration globally: Big tech is
dominated by Western culture; we’re seeing this
Nigeria calls on experts to support the launch hold true for the development of AI as well.
of a National AI Strategy – Digwatch Responsible AI needs to address biases and
training, red-teaming, and other aspects that need
Launch of the Centre for Artificial Intelligence: to include input from diverse groups, especially
Malawi University of Science and Technology. the Global South. There could be an international
– Malawi University of Science and body that sets standards so that AI machine
Technology learning will have a globally integrated
understanding instead of a biased perspective.
Latin America and the Caribbean: The
Caribbean Artificial Intelligence Initiative is Shift in focus from future harms to present-day
launched. – AI 4 Caribbean harms: The AI doomsday talk is not only a
distraction from present-day AI harms, like bias in
The first webinar organized by UNESCO's loan rates, but it’s also creating an AI arms race
Ibero-American Business Council on Artificial because (as the logic goes) if you don’t rush to
Intelligence and Ethics was held. – UNESCO make/control the next advancement in AI,
someone else will. We’d like to see AI slowed down
Oceania: The Australian government aligns so we can be more thoughtful about its uses and
and updates its AI strategy. – Australian build tools that don’t perpetuate bias, violate
Institute of International Affairs privacy rights, and erode democracy.
ALL TECH IS HUMAN | 2023
65
Responsible AI

Resources Artificial Unintelligence by Meredith


Your Face Belongs to Us by Kashmir Hill: The Broussard: A guide to understanding the
story of a small AI company that gave facial inner workings and outer limits of
recognition to law enforcement, billionaires, technology and why we should never
and businesses, threatening to end privacy assume that computers always get it right.
as we know it.
Dr. Ravit Dotan’s Resource Hub with
A Human Algorithm by Flynn Coleman: A Responsible AI write-ups, guides, listicles,
groundbreaking narrative on the urgency of and more.
ethically designed AI and a guidebook to
reimagining life in the era of intelligent Montreal AI Ethics Institute’s reports, blog,
technology. and newsletter.

Race After Technology by Ruha Benjamin: Bad Input by Consumer Reports in


From everyday apps to complex algorithms, partnership with the Kapor Center: Three
Ruha Benjamin cuts through tech-industry short films look at how biases in algorithms
hype to understand how emerging and data sets result in unfair practices for
technologies can reinforce White communities of color, often without their
supremacy and deepen social inequity. knowledge. Directed by filmmaker Alice Gu.

Ethical Machines by Reid Blackman: In Dataiku’s AI and Us web series exploring


"Ethical Machines," Reid Blackman gives you how AI is changing our everyday lives: from
all you need to understand AI ethics as a how we dress, to insurance, perceptions, or
risk management challenge. the gender pay gap.

Machine See, Machine Do by Patrick K. Lin: Ant-Defamation League’s Online Hate


Patrick K. Lin's Machine See, Machine Do: Index: The ADL Center for Technology and
How Technology Mirrors Bias in Our Society (CTS) built the Online Hate Index
Criminal Justice System takes a deep and (OHI), a set of machine learning classifiers
thorough look into the use of technology in that detect hate targeting marginalized
the criminal justice system, and groups on online platforms.
investigates the instances of coded bias
present at every level. Check out responsibletechguide.com for
more on Responsible AI!

ALL TECH IS HUMAN | 2023


66
Responsible AI

Here is a snapshot of recent job postings related to Responsible AI:

Lead/Principal Technical AI Ethicist - NLP at Salesforce


Responsible AI Analyst at Indeed
Responsible AI Senior Technical Program Manager at Workday
Research Associate – Project FAIR Fairness and Transparency theme
at Alan Turing Institute
Industry Partnerships Program Manager at Stanford Institute for
Human-Centered Artificial Intelligence
Research Scientist, AI Ethicist at Northeastern University Institute for
Experiential AI
AI Ethics Technical Program Manager at Sony AI
Senior Communications Manager, Responsible AI at TikTok
Responsible AI Solutions Engineer at Credo AI
William J. Brennan Fellowship, Speech, Privacy & Technology Project
at ACLU

Visit our Responsible Tech Job Board for the latest listings and freely
join our Responsible Tech Talent Pool to be matched up with potential
opportunities.

Are you an employer looking for Responsible Tech talent? Reach out to
rebekah@alltechIshuman.org and learn more here.

ALL TECH IS HUMAN | 2023


67
Trust & Safety

FROM OUR TRUST & SAFETY WORKING


GROUP CONTRIBUTORS
ALL TECH IS HUMAN | 2023

Overview safe, and secure. They develop community


Trust and Safety (T&S) is a field that guidelines and moderate content. T&S
developed as a group of professionals addresses many issues such as
emerged to identify and address the risks disinformation, offensive content,
and harms impacting communities online. harassment, fraud, online child safety,
Trust and Safety, as we know it, is a phishing, and spam. When it comes to
relatively young area - although relevant types of work in this space, there could be
work has been around since the early product-specific or even regional needs.
2000s, the term was first adopted by eBay That said, there are three core functions
with specified efforts to establish and within a Trust & Safety team: policy
sustain trust among users, keeping them (individuals who create the principles and
safe on the platform (Boyd, 2022). Since policies that define acceptable online
then, familiarity with the concept of Trust behaviors), operations (individuals who
and Safety, and what it includes, has ensure the policies are being followed and
increased exponentially. Nonetheless, it’s taking action against violative content and
still the case that most Trust and Safety behaviors), and engineering and product
teams are born in/out of a crisis; a good (individuals who create the
example came from Zoom growing its T&S infrastructure/tooling systems for detecting
team at the height of the pandemic, when and enforcing abusive content on the
the product experienced a massive platform). On top of these, depending on
increase in its user base (Maxim, Parecki, the size of the company and product
Cornett. 2022). needs, we might also see threat
detection/intelligence, investigations, and
T&S professionals ensure users of an online specialized issue teams, such as child
platform, tool, or community feel welcome, safety.
68
Trust & Safety

Although Trust and Safety has existed as long hash data to create databases of hashes related
as Internet services have been offered, the to malicious content. Platforms can then compare
field has grown beyond niche communities, is image hashes from their content to hashes of
leveraged at large companies, and has rapidly known malicious content, without exposing human
grown in the past few years. Organizations like moderators to potentially harmful content.
the Trust and Safety Professional Association
(founded in 2020), TS Collective, the Digital Hate Speech: Any speech or content that incites,
Trust & Safety Partnership, and the Integrity discriminates, justifies hatred, or promotes
Institute were formed to support T&S violence against an individual or group.
professionals and improve the public’s
understanding of T&S. These organizations’ Impersonation: Apps or websites that are created
existence serves as a milestone for the to resemble existing apps or services in order to
professionalism of the Trust and Safety field. gain access to personal data, passwords, or other
sensitive data. Impersonation of individuals is the
Key Terms and Definitions creation of fake accounts, using the target’s
(From TSPA, Digital Trust & Safety Partnership, identifying information and/or images, in order to
and ActiveFence) cause harm to that individual.

Adversarial Behavior: Intentional actions of Memes: Content, intended to be amusing or


actors or a network of actors circumventing interesting, that is widely shared online
detection or interrupting moderation rules. Misinformation is incorrect or misleading
information, often posted or shared unwittingly.
Child Safety/ Child Safety Abuse Material
(CSAM): Any visual depiction of the sexual Non-consensual sharing of intimate imagery
abuse and exploitation of children. (NCII): When images and videos of people who are
naked, showing their genitals, engaging in sexual
Disinformation: Intentionally misleading activity or poses, or wearing underwear in
information distributed to deceive and compromising positions; are shared without the
influence an audience. consent of all people involved. May have been
unknowingly or unwillingly or consensually taken.
Hashing: Creates a unique, fixed-length string
of letters and numbers to represent content Protected categories: A set of traits that are used
or a digital signature for an image. to discriminate against a person or a group of
people such as race, ethnicity, national origin,
Hash sharing: The cross-industry sharing of disability, religious affiliation, sexual orientation,
gender identity, age.

ALL TECH IS HUMAN | 2023


69
Trust & Safety

Recidivism: The evasion of suspensions or multiple services. This may occur within seconds
bans, such as creation of accounts after a to hours on some platforms like social media.
previous account ban.
Key Moments in Trust & Safety
Reverse Engineering (Red Team): A process Dismantling Trust and Safety at Twitter: The X
to replicate a system, process, device, or (formerly Twitter) Trust and Safety Council was
software, often used by cybersecurity teams. formed in 2016. It consisted of volunteers from
several advisory groups that addressed issues like
Risk Assessment: An analysis of the types, online safety, harassment, human and digital rights,
potential severity, and likelihood of harms of a suicide prevention, mental health, child sexual
product, service, or feature. exploitation, and dehumanization. After Elon
Musk’s acquisition of Twitter in October 2022,
Sextortion: The act of seeking financial gains, many employees responsible for addressing
favors, or private content by threatening to prohibited content and misinformation were laid
share sexually intimate information about a off. Three key members of the Trust and Safety
target. Council, Eirliani Abdul Rahman, Anne Collier, and
Lesley Podesta, resigned in December 2022. They
Terms of Service/Terms of Use: Legal were disappointed in new leadership’s disregard
agreements between users and service for T&S, including Twitter’s move to heavily rely on
providers under which the user can utilize automated content moderation, which “can only
services. go so far in protecting users from ever-evolving
abuse and hate speech before detectable
Transparency Reports: Issued by a service patterns have developed.” As Musk advocated for
that discloses metrics and insights about its free speech, many have noted a rise in
approach to salient risks and relevant misinformation, disinformation, harassment, and
enforcement practices, including how it hate speech on the platform. - Net Family News
enforced its policies and how it handled
requests to remove or restrict user content. Reddit Moderator Protests: Reddit’s API has been
Often detail government requests for user open for developers since 2008. In April 2023,
records. Reddit announced it would charge for its new API
terms. This move was intended to monetize
True Positive/True Negative: Content Reddit’s data and prevent the platform’s content
correctly or incorrectly flagged as violative. from being used to train large language models
(LLMs) for free. Christian Selig, the developer of a
Virality: When content gains high, rapid, and popular Reddit client for iOS called Apollo,”
wide reach amongst the users of a service or announced he would shut it down due to

ALL TECH IS HUMAN | 2023


70
Trust & Safety

the $20 million cost to keep the app running of what constitutes hate speech as law or weigh
under the new API terms. Other third-party in on the debate of the definition of hate speech;
developers of Reddit clients shut down in instead, they argued users should be able to
June. Thousands of subreddits went dark in communicate freely. The court ultimately
protest. Reddit threatened and removed blocked the law in February 2023, and similar
some moderators for restricting access to debates on content moderation and free speech
subreddits in protest, under grounds that the persist across the U.S.
protests violate the Code of Conduct. The
removed moderators were eventually The UK’s Online Safety Bill: The upcoming bill
reinstated. - The Verge has several requirements to make tech
companies more responsible for content on their
Volokh v. James: New York’s Online Hate platforms, intended to keep online users safe.
Speech Law was slated to take effect in Requirements include: preventing the spread of
December 2022. The law required social illegal content by requiring organizations to
media networks to develop and publish a remove this as soon as they see it, age-
policy describing how they will address visitor verification processes to access certain websites
complaints of hate speech, create a “clear and (e.g. pornography), securing adults from 'legal but
easily accessible mechanism” for visitors to harmful content' (e.g. abuse, harassment, self-
complain about perceived hate speech on the harm and eating disorders) by removing such
site, and inform complainants of how the content from their platforms, and forcing the
matter is being handled. The law originated, in biggest platforms to take action against paid-
part, as a response to the 2022 mass shooting for-scam adverts published or hosted on their
in Buffalo, NY. Prior to the shooting, the services. This legislation has received backlash
shooter wrote a manifesto describing himself from those who fear freedom of expression and
as an ethno-nationalist and supporter of white user privacy will be threatened. The content
supremacy motivated to commit acts of scanning and surveillance required by the bill
political violence. Eugene Volokh, founder of pose threats to end-to-end encrypted (E2EE)
Rumble (a video platform intended to be a communication services such as WhatsApp and
YouTube alternative) filed a complaint in Signal. E2EE is intended to prevent data from
federal court seeking to stop New York’s being read or modified by anyone other than the
Online Hate Speech Law. The plaintiffs argued sender and recipient, so companies that provide
the law infringes upon the First Amendment of E2EE are unable to hand over texts of their
the US Constitution, which prevents the customers' messages to the authorities. Security
government from making laws that abridge and privacy researchers argue that “nobody but
the freedom of speech. The plaintiffs also us” cryptographic backdoors have historically
argued against upholding the state’s definition failed and created vulnerabilities for attackers to

ALL TECH IS HUMAN | 2023


71
Trust & Safety

exploit. WhatsApp and other tech platforms Technology-neutral and future-proof policy and
have indicated they may leave the UK if regulation: With the rise of ChatGPT and general
forced to weaken encryption for the bill. They buzz around generative AI, there has been little to
also argue potential AI models that can scan no consensus in terms of a) whether existing laws
people’s messages for CSAM will likely result and regulations cover this new area and b) how to
in false positives, subject innocent users to regulate/create guardrails around the usage where
having their private messages widely viewed, coverage is unclear. Generative AI is not the first
and face false accusations of viewing CSAM. - emerging tech and won’t be the last, hence we
TechRadar need more technology-neutral and future-proof
guardrails to evaluate and prevent potential harm.
The Way Forward
Change should come in three categories: Standardized regulations with minimum-to-no
more transparency and inclusion in key deviation: With the increased focus on efforts to
partnership and collaboration efforts in the regulate social platforms - specifically within the
industry, building technology-neutral and EU (e.g. Digital Services Act, Online Safety Bill),
future-proof policy and regulation, and trust and safety practices will become more
ensuring intergovernmental coordination on standardized and formalized across the industry.
regulations applied to global firms that That said, we don’t always see coordination
operate in multiple jurisdictions. between governments in how they approach
platforms and their risk assessment and
Key partnership and collaboration efforts in prevention efforts. Given the delicate balance
the industry (and inclusion of youth): Safety between innovation and ensuring online safety, it’d
and wellbeing of the minors and younger be crucial to have good coordination between
audience is the top priority for platforms as different governments in their approach to
well as regulators, and it’s important to give regulating platforms.
the youth a voice while companies are
building products and policies around the
platforms they interact with. Some examples:
the TikTok Youth Council which was
announced recently, Meta’s Safety Advisory
board, the Co-design program and Youth
Advisors, and Youth and Families Advisory
Committee of Youtube just to name a few.
We’d like to see this becoming a common
practice in the industry, and companies to
offer knowledge sharing and best practices
around inclusion of youth.
ALL TECH IS HUMAN | 2023
72
Trust & Safety

Resources TrustCon (San Francisco, US): TrustCon is the


global conference dedicated to trust and safety
Glossaries: ActiveFence, Trust and Safety
professionals who are responsible for the
Professional Association, and Digital Trust & challenging work of keeping our platforms and
Safety Partnership communities safe. TrustCon, the only conference
of its kind, is the culmination of TSPA’s vision to
Frameworks, Guides, and Best Practices: Best create and foster a global community of practice
Practices Framework, ActiveFence Research among trust and safety professionals.
Hub, Integrity Institute Research Hub, Trust &
Africa Internet Governance Forum (Nigeria, Africa):
Safety Foundation Case Studies,
The Africa Internet Governance Forum (AfIGF) is a
regional initiative that brings together various
Events and Conferences stakeholders to discuss and address internet-
Trust & Safety Forum (Europe): Since 2021, related issues in Africa. It serves as a platform for
the Trust & Safety Forum (T&SF) offers a governments, civil society organizations, academia,
private sector representatives, and technical
cohesive space open to all stakeholders,
communities to engage in meaningful dialogue and
from platforms to regulators, inclusive of
collaboration on matters concerning Internet
trusted flaggers and solutions providers, governance.
committed to a trusted and safer digital
environment today and for the future. Online Safety Conference (South Pacific/Oceania):
Bringing together leading online safety experts and
Trust and Safety Research Conference practitioners from Aotearoa, Australia, and
internationally, to share knowledge and insights,
(Stanford, US): Hosted at Stanford
and to participate in discussion and debate,
University’s Frances. C. Arrillaga Alumni
exploring themes including:
Center, the Trust and Safety Research Legislative and policy responses
Conference convenes participants working Diversity and inclusion
on trust and safety issues across academia, Innovative education
industry, civil society, and government. The Pacific collaboration
event brings together a cross-disciplinary
RightsCon (location rotates): The world’s leading
group of academics and researchers in
summit on human rights in the digital age.
fields including computer science,
sociology, law, and political science to
Check out responsibletechguide.com for more on
connect with practitioners and
Trust & Safety!
policymakers on challenges and new ideas
for studying and addressing online trust and
safety issues.

ALL TECH IS HUMAN | 2023


73
Trust & Safety

Here is a snapshot of recent job postings related to Trust & Safety:

Sr. Director, Trust & Safety Communications at Hinge


Data Scientist, Trust & Safety - Operations at Roblox
Senior Machine Learning Engineer (User Trust & Safety) at Canva
Digital Life Initiative (DLI) Director at Cornell Tech
Program Manager - Sensitive Content at Scale AI
Managing Director at Integrity Institute
Digital Safety Lead Technical Advisor - Principal Group Product
Manager at Microsoft
Exploitative Content Lead at Discord
Sr. Product Manager, Trust & Safety at Vimeo
Product Manager, Safety Experience at Bumble

Visit our Responsible Tech Job Board for the latest listings and freely
join our Responsible Tech Talent Pool to be matched up with potential
opportunities.

Are you an employer looking for Responsible Tech talent? Reach out to
rebekah@alltechIshuman.org and learn more here.

ALL TECH IS HUMAN | 2023


74
Tech &
Democracy

FROM OUR TECH & DEMOCRACY


WORKING GROUP CONTRIBUTORS
ALL TECH IS HUMAN | 2023

Overview issues around the trade-offs between


The convergence of technology and freedom of expression, privacy, and public
democracy has reshaped the way societies safety, as well as the notion of truth and
engage, decide, and operate. From authenticity. Yet, digital technologies fuel
civic mobilization and citizen journalism,
deepfake presidential campaigns in South
equalizing the playing field for information
Korea to the Chilean government’s initiative
sharing and commentary. But they have
to boost local entrepreneurial activity, tech
also led to the spread of misinformation
saturates democracy by shaping political
and disinformation, as well as surveillance
processes, civic engagement, information
and censorship. Emerging tech is also
dissemination, and government operations. deeply personal and rooted in cultural
This section presents an overview of four context, so governments around the world
main components within tech and have created their own guidance on how
democracy: consumer rights, tech policy, technology will thrive within their
digital platforms and political participation, democracies. Charting the dynamic
and civic tech. intersection of technology and democracy,
this section uncovers the tools, challenges,
As digital tools and platforms become and opportunities that shape the modern
widely available to the everyday consumer, political landscape.
governments must balance the line
between fostering innovation and Key Terms and Definitions
protecting consumers from big tech. Data portability – The ability of individuals
Meanwhile, the proliferation of digital to transfer their personal data from one
communication technologies — and more service provider to another, enabling
recently, images and text manipulated by
machine learning and AI — raises complex
75
Tech & Democracy

greater control over their data. Echo chamber – An environment where a person
only encounters information or opinions that
Digital footprint – The trail of data left reflect and reinforce their own.
behind by a person's online activities,
including social media interactions, website Civic hacking – Collaborative and often
visits, and other online actions. grassroots efforts to use technology to address
civic issues, and create apps, tools, and platforms
Digital rights management – Technologies that benefit communities.
or strategies used by content creators or
distributors to control the access, usage, Election integrity – Ensuring the security,
and distribution of digital content. accuracy, and fairness of digital voting systems to
maintain the trust and legitimacy of electoral
Right to be Forgotten (RTBF) – A legal processes.
concept that allows individuals to request
the removal of certain online information Predictive policing – The use of algorithms,
about them from search engine results and predictive analytics, and other techniques in law
other online platforms. enforcement to identify potential criminal
suspects and activities.
Surveillance capitalism – A concept in
political economics that denotes the Techplomacy – Coined by the Danish government
widespread collection and commodification to define the connection between the
of personal data by corporations. governments and tech companies.

Data governance – The practice of Key Moments in Tech & Democracy


organizing and implementing policies,
procedures, and standards for the effective
Africa
use of an organization’s
Nigeria – The Data Protection Act, 2023
structured/unstructured information assets.
Provides a legal framework for the protection of
personal information and establishes the Nigeria
Net neutrality – The idea that network
Data Protection Commission. The Act aligns with
operators shouldn’t discriminate against any
international standards and establishes principles
network traffic based on source, destination,
for the processing of personal data, outlining
protocol, content, application, or device.
specific requirements for the processing of
sensitive and children's data.
Disinformation – False information that is
deliberately intended to mislead.
South Africa – National Policy on Data and Cloud
Creates guidelines for government bodies to safely
ALL TECH IS HUMAN | 2023
76
Tech & Democracy

utilize cloud services while adhering to the before collecting personal data. Allows the
appropriate data privacy and security government to limit the transfer of data outside
measures. The policy also encourages India and penalizes companies for violating rules.
universal access to broadband, eliminates
regulatory barriers to foster competition in Singapore – Enabling Service Hubs,
the data and cloud sector, promotes ICT Civic tech initiative to strengthen support for
research and development, and creates persons with disabilities and their caregivers within
alignment with the Fourth Industrial the community. Offers residents courses on daily
Revolution (4IR), OECD Framework, and EU living and digital skills.
standards.
Europe
Uganda – The Income Tax Bill amendment
EU – The Digital Services Act (DSA) was
inserts a new section that levies a 5% fee on
established to better protect consumers and their
the revenue of foreign providers of digital
rights online, establish transparency and a clear
communications services operating in accountability framework for online platforms, and
Uganda. It taxes foreign providers of digital foster innovation, growth, and competitiveness
services in Uganda, such as Meta, Twitter, within the single market. By 17 February 2024,
Amazon, or any other foreign-owned online platforms and search engines will be
company offering digital services to increase
required to publish the number of monthly average
tax collections for the country’s burgeoning
users in the EU.
digital economy.
EU – The Digital Markets Act (DMA) establishes
Asia specific criteria for qualifying a large online
China – Management Measures for platform as a “gatekeeper”. The DMA starts on 2
Generative Artificial Intelligence Services, May 2023. By 3 July 2023, gatekeepers need to
The Cybersecurity Administration of China notify their “core platform services” to the
introduced draft measures listing rules that Commission.
generative AI services have to follow,
including the type of content these products EU – The Artificial Intelligence Act is a proposed
are allowed to generate — within the EU regulation targeted at regulating AI systems in
framework set up by China’s national trifecta the EU, aims to maintain trust in AI systems, and to
of data laws: the Cyber Security Law, Data create an ecosystem of excellence for AI.
Security Law, and Personal Information
Protection Law. South America
Brazil – The Fake News Law (Bill 2630) requires
India – The Digital Personal Data Protection
internet companies, search engines, and social
Act requires companies to get user consent
messaging services to find and report illegal
ALL TECH IS HUMAN | 2023
77
Tech & Democracy

material. Recently, Brazil's government and US – The AI Disclosure Act of 2023 (H.R.3831)
judiciary objected to big tech firms would require that any content produced by AI
campaigning against the bill, alleging undue contain the phrase: “DISCLAIMER: this output has
interference in the debate in Congress. been generated by artificial intelligence.”

Chile – “Chile takes first steps towards AI US – The REAL Political Advertisements Act
regulation,” The Chilean parliament is (S.1596) provides further transparency and
engaging in discussions for a proposed bill accountability for the use of AI-generated content
that would address legal and ethical in political advertisements by requiring a
considerations in AI development and usage, disclaimer that AI was used.
aiming to strike a balance between
protecting citizens’ rights and promoting the US – The Digital Platform Commission Act of 2023
accessibility and advancement of these (S.1671) establishes a commission to regulate
technologies. digital platforms.

Costa Rica – “Lawmakers use ChatGPT to Tech & Democracy Information Hubs
draft AI regulation bill,” Costa Rican Tech Policy
legislators asked ChatGPT to draft legislation Data & Society Research Library
aimed at governing AI systems within the MIT Internet Policy Research Initiative Research
country. The generated bill advocates for the Produces policy research in a variety of
establishment of a dedicated institution technical fields, including cybersecurity, AI
responsible for overseeing AI regulation, policy, privacy, advanced network
guided by principles such as accountability, architectures, decentralized web, and app
explainability, bias prevention, and development.
safeguarding human rights. AI Ethicist, a global repository of reference and
research material for research on AI ethics,
North America responsible governance, and social impacts of
Canada – Digital Services Tax Act, AI.
The Digital Services Tax Act would impose a
3% tax on revenue for large tech companies Digital platforms and Political Participation
and online marketplaces, companies like Center for an Informed Public at the University
Walmart, Amazon, and Meta. of Washington Resources – Research,
workshops, and talks on misinformation and
US – The National AI Commission Act disinformation.
(H.R.4223) was introduced to establish an UNC Center for Information, Technology, and
artificial intelligence commission and for Public Life Research –Information on the
other purposes. Political and Civic Applications Division (PCAD),
ALL TECH IS HUMAN | 2023 which develops software to support research
78
Tech & Democracy

into information environments; critical Insights from the Tech & Democracy
disinformation studies; and resources Report
tracking how platform policies, state
laws, and ethics shape campaign
“I was opened up to my work in tech policy upon
communications.
joining Pollicy after the completion of my
Harvard Shorenstein Center on Media, fellowship program and then becoming a data and
Politics, & Public Policy digital rights researcher there. I have since co-led
HKS Misinformation Review Pollicy's AI work. My advice to individuals looking
Media Manipulation Casebook in this field of work — especially young people
across the African continent — would be to
Civic Tech interest themselves in fellowships and other such
Center for Civic Design Tools – Tools and career-shaping programs by organizations in the
resources designed through experiences space both on the continent and elsewhere.” –
working with election offices across the Bobina Zulfa, Digital Rights Researcher at Pollicy
US. They are free to use and adapt.
Election officials should check state law “Data protection issues are now squarely societal
to see if you’re able to use them. and human rights issues. There is a societal
Code for America Brigade impact on every sector that relies on data,
Code For All’s international working affecting the future of healthcare, transportation,
groups and marketing – the list goes on. Many of these
impacts will extend to the future of free speech
Open Data Handbook – Guides, case
and, ultimately, our democracy.“ – Jules
studies and resources for government &
Polonetsky, CEO, Future of Privacy Forum
civil society on the "what, why & how" of
open data.
“I built my career by simply doing three things,
which I like to call the 3C framework: Consume,
Consumer Rights Create and Collaborate. When I was just starting
Consumers International, Digital Rights – out, my first line of action was to consume as
A global resource for policy-makers, much content as I could about tech policy. As you
regulators, the tech industry and consume more content, you begin to identify gaps
consumers. and ignite a burning desire to fill those gaps with
Deceptive Design Hall of Shame – your own content. After consuming and creating,
Hundreds of examples of deceptive you will naturally begin receiving collaboration
patterns used by companies around the requests — which helps to broaden your reach,
world. letting more people know about you and what you
World Bank Digital Regulation Platform, do.” – Faith Obafemi, Data Protection and Privacy
Consumer Affairs Writer, Captain Compliance
Check out responsibletechguide.com for more on
Tech & Democracy.
ALL TECH IS HUMAN | 2023
79
Tech & Democracy

Here is a snapshot of recent job postings related to Tech &


Democracy:

Research Analyst, Technology and International Affairs Program at


Carnegie Endowment for International Peace
Digital Democracy Program Officer, Arlington at International
Foundation for Electoral Systems
Research Director, Transformative Technologies and Governance at
Centre for International Governance Innovation (CIGI)
Data Analyst at Protect Democracy
Senior Digital Democracy Program Officer, Arlington at International
Foundation for Electoral Systems
Director, Technology and Democracy at National Endowment for
Democracy
Assistant Professor - Technology Policy, Governance, and Society at
Goldman School of Public Policy / School of Information, University
of California, Berkeley
Senior Technologist / Senior Policy Analyst / Senior Counsel,
Elections & Democracy at Center for Democracy & Technology
(CDT)
Director, Independent Media and Information Space at National
Endowment for Democracy (NED)

Visit our Responsible Tech Job Board for the latest listings and freely
join our Responsible Tech Talent Pool to be matched up with potential
opportunities.

Are you an employer looking for Responsible Tech talent? Reach out to
rebekah@alltechIshuman.org and learn more here.

ALL TECH IS HUMAN | 2023


80
Public Interest
Tech

FROM OUR PUBLIC INTEREST TECH


WORKING GROUP CONTRIBUTORS
ALL TECH IS HUMAN | 2023

Overview governments and citizens use to


Many people have created definitions for communicate with each other. Government
what public interest technology (PIT) is, technology is often regarded as any
and most definitions agree on this idea: PIT technology where governments are the
is a technology created for the public intended users. PIT, civic technology, and
good, rather than for individual or government technology can overlap, but in
commercial gain. Several definitions of PIT our definition, PIT does not necessarily have
also emphasize that PIT should also aim for to overlap with either. Any technology that
equity, to ensure that PIT is inclusive and serves the public interest could be
accessible to all. considered PIT, no matter who is creating it,
nor what type of problem it is trying to
It is not just technologists who can be solve. For example, PIT creators, supporters,
involved in PIT, but also lawyers, and service providers may be corporations,
government workers, nonprofit workers, startups, non-profits, design firms, public
activists, scientists, policymakers, and any benefit corporations, technology
others who can provide a perspective that consultancies, teams in government, and so
will ensure that PIT is better shaped on.
towards the public good. As such, the PIT
ecosystem is made up of a mix of public Problem and opportunity domains in the
and private stakeholders. public interest may include improving the
quality of, and access to, education,
Similar to PIT are civic technology and healthcare, public programs and services,
government technology. Civic technology civic participation, a healthy and safe
environment, digital privacy, digital equity,
is often regarded as technology that
and more.
81
Public Interest Tech

Key Terms and Definitions Civic Tech: Civic Tech is a technology that enables
Accessibility: Accessibility is about ensuring greater participation in government or otherwise
that digital technology is usable by people assists the government in delivering citizen
with disabilities. Checklists, standards, and services and strengthening ties with the public.
laws are important tools to help achieve
accessibility — yet sometimes they get the Consentful Technology: Consentful Technologies
focus instead of the fundamental goal of are digital applications and spaces that are built
accessibility: meeting the needs of disabled with consent (defined above) at their core, and
people in the real world. Accessibility is an that support the self-determination of people who
important aspect of diversity, equity, and use and are affected by these technologies.
inclusion (DEI).
Cybersecurity, (also Public Interest
Anti-racist Technology: Structural Racism is Cybersecurity, Cyber Civil Defense): Ensures
a system in which public policies, confidentiality, integrity, and availability of
institutional practices, cultural information, and reduces the risk of cyberattacks.
representations, and other norms work in When applied to public interest organizations such
mutually reinforcing ways to perpetuate as hospitals, city governments, non-profits, etc.,
racial group inequity. Anti-racist Technology who serve the public, and typically lack the
is designed to combat structural racism and capacity to defend against cyber criminals or
mitigate the harms (current, inherited) it politically motivated attacks, we get Public Interest
causes, the access, opportunities, and rights Cybersecurity or Cyber Civil Defense.
it denies baked in as part of the design
process, and it would actively seek to Deceptive Design Patterns: Deceptive Design
generate racial equity - also as part of its Patterns are tricks used by websites and apps to
design. get you to do things that you didn't mean to, or
that you might not otherwise do, like buy things,
Assistive Technology: Assistive technology sign up for services, or switch your settings.
is a technology used by individuals with
disabilities in order to perform or improve GovTech: GovTech is the technology used to
functions that might otherwise be difficult or deliver public sector services, as well as the
impossible and can include mobility devices processes involved in modernizing them (aka
such as walkers and wheelchairs, as well as digital transformation), with an emphasis on
hardware, software, and peripherals that citizen-centric, universally accessible public
assist people with disabilities in accessing services, and whole-of-government approach to
computers or other information digital government transformation.
technologies.

ALL TECH IS HUMAN | 2023


82
Public Interest Tech

Human-Centered Design: Human-centered Web Accessibility: Web accessibility means that


design is a practice where designers focus websites, tools, and technologies are designed and
on four key aspects. They focus on people developed so that people with disabilities can use
and their context. They seek to understand them.
and solve the right problems, the root
problems. They understand that everything Key Moments in Public Interest Tech
is a complex system with interconnected The New York State government has recently
parts. Finally, they do small interventions. signed the Digital Fair Repair Act into law.
They continually prototype, test, and refine This made New York the first state in the U.S. to
their products and services to ensure that guarantee people the right to repair their
their solutions truly meet the needs of the digital devices, protecting consumers from
people they focus on. Cognitive science and anticompetitive efforts to limit repair. (See
user experience expert Don Norman sees it Governor Hochul Signs the Digital Fair Repair
as a step above user-centered design. Act Into Law.)

Inclusive Design: Inclusive design describes Some PIT stakeholders in the U.S. are
methodologies to create products that optimistic about the future of civic tech. This
understand and enable people of all is due to a number of factors: recently laid-off
backgrounds and abilities. Inclusive design private sector tech workers showing great
may address accessibility, age, culture, enthusiasm for public sector tech jobs;
economic situation, education, gender, governments making improvements in building
geographic location, language, and race. The up their technical capacity; and governments
focus is on fulfilling as many user needs as becoming more human-centered in their
possible, not just as many users as possible. approach to technology. (See Why 2023 could
be a year for civic-tech optimism and To Build
Public Interest Technology: Public Interest A Better Internet, Put Laid Off Tech Workers
Technology (PIT) is a broad and emergent Back to Work in the Public Interest.)
field that is synonymous with Responsible
Tech. Many people have created definitions Some of the most pressing ethical issues in
for what PIT is, and most definitions agree on technology today are: misuse of personal
this idea: PIT is a technology created for the information, misinformation and deepfakes,
public good, rather than for individual or lack of oversight and acceptance of
commercial gain. Several definitions of PIT responsibility, use of AI, and autonomous
also emphasize that PIT should also aim for technology. (See 5 Ethical Issues in
equity, to ensure that PIT is inclusive and Technology to Watch for in 2023.) PIT is not
accessible to all. immune to these ethical issues. For example,

ALL TECH IS HUMAN | 2023


83
Public Interest Tech

when governments rely on technology government roles are competitive with similar
created and maintained by external roles in the private sector. (See In Public
consultancies, it it becomes more Service, Technology Is Only as Good or Bad as
difficult to ensure that citizens’ personal We Are.)
data is kept private and secure
(representing the ethical issue of lack of Educate students from the broad range of
oversight and acceptance of fields that contribute to PIT to be prepared to
responsibility). think and work in PIT. An existing example is
the Public Interest Technology University
Consumer Reports is creating an app Network (PIT-UN).
called Permission Slip, which will provide
people more control over how for-profit When creating a new policy or piece of
entities use their consumer data. legislation, think down to the very end user
how that policy or legislation will play out.
Organizations like TechCongress and Ensure there is a real plan for funding and
Presidential Innovation Fellows are implementation, as well as feedback loops for
helping influence tech policy and collecting data and adjusting course according
government technology by placing to that data. (See In Public Service, Technology
technologists as fellows in the offices of Is Only as Good or Bad as We Are.)
federal policymakers and government
agencies. Get the broader public informed and involved
in discussions on tech policy, to ensure that
The Way Forward decisions truly reflect the public interest.
Build up internal technical capacity
within governments so that governments Incorporate AI with caution. Continuously
do not need to rely on external experts educate ourselves about what AI can and
and piecemeal projects to improve their cannot do. When AI is used, monitor for errors.
technology. One way to make this a
reality is by ensuring that pay for these Check out responsibletechguide.com for more on
Public Interest Tech.

ALL TECH IS HUMAN | 2023


84
Public Interest Tech

Here is a snapshot of recent job postings related to Public Interest


Tech:

Full-Stack Software Engineer at State of New Jersey Office of


Innovation
Chief Data Officer at City and County of Honolulu
Director, Digital Experience at Franklin County, OH
Program Officer, Technology in the Public Interest at MacArthur
Foundation
Chief Information Technology Officer at New York County District
Attorney’s Office
Digital Equity Officer and Director of Broadband and Cable at City of
Boston
Supervisory Digital Services Manager (12-Month Register) at Internal
Revenue Service, U.S. Department of the Treasury
Digital Service Expert at State of Colorado, Governor's Office of
Information Technology
Digital Service Director at State of Arizona
Deputy Chief Digital and Artificial Intelligence Officer for Business
Analytics at Office of the Secretary of Defense, U.S. Department of
Defense

Visit our Responsible Tech Job Board for the latest listings and freely
join our Responsible Tech Talent Pool to be matched up with potential
opportunities.

Are you an employer looking for Responsible Tech talent? Reach out to
rebekah@alltechishuman.org and learn more here.

ALL TECH IS HUMAN | 2023


85
Youth, Tech, &
Wellbeing

FROM OUR YOUTH, TECH, &


WELLBEING CONTRIBUTORS
ALL TECH IS HUMAN | 2023

Overview impactful, the lack of youth voices in the


Over the past few years, our relationship process of identifying a problem and
with technology has undergone a developing a solution indicates a gap that
transformation that distinctively impacts needs to be filled within the responsible
youth and well-being. The COVID-19
tech ecosystem. All Tech is Human aims to
pandemic and the Black Lives Matter
address this gap by highlighting the need
protests that coincided with it brought
and power of multiple stakeholders,
our attention to the impact of our
relationship with technology on our
disciplines, and perspectives.
mental health. Although young people
today are digital natives, the past few For this section of the report, the
years highlighted how their lack of power, contributors wish to offer a holistic lens
rights, and autonomy in the design regarding the perspectives of the youth of
process of digital technologies and today in their relationship to tech. Without
spaces has placed them in a vulnerable initiating feelings of doom and gloom that
position. some resources may project, we want to
foster a more grounded and yet actionable
Caregivers, technologists, and hope to garner mobility and motivation of
policymakers who have recognized young actionable change while allowing a chance
people's vulnerability have begun the
to see how tech is integrated in the lives of
process of transforming digital
Youth in a better and healthier way. We
technologies and spaces that are
want to show that Youth has a voice and
intentionally designed with youth and
well-being at the forefront of the design autonomy over their wellbeing and how it
process. Although these efforts are relates to tech.

ALL TECH IS HUMAN | 2023


86
Youth, Tech, &
Wellbeing

Key Terms and Definitions Wellbeing: While there is no global consensus on a


Youth is defined as the period determined single definition of well-being, the definition would
between childhood and adulthood – the age include having a fulfilled, abundant, and healthy life
group is also referred to as “young adult.” For that is all-encompassing of these attributes. It also
the purposes of the Responsible Tech Guide, may include striving towards having positive
we define “youth” as between the ages of 18 emotions such as happiness and contentment
to 25. including having the ability of self-autonomy,
purpose, and meaning over one’s life.
Digital Natives: Youth born in generations
already immersed in technology from an Key Moments in Youth, Tech, & Wellbeing
early age. These individuals think and learn in Youth, Tech, and Wellbeing are interconnected
the context of a hyper-connected world. because technology plays a significant role in
Understanding how to navigate this digital young people's lives. Equipping digital natives with
landscape is crucial for their personal and the skills, knowledge, and tools to use technology
professional development. (See John Palfrey responsibly and for their personal growth
and Urs Glasser, Born Digital: Understanding contributes to their overall wellbeing and prepares
the First Generation of Digital Natives). them for success in the digital age. Digital
technologies provide opportunities for
Emerging Technology: New or evolving communication and social interaction. Social
technologies; the process of continued media, messaging apps, and online communities
development of existing technologies; this enable young people to connect with peers across
also includes Artificial Intelligence (AI): the globe. However, using technology mindfully is
Artificial Narrow Intelligence (ANI), Artificial important to prevent negative impacts on mental
General Intelligence (AGI), and Artificial health and social relationships. While technology
Super Intelligence (ASI). More on the types of can offer numerous benefits, it also brings
AI at mygreatlearning.com. challenges, such as increased screen time,
cyberbullying, and the pressure to curate a perfect
Social Media: A digital technology that online image. Teaching youth about responsible
facilitates the sharing of text and multimedia technology use is vital for maintaining their mental
through virtual networks and communities health and wellbeing. It also opens doors to various
where it also connects friends, families and opportunities, such as online learning, remote work,
even businesses where they market, and entrepreneurship, which can positively impact
promote and track customers. More than 4.7 the future prospects of young individuals.
billion people around the world use social Equipping them with the skills, knowledge, and
media. tools to use technology responsibly and for their
personal growth contributes to their overall

ALL TECH IS HUMAN | 2023


87
Youth, Tech, &
Wellbeing

wellbeing and prepares them for success in “improve age-appropriate digital services and to
the digital age. ensure that every child is protected, empowered
and respected online” which incorporates the
Youth are empowered to advocate for their European Parliament Resolution on children’s
digital rights, and government officials across rights.
the world are more resolved to increase
digital well-being for youth. Illinois passed There still needs to be a human-centered and
the first US law aimed at protecting child people led approach that considers the safety and
influencers which will “entitle influencers rights of all users, including youth. In fact, youth are
under the age of 16 to a percentage of taking the lead in their own advocacy with
earnings based on how often they appear on organizations like, Design It For Us, a US based
video blogs or online content” discouraging “coalition of young activists and organizations
youth exploitation (Teen Vogue). fighting for safer social media and online platforms
for kids, teens, and young adults” (Design it for Us).
However, the pursuit of digital well-being for In Africa, the African Union Youth Envoy utilizes
youth has made progress with differing the Google Digital Skills Campaigns encourages
outcomes. In 2023, the US Congress Africa’s growing youth population who want to
reintroduced the Kids Online Safety Act position themselves to benefit from Africa’s digital
(“KOSA”) as a measure to protect children revolution and establish a strong digital economy,
online by increasing monitoring and limiting “part of the larger African Union’s Digital
access to sensitive information after Transformation campaign, which seeks to reach
revisions proposed by over 100 civil 100, 000 young people with digital skills for the
organizations. However, advocates for online creation of jobs by 2024 through a country
safety continue to highlight potential acceleration strategy across the African continent”
dangers. The Electronic Frontier Foundation (Unlocking Africa’s Potential). There are a number
says KOSA this as putting the “tools of of initiatives by youth-led and established
censorship in the hands of state attorneys institutions focused on youth digital well-being.
general, and would greatly endanger the
rights, and safety, of young people online” The Way Forward
(EFF). Also see: Children's Online Privacy EdTech: Technology is a fundamental part of
Protection Rule ("COPPA"), which protects modern education. Equipping young people with
children's privacy by giving parents tools to tech skills prepares them for the job market of the
control what information is collected from future, where digital literacy is essential. Familiarity
their children online ( FTC). The “European with technology also enhances
strategy” to foster a safe environment for their problem-solving, critical thinking, and
youth is Better Internet for Kids (BIK+), to creativity skills. Technology offers tools and

ALL TECH IS HUMAN | 2023


88
Youth, Tech, &
Wellbeing

platforms for creative expression. Young serves the best interests of society as a whole.
people can explore various forms of digital
art, music production, video creation, and Bridging Generational Gaps: Fostering meaningful
more. Encouraging their creativity in these intergenerational collaboration in responsible tech
digital mediums can foster innovation and can help bridge the gap in tech understanding and
self-expression. adoption. They can facilitate communication
between older generations and younger ones,
Digital Literacy: By fostering tech literacy fostering collaboration and knowledge sharing.
among youth, we empower them to
effectively use technology to be active Civic Engagement: Youth advocacy on tech issues
participants in shaping their digital is needed to encourage young people to engage in
experiences. And ultimately lead discussions civic activities and become informed and active
about digital ethics, privacy, cybersecurity, participants in shaping government policies
and the social implications of emerging related to technology.
technologies. Youth may become creators of
content, advocates for positive online Diversity, Equity, Inclusion, and Belonging:
communities, and contributors to the digital Involving new voices in responsible tech promotes
landscape in meaningful ways. Learning how diversity and inclusivity in the tech industry, which
to analyze information, evaluate sources, and has historically lacked representation from
make informed decisions in the digital age is underrepresented groups. Advocacy efforts can
crucial for youth to navigate a rapidly help ensure that technology is developed with a
changing world. Understanding technology broader range of perspectives and experiences in
and its impact on society helps young mind.
people become responsible global citizens.
Sustainability and Innovation: Involving a wide
Advocacy: The decisions made in the tech range of perspectives in the development and
industry today will have far-reaching deployment of technologies and policies can serve
consequences for the future. Youth as a catalyst for innovation in emerging challenges.
advocacy ensures that the voices and With tech’s significant impact on the environment,
concerns of future generations are heard and youth advocacy should raise awareness about
considered in policy-making and technology sustainable practices and alternatives, and
development. Through youth advocacy advocate for ethically-minded solutions.
efforts, we harnesses the unique
perspectives, digital fluency, and passion of
young people to drive positive change in the
tech industry, ensuring that technology

ALL TECH IS HUMAN | 2023


89
Youth, Tech, &
Wellbeing

Resources emphasis on algorithmic justice, learning, and


Organizations and Initiatives equity.
Connected Wellbeing Initiative
Design It For Us AI 101 for Teachers, a collaboration of online videos
Erasmus Student Network about AI in education by code.org, ETS, ISTE, and
Social Media and Youth Mental Health: An Khan Academy.
Event to Confront the Moral Panic, a
collaboration between Stanford Ask the Experts Webinar Series, by Children and
Medicine’s Center for Youth Mental Screens: Institute of Digital Media and Child
Health and Wellbeing, the d.school at Development.
Stanford, and GoodforMEdia.
Headstream Innovation Festival, a youth- EdSurge Podcast, with episodes on AI in education.
focused accelerator (of Second Muse) on
the theme of digital solutions to wellness Connection, Creativity and Drama: Teen Life on
for youth. Social Media in 2022, Pew Research Center
Inspired Internet Pledge, a cross-sector
initiative of the Digital Wellness Lab and Disrupted Childhood: The Cost of Persuasive
various social media companies. Design, 5Rights Foundation
Responsible Technology Youth Power
Fund Swimming with Sharks and Walking on Mars:
Safer Internet Day, a collaboration Synthesis of a Cross-Sector Forum on Immersive
amongst organizations coordinated by Technology in Secondary Education, Joan Ganz
UK Safer Internet Centre Cooney Center at Sesame Workshop
Stop Non-Consensual Intimate Image
Abuse | StopNCII.org Gen Z in the Room: Making Public Media By and
Thriving Youth in a Digital Age With Youth for the Future, Joan Ganz Cooney
UN Internet Governance Forum: Youth Center at Sesame Workshop
Initiatives
Youth Tech Health 2023 State of Kids’ Privacy, Common Sense Media

Publications and Media Fostering An Inclusive And Technology Responsive


Algorithmic Rights and Protections for Education For Youth Living With Disabilities In
Children edited by Mizuko Ito, Remy Cross, Africa, African Union Development Agency
Karthik Dinakar, and Candice Odgers: Essays
on the challenges and risks of designing Check out responsibletechguide.com for more on
algorithms and platforms for children, with an Youth, Tech, & Wellbeing.

ALL TECH IS HUMAN | 2023


90
Youth, Tech, &
Wellbeing

Here is a snapshot of recent job postings related to Youth, Tech &


Wellbeing:

Child Safety Researcher at ActiveFence


Data Analytics Intern at Institute for Youth in Policy
Global Issue Policy Lead - Youth Safety & Wellbeing - Trust & Safety
at TikTok
allcove Data Systems Manager at Psychiatry Center for Youth Mental
Health & Wellbeing, Stanford University
Remote Student Contractor, Youth & Education Privacy at Future of
Privacy Forum
Senior Data Scientist - Child Safety at Roblox
Senior Policy Manager at Thorn
Game Design Manager for Empowerment and Well-being at Lego
Program Manager, EdTech & Digital Health at SecondMuse
UX Research Manager for Youth Initiatives at YouTube

Visit our Responsible Tech Job Board for the latest listings and freely
join our Responsible Tech Talent Pool to be matched up with potential
opportunities.

Are you an employer looking for Responsible Tech talent? Reach out to
rebekah@alltechishuman.org and learn more here.

ALL TECH IS HUMAN | 2023


91
CONTRIBUTORS
Abhinav Mittal; Responsible AI April Yoder; Tech & Democracy; Elijah Uwakarhiomien Otor;
Content Strategy, Narrative, and LinkedIn Responsible AI
RTG Editor; LinkedIn; Website
Ayse Kok Arslan; Public Interest Elizabeth Kellie Aguado;
Aditi Peyush; Tech & Technology, Responsible AI Responsible AI; LinkedIn
Democracy, Youth, Tech &
Wellbeing; LinkedIn; Website Bruce Strauss; Youth, Tech & Elizabeth Rood; Youth, Tech &
Wellbeing Wellbeing; LinkedIn
Alexandra Jacoby; Public
Interest Technology; LinkedIn; Chandu Avni; Public Interest Emily Lippolis; UX Design;
Website Technology, Youth, Tech & LinkedIn; Website
Wellbeing, Trust & Safety
Alison Chai; Tech & Democracy, Researcher and Writer; LinkedIn; Enrique Planells-Artigot;
Responsible AI Website Responsible AI, Tech &
Democracy, Youth, Tech &
Alison Kim; Youth, Tech & Claire Weingarten; Responsible Wellbeing
Wellbeing, Responsible AI AI, LinkedIn
Evelina Ayrapetyan; Tech &
Amari Cowan; Responsible AI Conrad Schwellnus; Youth, Tech Democracy
& Wellbeing; LinkedIn
Amber Hunt; Responsible AI; Firuza Huseynova; Youth, Tech
LinkedIn Danielle Lim; Responsible AI; & Wellbeing
LinkedIn; Website
Andrea Ramos; Tech & Gigi Kenneth; Youth, Tech &
Democracy, Responsible AI Daphne Saavedra; Tech & Wellbeing; Responsible AI
Democracy
Andy McAdams; Responsible AI, Grace Volante; Responsible AI,
RTG Editor; LinkedIn, Website Darshita Chaturvedi; Youth, Tech & Wellbeing;
Responsible AI LinkedIn
Angela Garabet; Responsible AI
Dominique Greene-Sanders; Halima Khatun; Tech &
Anisa Bora; Youth, Tech & Responsible AI; LinkedIn Democracy
Wellbeing
Dorcas Nyamwaya; Responsible Hana Gabrielle Bidon; Youth,
Anjali Mehta; Responsible AI AI Tech & Wellbeing; Responsible
AI; LinkedIn, Website
Anugya Srivastava; Responsible Ece Ozkan; Trust & Safety
AI, LinkedIn Researcher and Writer; LinkedIn Hannah Smith; Responsible AI

ALL TECH IS HUMAN | 2023


92
CONTRIBUTORS
Heather Waugh; Responsible AI Lama Mohammed; Responsible María Picado; Tech &
AI, Tech & Democracy; LinkedIn Democracy; Responsible AI
Hilary Dockray; Public Interest
Technology, UX Design; LinkedIn; Laura Gacho; Tech & Marie Roker-Jones; Youth, Tech
Website Democracy, Responsible AI; & Wellbeing; Responsible AI
LinkedIn
Ingrid Woods; Responsible AI, Marisa Zalabak; Responsible AI,
Tech & Democracy Lauran Howard; Youth, Tech & Youth, Tech & Wellbeing;
Wellbeing, Responsible AI; LinkedIn; Website
Jacob Hagelberg; Responsible LinkedIn
AI Martina Howard; Public Interest
Lavina Ramkissoon; Technology, Responsible AI, UX
James Gresham; Trust & Safety Responsible AI Design, Visual Design; LinkedIn

Jeremiah Azurin; Tech & Leah Farrar; Tech & Democracy, Matt Rosenbaum; Tech &
Democracy; Responsible AI Responsible AI; Editor Democracy; LinkedIn

Jillian Drummond; Youth, Tech Lindsey Washburn; Responsible Mia Casesa; Responsible AI
& Wellbeing; Responsible AI AI; LinkedIn
Michelle Mol; Responsible AI
Julie Lee; Responsible AI Lisa D. Dance; UX Design;
LinkedIn; Website Mohsen Monji; Responsible AI
Katleho Mokoena; Responsible
AI; LinkedIn Liz Oh; Responsible AI Natalia Kucirkova; Youth, Tech
& Wellbeing; LinkedIn
Kendrea Beers; Responsible AI Lyn Muldrow; Youth, Tech &
Wellbeing, Responsible AI Nicola Brown; Youth, Tech &
Kim Fernandes; Responsible AI; Wellbeing
LinkedIn Maira Elahi; Responsible AI, Tech
& Democracy, Youth, Tech & Nikki Love Kingman; Public
Kimberly Wright; Public Interest Wellbeing; LinkedIn; Website Interest Technology, Tech &
Technology, Responsible AI, UX Democracy
Design; LinkedIn Mari Cairns; Responsible AI;
LinkedIn Patricia Liebesny Broilo; Youth,
Kwynn Gonzalez-Pons; Youth, Tech & Wellbeing; LinkedIn
Tech & Wellbeing; Responsible Maria Filippelli; Tech &
AI Democracy Priscilla Wahome; Youth, Tech
& Wellbeing
ALL TECH IS HUMAN | 2023
93
CONTRIBUTORS
Raashee Gupta Erry; Sree Lathika; Responsible AI
Responsible AI
Susmitha Tutta; Tech &
Rebecca Scott Thein; Tech & Democracy; Responsible AI
Democracy; LinkedIn
Tracy Kadessa; Responsible AI
Renata Mares; Tech &
Democracy; Responsible AI, Urba Mandrekar; Tech &
Public Interest Technology, Trust Democracy; Youth, Tech &
& Safety; LinkedIn Wellbeing; Responsible AI

Riitta Mettomäki; Trust & Verena Bryan; Responsible AI;


Safety; LinkedIn LinkedIn

Rosalyn Bejrsuwana; Yee Carter; Youth, Tech &


Responsible AI, Tech & Wellbeing; Responsible AI
Democracy; LinkedIn; Website
Yfat Barak-Cheney;
Ryan Bell; Youth, Tech & Responsible AI
Wellbeing
Zach Deocadiz; Youth, Tech &
Ryan Lee; Tech & Democracy Wellbeing, UX Design; LinkedIn

Sachi Bafna; Youth, Tech &


Wellbeing

Sara Heaser; Youth, Tech &


Wellbeing

Sarah D’Andrea; Responsible AI

Shivani Rao; Youth, Tech &


Wellbeing

Shruti Vellanki; Responsible AI

Soumya Khedkar; Responsible


AI
ALL TECH IS HUMAN | 2023
94
Learning from the Responsible Tech community

ALL TECH IS HUMAN | 2023

Panel Highlight:
How To Build A Career in
Responsible Tech
All Tech Is Human recently held a panel discussion featuring Danielle
Sutton, Kristina Francis, Ginny Fahs, and Flynn Coleman, moderated by
Executive Director Rebekah Tweed. This was part of our Responsible
Tech Mixer and Speaker Series, which brings together 200 people
each month in NYC to build community. This panel was held at
Betaworks on July 26, 2023.

The panel focused on the ways people can build a career in


responsible technology. Career pathways, the importance of multiple
backgrounds and perspectives, and how responsible technology is
one of the greatest civil rights issues of our time were discussed. Click
here to watch the full video.

In the coming pages, you will find high-level overviews of each panelist
and key insights from the discussion. You can see all the videos from
our series here.

95
Danielle
Sutton
Senior Consultant at Deloitte
and Trustworthy AI
Strategist

“I think one of the big things that


I've realized very early on was that
responsible tech is truly one of the
Danielle Sutton is a Senior
greatest social justice issues of our
Consultant at Deloitte and
time and it is also one of the
Trustworthy AI Strategist.
greatest market opportunities. And
Danielle Sutton is a 5th
that rarely happens honestly, those,
generation Harlemite who is
this convergence of events and so I
currently a Senior
found it really exciting in my own
Consultant at Deloitte in
journey to be able to explore and
their Government and Public
navigate that space.” -Danielle
Services Strategy & Analytics
Sutton
Practice. She has been with
the firm for 4 years, focusing
her work on the intersection
of Trustworthy AI and
criminal justice.
04

ALL TECH IS HUMAN | 2023


96
Flynn
Coleman
Writer, International
Human Rights Attorney

What kind of impact do you


ultimately hope to make in your
career?
Flynn Coleman is an author, an
international human rights
“A lot is this idea that ultimately we can't attorney, an environmental
take any of what we do just for ourselves advocate, a Fellow at Harvard and
with us, right? What we do for ourselves Yale, and a professor. Flynn is the
dies with us…We live very much in an Fernand Braudel Senior Fellow in
individualist culture. And that can bring the Department of Law at the
incredible innovation and incredible European University Institute in
independence. But thinking about this Florence, Italy, and is also a
collective community, legacy I think is Visiting Researcher at the
incredibly important. And it also takes University of Copenhagen in the
the pressure off having to do all of these Law Faculty. She has been named
things for ourselves because what we do a Technology & Human Rights
for ourselves dies with us. What could Fellow at the Harvard Kennedy
we give away? What could we take and School of Government and the
then bring back to our communities?” - Carr Center for Human Rights
Flynn Coleman Policy.
04

ALL TECH IS HUMAN | 2023


97
Kristina
Francis
Executive Director, JFFLabs

What kind of impact do you ultimately


hope to make in your career?

“I tell people a lot of times that my


mission in life is to make sure that people Kristina Francis is the executive
can live in their genius. I say it all the time, director of JFFLabs. In this role,
and so I said that even before I went to she oversees advisory,
JFF. And when we talk about walking acceleration, data, and investing
through the door and allowing life to kind initiatives that connect
of bring you to the area that allows you to traditional systems with systems
live your mission. That's what JFF allows disruptors to enable equitable
me to do. And for me it is making sure economic mobility. Kristina has
that everyone in this room, all the people more than 20 years of experience
in different communities, those who are in corporate operations and
suffering, that we can come together as a entrepreneurial ventures focused
people, as a community, as a humanity, on management consulting,
and make sure that people can live in the business development, software
way that gives them dignity and allows and data integration, and impact
them to give the gifts and talents to the investing competencies.
world.” - Kristina Francis
04

ALL TECH IS HUMAN | 2023


98
Ginny
Fahs
Director, Product R&D -
Innovation Lab

What advice or tips do you have, especially from


your own perspective, your own little slice of the
responsible tech ecosystem?
Ginny Fahs is the Director, Product R&D -
“I would say for folks who are transitioning into Innovation Lab at Consumer Reports.
responsible technology for the first time there are Prior to joining Consumer Reports, Ginny
ways to run small scale experiments. To see what was a software engineer at Uber and a
sorts of roles are gonna be a good fit for you. Technology Policy Fellow at the Aspen
Particularly if you're interested in working for a Institute. At Aspen, she focused on
nonprofit. Lots of nonprofits and governments, too, cybersecurity for the elderly and
have kind of smaller consulting engagements contributed research and design
where you're able to embed with a team, try on prototypes that are currently being
what it looks like to work in this way and, and adopted by U.S. government agencies
better understand the environment you'd be including the Federal Bureau of
working for and with. And that's been really crucial Investigation, Federal Trade Commission,
for me in my career as well. Most of my big career and Department of Homeland Security.
transitions have started with some sort of She also co-founded #MovingForward, a
consulting contract or internship or apprenticeship nonprofit social enterprise that fights
or way to try on the role. And I feel like that's really harassment and discrimination in startup
helped clarify decision making to get, to investing. Ginny holds a Bachelor of Arts
experience a day in the life myself, see the team I'll in History & Literature from Harvard
get to be working with, and make calls about where University and a Master’s in Business
to move based on the people, the project, the Administration from INSEAD, a global
mission, and just how it feels day to day.” -Ginny business school with campuses in Europe
04

Fahs and Asia.


ALL TECH IS HUMAN | 2023
99
Learning from the Responsible Tech community

ALL TECH IS HUMAN | 2023

Panel Highlight:
Technology Is
Infrastructure
All Tech Is Human recently held a panel discussion featuring Dr. Saima
Akhtar, Matt Mitchell, Claire Liu Yang, and Lyel Resner, moderated by
Program Associate Elisa Fox. This was part of our Responsible Tech
Mixer and Speaker Series, which brings together 200 people each
month in NYC to build community. This panel was held at Betaworks
on August 24, 2023.

The panel centered innovation necessity in workforce development in


order to recruit a wide range of disciplines and voices and foster a
more equitable tech ecosystem. It also explored how a diverse range
of backgrounds and voices are required in cybersecurity and public
interest technology. Click here to watch the video.

In the coming pages you will find high-level overviews of each panelist
and key insights from the discussion.

100
Claire Liu-
Yang
Chief of Staff at Silicon
Harlem

Can you share why a diverse and inclusive


ecosystem and workforce are important?

“So I'm gonna return to that sentence that I hope


Claire Liu Yang is an
all of you remember, technology is infrastructure,
right? What does that mean? What does that emerging leader who
mean when you think about it? Technology is demonstrates that female
leadership can transcend
infrastructure…Do you think one person built [the]
New York City subway? Do you think one person
can build that? Is that possible? No. Because social stigmas and shatter
infrastructure is not a single skillset, right?…So
barriers. With a passion for
there's so many different skillsets needed, and I
bet that in this room we have a good combination building infrastructure that’s
of them. And that's what's beautiful about it.” blind to gender, age, income,
race and disabilities, the
-Claire Liu-Yang

chief of staff at Silicon


Harlem manages the
broadband that provides
internet service for
affordable housing in
underserved communities.
ALL TECH IS HUMAN | 2023
101
LYEL
RESNER
Visiting Faculty and the
Head of the Public Interest
Technology Studio at Cornell
Tech

How would you persuade industry and funders to


invest in and collaborate with academia and civil
society on workforce development and education
programs? Lyel Resner is currently Visiting
Faculty and the Head of the Public
“...we run around a lot trying to convince influential Interest Technology Studio at Cornell
company builders and investors that responsible Tech, where he leads programming for
tech in some way is a business imperative. And I think 400+ graduate students on creating tech
there's a growing set of data that suggests that that's to create a more just future, and co-leads
true...I mean, one, we talk most importantly about how the Startups & Society Initiative (SSI) — a
the talent cares. We, you can look at this room as non profit research project backed by
some evidence. There are all sorts of studies about Ford, OSF, and Omidyar to support
how millennials and Gen Z care more about buying founders and investors with responsible
products and services and being part of communities innovation practices. As part of SSI, Lyel
that resonate with their values and
Co-founded the Responsible Innovation
generation....Second and relatedly is consumers care.
Founders Summit — an annual event
I think we're entering a time where trust in the private
that has attracted 700+ global tech
sector generally and in technology companies
leaders including founders backed by Y-
specifically, very justifiably, is at all time lows. And
Combinator, General Catalyst, and
that's empirical...And then thankfully, although to a
lesser degree here domestically than in the EU, there
Sequoia, and published the Responsible
is an increasingly aggressive and sophisticated Innovation Primer for Founders — a
regulatory regime. And so to kind of build social distillation of 100+ interviews of
capital and maybe get ahead of some of those things influential tech and civil society leaders
about building tech companies
04

is useful.” - Lyel Resner


responsibly.
ALL TECH IS HUMAN | 2023
102
MATT
MITCHELL
Senior Cybersecurity
Program Manager, The Ford
Foundation & Founder,
CryptoHarlem

How does CryptoHarlem’s work at the


Harlem Business Alliance expand
cybersecurity education to
underrepresented groups in the
cybersecurity field? Matt Mitchell is a well known
security researcher, operational
“If you look in the mirror and you see the
security trainer, and data
identity of the community that's being, you
journalist who founded and leads
know, targeted, it's really hard for you to show
up as your full self everyday to
CryptoHarlem, impromptu
work...CryptoHarlem is about like, tugging on workshops teaching basic
that piece of thread. Until the evil sweater falls cryptography tools to the
apart, you know? You can donate all your predominately African American
money, you could be a whistleblower. You community in upper Manhattan.
know, the realities of capitalism is what forces He hosts a weekly livestream
so many good minds from your program into
that educates all people on how to
that temptation of those factories. You know
stay safe from digital harms.
what I'm saying? And we're about just trying to
Harriet Tubman this whole situation and get
Matt is also the Senior
people outta there, you know?” -Matt Mitchell Cybersecurity Program Manager
at The Ford Foundation.

ALL TECH IS HUMAN | 2023


103
Saima
Akhtar
Senior Associate Director of
the Vagelos Computational
Science Center (CSC),
Barnard College

“Technology is infrastructure and the


way that I think about it really is that I
studied the built environment, right? I Saima Akhtar is the Senior
think about the inequities that are built Associate Director of the Vagelos
into the world around us. We think Computational Science Center
(CSC) at Barnard College. She is a
about these buildings as just magically
computational social scientist
appearing. No, there was an architect,
with a background in architecture
there was a patron, there was a plan,
and software engineering. Prior to
and in that same way, the internet is, joining Barnard, Saima was a
whether it's virtual or infrastructure. So postdoctoral associate in the Yale
those same inequities are gonna be University Department of
built in our virtual world if they don't Computer Science, where she
get resolved in our physical world. And managed digital cultural heritage
so I think that it's really important to preservation projects between the
think about all hands on deck ways of fields of computer science and
thinking about the future of technology architecture. At Barnard, Saima
and its impact on society” - Saima works with faculty and students to
Akhtar creatively and critically think
about the application of computing
across disciplines.
ALL TECH IS HUMAN | 2023 104
Read our previous reports with
interviews, resources, ideas, and
more!

All Tech Is Human recently released three reports that involved


hundreds of diverse community members working collaboratively
around the world. These reports feature dozens of profile interviews
and resources from hundreds of organizations in the responsible tech
ecosystem.

Download Tech & Democracy: People, Organizations, and Ideas for


a Better Tech Future at techanddemocracy.com

Download HX Report: Aligning Our Tech Future With Our Human


Experience at hxreport.org

Download AI and Human Rights: Building a Tech Future Aligned With


the Public Interest at aihumanrightsreport.com

Our organization is able to assemble hundreds of individuals quickly


on emerging topics in responsible tech, allowing us to map the
ecosystem, provide an overview, and conduct dozens of profile
interviews for the community’s education.

ALL TECH IS HUMAN | 2023


105
Listen to podcast interviews with
responsible tech professionals

All Tech Is Human conducted 16 live interviews with inspiring individuals


involved in the responsible tech ecosystem at Unfinished Live in
September 2022 in NYC. We asked each person, "What does your ideal
tech future look like?" You can listen to all 16 interviews in our podcast
series here.

ALL TECH IS HUMAN | 2023


106
34

ALL TECH IS HUMAN | 2023


Key Takeaways

01
Gain confidence and a better
understanding of the ecosystem
There are thousands of individuals just like
you looking to plug into this community.
Treat it like a high-level of commitment to
learn about the ecosystem, read relevant
books and articles, expand your network,
and find mentors.

02
Play an active role in responsible tech
After gaining a better understanding of the
ecosystem, attend a responsible tech
gathering to meeting others in the field.
Participate in the many activities from
hundreds of organizations working to make a
better tech future.

03
Stay involved with All Tech Is Human

Join our newsletter, which covers the


responsible tech movement, contribute to a
working group, attend a summit or mixer,
and join our Slack. There are numerous ways
to take an active part in our work!

107
All Tech Is Human’s Team
ALL TECH IS HUMAN | 2023

David Ryan Polgar: Founder & President


Elisa Fox: Program Associate, Tekalo
Josh Chapdelaine: Special Projects & Multimedia
Rebekah Tweed: Executive Director
Renée Cummings: Senior Fellow for AI, Data, and Public Policy
Sandra Khalil: Head of Partnerships & Trust & Safety vertical
Sara M. Watson: Siegel Family Endowment Research Fellow
Sarah Welsh: Program Manager, Mentorship & Tekalo

Our team is small-but-growing and is complimented by a core


of dedicated advisors and volunteers. Read an overview of our
organization here. For general inquires, write to us at
hello@alltechishuman.org.

108
Stay involved with All Tech
Is Human and the responsible tech
community!
ALL TECH IS HUMAN | 2023

109
Get in touch
Stay in touch with All Tech Is Human by joining our newsletter and
Slack community, attending our livestreams, and meeting in person at
ouf summits and mixers!

Find all of our projects and links at link.tree/alltechishuman.


ALL TECH IS HUMAN | 2023

Our non-profit is based in NYC with a global community and


approach. We hold in-person gatherings in NYC, San Francisco, DC,
and London and are exploring additional locations. There are also
informal meet-ups happening around the world organized
independently by members of our Slack community.

Find additional resources related to the Responsible Tech Guide at


responsibletechguide.com and stay in touch with All Tech Is
Human at alltechishuman.org.

Contact: hello@alltechishuman.org.

110

You might also like