Wilneida Negron, Strategic Advisor to the Ford Foundation and Atlantic Fellow for Racial Equity, discusses the ethical concerns that ran through the series, from the economic drivers of technology to the complexity and potential it has to impact society.
TRANSCRIPT
MUSIC Outro sting – Rovador for AudioJungle, Fantasy Cinematic
F Sweeney: Hello and welcome to Atlantic Fellows Conversations. I’m Fionnuala Sweeney.
Throughout this series, Being Human When Digital, we have been exploring how AR and VR can further the Atlantic Fellows’ aim of building healthier and fairer societies for all. As a community of hope, we wanted to use the series to explore and envision better tech futures, while recognising that big tech often exacerbates pre-existing racial, social and economic inequities.
We knew it was vital to rigorously interrogate the key ethical concerns around AR/VR and big tech. So, we asked Atlantic Fellow for Racial Equity Wilneida Negron, a strategic advisor at the Ford Foundation, to help moderate our discussions. We considered topics such as:
Today, Wilneida joins me to reflect on some of those discussions. I began by asking what, in her view, were the ethical concerns that ran through the entire Being Human When Digital series.
W Negron: You can probably pin it down to two things. One of them is the origin story of where technology originates from. The economic drivers and the economic actors that actually give birth to the tools that we later get a hold of in civil society, or in different movements, and want to use and present a lot of promise, and a lot of opportunity for really adapting these tools to do transformative work in our various sectors of work; whether it’s health or indigenous rights or education – you could go on with the number of sectors. But because, sometimes, we don’t truly understand the economic side of where the technology comes from, and who are the people that are shaping the development and who are then creating the incentives that contribute to problematic business models and contribute to practices that can undermine labour standards or, you know, just have a variety of really negative implications on society and the economy. We are just as a society coming to understand that. And we’re reacting, really, to the past 20 years of technology innovation.
So, we’re just now getting to grips with the emergence of what Amazon, what Facebook, what Twitter, what Instagram has presented in the way that it shapes society. When Twitter first came out, my fieldwork for my dissertation was focused on just the transformative aspect of Twitter for creating movements like Arab Spring, or even student movements in Taiwan or Latin America. It was this platform that allowed anyone to amplify their voice, and to really build power in a very transformative, collective way. Now we can fast-forward and understand that Twitter, Facebook, they were founded in a very homogenous, privatised tech innovation system that comes out of Silicon Valley. And they have business models that they’ve adapted, and shareholders that they answer to, that have created the economic side of these tools. Now, we’re understanding the negative side of that. And so, unpacking the origin story was the really deep, ethical component. Again, the origin story goes into, like, who are the people creating the technology? Like, what are the business models they’re utilising to finance and make these free and accessible? And what’s the implication of all of those things on society in ways that we can’t see?
The second, metalevel ethical issue is that, to unpack the complexity and leverage the power of these new technologies – specifically AR/VR – we are going to need new ways of working across sectors and across issue areas that are going to really test humanity. And it’s already testing humanity in how do we adapt because technology innovation happens so rapidly. Trying to find new ways to work together to deal with the complexity of these issues, you need a multidisciplinary perspective, and you need to be able to do a lot of translation across policy, as well as on the economic side of things, as well as the beautiful reimagining that happens. And we need to do all of those things, kind of, at the same time. And it’s super-complex. The innovation happens so quickly that what is concerning is that we will never really be able to catch up to it in a way that we can get to it early, so that we can be early shapers and early thinkers in how those technologies get developed and how they get implemented so that it’s not 10 years after the fact that we’re using these things and then just, “Oh, wow. I’m realising that there’s a fabric of society that’s been breaking while I’ve been buying all these AR/VR headsets.” And then, there’s things like privacy, surveillance. They’re all tied to those main two buckets.
F Sweeney: And yet, you would argue that AR/VR technology is essential to supporting social justice work; it can powerfully support social justice work?
W Negron: Absolutely. Particularly now, in the pandemic. There is an acceleration in trying to think about how to build communities in non-physical spaces. And how to support cross-learning and developing empathy towards things that we can experience directly. And there’s already a lot of really, really powerful work happening across those areas.
One of the Fellows spoke about, they’re working on a project that is trying to increase sensitivity around certain health issues. And AR/VR has the power to really flip the switch on our positionality in relation to society and how we experience things, in a really, really powerful, transformative way, that we’re just really beginning to untap. And then, the storytelling component, where there was Nonny and Dylan’s work that is deeply humanistic and deeply artistic; that has the power to wrap all that complexity in a storytelling format that’s really impactful. And so, the potential is there, and the time is really now, to really be, like, engaging the promise and the challenges at the same time.
F Sweeney: And a lot was spoken about balancing hope, optimism, with the criticisms that surround the introduction of AR/VR. What was it about this series that gave you hope, or reason for optimism?
W Negron: What gives me hope is there are lots of people, from many walks of life and expertise and issue areas, asking and coming to the same conclusions as far as where can we go from here? Both in exploring the promise and thinking about working across disciplines, or cross-sectorally. We are so much more equipped and prepared and knowledgeable of just how to embrace technology innovation, and how to move forward with it in a very proactive, substantive way. And that’s a muscle and an awareness that wasn’t there 15, 20 years ago.
And people are thinking about this with a very strong equity lens, as well. Both in disrupting the drivers of structural racism, but also again the disproportionate impact in access, and in the negative implications it could have on vulnerable communities. When I was involved in these conversations 15 years ago, you had to talk to a lot of people to make that argument and technology seemed pretty neutral, for the most part. The resounding feedback from Fellows was it was very much in their focus and their lens as well as very much wanting to explore the potential of the technology.
F Sweeney: If you say it’s difficult to catch up with Silicon Valley and business models, is it possible, still, at this relatively early stage, to engage Silicon Valley and discuss with them their business models; the implications – the ethical implications – for social justice work, in particular, that their technology brings to the fore? And that, somehow, it might be possible to influence those business models?
W Negron: Yeah. You just here walked into some emerging work, actually. Because the policy and regulatory frameworks are just beginning. I mean, at least in the US. Obviously, the EU is much more advanced with a broader policy framework to protect different stakeholders of the economy and vulnerable communities around some of the privacy implications of the different technology products. Global South, not as much. It’s a fragmented system global system of protection, which, because we work globally in the Atlantic Fellow community, it gives opportunity for some really interesting cross-collaboration work on supporting each other’s work on the policy and regulatory spaces as it relates to our communities and our country. Because a lot of what that policy/regulatory work also requires is building collective people power.
In the US, in particular, there’s a lot of work – a lot of movement-building work – happening across labour, immigration rights, gender rights and racial equity communities. Different stakeholders of society are building power and starting to name some of these ways that these products affect their livelihood and their sense or ability to have agency and seek employment and have a life without surveillance. Building collective power – both in our local community or our country, and globally as sort of, like, a pushback - is what helps to strengthen the policy framework as well as the regulatory framework. Because without citizens speaking up about some of the practices, it doesn’t trigger deregulatory response. Or, not enough resources go into the regulatory function that actually does some of the pushback or investigations of some of these practices. And so, the policy/regulatory is super-complex. It has a multi-layered shaping of bringing different groups together to really strengthen that piece of it.
But on the economic side, because the policy/regulatory is fragmented, slow-moving, requires a lot of multi-stakeholder collective power-building, there is a lot of emerging work happening with venture capital and with tech companies around helping them to understand social justice and racial equity, and the need to not undo the fabric of society and the labour protections that are there to protect workers, for example. But a lot of that work requires a lot of really clunky translation. There’s not a lot of people that work in those spaces, that help to translate in a meaningful way to those actors how to do better due diligence.
An example of that is that there’s a lot of venture capital which fund a lot of these privatised tech innovation. They are beginning to understand things like ESG - ESG stands for environmental, social and governance indicators - and that helps them to do some due diligence in how they invest. It basically just adds a few questions for them to ask a tech company, or to analyse a particular business model and its impact, and that work is really emerging. And the people from civil society that need to be informing the sorts of questions that would go in the ESG framework that is used in a capital market, there’s not a lot of people that are doing that translation and doing that kind of cross-collaborative work.
A lot of this deeper economic work is a very complex area that involves lots of types of investors beyond venture capital. Venture capital gets their money from institutional investors – which is foundations and sovereign wealth banks – and increasingly so, hedge funds and private equity - and so you can go to venture capital and ask them to do more due diligence around certain business models that these tech companies may have. But where they get their money is this other set of actors, these institutional investors, and they are not looking at these things as much. It’s like a very complex, fragmented set of actors to engage, and to do a lot of translation. And there’s less of that happening, and that is exactly what needs to be happening, now. But it’s an emerging field.
F Sweeney: Staying with the question of big tech for a moment. Can activists commit to decolonisation whilst using hardware which is produced by big tech?
W Negron: That is a challenge that has been in activist spaces. The desire to have an alternative is always there. But the alternative doesn’t have the audience that social justice actors need to be able to communicate and pass their message. You know, if there was an alternative to Facebook but only 50,000 people were there, you probably wouldn’t be using that platform to communicate. You’d use Facebook. And so, people go where the community is.
Also, philanthropy doesn’t fully yet know – or even private capital – how to support more democratically-owned alternative platforms. Maybe people will use Facebook for communication, but they want to use some other tools. Like, even if there’s like a public interest version of an AR/VR that is ethical and responsible – and I’m sure that there are people out there in this world that are imagining and thinking of that – the ability for them to find resources to create that alternative AR/VR headset that’s responsible and ethical, that money is not there. And the business case for why that is needs to be made. And so, right now, activists are sort of trapped in needing to use the hardware produced by big tech, while also pushing a decolonisation agenda, and it feels very incongruent. But the absence of a viable alternative is complicated.
F Sweeney: Let’s go back to the series itself. There are very powerful VR experiences pitched as training empathy. For example, Traveling Whilst Black. But we know that, as a white person, this could perhaps heighten my empathy as a white person, but I will never know what it feels like to travel whilst black. So, is it possible that sometimes VR experiences could be branded as a shortcut to each other’s lived experience? One word that’s commonly used is appropriation. And where is the line between appropriation and empathy?
W Negron: That’s really one of the set of questions to think about. There’s, you know, lots of questions of, can you move even from individual consciousness to a collective consciousness that would actually move someone, would move someone in the direction of actually taking action in society that would help to strengthen bonds that they may have with different groups, or different actors? So, there’s a lot of yet-to-be-determined, really exciting exploratory work in terms of how to centre people in experiences, the lived experiences of other communities in a way that’s substantive, and in a way that translates into a type of collective consciousness that would be more than just like, OK, you take the headset off, and then you go off and you continue with your implicit biases. That is the research that needs to happen right now, the understanding, like, how to maximise for that. I think the potential for doing that is there.
F Sweeney: That’s really interesting that we all might share experiences, but we may not have the specific experience to render the appropriate amount of empathy - and that is surely a challenge in this emerging technology, to be able to generate or develop?
W Negron: And there are just always unintended implications that we’re not aware of. And so, understanding what those could be - you know, some people are doing some of that research, already - of how, in an era of misinformation, some of these tools could be used to manipulate people’s thinking or feelings about things in the opposite direction. And so, what are the safeguards in absence of any policy or regulatory protections (that I don’t envision happening in the next five or 10 years) around this particular type of technology? Like, what are the safeguards that these artists and these creative, intelligent people that are using these technologies, that they can implement right now to account for the potential of unintended consequences?
F Sweeney: We’ve talked generally, so far, about how AR/VR can support social justice, but can we explain that to listeners where it might have an impact?
W Negron: There’s many ways that AR/VR is being used in a variety of settings, which has the potential to be really transformative in the health space, especially in the time of the global pandemic that we find ourselves in. AR/VR is being used to help doctors treat patients that are dealing with issues that have mental health implications that can contribute to depression or other mental health needs. And so, the ability to use AR/VR to transport a patient into a different setting that can be more rehabilitative has shown a lot of promise. There’s also a lot that is happening in terms of how we do narrative-change work, and how we elevate the stories that are not always heard in kind of new, powerful ways. Whether it’s using AR/VR to elevate the experiences and the stories of indigenous communities – not only as a way to tell their stories, but also in a way that can be preserved in a medium that is much more than capturing it on a book, or even a film. To think that we are potentially looking at losing some of that heritage from indigenous communities, but being able to preserve and to continue to grow that in a medium like AR/VR, that will give that lots of power and strength in the years to come - if it does turn out that an indigenous community is not able to sustain itself in the years to come, their stories and their experiences will be captured.
And for social justice actors, there is a lot that’s happening in terms of how to build, how to tell the stories of communities - and this goes back to what we were talking about - how to really talk about racial equity or structural racism in a way that centres someone right in what that experience is like. It’s something that no other storytelling format has been able to do, or can do. And so, storytelling with that positionality, especially when you’re trying to disrupt structural racism and sort of the implicit bias around that, AR/VR is uniquely positioned – more than any other medium – to be able to empower activists and storytellers to engage audiences in a new, kind of more powerful way.
F Sweeney: I’m just wondering, where generally do you think we’ll be in the next 10, 12 years, in terms of AR/VR and its daily use in our lives?
W Negron: It’s only going to grow. Really, if you look at what is being imagined right now in the innovation spaces like Silicon Valley, already the ecosystem of what is termed a metaverse is being developed right now. They’re starting to build collective, virtual, shared spaces that really enhance physical reality, and also include virtual worlds all in one. And so, the conversation around metaverses, the companies that are being funded to explore the development of a metaverse, is happening right now. And that ability for us to go from virtual to physical to augmented in the next 10 to 12 years is going to be seamless, when you look at what those conversations look like right now, or the people that are innovating that right now.
And so, it’s going to happen much faster than we realise. And it’s both exciting and both frightening to think that, in 10 years, our sense of physical reality is going to include these other mediums, and it’s all going to be one seamless experience. AR/VR has been around for a very long time. But the pandemic has accelerated the need for it and the companies developing the products, making them more democratically accessible and cheaper. This is very much what the world is going to look like. How do we start talking to people about what that reality looks like and engaging them around that?
F Sweeney: And that’s against the concerns that we talked about earlier – about data privacy, surveillance. Do you think that, given the speed of the development of this emerging technology, that we can still unleash the activist potential of virtual spaces?
W Negron: I think that we can, but it’s going to take a lot of proactive, collective sensemaking, that begins with a small set of actors that are concerned and interested and excited. And building and sharing that knowledge and developing that community and that voice. I’d much rather attempt to do that than just sort of throw our hands up in the air and say, “OK. This is what’s happening, and we should just, like, wait until we’re in the metaverse in 10 more years and just deal with the implications of that.” I’d much rather know what’s coming up ahead, and work with anybody who is engaged, again, proactively, and experimenting and creating the safeguards. That work starts very small, but that small could be pretty powerful, as well.
F Sweeney: That was Wilneida Negron, strategic advisor at the Ford Foundation and Atlantic Fellow for Racial Equity. This brings us to the end of our series, Being Human When Digital. We hope you have enjoyed listening.
This is Atlantic Fellows Conversations and I’m Fionnuala Sweeney. To find out more, visit our website at www.atlanticfellows.org.
MUSIC Outro sting – Rovador for AudioJungle, Fantasy Cinematic.
ENDS | Duration 22’ 40