Carolyn Herzog, EVP and General Counsel at ARM talks to Johanna Hamilton AMBCS about AI, diversity, inclusivity and the importance of matching the algorithm to the community.

After a twenty-year career in cybersecurity, Carolyn Herzog became ARM’s General Counsel in 2017. ARM is, of course, the world’s leading provider of silicon chip designs - its technology sits at the heart of billions of phones, tablets, cars and computers.

With its unique place in the tech industry supply chain, ARM is keen to explore how it can make products that are good - not just for our phones - but for society too.

You’ve talked elsewhere about the gender data gap. Where do you see bias happening?

I saw a data gap even from the time I was growing up. My father was an obstetrics and gynaecology specialist and he would talk to me about how there was all this medicine and information for men and yet there was this huge gap around information for women. As I progressed in my career as a student and eventually began my career in the Africa Department at the World Bank, I continued to see signs of information gaps between men and women as I worked on health and education projects in developing countries.

Certainly in AI we know that there is a bias that exists for women - both an unconscious bias and a conscious bias.

These days, many companies are doing unconscious bias training and I really like the effort. It levels the playing field and removes the blame factor, creating a safer environment for everyone to take action. However, it does make us realise in AI, if we don't pay attention to gender bias in areas like online facial recognition and other areas of facial recognition technology, there will be a massive gap in understanding where gender and other biases exist.

It exists in racial bias, it exists in gender bias and so certainly we know that if you don't have women at the table designing for AI, there will be a massive gap both from an unconscious and conscious perspective.

How do we counteract this gap?

We need women designing, we need a diverse steering committee of people looking at the ultimate product design and reviewing the design, and we need people using the technology once it's built to really understand whether or not that bias exists in the technology.

AI is such a unique technology where the ultimate use can be for good, or it could be used for harm. So, it's critically important for this technology, I think probably more than any other right now, to have women at the table designing the technology, using the technology, examining the technology.

ARM, in particular, is thinking about this, not from a financial perspective but from a ‘technology for good’ perspective and an ‘AI for good’ perspective.

So how is ARM tackling bias?

We're thinking about it through the entire technology ecosystem. There's been such a negative, tech-lash view of companies that we needed to look at creating a trust manifesto. [ARM is] in a perfect, non-threatening position, to engage in a dialogue with all companies to say “[We could be] a company that could bring other companies together to talk about this. Could we create a framework where bias is just one of the pillars that is creating a framework of ethical considerations for AI?”

We also think that it's critical to bring government, university, society and companies together to have a cohesive dialogue. It's not something that companies can do alone. And then there’s an adoption in terms of what does this mean for you as a company? What does it mean for the government? Again, the UK has been leading in terms of AI and thinking about this ethical framework. I think there's been a lot of really healthy dialogue around that.

Do you think there will ever be a United Nations for the positive use of tech?

I'd like to think there would be and certainly around data protection. What the EU has done around data privacy is critically important and I would like to see a broader governing body around data privacy at least with some governing principles, internationally.

AI needs to be used for good and we're already seeing it being used in health care. And yet, it could be misused and if companies don't adopt a framework then the government will come in and legislate.

Regulation can be very fragmented and in that case AI cannot succeed. So very much like the adoption of GDPR for the EU, the US hasn't yet adopted a GDPR-like framework for data privacy.

If you can imagine what the US would look like eventually and could look like if it doesn't have a GDPR, the fragmentation of privacy regulations would be impossible to manage. So, you can imagine if Virginia, West Virginia, California and Texas all have different data privacy regulations - how are companies going to manage the adoption of each of those data privacy regulations when they're trying to manage data? It will be extremely expensive, difficult and doomed to fail.

What about unintentional gender bias?

Siri is a woman's voice. Home assistants are all women's voices. That bias automatically does exist.

Why isn't everybody questioning bias? Why isn't equality happening faster? I do talk to people about this but my question is, why aren't we adopting standards faster? I think it should be a standard. We've been working on this for some time. It is challenging, it is complicated, I think companies are interested.

Why is it so hard? I think the question is, just, why?

There is a tendency now to blame the algorithm and not the person. When did computers become culpable?

When you think about what's been happening in the courts, and there's a huge issue with race in the United States right now, we have a pervasive and recognisable issue with bias, racism and injustices in the criminal justice system that have existed since the dawn of America.

What if you could build in some kind of algorithm that had no bias? Would it make a difference? And where's the human factor? Juries are the pride of the criminal justice system in America but it would seemingly be a dream if you just had no bias in the jury system. Would the United States be a different place if you could guarantee an unbiased jury?

If you are purely looking at engineering, the United States and England perhaps have greater opportunity for diverse coding than some other countries just by nature of our universities and company recruiting.

Tell me about how you deal with datasets.

I talked to the general counsel of one company and they're taking their data privacy mechanisms and doing data privacy product checks and using that mechanism for testing their AI. I think the more companies that can share information and say what works in terms of AI checks and balances, the more we'll all learn from each other. And I would like for governments to have some checks and balances, as well as for societal checks and balances, along with universities and governing bodies such as BCS. We all need to ask, "What can we do to make sure this is safe and trustworthy, while also promoting reasonable speed and transparency?”

How does ARM promote diversity?

My team is working through our panel of firms that we work with and we ask diversity questions and I make sure that when the panels come and talk to me, they know that I'm grading them on that.

When we look at hiring, we make sure that we have a diverse panel of candidates, as well as a diverse panel of interviewers. We always want to hire the best candidate for the role, which may mean filling in a gap in experience - again, looking at that from a diverse perspective.

Diversity has to be reviewed holistically. We are speaking with our engineers about this to ensure that they are also designing in AI with the same principles in mind, and with the ethical framework in mind. By working through this manifesto, it is heightening the awareness of our engineering community so that they are thinking about these key concerns, as they work on AI.

Girls aren't going into STEM subjects at university. How do we counteract that?

STEM programmes haven't worked. ARM has a tremendous programme called Gen ARM 2Z promoting diversity and they have come up with the most amazing inventions - some of which are being used and making a difference in society today. But, if you can't see it, then girls won't go into these programmes. There have been some wonderful movies that started to highlight the hidden women in pivotal programmes from years ago. That's the way that you inspire girls.

ARM is involved in programmes both in the UK and in the US to show women, girls and our communities what our engineers are working on and our engineering programmes.