Microsoft
- Natasha Crampton got her start as an attorney and is now Microsoft's first chief responsible AI officer.
- Crampton works across disciplines, including engineering and research, to establish best AI practices.
- She suggests experimenting with AI, leaning into non-technical skills, and participating in relevant discussions.
This as-told-to essay is based on a conversation with Natasha Crampton, Microsoft's first chief responsible AI officer, based in Redmond, Washington. The following has been edited for length and clarity.
Visit rouesnews.click for more information.
I'm Microsoft's first chief responsible AI officer, and I've been in the role for almost seven years now — well before ChatGPT and the agentic AI systems we're navigating now.
My job has two parts. There's an inward-facing part that involves working side-by-side with our engineering, sales, and research teams to ensure they uphold our principles as they build AI systems. There's also an external-facing part, which is contributing to the dialogue and conversation about the new laws, norms, and standards we need in this space.
This is part of an ongoing series about workers who transitioned into AI roles. Did you pivot to AI? We want to hear from you. Reach out to the reporter via email at [email protected] or secure-messaging platform Signal at aalt.19.
That work can be anything from contributing comments to a bill that might be pending, or it could be working with other leading AI labs to consolidate our best practices and share them with others.
I always had an innate interest in the intersection of technology, law, and society. I tried to take a disciplinary approach to my studies even when I was at school. So, in addition to studying law, I also studied information systems.
When I was strictly in the legal part of my career, I always worked on technological issues, like helping Microsoft draft contracts. Sometimes I would work on the pre-AI challenges, like online safety.
I had the fortune of essentially being at the University of Microsoft during this time. Our responsible AI program is a deep combination between our in-house research institution Microsoft Research, as well as working very closely with our engineering team. For those coming from different starting points, there are a huge number of self-driven certifications that you can get.
There are challenges to working across disciplines
This has been the most rewarding and challenging role I've had in my career.
One of the challenges is working with engineers and researchers to define a new approach to mitigating a risk we've seen in our technology, while pushing the frontiers of science forward. There aren't really that many things I do on a day-to-day basis that have an obvious answer or a playbook I can refer to.
I find that energizing overall, but it does involve making sure that we are working very collaboratively across a lot of different disciplines at the company.
Sometimes I found that people from different disciplines can really talk past one another because they're using different language — they come from different professional backgrounds.
Finding the right language to communicate across those disciplines has been important. Sometimes it is the nature of my work to make decisions where, for good reasons, reasonable minds might differ. It's important to have enough humility to recognize that, we might need to pivot in the future depending on how things unfold.
How to pivot into an AI role
If you want to pivot into technology, it's first a great idea to really use the technology yourself. There's no substitute to building your own understanding of how the technology works and what it's not good at.
Second, it's really important not to discount your ability to shape this technology just because you're not a deeply technical person. Many of those skills can be learned.
In my experience, a huge amount of the value comes at the intersection of technical understanding and perspectives from the social sciences, as well.
It's important to make sure you have enough of a literacy with the technology to understand how it works and how it's built. But we need as many social scientists shaping this technology as we do deeply technical people.
Third, you should be part of discussions. There are many very public conversations that are happening right now, whether it's about a new law, policy, or initiative in the workplace. You can be a part of that, too.
I think it's about expressing your interest and then finding a like-minded group of people.
When I'm hiring and looking for people to transition into this place, showing your commitment by taking some of those certifications and doing some extra study goes a long way. I also look for skillsets that you can acquire in lots of different jobs, like critical thinking, the ability to work across disciplines, and communicating your ideas.
I think my ability to bridge between disciplines was the primary quality that helped me find success in this role. That involves talking to engineers and policy makers and helping to find solutions that sit at the intersection of those two disciplines in particular.
Read the original article on Business Insider