Microsoft’s mission is to empower every person and every organization on the planet to achieve more. As employees we come together with a growth mindset, innovate to empower others, and collaborate to realize our shared goals. Each day we build on our values of respect, integrity, and accountability to create a culture of inclusion where everyone can thrive at work and beyond.
The Risk Management, Trust and Safety Operations team enables Microsoft and third parties to conduct business in a safe, compliant and ethical manner. Our industry leading expertise spans multiple jurisdictions and disciplines bringing diverse perspectives and solutions to current and emerging risks in the areas of artificial intelligence (AI), digital safety, global trade, third party compliance and other trust services. We relentlessly focus on building a culture of compliance, threat management and delivering best-in-class services at speed and scale. We leverage Microsoft’s innovative technologies and partner across the enterprise to drive the right outcomes for Microsoft, our customers and our partners.
As a Trust & Safety Guidelines Developer, you will play a critical role in the Microsoft teams responsible for driving machine learning, Responsible AI technology and Digital Safety enabling Microsoft’s customers to solve real business solutions across various use cases.
If you love the pursuit of excellence and are inspired by the challenges that come through driving innovations that impact how the world lives, works and plays, then we invite you to learn more about the Risk Management, Trust and Safety Operations team and the value we deliver across Microsoft and to our customers and partners. We offer unique opportunities to work on interesting global projects in an environment that appreciates diversity, focuses on talent development, and recognizes and rewards great work.