See more Management/Executive jobs →

← Back to all jobs

Head of Investigations and Machine Learning, Trust and Safet...


Headquarters: Sunnyvale, CA
See all Google jobs →

Google's brand is only as strong as our users' trust-and their steadfast belief that our guiding principles are what's best for them. Our Trust and Safety team has the critical responsibility of protecting Google's users by ensuring online safety by fighting web abuse and fraud across Google products like Search, Maps, Google Ads and AdSense. On this team, you're a big-picture thinker and strategic leader. You understand the user's point of view and are passionate about using your combined technical, sales and customer service acumen to protect our users. You work globally and cross-functionally with Google developers and Product Managers to navigate challenging online safety situations and handle abuse and fraud cases at Google speed (read: fast!). Help us prove that quality on the Internet trumps all.

As the Head of Investigations and Machine Learning, you'll be responsible for ensuring responsible and high-impact AI development and deployment in Trust and Safety (T&S) and Google. You'll manage a team of investigations analysts and ML Specialists who support and help to drive improvements in processes, products and AI from a data and user-first perspective. As a leader of this multi-disciplinary team, you will provide the team strategy and leadership to drive the capabilities of the team and scale the team’s impact. You will stay focused on aligning the highest-level company priorities with effective day-to-day operations, and help evolve early stage ideas into future growth, retention and user trust initiatives.
At Google we work hard to earn our users’ trust every day. Gaining and retaining this trust is critically important to Google’s success. We defend Google's integrity by fighting spam, fraud and abuse, and develop and communicate state-of-the-art product policies. The Trust and Safety team reduces risk and protects the experience of our users and business partners in more than 40 languages and across Google's expanding base of products. We work with a variety of teams from Engineering to Legal, Public Policy and Sales Engineering to set policies and combat fraud and abuse in a scalable way, often with an eye to finding industry-wide solutions. Trust and Safety team members are motivated to find innovative solutions, and use technical know-how, user insights and proactive communication to pursue the highest possible quality and safety standards for users across Google products.
  • Set vision, strategy and direction for the team, build talent, foster innovation and nurture development.
  • Collaborate with senior leadership across Google (including Product, Engineering, Legal, Privacy, Policy, Communications leads) to ensure goals are met.
  • Coordinate efforts across Trust and Safety to address Machine Learning-driven abuse and support product inclusion efforts.
  • Leverage knowledge in computer science or software engineering, including familiarity with machine learning concepts, to inform methods of fighting large-scale Machine Learning-driven abuse in teams across the organization.
  • Design and implement strategic programs and operational processes to investigate and close blockers to inclusive product experiences prior to launch.
Minimum qualifications:
  • Master's degree in a research (Human Computer Interaction, User Experience Research, Ethnography), social sciences (Philosophy, Psychology, Ethics) or an analytical field (Engineering, Computer Science, Mathematics, Science) or equivalent practical experience.
  • 15 years of relevant experience in operations or consulting.
  • 5 years of experience in people management.
  • Experience in defining and managing global strategic programs that span multiple organizations, as well as leading and running a globally spread operation.

Preferred qualifications:
  • Experience in risk management (Data Analysis, Hazard Assessment, Fraud Investigation, Risk Management, Security Vulnerabilities, Penetration Testing, Privacy and Security).
  • Experience collaborating with Engineering/Product Management stakeholders and understanding of abuse issues faced by advertisers, partners and end users. Understanding of the product development process in tech.
  • Ability to manage, influence and persuade others without direct authority in highly dynamic/ fast paced environments.
  • Ability to define projects and execute within timelines and with multiple stakeholders. Creative analytically-minded, logical thinker with strong problem solving and communication skills.
  • Technical or policy knowledge of the key policy issues affecting the internet (intellectual property, free expression, and online safety, ethics and the socio-technical considerations of technology and the future of AI).

At Google, we don’t just accept difference - we celebrate it, we support it, and we thrive on it for the benefit of our employees, our products and our community. Google is proud to be an equal opportunity workplace and is an affirmative action employer. We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status. We also consider qualified applicants regardless of criminal histories, consistent with legal requirements. If you have a disability or special need that requires accommodation, please let us know.
To all recruitment agencies: Google does not accept agency resumes. Please do not forward resumes to our jobs alias, Google employees or any other company location. Google is not responsible for any fees related to unsolicited resumes.