
Manager, Data Operations & Management
Job Description
Position Summary:
Data Accessibility Engineering Support: Manager, Data Operations & Management: As the Manager of Data Accessibility Engineering Support, you will play a critical role in ensuring that enterprise data is secure, discoverable, and accessible for advanced analytics, AI / ML, and operational use. You will oversee the implementation and support of data governance tooling, metadata management, and access controls across cloud-native platforms. This role is hands-on and strategic—ensuring compliance with organizational policies while enabling scalable data accessibility across GCP, AWS, Big Query, Redshift, and other modern data environments here.
Primary Responsibilities:
Data Accessibility & Governance Enablement:
- Lead the implementation and support of data accessibility solutions, ensuring efficient access to governed and trusted data assets.
- Oversee data governance tools and platforms (e.g., Collibra) for metadata management, lineage, and policy enforcement.
Cloud Platform Integration:
- Design and implement data accessibility frameworks for GCP and AWS environments, with a strong focus on Big Query, Redshift, and cloud-native storage layers (GCS / S3).
AI / ML Support & Lifecycle Management:
- Partner with AI / ML teams to support model lifecycle management through reliable access to training and scoring datasets.
- Ensure data quality and accessibility standards are embedded in MLOps workflows and pipelines.
Data Quality, Policy & Compliance:
- Implement and monitor enterprise data quality frameworks to support regulatory compliance and business confidence.
Cross-Functional Collaboration & Support:
- Work closely with data stewards, data engineers, data scientists, and compliance teams to continuously improve data operations.
Qualifications
- 7 to 11 years of experience in data operations, data governance, or data quality engineering roles.
Data governance platforms, especially Collibra with recent 3+ years.
- Metadata management, cataloging, and data technical lineage.
- Hands-on experience with Workflow and REST API’s (Groovy and Python) programming languages.
- AI / ML data workflows and supporting structured / unstructured data access for model training and inferencing Preferred [AI Governance].
- Cloud platforms: Google Cloud Platform (GCP), Amazon Web Services (AWS).
- Data warehouses: Big Query, Redshift (and / or Snowflake).
- SQL and enterprise-scale ETL / ELT pipelines.
- Strong analytical and problem-solving skills in large-scale, distributed data environments.
- Familiarity with data security, privacy regulations, and compliance standards (e.g., GDPR, CCPA).
- Excellent collaboration and communication skills across technical and non-technical teams.
- Bachelor’s or master’s degree in data science, Information Systems, Computer Science, or a related field.
Preferred Experience:
- Experience in Retail or QSR environments with complex multi-region data access needs.
- Exposure to enterprise data catalogs, automated data quality tooling, and access request workflows.
- Current GCP Associates (or Professional) Certification