Back to jobsData Handling: Work with SQL for data extraction, transformation, and analysis.
ETL Workflows: Develop and maintain ETL workflows using PySpark or equivalent technologies.
Data Pipeline Development: Assist in developing and maintaining data pipelines for structured and unstructured datasets.
Platform Implementation: Work on data solutions using Databricks or Snowflake
Collaboration: Collaborate with data scientists, analysts, and business stakeholders to deliver high-quality data products.
Data Quality and Compliance: Ensure data quality, reliability, and compliance with regulatory standards.
Domain-Specific Solutions: Contribute to domain-specific solutions for AML, fraud detection, and risk analytics.
