DeepLight AI is a specialist AI and data consultancy with extensive experience implementing intelligent enterprise systems across multiple industries, with particular depth in financial services and banking. Our team combines deep expertise in data science, statistical modeling, AI/ML technologies, workflow automation, and systems integration with a practical understanding of complex business operations.
We are seeking a skilled AWS Glue Data Engineer to join our Data Factory Squad, responsible for migrating source systems into the Lakehouse ingestion zone. This role focuses on building scalable ingestion pipelines, optimizing performance, and ensuring compliance with architectural and data assurance standards.
You will ideally have experience working in financial services with strong experience in AWS Glue, PySpark, and ETL pipeline development.
Your responsibilities as the AWS Glue Data Engineer will include:
- Data Ingestion Development
- Building and implementing AWS Glue jobs for Bronze layer ingestion using defined standards and templates.
- Implementing correct loading methods based on source requirements (CDC, full load, delta, snapshot).
- Designing and executing historical loading mechanisms to bring legacy data into the Lakehouse.
- Performance Optimisation
- Optimising Glue job performance (DPU allocation, parallelization, partitioning) according to best practices.
- Collaborating with platform teams to ensure tooling and optimization alignment.
- Migration & Automation
- Aggressively migrating source tables to Bronze layer, initially using manual approaches with standards/templates, later leveraging AI-enabled acceleration.
- Ensuring jobs are version-controlled and production deployment is automated via Git and Terraform.
- Governance & Monitoring
- Implementing source system connectivity into CDP in collaboration with source system owners.
- Ensuring jobs comply with data contracts and are properly monitored.
- Preparing documentation and handover to operational support teams.
- Collaboration
- Working closely with Data Architect for ingestion patterns and standards.
- Coordinating with Data Assurance Lead to apply quality checks across all jobs.
- Partnering with platform engineers for tooling and optimisation.