Marcura·7 months ago
To bring domain expertise in data engineering to the team, including the ETL process using tools.
1.Data engineering best practices
You will contribute to the data team’s ability to adhere to data engineering best practices across pipeline design, data quality monitoring, storage, versioning, security, testing, documentation, cost, and error handling.
2. Data transformation in dbt
Ensure that the daily dbt build is successful, including full test coverage of existing models.
Create new data models in collaboration with the data analysts.
Add new tests to enhance data quality.
Incorporate new data sources
3.Data extraction
Develop and maintain our data pipelines in Stitch, Fivetran, Segment and Apache Airflow. Evaluate when it’s appropriate to using tools, versus building data pipelines in Apache Airflow.
Ensure that data extraction jobs run successfully daily. Collaborate with engineers from MarTrust to add new data sets to our data extraction jobs.
4. Data warehousing in BigQuery
Ensure that the data in our data warehouse is kept secure and that daily jobs in BigQuery run successfully.
5. Maritime domain expertise
You will be expected to quickly develop a deep understanding of MarTrust’s data domain in order to support the organisation with accurate data models and tests.
· A bachelor’s degree in a relevant field such as Computer Science, Information
· Technology, Data Science, Engineering (e.g., Computer Science Engineering),
· Mathematics, or Statistics
· Domain experience in FinTech or a high-growth startup.
· Familiarity with the tools we work with, or other equivalent ones (DBT, BigQuery,
· Stitch, Segment, Fivetran, Metabase, Apache Airflow)
· Experience of end-to-end ownership in data engineering
·Collaborative approach with the data team, where data analysts are also
· contributing to our data models, and stand in for data engineers when needed.
· Strong data modelling and organizational skills.
· Proven track record in creating robust data processes for accuracy and reliability