Your role will include but not limited to:
- Designing, developing, and coordinating secure, effective, efficient and scalable data pipelines.
- Applying your knowledge of data infrastructure and capabilities to build effective technological solutions to support core business (data) functions and integrations with a broader technology stack.
- Collaborating with the internal teams to fully understand business requirements and desired business outcomes.
- Ensuring all technical documentation is up to date and maintain the highest levels of quality.
- Continually learning and researching best practices and tooling for Big Data delivery.
- Advice, approaches and associated tooling to support data governance activities including an information security and data quality focus.
Mandatory
Experience with Data Engineering technologies such as:
- Azure Data Factory / Synapse Analytics / Event Hubs o Azure Function/Logic Apps
- Databricks / Deltalake / SQL Analytics / MLFlow
- Python
- Java
- SQL / T-SQL
- DBT
- R
- Scala
- Legacy ETL approaches and technologies (e.g. SSIS / SQL Stored Procedures).
- Azure & Databricks Administration experience.
- DevOps / Pipelines.
- An understanding of considerations when working with big data, for example security, governance, reliability, and scalability.
- Hands-on experience with streaming and batch processing tools and technologies.
- A rich understanding of data in all forms.
- Performance tuning.
- Data modelling.
- Ability to articulate data engineering concepts, making recommendations as to technology choices.
Desirable
- Experience with AI / ML
- PowerBI / DAX / Measures
- Purview / Unity Catalog / Data Lineage
- Fabric
Generating Apply Link...