簡介
About the Role
We are seeking a highly skilled Data Engineer with a strong DevOps mindset to help build a reusable Microsoft Fabric accelerator solution. This role is ideal for someone with a solid software and backend engineering background, with a deep understanding of building efficient, secure, and scalable data pipelines and orchestrating complex workflows.
You will play a key part in designing and developing infrastructure and data solutions that are high-performance, portable, and aligned with industry best practices in security and DevOps. This is not an analytics or data modeling role — the focus is on engineering excellence and infrastructure.
Key Responsibilities:
- Design, develop, and maintain efficient and scalable data pipelines using best-in-class engineering practices.
- Build orchestration workflows to extract and move data across cloud and hybrid environments.
- Develop portable, metadata-driven solutions for rapid deployment across clients.
- Collaborate with teams to implement CI/CD pipelines and infrastructure-as-code for data solutions.
- Architect and implement cyber-secure systems that comply with enterprise security standards.
- Contribute to the development of a Microsoft Fabric accelerator for client deployment.
Key Requirements:
- Proven experience in backend/data engineering with a strong DevOps focus.
- Proficient in Python, Spark, PySpark, and SQL.
- Hands-on experience with ETL processes and real-time data ingestion.
- Strong knowledge of data orchestration tools like Apache Airflow and Azure Data Factory.
- Experience with Microsoft Fabric, including: Lakehouses, Pipelines, Dataflows Gen2,Notebooks
- Familiarity with containerization and IaC tools: Docker, Kubernetes, Bicep, Terraform.
- CI/CD best practices for data workflows and cloud deployments.
- Experience with Delta Lake or Apache Iceberg in the Microsoft ecosystem.
- Deep understanding of Azure services: Data Lake, Synapse Analytics, Event Hubs.
- Experience with real-time streaming platforms such as Kafka or Azure Event Hubs.
- API integration and Change Data Capture (CDC) methodologies.
- Experience with PowerShell, C#, or .NET.
- Familiarity with reusable solution templates in consultancy/client environments.
- Relevant certifications including:
- DP-700 (Analyzing and Visualizing Data with Microsoft Power BI)
- Azure Data Engineer Associate
- Databricks
- Kubernetes Certifications
Minimum Requirements:
- i5 10th Gen or equivalent, 8GB RAM, 2GB Video Card or Integrated Graphics Card, Windows 10 or macOS (no Netbooks or Chromebooks), Headset with noise cancellation feature
- Main Internet should be minimum of 20MBPs, wired only
- An Antivirus Software should be installed, activated and updated
- Backup Internet
職位要求
Please refer to job description.