ResponsibilitiesDesign and maintain scalable ETL/ELT pipelines in Azure Databricks to process structured, semi-structured, and unstructured insurance data from diverse sources.Collaborate with architects, modellers, analysts, and stakeholders to deliver tailored data assets for analytics, operations, and regulatory reporting.Develop and optimize batch and streaming data solutions using Azure Data Factory, Data Lake Storage Gen2, Azure Event Hubs, and Kafka.Implement robust data validation, cleansing, and quality controls to ensure reliability for critical insurance use cases.Integrate Informatica tools to support governance, metadata management, lineage tracking, and cataloguing.Enforce data security best practices including RBAC, managed identities, encryption, and compliance with PDPO, GDPR, and other regulations.Automate deployment workflows using GitHub Actions for efficient, repeatable, and auditable data operations.Troubleshoot pipeline issues, conduct root cause analysis, and proactively resolve data quality and performance challenges.Maintain detailed technical documentation for data pipelines, transformation logic, and operational procedures.Apply domain expertise in Hong Kong Life and General Insurance to ensure solutions meet local business and regulatory standards.RequirementsBachelor’s degree in Computer Science, IT, or a related field.5+ years of experience with Azure cloud platforms, focused on Databricks for insurance data workloads.Strong skills in provisioning and managing Azure Databricks clusters and workspaces for varied data types.Experience integrating Azure Data Factory and Data Lake Storage Gen2 with Databricks for seamless data flows.Proficient in Terraform for infrastructure-as-code deployment of Azure and Databricks services.Hands-on experience with Informatica for metadata management, cataloguing, and governance.Deep understanding of Azure security (RBAC, NSG, Key Vault), monitoring, and cost optimization in regulated environments.Skilled in CI/CD automation using GitHub Actions for platform and pipeline deployments.Proven ability to troubleshoot, optimize, and support mission-critical data workloads in insurance.Strong documentation and communication skills to support cross-functional teams and stakeholders.All applications applied through our system will be delivered directly to the advertiser and privacy of personal data of the applicant will be ensured with security.