To Apply for this Job Click Here
Location: Hybrid Wayne, PA 2 days on site required
Salary: 130-160K
Position Description:
- Lead the design and optimize the data engineering platform, ensuring robust administration and operational efficiency.
- Build and maintain scalable Azure-based data processing environments integrated with tools such as Azure DevOps, HVR/Fivetran, Snowflake, Collibra, Power BI, and Tableau.
- Apply deep cloud engineering expertise across Azure, AWS, or GCP, leveraging Kubernetes, containers, serverless technologies, and modern architectural patterns.
- Automate infrastructure provisioning and management using Infrastructure-as-Code tools (Terraform, Ansible).
- Implement strong foundations in networking, security, identity and access management (IAM), monitoring, and automation.
- Collaborate with data architecture, engineering, and governance teams to deliver a secure, resilient, and scalable cloud data ecosystem.
Required Skills:
- 4+ years of hands-on Azure Databricks experience in enterprise environments
- On-Prem Hadoop technologies, or Snowflake, or any Big Data
- 1+ year Terraform (or any Infrastructure as Code like Ansible) is a must
- Experience with Azure DevOps is a must (GitHub, deployment pipelines from the engineering side).
- Regulatory industry experience
- Strong proficiency in PySpark, Python, or PowerShell, or SQL, and lakehouse architectural concepts is a must
- Familiarity with Delta Live Tables, Unity Catalog, and MLflow a must
