To Apply for this Job Click Here
Location: Hybrid King of Prussia, PA (Must be able to work on site 2-3 days a week)
Salary: 145-155K
Position Description
- Serve as the Data Architect for the Health Care Payer domain, responsible for designing, developing, and maintaining data platforms-including data warehouses and data lakes-to support secure, efficient collection, storage, and analysis of healthcare data.
- Partner with business and IT stakeholders to ensure the data architecture aligns with organizational needs while meeting compliance and security requirements.
- Lead the design and implementation of scalable, secure data systems and interoperability solutions compliant with USCDI/US Core standards.
- Build and maintain conceptual, logical, and physical data models across healthcare domains such as clinical, claims, member, provider, and EHR data.
- Assess and recommend data management technologies, frameworks, and methodologies to improve data integration and analytics capabilities.
- Enforce data governance practices to ensure data quality, integrity, security, and regulatory compliance, including HIPAA, GDPR, and other privacy laws.
- Act as an independent thought leader capable of self-direction and driving architectural decisions.
- Apply strong data management principles and hands-on experience with end-to-end data processing.
- Collaborate with analysts, data scientists, IT teams, and business leaders to understand data needs and ensure architectural alignment.
- Optimize data storage and retrieval to minimize latency and support real-time and historical analytics.
- Contribute to establishing and maintaining data governance standards across systems.
- Stay current with emerging technologies in healthcare data management and propose innovative solutions.
- Provide mentorship, training, and guidance to team members.
Required Skills
- Healthcare experience is a must
- Deep expertise in data architecture, database technologies, and cloud platforms (Azure, AWS, GCP).
- Strong understanding of healthcare interoperability standards such as HL7, FHIR, CCDs, HIE, and USCDI/US Core.
- Proficiency with SQL and NoSQL databases, Python, and PySpark.
- Experience using data modeling tools (e.g., Erwin).
- Hands-on experience with integration and data engineering tools such as Azure Data Factory, Databricks, Kafka, IBM DataStage, and Informatica.
- Familiarity with business intelligence and analytics platforms.
- Strong analytical and problem-solving abilities.
- Excellent communication and collaboration skills for working with cross-functional teams.
