To Apply for this Job Click Here
Responsibilities
- Design and architect large-scale, on-premises data platforms supporting mission-critical data across multiple classification levels
- Define infrastructure and system requirements for highly scalable environments consisting of extensive compute, storage, and networking resources
- Develop distributed data architectures for ingestion, storage, processing, and dissemination of large data volumes
- Translate mission and user needs into technical requirements, system designs, and engineering artifacts
- Support capacity planning efforts, including scalability strategies and growth modeling
- Collaborate with system architects, engineers, cybersecurity teams, and stakeholders to define and implement technical solutions
- Evaluate vendor technologies, support technical reviews, and contribute to acquisition and procurement planning activities
- Produce architecture documentation and engineering packages to support system development and deployment
Requirements
- Bachelor’s degree in Computer Science, Data Science, Engineering, or related quantitative field, with an active TS/SCI clearance
- 7+ years of experience across data engineering and data science, including strong proficiency in Python and SQL
- Experience designing and implementing ETL/ELT pipelines and deploying machine learning models
- Hands-on experience with big data technologies such as Apache Spark, Airflow, or Hadoop
- Knowledge of enterprise data architecture, data center design principles, and storage systems (SAN, NAS, object storage)
- Experience working with large-scale or distributed data environments, including high-performance computing platforms
- Understanding of data governance practices and experience supporting secure, multi-classification environments
- Experience supporting technical requirements development, infrastructure planning, and government or regulated environments
1454624_1778702820
