Posted in General Business 30+ days ago.
Type: Full-Time
Type of Requisition:
RegularClearance Level Must Currently Possess:
Top Secret SCI + PolygraphClearance Level Must Be Able to Obtain:
NoneSuitability:
No Suitability RequiredPublic Trust/Other Required:
Job Family:
Software EngineeringJob Description:
JITR 890 - Project: Data Science Program
The Contractor shall provide for the tasks documented in this narrative on a best effort, Level of Effort (LOE) basis with the number of FTEs listed under labor categories. The hours will be managed at the overall JITR level.
A. Background/Introduction
The Sponsor requires a Contractor to support a growing data science/data engineering ecosystem currently in development. The Sponsor’s organization includes shifting demands and priorities. The Contractor shall work with a small team to execute its operations and partner across the organization to ensure mission success.
B. Work Requirement
The Contractor shall architect systems using enterprise tools and capabilities (e.g. authentication services, audit, etc.) as follows: The Contractor shall architect a new data science system for the Sponsor that will connect with the Sponsor’s existing systems. The Contractor shall design and build a system that emphasizes the ingestion of data, and push data and services to users. The Contractor shall maximize efficiencies and reduce long-term O&M costs. The Contractor shall manage and maintain the Sponsor’s existing software applications and environments, including managing uptime, scaling, and system administration. The Contractor shall develop and maintain connections between the Sponsor’s existing systems and third party systems.
The Contractor shall gather requirements, develop program schedules, and monitor program execution. The Contractor shall develop, coordinate, and manage system documentation and the Sponsor’s A&A process for the Sponsor’s existing systems. The Contractor shall develop and disseminate progress reports for external audiences.
Required:
• Expertise in building automated ETL pipelines/workflows in Python-based environment (e.g. Apache Airflow or Prefect).
• Demonstrated knowledge of techniques for entity resolution/aggregation and linking rich data together across diverse and broad datasets.
• Demonstrated experience successfully building and deploying applications to serve and process large volumes of data.
• Demonstrated knowledge of data governance best practices on data sourcing and data compliance management.
• Demonstrated experience building, deploying, and managing cloud-based applications.
Desired
• Experience developing and coordinating system-to-system interfaces across enterprise applications and data stores.
• Experience using Apache NiFi.
• Demonstrated experience managing heterogeneous data holdings in accordance with Sponsor’s regulations.
#OpportunityOwned
#GDITCareers
#GDITLife,
#WeAreGDIT
#GDITInterns
Scheduled Weekly Hours:
40Travel Required:
Less than 10%Telecommuting Options:
Telecommuting Not AllowedWork Location:
USA VA McLeanAdditional Work Locations:
We are GDIT. The people supporting some of the most complex government, defense, and intelligence projects across the country. We deliver. Bringing the expertise needed to understand and advance critical missions. We transform. Shifting the ways clients invest in, integrate, and innovate technology solutions. We ensure today is safe and tomorrow is smarter. We are there. On the ground, beside our clients, in the lab, and everywhere in between. Offering the technology transformations, strategy, and mission services needed to get the job done.GDIT is an Equal Opportunity/Affirmative Action employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or veteran status, or any other protected class.
Cushman & Wakefield |
Wells Fargo |
Epiq Global Business Transformation Solutions, LLC |