Data Engineer (Multiple Locations) - Top Secret with SCI Eligibility at Logistics Management Institute in Charlottesville, Virginia

Posted in Other 2 days ago.

Job Description:

Data Engineer (Multiple Locations) - Top Secret with SCI Eligibility

Job Locations


Job ID
# of Openings


LMI is a consultancy dedicated to powering a future-ready, high-performing government, drawing from expertise in digital and analytic solutions, logistics, and management advisory services. We deliver integrated capabilities that incorporate emerging technologies and are tailored to customers' unique mission needs, backed by objective research and data analysis. Founded in 1961 to help the Department of Defense resolve complex logistics management challenges, LMI continues to enable growth and transformation, enhance operational readiness and resiliency, and ensure mission success for federal civilian and defense agencies.

LMI is currently seeking an innovative, experienced, and highly-skilled Data Engineer to join our growing Advanced Analytics & AI team in Charlottesville, VA. In this role, you will create and develop custom solutions for our users in a collaborative, fast-paced, state-of-the art environment. To be successful in this role, you will be thorough, creative, and exceptionally well-skilled in all phases of the development lifecycle, with a passion for continued learning and collaboration.

*This position is located in Charlotesville, VA and requires an active TS w/SCI eligibility clearance*

*Also looking for candidates in these locations: Charlottesville, Fort Bragg (NC), Fort Gordon (GA), Ft. Meade, Hawaii, San Antonio, Germany, Italy, South Korea


    5+ years performing data acquisition, identify relevant data sources and sets and shall provide data system enhancements as required, including but not limited to product reformatting and data quality assessments to support the acquisition of new datasets.
  • Design, develop, test and manage the overall architecture that helps analyze and process data in the way the organization needs it.

  • Integrate external or new datasets into existing data pipelines.

  • Process, clean, and verify the integrity, accuracy, completeness, and uniformity.

  • Assess the effectiveness and accuracy of new data sources and data gathering techniques and perform all network administration and data system operations (e.g., computer and peripheral device operations, system backups) and any related operations associated with data acquisition, data maintenance, maintaining and updating metadata, and other data and information services for stakeholders.

  • Develop, construct, test and maintain databases.

  • Build data and analytics tools that will offer deeper insight into the pipeline, allowing for critical discoveries surrounding key performance indicators and customer activity.

  • Give recommendations and implement ways to improve data reliability, efficiency, and quality: evaluate, compare and improve the different approaches including design patterns innovation, data lifecycle design, data ontology alignment, annotated datasets, and elastic search approaches.

  • Act as the lead data strategist, identifying and integrating new datasets that can be leveraged.

  • Document all processes, models and activities.

  • Research and keep up-to-date with latest tradecraft and technology.

  • Collaborate with systems architects, data scientists, and analysts to direct and optimize the flow of data within the pipeline and ensure consistency of data delivery and utilization across multiple projects.

  • Curate and collect the data from a variety of traditional and non-traditional sources: extract data from sources, transform and integrate data in line with existing data, and load data into data stores for access by others.

Languages, Tools, and Techniques:

  • Data pipeline/workflow management tools such as Azkaban and Airflow AWS cloud services such as EC2, EMR, RDS and Redshift.

  • Know basics of algorithms and data structures, distributed computing, Hadoop cluster management, HDFS, MapReduce, stream-processing solutions such as Storm or Spark, big data querying tools, frameworks, messaging systems, and big data toolkits.

  • Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.

  • Experience building and optimizing data pipelines, architectures and data sets.

  • Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.

  • Knowledge of ETL tools, data APIs, data modeling, and data warehousing solutions.

  • R, Python, Ruby, C++, Perl, Java, SAS, SPSS, and Matlab.

  • Demonstrated ability to work with enterprises to develop processes that support data transformation, data structures, metadata, dependency and workload management.

  • Comfort working in a dynamic environment with several ongoing concurrent projects; able to multitask, prioritize, and manage time effectively.

  • Creative problem solver who thrives when presented with a challenge; able to analyze problems and strategize for better solutions; strong problem solving skills with an emphasis on production for re-use.


  • Active TS/SCI clearance

  • MS in Computer Science, Information Systems or equivalent field and 5+ years of experience in a similar data engineer role; BS in Computer Science, Information Systems or equivalent field and 7+ years of experience in a similar data engineer role; or AA in Computer Science Information Systems or equivalent field and 10+ years of experience in a similar data engineer role

  • 3 years of experience handling databases and software develop is preferred.

  • Experience working with AWS cloud services such as EC2, EMR, RDS and Redshift

  • R, Python, Ruby, C++, Perl, Java, SAS, SPSS, and Matlab skills desired

  • Advanced data engineering experience required

  • Significant experience with databases

  • Experience and familiarity with Agile-Scrum software development

  • Experience gathering and decomposing requirements

  • Proven record of solution development and deployment

  • Familiarity with web based application development

  • Experience with testing, use case, and user stories

  • Outstanding communication skills, written and verbal

  • Highly-organized and able to manage multiple projects simultaneously

  • Team-player mentality with a positive attitude

  • Keen attention to detail and solid analytical skills

  • Able to articulate complex, abstract concepts concisely and effectively


LMI is an Equal Opportunity Employer. LMI is committed to the fair treatment of all and to our policy of providing applicants and employees with equal employment opportunities. LMI recruits, hires, trains, and promotes people without regard to race, color, religion, sex (including pregnancy), sexual orientation, gender identity, age, national origin, disability, veteran status, or any other factors protected by applicable law.

Need help finding the right job?

We can recommend jobs specifically for you!

Click here to get started.