Senior Data Engineer TS/SCI w/ poly

Clearance Level
Top Secret SCI + Polygraph
Category
Data Science
Location
Chantilly, Virginia

REQ#: RQ87500

Travel Required: Less than 10%
Requisition Type: Regular

Basic Qualification

Bachelors degree in Engineering, Computer Science, Systems Engineering, or other computer or information technology-related field plus a minimum of 12 years relevant experience.

CLEARANCE REQUIREMENTS:

A TS/SCI security clearance with the ability to obtain a Polygraph is required at time of hire.  Candidates must be able to obtain the Polygraph within a reasonable amount of time from date of hire. Applicants selected will be subject to a U.S. Government security investigation and must meet eligibility requirements for access to classified information. Due to the nature of work performed within our facilities, U.S. citizenship is required.

Job Description

Developing mission-critical systems that help keep people safe is what we do. You will be part of the team that helps heroes make a true impact. The work we do is important. The challenges we face are career-defining. The opportunity we can offer is one-of-a-kind.

We apply advanced technologies such as Artificial Intelligence, Blockchain, AR/VR, Cloud Native and Quantum Physics to solve our customers’ missions in cyber, RF, undersea, interstellar space and everything in between.

As a Data Engineer, you’ll lead model and simulation activities as you participate in requirements analysis and management, functional analysis, performance analysis, system design, trade studies, systems integration and test (verification). It’s your chance to step up to the challenge and prove you’re ready to lead the world.

REPRESENTATIVE DUTIES AND TASKS:

We are seeking a Data Engineer to support the Insider Threat mission. Data Engineers work with various security system data owners to automate data integration and collection strategies. Work closely with the data science team to ensure data cleanliness and accuracy.

  • Support data science team by designing, developing and implementing scalable ETL process for disparate datasets into a Hadoop infrastructure
  • Design, develop, implement and maintain data ingestion process from various disparate datasets using StreamSets (experience with StreamSets not mandatory)
  • Develop processes to identify data drift and malformed records
  • Develop technical documentation and standard operating procedures
  • Leads technical tasks for small teams or projects

KNOWLEDGE SKILLS AND ABILITIES:

  • Working knowledge of entity resolution systems
  • Experience with messages systems like Kafka
  • Experience with NoSQL and/or graph databases like MongoDB or ArangoDB
  • Any of the following databases: SQL, MongoDB, Oracle, Postgres
  • Working experience with ETL processing
  • Working experience with data workflow products like StreamSets or NiFi
  • Working experience with Python RESTful API services, JDBC
  • Experience with Hadoop and Hive/Impala
  • Experience with Cloudera Data Science Workbench is a plus
  • Understanding of pySpark
  • Leadership experience
  • Creative thinker
  • Ability to multi-task
  • Excellent use and understanding of data engineering concepts, principles, and theories

We are GDIT. The people supporting some of the most complex government, defense, and intelligence projects across the country. We deliver. Bringing the expertise needed to understand and advance critical missions. We transform. Shifting the ways clients invest in, integrate, and innovate technology solutions. We ensure today is safe and tomorrow is smarter. We are there. On the ground, beside our clients, in the lab, and everywhere in between. Offering the technology transformations, strategy, and mission services needed to get the job done.

GDIT is an Equal Opportunity/Affirmative Action employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or veteran status, or any other protected class.