Big Data/Hadoop Developer

Clearance Level
Not Applicable
Systems Engineering
Falls Church, Virginia

REQ#: RQ43984

Requisition Type: Pipeline

General Dynamics Information Technology is the premier provider of high-tech IT solutions to the government IT market. At General Dynamics Information Technology, we deliver cost-effective, next-generation IT solutions and services to the Department of Defense, the intelligence community and federal civilian agencies as they modernize their information systems.

GDIT is looking for a reliable, motivated and results-oriented platform administrator for the Big Data platform implemented using Big Data technology stack in  AWS GovCloud.   The candidate must have 8-10 years progressive experience administering large scale enterprise technology solutions with at least 3 years’ experience in administering  Big Data solutions using Hadoop stack.  In addition to the Hadoop stack, the candidate will administer AWS PAAS services such as RDS databases, S3 and Glacier services. 


Specifically, this position will involve in the following responsibilities :

·        Install, Configure  and Deploy technology solutions using DevOps tool chain (i.e Ansible)

·        Hadoop cluster administration, expansion, and upgrade in cloud hosted environments

·        Monitor Hadoop Cluster Availability, Connectivity and Security • Setting up Linux users, groups, Kerberos principals and keys  ​

·        Thorough understanding of AD/LDAP integration with RHEL, SSSD, KDC configurations 

·        Incident management, Issue/problem resolution, troubleshooting and performance tuning, logs management, alerts monitoring

·        At least 3 years of experience Architecting, Designing, Implementation and Administration of Hadoop infrastructure

·        At least 2 years of experience in Project life cycle activities on development and maintenance projects

·        Familiarity with FedRAMP compliance requirements

·        In addition, this position will be working with our customers as infrastructure solution engineers on assigned projects throughout the  development cycle. 

·        Operational expertise in troubleshooting, understanding of system’s capacity, bottlenecks, basics of memory, CPU, OS, storage, and networks 

  • Required Skills: Cluster maintenance, cluster security, setting up production clusters, kerberos, Ranger, Knox topology, tuning and optimization of clusters, implementing back-up and fail over, debugging is preferred
  • 3-5 Years of Hadoop Administration Experience Required
  • 3-5 Hands-on experience with Horton Works & Cloudera preferred
  • 3-5 Years of Amazon Web Service knowledge and expertise preferred (Setup and Configuration)
  • Experience with Data Lakes and Big Data

This position requires being fully vaccinated against COVID-19 by December 8, 2021 or the start date, if after December 8.

We are GDIT. The people supporting some of the most complex government, defense, and intelligence projects across the country. We deliver. Bringing the expertise needed to understand and advance critical missions. We transform. Shifting the ways clients invest in, integrate, and innovate technology solutions. We ensure today is safe and tomorrow is smarter. We are there. On the ground, beside our clients, in the lab, and everywhere in between. Offering the technology transformations, strategy, and mission services needed to get the job done.

GDIT is an Equal Opportunity/Affirmative Action employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or veteran status, or any other protected class.