JOB DESCRIPTION: The Hadoop Engineer will design, develop, code, test, and debug complex software products, or make significant enhancements to existing software. The ideal candidate is a hands-on platform builder with significant experience in developing scalable data platforms, with experience in business intelligence, analytics, data science and data products. They must have strong, firsthand technical expertise in a variety of configuration management and big data technologies and the proven ability to fashion robust scalable solutions that can manage large data sets. They must be at ease working in an agile environment with little supervision. This person should embody a passion for continuous improvement and innovation.
REQUIRED KNOWLEDGE/SKILLS: 1. Bachelor’s Degree in Computer Science, Electrical or Computer Engineering or a related technical discipline, or the equivalent combination of education, technical training, or work/military experience 2. 10-15 years of related software development experience 3. Hands-on development experience with Java, Scala, Python
DESIRED KNOWLEDGE/SKILLS: 1. Hands-on experience with data formats including XML, PCAP, images, and media 2. Hands-on experience working with Hadoop, Hive, Map Reduce, Spark, NiFi, Kafka, HBase. 3. DevOps experience building and deploying cloud infrastructure with technologies like ansible, chef, puppet, etc. 4. Experience with test-driven development and automated testing frameworks. 5. Experience with Scrum/Agile development methodologies. 6. Capable of delivering on multiple competing priorities with little supervision. 7. Excellent verbal and written communication skills. 8. We’re looking for someone with 3-5 years of experience and is familiar with the following software/tools: • Experience with infrastructure automation technologies like Docker, Kubernetes, etc. • Experience with build automation technologies like Maven, Jenkins, etc. • Experience with monitoring technologies like Nagios, Ganglia, Grafana, etc. • Experience with modern programming languages like Java, Python, etc. • Experience with building APIs and services using REST, GraphQL, etc. • Experience with Elastic-search and non-relational databases.
KEY RESPONSIBILITIES: 1. Analyze, design and develop tests and test-automation suites. 2. Design and develop a processing platform using various configuration management technologies. 3. Test software development methodology in an agile environment. 4. Provide ongoing maintenance, support and enhancements in existing systems and platforms. 5. Collaborate cross-functionally with data scientists, analyst, project managers and other engineers 6. Troubleshoots complex problems and provides customer support for software systems and application issues. 7. Provide recommendations for continuous improvement. 8. Work alongside other engineers on the team to sustain and advance our organization’s capabilities
We are GDIT. The people supporting some of the most complex government, defense, and intelligence projects across the country. We deliver. Bringing the expertise needed to understand and advance critical missions. We transform. Shifting the ways clients invest in, integrate, and innovate technology solutions. We ensure today is safe and tomorrow is smarter. We are there. On the ground, beside our clients, in the lab, and everywhere in between. Offering the technology transformations, strategy, and mission services needed to get the job done.
GDIT is an Equal Opportunity/Affirmative Action employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or veteran status, or any other protected class.