ETL/Hadoop Developer-Cloud US Citizen (Public Trust)-Telework/Remote.

Clearance Level
None
Category
Cloud
Location
Alexandria, Virginia

REQ#: RQ99461

Travel Required: Less than 10%
Public Trust: Other
Requisition Type: Regular

Big Data Cloud ETL/Hadoop Developer

GDIT is your place! You make it your own by bringing your ideas and unique perspective to our culture. By owning your opportunity at GDIT, you are helping us ensure today is safe and tomorrow is smarter. Our work depends on a Big Data Cloud ETL/Hadoop Developer joining our team to support GDITs Customer activities at Alexandria, VA.

Role & Responsibilities:

Design and implement Big Data analytic solutions on Cloudera Data Platform. Create custom analytic and data mining algorithms to help extract knowledge and meaning from vast stores of data. Refine data processing pipelines focused on unstructured and semi-structured data refinement. Support quick turn and rapid implementations for larger scale and longer duration analytic capability implementations. Telework/Remote.Responsibilities include:
  • Design, develop and implementation in Cloudera (CDP and SDX).
  • Leverage CDP features to build the cloud-hybrid architectures (CDP Public Cloud).
  • 5+ years of experience required.
  • Work as a ETL (Extract Transform Load) Developer for the Hadoop enterprise data platform, specifically in Cloudera Ecosystem components such as HDFS, Sentry, HBase, Impala, Hue, Spark, Hive, Kafka, YARN, and Zookeeper.
  • Schedule jobs using Apache Nifi or Air flow
  • Design Big Data/Hadoop platforms (like Hive, HBase, Kafka, Yarn, impala etc.) and will handle and identify possible failure scenarios.
  • Develop components for big data platforms related to data ingestion, storage, transformation and analytics.
  • Execute and troubleshoot Spark and Hive jobs.
  • Develop shell/Scala/Python scripts to transform the data in HDFS and automation.
  • Debug, configure and tune components of Hadoop ecosystems as part of the development activities.
  • Import and export data using Sqoop from HDFS to Relational Database Systems and vice-versa for the new data pipelines.
  • Analyze, recommend, and implement improvements to support Environment Management initiatives.

Requirements/Skills:

  • Proven core skills to ingest, transform, and process data using Apache Spark and core pro
  • Strong ETL Developer experience is needed
  • Hands on experience with all Cloudera Data Platform tools (On Prem and On the cloud)
  • Experience with Cloudera Data Science Workbench and Cloudera Data Flow products
  • Experience with Cloudera Data Platform components such as Cloudera Certified Professional (CCP) Data Engineer or Spark and Hadoop Developer
  • Experience with Hadoop and the HDFS Ecosystem
  • 5+ years of strong experience with Apache Spark, Storm, Kafka is must
  • Experience with Python, R, Knox, Tomcat and Ambari
  • A minimum of 5 years working with HBase/Hive/MRV1/MRV2 is required
  • Experience in integrating heterogeneous applications is required
  • Experience working to resolve a variety of infrastructure issues
  • Experience with Core Java, Scala
  • Experience with Relational Database Systems/SQL and hierarchical data management
  • Strong experience with ETL tools such as Sqoop and Pig
  • Experience with data-modeling and implementation
  • Experience with any machine learning or AI experience is a big plus with Python/TensorFlow experience.
  • Telework/Remote work allowed.
  • WHAT YOU’LL NEED:
  • Education Degree related work experience
  • 5 + years of ETL experience relevant experience
  • 5+ Years Hadoop, Apache/Spark Technical skills
  • US Citizen or Green Card required.
  • Must be able to obtain Public Trust Security clearance
  • WHAT GDIT CAN OFFER YOU:
  • Full-flex work week
  • 401K with company match
  • Internal mobility team dedicated to helping you own your career
  • Collaborative teams of highly motivated critical thinkers and innovators
  • Ability to make a real impact on the world around you

This position requires being fully vaccinated against COVID-19 by December 8, 2021 or the start date, if after December 8.

We are GDIT. The people supporting some of the most complex government, defense, and intelligence projects across the country. We deliver. Bringing the expertise needed to understand and advance critical missions. We transform. Shifting the ways clients invest in, integrate, and innovate technology solutions. We ensure today is safe and tomorrow is smarter. We are there. On the ground, beside our clients, in the lab, and everywhere in between. Offering the technology transformations, strategy, and mission services needed to get the job done.

GDIT is an Equal Opportunity/Affirmative Action employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or veteran status, or any other protected class.