coverPhoto
Cognitio
Data Engineer (ETL)
McLean, VA
$73K - $146K (Glassdoor Est.)
Apply Now
Rating Highlights
Compensation & Benefits: 1.3
Culture & Values: 1.7
Career Opportunities: 1.7
Work/Life Balance: 1.3
Job & Company Insights
Job Type: Full-time
Job Function: data engineer
Industry: N/A
Size: 201 to 500 Employees
Job
The Data Engineer will manipulate data and data flows for both existing and new systems. Additionally they will provide support in the areas of data extraction, transformation and load (ETL), data mapping, data extraction, analytical support, operational support, database support, and maintenance support of data and associated systems. As a member of the team, candidates will work in a multi-tasking, quick-paced, dynamic, process-improvement environment that requires experience with the principles of large-scale (terabytes) database development, large-scale file manipulation, data modeling, data mapping, data testing, data quality, and documentation preparation.

KEY RESPONSIBILITIES
  • Research, design, develop and/or modifies enterprise-wide systems and/or application software.
  • Develop complex data flows, or makes significant enhancements to existing pipelines.
  • Resolves complex hardware/software compatibility and interface design considerations.
  • Conducts investigations and tests of considerable complexity.
  • Provides input to staff involved in writing and updating technical documentation.
  • Troubleshoots complex problems and provides customer support for the ETL process
  • Prepares reports on analyses, findings, and project progress.
  • Provides guidance and work leadership to less-experienced software engineers.



Requirements

REQUIRED KNOWLEDGE/SKILLS:

  • Candidate must have an active TS/SCI Full Scope Polygraph.
  • Bachelor’s Degree in Computer Science, Electrical or Computer Engineering or a related technical discipline, or the equivalent combination of education, technical training, or work/military experience
  • 8-10 years of related software engineering and ETL experience.
  • Experience building and maintaining data flows in NiFi or Pentaho.
  • Excellent organizational, coordination, interpersonal and team building skills.
  • Familiarization with NoSQL datastores

DESIRED KNOWLEDGE/SKILLS

  • Familiarization executing jobs in Big Data Technologies (i.e., Hadoop or Spark)
  • Experience with the following languages: Java/J2EE, C, C++, SQL, XML, XQuery, XPath, Ruby on Rails, HTML/XHTML, CSS, Python, Shell Scripting, JSON
  • Knowledge of servers operating systems; Windows, Linux, Distributed Computing, Blade Centers, and cloud infrastructure
  • Strong problem solving skills
  • Ability to comprehend database methodologies
  • Focus on continual process improvement with a proactive approach to problem solving
  • Ability to follow directions and finish task
Show more
Get alerts to jobs like this, to your inbox.

Suggested Searches