coverPhoto
Kasasa4.0
Data Engineer
Austin, TX
$65K - $129K (Glassdoor Est.)
Rating Highlights
Compensation & Benefits: 3.2
Culture & Values: 4.5
Career Opportunities: 3.4
Work/Life Balance: 3.9
Job & Company Insights
Job Type: Full-time
Job Function: data engineer
Industry: Finance
Size: 201 to 500 Employees
Job
SUMMARY OF PURPOSE:
The Data Engineer will be designing, developing, enhancing, testing and implementing data centric components that create or enhance value to Kasasa and its clients. Some of the duties of the Data Engineer include staging first-party data into Kasasa new Data Ecosystem, integrating new data from 3rd party sources, and creating new data services that power Kasasa’s product suite.
The Data Engineer is expected to develop clean, well-designed, well-documented, tested, reusable code in Kasasa’s agile development environment. The Data Engineer operates as part of a cross functional development team that includes Architecture, CloudOps (DevOps), Product Owners and other developers. The Data Engineer is expected to participate in architectural discussions, facilitating infrastructure improvements, contributing to agile ceremonies, and estimating and grooming new development work.
The Data Engineer works in a team comprised of Data Warehouse Developers/Modelers, a Product Owner, and reports to Kasasa’s Vice President of Data & Business Intelligence.
JOB REQUIREMENTS:
  • Collaborate with Product and Architecture to identify opportunities for development effort that allow Kasasa to achieve its enterprise vision of data
  • Participate in scrum team ceremonies to flesh out work to be undertaken in upcoming iterations. provide designs, negotiate scope against feature requests, etc.
  • Build and maintain:
    • systems that help Kasasa to determine if its data is correct, detect problems and trigger alerts
    • systems that help Kasasa understand its data freshness for batch/daily data feeds and other less frequently arriving data assets (and alerts)
    • systems that manage metadata about Kasasa’s data assets, including auto-documentation and end-user query/use with which warehouse developers can instrument their code
    • systems that manage Kasasa’s data in a secure and compliant manner according to its policies and legal requirements
    • systems that assist Kasasa in its data governance efforts
    • systems that orchestrate (or help other developers to orchestrate) the transformation, loading, unloading, and publishing of data into and out of Kasasa’s data warehouse ecosystem, including SLA maintenance and monitoring, graceful error handling, etc.
PREFERRED EXPERIENCE:
  • 2 – 3 years of development experience in building enterprise applications using Python to automate complex workflows, consume web APIs, extract data from first- and third-party systems
  • Experience working with AWS services such as S3, Athena, Glue Catalog, Redshift (especially around bulk loading and unloading, Redshift Spectrum, External Schemas), Lambda Functions – or similar solutions from other cloud providers)
  • Experience with Infrastructure as Code (Terraform) and Continuous Integration (GitLab CI/CD)
  • Experience with AirFlow, including DAG design and development
  • Experience with Kubernetes and Docker environments
  • History of Agile software development
  • Excellent listening, interpersonal, written, and oral communication skills.
  • Strong sense of initiative and ability to contribute and improve upon established systems.
  • Bachelor’s Degree in Computer Science, Computer Engineering, or equivalent experience.
Show more
Get alerts to jobs like this, to your inbox.

Suggested Searches