Перенесена в архив: Data engineer

111222
Отправить резюме

О вакансии

An american financial startup that specializes in loans and insurance for families is looking for Data engineer.

Company's goal is to make insurance great again, because at this point it is too painful and time consuming. That's why startup was founded to create the easiest experience for users. Enabled by disruptive technologies, it aims for the one-click user experience in financial product offerings which traditionally requires users to fill out long forms.
Based in Silicon Valley, project is looking for a Data engineer to work in a distributed team. Startup is backed by Y-combinator, SV Angel, Funders Club, and many other prominent Silicon Valley Investors. It is founded by serial entrepreneurs who previously built and scaled YourMechanic (“Uber for car repair,” the nation’s largest on-demand car repair site).

About the role:

We are looking for a Data Engineer who is passionate and motivated to make an impact in creating a robust and scalable data platform. In this role, you will have ownership of the company’s core data pipeline that powers top line metrics. You will also leverage data expertise to help evolve data models in various components of the data stack. You will be working on architecting, building, and launching highly scalable and reliable data pipelines to support the company’s growing data processing and analytics needs. Your efforts will allow access to business and user behavior insights, leveraging the data to fuel other functions such as Analytics, Data Science, Operations and many others.

Responsibilities:

  • Owner of the core company data pipeline, responsible for scaling up data processing flow to meet the rapid data growth.
  • Consistently evolve data model & data schema based on business and engineering needs.
  • Implement systems tracking data quality and consistency.
  • Develop tools supporting self-service data pipeline management (ETL).
  • SQL and MapReduce job tuning to improve data processing performance.

Requirements:

  • Proficient in SQL, specially with Postgres dialect.
  • Expertise in either Python or Javascript for developing and maintaining data pipeline code.
  • Experience with BI software (such as Tableau).
  • Experience with Hadoop (or similar) Ecosystem.
  • Experience with deploying and maintaining data infrastructure in the cloud (experience with AWS preferred).
  • Comfortable working directly with data analytics to bridge business requirements with data engineering.

 

Send your CV to hr@digitalhr.ru

Отправить
резюме
на вакансию

hr@digitalhr.ru