Our Client is looking for an experienced Data Engineer in the Data Management team. They are seeking a data engineer to join this team and focus especially on building data pipelines for reporting.
- The day-to-day work involves discussing with business stakeholders to confirm reporting requirements, confirming metric definitions with the stakeholders and data and risk analysts, designing flexible data structures and data marts to efficiently transform the data, then authorizing, optimizing, deploying, and maintaining the queries at the core of the data pipeline to deliver a report.
- The team is also responsible for building and maintaining data pipelines for critical reports that keep the business running.
- You will not only handle the above day-to-day work, but also proactively identify and propose improvements to the reporting workflow itself. Like finding ways to optimize ETL pipelines and the ETL architecture behind reports with similar upstream data, proposing new processes to improve the report creation process, trying new techniques or technologies to validate report results and minimize report maintenance time, etc.
Duties & Responsibilities
- Consult with business stakeholders on reporting needs, collect requirements, document report background and technical specification. Work with the same stakeholders to collect feedback on report contents.
- Conduct ad-hoc analysis to confirm requirements and important report metrics definitions, often working together with analysts in credit risk and/or data science. Identify, investigate, and solve data discrepancies in reports.
- Design and document ETL pipelines and data structures upstream from the final generated reports. Find ways to optimize the ETL in the pipeline and adjust data models/ structures upstream to increase flexibility, efficiency, or easy of achieving high accuracy in reports
- Monitor report generation, provide first line support for any issues with deployed reporting pipelines and generated reports
- Working with data scientists, design and deploy BI tool dashboards to add value to reporting pipelines for stakeholders (we use Looker)
- Contribute to data lake/warehouse development and other data integration projects
【会社概要 | Company Details】
A successful Fintech Company, provides a payment platform that lets online merchants accept payments in real-time from consumers without credit cards.
【就業時間 | Working Hours】
Work Location: From anywhere in Japan
【休日休暇 | Holidays】
Saturday, Sunday, and National Holidays, Year-end and New Year Holidays, Paid Holidays, Other Special Holidays
【待遇・福利厚生 | Services / Benefits】
各種社会保険完備（厚生年金保険、健康保険、労災保険、雇用保険）、 屋内原則禁煙（屋外に喫煙所あり）、 通勤交通費支給等
Social insurance, Transportation Fee, No smoking indoors allowed (Designated smoking area), etc.
- A Bachelor Degree in Computer Science, Information Technology or a related subject
- Experience in data engineering or software development of data-intensive applications
- Strong skills in Python, Git, Docker, SQL, some ETL tool/job orchestration tool such as Airflow or Prefect (we use Prefect), ETL pipelines.
- Experience on financial/accounting products or services, in particular, technical implementation impact and consideration
- Creative Problem-solving skills, a passion for programming and solving problems with code