Data Engineer (P2/P3) (Data Engimeering Team)

Job ID 20862
Job Type
Permanent
Salary
JPY 7,500,000 - JPY 11,000,000 per year
Japanese Level
Advanced (JLPT Level 1)
English Level
High Intermediate (TOEIC 730)
Start Date
ASAP
Location
Tokyo
Job Type
Permanent
Salary
JPY 7,500,000 - JPY 11,000,000 per year
Japanese Level
Advanced (JLPT Level 1)
English Level
High Intermediate (TOEIC 730)
Start Date
ASAP
Location
Tokyo

Description

One of the world's largest insurance companies is looking for a Expert Agile Coach within the Agile & Technology Transformation Team.

Responsibilities

- Build the first iterations of high quality and sustainable data pipelines and ETL processes to extract data from a variety of APIs and relational databases and ingest into AWS services.

- Efficiently develop complex SQL queries and data models to aggregate and transform data for reporting and analytics teams.

- Optimize, refactor, and improve our code base, processes, and best practices.

- Monitor existing solutions and work proactively to rapidly resolve errors and identify future problems before they occur.

- Own and keep the system design and operations documents up to date.

 

 

【Company Details】

Our client is one of the world's largest insurance - financial groups, trusted by over 50 million customers. The company provides various financial protections including general insurance, life insurance, retirement funds, and inheritance throughout the lifetime for individual customers, small businesses, and large companies.

 

【Working Hours】
9:00am - 5:00pm Mon-Fri (Hybrid: 2 days in office and 3 days WFH every week)

 

【Holidays】
Saturday, Sunday, and National Holidays, Year-end and New Year Holidays, Paid Holidays, Other Special Holidays

 

【Services / Benefits】

Social insurance, Transportation Fee, No smoking indoors allowed (Designated smoking area), etc.

 

Required Skills

- Experience in data / analytics in an engineering / analyst role

- Experience working on data pipelines or analytics projects with SQL, and Python or Java

- Experience gathering requirements and designing data models / develop DWH or Data Mart

- Knowledge and experience working with the following AWS services: S3, IAM, ECS/EC2, Lambda, Glue, Athena, Kinesis/Spark Streaming, Step Functions, CloudWatch, Redshift, RDS

- Ability to work in a Linux/Unix environment and on remote machines