Scala Hadoop Developer

Job Type
Outsource
Salary
700,000 JPY - 1,000,000 JPY per month + Transportation + Health benefit + Flexible Working Hours
Japanese Level
None
English Level
Advanced (TOEIC 860)
Start Date
ASAP
Location
Tokyo

Description

Our Client, one of the biggest Global Insurance services providers in Japan (MetLife) is looking for talented, creative and passionate Hadoop Developer (with a strong knowledge of Scala), to join their Data Development Team. You must be a strong logical thinker as you will be working on developing a lot of applications for data Ingestions, data transformation and data distribution projects.

 

- You will be doing development on the Hadoop ecosystem using Scala.

- You will be reporting to the Data Development Manager, and you will be a part of a 17 members team

 

DUTIES & RESPONSIBILITIES

- Develop and maintain applications on the Hadoop ecosystem

- You will be developing applications to support data ingestion, data transformation, and data distribution projects

- You will responsible for providing data feeds, reports, and coordination efforts for the downstream applications

- You will be coordinating with the internal teams, i.e. operations team and platform teams for day to day work

 

【会社概要 | Company Details】
多様な販売チャネルと商品ラインナップに強みを持つ、日本で40年以上の実績がある外資系保険会社です。多様性ある環境づくりにも力を入れており、中でも女性の登用に力を入れています。
Global insurance company with over 40 years of experience in Japan with strengths in various sales channels and product lineup. The company focuses on creating diverse environments including but not limited to promoting the appointment of women.

【就業時間 | Working Hours】
9:00 - 18:00(Mon - Fri)

【休日休暇 | Holidays】
完全週休2日制(土日祝休み)、年末年始、年次有給休暇、その他特別休暇など
Saturday, Sunday, and National Holidays, Year-end and New Year Holidays, Paid Holidays, Other Special Holidays

【待遇・福利厚生 | Services / Benefits】

各種社会保険完備(厚生年金保険、健康保険、労災保険、雇用保険)、 屋内原則禁煙(屋外に喫煙所あり)、 通勤交通費支給等

Social insurance, Transportation Fee, No smoking indoors allowed (Designated smoking area), etc.

 

Required Skills

TECHNICAL SKILLS

  • Sound understanding of the Hadoop ecosystem
  • Hands-on experience of Hadoop components: HDFS, Hive, HBase, Phoenix, Solr, Oozie
  • Minimum 3+ years of experience in Scala
  • Strong coding expertise with Scala and Spark.
  • Good understanding of database concepts and SQL
  • Experience with Unix and shell scripts
  • Good knowledge of Git and Sbt
  • Strong logical thinker
  • Strong analytical mind to help solve complicated problems
  • Desire to resolve issues and dive into potential issues
  • Self-starter who works with minimal supervision, and the ability to work in a team of diverse skill sets
  • Ability to comprehend customer requests and provide an optimized solution