A global life insurance firm is hiring a Data Modeler to prepare, maintain and deliver the enterprise wide generic and flexible data models.
- Understand and translate business needs into data models supporting long-term solutions.
- Work with the Application Development team to implement data strategies, build data flows and develop data models.
- Create logical and physical data models using best practices to ensure high data quality and reduced redundancy.
- Optimize and update logical and physical data models to support new and existing projects.
- Maintain logical and physical data models along with corresponding metadata.
- Develop the data model following the best practices for standard naming conventions and practices to ensure consistency of data models as per the company standards.
- Recommend opportunities for reuse of data models in new environments/projects.
- Evaluate data models and physical databases for variances and discrepancies.
- Validate business data objects for accuracy and completeness.
- Analyze data-related system integration challenges and propose appropriate solutions.
- Guide System Analysts, Engineers, Programmers and others on project limitations and capabilities, performance requirements and interfaces.
- Review modifications to existing databases to improve efficiency and performance.
- Discover metadata of the source database, including value patterns and distributions, key candidates, foreign-key candidates, and functional dependencies
【会社概要 | Company Details】
Global insurance company with over 40 years of experience in Japan with strengths in various sales channels and product lineup. The company focuses on creating diverse environments including but not limited to promoting the appointment of women.
【就業時間 | Working Hours】
9:00 - 17:00（Mon - Fri）
【休日休暇 | Holidays】
Saturday, Sunday, and National Holidays, Year-end and New Year Holidays, Paid Holidays, Other Special Holidays
【待遇・福利厚生 | Services / Benefits】
Social insurance, Transportation Fee, etc.
- Strong data modeling skills required using Erwin.
- Experienced working on HDFS, HBase, Solr and RDBMS (Oracle/Sql Server).
- Data structures and data management knowledge and proven experience.
- Experienced in SQL and NoSQL databases.
- Exposure to new platforms and adapt to a continuous evolving ecosystem of multiple moving components involving Hive, HBase, Zeppelin, Spark, Livy, Solr, QlikSense, PowerBI and other BI tools
- Basic engineering skills such as background in mathematics, statistics. Analytical thinking (able to drive reporting and analytical requests).
- Extended knowledge of Hadoop ecosystem and Spark is a plus.
- Knowledge on graph theory and graph databases is a plus.