IHS Markit Jobs

Mobile ihs Logo

Job Information

General Electric Data Engineer in Bengaluru, India

Job Description Summary

You will be responsible for bulding, testing and maintaining Data orchestration through different layers, work with the internal audit/development team to identify possible discrepancies and resolving them. She/he will work as a part of the operations team to provide technical support and working with developers to enhance the existing products as well as introducing new changes as a part of the process improvement.

Job Description

Roles and Responsibilities

In this role, you will:

  • Leverage technical data dictionaries and business glossaries to analyze the datasets

  • Perform data profiling and data analysis for any source systems and the target data repositories

  • Understand metadata and the underlying data structures needed to standardize the data load processes.

  • Develop data mapping specifications based on the results of data analysis and functional requirements

  • Perform a variety of data loads & data transformations using multiple tools and technologies.

  • Build automated Extract, Transform & Load (ETL) jobs based on data mapping specifications

  • Validate the data mapping results and match with the expected results

  • Implement Data Quality (DQ) rules provided

Education Qualification

Bachelor's Degree in with basic experience.

Desired CharacteristicsTechnical Expertise:

  • Ability to understand logical and physical data models, big data storage architecture, data modeling methodologies, metadata management, master data management & data lineage techniques

  • Hands-on experience in programming languages like Java, Python or Scala

  • Hands-on experience in writing SQL scripts for Oracle, MySQL, PostgreSQL or Hive

  • Experience in handling both Online Transaction Processing (OLTP) and Online Analytical Processing (OLAP) data models

-Experience / Good understanding of Spark based ETL development leveraging data from object storage such as AWS S3.

  • Experience with Big Data / Hadoop / Spark / Hive / NoSQL database engines (i.e. Cassandra or HBase)

  • Exposure to unstructured datasets and ability to handle XML, JSON file formats

  • Exposure to Extract, Transform & Load (ETL) tools like Informatica/Talend and open source technologies such as Airflow and Drill.

Domain Expertise:

  • Exposure to finance and accounting data domains

  • Exposure to customer and sourcing data domains

Leadership skills:

  • Partner with other team members to understand the project objectives and resolve technical issues.

  • Communicate project status or challenges in a clear and concise manner to the cross team members.

Note

Note:

To comply with US immigration and other legal requirements, it is necessary to specify the minimum number of years’ experience required for any role based within the USA. For roles outside of the USA, to ensure compliance with applicable legislation, the JDs should focus on the substantive level of experience required for the role and a minimum number of years should NOT be used.

This Job Description is intended to provide a high level guide to the role. However, it is not intended to amend or otherwise restrict/expand the duties required from each individual employee as set out in their respective employment contract and/or as otherwise agreed between an employee and their manager.

Additional Information

Relocation Assistance Provided: Yes

DirectEmployers