INTERNETOFTHINGS JOBS         BIGDATA INDIA JOBS         Home         Register       Sign In


 
Company Info
AMAZON
Brisbane, CA, United States

Company Profile


Big Data Engineer


col-narrow-left   

Job ID:

12361

Location:

Seattle, WA, United States 

Category:

Analytics, Big Data Analytics, Big Data Architect, ETL Developer, Hadoop
col-narrow-right   

Job Views:

225

Posted:

11.05.2018
col-wide   

Job Description:

The big data engineer develops, supervises and maintains data integration processes into the data lake by building data pipelines and using state of the art ETL tools in alignment with business representatives and the business information needs. The big data engineer executes master data management policies developed by the data architect. He also executes the data quality evaluations and works with business representatives in improving the data quality to the required levels

Job Requirements:

What are my responsibilities? • Responsible for the integration of large, structured and unstructured data volumes into the existing cloud platforms • Development of scalable end-to-end data pipelines for batch and stream processing • Execution of the data integration activities (ETL / ELT) for populating the data lake and integrating diverse data sources • Execution an further development of the physical implementation of the logical data model into a physical implementation in the data lake • Implementation of solutions for reference data and master data management within the context of the mobility data business • Execution of data quality measurements and implementation of data quality improvement activities to the required levels of data quality • Support of build-up and maintenance of a data directory for all data relevant to the mobility data business • Representation of the Data Architecture team in selected data architecture, data modeling, and metadata management work teams inside Mobility What do I need to qualify for this job? • University degree in an appropriate area (e.g. informatics) • At least 2 years of relevant work experience • Experience with modern big data technologies like Hadoop, MapReduce, Kafka, Hive, Presto, Spark, etc. • Experience with cloud solutions like AWS • Experience with programming languages like SQL, Scala, Python, Java



Home My Account Find Jobs Post Resumes Search Resumes Post Jobs Contact About Us Sitemap terms & cond Privacy policy