Vaga de Data Engineer
1 vaga: | Publicada em 29/06
A combinar
Sobre a vaga
PURPOSE OF THE JOB As Data Engineer, you will work on one of the world's largest
social media platform which deals with a few petabytes of data coming to the
system daily. You will contribute as part of R&D self-organized team working in a
challenging, innovative environment for our client.Investigate, create, and
implement the solutions for many technical challenges using cutting edge
technologies, including building/enhancing Data processing platform enabling work
of software used by hundreds of millions of users. MAIN TASKS AND RESPONSIBILITIES
"Obtains tasks from the project lead or Team Lead (TL), prepares functional and
design specifications, approves them with all stakeholders."Ensures that assigned
area/areas are delivered within set deadlines and required quality
objectives."Provides estimations, agrees task duration with the manager and
contributes to project plan of assigned area."Analyzes scope of alternative
solutions and makes decisions about area implementation based on his/her
experience and technical expertise."Leads functional and architectural design of
assigned areas. Makes sure design decisions on the project meet architectural and
design requirements."Addresses area-level risks, provides and implements
mitigation plan."Reports about area readiness/quality, and raise red flags in
crisis situations which are beyond his/her AOR."Responsible for resolving crisis
situations within his/her AOR."Initiates and conducts code reviews, creates code
standards, conventions, and guidelines."Suggests technical and functional
improvements to add value to the product."Constantly improves his/her professional
level."Collaborates with other teams. REQUIRED EDUCATION AND EXPERIENCE Must
have:"University degree in Computer Related Sciences or similar"+ years'
experience as Data Engineer with strong Python & Pyspark coding skill."+ years'
experience working in data migration, data preparation, data gateway, and data
warehousing projects."+ years of experience in ETL orchestration and workflow
management tools like Airflow, or Oozie."Experience in Spark, Snowflake &
Databricks."Expert in Database fundamentals, SQL and distributed computing."+
years' experience with Apache Superset, Apache Airflow, Spark and understanding on
how these technologies work internally."Rigor in high code quality, automated
testing, and other engineering best practices"Strong OOP skills"Effective
communication, collaboration, and interpersonal skills"Result oriented
approach."Good English (oral & written) and communication skills in general. Would
be a plus:"Experience work with Apache Parquet."Experience in AWS.