Treball Data Scientist Informatica PowerCenter en Alacant

Estás son las últimas ofertas de trabajo de Data Scientist Informatica PowerCenter en Alacant encontradas.

 treballs

  • 18/05/2022

    Comunitat Valenciana

    We seek new teammates with can-do attitude, creative mindset, curiosity, problem solving and thirst for knowledge. We'd welcome you with open arms! Reporting to Head of Data Engineering, your main objectives are: MAJOR AREAS OF ACCOUNTABILITY: Design and implement data pipelines to ingest heterogeneous data into the datalake / datawarehouse in different scenarios (batch, streaming, ...), managing large and complex data sets. Ensure data accuracy and correctness on the implemented pipelines. Create custom software components and analytics applications. In coordination and collaboration with the data insights team, in charge of the front-end and delivering data products, develop data preparation in order to be used for different purposes and use cases: reporting, machine learning, data sharing, ... and identify opportunities for data acquisition. In coordination and collaboration with the data governance team, explore ways to enhance data quality and reliability. Integrate up-and-coming data management and software engineering technologies into existing data structures. Use agile software development processes to iteratively make improvements to our back end systems. Ensure that all systems meet the business/company requirements as well as industry practices. INTERNAL AND EXTERNAL RELATIONSHIPS: Internal : All Business Services, Product Lines, Architecture, Security, Data Insights & Governance, IT Ops, QA External : IT Partners, External consultants PROFILE: PREVIOUS EXPERIENCE: Proven experience in data engineering or software engineering around data solutions in modern data architectures. EDUCATION LEVEL / CERTIFICATES: Bachelor or Engineering's degree level or higher. LANGUAGES: Written and verbal proficiency in English (mandatory). Other languages practice is appreciated (French, Spanish etc.). TECHNICAL SKILLS: Strong Python and SQL knowledge. Strong knowledge in data integration / ETLs and orchestration tools. Experience with relational SQL and NoSQL databases. Experience in cloud data platforms (GCP, AWS, Snowflake). Experience in continuous integration development techniques. WISHED: Good knowledge in other programming languages, like Java / Kotlin. Knowledge in frameworks to develop streaming pipelines: Kafka, Apache Beam, Spark, ... Knowledge in Terraform/Helm/K8s/Docker. Knowledge on BI and visualization tools. PERSONAL CHARACTERISTICS: Team player with a positive attitude and ability to collaborate effectively. Strong willed and self-motivated. Analytical mindset, process focused and structured. Proactive and self starting.

Cerca avançada