DATA ENGINEER (GCP/HADOOP)

Polska

20000 - 27000 PLN

Poziom
Regular
Umowa
B2B
Wielkość firmy
100 - 249
Pozostało
Zakończono
Stack technologiczny
Pracodawca nie ma żadnych wymagań technologicznych
Miasta
Zdalnie, Kraków, Trójmiasto, Warszawa
Opis

CAPCO POLAND

Capco Poland is a global technology and management consultancy specializing in driving digital transformation across the financial services industry. We are passionate about helping our clients succeed in an ever-changing industry.

We also are:

  • Experts in banking and payments, capital markets, wealth and asset management
  • Focused on maintaining our nimble, agile, and entrepreneurial culture
  • Committed to growing our business and hiring the best talent to help us get there
 
THINGS YOU WILL DO

We are looking for Data Engineers to work on the collecting, storing, processing, and analyzing of large sets of data and be a part of the Clients' Wholesale Chief Data & Analytics Office. Our clients' Big Data Lake is the largest aggregation of data ever within financial services with over 300 sources and a rapidly growing book of work.

  • Deliver an ecosystem of curated, enriched, and protected sets of data – created from global, raw, structured, and unstructured sources
  • Collect, store, analyze, and leverage data
  • Integrate data with the architecture used across the company
  • Build core services that power Machine Learning and analytics systems
  • Data Engineering and Management
  • Data development process: design, build and test data products that are complex or large-scale
  • Promote development standards, code reviews, mentoring, testing, scrum story writing
  • Cooperate with customers/stakeholders

Challenges:
1. Refactoring the current technology stack and architecture from on-premise Hadoop to Google Cloud Platform 
2. Integrating with an established, complex Multitenant Hadoop based

SKILLS & EXPERIENCES YOU NEED TO GET THE JOB DONE

  • Experience working with data pipeline building technologies: PySpark, Scala, Hive, Java
  • Good knowledge of Data warehouse concepts
  • Proficient in SQL and relational database design
  • Elastic Search experience (Elastic/Logstash/Kibana etc)
  • Google Cloud Platform knowledge
  • Knowledge and experience of Hadoop eco-system and data management frameworks  
  • Knowledge of CI/ CD, Agile, DevOps, Software Development Life Cycle (SDLC)
  • Excellent communication, interpersonal, and decision-making skills
  • Good English knowledge 

WHY WORTH JOINING US

  • Employment contract and/or Business to Business as you prefer
  • Possibility to work remotely
  • Speaking English on daily basis, mainly in contact with foreign stakeholders and peers
  • Multiple employee benefits packages (multisport card, private medical care, lunch card)
  • Access to 3.000+ Business Courses Platform (Udemy)
  • Access to required IT equipment
  • Ongoing learning opportunities to help you acquire new skills or deepen existing expertise
  • Being part of the core squad focused on the growth of the Polish business unit
  • A flat, non-hierarchical structure that will enable you to work with senior partners and directly with clients
  • A work culture focused on innovation and creating lasting value for our clients and employees

ONLINE RECRUITMENT PROCESS STEPS

  • Screening call with the Recruiter
  • Home assignment if required
  • Technical/Competencies interview with Capco Hiring Manager
  • Client’s interview
  • Feedback/Offer
Wyślij CV
Ta rekrutacja prowadzona jest w serwisie zewnętrznym. Po kliknięciu powyższego przycisku zostanie wczytana strona rekrutera na której można kontynuować proces rekrutacji.
Zobacz również
Created by RedAxe ©Work4.dev 2020 - 2024