Senior Data Engineer

Polska

19000 - 24000 PLN

Level
Senior
Contract
Contract of employment
Company size
100 - 249
Left
Finished
Technology stack
Python:
Senior
etl:
Senior
sql:
Senior
nosql:
Senior
english:
Senior
data-warehouse:
Senior
Kubernetes:
Regular
AWS:
Regular
Azure:
Regular
airflow:
Regular
tableau:
Regular
apache-superset:
Regular
spark:
Regular
pyspark:
Regular
gcp:
Regular
Cities
Remote
Description
Senior Data Engineer

Join the newly established Poland Altimetrik division with amazing possibilities to grow in our star team. You will join the data team that develops integrated analytical solutions for one of our clients, which provides comprehensive financial services for institutions from the health care sector.  

You will:
  • Design efficient and scalable databases and data warehouses.
  • Implement robust cloud-based data pipelines.
  • Ensure quality and consistency of data & monitoring ETL processes.
  • Work with key business stakeholders, technology experts to gather requirements and develop solutions. 
  • Mentor and coach teammates. 
Mandatory Requirements:
  • BS or Master's Degree in Computer Science, Statistics, or other data-related fields.
  • At least five years of industry experience in: 
    • implementing data pipelines using ETL tools (e.g: pygrametl, Talend, Luigi or other),
    • designing relational database using one of most popular DBMS (e.g. Oracle, MS SQL, PostgreSQL or others),
    • designing and implementing data warehousing solutions (e.g: Snowflake, Redshift, BigQuery or others),
    • extracting data from SQL, Messaging systems (Eg: Kafka), NoSQL databases (e.g: MongoDB, Cassandra),  
    • Should be able to Unit test and do an end to end testing of all data pipelines and validate data with source. 
  • Strong SQL Knowledge and Data Analysis skills.
  • Python programming skills Mandatory.  
  • Participating in all phases of software development lifecycle (plan, design, develop, test, release, maintain and support) 
  • Experience with cloud environment (e.g: AWS, Azure, GCP, OpenShift) 
  • A DevOps mindset 
  • Excellent Communication Skills. 
  • Mentoring and leading a team of data engineers. 
Experience and skills which are strong assets:
  • Analytic engines such like Spark, PySpark.
  • Designing Business Intelligence visualization dashboards with Tableau, Apache Superset.
  • Workflow tools: Apache Airflow.
  • Experience in Kubernetes.

We offer:
  • Work in the Polish time zone.
  • Equipment provided by Altimetrik.
  • Salary: 19 000 - 24 000 PLN net / gross B2B / CoE.
Send CV
This recruitment is carried out on an external website. After clicking the button above, you will be redirected to the recruiter page where you can continue the recruitment process.
See also
Created by RedAxe ©Work4.dev 2020 - 2025