Who
we are?
ITSG is an
innovative software house company operating on the Polish market for over 15
years. We stand out due to the complexity and innovativeness of the completed
projects and implemented products for companies from the financial
institutions, medicine, pharmacy, telecommunications, and media sectors.
We are
looking for a Senior Data Engineer for the Warsaw Data Science Lab, to
support us in a project for a global leader in the financial transactions
industry. . Who will be responsible for designing, developing, and implementing
Data integration and Analytics solutions on SQL Server and/or Hadoop on premise.
This position requires close collaboration with business partners to understand
Business goals, solicit requirements, and design, develop, and implement Data
Analytics solutions.collaboration with business partners to understand Business
goals, solicit requirements, and design, develop, and implement Data Analytics
solutions.
Duties
and responsibilities:
- Build resilient data pipelines for reporting and data
science/ML use cases
- Develop workflows and ETL/ELT code using Hive scripts,
Spark, Airflow, NiFi, Sqoop, Oozie, and other utilities on Hadoop.
- Participate in technology project delivery activities
such as gathering Business requirements, conceptual approach, design,
development, test case preparation, unit/integration test execution, and support
documentation.
- Develop the source to target mapping documents and
ETL/ELT code to load data from the source to the Data warehouse or target
application systems
- Partner with IT groups such as CIT, Engineering,
Product, Security, and Infrastructure on project delivery activities and
security remediations
- Provide insights from data and present them to IT and
non-technical users to improve operations and productivity
- Unit test data loads and write scripts for data
validation
- Support QA, UAT, and performance testing phases of the
development cycle and implement DevOps principles from development to
deployment to production
- Provide Production support for jobs running on the
cluster and monitor and manage SLA for downstream applications.
- Build data quality frameworks
- Build analytical applications using the Hadoop ecosystem
using packaged or open-source technologies
Requirements:
- Minimum 5 years of experience in developing and delivering Analytics
projects in the field of ELT/ETL and reporting.
- Completed major studies (Computer Science, Software Engineering or related)
- Experience in data management in SQL Server
- Experience in writing complex SQL queries
- Experience in integrating with internally developed APIs
- Experience in Hadoop development and data integration in Hadoop,
Hive, Spark, Presto, Airflow, Sqoop, NiFi etc preferred.
- Experience in building data pipelines for machine learning/advanced model
support analytics
- Experience in building data engineering pipeline frameworks
- English at a communicative level
Nice
to have:
- Experience in system design and infrastructure
construction and maintenance
- Experience in scripting Python, Shell/Perl, scheduling
and version control, CI/CD technologies
- Experience with real-time data processing assimilation
using Kafka/Spark Streaming
- Strong skills in identifying, analyzing, and solving
problems
- Ability to learn new information technologies and
apply them quickly
- Exceptional communication and customer engagement
skills, able to interact effectively with diverse groups of global
stakeholders, both technical and business users
- Experience working in a globally dispersed team
We
offer:
- Salary: 130 - 170 PLN+vat /hour
- 90-95% remote work, we need you to come to the office in Warsaw 1-2 times a month
- Work with an experienced team of experts using the
latest tools and technologies
- Individual training and career growth opportunities
- Benefits: English lessons, Multisport cards, private
medical insurance, integration events