Skip navigation EPAM

Senior Data Integration Engineer Czech Republic or Remote

Senior Data Integration Engineer Description

EPAM is a leading global provider of digital platform engineering and development services. We are committed to having a positive impact on our customers, our employees, and our communities. We embrace a dynamic and inclusive culture. Here you will collaborate with multi-national teams, contribute to a myriad of innovative projects that deliver the most creative and cutting-edge solutions, and have an opportunity to continuously learn and grow. No matter where you are located, you will join a dedicated, creative, and diverse community that will help you discover your fullest potential.


We are currently looking for a Senior Data Integration Engineer to join our Prague office.

Project technologies and tools

  • Cloud providers stack (AWS/Azure/GCP): Storage; Compute; Networking; Identity and Security
  • DataWarehousing and DB solutions (RedShift, Snowflake, BigQuery, Azure Synapse, etc.)
  • Experience with some industry-standard Data Integration tools (Azure Data Factory, AWS Glue, GCP Dataflow, Talend, Informatica, Pentaho, Apache NiFi, KNIME, SSIS, etc.)
  • Experience in coding with one of the data-oriented programming languages: SQL, Python, SparkSQL, PySpark, R, Bash, Scala
  • Expected experience working with at least one Relational Database (RDBMS: MS SQL Server, Oracle, MySQL, PostgreSQL)
  • Dataflow orchestration tools, data replication tools and data preparation tools
  • Version Control Systems (Git, SVN)
  • Testing: Component/ Integration Testing / Reconciliation


  • Design and implement Data Integration solutions, model databases, and contribute to building data platforms using classic Data technologies and tools (Databases, ETL/ELT technology & tools, MDM tools, etc.) as well as implementing modern Cloud or Hybrid data solutions
  • Work with product and engineering teams to understand data product requirements, evaluate new features and architecture to help and drive decisions
  • Build collaborative partnerships with architects and key individuals within other functional groups
  • Perform detailed analysis of business problems and technical environments and use this in designing high-quality technical solutions
  • Actively participate in code review and testing of solutions to ensure it meets specification and quality requirements
  • Build and foster a high-performance engineering culture, supervise junior/middle team members and provide them technical leadership
  • Write project documentation
  • Be self-managing, implement functionality without supervision, test his/her work thoroughly using test cases, and/or supervise less experienced colleagues


  • At least 3 years of relevant development experience and practice with data management, data storage, data modeling, data analytics, data migration, and database design
  • Practical hands-on experience in developing Data Solutions in at least one major public Cloud environment (AWS, Azure, GCP)
  • Practical knowledge of leading cloud data warehousing solutions (e.g. Redshift, Azure Synapse Analytics, Google BigQuery, Snowflake, etc.)
  • Production coding experience in one of the data-oriented programming languages
  • Solid background in developing Data Analytics & Visualization, Data Integration or DBA & Cloud Migration Solutions
  • Experienced and highly self-motivated professional with outstanding analytical and problem-solving skills
  • Play the role of a Key Developer and a Designer or a Team Lead of 2-5 engineers and ensure that delivered solutions meet business requirements and expectations
  • Able to read and understand project and requirement documentation; able to create design and technical documentation including high-quality documentation of his/her code
  • Experienced in working with modern Agile developing methodologies and tools
  • Able to work closely with customers and other stakeholders
  • Advanced knowledge of Data Integration tools (Azure Data Factory, AWS Glue, GCP Dataflow, Talend, Informatica, Pentaho, Apache NiFi, KNIME, SSIS, etc.)
  • Advanced knowledge of Relational Databases (SQL optimization, Relations, Stored Procedures, Transactions, Isolation Levels, Security)
  • Practical hands-on experience of development of Data Solutions in Cloud environments (AWS, Azure, GCP) - designing, implementing, deploying, and monitoring scalable and fault-tolerant data solutions
  • Solid understanding of core cloud technologies and approaches. Awareness of niche and case-specific cloud services
  • Expected ability to troubleshoot the outages of average complexity, identify and trace performance issues
  • Pattern-driven solutioning, choosing the best for particular business requirements and technical constraints
  • Advanced knowledge of Data Security (Row-level data security, audit, etc.)
  • Production experience of one of the data-oriented programming languages: SQL, Python, SparkSQL, PySpark, R, Bash
  • Production projects experience in Data Management, Data Storage, Data Analytics, Data Visualization, Data Integration, MDM (for MDM profiles), Disaster Recovery, Availability, Operation, Security, etc
  • Experience with data modeling (OLAP, OLTP, ETL and DWH / Data Lake /Delta Lake/ Data Mesh methodologies. Inman vs Kimbal, Staging areas, SCD and other dimension types)
  • Good understanding of Online and streaming integrations, micro-batching, Understanding of CDC methods and delta extracts
  • General understanding of Housekeeping processes (archiving, purging, retention policies, hot/cold data, etc.)
  • Good understanding of CI/CD principles and best practices. Understanding of concepts of "Canary release", Blue-Green, Red-Black deployment models
  • Data-oriented focus and possessing compliance awareness, such as PI, GDPR, HIPAA
  • Experience in direct customer communications
  • Experienced in different business domains
  • English proficiency

We offer

  • Opportunity to work in a fast-paced, agile, software engineering culture
  • English-speaking environment
  • Unlimited access to LinkedIn learning solutions
  • Comfortable modern offices in Prague 4 or remote work from any location in Czech Republic
  • Benefit program (5 weeks of vacation, paid sick days, paid days off for special occasions, meal vouchers, flexi pass, Prague city public transport annual coupon, multisport cards, contribution to pension fund, health insurance for family member)
  • English language courses
  • Czech language courses upon request
  • Relocation assistance
  • Rotation program - possibility to relocate for short and long-term projects within 30 countries
  • Referral bonuses for recommended candidates
  • Mobile Phone Tariff’s program for managerial level candidates
  • EPAM Employee Stock Purchase Plan (ESPP) (subject to certain eligibility requirements)
  • Some of these benefits may be available only after you have passed your probationary period

Witaj. W czym możemy pomóc?