Skip navigation EPAM

Lead Data DevOps Engineer Hungary or Remote

Lead Data DevOps Engineer Description

EPAM is a leading global provider of digital platform engineering and development services. We are committed to having a positive impact on our customers, our employees, and our communities. We embrace a dynamic and inclusive culture. Here you will collaborate with multi-national teams, contribute to a myriad of innovative projects that deliver the most creative and cutting-edge solutions, and have an opportunity to continuously learn and grow. No matter where you are located, you will join a dedicated, creative, and diverse community that will help you discover your fullest potential.

No less important is the safety, well-being, and experience of our applicants. Therefore until further notice, all EPAM employment interviews will be conducted remotely. Our recruitment professionals and hiring managers are standing by to ensure a robust and engaging virtual candidate experience. We look forward to speaking with you! For further information please check our web site at


Currently we are looking for a Lead Data DevOps Engineer for our Data Practice in Hungary to make the team even stronger.

Learn more about our Data Practice here.

What You’ll Do

  • Design, implement and integrate modern data analytics solutions in Cloud
  • Build and maintain high-performance, cloud-based, scalable distributed systems
  • Build proactive monitoring systems (Prometheus, Grafana, etc.)
  • Maintain and optimize Cloud-based infrastructure environments
  • Create and maintain CI/CD pipelines (Jenkins, Gitlab-CI, TeamCity, JFrog, etc.)
  • Involve in communication with Engineers and Customers to clarify requirements to ensure constant smooth delivery of working software to production
  • Mentor less-experienced colleagues
  • Engage the team for efficient collaboration

What You Have

  • Bachelor’s / Master’s Degree in Information Technology or Economics with a strong IT attitude
  • Experience in any cloud platform (AWS, Azure or GCP etc.)
  • Practical knowledge of containers (Docker)
  • Up-to-date knowledge of CI/CD processes
  • Understanding of Big data concepts
  • Knowledge of Network functioning principles (Basic protocols, Routing, NAT, VPN, etc.)
  • Scripting experience (Bash, Python, Java, JS, etc.)
  • Experience in writing Infrastructure as Code (Ansible, Terraform, AWS CDK etc.)
  • Knowledge of SQL and / or NoSQL Databases (HBase, Cassandra, MongoDB, Redis etc.)
  • Self-directed person with strong problem solving and troubleshooting skills
  • Fluent English

Nice to have

  • Cloudera (HDP), Hadoop
  • Kafka and / or Spark
  • Elasticsearch (ELK)

We offer

  • Dynamic, entrepreneurial, high speed, high growth corporate environment
  • Diverse multicultural, multi-functional, and multilingual work environment
  • Opportunities for personal and career growth in a progressive industry
  • Global scope, international projects
  • Widespread training and development opportunities
  • Unlimited access to LinkedIn learning solutions
  • Competitive salary and various benefits
  • Sport and social teams support, recreation area, advanced CSR programs

Witaj. W czym możemy pomóc?