Skip navigation EPAM

Data Software Lead Hungary or Remote

  • hot

Data Software Lead Description

EPAM is a leading global provider of digital platform engineering and development services. We are committed to having a positive impact on our customers, our employees, and our communities. We embrace a dynamic and inclusive culture. Here you will collaborate with multi-national teams, contribute to a myriad of innovative projects that deliver the most creative and cutting-edge solutions, and have an opportunity to continuously learn and grow. No matter where you are located, you will join a dedicated, creative, and diverse community that will help you discover your fullest potential.


As a Data Software Lead, you will design, develop, and maintain data systems and solutions. You will work collaboratively with cross-functional teams to ensure data integration, quality, and security. We are looking for a hands-on and lead data software engineer focused on Azure Data stack (Databricks) to join our growing Data Practice.

What You’ll Do

  • Lead modern data platforms implementation starting from requirements identification and up to production releases
  • Design and develop highly scalable, maintainable and data centric solutions using Azure cloud and Databricks
  • Design, create, optimize and maintain EL/ETL/ELT data pipelines and orchestration processes
  • Design, create, optimize and maintain data models
  • Ensure data quality processes, monitoring and data cleansing
  • Contribute to data governance activities
  • Collaborate with cross-functional teams to integrate data into various systems and applications
  • Drive and facilitate direct communication with project sponsors and business stakeholders

What You Have

  • Azure Data stack experience: Databricks, Data Lakehouse, Delta Lake, ADF, ADLS, Apache Spark, SQL/NoSQL, Key vault
  • Strong experience with Python, PySpark
  • Strong understanding of data processing design patterns: EL, ETL, ELT, incremental/delta loads and data streaming (optional)
  • Strong knowledge of RDBMS, data marts and Lakehouse design concepts
  • Ability to support infrastructure, security, network, and data exchange and distribution topics
  • Ability to work collaboratively with cross-functional teams
  • Excellent problem-solving skills and attention to detail
  • Fluent English level required, B2 minimum

Nice to have

  • Multi-cloud experience, incl. Azure, AWS, GCP
  • Data streaming skills, Kafka
  • Professional Databricks certification
  • Technical Team leading competence

We Offer

  • Permanent job with remote work opportunity
  • Widespread training and development opportunities, language courses, soft-skill trainings
  • Vast opportunities for self-development, unlimited access to LinkedIn Learning, GAL trainings
  • Multilingual work environment
  • Competitive salary and benefit packages (private health care, sport card, fringe benefits)
  • International projects, working in hybrid teams with high-skilled peers
  • Sport and social teams support, advanced CSR programs

Witaj. W czym możemy pomóc?