Skip navigation EPAM

Big Data Developer Poland or Remote

  • hot

Big Data Developer Description

Job #: 55278
EPAM is a leading global provider of digital platform engineering and development services. We are committed to having a positive impact on our customers, our employees, and our communities. We embrace a dynamic and inclusive culture. Here you will collaborate with multi-national teams, contribute to a myriad of innovative projects that deliver the most creative and cutting-edge solutions, and have an opportunity to continuously learn and grow. No matter where you are located, you will join a dedicated, creative, and diverse community that will help you discover your fullest potential.

The remote option applies only to the Candidates who will be working from any location in Poland.

DESCRIPTION


We are looking for (aspiring) Big Data Developers with at last 1 year of commercial experience to join our growing Data Practice and make our team even stronger. Projects and technologies we are working with are very different and cover all technologies which present currently on the market and represented by open source communities.

We are providing our service to Clients in different domains: Financial, Health Care, Insurance and many others, so you will have a chance to develop yourself in any direction you want.

You can join one of our offices, which are located in Warsaw, Krakow, Wroclaw and Gdansk. We are supporting Employment Contract or B2B, depending on your preferences.
#LI-Remote
#PL_HJ_4Apr2

Responsibilities

  • Data ingestion from different data sources which exposes data using different technologies, such as: RDBMS, REST HTTP API, flat files, Streams, Time series data, SAP and a lot other based on various proprietary systems. You will need to research and implement data ingestion with help of Big Data technologies
  • Data processing/transformation using various technologies such as Spark and Cloud Services. You will need to understand your part business logic and implement it using any language which supported by base data platform

Requirements

  • Advanced knowledge one of language: Java, Scala, Python, C#
  • Production experience with: HDFS, YARN, Hive, Spark, Kafka, Oozie / Airflow
  • Google Cloud Platform, Amazon Web Services, Cloudera (or any other) HADOOP distribution, Databricks

Nice to have

  • Spark Streaming
  • Kafka Streaming / Kafka Connect
  • Snowflake
  • ELK Stack
  • Docker / Kubernetes
  • Cassandra / MongoDB
  • CI/CD: Jenkins, GitLab, Jira, Confluence other related tools

We offer

  • Team & working conditions:
    • Friendly team and enjoyable working environment
    • Engineering community of industry’s professionals
    • Flexible schedule and opportunity to work remotely
    • Relocation within our offices
    • Corporate and social events
    • Benefits package (health insurance, multisport, shopping vouchers)
  • Stable income:
    • Employment Contract or B2B
    • Regular assessments and salary reviews
    • Participation in the Employee Stock Purchase Plan
    • Referral bonuses
  • Career development:
    • Innovative solutions delivery and engineering excellence
    • Outstanding career roadmap
    • Leadership development, career advising, soft skills and well-being programs
    • Certification (GCP, Azure, AWS)
    • Unlimited access to LinkedIn Learning, Get Abstract, O’Reilly, Cloud Guru
    • Language classes on English and Polish for foreigners
  • Please note that only selected candidates will be contacted

Witaj. W czym możemy pomóc?

NASZE LOKALIZACJE