Skip navigation EPAM

Data Architect Slovakia

  • hot

Data Architect Description

Job #: 90118
EPAM is a leading global provider of digital platform engineering and development services. We are committed to having a positive impact on our customers, our employees, and our communities. We embrace a dynamic and inclusive culture. Here you will collaborate with multi-national teams, contribute to a myriad of innovative projects that deliver the most creative and cutting-edge solutions, and have an opportunity to continuously learn and grow. No matter where you are located, you will join a dedicated, creative, and diverse community that will help you discover your fullest potential.


We are looking for Data Architect any level for data-driven projects. Together we design and drive lots of solutions which generate value from data, taking advantage of scalable platforms, cutting-edge technologies, and machine learning algorithms.

Set of used technologies is very wide, so any technology background of the Data Architect is acceptable. We provide a solid architecture framework, educational programs, and strong SA community to support you in a deep dive to data domain.

Some architectural areas we are focusing on:

• Data Processing Architecture
• Streaming Architecture
• Data Platform Operations
• Metadata Management Architecture
• Cloud Data Services Architecture
• ML and MLOps Architecture
• Data Warehouse Architecture
• Data Management
• Business Intelligence Solutions
• Data Integration Architecture
• Data Security Architecture

Some examples from tool set/technology stack we are using:

• Clouds: AWS, Azure, GCP
• Distributed data processing & ETL Frameworks: Apache Spark (and related cloud specific technologies such as AWS EMR, GCP DataProc, Azure HD Insight), GCP DataFlow, AWS Glue, Databricks
• Distributed Environments: Kubernetes, Docker, AWS ECS, Google Kubernetes Engine, Azure Kubernetes Services
• Analytical Data Warehousing: Snowflake, AWS Redshift, Azure Synapse, GCP BigQuery
• Relational Databases: PostgreSQL, AWS SQL DB, GCP Cloud SQL
• Lightweight/Serverless Compute: AWS Lambda Functions, GCP Cloud Functions, Azure Functions
• No-SQL/Specialized Databases: Cassandra, MongoDB, Azure Cosmos DB, GCP BigTable, Redis (including cloud analogs)
• Data Catalogs & Metadata Management: Collibra, Alation, Informatica,Azure Purview, Google Dataplex
• Integration & flow management: AWS Step Functions, Airflow/ GCP Cloud Composer, Azure Data Factory, Kafka Connect
• Data Streaming: Kafka, AWS Kinesis, GCP Pub/Sub, Azure Event Hub
• Object storages: S3, ADLS, GCS, HDFS, Minio
• Search platforms: Solr, ElasticSearch
• ML: MLflow, Kubeflow, AWS Sagemaker, Azure ML, GCP AI Platform
• Data Visualization: Power BI, Tableau, QlikView, Spotfire, Jupyter
• Platform Operations: IaaC (Terraform, AWS CloudFormation, Azure DevOps etc.), IaM (Azure AD, AWS Cognito, etc.), monitoring (Prometheus, Splunk, Azure Monitor, etc.), CI/CD (Jenkins, GCP Cloud Build, etc.), Cloud cost managment, secuity & networking tools
• Programming Languages: Java, Scala, Python


  • Design and evolve large-scale data-driven solutions
  • Drive direct communications with business stakeholders
  • Elaborate on all technical aspects for the development team, provide justification for any architectural decision
  • Lead implementation of the solutions from establishing project requirements and goals to solution "go-live"
  • Create and present solution architecture documentation with deep technical details to customer and implementation teams
  • Participate in the full cycle of pre-sale activities
  • Lead solution architecture evaluation and assessment activities
  • Continuously research emerging technologies, participate in company level knowledge sharing initiatives, PoCs and training programs


  • Experience in requirements engineering, solution architecture, systems development, deployment and maintenance
  • Knowledge of architecture, design patterns and technological landscape in at least 3 technology domains (Data Platforms, IoT, ML, Backend, Mobile, etc.)
  • Profound knowledge of the technology’s internals for at-least 1 technology domain
  • Solid understanding of the core concepts in data and analytics platfrom architectures, data warehousing, business intelligence, data management, integration, security and operations areas
  • Wide experience in design, implementation, deployment, troubleshooting and replatforming of distributed systems both on premises and in the Cloud
  • Structured and systematic knowledge of the entire of architecture design process (requirements, quality attributes, technology selection, estimation, proposal verification, documentation, etc.)
  • Experience in all phases of the software development life cycle using different development methodologies and best practices
  • Highly organized and detail-oriented
  • Good communication skills
  • Fluent English

We offer

  • We offer the possibility to work on full product lifecycle – from concept to delivery into production
  • Opportunity to work on leading edge platforms, working in a fast-paced, agile, software engineering culture
  • Using English on a daily basis
  • Unlimited access to LinkedIn learning solutions
  • Benefit program (5 weeks of vacation, 5 paid sick days, meal vouchers, reimbursement of glasses, contribution to pension fund)
  • Rotation program - possibility to relocate for short and long-term projects within 30 countries

Witaj. W czym możemy pomóc?