Data Engineer

Job Location: Poland
Job Type: Full Time

Client:

Our client is a premier data solutions firm specializing in architecture, migrations, and modernization projects. Renowned for executing long-term international projects as an external vendor, they deliver exceptional solutions without engaging in traditional outsourcing. They specialize in Azure and GCP projects, deliberately not including AWS in their portfolio. As they continue to expand with new project acquisitions, they are seeking a skilled Mid-Level Data Engineer to join their dynamic team. This role offers the opportunity to work on innovative, high-impact projects across the globe, contributing to the growth and success of a leading industry player.

Role:

As a Mid-Level Data Engineer, you’ll be an integral part of our team, working on full-scale projects from scratch. You’ll contribute to architecture design, migrations, and modernization efforts, collaborating closely with clients and team members to deliver high-quality data solutions.

Key Responsibilities

  • Design and implement data pipelines and ETL processes
  • Develop and maintain data infrastructure on cloud platforms
  • Optimize data storage and retrieval systems
  • Collaborate with cross-functional teams to understand data requirements
  • Participate in client-facing activities, including sprint calls and briefings
  • Contribute to the continuous improvement of our data engineering practices

Requirements:

  • Proven experience as a Data Engineer, with a focus on big data.
  • Strong knowledge of SQL and experience with relational and non-relational databases.
  • Proficiency in programming languages like Python, Java, or Scala.
  • Experience with big data tools: Hadoop, Spark, Kafka, etc.
  • Experience with data pipeline and workflow management tools: Airflow, Luigi, etc.
  • Experience with cloud services: Azure, or GCP (not AWS)
  • Knowledge of stream-processing systems: Storm, Spark-Streaming, etc.

Nice to have:

  • Experience with Apache Hive for data warehousing
  • Proficiency in CI/CD pipelines and tools (e.g. Azure DevOps, Jenkins, GitLab CI)
  • Knowledge of Snowflake for cloud data warehousing
  • Familiarity with Apache Kafka for stream processing
  • Experience with data visualization tools

Offer:

  • Salary: 18,000 – 25,000 PLN net (B2B)
  • 20 days of paid holidays
  • Medical/Healthcare Insurance plan
  • Remote work opportunity (must be based in Poland)
  • Flexible working hours (core hours 10-15)
  • Long-term cooperation and growth opportunities
  • Challenging projects with cutting-edge technologies

Apply for this position

Allowed Type(s): .pdf, .doc, .docx
pl_PLPolski