Skip to main content

Lokacija: Novi Sad

What You’ll Do?

We are seeking a Data Engineer to help in the development of a data lake for the order management system and infrastructure product line. The candidate is expected to be a highly motivated individual who will be responsible for collaborating with development, sales, customer support and technical operations teams in a growing world-class organization.

Job Overview

You will be responsible for building and supporting our next generation cloud data lake infrastructure. As a Data Engineer, you’ll work on developing the infrastructure to store and process diverse structured and semi-structured data and building Business Intelligence and Analytics applications to access and visualize the data. You’ll also be responsible for integrating these applications with the architecture used across the organization. Adjacent responsibilities include establishing best practices with respect to data integration, data visualization, schema design, performance and reliability of data processing systems, supporting data quality, and enabling convenient access to data for both internal and external consumers.

Role and Responsibilities:

  • Design and enable a modern cloud-based data lake supporting structured and semi-structured data in native form as needed
  • Design and develop complex, highly scalable, and extensible ETL and batch processing data pipelines from internal and external sources
  • Design and develop a cloud-based analytics and presentation layer
  • Work on cross-functional teams to design, develop, and deploy data-driven applications and products
  • Lead in prototyping emerging technologies involving data ingestion and transformation, distributed file systems, databases, and frameworks
  • Define, develop, and automate data quality checks
  • Review developed solutions to solve specific business problems. Optimize queries, data models, and storage formats to support common usage patterns
  • Develop code at a high-level of abstraction
  • Influences strategy related to processes and workflows across their division

Qualifications:

  • In-depth knowledge of SQL, NoSQL, ORC and other data structures
  • Extensive hands-on Python programming experience. Able to employ design patterns and generalize code to address common use cases
  • Applied knowledge of cloud data warehousing (AWS, GCP, Snowflake etc.); certification in either AWS or GCP
  • Experience architecting, provisioning, and operating integrated, serverless, and elastic cloud-based systems at scale
  • Knowledge of Infrastructure as Code best practices, including experience provisioning immutable, cloud-based infrastructure using Terraform or Cloud Formation
  • Experience with advanced cloud platform configuration and services to optimize cost and security of data both at rest and in transit
  • Experience using Apache Spark to execute distributed computation workflows on large data sets; additional knowledge of and experience in HDFS is beneficial
  • Solid data understanding and business acumen in the data rich industries
  • Strong understanding of database internals, such as indexes, binary logging, and transactions
  • Experience using containerization solutions such as Docker and Kubernetes
  • Experience with embedded business intelligence tools similar to Looker, Qlik, Sisense
  • Experience with software engineering tools and workflows (i.e. JIRA, Jenkins, CI/CD, git)
  • Ability to make sound decisions in a fast-paced, high-pressure environment

We offer:

  • Permanent contract and a competitive salary
  • Innovative projects in fintech and crypto
  • Cutting-edge technologies
  • Flexible working hours and remote working options
  • Career and growth plan
  • Employee appreciation program
  • Personal development and training
  • Private healthcare plan
  • Friendly and pleasant working environment
  • Wellness and mindfulness support
  • Referral bonuses
  • Paternity leave – 10 working days for new dads