NIX Tech, a global supplier of software engineering and IT outsourcing services, is looking for a Medior/Senior Data Engineer in its office in Budapest (Váci Greens, 13th district). You’ll be part of a team of professionals who are ready to find the best tailor-made IT solutions for their multinational clients in various industries and solve complex problems.
Main goals and responsibilities:
– Collaborate with product owners and team leads to identify, design, and implement new features to support the growing data needs;
– Build and maintain optimal architecture to extract, transform, and load data from a wide variety of data sources, including external APIs, data streams, and data lakes;
– Implement data privacy and data security requirements to ensure solutions stay compliant with security standards and frameworks;
– Monitor and anticipate trends in data engineering, and propose changes in alignment with organizational goals and needs;
– Share knowledge with other teams on various data engineering or project-related topics;
– Collaborate with the team to decide on which tools and strategies to use within specific data integration scenarios.
Required skills:
– 4+ years of proven experience developing software using an object-oriented or a functional language;
– Strong programming skills in Python or Scala;
– Proficient with stream processing using the current industry standards (e.g. AWS Kinesis, Kafka streams, Spark/PySpark, Flink, etc.);
– Solid with distributed computing approaches, patterns, and technologies (Spark/PySpark. Hadoop, Storm, Hive, Beam as a plus);
– Experience working with Cloud Platforms (GCP, AWS, Azure) and their data-oriented components;
proficiency in SQL and query tuning;
– Understanding of data warehousing principles and modeling concepts (knowledge of data model types and terminology including OLTP/OLAP, (De)normalization, dimensional, star, snowflake modeling, cubes, and graph/NoSQL);
– Proven experience in modern data warehouse building using Snowflake, AWS Redshift or BigQuery;
expertise in the use of relational databases (PostgreSQL, MSSQL or MySQL) as well as non-relational (MongoDB);
– Experience with orchestration of data flows (Apache Airflow, Talend, Glue, Azure Data Factory);
a team player with excellent collaboration skills;
– English level intermediate +.
Would be a plus:
– Expertise in data storage design principles. Understanding of pros and cons of SQL/NoSQL solutions, their types, and configurations (standalone/cluster, column/row-oriented, key-value/document stores);
– Deep knowledge of Spark internals (tuning, query optimization);
– Experience with data integration and business intelligence architecture;
– Experience with data lakes, data lake houses (Azure Data lake, Apache Hudi, Apache Iceberg, Delta lake);
– Experience with containerized (Docker, ECS, Kubernetes) or serverless (Lambda) deployment;
– Good knowledge of popular data standards and formats (e.g., JSON, XML, Proto, Parquet, Avro, ORC, etc.);
– Experience with Informatica, Databricks, Talend, Fivetran, or similar;
– Experience in data science and machine learning with building Machine Learning models.
What we offer:
– Stable long-term work environment
– Paid English courses and conversation clubs
– Opportunities for professional and personal growth
– Mentoring program, internal and external professional training programs
– Comfortable office in Budapest (Vaci Green)
– Every necessary tool and device in the office will be provided to comfortably perform all project tasks: computers, meeting rooms, spacious modern kitchens with professional coffee machines, comfortable recreation areas with game consoles, board games, and a selection of literature for every taste
– Support and care from our friendly team