
Data Engineer - Deadline 19/06/25
- On-site
- Belgium, Brussels, Belgium
Job description
DESCRIPTION OF THE TASKS:
• Definition of scalable data pipelines and ETL processes, typically in a cloud environment
• Integrate data from various sources to support analytics and business intelligence
• Ensure data quality, security, and compliance with relevant regulations and specifications.
• Work closely with data scientists and analysts to provide clean and structured data.
• Optimize data storage and retrieval for performance and cost-effectiveness in cloud platforms.
• Follow up and coordinate all efforts on design, implementation, and operations of the data and metadata technical processing aspects of a cloud solution.
• Ensure the necessary requirements are complied with and that all stakeholders have the means to consume the data according to security, availability, and performance criteria in accordance with their assigned user roles.
• Provide support and training on data manipulation and processing to the different stakeholders of the project
• Ensure technical aspects of the design, implementation, and operations are aligned with data principles, requirements, and expectations, particularly in terms of data availability and processing performance.
• Design, develop and implement data cleansing, data preparation or other type’s data processing solutions using the available tools in the context of the project
• Contribute to the ICS2 and SSA project in other data related aspects delivering on data modelling, interfaces definition, technical and functional documentation, etc
• Support definition, design and use of the SSA platform focusing on data sources efficient integration, storage, accessibility and processing, data transmissions, interfaces and data security.
• Participate in the ICS2 analytics usage and test scenarios identification and investigation
• Participate in the platform tool deployment, integration and administration
• Identify, document and apply operational procedures for the use of the platform
• Define and document technical specifications for the platform integration (interfaces, data models, etc.).
• Design or participate in the design of data processing algorithms for specific use cases.
• Report on the status, risks, and mitigation actions in this respect.
• Contribute to ICS2 SSA data and integration architecture, which satisfies above requirements and support future needs without significant rework for following ICS2 releases and following potential ICS2 Analytics evolution
• Direct support to the users of the platform facilitating their work while identifying and documenting potential improvements
• Liaise with Architects on the design of the system and the way it fulfils the business requirements and non-functional requirements (volume, scalability, stability, confidentiality, security, integrity, availability, usability, within an appropriate data governance approach from the perspective of the customs);
• Propose systems, components and/or services architecture, COTS software products, standards; and/or to evaluate technical offers
• Liaise with Architects on investigation, evaluation of the technical solutions proposed by TAXUD contractors; and/or evaluate assessments/ benchmarking done by the external parties as contractors and other DGs;
• Make proposals for architecture and data governance, taking account of the need for the analytics solution to support flexible multidisciplinary collaboration between Member States customs experts and the Commission, within the prevailing legal and governance framework, on permanent and ad-hoc use cases or evaluate equivalent technical offers of contractors and other vendors.
• Being up to date with the industry evolution, with vendor product offerings and analytics technologies and increasing awareness of these into the organization.
• Analyse the integration of ICS2 SSA platform with ICS2 Common Repository data model and ensure their interoperability.
• Liaise with Architects on providing the technology and architectural guidance for the ICS2 SSA design and implementation;
• Ensures compatibility of the versions of different software components used together in a system and respective built management
• Design of the Analytics Data Models and Data processing/ Data flows
• Design and deploy (or support) of new data processing workflows;
• Design and ensure integration with ICS2 internal and external data sources;
• Coordinate with the projects' various stakeholders
• Report on the status (tasks progress, plan, actions, risks, issues, decisions, changes, etc.) of activities related the role’s responsibilities
• Escalate data related technical issues
• Produce and/or review large set of technical documents;
LEVEL OF EDUCATION: Bachelor or Master Degree
Following skills and knowledge are required for the performance of the above listed tasks:
• Very good knowledge of technologies like Kubernetes, Docker, Cloudera, Spark, Kafka, Microservices, relational DBMS, RESTful APIs, AWS, Azure, etc.
• Excellent knowledge of ETL processes to ensure data quality, accuracy, and accessibility for data science teams
• Ability to take up responsibilities efficiently and fast, even with a limited documentation and knowledge transfer; ability to identify and cover the gaps in the documentation
• Ability to give business and technical presentations
• Ability to apply high quality standards
• Ability to cope with fast changing technologies used in the field of big data analytics
• Very good communication skills with technical and non-technical audiences
• Analysis and problem-solving skills
• Capability to write clear and structured technical documents
• Ability to participate in technical meetings and good communication skills
Following specific expertise is mandatory for the performance of tasks:
• Good knowledge in microservices and architecture in the cloud.
• Good knowledge in applications design.
• Excellent knowledge of Relational DBMS.
• Good knowledge of interoperability technology (web services, message oriented middleware, service oriented bus, event architecture).
• Good knowledge of the EC processes and business knowledge of Customs processes would be additional asset,
• Knowledge of central high-availability systems, big data and analytics solutions; good knowledge of machine learning platforms would be a benefit
• Good knowledge of systems and tools like Oracle, SAP, Dataiku, Denodo, Kafka, Superset, LDAP, Spark, Presto or other engines for large-scale data analytics, etc.
• Good knowledge of security aspects; system security, data security, etc.
• Knowledge of the programming languages
• Knowledge of the network protocols
• Excellent knowledge of XML, HTML, JSON
Level 6 to 9
Delivery mode : Near Site (Brussels)
or
All done!
Your application has been successfully submitted!