Data Engineer (SNS01613)

April 17, 2025

Job Description

Welcome to Sansaone, a dynamic force in the realm of ICT talent acquisition. Born out of a passion for excellence and a vision for connecting outstanding professionals with forward-thinking organizations, we stand as a beacon for strategic recruitment solutions in the Information and Communication Technology sector. With a commitment to excellence and a passion for connecting exceptional professionals with innovative organizations, we are your strategic partner in building transformative teams.

We currently have a vacancy for a Data Engineer fluent in English, to offer his/her services as an expert in Valletta, Malta.

  • The work will be carried out either remotely or on site at customer premises. In the context of the first assignment, the successful candidate will be integrated in the Development team of the company that will closely cooperate with a major client’s IT team on site.
Your tasks
  • Design, develop, document, and maintain ETL/ELT processes, data integration, cleaning, transformation, dissemination and automation processes.
  • Design, develop, document and maintain data architecture, data modelling and metadata.
  • Develop and support data warehouse/lakehouse architectures and data processing ensuring data quality, lineage, auditing, metadata, logging, linkage across datasets and impact assessments.
  • Develop and maintain business intelligence models, interactive dashboards, reports and analytics using tools such as Databricks, Jupyter Notebooks, and Power BI.
  • Design, develop, document, improve and maintain the Data Warehouse/Lakehouse ecosystem (e.g. the DataDevOps lifecycle, architecture).
  • Contribute to the definition and documentation of data governance policies, procedures, standards, and metadata models.
Requirements
  • University degree in IT or relevant discipline, combined with minimum 6 years of relevant working experience in IT.
  • Experience with development and data processing using e.g. Python, SQL, Power M and DAX.
  • Experience with structured, semi-structured and unstructured data types and related file format (e.g. JSON, Parquet, Delta).
  • Experience with gathering business requirements and transforming it into data collection, integration and analysis processes.
  • Experience in Microsoft On-Prem and Azure Data Platform tools (such as Azure Data Factory, Azure Functions, Azure Logic Apps, SQL Server, ADLS, Azure Databricks, Microsoft Fabric/Power BI, Azure DevOps, Azure AI Services, PowerShell).
  • Experience in CI/CD lifecycle using Azure DevOps.
  • Experience in Databricks ecosystem, Apache Spark and Python data processing libraries.
  • Experience with Data Modelling principles and methods.
  • Experience with Data Lakes and Data Lakehouse architecture, concepts and governance.
  • Experience with Data Integration and data warehouse/lakehouse modelling techniques, concepts and methods (e.g. SCD, Functional Engineering, Data Vault, Data Streaming, etc).
  • Experience with data governance and data management standards, policies, processes, metadata, quality, etc.
  • Experience with WebAPIs and OpenAPI standard.
  • Knowledge of DAMA Data Management best practices and standards.
  • Knowledge of Data Governance and Discovery tools such as Azure Purview.
  • Knowledge of Master data and reference data management concepts.
  • Knowledge of Business glossaries, data dictionaries, and data catalogues.
  • Knowledge of Moodle or other Learning Management System.
  • Excellent command of the English language.

Hiring Team Member

Avula Srivalli
Recruitment Coordinator