Senior Data Engineer
Freemarket
Date: 16 hours ago
City: London
Contract type: Full time

Job Title: Senior Data Engineer
Location: Freemarket offers a hybrid working model. You should be able to attend the office in London Bridge when required, on average twice per week.
Department: Data / Engineering
Reports To: Head of Data
Employment Type: Permanent, Full-Time
About FreemarketFX
Freemarket is a provider of digital solutions for FX and cross-border payment needs. Anchored by deep sector expertise, rigorous compliance-led onboarding, and unmatched oversight of regulated flows, clients are rewarded with a partner that values their relationship like no other. Through our proprietary digital platform clients can access an instant settlement network and seamless real-time money movement globally within an interconnected community of like-minded companies.
At Freemarket, our success is driven by our commitment to core behaviours that shape how we work and deliver value. We take accountability, ensuring outcomes are met with urgency and transparency. Our data-driven approach blends rigorous analysis with intuition to guide sound decision-making. We encourage innovation by being curious learners, always seeking new knowledge, skills, and perspectives. We act as team players, prioritising team success over individual recognition, and our client-centric mindset ensures we consistently understand and meet the needs of our clients, adding value at every step. These behaviours run through everything we do, enabling us to exceed expectations and support our clients' growth effectively.
About the Role
We are seeking a highly skilled Senior Data Engineer to design, implement, and maintain scalable, reliable, and high-performance data pipelines and architectures in a modern cloud environment. You will play a key role in building and optimizing our Medallion architecture (Bronze, Silver, Gold layers), working with modern tools such as Databricks, dbt, Azure Data Factory, and Python/SQL to support critical business analytics and AI/ML initiatives.
Key Responsibilities
Location: Freemarket offers a hybrid working model. You should be able to attend the office in London Bridge when required, on average twice per week.
Department: Data / Engineering
Reports To: Head of Data
Employment Type: Permanent, Full-Time
About FreemarketFX
Freemarket is a provider of digital solutions for FX and cross-border payment needs. Anchored by deep sector expertise, rigorous compliance-led onboarding, and unmatched oversight of regulated flows, clients are rewarded with a partner that values their relationship like no other. Through our proprietary digital platform clients can access an instant settlement network and seamless real-time money movement globally within an interconnected community of like-minded companies.
At Freemarket, our success is driven by our commitment to core behaviours that shape how we work and deliver value. We take accountability, ensuring outcomes are met with urgency and transparency. Our data-driven approach blends rigorous analysis with intuition to guide sound decision-making. We encourage innovation by being curious learners, always seeking new knowledge, skills, and perspectives. We act as team players, prioritising team success over individual recognition, and our client-centric mindset ensures we consistently understand and meet the needs of our clients, adding value at every step. These behaviours run through everything we do, enabling us to exceed expectations and support our clients' growth effectively.
About the Role
We are seeking a highly skilled Senior Data Engineer to design, implement, and maintain scalable, reliable, and high-performance data pipelines and architectures in a modern cloud environment. You will play a key role in building and optimizing our Medallion architecture (Bronze, Silver, Gold layers), working with modern tools such as Databricks, dbt, Azure Data Factory, and Python/SQL to support critical business analytics and AI/ML initiatives.
Key Responsibilities
- ETL Development: Design and build robust and reusable ETL/ELT pipelines through the Medallion architecture in Databricks.
- Data Transformation: Create and manage data models and transformations using dbt, ensuring clear lineage, version control, and modularity.
- Pipeline Orchestration: Develop and manage workflow orchestration using Azure Data Factory, including setting up triggers, pipelines, and integration runtimes.
- System Maintenance: Monitor, maintain, and optimize existing data pipelines, including cron job scheduling and batch/stream processing.
- Error Handling: Design and implement effective logging, monitoring, and alerting strategies for robust error management and recovery.
- Scalability & Futureproofing: Contribute to architectural discussions and decisions, ensuring scalability, data quality, and future-proof data systems.
- Collaboration: Work closely with data analysts, finance and engineers to ensure data availability and usability across business domains.
- Documentation: Maintain comprehensive documentation covering data models, architecture decisions, transformation logic, and operational procedures.
- Data Governance & Security: Ensure compliance with data security policies, data retention rules, and privacy regulations.
- 5+ years of experience in data engineering or similar roles.
- Strong experience with Databricks, including notebooks, cluster configuration, and Delta Lake.
- Proficiency in dbt for transformation logic and version-controlled data modeling.
- Deep knowledge of Azure Data Factory, including pipeline orchestration and integration with other Azure services.
- Experience with data integration (e.g. : APIs, JSON, XML, Web Services) essential
- Expertise in Python and SQL for data manipulation and pipeline development.
- Hands-on experience implementing and maintaining Medallion Architecture (Bronze/Silver/Gold).
- Familiarity with CI/CD, Git version control, and agile development methodologies.
- Strong understanding of data warehousing principles, data modeling, and performance optimization.
- Experience with cron jobs, job orchestration, and error monitoring tools.
- Experience with Azure Bicep or other Infrastructure-as-Code tools.
- Exposure to real-time/streaming data (Kafka, Spark Streaming, etc.).
- Understanding of data mesh, data contracts, or domain-driven data architecture.
- Hands on experience with MLflow and Llama
How to apply
To apply for this job you need to authorize on our website. If you don't have an account yet, please register.
Post a resumeSimilar jobs
Health and Safety and Environmental Management Advisor
UK Ministry of Defence,
London
2 weeks ago
City of Westminster, London (region), SW1P 3JHJob SummaryThe MOD Saudi Armed Forces Projects (MODSAP) is a unique and exciting place to work. It forms a crucial part of the longstanding strategic defence relationship between the UK and the Kingdom of Saudi Arabia (KSA). Our defence, security and prosperity interests in KSA remain substantial. MODSAP delivers a large, exciting and complex...

Programmatic Supply Manager
TravelDesk,
London
2 weeks ago
Job purposeAt MMGY Global, we believe nothing shapes your view of the world like travel. So, every day, we share our client’s stories from a perspective that inspires people to see the world differently. Our personalised service and strategy connect media, consumers, and influencers across the globe, taking people to new places and changing their view for the better. At...

Valuation and Reporting Specialist
Robert Walters,
London
£90
per hour
2 weeks ago
Job Title: Valuations and Reporting SpecialistJob Type: 12-month contractLocation: LondonHourly: £90 PAYE + holiday payOpportunity Overview: A fantastic opportunity has risen to temporarily join my client as a Valuations & Reporting Specialist.My client is a leading global financial services firm providing a wide range of investment banking, securities, investment management and wealth management services. The Firm's employees serve clients worldwide...
