Data Platform Engineer

London, London £80000 - £90000 Annually Rise Technical Recruitment

Posted 2 days ago

Job Viewed

Tap Again To Close

Job Description

permanent

Data Platform Engineer
London - Hybrid
80,000-90,000 + 38 Days Holiday + Private Healthcare + Life Assurance + Flexible Working + Pension + Package


Excellent opportunity for a Data-focused Site Reliability Engineer or a Data Platform Engineer with some DevOps principals to join a forward-thinking and high-growth Fintech company offering a hybrid work environment, a great benefits package, and opportunities for further progression!

This company are a bleeding-edge technology business driving innovation across the global financial services sector. Their platform processes vast amounts of real-time and batch data, supporting business critical reporting and analytics. With a strong culture rooted in integrity, creativity, and technical excellence, they've become a trusted partner across a wide range of businesses.

In this role you'll take ownership of the reliability and performance of large-scale date pipelines built on AWS, Apache Flink, Kafka, and Python. You'll play a key role in diagnosing incidents, optimising system behaviour, and ensuring reporting data is delivered on time and without failure.

The ideal candidate will have a strong experience working with streaming and batch data systems, a solid understanding of monitoring a observability, and hands-on experience working with AWS, Apache Flink, Kafka, and Python.

This is a fantastic opportunity to step into a SRE role focused on data reliability in a modern cloud native environment, with full ownership of incident management, architecture, and performance.

The Role:
*Maintaining and monitoring real-time and batch data pipelines using Flink, Kafka, Python, and AWS
*Act as an escalation point for critical data incidents and lead root cause analysis
*Optimising system performance, define SLIs/SLOs, and drive reliability
*Woking closely with various other departments and teams to architect scalable, fault-tolerant data solutions

The Person:
*Experience in a data-focused SRE, Data Platform, or DevOps role
*Strong knowledge of Apache Flink, Kafka, and Python in production environments
*Hands-on AWS experience with AWS (Lambda, EMR, Step Functions, Redshift, etc.)
*Comfortable with monitoring tools, distributed systems debugging, and incident response

Reference Number: BBBH(phone number removed)

To apply for this role or for to be considered for further roles, please click "Apply Now" or contact Tommy Williams at Rise Technical Recruitment.

Rise Technical Recruitment Ltd acts an employment agency for permanent roles and an employment business for temporary roles.

The salary advertised is the bracket available for this position. The actual salary paid will be dependent on your level of experience, qualifications and skill set. We are an equal opportunities employer and welcome applications from all suitable candidates.

This advertiser has chosen not to accept applicants from your region.

Data Platform Engineer

London, London Lyst

Posted today

Job Viewed

Tap Again To Close

Job Description

Permanent

Lyst is a global fashion shopping platform founded in London in 2010 and catering to over 160M shoppers per year. We offer our customers the largest assortment of premium & luxury fashion products in one place, curating pieces from 27,000 of the world's leading brands and stores. In 2025, Lyst joined Zozo, operators of Zozotown, the leading fashion e-commerce platform in Japan. This partnership marks a bold new era for Lyst, as we accelerate our vision and work together to transform the future of fashion shopping through AI and technology. 

At Lyst, we obsess over the customer, providing a search & discovery experience which offers inspiration, fulfilment, and personalisation. We believe that fashion is amazing but shopping for fashion often isn't, and use our technology, data and creativity to bring more joy, greater choice and fewer fails. Our mission is to help fashion shoppers make better choices and help fashion partners find better audiences as the category-leading destination for every fashion shopper.

The Role

Lyst is looking for a Software Engineer to join their Data Platform team. The engineer will help build upon and improve our customer facing analytics ingestion pipelines that power our business intelligence and decision making.

You will work in a fast-paced environment with a high level of autonomy. While your work will be peer reviewed and subject to approvals, it is important that you are able to liaise with both team members and wider partners in the organisation to gather requirements and lead on appropriate solutions.

You will be maintaining and improving a system that handles a high volume of traffic so experience with cloud and data warehouse infrastructure will help you in this role (we use AWS, Cloudflare and snowflake). Familiarity with infrastructure as code will also help when updating our cloud architecture (we use terraform).

We place a large focus on data quality so you'll be expected to review the small picture (data contracts via versioned schemas) and the wider picture (data integrity and data completeness across tables and databases). We also value the ability for team members to think of ways we can improve data quality and drive those initiatives across the business.

Within the first three months, you will be able to:

  • Contribute to every part of our system, ranging from code and tests to infrastructure changes.
  • Ensure the stability of our system by implementing and improving monitoring and observability tools.
  • Write resilient code that is well tested.
  • Be curious — not just the code, but the architecture of our platforms and everything that enables the business to thrive.
  • Gain expertise over our tools and services: Python, Docker, Github Actions, Snowflake, AWS
  • Participate in all team ceremonies and have direct input in the team's ways of working.

This is a high-trust, supportive, and collaborative environment where you will have plenty of opportunities to make an impact on your team and the wider company. We value attitude as much as we do direct experience—we want to hire people to grow into the role and beyond.

About the team:

  • Python is our bread and butter.
  • The wider data platform team uses dbt, Snowflake, and Looker to model, transform, and expose data for analytics and reporting across the business.
  • We use Docker and Kubernetes to manage our production services.
  • We use Github Actions for continuous integration and deployment. Lyst deploys new changes to production more than 500 times a month.
  • The Data Platform team is one of Lyst's fundamental teams, and the data we manage is critical for the business. We have a strong relationship across the rest of the organisation, and almost all of Lyst engineering engages with us on a regular basis.
  • We care about robustness and integrity in our pipelines and use observability tools to monitor.

Requirements

  • Experience in developing robust and secure software solutions and data pipelines. 
  • Effective communication skills, comfortable working with technical and non-technical individuals and teams.
  • Proficiency in developing with Python.
  • Experience with delivering software solutions and data pipelines within public cloud technologies and architecture (perferably AWS exp).
  • Experience with containers (Docker) and container orchastration.
  • Experience with Infrastructure as Code (we use Terraform).
  • Experience utilising monitoring, observability and logging tools.
  • Experience with git, gitOps, github actions.
  • Exposure or experience with cloud data warehouse/data platforms (we use Snowflake). 

Things that matter to us:

  • You have a passion for utilising your software engineering skills in data engineering.
  • You enjoy building sustainable, robust software that scales and automate and drive efficiencies at every chance.
  • You are excited to learn and equally share your knowledge with peers.
  • You enjoy problem solving as an individual but also solving problems within a team or with wider colaboration across teams.
  • You take pride in code ownership and maintaining responsibilities to a high level with good documentation and test coverage.
  • You understand the importance of good Data management practices and broader Data Governance principles and their impact on things like Data Quality. 
  • You effectively support and maintain production data pipelines, ensuring monitoring and observability is in place and respond to sustain service levels and data consistency within Data platforms.

Benefits

    • Our Ways of Working: We all come into the office on Tuesdays and Thursdays , with the option to work remotely or come into the office on the other days. We believe that in person collaboration and community spirit is super important, which is why we spend some of our time in the office and some of our time at home.
    • Time Off: In addition to the 8 statutory
This advertiser has chosen not to accept applicants from your region.

Data Platform Engineer

London, London BMLL Technologies

Posted 5 days ago

Job Viewed

Tap Again To Close

Job Description

Permanent

About BMLL

We are the leading independent provider of harmonized Level 3 historical data and analytics to the world’s most sophisticated capital market participants. BMLL offers banks, brokers, asset managers, hedge funds and global exchange groups immediate and flexible access to the most granular Level 3, T+1 order book data and advanced analytics, enabling them to accelerate research, optimize trading strategies and generate alpha at unparalleled speed and scale.

Our culture is inclusive and highly collaborative, with a flat management structure that empowers our employees to get involved in decision making as we continue to grow and scale.  We give all our employees share options so they participate in the growth and development of the business.

We offer a combination of remote and office (London based) working, weekly team lunches and plenty of office snacks!

For more information, please visit our website, or visit our Twitter, @bmlltech or LinkedIn, @BMLL.

About the Role

At BMLL, we process terabytes of historical market data every day, for which we have a powerful data processing platform built on AWS to provide best-in-class capacity, scalability and reliability.

We are looking for a Data Platform Engineer to join BMLL's Core Engineering team, where you'll architect the core platform in which BMLL's development teams execute highly complex data pipelines. You'll design and build solutions that scale compute to millions of concurrent job executions, optimised to meet performance, efficiency and cost-effectiveness requirements, while ensuring high availability.

Data Platform Engineers bridge the gap between software and infrastructure, and are essential to the success of BMLL's technology strategy.

Responsibilities

  • Design and build software solutions to scale AWS compute resources to meet application performance requirements.
  • Ensure 24/7 system reliability by implementing company and industry best practices in replication, redundancy and monitoring.
  • Implement workflow management software, to automate operational tasks and optimise the utilisation of infrastructure and applications.
  • Design and implement CI/CD workflows to maintain software quality via continuous and automated deployment and testing.
  • Work with development and operations teams to design solutions to complex problems, involving large data pipelines that process terabytes of historical market data, in the most efficient and cost-effective manner.
  • Regularly review and assess new tools that become available in the industry and assess how they could be integrated into the platform to continuously improve.

Requirements

Essential

  • Industry experience with cloud computing tools and services in complex systems, preferably in AWS.
  • Strong Python programming skills.
  • Industry experience with software development lifecycle processes and tools.
  • Experience working in a Linux environment.
  • Experience with Docker.
  • Experience with SQL and relational databases.
  • Avid learner, problem solver and detail-orientated.
  • Excellent teamwork and the ability to communicate and work in multidisciplinary teams in a collaborative manner.
  • Computer science or other STEM degree.
  • At least two years of industry experience.

Desirable

  • Familiarity with distributed systems concepts and tools, such as Spark, Ray, RabbitMQ, Kafka, AWS Batch.
  • Familiarity with DevOps practices and tools, such as Terraform.
  • Familiarity with job execution and orchestration tools, such as Celery and Airflow.

Benefits

  • Competitive salary
  • 25 days holiday plus
This advertiser has chosen not to accept applicants from your region.

Data Platform Engineer

London, London YouLend

Posted 48 days ago

Job Viewed

Tap Again To Close

Job Description

Permanent

About Us  

YouLend is the preferred global embedded financing platform for many of the world’s leading e-commerce sites, tech companies and

This advertiser has chosen not to accept applicants from your region.

Lead Data Platform Engineer (Python & AWS)

Lyst

Posted 568 days ago

Job Viewed

Tap Again To Close

Job Description

Permanent

Lyst is a global Fashion Tech company and premium shopping app, founded in London in 2010 and catering to over 200M shoppers per year. We offer our customers the largest assortment of premium & luxury fashion items & products in one place, via an assortment of 8.5M+ items from over 17,000 of the world’s leading brands. We are a scale-up business with a current team size of c. 160 people in London, combining an agile mentality with a proven business model and over a decade of experience. This provides a balance between foundations and structure, and autonomy and pace.

At Lyst we obsess over the customer, providing a search & discovery experience which offers inspiration, fulfilment, and personalisation. We believe that fashion is amazing but shopping for fashion often isn’t, and use our technology, data and creativity to bring more joy, greater choice and fewer fails. Our mission is to help fashion shoppers make better decisions and help fashion partners find better audiences as the category-leading destination for every fashion shopper. Lyst has raised over $160m from leading investors including Accel, Balderton, Molten Ventures, Fidelity International, and LVMH.

We’re looking for a Lead Data Platform Engineer to join our central Analytics Engineering team, and help drive the Data Engineering discipline within the Data Platform chapter. This is a great opportunity for someone who wants to have impact from day one and make their mark as an expert in a high performing, collaborative team.

We record nearly 100 million lines of data daily, and we are looking for a senior engineer to support and maintain a data platform that is scalable, reliable and enables data-led decision making business-wide.

As a data engineer within the central data platform function, you will maintain the data ingestion infrastructure in AWS for data owners across Lyst to leverage and to ensure the data created is of high quality. You will identify, design and implement improvements to these processes, including data contracts and testing. This role will work closely with our engineers, analytics engineers and analysts to help us successfully deliver our data platform and data strategy.

Key responsibilities:

  • Maintain our data ingestion infrastructure, and liaise with Engineering data owners on the infrastructure and standards
  • Identify, design and implement process improvements for data ingestion and testing
  • Create documentation and guidelines on building reliable, scalable data pipelines
  • Own our data contract capabilities, and develop guidelines for data owners
  • Advocate data quality across the organisation
  • Support data owners on data ingestion, for new and existing data pipelines
  • Work closely with Analytics Engineering to bridge the gap between data creation and transformation
  • Support in delivering our data strategy
  • We work mainly in Python and SQL, running on a range of AWS technologies such as S3, EKS, SQS, Firehose, Kinesis & Lambda and non-AWS tools such as Snowflake, dbt, Looker, CircleCI, Docker, Terraform & GitHub.

Requirements

  • Experience with Python, AWS data tooling, and software fundamentals (SQL, Git)
  • Experience with data processing & testing in the cloud, with a large amount of data
  • Excellent Python knowledge and experience and up to date on best Python practices, as well as experience writing efficient SQL
  • Experience working on full data engineering projects, from design to implementation
  • Experience iterating & refactoring existing code as well as writing new code
  • Communication: You are able to communicate clearly and be humble when sharing ideas and solutions. You can communicate with both technical and non-technical stakeholders
  • You strive to write clear, well documented, tested code, and look to improve existing code and ways of working. When you find an opportunity, you’re are excited to go for it
  • (Bonus) Experience with data contract implementation

Things that matter to us:

  • You are curious at heart and like to take ownership of something to make it better
  • You are detail oriented
  • You are a team player and communicate with your peers and other stakeholders in the company on a day to day basis
  • Being confronted with a difficult or strange problem makes you feel like a detective that wants to crack the mystery
  • You have a sense of ownership over products, features and services the team looks after

Benefits

  • Our Ways of Working: We come into the office between 2-4 days a week and we're always in on Tuesdays and Thursdays . We strongly believe that in-person collaboration and time spent together as a team allows us to deliver high impact work, but the flexibility to also work remotely supports our diverse team.
  • Time Off : In addition to the 8 statutory
This advertiser has chosen not to accept applicants from your region.

Big Data Engineer

London, London Northrop Grumman

Posted 4 days ago

Job Viewed

Tap Again To Close

Job Description

UK CITIZENSHIP REQUIRED FOR THIS POSITION: Yes
RELOCATION ASSISTANCE: Relocation assistance may be available
CLEARANCE TYPE: UK-Highest Level of Government Clearance
TRAVEL: Yes, 10% of the Time
**Salary: £77,400 - £116,000**
**Define Possible at Northrop Grumman UK**
At Northrop Grumman UK, our mission is to solve the most complex challenges by shaping the technology and solutions of tomorrow. We call it Defining Possible.
This mind-set goes beyond our customer solutions; it's the foundation for your career development and the impact we have within the community. So, what's your possible?
**Opportunity:**
This is more than just a job; it's a mission.
As a Big Data Systems Engineer, you will be responsible for designing, implementing, and maintaining scalable big data systems that support our data science, machine learning, and AI workloads. You will work closely with cross-functional teams to ensure seamless data integration, security, and compliance with GDPR and privacy regulations. Your expertise in scripting, troubleshooting, and integration testing will be essential in optimizing our data pipelines and orchestration processes.
Our UK Cyber & Intelligence business combines modern software development approaches with a rich heritage and experience in the Defence and security sectors. Our customers have complex and sensitive data and information requirements requiring a mission partner who quickly understands the context, delivering and sustaining a portfolio of challenging technology projects at scale and pace, supporting them through an ambitious digital transformation programme.
If you are looking for a career with meaning where you can make a difference to the nation's security and support critical missions, then look no further.
_"My purpose; to lead a team of engineers with the brightest minds, to push the boundaries and define possible together."_
**Role responsibilities:**
+ Scripting and Automation: Develop and maintain scripts for automating data processes and workflows.
+ Troubleshooting: Diagnose and resolve technical issues related to big data systems, data pipelines, and integrations.
+ Integration Testing: Conduct thorough testing to ensure seamless integration of new tools and technologies into existing systems.
+ Linux Internals: Utilize in-depth knowledge of Linux internals to optimize performance and reliability of big data infrastructure.
+ Network Protocols: Apply understanding of TCP/IP and OSI models in the design and troubleshooting of networked systems.
+ Data Pipeline Support: Manage and optimize data pipelines and orchestration tools such as NiFi and Airflow.
+ Scalable System Design: Design scalable big data systems with a focus on security, GDPR compliance, and privacy
+ Hybrid Cloud Management: Leverage hybrid cloud experience to manage on-premise and AWS cloud resources effectively
+ Data Science Support: Support data science, machine learning, and AI workloads using tools like Jupyter, Spacy, Transformers, and NLTK.
+ Big Data Platforms: Utilize big data NoSQL engines and platforms such as Hive, Impala, and Elasticsearch for data storage and processing
+ BI and Visualization: Implement and support business intelligence and visualization tools like Tableau, Kibana, and PowerBI to provide actionable insights.
**We are looking for:**
+ Strong troubleshooting skills and the ability to diagnose and resolve complex technical issues.
+ Experience with integration testing and ensuring seamless tool integration.
+ In-depth knowledge of Linux internals and system administration.
+ Understanding of TCP/IP and OSI models.
+ Hands-on experience with data pipeline tools like NiFi and Airflow.
+ Proven ability to design scalable big data systems with a focus on security and GDPR compliance.
+ Hybrid cloud experience, specifically with on-premise and AWS cloud environments.
+ Familiarity with data science, machine learning, and AI tools such as Jupyter, Spacy, Transformers, and NLTK.
+ Experience with big data NoSQL engines/platforms such as Hive, Impala, and Elasticsearch.
+ Proficiency with business intelligence and visualization tools like Tableau, Kibana, and PowerBI.
+ Excellent communication and collaboration skills.
Preferred Qualifications:
+ Certification in AWS or other cloud platforms.
+ Experience with additional data orchestration tools.
+ Familiarity with other big data tools and technologies.
+ Previous experience in a similar role within a dynamic and fast-paced environment.
Experience in Cloudera, Hadoop, Cloudera Data Science Workbench (CDSW), Cloudera Machine Learning (CML) would be highly desirable.
**Work Environment:**
+ Full time on-site presence required
**If you don't meet every single requirement, we still encourage you to apply.**
Sometimes, people hesitate to apply because they can't tick every box. We encourage you to apply if you believe the role will suit you well, even if you don't meet all the criteria. You might be exactly who we are looking for, either for this position or for our opportunities at Northrop Grumman UK. We are on an exciting growth trajectory and growing our teams across the UK.
**Security clearance:**
You must hold the highest level of UK Government security clearance. Our requirement team is on hand to answer any questions and we will guide you through the process: .
**Benefits:**
We can offer you a range of flexible working options to suit you, including optional compressed working schedule with every other Friday off. Our benefits including private health care, cash health plan, holiday buy and sell, career development opportunities and performance bonuses. For a comprehensive list of benefits, speak to our recruitment team.
**Why join us?**
+ **A mission to believe in** **-** Every day we contribute to building a more secure and connected world, expanding our reach from land, sea, and air to space and cyberspace. From engineering data and intelligence solutions, to developing maritime navigation and control systems and innovating command and control systems for the UK and NATO, what we do together matters.
+ **A place to belong and thrive** **-** Every voice matters at our table meaning you can bring your authentic self to work. From our Employee Resource Groups backed by thousands of employees to our partnerships with the Association For Black and Minority Ethnic Engineers, Forces Transition Group, Mind, and Women in Defence - we are passionate about growing and supporting our inclusive community where everyone can belong.
+ **Your career, your way** - Shape your career journey with diverse roles, mentorship, and development opportunities that fuel your curiosity, channel your expertise, and nurture your passion. Looking for flexibility? Balance your professional career with your personal life through our health and wellbeing benefits, discount schemes, and investment in your future development. Speak to our team to find the balance that's right for you.
**Ready to apply?**
**Yes** - Submit your application online. Your application will be reviewed by our team and we will be in touch.
**Possibly, I'd like to find out more** **about this role** - Reach out to our team for more information and support: .
**No, I don't think this role is right for me** - Our extensive UK growth means we have exciting, new opportunities opening all the time. Speak to our team to discuss your career goals.
Northrop Grumman is committed to hiring and retaining a diverse workforce, and encourages individuals from all backgrounds and all abilities to apply and consider becoming a part of our diverse and inclusive workforce.
This advertiser has chosen not to accept applicants from your region.

Cloud Data Analytics Platform Engineer - AVP

London, London Citigroup

Posted 12 days ago

Job Viewed

Tap Again To Close

Job Description

Join our rapidly expanding team as a **hands-on** Cloud Data Analytics Platform Engineer and play a pivotal role in shaping the future of data at Citi. We're building a cutting-edge, multi-cloud data analytics platform that empowers our users with secure, scalable, and efficient data insights. This role sits at the intersection of infrastructure, data engineering, and architecture, offering a unique opportunity to work with the latest cloud-native technologies and influence our data strategy. This is a hands-on role requiring deep technical skills and a passion for building and optimizing data platforms.
**What You'll Do:**
+ **Architect and Build:** Design and implement a robust, cloud-native data analytics platform spanning AWS, GCP, and other emerging cloud environments. You'll leverage services like S3/GCS, Glue, BigQuery, Pub/Sub, SQS/SNS, MWAA/Composer, and more to create a seamless data experience. **(Required)**
+ **Data Lake , Data Zone, Data Governance:** Design, build, and manage data lakes and data zones within our cloud environment, ensuring data quality, discoverability, and accessibility for various downstream consumers. Implement and maintain enterprise-grade data governance capabilities, integrating with data catalogs and lineage tracking tools to ensure data quality, security, and compliance. **(Required)**
+ **Infrastructure as Code (IaC):** Champion IaC using Terraform, and preferably other tools like Harness, Tekton, or Lightspeed, developing modular patterns and establishing CI/CD pipelines to automate infrastructure management and ensure consistency across our environments. **(Required, with expanded toolset)**
+ **Collaboration and Best Practices:** Work closely with data engineering, information security, and platform teams to define and enforce best practices for data infrastructure, fostering a culture of collaboration and knowledge sharing. **(Required)**
+ **Kubernetes and Orchestration:** Manage and optimize Kubernetes clusters, specifically for running critical data processing workloads using Spark and Airflow. **(Required)**
+ **Cloud Security:** Implement and maintain robust security measures, including cloud networking, IAM, encryption, data isolation, and secure service communication (VPC peering, PrivateLink, PSC/PSA). **(Required) .** Your knowledge of compliance frameworks relevant to cloud data will be invaluable in maintaining a secure and compliant data environment. **(Optional)**
+ **Snowflake and Databricks (Optional, but highly desired):** Leverage your experience with Snowflake and Databricks to enhance our data platform's capabilities and performance. While not mandatory, experience with these technologies is a significant advantage.
+ **Event-Driven Architectures , FinOps and Cost Optimization (Optional):** Contribute to the development of event-driven data pipelines using Kafka and schema registries, enabling real-time data insights and responsiveness. Apply FinOps principles and multi-cloud cost optimization techniques to ensure efficient resource utilization and cost control.
**What You'll Bring:**
+ **Hands-on Engineering Expertise:** You're a builder who enjoys diving into the technical details and getting your hands dirty. You thrive in a fast-paced environment and are eager to make a direct impact.
+ **Experience : 5-8** years of relevant experience in Data Engineering & Infrastructure Automation
+ **Cloud Expertise:** Proven hands-on experience with AWS and/or GCP, including a deep understanding of their data analytics service offerings.
+ **Data Lake/Zone/Governance Experience:** Demonstrable experience designing, building, and managing data lakes and data zones. Familiarity with data governance tools and frameworks.
+ **IaC Proficiency:** Solid experience with Terraform and preferably Harness, Tekton, or Lightspeed for CI/CD pipeline management.
+ **Kubernetes Mastery:** Strong command of Kubernetes, especially in the context of data processing workloads.
+ **Security Focus:** A firm grasp of cloud security principles and best practices.
+ **Financial Services Experience:** Experience working in financial services, banking, or on data-related cloud transformation projects within the financial industry. (Highly Desired)
**We offer:**
By joining Citi London, you will not only be part of a business casual workplace with a hybrid working model (up to 2 days working at home per week), but also receive a competitive base salary (which is annually reviewed), and enjoy a whole host of additional benefits such as:
+ 27 days annual leave (plus
This advertiser has chosen not to accept applicants from your region.
Be The First To Know

About the latest Data platform engineer Jobs in London !

Data Solutions Architect (Modern Big Data)

London, London NTT America, Inc.

Posted 11 days ago

Job Viewed

Tap Again To Close

Job Description

**The team you'll be working with:**
We are seeking a highly experienced and visionary Data Solutions Architect (Modern Big Data) to join our Data & AI practice. The successful candidate will bring extensive expertise in architecting and delivering modern big data platforms that are scalable, reliable, and business-aligned. This role is pivotal in enabling clients to harness the power of streaming data, data lakes, lakehouses, and advanced analytics platforms, while guiding them on their data modernisation journeys.
As a trusted advisor, you will collaborate with executives, stakeholders, and technical teams to define modern big data strategies, design cloud-native architectures, and implement industry-leading best practices. You will thrive in a fast-paced, evolving technology environment, continuously expanding your knowledge to ensure NTT DATA and our clients remain leaders in data-driven innovation.
**What you'll be doing:**
**Primary Responsibilities:**
+ Client Engagement & Delivery
+ Solution Design & Implementation
+ Modernisation & Transformation
+ Thought Leadership & Knowledge Sharing
+ Collaboration & Leadership
**Business Relationships:**
+ Client Partners
+ Practice Leaders and Members
+ Peer-level relationships within client organisations up to Head of Data Engineering, Chief Data Architect, CIO, and CDO level
**What experience you'll bring:**
**Key Competencies:**
+ Demonstrated expertise in designing and implementing modern big data architectures for large enterprises.
+ Strong consulting values with the ability to engage effectively with senior client stakeholders.
+ Hands-on experience across the data lifecycle: ingestion, storage, transformation, governance, analytics, and consumption.
+ Deep understanding of data product strategies, Data Mesh, and Data Fabric concepts.
+ Ability to align data strategies with business objectives, driving measurable value.
+ Excellent analytical, problem-solving, and communication skills.
+ Proven experience leading data modernisation initiatives across multiple sectors.
**Technical Expertise:**
+ Proficiency with big data technologies: Apache Spark, Kafka, Confluent, Databricks, Unity Catalog.
+ Strong knowledge of cloud ecosystems (AWS, Azure, GCP) and infrastructure-as-code (Terraform, CloudFormation).
+ Hands-on expertise with data lake/lakehouse architectures and modern streaming solutions.
+ Experience with SQL, NoSQL, ETL/ELT pipelines, and programming languages (Python, R, Java).
+ Familiarity with BI, data visualisation, and DevOps principles.
+ Strong understanding of data governance, security, and compliance frameworks (GDPR, CCPA, HIPAA).
+ Exposure to machine learning and AI integration within modern data platforms.
**Experience, Qualifications:**
+ Experience: Minimum 8-12 years in data architecture, engineering, or consulting, with at least 4+ years in modern big data solution architecture.
+ Education: University degree required.
+ Preferred: BSc/MSc in Computer Science, Data Engineering, or related field.
+ Relevant certifications in Databricks, Kafka, or cloud platforms highly desirable.
**Measures of Success:**
+ Delivery of modern big data solutions aligned to client business objectives.
+ High client satisfaction and repeat engagements.
+ Leadership in client transformation journeys to cloud-native data platforms.
+ Contribution to practice growth through assets, accelerators, and thought leadership.
**Who we are:**
We're a business with a global reach that empowers local teams, and we undertake hugely exciting work that is genuinely changing the world. Our advanced portfolio of consulting, applications, business process, cloud, and infrastructure services will allow you to achieve great things by working with brilliant colleagues, and clients, on exciting projects.
Our inclusive work environment prioritises mutual respect, accountability, and continuous learning for all our people. This approach fosters collaboration, well-being, growth, and agility, leading to a more diverse, innovative, and competitive organisation. We are also proud to share that we have a range of Inclusion Networks such as: the Women's Business Network, Cultural and Ethnicity Network, LGBTQ+ & Allies Network, Neurodiversity Network and the Parent Network.
For more information on Diversity, Equity and Inclusion please click here: Creating Inclusion Together at NTT DATA UK | NTT DATA ( we'll offer you:**
We offer a range of tailored benefits that support your physical, emotional, and financial wellbeing. Our Learning and Development team ensure that there are continuous growth and development opportunities for our people. We also offer the opportunity to have flexible work options.
You can find more information about NTT DATA UK & Ireland here: are an equal opportunities employer. We believe in the fair treatment of all our employees and commit to promoting equity and diversity in our employment practices. We are also a proud Disability Confident Committed Employer - we are committed to creating a diverse and inclusive workforce. We actively collaborate with individuals who have disabilities and long-term health conditions which have an effect on their ability to do normal daily activities, ensuring that barriers are eliminated when it comes to employment opportunities. In line with our commitment, we guarantee an interview to applicants who declare to us, during the application process, that they have a disability and meet the minimum requirements for the role. If you require any reasonable adjustments during the recruitment process, please let us know. Join us in building a truly diverse and empowered team.
Back to search Email to a friend Apply now
This advertiser has chosen not to accept applicants from your region.

Cloud & AI Solution Engineer - Data Platform

London, London Microsoft Corporation

Posted 13 days ago

Job Viewed

Tap Again To Close

Job Description

Are you insatiably curious, deeply passionate about the realm of databases and analytics, and ready to tackle complex challenges in a dynamic environment in the era of AI? If so, we invite you to join our team as a **Cloud & AI Solution Engineer** in **Data Platform** for **enterprise customers** at Microsoft. Here, you'll be at the forefront of innovation, working on cutting-edge projects that leverage the latest technologies to drive meaningful impact. Join us and be part of a team that thrives on collaboration, creativity, and continuous learning.
Databases & Analytics is a growth opportunity for Microsoft Azure, as well as its partners and customers. It includes a rich portfolio of products including IaaS and PaaS services on the Azure Platform in the age of AI. These technologies empower customers to build, deploy, and manage database and analytics applications in a cloud-native way.
As an Data Platform Solution Engineer (SE), you will play a pivotal role in helping enterprises unlock the full potential of Microsoft's cloud database and analytics stack across every stage of deployment. You'll collaborate closely with engineering leaders and platform teams to accelerate the Fabric Data Platform, including Azure Databases and Analytics, through hands-on engagements like Proof of Concepts, hackathons, and architecture workshops. This opportunity will allow you to accelerate your career growth, develop deep business acumen, hone your technical skills, and become adept at solution design and deployment. You'll guide customers through secure, scalable solution design, influence technical decisions, and accelerate database and analytics migration into their deployment workflows. In summary, you'll help customers modernize their data platform and realize the full value of Microsoft's platform, all while enjoying flexible work opportunities.
As a trusted technical advisor, you'll guide customers through secure, scalable solution design, influence technical decisions, and accelerate database and analytics migration into their deployment workflows. In summary, you'll help customers modernize their data platform and realize the full value of Microsoft's platform.
Microsoft's mission is to empower every person and every organization on the planet to achieve more. As employees we come together with a growth mindset, innovate to empower others, and collaborate to realize our shared goals. Each day we build on our values of respect, integrity, and accountability to create a culture of inclusion where everyone can thrive at work and beyond.
**Responsibilities**
+ Drive technical sales with decision makers using demos and PoCs to influence solution design and enable production deployments.
+ Lead hands-on engagements-hackathons and architecture workshops-to accelerate adoption of Microsoft's cloud platforms.
+ Build trusted relationships with platform leads, co-designing secure, scalable architectures and solutions
+ Resolve technical blockers and objections, collaborating with engineering to share insights and improve products.
+ Maintain deep expertise in Analytics Portfolio: Microsoft Fabric (OneLake, DW, real-time intelligence, BI, Copilot), Azure Databricks, Purview Data Governance and Azure Databases: SQL DB, Cosmos DB, PostgreSQL.
+ Maintain and grow expertise in on-prem EDW (Teradata, Netezza, Exadata), Hadoop & BI solutions.
+ Represent Microsoft through thought leadership in cloud Database & Analytics communities and customer forums
**Qualifications**
+ Proven technical pre-sales or technical consulting experience
+ OR Bachelor's Degree in Computer Science, Information Technology, or related field AND few years of technical pre-sales or technical consulting experience 
+ OR Master's Degree in Computer Science, Information Technology, or related field AND a couple years of technical pre-sales or technical consulting experience 
+ OR equivalent experience
+ Expert on Azure Databases (SQL DB, Cosmos DB, PostgreSQL) from migration & modernize and creating new AI apps.
+ Expert on Azure Analytics (Fabric, Azure Databricks, Purview) and competitors (BigQuery, Redshift, Snowflake) in data warehouse, data lake, big data, analytics, real-time intelligent, and reporting using integrated Data Security & Governance.
+ Proven ability to lead technical engagements (e.g., hackathons, PoCs, MVPs) that drive production-scale outcomes.
Microsoft is an equal opportunity employer. Consistent with applicable law, all qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations ( .
This advertiser has chosen not to accept applicants from your region.

International Standard Verifier - BTEC Cloud Computing

London, London Pearson

Posted 10 days ago

Job Viewed

Tap Again To Close

Job Description

This vacancy is for candidates who are based in the UK and willing to travel internationally
**About Pearson**
At Pearson we're committed to a world that's always learning. From bringing lectures vividly to life to turning textbooks into laptop lessons, we are always re-examining the way people learn best. We are bold thinkers and standout innovators who motivate each other to explore new frontiers in an environment that supports and inspires us to be better. By pushing the boundaries of technology, and each other to surpass these boundaries, we create seeds of learning that become the catalyst for the world's innovations, personal and global.
We are the UK's largest awarding body and offer qualifications that are globally recognised and benchmarked, with educational excellence rooted in a range of General and Vocational courses.
**Purpose**
The Standards Verifier plays a key role in maintaining the integrity and consistency of vocational qualifications by ensuring that assessment and internal verification practices meet national standards. This involves providing expert subject-specific support to centres, conducting sampling and verification activities, delivering clear and constructive feedback, and contributing to the ongoing development of quality assurance processes. Through regular training, collaboration with internal teams, and engagement with centres, the Standards Verifier helps uphold Pearson's commitment to high-quality, reliable assessment.
**Core Services**
When carrying out your role; please be mindful that you represent Pearson and that you should always maintain your professional integrity. Therefore, it is essential that you:
Ensure compliance with all parts of the terms and conditions and other policies.
Act at all times in a way which will not bring Pearson, it's employees and its representatives into disrepute.
Adhere to Pearson policies and guidance related to data privacy and security of information.
Remain respectful and advocate for others to ensure diversity, equity, and inclusion, in a professional manner.
Raise any concerns if you suspect or are made away of any malpractice, maladministration and/ or Safeguarding issues.
Adhere to deadlines given by Pearson and ensure that all work is carried out to the best of an individual's ability.
Respect the confidentiality of centres and learners.
You are responsible for the provision of your own IT equipment and must therefore ensure it is capable of running Pearson required software, relevant malware, appropriate anti-virus software and a suitable internet connection. As well as this we require a Personal email address attached to a secure
This advertiser has chosen not to accept applicants from your region.
 

Nearby Locations

Other Jobs Near Me

Industry

  1. request_quote Accounting
  2. work Administrative
  3. eco Agriculture Forestry
  4. smart_toy AI & Emerging Technologies
  5. school Apprenticeships & Trainee
  6. apartment Architecture
  7. palette Arts & Entertainment
  8. directions_car Automotive
  9. flight_takeoff Aviation
  10. account_balance Banking & Finance
  11. local_florist Beauty & Wellness
  12. restaurant Catering
  13. volunteer_activism Charity & Voluntary
  14. science Chemical Engineering
  15. child_friendly Childcare
  16. foundation Civil Engineering
  17. clean_hands Cleaning & Sanitation
  18. diversity_3 Community & Social Care
  19. construction Construction
  20. brush Creative & Digital
  21. currency_bitcoin Crypto & Blockchain
  22. support_agent Customer Service & Helpdesk
  23. medical_services Dental
  24. medical_services Driving & Transport
  25. medical_services E Commerce & Social Media
  26. school Education & Teaching
  27. electrical_services Electrical Engineering
  28. bolt Energy
  29. local_mall Fmcg
  30. gavel Government & Non Profit
  31. emoji_events Graduate
  32. health_and_safety Healthcare
  33. beach_access Hospitality & Tourism
  34. groups Human Resources
  35. precision_manufacturing Industrial Engineering
  36. security Information Security
  37. handyman Installation & Maintenance
  38. policy Insurance
  39. code IT & Software
  40. gavel Legal
  41. sports_soccer Leisure & Sports
  42. inventory_2 Logistics & Warehousing
  43. supervisor_account Management
  44. supervisor_account Management Consultancy
  45. supervisor_account Manufacturing & Production
  46. campaign Marketing
  47. build Mechanical Engineering
  48. perm_media Media & PR
  49. local_hospital Medical
  50. local_hospital Military & Public Safety
  51. local_hospital Mining
  52. medical_services Nursing
  53. local_gas_station Oil & Gas
  54. biotech Pharmaceutical
  55. checklist_rtl Project Management
  56. shopping_bag Purchasing
  57. home_work Real Estate
  58. person_search Recruitment Consultancy
  59. store Retail
  60. point_of_sale Sales
  61. science Scientific Research & Development
  62. wifi Telecoms
  63. psychology Therapy
  64. pets Veterinary
View All Data Platform Engineer Jobs View All Jobs in London