441 Data Engineer jobs in the United Kingdom
Big Data Engineer

Posted 1 day ago
Job Viewed
Job Description
RELOCATION ASSISTANCE: Relocation assistance may be available
CLEARANCE TYPE: UK-Highest Level of Government Clearance
TRAVEL: Yes, 10% of the Time
**Salary: £77,400 - £116,000**
**Define Possible at Northrop Grumman UK**
At Northrop Grumman UK, our mission is to solve the most complex challenges by shaping the technology and solutions of tomorrow. We call it Defining Possible.
This mind-set goes beyond our customer solutions; it's the foundation for your career development and the impact we have within the community. So, what's your possible?
**Opportunity:**
This is more than just a job; it's a mission.
As a Big Data Systems Engineer, you will be responsible for designing, implementing, and maintaining scalable big data systems that support our data science, machine learning, and AI workloads. You will work closely with cross-functional teams to ensure seamless data integration, security, and compliance with GDPR and privacy regulations. Your expertise in scripting, troubleshooting, and integration testing will be essential in optimizing our data pipelines and orchestration processes.
Our UK Cyber & Intelligence business combines modern software development approaches with a rich heritage and experience in the Defence and security sectors. Our customers have complex and sensitive data and information requirements requiring a mission partner who quickly understands the context, delivering and sustaining a portfolio of challenging technology projects at scale and pace, supporting them through an ambitious digital transformation programme.
If you are looking for a career with meaning where you can make a difference to the nation's security and support critical missions, then look no further.
_"My purpose; to lead a team of engineers with the brightest minds, to push the boundaries and define possible together."_
**Role responsibilities:**
+ Scripting and Automation: Develop and maintain scripts for automating data processes and workflows.
+ Troubleshooting: Diagnose and resolve technical issues related to big data systems, data pipelines, and integrations.
+ Integration Testing: Conduct thorough testing to ensure seamless integration of new tools and technologies into existing systems.
+ Linux Internals: Utilize in-depth knowledge of Linux internals to optimize performance and reliability of big data infrastructure.
+ Network Protocols: Apply understanding of TCP/IP and OSI models in the design and troubleshooting of networked systems.
+ Data Pipeline Support: Manage and optimize data pipelines and orchestration tools such as NiFi and Airflow.
+ Scalable System Design: Design scalable big data systems with a focus on security, GDPR compliance, and privacy
+ Hybrid Cloud Management: Leverage hybrid cloud experience to manage on-premise and AWS cloud resources effectively
+ Data Science Support: Support data science, machine learning, and AI workloads using tools like Jupyter, Spacy, Transformers, and NLTK.
+ Big Data Platforms: Utilize big data NoSQL engines and platforms such as Hive, Impala, and Elasticsearch for data storage and processing
+ BI and Visualization: Implement and support business intelligence and visualization tools like Tableau, Kibana, and PowerBI to provide actionable insights.
**We are looking for:**
+ Strong troubleshooting skills and the ability to diagnose and resolve complex technical issues.
+ Experience with integration testing and ensuring seamless tool integration.
+ In-depth knowledge of Linux internals and system administration.
+ Understanding of TCP/IP and OSI models.
+ Hands-on experience with data pipeline tools like NiFi and Airflow.
+ Proven ability to design scalable big data systems with a focus on security and GDPR compliance.
+ Hybrid cloud experience, specifically with on-premise and AWS cloud environments.
+ Familiarity with data science, machine learning, and AI tools such as Jupyter, Spacy, Transformers, and NLTK.
+ Experience with big data NoSQL engines/platforms such as Hive, Impala, and Elasticsearch.
+ Proficiency with business intelligence and visualization tools like Tableau, Kibana, and PowerBI.
+ Excellent communication and collaboration skills.
Preferred Qualifications:
+ Certification in AWS or other cloud platforms.
+ Experience with additional data orchestration tools.
+ Familiarity with other big data tools and technologies.
+ Previous experience in a similar role within a dynamic and fast-paced environment.
Experience in Cloudera, Hadoop, Cloudera Data Science Workbench (CDSW), Cloudera Machine Learning (CML) would be highly desirable.
**Work Environment:**
+ Full time on-site presence required
**If you don't meet every single requirement, we still encourage you to apply.**
Sometimes, people hesitate to apply because they can't tick every box. We encourage you to apply if you believe the role will suit you well, even if you don't meet all the criteria. You might be exactly who we are looking for, either for this position or for our opportunities at Northrop Grumman UK. We are on an exciting growth trajectory and growing our teams across the UK.
**Security clearance:**
You must hold the highest level of UK Government security clearance. Our requirement team is on hand to answer any questions and we will guide you through the process: .
**Benefits:**
We can offer you a range of flexible working options to suit you, including optional compressed working schedule with every other Friday off. Our benefits including private health care, cash health plan, holiday buy and sell, career development opportunities and performance bonuses. For a comprehensive list of benefits, speak to our recruitment team.
**Why join us?**
+ **A mission to believe in** **-** Every day we contribute to building a more secure and connected world, expanding our reach from land, sea, and air to space and cyberspace. From engineering data and intelligence solutions, to developing maritime navigation and control systems and innovating command and control systems for the UK and NATO, what we do together matters.
+ **A place to belong and thrive** **-** Every voice matters at our table meaning you can bring your authentic self to work. From our Employee Resource Groups backed by thousands of employees to our partnerships with the Association For Black and Minority Ethnic Engineers, Forces Transition Group, Mind, and Women in Defence - we are passionate about growing and supporting our inclusive community where everyone can belong.
+ **Your career, your way** - Shape your career journey with diverse roles, mentorship, and development opportunities that fuel your curiosity, channel your expertise, and nurture your passion. Looking for flexibility? Balance your professional career with your personal life through our health and wellbeing benefits, discount schemes, and investment in your future development. Speak to our team to find the balance that's right for you.
**Ready to apply?**
**Yes** - Submit your application online. Your application will be reviewed by our team and we will be in touch.
**Possibly, I'd like to find out more** **about this role** - Reach out to our team for more information and support: .
**No, I don't think this role is right for me** - Our extensive UK growth means we have exciting, new opportunities opening all the time. Speak to our team to discuss your career goals.
Northrop Grumman is committed to hiring and retaining a diverse workforce, and encourages individuals from all backgrounds and all abilities to apply and consider becoming a part of our diverse and inclusive workforce.
Big Data Engineer / Developer ___ Remote ___ Contract
Posted 29 days ago
Job Viewed
Job Description
br>Location: Remote (Virginia / EST Time)
Long Term Contract – W2 / C2C Only < r>
br>
Required Skills & Experience:
Overall IT Experience: 8-10 Years.
+ years of experience in application development including Python, SQL, Scala, or Java
4+ years of experience with a public cloud (AWS, Microsoft Azure, Google Cloud)
4+ years’ experience with Distributed data/computing tools (MapReduce, Hadoop, Hive, EMR, Kafka, Spark, Gurobi, or MySQL) < r> 4 year experience working on real-time data and streaming applications
4+ years of experience with NoSQL implementation (Mongo, Cassandra)
4+ years of data warehousing experience (Redshift or Snowflake)
4+ years of experience with UNIX/Linux including basic commands and shell scripting
2+ years of experience with Agile engineering practices
br>
If interested, please share your resume to br>
br>
Data Engineer
Posted today
Job Viewed
Job Description
Data Engineer // 1 Day/Week Onsite Worcestershire // Initial 3 Months // Up to 400/day Outside IR35
REED Technology are working with a client who are seeking a contract Data Engineer to join their dynamic Data Engineering team. You'll play a key role in designing and implementing complex data flows that power analytics and business intelligence (BI) systems across their growing financial services organisation.
Key responsibilities
- Designing, building, and maintaining robust data pipelines and platforms
- Collaborating with BI, tech, and business teams to deliver scalable solutions
- Leading on the implementation of new data infrastructure
- Analysing data for integrity and correlation across systems
- Translating stakeholder requirements into technical deliverables
- Producing and maintaining high-quality technical documentation
- Championing data engineering best practices and standards across the business
Technical skills
- Cloud data platforms - Azure, AWS, or GCP (Azure preferred)
- Snowflake - Deep knowledge and hands-on experience
- Matillion - Expertise in ETL orchestration
- Data warehousing and advanced analytics
- Dimensional modelling and data vault methodologies
- Stakeholder engagement and cross-functional collaboration
Flexible hybrid working - 1 day per week onsite in Worcestershire (Tuesday)
This is a great opportunity for a seasoned Data Engineer to make a real impact on a forward-thinking financial services company that is investing heavily in its data capabilities.
If you have the skills and experience to carry out the role, please apply using the link provided.
Data Engineer
Posted 1 day ago
Job Viewed
Job Description
Data Engineer (MS Fabric) / Remote with travel to Yorkshire or London / 45,000 - 55,000
Are you a Data Engineer with solid SQL skills and experience building tabular models? Want to join a business investing in its cloud journey, where you'll help modernise data platforms and influence core business reporting?
This is a great opportunity to enhance your technical capability while contributing to meaningful projects across logistics, ERP, and finance systems.
What do we need from you?
- Strong SQL - able to write efficient queries and build ETL processes
- SQL Server / RDBMS experience
- Proven experience building Tabular models
Nice to have:
- Exposure to Microsoft Fabric
- Python scripting
- Experience working in a cloud environment (e.g., Azure)
You'll be joining a collaborative data function to expand and optimise existing tabular models. The business already has strong models across finance and stock and now needs support to extend these into areas like fleet, transport, and logistics. You'll also help with modernising legacy models, supporting system changes, and contributing to their ongoing cloud transition via Microsoft Fabric.
Key focus areas:
- Enhance and build tabular models, especially around underdeveloped areas such as fleet data
- Support integration and modelling around Core ERP, Transport, and Logistics systems
- Help scale and transition to Microsoft Fabric as part of the company's cloud journey
- Improve performance and efficiency of existing architecture to meet growing user demand
Day-to-day responsibilities:
- Work closely with Business Analysts to turn business requirements into robust technical solutions
- Spend 60-70% of your time on model development, with the remainder focused on support and optimisation
- Re-engineer legacy tabular models to improve performance
- Adapt models in response to system or source data changes
- Deliver traditional ETL and modern data modelling support to the wider business
Why join?
- Clear career and skill progression opportunities
- Excellent team culture with strong employee tenure
- Opportunity to broaden your technical stack - training, certifications, and hands-on project work
- Remote working and hybrid working options which provide a great work-life balance
If you're interested, please send your CV to Dominic Brown at by close of play on Friday 25th July, to avoid disappointment, as interviews will follow shortly after.
"At Corecom, we don't just accept differences, we celebrate them and thrive on them for the benefit of our employees, our clients and our candidates. Internally, we thrive from our differences and want our employees to be proud to be themselves and proud to be Corecom. Externally, we utilise those differences to help our clients and candidates strive for a more diverse and inclusive world.
Data Engineer (MS Fabric) / Remote with travel to Yorkshire or London / 45,000 - 55,000
Data Engineer
Posted 4 days ago
Job Viewed
Job Description
Exciting opportunity to work with a Big4 Tech Company as a Data Engineer, based in London, UK!
Data Engineer
Contract Length: 15th September - 31st August 2026
Location: London (3 days onsite, 2 days WFH)
Summary:
The main function of the Data Engineer is to develop, evaluate, test and maintain architectures and data solutions within our organization. The typical Data Engineer executes plans, policies, and practices that control, protect, deliver, and enhance the value of the organization's data assets.
Job Responsibilities:
* Design, construct, install, test and maintain highly scalable data management systems.
* Ensure systems meet business requirements and industry practices.
* Design, implement, automate and maintain large scale enterprise data ETL processes.
* Build high-performance algorithms, prototypes, predictive models and proof of concepts .
Qualifications:
* Ability to work as part of a team, as well as work independently or with minimal direction.
* Excellent written, presentation, and verbal communication skills.
* Collaborate with data architects, modelers and IT team members on project goals.
* Strong PC skills including knowledge of Microsoft SharePoint.
* Bachelor's degree in a technical field such as computer science, computer engineering or related field required.
Data Engineer
Posted 5 days ago
Job Viewed
Job Description
JOB DETAILS
- 550 PER DAY
- INSIDE IR35
- REMOTE ROLE
- ORGANISATION BASED IN BRISTOL
- 3-MONTH CONTRACT
- IMMEDIATE START
SKILLS
- Extensive experience in Azure Data Factory, SQL and Python.
- Strong understanding of ETL processes.
- Prior working experience with Databricks and Unity Catalogue.
RESPONSIBILITIES
- Taking Proof of Concept into production.
- Fixing issues that may arise from the Proof of Concept production process'.
- The candidate must be a self-starter and have the ability to work independently.
What you need to do now
If you're interested in this role, click 'apply now' to forward an up-to-date copy of your CV, or call us now.
If this job isn't quite right for you, but you are looking for a new position, please contact us for a confidential discussion about your career.
Hays Specialist Recruitment Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job you accept the T&C's, Privacy Policy and Disclaimers which can be found at (url removed)
Data Engineer
Posted 5 days ago
Job Viewed
Job Description
i am recruiting for a Data Engineer to work in Glasgow 3 days a week, 2 days remote.
The role falls inside IR35 so you will have to work through an umbrella company.
Banking / Financial Services experience is required.
You will have a number of years of experience supporting Software Engineering, Data Engineering, or Data Analytics projects.
Experience in data development and solutions in highly complex data environments with large data volumes.
SQL / PLSQL experience with the ability to write ad-hoc and complex queries to perform data analysis.
Experience developing data pipelines and data warehousing solutions using Python and libraries such as Pandas, NumPy, PySpark, etc.
You will be able to develop solutions in a hybrid data environment (on-Prem and Cloud).
Hands on experience with developing data pipelines for structured, semi-structured, and unstructured data and experience integrating with their supporting stores (e.g. RDBMS, NoSQL DBs, Document DBs, Log Files etc).
Please apply ASAP to find out more!
Be The First To Know
About the latest Data engineer Jobs in United Kingdom !
Data Engineer
Posted 7 days ago
Job Viewed
Job Description
Data Engineer - GCP | 500-550 per day | Inside IR35
6-Month Contract | Hybrid (2 Days Onsite - Osterley)
83zero are partnered with a leading media and broadcasting organisation on the lookout for a skilled Data Engineer to join their Data & Analytics team on an initial 6-month contract .
This role is perfect for someone with strong ETL expertise , deep experience in Google Cloud Platform (GCP) , and a passion for building scalable, cloud-native data pipelines. You'll work with cutting-edge tech in a fast-paced environment, helping to deliver critical insights and analytics to the business.
What You'll Be Doing:
- Designing and developing scalable ETL pipelines to process and deliver large volumes of data.
- Working hands-on with GCP services including BigQuery , Pub/Sub , and Dataflow .
- Automating infrastructure using Terraform , Ansible , and CI/CD tooling.
- Writing clean, efficient code in Python , Go , and BASH .
- Supporting and maintaining a secure Linux-based data engineering environment.
- Collaborating with stakeholders to ensure data pipelines meet business needs and SLAs.
What We're Looking For:
- Proven experience in data engineering with a strong focus on cloud-based ETL workflows .
- Solid background with Google Cloud Platform (GCP) and associated data tools.
- Skilled in Infrastructure as Code - Terraform and Ansible preferred.
- Confident working with CI/CD pipelines (Jenkins, GitLab CI, GoCD, etc.).
- Proficient in Python , Go , and shell scripting (BASH ).
- Strong Linux system administration skills.
- Ability to work 2 days per week onsite in Osterley .
Data Engineer
Posted 10 days ago
Job Viewed
Job Description
Data Engineer
Location: Uxbridge (Hybrid – 2 days onsite)
Salary: Up to £60,000 + Bonus + Benefits
Type: Permanent | Mid-Senior Level
Are you a technically strong Data Engineer with a passion for bridging the gap between global data functions and commercial strategy? This is your opportunity to join one of the world’s fastest-growing consumer brands, playing a key role in developing the EMEA data landscape and working alongside a high-performing global team.
A global leader in the energy drinks space, this business continues to dominate its category with operations across 140+ countries. With its bold brand, dynamic culture, and continued double-digit growth across EMEA, they’re investing in their data capabilities to better support business-critical decisions across Commercial, Marketing, and wider business functions.
The RoleThis role sits within the EMEA Data & Analytics function and will play a pivotal role in supporting the commercial team while collaborating with internal US engineers and external vendors (TCS). You’ll work to align EMEA capabilities with the more mature US function, helping standardise data reporting, architecture, and tooling across regions.
You’ll be responsible for data pipeline development, modelling, and transformation across Microsoft Azure services, while gathering requirements, engaging stakeholders, and documenting processes clearly and proactively.
Key Responsibilities
- Design, build and maintain data pipelines across Microsoft Fabric (Azure Data Factory, Synapse, Databricks)
- Develop scalable data models to support business needs in the commercial space and beyond
- Collaborate with global teams and vendors to ensure alignment and effective knowledge transfer
- Translate business needs into technical solutions through effective stakeholder engagement
- Document data architecture, processes and reporting logic to ensure repeatability and transparency
- Work with SQL and PySpark to transform and load data
- Support Power BI reporting needs where required
What We’re Looking For
- Previous experience in data engineering
- Strong hands-on experience with Azure data tools (Data Factory, Synapse, Databricks)
- Advanced SQL and PySpark knowledge
- Strong stakeholder engagement skills with experience in requirement gathering and documentation
- Microsoft certification and Power BI experience is desirable
- Background in mid-to-large scale businesses preferred – complexity and data maturity essential
- A proactive, solutions-oriented personality who thrives in fast-paced, evolving environments
Interested?
Click “Apply” or email your CV to (url removed) to learn more.
The Advocate Group is a leading recruitment partner to the FMCG and consumer product sectors. We are an equal opportunities employer and welcome applications from all suitably qualified persons regardless of their race, sex, disability, religion/belief, sexual orientation, or age.
By applying for this role, you are agreeing to our Privacy Policy, which can be found on our website. The Advocate Group is acting as an employment agency in relation to this vacancy.