631 Big Data jobs in the United Kingdom
Big Data Engineer

Posted 6 days ago
Job Viewed
Job Description
RELOCATION ASSISTANCE: Relocation assistance may be available
CLEARANCE TYPE: UK-Highest Level of Government Clearance
TRAVEL: Yes, 10% of the Time
**Salary: £77,400 - £116,000**
**Define Possible at Northrop Grumman UK**
At Northrop Grumman UK, our mission is to solve the most complex challenges by shaping the technology and solutions of tomorrow. We call it Defining Possible.
This mind-set goes beyond our customer solutions; it's the foundation for your career development and the impact we have within the community. So, what's your possible?
**Opportunity:**
This is more than just a job; it's a mission.
As a Big Data Systems Engineer, you will be responsible for designing, implementing, and maintaining scalable big data systems that support our data science, machine learning, and AI workloads. You will work closely with cross-functional teams to ensure seamless data integration, security, and compliance with GDPR and privacy regulations. Your expertise in scripting, troubleshooting, and integration testing will be essential in optimizing our data pipelines and orchestration processes.
Our UK Cyber & Intelligence business combines modern software development approaches with a rich heritage and experience in the Defence and security sectors. Our customers have complex and sensitive data and information requirements requiring a mission partner who quickly understands the context, delivering and sustaining a portfolio of challenging technology projects at scale and pace, supporting them through an ambitious digital transformation programme.
If you are looking for a career with meaning where you can make a difference to the nation's security and support critical missions, then look no further.
_"My purpose; to lead a team of engineers with the brightest minds, to push the boundaries and define possible together."_
**Role responsibilities:**
+ Scripting and Automation: Develop and maintain scripts for automating data processes and workflows.
+ Troubleshooting: Diagnose and resolve technical issues related to big data systems, data pipelines, and integrations.
+ Integration Testing: Conduct thorough testing to ensure seamless integration of new tools and technologies into existing systems.
+ Linux Internals: Utilize in-depth knowledge of Linux internals to optimize performance and reliability of big data infrastructure.
+ Network Protocols: Apply understanding of TCP/IP and OSI models in the design and troubleshooting of networked systems.
+ Data Pipeline Support: Manage and optimize data pipelines and orchestration tools such as NiFi and Airflow.
+ Scalable System Design: Design scalable big data systems with a focus on security, GDPR compliance, and privacy
+ Hybrid Cloud Management: Leverage hybrid cloud experience to manage on-premise and AWS cloud resources effectively
+ Data Science Support: Support data science, machine learning, and AI workloads using tools like Jupyter, Spacy, Transformers, and NLTK.
+ Big Data Platforms: Utilize big data NoSQL engines and platforms such as Hive, Impala, and Elasticsearch for data storage and processing
+ BI and Visualization: Implement and support business intelligence and visualization tools like Tableau, Kibana, and PowerBI to provide actionable insights.
**We are looking for:**
+ Strong troubleshooting skills and the ability to diagnose and resolve complex technical issues.
+ Experience with integration testing and ensuring seamless tool integration.
+ In-depth knowledge of Linux internals and system administration.
+ Understanding of TCP/IP and OSI models.
+ Hands-on experience with data pipeline tools like NiFi and Airflow.
+ Proven ability to design scalable big data systems with a focus on security and GDPR compliance.
+ Hybrid cloud experience, specifically with on-premise and AWS cloud environments.
+ Familiarity with data science, machine learning, and AI tools such as Jupyter, Spacy, Transformers, and NLTK.
+ Experience with big data NoSQL engines/platforms such as Hive, Impala, and Elasticsearch.
+ Proficiency with business intelligence and visualization tools like Tableau, Kibana, and PowerBI.
+ Excellent communication and collaboration skills.
Preferred Qualifications:
+ Certification in AWS or other cloud platforms.
+ Experience with additional data orchestration tools.
+ Familiarity with other big data tools and technologies.
+ Previous experience in a similar role within a dynamic and fast-paced environment.
Experience in Cloudera, Hadoop, Cloudera Data Science Workbench (CDSW), Cloudera Machine Learning (CML) would be highly desirable.
**Work Environment:**
+ Full time on-site presence required
**If you don't meet every single requirement, we still encourage you to apply.**
Sometimes, people hesitate to apply because they can't tick every box. We encourage you to apply if you believe the role will suit you well, even if you don't meet all the criteria. You might be exactly who we are looking for, either for this position or for our opportunities at Northrop Grumman UK. We are on an exciting growth trajectory and growing our teams across the UK.
**Security clearance:**
You must hold the highest level of UK Government security clearance. Our requirement team is on hand to answer any questions and we will guide you through the process: .
**Benefits:**
We can offer you a range of flexible working options to suit you, including optional compressed working schedule with every other Friday off. Our benefits including private health care, cash health plan, holiday buy and sell, career development opportunities and performance bonuses. For a comprehensive list of benefits, speak to our recruitment team.
**Why join us?**
+ **A mission to believe in** **-** Every day we contribute to building a more secure and connected world, expanding our reach from land, sea, and air to space and cyberspace. From engineering data and intelligence solutions, to developing maritime navigation and control systems and innovating command and control systems for the UK and NATO, what we do together matters.
+ **A place to belong and thrive** **-** Every voice matters at our table meaning you can bring your authentic self to work. From our Employee Resource Groups backed by thousands of employees to our partnerships with the Association For Black and Minority Ethnic Engineers, Forces Transition Group, Mind, and Women in Defence - we are passionate about growing and supporting our inclusive community where everyone can belong.
+ **Your career, your way** - Shape your career journey with diverse roles, mentorship, and development opportunities that fuel your curiosity, channel your expertise, and nurture your passion. Looking for flexibility? Balance your professional career with your personal life through our health and wellbeing benefits, discount schemes, and investment in your future development. Speak to our team to find the balance that's right for you.
**Ready to apply?**
**Yes** - Submit your application online. Your application will be reviewed by our team and we will be in touch.
**Possibly, I'd like to find out more** **about this role** - Reach out to our team for more information and support: .
**No, I don't think this role is right for me** - Our extensive UK growth means we have exciting, new opportunities opening all the time. Speak to our team to discuss your career goals.
Northrop Grumman is committed to hiring and retaining a diverse workforce, and encourages individuals from all backgrounds and all abilities to apply and consider becoming a part of our diverse and inclusive workforce.
Big Data Architect
Posted 2 days ago
Job Viewed
Job Description
For this role, you will be responsible for providing the framework that appropriately replicates the Big Data needs of a company utilizing data.
Essential requirements:
- More than 3 years of presales experience in the design of Big Data and Data analytics solutions according to customer requirements
- Previous experience with the preparation of high-quality engaging customer presentations, excellent communication skills, experience in conversations at CxO level, ability to adapt the message to the customer feedback, etc.
- Experience in preparation answering RFPs: organize the offer solution team, solution definition, effort and cost estimation,
- Past experience in dealing with partners, tools vendors, etc.
- Business Domain Knowledge
- More than 5 years of experience in Big Data implementation projects
- Experience in the definition of Big Data architecture with different tools and environments: Cloud (AWS, Azure and GCP), Cloudera, No-sql databases (Cassandra, Mongo DB), ELK, Kafka, Snowflake, etc.
- Past experience in Data Engineering and data quality tools (Informatica, Talend, etc.)
- Previous involvement in working in a Multilanguage and multicultural environment
- Proactive, tech passionate and highly motivated
Desirable requirements:
- Experience in Data analysis and visualization solutions: Microstrategy, Qlik, PowerBI, Tableau, Looker,…
- Background in Data Governance and Data Catalog solutions: Axon, Informatica EDC, Colibra, Purview, etc.
- Previous experience in Artificial Intelligence techniques: ML/Deep Learning, Computer Vision, NLP, etc
General information:
- Start Date: ASAP
- Length of Contract: 1 year (minimum)
- Work Location: Madrid
- Remote working. (It may be necessary at some point on-site presence in the customer office in Madrid).
We look forward to receiving your application!
Databricks Architect (Modern Big Data)

Posted 6 days ago
Job Viewed
Job Description
We are seeking a highly experienced and visionary Data Solutions Architect (Modern Big Data) to join our Data & AI practice. The successful candidate will bring extensive expertise in architecting and delivering modern big data platforms that are scalable, reliable, and business-aligned. This role is pivotal in enabling clients to harness the power of streaming data, data lakes, lakehouses, and advanced analytics platforms, while guiding them on their data modernisation journeys.
As a trusted advisor, you will collaborate with executives, stakeholders, and technical teams to define modern big data strategies, design cloud-native architectures, and implement industry-leading best practices. You will thrive in a fast-paced, evolving technology environment, continuously expanding your knowledge to ensure NTT DATA and our clients remain leaders in data-driven innovation.
**What you'll be doing:**
**Primary Responsibilities:**
+ Client Engagement & Delivery
+ Solution Design & Implementation
+ Modernisation & Transformation
+ Thought Leadership & Knowledge Sharing
+ Collaboration & Leadership
**Business Relationships:**
+ Client Partners
+ Practice Leaders and Members
+ Peer-level relationships within client organisations up to Head of Data Engineering, Chief Data Architect, CIO, and CDO level
**What experience you'll bring:**
**Must-Have Competencies:**
+ 8+ years data architecture experience - Enterprise-scale solutions across multiple sectors with proven delivery track record
+ Technical leadership at scale - Leading 15+ person cross-functional teams and serving as technical escalation point for C-level stakeholders
+ Full data lifecycle mastery - End-to-end expertise from ingestion to consumption, including governance, security, and both batch/real-time processing
+ Business-technology translation - Ability to align data strategy with business objectives and communicate across all stakeholder levels
+ Databricks platform expertise - Deep hands-on experience with Databricks Lakehouse architecture, Delta Lake, Unity Catalog, and multi-cloud implementations
**Must be eligible for SC clearance**
**Nice to Have:**
+ Cloud-native architecture expertise - Hands-on experience with AWS/Azure/GCP, data lakes, real-time streaming, and infrastructure-as-code
+ Presales & business development experience - Track record supporting opportunity qualification, bid reviews, proposal development, and client-facing sales activities
+ Data governance & compliance - Strong background in security frameworks, regulatory compliance (GDPR), data lineage, and quality management
+ AI/ML integration capabilities - Experience with MLOps, analytics platforms, and integrating AI/ML into data architectures
+ Agile delivery & thought leadership - Proven agile/hybrid delivery experience with contribution to practice growth through proposition development and knowledge sharing
**Experience, Qualifications:**
+ Experience: Minimum 8-12 years in data architecture, engineering, or consulting, with at least 4+ years in modern big data solution architecture.
+ Education: University degree required.
+ Preferred: BSc/MSc in Computer Science, Data Engineering, or related field.
+ Relevant certifications in Databricks, Kafka, or cloud platforms highly desirable.
**Who we are:**
We're a business with a global reach that empowers local teams, and we undertake hugely exciting work that is genuinely changing the world. Our advanced portfolio of consulting, applications, business process, cloud, and infrastructure services will allow you to achieve great things by working with brilliant colleagues, and clients, on exciting projects.
Our inclusive work environment prioritises mutual respect, accountability, and continuous learning for all our people. This approach fosters collaboration, well-being, growth, and agility, leading to a more diverse, innovative, and competitive organisation. We are also proud to share that we have a range of Inclusion Networks such as: the Women's Business Network, Cultural and Ethnicity Network, LGBTQ+ & Allies Network, Neurodiversity Network and the Parent Network.
For more information on Diversity, Equity and Inclusion please click here: Creating Inclusion Together at NTT DATA UK | NTT DATA ( we'll offer you:**
We offer a range of tailored benefits that support your physical, emotional, and financial wellbeing. Our Learning and Development team ensure that there are continuous growth and development opportunities for our people. We also offer the opportunity to have flexible work options.
You can find more information about NTT DATA UK & Ireland here: are an equal opportunities employer. We believe in the fair treatment of all our employees and commit to promoting equity and diversity in our employment practices. We are also a proud Disability Confident Committed Employer - we are committed to creating a diverse and inclusive workforce. We actively collaborate with individuals who have disabilities and long-term health conditions which have an effect on their ability to do normal daily activities, ensuring that barriers are eliminated when it comes to employment opportunities. In line with our commitment, we guarantee an interview to applicants who declare to us, during the application process, that they have a disability and meet the minimum requirements for the role. If you require any reasonable adjustments during the recruitment process, please let us know. Join us in building a truly diverse and empowered team.
Back to search Email to a friend Apply now
Senior Data Engineer - Big Data & Analytics
Posted 1 day ago
Job Viewed
Job Description
Key Responsibilities:
- Design, build, and maintain scalable and efficient data pipelines and ETL/ELT processes using big data technologies (e.g., Spark, Hadoop ecosystem).
- Develop and optimize data models and structures within data warehouses and data lakes.
- Ensure data quality, integrity, and security across all data systems.
- Collaborate with data scientists, analysts, and business stakeholders to understand data requirements and deliver data solutions.
- Implement and manage cloud-based data platforms (e.g., AWS Redshift, S3; Azure Data Lake, SQL Data Warehouse; GCP BigQuery).
- Monitor data pipeline performance, troubleshoot issues, and implement performance enhancements.
- Develop and maintain data dictionaries, metadata management, and documentation.
- Stay abreast of emerging trends and technologies in big data and data engineering.
- Mentor junior data engineers and contribute to best practices within the data team.
- Automate data processes and infrastructure where possible.
Qualifications and Experience:
- Bachelor's or Master's degree in Computer Science, Engineering, or a related quantitative field.
- Minimum of 5-7 years of experience in data engineering or a related role, with a strong focus on big data technologies.
- Proficiency in programming languages such as Python, Scala, or Java.
- Hands-on experience with distributed data processing frameworks like Apache Spark.
- Strong SQL skills and experience with various database systems (relational and NoSQL).
- Experience with cloud data platforms (AWS, Azure, GCP) is essential.
- Solid understanding of data warehousing concepts, data modeling, and ETL/ELT principles.
- Excellent problem-solving, analytical, and critical thinking skills.
- Strong communication and collaboration skills, with the ability to work effectively in a remote team.
- Experience with containerization technologies (e.g., Docker, Kubernetes) is a plus.
- Familiarity with data visualization tools is beneficial.
Lead Data Engineer - Big Data Platforms
Posted 15 days ago
Job Viewed
Job Description
The ideal candidate will possess a Bachelor's or Master's degree in Computer Science, Engineering, or a related quantitative field, with a minimum of 7 years of professional experience in data engineering. Extensive hands-on experience with distributed big data technologies such as Apache Spark, Hadoop, Kafka, and cloud-based data services (e.g., AWS EMR, Azure Databricks, Google Cloud Dataflow) is essential. Proficiency in SQL and NoSQL databases, as well as strong programming skills in Python or Scala, are required. You should have a deep understanding of data warehousing concepts, data modeling, and ETL/ELT best practices. Experience in designing and implementing robust data governance and data quality frameworks is highly desirable. Strong leadership qualities, with the ability to mentor junior engineers, guide technical strategy, and collaborate effectively with cross-functional teams (data scientists, analysts, business stakeholders), are paramount.
This role offers a unique opportunity to shape the future of data architecture within a dynamic and growing company. You will be instrumental in architecting solutions that handle vast amounts of data, unlock new insights, and drive business value. The position involves a hybrid working model, allowing for flexibility while fostering team collaboration. You will be based in our state-of-the-art offices in London, England, UK , working with cutting-edge technologies and solving complex data challenges. Join a team committed to excellence and innovation in the heart of the tech industry.
Key Responsibilities:
- Design, build, and maintain scalable big data pipelines and infrastructure.
- Develop and optimize ETL/ELT processes for data ingestion and transformation.
- Manage and enhance data lakes and data warehouses.
- Ensure data quality, integrity, and reliability.
- Implement data governance and security best practices.
- Lead and mentor a team of data engineers.
- Collaborate with data scientists and analysts to meet their data needs.
- Evaluate and adopt new data technologies and tools.
Senior Data Engineer - Big Data & Cloud
Posted 17 days ago
Job Viewed
Job Description
Responsibilities:
- Design, construct, install, and maintain scalable data pipelines and data warehousing solutions.
- Develop and implement ETL/ELT processes for ingesting, transforming, and loading data from various sources.
- Optimize data storage and processing for performance, reliability, and cost-efficiency on cloud platforms.
- Build and maintain data infrastructure using big data technologies such as Spark, Hadoop, and Kafka.
- Collaborate with data scientists, analysts, and other engineers to understand data needs and deliver solutions.
- Implement data governance, quality, and security best practices.
- Monitor data systems, troubleshoot issues, and implement solutions to ensure high availability.
- Develop and maintain robust data models for analytical purposes.
- Stay current with emerging technologies and trends in data engineering and big data.
- Mentor junior data engineers and contribute to the team's technical growth.
Qualifications:
- Bachelor's or Master's degree in Computer Science, Engineering, or a related quantitative field.
- 5+ years of experience in data engineering, with a focus on big data and cloud environments.
- Proficiency in at least one major cloud platform (AWS, Azure, or GCP) and its data services.
- Strong experience with programming languages like Python, Scala, or Java.
- Expertise in SQL and experience with distributed data processing frameworks (e.g., Apache Spark, Hadoop ecosystem).
- Solid understanding of data warehousing concepts, data modeling, and ETL/ELT design patterns.
- Experience with real-time data streaming technologies (e.g., Kafka, Kinesis) is a plus.
- Excellent problem-solving, analytical, and communication skills.
- Ability to work independently and collaboratively in a remote, fast-paced environment.
- This is a fully remote role, ideal for a skilled Data Engineer located near **Manchester, Greater Manchester, UK**, or elsewhere.
Senior Data Engineer - Big Data & Cloud
Posted 19 days ago
Job Viewed
Job Description
Responsibilities:
- Design, develop, and optimize robust and scalable ETL/ELT data pipelines using cloud-based technologies.
- Implement and manage data warehousing solutions and data lake architectures on platforms like AWS, Azure, or GCP.
- Ensure data quality, integrity, and reliability across all data systems.
- Collaborate with data scientists, analysts, and business stakeholders to understand data requirements and deliver effective solutions.
- Develop and maintain data models that support business needs and analytical capabilities.
- Automate data processes and implement monitoring solutions to ensure system performance and reliability.
- Optimize data storage and retrieval processes for performance and cost-efficiency.
- Contribute to the development and enforcement of data governance policies and best practices.
- Troubleshoot and resolve complex data-related issues.
- Mentor junior data engineers and contribute to team knowledge sharing.
- Stay current with emerging technologies and trends in big data and data engineering.
- Bachelor's or Master's degree in Computer Science, Engineering, or a related quantitative field.
- Proven experience (5+ years) in data engineering, with a strong focus on building and managing large-scale data pipelines.
- Expertise in at least one major cloud platform (AWS, Azure, GCP) and their data services (e.g., S3, Redshift, BigQuery, Snowflake, Databricks).
- Proficiency in programming languages such as Python, SQL, or Scala.
- Experience with big data technologies like Spark, Hadoop, Kafka, etc.
- Strong understanding of data warehousing concepts, data modeling, and database design.
- Familiarity with containerization technologies (e.g., Docker, Kubernetes) is a plus.
- Excellent problem-solving skills and ability to work with complex datasets.
- Strong communication and collaboration skills, with the ability to translate technical concepts to non-technical stakeholders.
- Ability to work effectively in a hybrid work environment.
Be The First To Know
About the latest Big data Jobs in United Kingdom !
Remote Lead Data Scientist - Cloud & Big Data
Posted 10 days ago
Job Viewed
Job Description
Key Responsibilities:
- Lead and mentor a team of data scientists and engineers.
- Develop and implement advanced statistical and machine learning models.
- Design and optimize big data processing pipelines on cloud platforms.
- Collaborate with stakeholders to identify business opportunities for data science.
- Translate complex analytical results into actionable business insights.
- Define and manage the technical strategy for data science projects.
- Ensure the scalability, reliability, and performance of data science solutions.
- Stay abreast of the latest advancements in AI, ML, and big data technologies.
- Drive innovation and best practices in data science within the organization.
- Oversee data governance and ethical considerations in model development.
- PhD or Master's degree in Computer Science, Statistics, Mathematics, or a related quantitative field.
- Extensive experience in data science, with a proven track record of leading successful projects.
- Deep expertise in machine learning, statistical modeling, and data mining techniques.
- Strong proficiency in Python or R, and SQL.
- Hands-on experience with cloud platforms (AWS, Azure, or GCP) and big data technologies (Spark, Hadoop).
- Demonstrated leadership and team management skills.
- Excellent problem-solving, analytical, and critical thinking abilities.
- Outstanding communication and presentation skills, with the ability to explain complex concepts to non-technical audiences.
- Experience with data visualization tools.
- Ability to thrive in a remote, collaborative environment.
Project Manager (AI & Big Data) - Abu Dhabi Relocation
Posted today
Job Viewed
Job Description
Location: Abu Dhabi
Presight is looking for a dynamic, self-motivated Project Manager to serve as the primary business contact between Presight AI and its customers—supporting all initiatives and customer requests through astute account management, efficient projects & programs management and in-depth product knowledge.
Key Responsibilities:
To shape the growth curve of Presight by being the hands-on point of contact in the conceptualization & development of cutting edge, next-gen analytics solutions for their international customer base. You will be responsible for the end-to-end delivery of projects & programs to the customer, through all deal phases.
- Ensure customer requirements and needs are clearly understood, met, and final delivery is made effectively on time.
- Serve as primary point of contact for any and all matters specific to customers and develop a trusted advisor relationship with key accounts, customer stakeholders, and executive sponsors by understanding their business needs and technical challenges.
- Use project management methodologies to provide milestones and timelines to set expectations with customer teams prior to beginning any new project initiatives.
- Take ownership of customer retention, sales, revenue generation, and project management activities.
- Ensure the solution environments remain operationally healthy while reducing cost and complexity with high levels of performance, security, scalability, maintainability, reusability, and reliability upon deployment.
- Accelerate customer adoption of Presight by leading the implementation journey.
- Engage customers in the development of reporting elements, scheduling, establishing
- guidelines, and setting milestones to measure success.
- Manage project timelines, milestones, migration goals and business transformation strategies,
- and ensure that all processes & procedures are executed within agreed timeframes at defined
- quality standards.
- Utilize technical acumen and drive technical discussions regarding incidents, trade-offs, and
- risk management.
- Provide detailed reviews of service disruptions, metrics, prelaunch planning, post-sales, and
- consultative expertise.
- Ensure that customer issues are dealt with in an efficient manner, following predefined
- escalation processes for any problems that may arise.
- Design and develop bi-weekly presentations called ‘Demo for customers’ for General Manager, CTO and other stakeholders.
- Comply with QHSE (Quality Health Safety and Environment), Business Continuity, Information Security, Privacy, Risk, Compliance Management and Governance of Organizations policies, procedures, plans and related risk assessments.
Requirements:
- Master's Degree in Artificial Intelligence, Business Analytics, Business Administration or related field.
- Minimum 10 years of experience in project management, solution/product delivery in an AI service organization or analytics platforms & technologies.
- Previous experience in IT or finance industry is a good to have.
- PMP certification from PMI - a must