454 Spark jobs in the United Kingdom
Spark/Scala Developer
Posted 2 days ago
Job Viewed
Job Description
Get the future you want!
Choosing Capgemini means choosing a company where you will be empowered to shape your career in the way you’d like, where you’ll be supported and inspired by a collaborative community of colleagues around the world, and where you’ll be able to reimagine what’s possible. Join us and help the world’s leading organizations unlock the value of technology and build a more sustainable, more inclusive world.
Your Role
We are looking for a skilled Spark/Scala Developer to join our data engineering team. The ideal candidate will have hands-on experience in designing, developing, and maintaining large-scale data processing pipelines using Apache Spark and Scala. You will work closely with data scientists, analysts, and engineers to build efficient data solutions and enable data-driven decision-making.
Key Responsibilities:
- Develop, optimize, and maintain data pipelines and ETL processes using Apache Spark and Scala.
- Design scalable and robust data processing solutions for batch and real-time data.
- Collaborate with cross-functional teams to gather requirements and translate them into technical specifications.
- Perform data ingestion, transformation, and cleansing from various structured and unstructured sources.
- Monitor and troubleshoot Spark jobs, ensuring high performance and reliability.
- Write clean, maintainable, and well-documented code.
- Participate in code reviews, design discussions, and agile ceremonies.
- Implement data quality and governance best practices.
- Stay updated with emerging technologies and propose improvements
About Capgemini
Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, cloud and data, combined with its deep industry expertise and partner ecosystem. Get the future you want |
Spark/Scala Developer
Posted 2 days ago
Job Viewed
Job Description
Get the future you want!
Choosing Capgemini means choosing a company where you will be empowered to shape your career in the way you’d like, where you’ll be supported and inspired by a collaborative community of colleagues around the world, and where you’ll be able to reimagine what’s possible. Join us and help the world’s leading organizations unlock the value of technology and build a more sustainable, more inclusive world.
Your Role
We are looking for a skilled Spark/Scala Developer to join our data engineering team. The ideal candidate will have hands-on experience in designing, developing, and maintaining large-scale data processing pipelines using Apache Spark and Scala. You will work closely with data scientists, analysts, and engineers to build efficient data solutions and enable data-driven decision-making.
Key Responsibilities:
- Develop, optimize, and maintain data pipelines and ETL processes using Apache Spark and Scala.
- Design scalable and robust data processing solutions for batch and real-time data.
- Collaborate with cross-functional teams to gather requirements and translate them into technical specifications.
- Perform data ingestion, transformation, and cleansing from various structured and unstructured sources.
- Monitor and troubleshoot Spark jobs, ensuring high performance and reliability.
- Write clean, maintainable, and well-documented code.
- Participate in code reviews, design discussions, and agile ceremonies.
- Implement data quality and governance best practices.
- Stay updated with emerging technologies and propose improvements
About Capgemini
Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, cloud and data, combined with its deep industry expertise and partner ecosystem. Get the future you want |
Software Engineer - SPARK - Bath
Posted 1 day ago
Job Viewed
Job Description
At Capgemini Engineering, the world leader in engineering services, we bring together a global team of engineers, scientists, and architects to help the world’s most innovative companies unleash their potential. From autonomous cars to life-saving robots, our digital and software technology experts think outside the box as they provide unique R&D and engineering services across all industries. Join us for a career full of opportunities. Where you can make a difference. Where no two days are the same.
Your role
Capgemini Engineering is looking for a software engineers to join our High Integrity Software team. You will join a diverse team of excellent and experienced engineers who are committed to ensuring our projects meet our customers’ high expectations.
- We develop award-winning software for safety-related and mission-critical applications across Aerospace, Defence, Rail, Automotive, and Energy sectors, often supporting critical national infrastructure.
- Our projects deliver transformative improvements in operational efficiency and safety, earning high respect and trust from our customers.
Your Profile
Essential
- Background in Software Testing and/or Software development.
- Static code analysis experience or experience with Formal Methods( SPARK, ADA, Z language)
- Experience in working safety critical environment, particularly in the nuclear industry.
- Experience with Spark/Software Language ADA.
- Excellent interpersonal skills and the ability to quickly build rapport with others
- Self-motivated and able to use initiative
- Organised, good time management and prioritisation
- Methodical approach to work, with good attention to details and strong logic and reasoning skills
- Willingness to learn new languages, skills and techniques.
Desirable
- Degree qualified in software engineering or related/relevant subject.
- Experience in at least one of the following markets: aerospace, transport, defence, rail, automotive.
- Knowledge and experience in static analysis techniques
If you're excited about this role but don’t meet every requirement, we still encourage you to apply, your unique experience could be just what we need.
What you’ll love about working here
- Well-being hub and different wellbeing initiatives
- Hybrid working up to 70%
- Possibility to work up to 45 days per year from abroad
- Open access to digital learning platforms
- Active employee networks promoting diversity, equity and inclusion like OutFront, CapAbility or
Need to know
- This is Hybrid role. 2-3 days/week presence is required in office.
- All roles will require a level of security clearance; BPSS OR Security Clearance OR Developed Vetting.
- You can bring your whole self to work. At Capgemini building an inclusive future is part of everyday life and will be part of your working reality. We have built a representative and welcoming environment, for everyone
Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, generative AI, cloud and data, combined with its deep industry expertise and partner ecosystem.
Data Processing Operator (6 Month Fixed Term Contract)
Posted 13 days ago
Job Viewed
Job Description
Spotlight Sports Group is a global media and technology company specialising in content and data within sports betting, horse racing and fantasy sports. With over 400 employees, the group operates multiple award-winning brands, including Racing Post, the world’s largest horse racing affiliate, Pickswise, myracing and Free Super Tips. We partner with leading operators across the betting industry to produce and build multilingual, best-in-class digital products and content to engage and educate customers. ICS-digital, an international marketing agency including ICS-translate, also operates under the group.
This role will be working from our London office 3 days a week and 2 days working from home.
Job Purpose:
To provide a professional service inputting into and maintaining the horse racing database ensuring a high degree of accuracy. The data held in the database is used to provide content to Racing Post’s publications, the website and is also syndicated to other publishers and betting operators.
Key Accountabilities/Responsibilities:
- Collates and inputs horse racing data into the Company databases (Sybase and/or GRP) to a high degree of accuracy.
- Processes horse racing data and maintains the database to the standards required.
- Manages and creates Silks images for international horse racing.
- Ensure all B2B products are released to clients in a timely fashion. Where issues occur, lead the resolution process and keep the B2B clients informed.
- Train new or inexperienced staff.
- Writing for various Racing Post papers/publications as and when required.
- Proof read work produced, ensuring the delivery to error-free pages (web and print).
- Occasionally, take on other production and writing duties as required.
- Quality assurance.
Key Relationships:
- Content, including other Data Operations teams
- B2B
- External clients.
Skills and Attributes:
Essential
- Good knowledge of horse racing, greyhounds and sport.
- Ability to produce work to the highest standard.
- Outstanding attention to detail.
- Excellent team working skills.
- Excellent organisation skills with the ability to understand schedules and meet deadlines.
- Ability to work efficiently under pressure.
- Good communication skills.
- IT literate, including good operational knowledge of Microsoft or Google packages.
Desirable
- Previous experience of working in a publishing environment.
- Previous experience with InDesign, Adobe Photoshop, Google Drive
Benefits
We offer a range of well-being initiatives, including private medical insurance, excellent parental leave, a working globally policy, mental health support, assistance programs, and social gatherings. We also provide a pension scheme and various other benefit schemes. Plus, we all get our birthdays off work and enjoy 25 days of holiday per year, as well as the opportunity to buy 5 additional days per year and you can be flexible about when you use your public holidays.
We’ve also got you covered with life assurance and exclusive perks like the Star card and our Step Further Awards (our employee recognition program) to recognise your dedication. For those working via the hybrid model (in the office and at home) we’ve made commuting easier with our Season Ticket Loan and Cycle to Work Scheme.
You can also take advantage of complimentary access to our Racing Post Members Club, complete with an Ultimate Membership. We believe in making a positive impact beyond the workplace, and you'll have the chance to volunteer two days per year with our charity partner, Autism in Racing .
Big Data Engineer

Posted 6 days ago
Job Viewed
Job Description
RELOCATION ASSISTANCE: Relocation assistance may be available
CLEARANCE TYPE: UK-Highest Level of Government Clearance
TRAVEL: Yes, 10% of the Time
**Salary: £77,400 - £116,000**
**Define Possible at Northrop Grumman UK**
At Northrop Grumman UK, our mission is to solve the most complex challenges by shaping the technology and solutions of tomorrow. We call it Defining Possible.
This mind-set goes beyond our customer solutions; it's the foundation for your career development and the impact we have within the community. So, what's your possible?
**Opportunity:**
This is more than just a job; it's a mission.
As a Big Data Systems Engineer, you will be responsible for designing, implementing, and maintaining scalable big data systems that support our data science, machine learning, and AI workloads. You will work closely with cross-functional teams to ensure seamless data integration, security, and compliance with GDPR and privacy regulations. Your expertise in scripting, troubleshooting, and integration testing will be essential in optimizing our data pipelines and orchestration processes.
Our UK Cyber & Intelligence business combines modern software development approaches with a rich heritage and experience in the Defence and security sectors. Our customers have complex and sensitive data and information requirements requiring a mission partner who quickly understands the context, delivering and sustaining a portfolio of challenging technology projects at scale and pace, supporting them through an ambitious digital transformation programme.
If you are looking for a career with meaning where you can make a difference to the nation's security and support critical missions, then look no further.
_"My purpose; to lead a team of engineers with the brightest minds, to push the boundaries and define possible together."_
**Role responsibilities:**
+ Scripting and Automation: Develop and maintain scripts for automating data processes and workflows.
+ Troubleshooting: Diagnose and resolve technical issues related to big data systems, data pipelines, and integrations.
+ Integration Testing: Conduct thorough testing to ensure seamless integration of new tools and technologies into existing systems.
+ Linux Internals: Utilize in-depth knowledge of Linux internals to optimize performance and reliability of big data infrastructure.
+ Network Protocols: Apply understanding of TCP/IP and OSI models in the design and troubleshooting of networked systems.
+ Data Pipeline Support: Manage and optimize data pipelines and orchestration tools such as NiFi and Airflow.
+ Scalable System Design: Design scalable big data systems with a focus on security, GDPR compliance, and privacy
+ Hybrid Cloud Management: Leverage hybrid cloud experience to manage on-premise and AWS cloud resources effectively
+ Data Science Support: Support data science, machine learning, and AI workloads using tools like Jupyter, Spacy, Transformers, and NLTK.
+ Big Data Platforms: Utilize big data NoSQL engines and platforms such as Hive, Impala, and Elasticsearch for data storage and processing
+ BI and Visualization: Implement and support business intelligence and visualization tools like Tableau, Kibana, and PowerBI to provide actionable insights.
**We are looking for:**
+ Strong troubleshooting skills and the ability to diagnose and resolve complex technical issues.
+ Experience with integration testing and ensuring seamless tool integration.
+ In-depth knowledge of Linux internals and system administration.
+ Understanding of TCP/IP and OSI models.
+ Hands-on experience with data pipeline tools like NiFi and Airflow.
+ Proven ability to design scalable big data systems with a focus on security and GDPR compliance.
+ Hybrid cloud experience, specifically with on-premise and AWS cloud environments.
+ Familiarity with data science, machine learning, and AI tools such as Jupyter, Spacy, Transformers, and NLTK.
+ Experience with big data NoSQL engines/platforms such as Hive, Impala, and Elasticsearch.
+ Proficiency with business intelligence and visualization tools like Tableau, Kibana, and PowerBI.
+ Excellent communication and collaboration skills.
Preferred Qualifications:
+ Certification in AWS or other cloud platforms.
+ Experience with additional data orchestration tools.
+ Familiarity with other big data tools and technologies.
+ Previous experience in a similar role within a dynamic and fast-paced environment.
Experience in Cloudera, Hadoop, Cloudera Data Science Workbench (CDSW), Cloudera Machine Learning (CML) would be highly desirable.
**Work Environment:**
+ Full time on-site presence required
**If you don't meet every single requirement, we still encourage you to apply.**
Sometimes, people hesitate to apply because they can't tick every box. We encourage you to apply if you believe the role will suit you well, even if you don't meet all the criteria. You might be exactly who we are looking for, either for this position or for our opportunities at Northrop Grumman UK. We are on an exciting growth trajectory and growing our teams across the UK.
**Security clearance:**
You must hold the highest level of UK Government security clearance. Our requirement team is on hand to answer any questions and we will guide you through the process: .
**Benefits:**
We can offer you a range of flexible working options to suit you, including optional compressed working schedule with every other Friday off. Our benefits including private health care, cash health plan, holiday buy and sell, career development opportunities and performance bonuses. For a comprehensive list of benefits, speak to our recruitment team.
**Why join us?**
+ **A mission to believe in** **-** Every day we contribute to building a more secure and connected world, expanding our reach from land, sea, and air to space and cyberspace. From engineering data and intelligence solutions, to developing maritime navigation and control systems and innovating command and control systems for the UK and NATO, what we do together matters.
+ **A place to belong and thrive** **-** Every voice matters at our table meaning you can bring your authentic self to work. From our Employee Resource Groups backed by thousands of employees to our partnerships with the Association For Black and Minority Ethnic Engineers, Forces Transition Group, Mind, and Women in Defence - we are passionate about growing and supporting our inclusive community where everyone can belong.
+ **Your career, your way** - Shape your career journey with diverse roles, mentorship, and development opportunities that fuel your curiosity, channel your expertise, and nurture your passion. Looking for flexibility? Balance your professional career with your personal life through our health and wellbeing benefits, discount schemes, and investment in your future development. Speak to our team to find the balance that's right for you.
**Ready to apply?**
**Yes** - Submit your application online. Your application will be reviewed by our team and we will be in touch.
**Possibly, I'd like to find out more** **about this role** - Reach out to our team for more information and support: .
**No, I don't think this role is right for me** - Our extensive UK growth means we have exciting, new opportunities opening all the time. Speak to our team to discuss your career goals.
Northrop Grumman is committed to hiring and retaining a diverse workforce, and encourages individuals from all backgrounds and all abilities to apply and consider becoming a part of our diverse and inclusive workforce.
Big Data Architect
Posted 2 days ago
Job Viewed
Job Description
For this role, you will be responsible for providing the framework that appropriately replicates the Big Data needs of a company utilizing data.
Essential requirements:
- More than 3 years of presales experience in the design of Big Data and Data analytics solutions according to customer requirements
- Previous experience with the preparation of high-quality engaging customer presentations, excellent communication skills, experience in conversations at CxO level, ability to adapt the message to the customer feedback, etc.
- Experience in preparation answering RFPs: organize the offer solution team, solution definition, effort and cost estimation,
- Past experience in dealing with partners, tools vendors, etc.
- Business Domain Knowledge
- More than 5 years of experience in Big Data implementation projects
- Experience in the definition of Big Data architecture with different tools and environments: Cloud (AWS, Azure and GCP), Cloudera, No-sql databases (Cassandra, Mongo DB), ELK, Kafka, Snowflake, etc.
- Past experience in Data Engineering and data quality tools (Informatica, Talend, etc.)
- Previous involvement in working in a Multilanguage and multicultural environment
- Proactive, tech passionate and highly motivated
Desirable requirements:
- Experience in Data analysis and visualization solutions: Microstrategy, Qlik, PowerBI, Tableau, Looker,…
- Background in Data Governance and Data Catalog solutions: Axon, Informatica EDC, Colibra, Purview, etc.
- Previous experience in Artificial Intelligence techniques: ML/Deep Learning, Computer Vision, NLP, etc
General information:
- Start Date: ASAP
- Length of Contract: 1 year (minimum)
- Work Location: Madrid
- Remote working. (It may be necessary at some point on-site presence in the customer office in Madrid).
We look forward to receiving your application!
Senior Data Engineer - Big Data & Analytics
Posted 1 day ago
Job Viewed
Job Description
Key Responsibilities:
- Design, build, and maintain scalable and efficient data pipelines and ETL/ELT processes using big data technologies (e.g., Spark, Hadoop ecosystem).
- Develop and optimize data models and structures within data warehouses and data lakes.
- Ensure data quality, integrity, and security across all data systems.
- Collaborate with data scientists, analysts, and business stakeholders to understand data requirements and deliver data solutions.
- Implement and manage cloud-based data platforms (e.g., AWS Redshift, S3; Azure Data Lake, SQL Data Warehouse; GCP BigQuery).
- Monitor data pipeline performance, troubleshoot issues, and implement performance enhancements.
- Develop and maintain data dictionaries, metadata management, and documentation.
- Stay abreast of emerging trends and technologies in big data and data engineering.
- Mentor junior data engineers and contribute to best practices within the data team.
- Automate data processes and infrastructure where possible.
Qualifications and Experience:
- Bachelor's or Master's degree in Computer Science, Engineering, or a related quantitative field.
- Minimum of 5-7 years of experience in data engineering or a related role, with a strong focus on big data technologies.
- Proficiency in programming languages such as Python, Scala, or Java.
- Hands-on experience with distributed data processing frameworks like Apache Spark.
- Strong SQL skills and experience with various database systems (relational and NoSQL).
- Experience with cloud data platforms (AWS, Azure, GCP) is essential.
- Solid understanding of data warehousing concepts, data modeling, and ETL/ELT principles.
- Excellent problem-solving, analytical, and critical thinking skills.
- Strong communication and collaboration skills, with the ability to work effectively in a remote team.
- Experience with containerization technologies (e.g., Docker, Kubernetes) is a plus.
- Familiarity with data visualization tools is beneficial.
Be The First To Know
About the latest Spark Jobs in United Kingdom !
Lead Data Engineer - Big Data Platforms
Posted 15 days ago
Job Viewed
Job Description
The ideal candidate will possess a Bachelor's or Master's degree in Computer Science, Engineering, or a related quantitative field, with a minimum of 7 years of professional experience in data engineering. Extensive hands-on experience with distributed big data technologies such as Apache Spark, Hadoop, Kafka, and cloud-based data services (e.g., AWS EMR, Azure Databricks, Google Cloud Dataflow) is essential. Proficiency in SQL and NoSQL databases, as well as strong programming skills in Python or Scala, are required. You should have a deep understanding of data warehousing concepts, data modeling, and ETL/ELT best practices. Experience in designing and implementing robust data governance and data quality frameworks is highly desirable. Strong leadership qualities, with the ability to mentor junior engineers, guide technical strategy, and collaborate effectively with cross-functional teams (data scientists, analysts, business stakeholders), are paramount.
This role offers a unique opportunity to shape the future of data architecture within a dynamic and growing company. You will be instrumental in architecting solutions that handle vast amounts of data, unlock new insights, and drive business value. The position involves a hybrid working model, allowing for flexibility while fostering team collaboration. You will be based in our state-of-the-art offices in London, England, UK , working with cutting-edge technologies and solving complex data challenges. Join a team committed to excellence and innovation in the heart of the tech industry.
Key Responsibilities:
- Design, build, and maintain scalable big data pipelines and infrastructure.
- Develop and optimize ETL/ELT processes for data ingestion and transformation.
- Manage and enhance data lakes and data warehouses.
- Ensure data quality, integrity, and reliability.
- Implement data governance and security best practices.
- Lead and mentor a team of data engineers.
- Collaborate with data scientists and analysts to meet their data needs.
- Evaluate and adopt new data technologies and tools.
Senior Data Engineer - Big Data & Cloud
Posted 17 days ago
Job Viewed
Job Description
Responsibilities:
- Design, construct, install, and maintain scalable data pipelines and data warehousing solutions.
- Develop and implement ETL/ELT processes for ingesting, transforming, and loading data from various sources.
- Optimize data storage and processing for performance, reliability, and cost-efficiency on cloud platforms.
- Build and maintain data infrastructure using big data technologies such as Spark, Hadoop, and Kafka.
- Collaborate with data scientists, analysts, and other engineers to understand data needs and deliver solutions.
- Implement data governance, quality, and security best practices.
- Monitor data systems, troubleshoot issues, and implement solutions to ensure high availability.
- Develop and maintain robust data models for analytical purposes.
- Stay current with emerging technologies and trends in data engineering and big data.
- Mentor junior data engineers and contribute to the team's technical growth.
Qualifications:
- Bachelor's or Master's degree in Computer Science, Engineering, or a related quantitative field.
- 5+ years of experience in data engineering, with a focus on big data and cloud environments.
- Proficiency in at least one major cloud platform (AWS, Azure, or GCP) and its data services.
- Strong experience with programming languages like Python, Scala, or Java.
- Expertise in SQL and experience with distributed data processing frameworks (e.g., Apache Spark, Hadoop ecosystem).
- Solid understanding of data warehousing concepts, data modeling, and ETL/ELT design patterns.
- Experience with real-time data streaming technologies (e.g., Kafka, Kinesis) is a plus.
- Excellent problem-solving, analytical, and communication skills.
- Ability to work independently and collaboratively in a remote, fast-paced environment.
- This is a fully remote role, ideal for a skilled Data Engineer located near **Manchester, Greater Manchester, UK**, or elsewhere.