742 Senior Big Data Engineer Databricks Relocation To Abu Dhabi jobs in the United Kingdom

Senior Big Data Engineer (Databricks) - RELOCATION TO ABU DHABI

SoftServe

Job Viewed

Tap Again To Close

Job Description

Please note: this position requires relocation to Abu Dhabi for a minimum period of 12 months. Project duration: 36 months+. Softserve will support relocation of selected candidates.


WE ARE

SoftServe is a global digital solutions company with headquarters in Austin, Texas, founded in 1993. Our associates are currently working on 2,000+ projects with clients across North America, EMEA, APAC, and LATAM. We are about people who create bold things, who make a difference, who have fun, and who love their work.

Big Data & Analytics Center of Excellence, data consulting and data engineering branch at SoftServe. Starting as a group of three enthusiasts back in 2013, hundreds of Data Engineers and Architects nowadays build Data & Analytics end-to-end solutions from strategy through technical design and PoC to full-scale implementation. We have customers in Healthcare, Finance, Manufacturing, Retail, and Energy domains.

We hold top-level partnership statuses with all the major cloud providers and collaborate with many technology partners like AWS, GCP, Microsoft, Databricks, Snowflake, Confluent, and others.


IF YOU ARE

Experienced with Python/PySpark

Proficient working with Databricks Lakehouse architecture and principles

Having 2+ years of designing data models, building ETL pipelines, and wrangling data to solve business problems

Experienced with Azure cloud technologies Modern Data Estate such as Azure Data Factory, Azure DevOps, Azure Synapse, Azure Data Lake Services

Skilled with advanced SQL working with relational databases

Knowledgeable of database and data warehouse design best practices

Experienced designing, building, and scaling streaming and batch data pipelines

Able to suggest the best solution among different options and conduct trade-off analysis, solve the problems

Engaging in communication with stakeholders in written and verbal form (Upper-Intermediate English level)


YOU WANT TO

Be part of a team of data-focused Engineers dedicated to continuous learning, improvement, and knowledge sharing every day.

Work with a cutting-edge technology stack, including pioneering services from major cloud providers that are at the forefront of innovation.

Engage with customers of diverse backgrounds, ranging from large global corporations to emerging start-ups preparing to launch their first product.

Be involved in the entire project lifecycle, from initial design and proof of concept (PoC) to minimum viable product (MVP) development and full-scale implementation.


TOGETHER WE WILL

Address different business and technology challenges, engage in impactful projects, use top-notch technologies, and drive multiple initiatives as a part of the Center of Excellence.

Support your technical and personal growth—we have a dedicated career plan for all roles in our company.

Investigate new technologies, build internal prototypes, and share knowledge with SoftServe Data Community.

Upskill with full access to Udemy learning courses.

Pass professional certifications, encouraged and covered by the company.

Adopt best practices from experts while working in a team of top-notch Engineers and Architects.

Collaborate with world-leading companies and attend professional events


All qualified applicants will receive consideration for employment without regard to race, color, religion, age, sex, national origin, disability, sexual orientation, gender identity/expression, or protected veteran status. SoftServe is an Equal Opportunity Employer.

This advertiser has chosen not to accept applicants from your region.

Job No Longer Available

This position is no longer listed on WhatJobs. The employer may be reviewing applications, filled the role, or has removed the listing.

However, we have similar jobs available for you below.

Senior Data Engineer - Abu Dhabi, UAE

Robert Walters

Posted 2 days ago

Job Viewed

Tap Again To Close

Job Description

Job Title: Senior Data Engineer


Key Requirements:

  • 4-8 years of experience
  • from tier 1 or 2 big tech companies


Job Location:

Abu Dhabi, UAE


Benefits:

  • Work with cutting-edge technology through modern infrastructure and automation projects
  • Thrive in a growth-focused environment that prioritizes learning, innovation, and career development
  • Competitive salary and a comprehensive benefits package


Job Summary: As a Senior Data Engineer , you will be responsible for designing, developing, and maintaining advanced, scalable data systems that power critical business decisions. You will lead the development of robust data pipelines, ensure data quality and governance, and collaborate across cross-functional teams to deliver high-performance data platforms in production environments. This role requires a deep understanding of modern data engineering practices, real-time processing, and cloud-native solutions.


Key Responsibilities:

  • Data Pipeline Development & Management: Design, implement, and maintain scalable and reliable data pipelines to ingest, transform, and load structured, unstructured, and real-time data feeds from diverse sources.
  • Manage data pipelines for analytics and operational use , ensuring data integrity, timeliness, and accuracy across systems.
  • Implement data quality tools and validation frameworks within transformation pipelines.
  • Data Processing & Optimization: Build efficient, high-performance systems by leveraging techniques like data denormalization , partitioning , caching , and parallel processing .
  • Develop stream-processing applications using Apache Kafka and optimize performance for large-scale datasets .
  • Enable data enrichment and correlation across primary, secondary, and tertiary sources.


  • Cloud, Infrastructure, and Platform Engineering: Develop and deploy data workflows on AWS or GCP , using services such as S3, Redshift, Pub/Sub, or BigQuery.
  • Containerize data processing tasks using Docker , orchestrate with Kubernetes , and ensure production-grade deployment.
  • Collaborate with platform teams to ensure scalability, resilience, and observability of data pipelines.


  • Database Engineering : Write and optimize complex SQL queries on relational (Redshift, PostgreSQL) and NoSQL (MongoDB) databases.
  • Work with ELK stack (Elasticsearch, Logstash, Kibana) for search, logging, and real-time analytics.
  • Support Lakehouse architectures and hybrid data storage models for unified access and processing.


  • Data Governance & Stewardship: Implement robust data governance , access control , and stewardship policies aligned with compliance and security best practices.
  • Establish metadata management, data lineage, and auditability across pipelines and environments.


  • Machine Learning & Advanced Analytics Enablement: Collaborate with data scientists to prepare and serve features for ML models.
  • Maintain awareness of ML pipeline integration and ensure data readiness for experimentation and deployment.


  • Documentation & Continuous Improvement: Maintain thorough documentation including technical specifications , data flow diagrams , and operational procedures .
  • Continuously evaluate and improve the data engineering stack by adopting new technologies and automation strategies.


Required Skills & Qualifications:

  • 8+ years of experience in data engineering within a production environment.
  • Advanced knowledge of Python and Linux shell scripting for data manipulation and automation.
  • Strong expertise in SQL/NoSQL databases such as PostgreSQL and MongoDB.
  • Experience building stream processing systems using Apache Kafka .
  • Proficiency with Docker and Kubernetes in deploying containerized data workflows.
  • Good understanding of cloud services (AWS or Azure).
  • Hands-on experience with ELK stack (Elasticsearch, Logstash, Kibana) for scalable search and logging.
  • Familiarity with AI models supporting data management.
  • Experience working with Lakehouse systems , data denormalization , and data labeling practices.


Preferred Qualifications:

  • Working knowledge of data quality tools , lineage tracking , and data observability solutions.
  • Experience in data correlation , enrichment from external sources, and managing data integrity at scale .
  • Understanding of data governance frameworks and enterprise compliance protocols .
  • Exposure to CI/CD pipelines for data deployments and infrastructure-as-code.


Education & Experience:

  • Bachelor’s or Master’s degree in Computer Science , Engineering , Data Science , or a related field.
  • Demonstrated success in designing, scaling, and operating data systems in cloud-native and distributed environments .
  • Proven ability to work collaboratively with cross-functional teams including product managers, data scientists, and DevOps.


If you are interested in this exciting opportunity, please don't hesitate to apply.

This advertiser has chosen not to accept applicants from your region.

Senior Data Engineer - Abu Dhabi, UAE

London, London London Robert Walters

Posted 3 days ago

Job Viewed

Tap Again To Close

Job Description

Job Title: Senior Data Engineer


Key Requirements:

  • 4-8 years of experience
  • from tier 1 or 2 big tech companies


Job Location:

Abu Dhabi, UAE


Benefits:

  • Work with cutting-edge technology through modern infrastructure and automation projects
  • Thrive in a growth-focused environment that prioritizes learning, innovation, and career development
  • Competitive salary and a comprehensive benefits package


Job Summary: As a Senior Data Engineer , you will be responsible for designing, developing, and maintaining advanced, scalable data systems that power critical business decisions. You will lead the development of robust data pipelines, ensure data quality and governance, and collaborate across cross-functional teams to deliver high-performance data platforms in production environments. This role requires a deep understanding of modern data engineering practices, real-time processing, and cloud-native solutions.


Key Responsibilities:

  • Data Pipeline Development & Management: Design, implement, and maintain scalable and reliable data pipelines to ingest, transform, and load structured, unstructured, and real-time data feeds from diverse sources.
  • Manage data pipelines for analytics and operational use , ensuring data integrity, timeliness, and accuracy across systems.
  • Implement data quality tools and validation frameworks within transformation pipelines.
  • Data Processing & Optimization: Build efficient, high-performance systems by leveraging techniques like data denormalization , partitioning , caching , and parallel processing .
  • Develop stream-processing applications using Apache Kafka and optimize performance for large-scale datasets .
  • Enable data enrichment and correlation across primary, secondary, and tertiary sources.


  • Cloud, Infrastructure, and Platform Engineering: Develop and deploy data workflows on AWS or GCP , using services such as S3, Redshift, Pub/Sub, or BigQuery.
  • Containerize data processing tasks using Docker , orchestrate with Kubernetes , and ensure production-grade deployment.
  • Collaborate with platform teams to ensure scalability, resilience, and observability of data pipelines.


  • Database Engineering : Write and optimize complex SQL queries on relational (Redshift, PostgreSQL) and NoSQL (MongoDB) databases.
  • Work with ELK stack (Elasticsearch, Logstash, Kibana) for search, logging, and real-time analytics.
  • Support Lakehouse architectures and hybrid data storage models for unified access and processing.


  • Data Governance & Stewardship: Implement robust data governance , access control , and stewardship policies aligned with compliance and security best practices.
  • Establish metadata management, data lineage, and auditability across pipelines and environments.


  • Machine Learning & Advanced Analytics Enablement: Collaborate with data scientists to prepare and serve features for ML models.
  • Maintain awareness of ML pipeline integration and ensure data readiness for experimentation and deployment.


  • Documentation & Continuous Improvement: Maintain thorough documentation including technical specifications , data flow diagrams , and operational procedures .
  • Continuously evaluate and improve the data engineering stack by adopting new technologies and automation strategies.


Required Skills & Qualifications:

  • 8+ years of experience in data engineering within a production environment.
  • Advanced knowledge of Python and Linux shell scripting for data manipulation and automation.
  • Strong expertise in SQL/NoSQL databases such as PostgreSQL and MongoDB.
  • Experience building stream processing systems using Apache Kafka .
  • Proficiency with Docker and Kubernetes in deploying containerized data workflows.
  • Good understanding of cloud services (AWS or Azure).
  • Hands-on experience with ELK stack (Elasticsearch, Logstash, Kibana) for scalable search and logging.
  • Familiarity with AI models supporting data management.
  • Experience working with Lakehouse systems , data denormalization , and data labeling practices.


Preferred Qualifications:

  • Working knowledge of data quality tools , lineage tracking , and data observability solutions.
  • Experience in data correlation , enrichment from external sources, and managing data integrity at scale .
  • Understanding of data governance frameworks and enterprise compliance protocols .
  • Exposure to CI/CD pipelines for data deployments and infrastructure-as-code.


Education & Experience:

  • Bachelor’s or Master’s degree in Computer Science , Engineering , Data Science , or a related field.
  • Demonstrated success in designing, scaling, and operating data systems in cloud-native and distributed environments .
  • Proven ability to work collaboratively with cross-functional teams including product managers, data scientists, and DevOps.


If you are interested in this exciting opportunity, please don't hesitate to apply.

This advertiser has chosen not to accept applicants from your region.

Big Data Engineer

London, London London £100000 - £115000 Annually Tec Partners

Posted 4 days ago

Job Viewed

Tap Again To Close

Job Description

permanent

Company

TEC Partners are working with a world leading defence and security company who work in collaboration with governments across the world to ensure international security across land, sea and air. They are leaders in highly sensitive international protection and continually seek to lead the way for innovation in their domain.

About this Big Data Engineer Role

As a Big Data Engineer, you will play a crucial role in creating and maintaining systems to support our client's data science, AI and machine learning workloads. You will be required to work in a collaborative nature with different departments and ensure effective data integration.

Why Work as a Big Data Engineer with Our Client?

  • Competitive salary up to 115,000
  • Flexible working hours
  • Private healthcare
  • Holiday buy and sell
  • Career development
  • Performance bonus

What is Expected of you as a Big Data Engineer with Our Client?

  • UKIC security vetting
  • A minimum of 5 years of experience of working with Big Data as a Data Engineer
  • Excellent communication skills and the ability to work in a collaborative nature with the wider team and explain complex technical details to those of varying technical literacy
  • The ability to create scalable big data pipelines
  • A strong background in automation and scripting to Develop and maintain scripts to streamline data workflows.

Responsibilities of a Big Data Engineer with Our Client

  • The ability to troubleshoot big data environment and optimize pipelines
  • Conduct end-to-end testing to ensure seamless tool and system integrations across the data stack.
  • Design and support secure, scalable systems using network protocols (TCP/IP, OSI)
  • Enable machine learning and AI workflows through tools like Jupyter, SpaCy, Transformers, and NLTK.
  • Implement and support BI tools (Tableau, Power BI, Kibana) to drive actionable insights from complex data sets.

If you are interested in this Big Data Engineer role and would like to learn more about it or other Data opportunities, please contact Stuart at TEC Partners today.

This advertiser has chosen not to accept applicants from your region.

Big Data Engineer

London, London London TEC Partners - Technical Recruitment Specialists

Posted 1 day ago

Job Viewed

Tap Again To Close

Job Description

Company

TEC Partners are working with a world leading defence and security company who work in collaboration with governments across the world to ensure international security across land, sea and air. They are leaders in highly sensitive international protection and continually seek to lead the way for innovation in their domain.


About this Big Data Engineer Role

As a Big Data Engineer, you will play a crucial role in creating and maintaining systems to support our client’s data science, AI and machine learning workloads. You will be required to work in a collaborative nature with different departments and ensure effective data integration.


Why Work as a Big Data Engineer with Our Client?

  • Competitive salary up to £115,000
  • Flexible working hours
  • Private healthcare
  • Holiday buy and sell
  • Career development
  • Performance bonus


What is Expected of you as a Big Data Engineer with Our Client?

  • You will need to hold current eDV or UKIC security vetting
  • A minimum of 5 years of experience of working with Big Data as a Data Engineer
  • Excellent communication skills and the ability to work in a collaborative nature with the wider team and explain complex technical details to those of varying technical literacy
  • The ability to create scalable big data pipelines
  • A strong background in automation and scripting to Develop and maintain scripts to streamline data workflows.


Responsibilities of a Big Data Engineer with Our Client

  • The ability to troubleshoot big data environment and optimize pipelines
  • Conduct end-to-end testing to ensure seamless tool and system integrations across the data stack.
  • Design and support secure, scalable systems using network protocols (TCP/IP, OSI)
  • Enable machine learning and AI workflows through tools like Jupyter, SpaCy, Transformers, and NLTK.
  • Implement and support BI tools (Tableau, Power BI, Kibana) to drive actionable insights from complex data sets.


If you are interested in this Big Data Engineer role and would like to learn more about it or other Data opportunities, please contact Stuart at TEC Partners today.

This advertiser has chosen not to accept applicants from your region.

Big Data Engineer

TEC Partners - Technical Recruitment Specialists

Posted 1 day ago

Job Viewed

Tap Again To Close

Job Description

Company

TEC Partners are working with a world leading defence and security company who work in collaboration with governments across the world to ensure international security across land, sea and air. They are leaders in highly sensitive international protection and continually seek to lead the way for innovation in their domain.


About this Big Data Engineer Role

As a Big Data Engineer, you will play a crucial role in creating and maintaining systems to support our client’s data science, AI and machine learning workloads. You will be required to work in a collaborative nature with different departments and ensure effective data integration.


Why Work as a Big Data Engineer with Our Client?

  • Competitive salary up to £115,000
  • Flexible working hours
  • Private healthcare
  • Holiday buy and sell
  • Career development
  • Performance bonus


What is Expected of you as a Big Data Engineer with Our Client?

  • You will need to hold current eDV or UKIC security vetting
  • A minimum of 5 years of experience of working with Big Data as a Data Engineer
  • Excellent communication skills and the ability to work in a collaborative nature with the wider team and explain complex technical details to those of varying technical literacy
  • The ability to create scalable big data pipelines
  • A strong background in automation and scripting to Develop and maintain scripts to streamline data workflows.


Responsibilities of a Big Data Engineer with Our Client

  • The ability to troubleshoot big data environment and optimize pipelines
  • Conduct end-to-end testing to ensure seamless tool and system integrations across the data stack.
  • Design and support secure, scalable systems using network protocols (TCP/IP, OSI)
  • Enable machine learning and AI workflows through tools like Jupyter, SpaCy, Transformers, and NLTK.
  • Implement and support BI tools (Tableau, Power BI, Kibana) to drive actionable insights from complex data sets.


If you are interested in this Big Data Engineer role and would like to learn more about it or other Data opportunities, please contact Stuart at TEC Partners today.

This advertiser has chosen not to accept applicants from your region.

Big Data Engineer

London, London London Northrop Grumman

Posted 21 days ago

Job Viewed

Tap Again To Close

Job Description

UK CITIZENSHIP REQUIRED FOR THIS POSITION: Yes
RELOCATION ASSISTANCE: Relocation assistance may be available
CLEARANCE TYPE: UK-Highest Level of Government Clearance
TRAVEL: Yes, 10% of the Time
**Salary: £77,400 - £116,000**
**Define Possible at Northrop Grumman UK**
At Northrop Grumman UK, our mission is to solve the most complex challenges by shaping the technology and solutions of tomorrow. We call it Defining Possible.
This mind-set goes beyond our customer solutions; it's the foundation for your career development and the impact we have within the community. So, what's your possible?
**Opportunity:**
This is more than just a job; it's a mission.
As a Big Data Systems Engineer, you will be responsible for designing, implementing, and maintaining scalable big data systems that support our data science, machine learning, and AI workloads. You will work closely with cross-functional teams to ensure seamless data integration, security, and compliance with GDPR and privacy regulations. Your expertise in scripting, troubleshooting, and integration testing will be essential in optimizing our data pipelines and orchestration processes.
Our UK Cyber & Intelligence business combines modern software development approaches with a rich heritage and experience in the Defence and security sectors. Our customers have complex and sensitive data and information requirements requiring a mission partner who quickly understands the context, delivering and sustaining a portfolio of challenging technology projects at scale and pace, supporting them through an ambitious digital transformation programme.
If you are looking for a career with meaning where you can make a difference to the nation's security and support critical missions, then look no further.
_"My purpose; to lead a team of engineers with the brightest minds, to push the boundaries and define possible together."_
**Role responsibilities:**
+ Scripting and Automation: Develop and maintain scripts for automating data processes and workflows.
+ Troubleshooting: Diagnose and resolve technical issues related to big data systems, data pipelines, and integrations.
+ Integration Testing: Conduct thorough testing to ensure seamless integration of new tools and technologies into existing systems.
+ Linux Internals: Utilize in-depth knowledge of Linux internals to optimize performance and reliability of big data infrastructure.
+ Network Protocols: Apply understanding of TCP/IP and OSI models in the design and troubleshooting of networked systems.
+ Data Pipeline Support: Manage and optimize data pipelines and orchestration tools such as NiFi and Airflow.
+ Scalable System Design: Design scalable big data systems with a focus on security, GDPR compliance, and privacy
+ Hybrid Cloud Management: Leverage hybrid cloud experience to manage on-premise and AWS cloud resources effectively
+ Data Science Support: Support data science, machine learning, and AI workloads using tools like Jupyter, Spacy, Transformers, and NLTK.
+ Big Data Platforms: Utilize big data NoSQL engines and platforms such as Hive, Impala, and Elasticsearch for data storage and processing
+ BI and Visualization: Implement and support business intelligence and visualization tools like Tableau, Kibana, and PowerBI to provide actionable insights.
**We are looking for:**
+ Strong troubleshooting skills and the ability to diagnose and resolve complex technical issues.
+ Experience with integration testing and ensuring seamless tool integration.
+ In-depth knowledge of Linux internals and system administration.
+ Understanding of TCP/IP and OSI models.
+ Hands-on experience with data pipeline tools like NiFi and Airflow.
+ Proven ability to design scalable big data systems with a focus on security and GDPR compliance.
+ Hybrid cloud experience, specifically with on-premise and AWS cloud environments.
+ Familiarity with data science, machine learning, and AI tools such as Jupyter, Spacy, Transformers, and NLTK.
+ Experience with big data NoSQL engines/platforms such as Hive, Impala, and Elasticsearch.
+ Proficiency with business intelligence and visualization tools like Tableau, Kibana, and PowerBI.
+ Excellent communication and collaboration skills.
Preferred Qualifications:
+ Certification in AWS or other cloud platforms.
+ Experience with additional data orchestration tools.
+ Familiarity with other big data tools and technologies.
+ Previous experience in a similar role within a dynamic and fast-paced environment.
Experience in Cloudera, Hadoop, Cloudera Data Science Workbench (CDSW), Cloudera Machine Learning (CML) would be highly desirable.
**Work Environment:**
+ Full time on-site presence required
**If you don't meet every single requirement, we still encourage you to apply.**
Sometimes, people hesitate to apply because they can't tick every box. We encourage you to apply if you believe the role will suit you well, even if you don't meet all the criteria. You might be exactly who we are looking for, either for this position or for our opportunities at Northrop Grumman UK. We are on an exciting growth trajectory and growing our teams across the UK.
**Security clearance:**
You must hold the highest level of UK Government security clearance. Our requirement team is on hand to answer any questions and we will guide you through the process: .
**Benefits:**
We can offer you a range of flexible working options to suit you, including optional compressed working schedule with every other Friday off. Our benefits including private health care, cash health plan, holiday buy and sell, career development opportunities and performance bonuses. For a comprehensive list of benefits, speak to our recruitment team.
**Why join us?**
+ **A mission to believe in** **-** Every day we contribute to building a more secure and connected world, expanding our reach from land, sea, and air to space and cyberspace. From engineering data and intelligence solutions, to developing maritime navigation and control systems and innovating command and control systems for the UK and NATO, what we do together matters.
+ **A place to belong and thrive** **-** Every voice matters at our table meaning you can bring your authentic self to work. From our Employee Resource Groups backed by thousands of employees to our partnerships with the Association For Black and Minority Ethnic Engineers, Forces Transition Group, Mind, and Women in Defence - we are passionate about growing and supporting our inclusive community where everyone can belong.
+ **Your career, your way** - Shape your career journey with diverse roles, mentorship, and development opportunities that fuel your curiosity, channel your expertise, and nurture your passion. Looking for flexibility? Balance your professional career with your personal life through our health and wellbeing benefits, discount schemes, and investment in your future development. Speak to our team to find the balance that's right for you.
**Ready to apply?**
**Yes** - Submit your application online. Your application will be reviewed by our team and we will be in touch.
**Possibly, I'd like to find out more** **about this role** - Reach out to our team for more information and support: .
**No, I don't think this role is right for me** - Our extensive UK growth means we have exciting, new opportunities opening all the time. Speak to our team to discuss your career goals.
Northrop Grumman is committed to hiring and retaining a diverse workforce, and encourages individuals from all backgrounds and all abilities to apply and consider becoming a part of our diverse and inclusive workforce.
This advertiser has chosen not to accept applicants from your region.

Data Engineer (Databricks & Azure) - Clean Energy

South East, South East South East Data Science Talent

Posted 3 days ago

Job Viewed

Tap Again To Close

Job Description

Data Engineer (Databricks & Azure) - Clean Energy


Location: South East England (Hybrid - 1 day onsite per week)


Salary: £60k - £70k + benefits package


18 months. That’s all the time it took for the client’s Databricks platform to evolve into a key driver of innovative green technologies. Now, they’re looking for someone to take it even further.


Imagine joining a forward-thinking client at the forefront of clean energy innovation. Your work will directly contribute to a zero-carbon future by supporting advancements in electrolysis for green hydrogen production and fuel cells for future power solutions. Through powerful partnerships with major multinational companies, the client’s solid oxide platform is transforming energy systems and helping decarbonise emissions-heavy industries like steelmaking and future fuels.


What’s the Role?


You’ll join a highly skilled data team, part of a broader department focused on modelling and digitalisation. This team develops and maintains a cutting-edge Azure Databricks Data Lakehouse platform to support all core business functions. Your primary goal will be building and maintaining robust, secure data pipelines and models that deliver trusted datasets to internal and external stakeholders, enabling data-driven decisions across the organisation.


As a Data Engineer, you will maintain, monitor, and enhance the Databricks platform that powers the client’s data services. You’ll work on building robust pipelines using Azure Data Lake and Python while collaborating closely with data scientists, simulation engineers, and the wider business.


Reporting to the Head of Data Management, you’ll be a part of a collaborative team focused on data governance, engineering, and strategy. This role offers the chance to make a visible impact in a dynamic, fast-evolving field.



Why Join?


  • IMPACTFUL WORK : Help revolutionise electrolyzer technology, accelerating clean hydrogen production and decarbonisation on a global scale.


  • SEE RESULTS QUICKLY : Your work will directly influence live projects, delivering measurable results in real-world applications.


  • CULTURE OF INNOVATION : Collaborate with forward-thinking professionals in an environment where experimentation and creativity are encouraged.


  • SECURE GROWTH : Join a financially robust organisation investing heavily in cutting-edge technologies and talent development.


  • PURPOSE-DRIVEN MISSION : Be part of a team dedicated to advancing green technologies and creating a sustainable future.



What You Can Add


We’re looking for someone who thrives on solving complex problems and working in fast-paced environments. Here’s what you’ll need:


  • 18 months or more of Databricks experience , with a strong background in managing and maintaining data services on the platform.


  • Expertise in Azure Data Lake , Python , and CI/CD pipelines using Azure DevOps.


  • Practical experience in industries like manufacturing , product development , or automotive , with a focus on real-world applications.


  • Familiarity with data modelling, governance, and digitisation.


  • Bonus: Knowledge of Unity Catalog in Databricks.



Ready to shape the future of clean energy? Apply now to join our client as a Data Engineer and help drive the green energy revolution.

This advertiser has chosen not to accept applicants from your region.
Be The First To Know

About the latest Senior big data engineer databricks relocation to abu dhabi jobs in United Kingdom!

Data Engineer - Big Data

Gloucester, South West £450 - £550 Daily BrightBox Group

Posted 8 days ago

Job Viewed

Tap Again To Close

Job Description

contract
Data Engineer
Hybrid working (2-3 days per week in Gloucester)
450-550 per day (inside IR35)

We are seeking a skilled, Security Cleared, Data Engineer to join our team. The ideal candidate will have a strong background in data architecture and engineering, specifically within the AWS ecosystem. You will be responsible for designing and implementing data pipelines that support large-scale data processing and analytics.

Key Responsibilities:
- Develop, construct, test, and maintain data architectures and data processing systems.
- Collaborate with data scientists and analysts to understand data requirements and ensure data availability.
- Design and optimise data models to support business intelligence and reporting needs.
- Monitor and troubleshoot data systems to ensure high availability and performance.
- Implement best practises for data management, security, and compliance.

Required Skills:
- Active Security Clearance
- Proficiency in AWS services related to data processing and storage, such as Amazon S3, Redshift, and EMR.
- Strong experience with big data technologies and frameworks.
- Solid understanding of data modelling, ETL processes, and database management.
- Ability to work effectively in a team environment and communicate technical concepts clearly to non-technical stakeholders.
This advertiser has chosen not to accept applicants from your region.

Data Engineer/Azure/Databricks

London, London London Kingdom People

Posted 8 days ago

Job Viewed

Tap Again To Close

Job Description

contract

Experience Azure Data Engineer required for my global client. Due to a major increase in workload, my client is in need of an Experienced Data Engineer with very strong experience around Data Engineering, Azure and Microsoft Dynamics 365 F & O. Please do not send me your cv unless you have been a senior data engineer working on the Data Engineering for a Dynamics 365 F & O deployment. My client wants a real go getting self starter, someone that has gravitas and can be trusted to deliver from day 1. The role is remote with occasional trips into the office in central London and is Outside IR35

This advertiser has chosen not to accept applicants from your region.