53 Big Data Hadoop jobs in Ireland

Data Engineer

Dublin, Leinster Grifols Shared Services North America, Inc

Posted today

Job Viewed

Tap Again To Close

Job Description

Would you like to join an international team working to improve the future of healthcare? Do you want to enhance the lives of millions of people? Grifols is a global healthcare company that since 1909 has been working to improve the health and well-being of people around the world. We are leaders in plasma-derived medicines and transfusion medicine and develop, produce and market innovative medicines, solutions and services in more than 110 countries and regions.
At Grifols, we believe that diversity adds value to our business, our teams, and our culture. We are committed to equal employment opportunities that foster an inclusive environment.
**Position Overview:**
As a Data Engineer, you will play a pivotal role in designing, building and maintaining new and existing data pipelines that integrate data from different sources including **Google BigQuery, Microsoft Azure, Oracle, Salesforce and other enterprise platforms** . You will be responsible for transforming raw data for advanced analysis and predictions and your expertise in **SQL, Python and modern data engineering tools** will be instrumental in unlocking the potential of large-scale data sets, ensuring data consistency and quality across the organisation. You will collaborate with data scientists, analysts, software engineers, and business stakeholders to optimise data flow and the delivery of high-quality insights for data-driven decision-making.
**Key Responsibilities:**
+ Design and build robust data pipelines to extract, transform, and load **(ETL/ELT)** data from various sources into **Google BigQuery** and/or data warehouses, implementing efficient and scalable data processing solutions using **SQL and Python** , and ensuring data integrity, scalability, and performance.
+ Collaborate with different teams to integrate data from various systems **(Marketing Campaigns, Commercial Data, Microsoft Azure, Oracle, Salesforce and other data platforms)** , ensuring data consistency across the organization, and implementing data synchronization strategies to keep data up to date.
+ Transform and cleanse raw data to ensure data quality and consistency, integrating data from multiple sources to provide a unified view for analysis.
+ Ensure that data is accurate, consistent, and reliable by applying predefined rules, checks, and constraints before it is processed or used for analysis.
+ Design and implement data models and schemas in **BigQuery and Azure SQL database** to support analytical and reporting requirements, developing and maintaining logical and physical data models.
+ Monitor **BigQuery, Azure SQL database** performance and troubleshoot any data ingestion, processing, or query execution issues, implementing proactive measures to detect and address potential bottlenecks.
+ Optimize SQL queries for efficient execution and reduced processing time, employing best practices and develop automation scripts and workflows using Python to streamline data processing and integration tasks.
+ Document data pipelines, data models, and processes for both technical and non-technical stakeholders and ensure data security, governance, compliance with relevant privacy regulations.
+ Collaborate with cross-functional teams, including data scientists, data analysts, software engineers, and business stakeholders, gathering requirements and delivering solutions that meet their needs, effectively presenting data-related insights and findings.
+ Stay updated with emerging technologies and tools related to data engineering and recommending appropriate technologies to enhance data processing capabilities.
**Qualifications:**
+ Bachelor's or master's degree in Data Analytics, Computer Ccience, Software Engineering, Data Engineering, or a related field.
+ Around 3 years of experience as a Data Engineer with hands-on experience in building data pipelines across multiple systems ( **Google BigQuery, Microsoft Azure, Oracle, Salesforce and other enterprise platforms)** .
+ Proficient in SQL (query optimization techniques for large-scale data sets) and Python.
+ Strong knowledge of cloud-based data storage and processing technologies, with a focus on **Google Cloud Platform (GCP)** and **Microsoft Azure cloud services** .
+ Experience with data modelling, database design principles and data integration techniques and tools.
+ Proven experience as a **Power BI Developer** or similar (Tableau, Looker Studio, Fabric) with a strong portfolio of Power BI reports and dashboards.
+ Strong expertise in **DAX (Data Analysis Expressions)** and **Power Query** languages.
+ Familiarity with **Google Analytics** data structures and **Salesforce CRM** data structures.
+ Excellent problem-solving and analytical skills.
+ Strong communication and collaboration skills with cross-functional teams.
**Our Benefits Include:**
+ Highly competitive salary
+ Group pension scheme - Contribution rates are (1.5% / 3%/ 5%/ 7%) and company will match
+ Private Medical Insurance for the employee (Irish Life)
+ Ongoing opportunities for career development in a rapidly expanding work environment
+ Succession planning and internal promotions
+ Education allowance
+ Wellness activities - Social activities eg. Padel, Summer Events
We understand that self-doubt can hold talented individuals back from applying for opportunities. We encourage everyone who meets the qualifications to apply - we're excited to hear from you
#LI-FD1
**Location: Grange Castle International Business Park, Grange, Co. Dublin, D22 K2R3 ( more about Grifols
**Req ID:**
**Type:** Regular Full-Time
**Job Category:**
This advertiser has chosen not to accept applicants from your region.

Senior Data Engineer

Letterkenny, Ulster UnitedHealth Group

Posted today

Job Viewed

Tap Again To Close

Job Description

Optum is a global organisation that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by diversity and inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health equity on a global scale. Join us to start Caring. Connecting. Growing together.
**About the role:**
In healthcare, evolution doesn't just happen. It takes innovation, imagination, and a passion for solving problems in new and better ways. And innovation is taking place at a lightning-fast pace every day at Optum.
As the fastest growing part of the UnitedHealth Group family of businesses, we're expanding our team in Ireland and creating excellent opportunities for those who want greater purpose and more impact in their work. We'll provide the investment, support, and resources to advance your career. You'll provide the talent, ambition, and drive.
As a Senior Data Engineer, you will be responsible for the development of complex data sources and pipelines into our data platform (i.e. Snowflake) along with other data applications (i.e. Azure, Airflow, etc.) and automation.
The Senior Data Engineer will work closely with the data, Architecture, Business Analyst, Data Stewards to integrate and align requirements, specifications and constraints of each element of the requirement. They will also need to help identify gaps in resources, technology, or capabilities required, and work with the data engineering team to identify and implement solutions where appropriate.
_Careers with Optum offer flexible work arrangements and individuals who live and work in the Republic of Ireland will have the opportunity to split their monthly work hours between our Dublin or Letterkenny and telecommuting from a home-based office in a hybrid work model._
**Primary Responsibilities:**
- Integrate data from multiple on prem and cloud sources and systems. Handle data ingestion, transformation, and consolidation to create a unified and reliable data foundation for analysis and reporting.
- Develop data transformation routines to clean, normalize, and aggregate data. Apply data processing techniques to handle complex data structures, handle missing or inconsistent data, and prepare the data for analysis, reporting, or machine learning tasks.
- Implement data de-identification/data masking in line with company standards.
- Monitor data pipelines and data systems to detect and resolve issues promptly.
- Develop monitoring tools to automate error handling mechanisms to ensure data integrity and system reliability.
- Utilize data quality tools like Great Expectations or Soda to ensure the accuracy, reliability, and integrity of data throughout its lifecycle.
- Create & maintain data pipelines using Airflow & Snowflake as primary tools
- Create SQL Stored procs to perform complex transformation
- Understand data requirements and design optimal pipelines to fulfil the use-cases
- Creating logical & physical data models to ensure data integrity is maintained
- CI CD pipeline creation & automation using GIT & GIT Actions
- Tuning and optimizing data processes
You will be rewarded and recognised for your performance in an environment that will challenge you and give you clear direction on what it takes to succeed in your role, as well as providing development for other roles you may be interested in.
**Required Qualifications:**
- Bachelor's degree in Computer Science or a related field.
- Proven hands-on experience as a Data Engineer.
- Proficiency in SQL (any flavor), with experience using Window functions and advanced features.
- Excellent communication skills.
- Strong knowledge of Python.
- In-depth knowledge of Snowflake architecture, features, and best practices.
- Experience with CI/CD pipelines using Git and Git Actions.
- Knowledge of various data modeling techniques, including Star Schema, Dimensional models, and Data Vault.
- Hands-on experience developing data pipelines (Snowflake), writing complex SQL queries.
- Experience building ETL/ELT/data pipelines.
- Hands-on experience with related/complementary open-source software platforms and languages (e.g., Scala, Python, Java, Linux).
- Experience with both relational (RDBMS) and non-relational databases.
- Analytical and problem-solving skills applied to big data datasets.
- Experience working on projects with agile/scrum methodologies and high-performing teams.
- Good understanding of access control, data masking, and row access policies.
- Exposure to DevOps methodology.
- Knowledge of data warehousing principles, architecture, and implementation.
**Preferred Qualifications:**
- Bachelor's degree or higher in Database Management, Information Technology, Computer Science, or a related field.
- Motivated self-starter who excels at managing tasks independently and takes ownership.
- Experience orchestrating data tasks in Airflow to run on Kubernetes for data ingestion, processing, and cleaning.
- Expertise in designing and implementing data pipelines to process high volumes of data.
- Ability to create Docker images for applications to run on Kubernetes
- Familiarity with Azure Services such as Blobs, Functions, Azure Data Factory, Service Principal, Containers, Key Vault, etc.
**Please note you must currently be eligible to work and remain indefinitely without any restrictions in the country to which you are making an application. Proof will be required to support your application.**
All telecommuters will be required to adhere to the UnitedHealth Group's Telecommuter Policy.
At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalised groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.
Diversity creates a healthier atmosphere: Optum is an Equal Employment Opportunity employer and all qualified applicants will receive consideration for employment without regard to gender, civil status, family status, sexual orientation, disability, religion, age, race, and membership of the Traveller community, or any other characteristic protected by law. Optum is a drug-free workplace. © 2025 Optum Services (Ireland) Limited. All rights reserved.
#BBMEMEA
This advertiser has chosen not to accept applicants from your region.

Senior Data Engineer

Dublin, Leinster UnitedHealth Group

Posted today

Job Viewed

Tap Again To Close

Job Description

Optum is a global organisation that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by diversity and inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health equity on a global scale. Join us to start **Caring. Connecting. Growing together.**
In healthcare, evolution doesn't just happen. It takes innovation, imagination, and a passion for solving problems in new and better ways. And innovation is taking place at a lightning-fast pace every day at Optum. As the fastest growing part of the UnitedHealth Group family of businesses, we're expanding our team in Ireland and creating excellent opportunities for those who want greater purpose and more impact in their work. We'll provide the investment, support, and resources to advance your career. You'll provide the talent, ambition, and drive.
As a **Senior Data Engineer** you will be working on developing and maintaining data pipelines that extract, transform, and load (ETL) data from various sources into a centralized data storage system, such as a data warehouse or data lake. Ensure the smooth flow of data from source systems to destination systems while adhering to data quality and integrity standards. In addition to having impact on a great team, you'll also discover the career opportunities you'd expect from an Industry Leader.
**Schedule:** Full-time position with standard working hours of Monday - Friday, 9am - 5pm.
_Careers with Optum offer flexible work arrangements and individuals who live and work in the Republic of Ireland will have the opportunity to split their monthly work hours between our Dublin or Letterkenny office and telecommuting from a home-based office in a hybrid work model._
**Primary Responsibilities:**
+ Data Integration: Integrate data from multiple sources and systems, including databases, APIs, log files, streaming platforms, and external data providers. Handle data ingestion, transformation, and consolidation to create a unified and reliable data foundation for analysis and reporting
+ Data Transformation and Processing: Develop data transformation routines to clean, normalize, and aggregate data. Apply data processing techniques to handle complex data structures, handle missing or inconsistent data, and prepare the data for analysis, reporting, or machine learning tasks
+ Contribute to common frameworks and best practices in code development, deployment, and automation/orchestration of data pipelines
+ Implement data governance in line with company standards
+ Partner with Data Analytics and Product leaders to design best practices and standards for developing and productionizing analytic pipelines
+ Partner with Infrastructure leaders on architecture approaches to advance the data and analytics platform, including exploring new tools and techniques that leverage the cloud environment (Azure, Snowflake, others)
+ Monitoring and Support: Monitor data pipelines and data systems to detect and resolve issues promptly. Develop monitoring tools, alerts, and automated error handling mechanisms to ensure data integrity and system reliability
_You will be rewarded and recognised for your performance in an environment that will challenge you and give you clear direction on what it takes to succeed in your role, as well as providing development for other roles you may be interested in._
**Required Qualifications:**
+ Bachelor's Degree (or higher) in Database Management, Information Technology, Computer Science or similar
+ Proven experience designing data solutions, including data modelling
+ Hands-on experience in data processing using PySpark and SQL
+ Experience with orchestration tools like Azure Data Factory or Airflow
+ Fluent in SQL, including advanced functions and window operations
+ Demonstrated experience with DevOps tools and CI/CD pipelines
+ Experience applying data governance controls in regulated environments
+ Well versed in Python regarding data manipulation, cleaning, transforming, and analyzing structured data to support our data-driven initiatives
**Preferred Qualifications:**
+ Proven Data Engineering experience
+ A motivated self-starter who excels at managing their own tasks and takes ownership
+ Experience working in projects with agile/scrum methodologies
+ Experience with Azure Databricks and Snowflake
+ Familiarity with production quality ML and/or AI model development and deployment
**Soft Skills:**
+ Effective communication skills and the ability to understand and adapt to changing requirements
**Please note you must currently be eligible to work and remain indefinitely without any restrictions in the country to which you are making an application. Proof will be required to support your application.**
_All telecommuters will be required to adhere to the UnitedHealth Group's Telecommuter Policy._
_At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalised groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission._
_Diversity creates a healthier atmosphere: Optum is an Equal Employment Opportunity employer and all qualified applicants will receive consideration for employment without regard to gender, civil status, family status, sexual orientation, disability, religion, age, race, and membership of the Traveller community, or any other characteristic protected by law. Optum is a drug-free workplace. © 2025 Optum Services (Ireland) Limited. All rights reserved._
#RPO #BBMRAQ
This advertiser has chosen not to accept applicants from your region.

Senior Data Engineer

Dublin, Leinster UnitedHealth Group

Posted today

Job Viewed

Tap Again To Close

Job Description

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by diversity and inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health equity on a global scale. Join us to start **Caring. Connecting. Growing together.**
As a Fortune 5 business, we're one of the world's leading healthcare companies. There are no limits here on the resources you'll have or the challenges you'll encounter.
We have been supporting global healthcare systems from Ireland and the UK for more than 20 years, building a dynamic and diverse team of more than 2,100 talented individuals. With a continued record of growth and stability, we're on the constant lookout for fresh talent to join our expanding teams.
As a Senior Data Engineer, you will be working with the Provider Growth team and will be responsible for connecting & analyzing existing data assets across the enterprise and building out new data pipelines and ensuring data connectivity with the intent of crafting client-centric qualification & origination strategies. You will participate in efforts to grow the Provider Growth pipeline and coordinate campaign/outreach activities in support of predicted customer need. This will involve working with data scientists to support feature engineering that connects and aggregates data in new and innovative ways. You will work with various data domain experts to understand what data exists and how to maximize and model it for the data science them who will be building predictive models to enhance overall sales strategy effectiveness.
**Working Schedule** : Full-time position, Monday - Friday with standard working hours.
_Careers with Optum offer flexible work arrangements and individuals who live and work in the Republic of Ireland will have the opportunity to split their monthly work hours between our Dublin or Letterkenny Office and telecommuting from a home-based office in a hybrid work model._
**Primary Responsibilities:**
+ Work with various data domain experts to understand & link what data exists across various domains (Operations, Product, Finance, Value Analytics, etc.)
+ Design data models to efficiently connect data across data domains to support customer targeting analytics
+ Develop and maintain data pipelines: Design, build, and manage data pipelines to ensure efficient data flow and integration across various systems
+ Analyze large datasets to identify trends, patterns, and actionable insights that support business decision-making
+ Ensure data accuracy and relevance: Collaborate with data scientists and other cross(1)functional teams to ensure the accuracy and relevance of data used in sales strategies
+ Bridge the gap between raw data and business insights by transforming complex datasets into usable formats for analysis
+ Perform exploratory data analysis (EDA) to support hypothesis generation and business strategy
+ Collaborate with cross-functional teams: Work closely with teams such as Value Analytics, Ops, Product, and Finance to ensure seamless data connectivity and integration
_You will be rewarded and recognized for your performance in an environment that will challenge you and give you clear direction on what it takes to succeed in your role, as well as providing development for other roles you may be interested in._
**Required Qualifications:**
+ Bachelor's Degree (or higher) in Data Engineering, Computer Science, Data Analytics, or a related field
+ Proven experience in data engineering, including building and maintaining data pipelines
+ Experience with BI tools (e.g., Tableau, Power BI) for dashboarding and reporting
+ SQL skills for data querying, transformation, and optimization
+ Proficiency in programming languages such as Python or Scala
+ Experience in Cloud-based technologies, data migration/automation and data engineering practices
+ Experience with ADF (Azure Data Factory), SSIS, or similar ETL tools
+ Experience with data integration tools and platforms
+ Communication skills to convey complex data insights to non-technical stakeholders
**Preferred Qualifications:**
+ Familiarity with Salesforce and sales operations activities
+ US Healthcare experience
+ Knowledge of Clinical Solutions, and/or Provider Revenue Cycle Management product and operations
+ Familiarity with Optum's Provider technology &/or services solutions
+ Familiarity with data warehousing solutions (e.g., Snowflake)
+ Experience working in projects with agile/scrum methodologies
**Soft Skills:**
+ Excellent analytical and problem-solving skills
+ Ability to work collaboratively in a team environment
**Please note you must currently be eligible to work and remain indefinitely without any restrictions in the country to which you are making an application. Proof will be required to support your application.**
_All telecommuters will be required to adhere to the UnitedHealth Group's Telecommuter Policy._
_At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission._
_Diversity creates a healthier atmosphere: Optum is an Equal Employment Opportunity employer, and all qualified applicants will receive consideration for employment without regard to gender, civil status, family status, sexual orientation, disability, religion, age, race, and membership of the Traveler community, or any other characteristic protected by law. Optum is a drug-free workplace. © 2025 Optum Services (Ireland) Limited. All rights reserved._
#RPO #BBMEMEA
This advertiser has chosen not to accept applicants from your region.

Senior Data Engineer

Dublin, Leinster UnitedHealth Group

Posted today

Job Viewed

Tap Again To Close

Job Description

Optum is a global organisation that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by diversity and inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health equity on a global scale. Join us to start Caring. Connecting. Growing together.
**About the role:**
In healthcare, evolution doesn't just happen. It takes innovation, imagination, and a passion for solving problems in new and better ways. And innovation is taking place at a lightning-fast pace every day at Optum.
As the fastest growing part of the UnitedHealth Group family of businesses, we're expanding our team in Ireland and creating excellent opportunities for those who want greater purpose and more impact in their work. We'll provide the investment, support, and resources to advance your career. You'll provide the talent, ambition, and drive.
As a Senior Data Engineer, you will be responsible for the development of complex data sources and pipelines into our data platform (i.e. Snowflake) along with other data applications (i.e. Azure, Airflow, etc.) and automation.
The Senior Data Engineer will work closely with the data, Architecture, Business Analyst, Data Stewards to integrate and align requirements, specifications and constraints of each element of the requirement. They will also need to help identify gaps in resources, technology, or capabilities required, and work with the data engineering team to identify and implement solutions where appropriate.
_Careers with Optum offer flexible work arrangements and individuals who live and work in the Republic of Ireland will have the opportunity to split their monthly work hours between our Dublin or Letterkenny and telecommuting from a home-based office in a hybrid work model._
**Primary Responsibilities:**
- Integrate data from multiple on prem and cloud sources and systems. Handle data ingestion, transformation, and consolidation to create a unified and reliable data foundation for analysis and reporting.
- Develop data transformation routines to clean, normalize, and aggregate data. Apply data processing techniques to handle complex data structures, handle missing or inconsistent data, and prepare the data for analysis, reporting, or machine learning tasks.
- Implement data de-identification/data masking in line with company standards.
- Monitor data pipelines and data systems to detect and resolve issues promptly.
- Develop monitoring tools to automate error handling mechanisms to ensure data integrity and system reliability.
- Utilize data quality tools like Great Expectations or Soda to ensure the accuracy, reliability, and integrity of data throughout its lifecycle.
- Create & maintain data pipelines using Airflow & Snowflake as primary tools
- Create SQL Stored procs to perform complex transformation
- Understand data requirements and design optimal pipelines to fulfil the use-cases
- Creating logical & physical data models to ensure data integrity is maintained
- CI CD pipeline creation & automation using GIT & GIT Actions
- Tuning and optimizing data processes
You will be rewarded and recognised for your performance in an environment that will challenge you and give you clear direction on what it takes to succeed in your role, as well as providing development for other roles you may be interested in.
**Required Qualifications:**
- Bachelor's degree in Computer Science or a related field.
- Proven hands-on experience as a Data Engineer.
- Proficiency in SQL (any flavor), with experience using Window functions and advanced features.
- Excellent communication skills.
- Strong knowledge of Python.
- In-depth knowledge of Snowflake architecture, features, and best practices.
- Experience with CI/CD pipelines using Git and Git Actions.
- Knowledge of various data modeling techniques, including Star Schema, Dimensional models, and Data Vault.
- Hands-on experience developing data pipelines (Snowflake), writing complex SQL queries.
- Experience building ETL/ELT/data pipelines.
- Hands-on experience with related/complementary open-source software platforms and languages (e.g., Scala, Python, Java, Linux).
- Experience with both relational (RDBMS) and non-relational databases.
- Analytical and problem-solving skills applied to big data datasets.
- Experience working on projects with agile/scrum methodologies and high-performing teams.
- Good understanding of access control, data masking, and row access policies.
- Exposure to DevOps methodology.
- Knowledge of data warehousing principles, architecture, and implementation.
**Preferred Qualifications:**
- Bachelor's degree or higher in Database Management, Information Technology, Computer Science, or a related field.
- Motivated self-starter who excels at managing tasks independently and takes ownership.
- Experience orchestrating data tasks in Airflow to run on Kubernetes for data ingestion, processing, and cleaning.
- Expertise in designing and implementing data pipelines to process high volumes of data.
- Ability to create Docker images for applications to run on Kubernetes
- Familiarity with Azure Services such as Blobs, Functions, Azure Data Factory, Service Principal, Containers, Key Vault, etc.
**Please note you must currently be eligible to work and remain indefinitely without any restrictions in the country to which you are making an application. Proof will be required to support your application.**
All telecommuters will be required to adhere to the UnitedHealth Group's Telecommuter Policy.
At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalised groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.
Diversity creates a healthier atmosphere: Optum is an Equal Employment Opportunity employer and all qualified applicants will receive consideration for employment without regard to gender, civil status, family status, sexual orientation, disability, religion, age, race, and membership of the Traveller community, or any other characteristic protected by law. Optum is a drug-free workplace. © 2025 Optum Services (Ireland) Limited. All rights reserved.
#BBMEMEA
This advertiser has chosen not to accept applicants from your region.

Senior Data Engineer

Dublin, Leinster Mastercard

Posted today

Job Viewed

Tap Again To Close

Job Description

**Our Purpose**
_Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we're helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential._
**Title and Summary**
Senior Data Engineer
Mastercard strives to be the trusted network for empowering small businesses to grow and thrive through compelling products and solutions accessible from our B2B customers and partners. This segment is central to Mastercard's growth strategy - not only because of the impact Small & Medium Enterprises have on local, regional, and global economies but also because of their significant role in the payment ecosystem as buyers and sellers.
SME Engineering's organizational goal is to build innovative solutions and products that enable Mastercard to provide competitive solutions to small and medium businesses.
We believe in lean teams delivering a lot of value, so you will be part of a lean team potentially driving impact for the platform buildout from scratch. The team will be addressing complex technical problems. It will work on the latest technology like Azure/AWS cloud-native services, GraphQL, High throughput SQL/key-value/document stores, Cloud-based services, and exposure to other Mastercard tech stacks as part of the development process.
Be part of Mastercard and embrace the "Mastercard Way" of - "Create Value, Grow Together and Move Fast" by doing the right thing, focused on "Decency, Inclusion, and Force for Good."
Role:
As a Senior Data Engineer, you will:
- Write high-quality, performant, clean, and testable code that adheres to best practices and coding standards, ensuring maintainability and scalability.
- Design, develop, and maintain data pipelines that automate tasks within data science and data engineering, enhancing workflow efficiency.
- Work in cross-functional teams and across different business units to tackle complex problems, fostering a collaborative and innovative environment.
- Assist in deploying and validating production artifacts, ensuring seamless integration and functionality.
- Identify opportunities to simplify and automate tasks, building reusable components that serve multiple use cases and teams.
- Create data assets that are well-modeled, thoroughly documented, and easy to understand and maintain, contributing to a robust data infrastructure.
- Develop functional requirements in complex environments, ensuring solutions meet business needs and technical specifications.
- Utilize advanced big data platforms and technologies to handle data at petabyte scale, pushing the boundaries of what is possible in data engineering.
- Build, optimize and maintain ETL pipelines using Hadoop ecosystem tools (HDFS, Hive, Spark).
- Collaborate with teams to ensure efficient and reliable data processing.
- Perform data modeling, quality checks, and system performance tuning.
- Contribute to modernization efforts, including potential cloud or Databricks integration.
All About You:
- Essential Skills to be successful:
- Proven experience in Python, PySpark and SQL, showcasing the ability to write clean, readable, and maintainable code.
- Hands-on knowledge of any big data engine, with a preference for Spark, demonstrating proficiency in managing large-scale data.
- Strong experience with CI/CD tools such as Git and Jenkins, ensuring efficient and reliable code deployment.
- Excellent communication skills, enabling effective collaboration and knowledge sharing.
- Highly skilled in problem-solving, capable of addressing complex challenges with innovative solutions.
- Exhibits a high degree of initiative and a strong curiosity, with a desire to continuously learn and grow.
- Strong understanding of Agile methodologies, with the ability to drive iterative delivery and cross-team collaboration.
- Strong communicator with the ability to explain complex concepts to both technical and non-technical audiences, and to influence stakeholders across product, engineering, and acquisition teams.
- Bachelor's degree in Computer Science, Data Analytics, Mathematics, Software Engineering, or a related field or equivalent practical experience.
- Programming language: Java/Scala/Python
- Data processing framework: Spark
- BigData Hadoop Frameworks: Hive, impala, oozie, airflow, Hdfs
- Any cloud provider experience. Services like S3, Athena, EMR, Redshift, Glue, Lambda etc.
- Data & AI platform: Databricks
Nice to have:
- Experience in with data engineering on petabyte scale data
- Passion of Machine Learning
- Comfortable with pioneering new technology
**Corporate Security Responsibility**
All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must:
+ Abide by Mastercard's security policies and practices;
+ Ensure the confidentiality and integrity of the information being accessed;
+ Report any suspected information security violation or breach, and
+ Complete all periodic mandatory security trainings in accordance with Mastercard's guidelines.
This advertiser has chosen not to accept applicants from your region.

Senior Data Engineer

Rithum

Posted today

Job Viewed

Tap Again To Close

Job Description

Rithum is the world's most trusted commerce network, accelerating how brands, suppliers, and retailers work together to deliver seamless e-commerce experiences. We provide an unmatched platform for brands and retailers, enabling them to accelerate growth, optimise operations across channels, scale product offerings and enhance margins.
Today, more than 40,000 companies trust Rithum to grow their business across hundreds of channels, representing over $50 billion in annual GMV. Using our commerce, marketing, and delivery solutions, our customers create optimised consumer shopping journeys from beginning to end.
**Overview**
As a **Senior Data Engineer** , you are a key architect in developing, delivering and maintaining intelligent data infrastructure that powers advanced analytics. You are an experienced data engineer with strong ETL pipeline and dimensional modeling skills, who is ready to leverage AI as a powerful collaborator and brings a depth of knowledge to ensure it is employed effectively for the long-term health of our business systems. You are responsible for building complex data pipelines from the ground up, delivering data into and enhancing a database infrastructure that powers advanced analytics, AI-driven decision-making, and autonomous data operations. You work across the organization, supporting and collaborating with our product teams, engineers, data scientists, and data analysts. In this context, you enhance and support our company's cloud data architecture to support our products and data initiatives, for external clients as well as internal users.
**Responsibilities**
_Core Data Infrastructure_
+ **Design and implement scalable ETL/ELT workflows** that support both batch and streaming data using AWS primitives (e.g., S3, Kinesis, Glue, Redshift, Athena)
+ **Architect and maintain cloud-native data platforms** with automated ingestion, transformation, and governance pipelines using modern tools like DBT, Apache Spark, Delta Lake, Airflow, and Databricks
+ **Work with stakeholders, including the Product, BI, and Support teams** , to assist with data-related technical challenges and support their data and infrastructure needs.
+ **Collaborate with cross-functional Engineering teams** using analytics to predict data needs and proactively deliver solutions.
+ **Assist in the optimization of data lake/lakehouse infrastructure** , to support AI workloads and large-scale analytics.
_Governance & Collaboration_
+ **Ensure data quality, lineage, and observability** .
+ **Develop and enforce data governance policies** , including compliance monitoring and privacy protection.
+ **Partner with Data Scientists** to optimize data pipelines for model training, inference, and continuous learning workflows.
_Advanced Data Operations_
+ **Build self-healing data pipelines** with AI-driven error detection, root cause analysis, and automated remediation capabilities.
+ **Implement intelligent data lineage tracking** to automatically discover relationships between datasets and predict downstream impact of changes.
+ **Create AI-assisted data discovery systems** that help stakeholders find relevant datasets and understand data semantics through natural language interfaces.
+ **Participate in on-call rotation** , as needed.
_AI-Enhanced Development_
+ **Leverage AI coding assistants** (GitHub Copilot, Cursor, etc.) to
+ **Accelerate development** cycles, generate complex SQL queries, and automatically optimize data pipeline code.
+ **Develop data quality monitoring** , using anomaly detection and data profiling tools to identify issues, before they impact downstream systems.
+ **Optimize pipeline orchestration** , with ML to predict optimal scheduling, resource allocation, and failure recovery patterns.
+ **Generate and maintain living documentation** that evolves with code changes.
_Leadership & Innovation_
+ **Participate in the full software development lifecycle** , including both manual and AI-assisted requirements gathering, automated testing, and intelligent deployment strategies.
+ **Mentor junior engineers** on both traditional data engineering practices and effective use of AI development tools.
+ **Lead tool evaluation and adoption** for the data engineering team, establishing best practices for human-AI collaboration.
+ **Drive innovation in data architecture** by experimenting with emerging technologies.
**Qualifications**
Minimum Qualifications
+ 3+ years of experience in data engineering, including building and maintaining large-scale data pipelines
+ Extensive experience in SQL RDBMS (SQLServer or similar) with dimensional modeling using star schema, and foundational data warehousing concepts
+ Hands-on experience with AWS services such as Redshift, Athena, S3, Kinesis, Lambda, Glue
+ Experience with DBT, Databricks or similar data platform tooling
+ Experience working with structured and unstructured data and implementing data quality frameworks
+ Excellent communication and collaboration skills
+ Demonstrated experience using AI coding tools (GitHub Copilot, Cursor, or similar), with understanding of prompt engineering, to enhance development productivity
+ Understanding of AI/ML concepts and data requirements, including feature stores, model versioning, and real-time inference pipelines
Preferred Qualifications
+ Bachelor's or Master's degree in Computer Science, Engineering, or a related field
+ Experience in a SaaS or e-commerce environment with AI/ML products
+ Knowledge of stream processing frameworks like Kafka, Flink, or Spark Structured Streaming
+ Familiarity with LLMOps and AI model deployment patterns in data infrastructure
+ Experience with AI-powered data tools such as automated data catalogs, intelligent monitoring systems, or AI-assisted query optimization
+ Proven ability to thrive in a fast-paced, agile environment with shifting priorities and emerging technologies
+ Experience with containerization and orchestration tools like Docker and Kubernetes
**Travel Required**
Up to 10%
**Other Duties**
_Please note this job description is not designed to cover or contain a comprehensive listing of activities, duties or responsibilities required of the employee for this job. Duties, responsibilities, and activities may change at any time with or without notice._
**What it's like to work at Rithum**
When you join Rithum, you can expect to work with smart risk-takers, courageous collaborators, and curious minds.
As part of the Rithum team, you are valued, supported, and included. Guided by a transparent culture and accessible, approachable leadership, we offer career opportunities aligned to your ambitions and talents. To ensure work and life balance works for you, we also offer an array of resources to support you and your families, including comprehensive benefits and wellness plans.
At Rithum you will:
+ Partner with the leading brands and retailers.
+ Connect with passionate professionals who will help support your goals.
+ Participate in an inclusive, welcoming work atmosphere.
+ Achieve work-life balance through remote-first working conditions, generous time off, and wellness days.
+ Receive industry-competitive compensation and total rewards benefits.
**Benefits**
+ Medical coverage provided through Irish Life Health; premiums paid by the company
+ Life & disability insurance
+ Pension plan with 5% company match
+ Competitive time off package with 25 Days of PTO, 11 Company-Paid holidays, 2 Wellness days and 1 Paid Volunteer Day
+ Access to tools to support your wellbeing such as the Calm App and an Employee Assistance Program
+ Professional development stipend and learning and development offerings to help you build the skills and connections you need to move forward in your career.
+ Charitable contribution match per team member
Rithum is an equal opportunity employer. We are committed to providing an environment of mutual respect where equal employment opportunities are available to all applicants and teammates without regard to race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other protected characteristic. All employment is decided on the basis of qualifications, merit, and business need.
We're committed to providing reasonable accommodations in accordance with the law for qualified applicants. If you require assistance during the interview process due to a medical condition or need support accessing our website or completing the application process, please reach out to us by completing the Accommodations Request Form ( . Your comfort and accessibility are important to us, and we're here to ensure a seamless experience as you explore opportunities with our team.
This advertiser has chosen not to accept applicants from your region.
Be The First To Know

About the latest Big data hadoop Jobs in Ireland !

Senior Data Engineer

Galway, Connacht Cpl Resources - Galway

Posted today

Job Viewed

Tap Again To Close

Job Description

Senior Data Engineer Location:Galway (Hybrid) Type:Contract Your Technical Expertise: *Strong experience with database technologies such as Snowflake and Oracle. *Object-oriented development experience (preferred language: Python). *Practical experience with cloud technologies (AWS preferred). *Proven ability to design and build scalable, robust ETL data flows. *Solid design and analysis capabilities for large data platforms. *Familiar with DevOps and CI/CD practices and tools (Jenkins, GitHub, Terraform, Docker). *Knowledge of messaging systems (e.g., SNS, SQS, Artemis MQ). *Experience with job scheduling and batch processing (Control M, AWS Batch). *Comfortable working in an agile development environment. *Excellent interpersonal, communication and cross-team collaboration skills. *Financial services experience is preferred but not essential. Your Skills & Experience: *Hands-on application design and development experience with a complete understanding of the software development lifecycle. *Capability to improve and uphold code quality, security, structure, and automation. *Strong analytical, communication and organisational skills, with the ability to handle multiple concurrent tasks. *A collaborative mindset: you actively contribute to how work gets delivered and thrive in team settings. *Expertise conducting code reviews to enforce standards and efficient coding practices. *Experience building monitoring and alerting solutions to surface failures or potential performance degradation. *Ability to guide, encourage and motivate other engineers. *A true team player who can also operate independently with minimal direction. If you are interested in learning more about this opportunity, please drop me a message, forward your CV using the apply button or reach out to me directly via #LI-CF3 Skills: Snowflake Python ETL
This advertiser has chosen not to accept applicants from your region.

Senior Data Engineer

Dublin, Leinster SSE

Posted today

Job Viewed

Tap Again To Close

Job Description

Base Location: Dublin or Reading Salary: £42,600- £4,000 / €57,600 - €86,400+ performance-related bonus and a range of benefits to support your finances, wellbeing and family. Working Pattern: Permanent | Full Time | Flexible First options available As a Data Engineer at ECS, you will play a key role in supporting the delivery of our data strategy. You will contribute to the design, development, and optimisation of scalable data solutions that enable business insights and enhance customer experience, particularly through the use of cloud technologies and advanced analytics. You will also be involved in data migration and transformation initiatives, working closely with cross-functional teams to ensure data is accurate, accessible, and secure. You will Contribute to the delivery of data solutions across Distributed Energy and Business Energy within ECS, working as part of a collaborative technical team. Support and mentor junior data engineers, promoting knowledge sharing and teamwork. Apply strong analytical and problem-solving skills to help design and implement data pipelines and models. Communicate effectively with both technical and non-technical stakeholders, contributing to discussions and sharing insights. Show enthusiasm for innovation, particularly in AI and automation, and demonstrate a willingness to learn and adapt in a fast-paced environment. You have Hands-on experience with Azure data services, including Azure Data Factory, Databricks, Azure Storage (Data Lake / Delta Lake), Azure SQL Database, and Synapse. Proficiency in Python/PySpark, T-SQL, and data analysis libraries such as Pandas and NumPy. Experience with Power BI is desirable. Experience building and maintaining ETL pipelines, developing data models, and contributing to documentation and code reviews. A good understanding of data governance, data manipulation, machine learning concepts, and language models is preferable. Familiarity with Agile ways of working and tools such as Azure DevOps, including source control and CI/CD practices. About SSE SSE has a bold ambition - to be a leading energy company in a net zero world. We're investing around £10 llion a day in homegrown energy to help power a cleaner, more secure future. Our investment will see us build the world's largest offshore wind farm and transform the grid to deliver greener electricity to millions. Our IT division powers growth across all SSE business areas by making sure we have the systems, software and security needed to take the lead in a low carbon world. They provide expertise, advice and day-to-day support in emerging technologies, data and analytics, cyber security and more. Flexible benefits to fit your life Enjoy discounts on private healthcare and gym memberships. Wellbeing benefits like a free online GP and 24/7 counselling service. Interest-free loans on tech and transport season tickets, or a new bike with our Cycle to Work scheme. As well as generous family entitlements such as maternity and adoption pay, and paternity leave. Work with an equal opportunity employer SSE will make any reasonable adjustments you need to ensure that your application and experience with us is positive. Please contact / to discuss how we can support you. We're dedicated to fostering an open and inclusive workplace where people from all backgrounds can thrive. We create equal opportunities for everyone to succeed and especially welcome applications from those who may not be well represented in our workforce or industry. Ready to apply? Start your online application using the Apply Now box on this page. We only accept applications made online. We'll be in touch after the closing date to let you know if we'll be taking your application further. If you're offered a role with SSE, you'll need to complete a criminality check and a credit check before you start work. To be considered for this role you will be redirected to and must complete the application process on our careers page. To start the process click the Continue to Application below.
This advertiser has chosen not to accept applicants from your region.

Senior Data Engineer

Dublin, Leinster J.P MORGAN S.E Dublin Branch

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Description Elevate your career by working with new AI and machine learning technologies, focusing on delivering impactful solutions. We provide opportunities to help you reach your full potential, offering the support you need to achieve your career goals. The Employee Platforms technology team is excited to invest in Dublin, establishing an engineering hub focused on the growth and application of innovative technologies to enhance the experience of our 300k+ employees. As a Data Engineer at JP Morgan Chase within the Employee Platforms team, you will design, build, and maintain scalable data pipelines and infrastructure to support our data-driven initiatives. Your expertise in AWS, Databricks, and graph database technologies will be crucial in delivering solutions that meet business needs. You will collaborate with data scientists, analysts, and other stakeholders to ensure data quality and consistency, while staying up-to-date with industry trends and technologies. J.P. Morgan Dublin thrives as a collaborative, tight-knit community, passionately driven by innovation, where curiosity fuels the relentless pursuit of groundbreaking ideas. The culture is centred around exploring new frontiers together by fostering an environment that encourages growth, creativity and forward-thinking. Job Responsibilities: Design, develop, and maintain scalable data pipelines using AWS services, Databricks, and graph databases. Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions that meet business needs. Implement data integration and transformation processes to ensure data quality and consistency. Optimise and manage data storage solutions, ensuring efficient data retrieval and processing. Monitor and troubleshoot data pipelines to ensure reliability and performance. Stay up-to-date with the latest industry trends and technologies in data engineering and make recommendations for improvements. Document data engineering processes, workflows, and best practices. Required Qualifications, Capabilities, and Skills: Formal training or certificate in Computer Science, Information technology and proficient advanced experience. Proven experience as a Data Engineer or in a similar role. Strong proficiency in AWS services such as S3, Lambda, Glue, and Redshift. Experience with Databricks for data processing and analytics. Familiarity with thatDot Streaming Graph for real-time data processing. Proficiency in programming languages such as Python for scripting and automation. Strong SQL skills for querying and managing relational databases. Experience with Cypher for querying graph databases. Strong problem-solving skills and attention to detail. Excellent communication and collaboration skills. Preferred Qualifications, Capabilities, and Skills: Experience with data modelling and ETL processes. Knowledge of big data technologies such as Apache Spark and Kafka. Familiarity with data visualization tools and technique About Us J.P. Morgan is a global leader in financial services, providing strategic advice and products to the world's most prominent corporations, governments, wealthy individuals and institutional investors. Our first-class business in a first-class way approach to serving clients drives everything we do. We strive to build trusted, long-term partnerships to help our clients achieve their business objectives. We recognize that our people are our strength and the diverse talents they bring to our global workforce are directly linked to our success. We are an equal opportunity employer and place a high value on diversity and inclusion at our company. We do not discriminate on the basis of any protected attribute, including race, religion, color, national origin, gender, sexual orientation, gender identity, gender expression, age, marital or veteran status, pregnancy or disability, or any other basis protected under applicable law. We also make reasonable accommodations for applicants' and employees' religious practices and beliefs, as well as mental health or physical disability needs. Visit our FAQs for more information about requesting an accommodation. About the Team Our professionals in our Corporate Functions cover a diverse range of areas from finance and risk to human resources and marketing. Our corporate teams are an essential part of our company, ensuring that we're setting our businesses, clients, customers and employees up for success. To be considered for this role you will be redirected to and must complete the application process on our careers page. To start the process click the Continue to Application or Login/Register to apply button below.
This advertiser has chosen not to accept applicants from your region.
 

Nearby Locations

Other Jobs Near Me

Industry

  1. request_quote Accounting
  2. work Administrative
  3. eco Agriculture Forestry
  4. smart_toy AI & Emerging Technologies
  5. school Apprenticeships & Trainee
  6. apartment Architecture
  7. palette Arts & Entertainment
  8. directions_car Automotive
  9. flight_takeoff Aviation
  10. account_balance Banking & Finance
  11. local_florist Beauty & Wellness
  12. restaurant Catering
  13. volunteer_activism Charity & Voluntary
  14. science Chemical Engineering
  15. child_friendly Childcare
  16. foundation Civil Engineering
  17. clean_hands Cleaning & Sanitation
  18. diversity_3 Community & Social Care
  19. construction Construction
  20. brush Creative & Digital
  21. currency_bitcoin Crypto & Blockchain
  22. support_agent Customer Service & Helpdesk
  23. medical_services Dental
  24. medical_services Driving & Transport
  25. medical_services E Commerce & Social Media
  26. school Education & Teaching
  27. electrical_services Electrical Engineering
  28. bolt Energy
  29. local_mall Fmcg
  30. gavel Government & Non Profit
  31. emoji_events Graduate
  32. health_and_safety Healthcare
  33. beach_access Hospitality & Tourism
  34. groups Human Resources
  35. precision_manufacturing Industrial Engineering
  36. security Information Security
  37. handyman Installation & Maintenance
  38. policy Insurance
  39. code IT & Software
  40. gavel Legal
  41. sports_soccer Leisure & Sports
  42. inventory_2 Logistics & Warehousing
  43. supervisor_account Management
  44. supervisor_account Management Consultancy
  45. supervisor_account Manufacturing & Production
  46. campaign Marketing
  47. build Mechanical Engineering
  48. perm_media Media & PR
  49. local_hospital Medical
  50. local_hospital Military & Public Safety
  51. local_hospital Mining
  52. medical_services Nursing
  53. local_gas_station Oil & Gas
  54. biotech Pharmaceutical
  55. checklist_rtl Project Management
  56. shopping_bag Purchasing
  57. home_work Real Estate
  58. person_search Recruitment Consultancy
  59. store Retail
  60. point_of_sale Sales
  61. science Scientific Research & Development
  62. wifi Telecoms
  63. psychology Therapy
  64. pets Veterinary
View All Big Data Hadoop Jobs