243 Big Data jobs in Ireland
Big Data Engineer
Posted today
Job Viewed
Job Description
About the Role:
We are looking for a Big Data Engineer to join one of our leading clients on an exciting project. The ideal candidate will have hands-on experience with large-scale data processing, Hadoop ecosystem tools, and cloud platforms, and will play a key role in building and optimizing data pipelines.
Tech Stack
- Programming Languages: Java / Scala / Python
- Data Processing Framework: Spark
- Big Data / Hadoop Frameworks: Hive, Impala, Oozie, Airflow, HDFS
- Cloud Experience: AWS, Azure, or GCP (services such as S3, Athena, EMR, Redshift, Glue, Lambda, etc.)
- Data & AI Platform: Databricks
Roles & Responsibilities
- Build, optimize, and maintain ETL pipelines using Hadoop ecosystem tools (HDFS, Hive, Spark).
- Collaborate with cross-functional teams to ensure efficient and reliable data processing workflows.
- Perform data modelling, implement quality checks, and carry out system performance tuning.
- Support modernization efforts, including migration and integration with cloud platforms and Databricks.
Preferred Qualifications
- Hands-on experience with large-scale data processing and distributed systems.
- Strong problem-solving and analytical skills.
- Familiarity with CI/CD pipelines and version control tools is a plus.
Job Types: Full-time, Permanent
Pay: €70,000.00-€85,000.00 per year
Work Location: In person
Application deadline: 10/10/2025
Reference ID: IJP - SBDE - DUBIR - 01
Expected start date: 19/10/2025
Big Data Developer
Posted today
Job Viewed
Job Description
I'm working with a financial services client of mine, and they're looking for a Senior Data Software Engineer to come in and join their team. This role will be a day rate contract (12 months), with a lucrative rate.
What You Bring to the Table
- 5+ years of professional experience
in software design and development - Advanced hands-on expertise with
Snowflake and Oracle
databases - Solid foundation in
Java-based object-oriented programming - Practical experience working with
Apache Spark
(using Java or Scala), ideally within
AWS EMR environments - Strong familiarity with
AWS - Demonstrated ability to architect and build
scalable, reliable ETL pipelines - Background in
DevOps workflows
and tools like Maven, Jenkins, GitHub, Terraform, and Docker, with experience in
CI/CD automation
Please feel free to reach out for more information
Lead Big Data Engineer
Posted today
Job Viewed
Job Description
Genesys empowers organizations of all sizes to improve loyalty and business outcomes by creating the best experiences for their customers and employees. Through Genesys Cloud, the AI-powered Experience Orchestration platform, organizations can accelerate growth by delivering empathetic, personalized experiences at scale to drive customer loyalty, workforce engagement, efficiency and operational improvements.
We employ more than 6,000 people across the globe who embrace empathy and cultivate collaboration to succeed. And, while we offer great benefits and perks like larger tech companies, our employees have the independence to make a larger impact on the company and take ownership of their work. Join the team and create the future of customer experience together.
The Genesys Cloud Analytics platform is the foundation on which decisions are made that directly impact our customer's experience as well as their customers' experiences. We are a data-driven company, handling tens of millions of events per day to answer questions for both our customers and the business. From new features to enable other development teams, to measuring performance across our customer-base, to offering insights directly to our end-users, we use our terabytes of data to move customer experience forward.
In this role, you will be a technical leader with your expertise working in our Batch Analytics team, which manages EMR pipelines on Airflow that process peta bytes of data. We're all about scale.
The best person will have a strong engineering background, not shy from the unknown, and will be able to articulate vague requirements into something real. We are a team whose focus is to operationalize big data products and curate high-value datasets for the wider organization as well as to build tools and services to expand the scope of and improve the reliability of the data platform as our usage continues to grow on a daily basis.
Summary:
- Build and manage large scale pipelines using Spark and Airflow.
- Develop and deploy highly-available, fault-tolerant software that will help drive improvements towards the features, reliability, performance, and efficiency of the Genesys Cloud Analytics platform.
- Actively review code, mentor, and provide peer feedback.
- Engineer efficient, adaptable and scalable architecture for all stages of data lifecycle (ingest, streaming, structured and unstructured storage, search, aggregation) in support of a variety of data applications.
- Build abstractions and re-usable developer tooling to allow other engineers to quickly build streaming/batch self-service pipelines.
- Build, deploy, maintain, and automate large global deployments in AWS.
- Troubleshoot production issues and come up with solutions as required.
This may be the perfect job for you if:
- You have engineered scalable software using big data technologies (e.g., Hadoop, Spark, Hive, Presto, Elasticsearch, etc).
- You have a strong engineering background with ability to design software systems from the ground up.
- You have expertise in Java. Python and other object-oriented languages are a plus.
- You have experience in web-scale data and large-scale distributed systems, ideally on cloud infrastructure.
- You have a product mindset. You are energized by building things that will be heavily used.
- Open to mentoring and collaborating with junior members of the team.
- Be adaptable and open to exploring new technologies and prototyping solutions within a reasonable cadence.
- You design not just with a mind for solving a problem, but also with maintainability, testability, monitorability, and automation as top concerns.
Technologies we use and practices we hold dear:
- Right tool for the right job over we-always-did-it-this-way.
- We pick the language and frameworks best suited for specific problems.
- Ansible for immutable machine images.
- AWS for cloud infrastructure.
- Automation for everything. CI/CD, testing, scaling, healing, etc.
- Hadoop and Spark for batch processing
- Airflow for orchestration.
- Dynamo, Elasticsearch, Presto, and S3 for query and storage.
If a Genesys employee referred you, please use the link they sent you to apply.
About Genesys:
Genesys empowers more than 8,000 organizations worldwide to create the best customer and employee experiences. With agentic AI at its core, Genesys Cloud is the AI-Powered Experience Orchestration platform that connects people, systems, data and AI across the enterprise. As a result, organizations can drive customer loyalty, growth and retention while increasing operational efficiency and teamwork across human and AI workforces. To learn more, visit
Reasonable Accommodations:
If you require a reasonable accommodation to complete any part of the application process, or are limited in your ability to access or use this online application and need an alternative method for applying, you or someone you know may contact us at
You can expect a response within 24–48 hours. To help us provide the best support, click the email link above to open a pre-filled message and complete the requested information before sending. If you have any questions, please include them in your email.
This email is intended to support job seekers requesting accommodations. Messages unrelated to accommodation—such as application follow-ups or resume submissions—may not receive a response.
Genesys is an equal opportunity employer committed to fairness in the workplace. We evaluate qualified applicants without regard to race, color, age, religion, sex, sexual orientation, gender identity or expression, marital status, domestic partner status, national origin, genetics, disability, military and veteran status, and other protected characteristics.
Please note that recruiters will never ask for sensitive personal or financial information during the application phase.
Senior Big Data Engineer
Posted today
Job Viewed
Job Description
New Roles - Senior Big Data Engineers in Galway - permanent or 11 months contract
Location: Galway - hybrid working - suited if located within commutable distance
Are you ready to take your software engineering career to the next level? Our client is on the lookout for a passionate and skilled Senior Software Engineer to join their innovative team. This is your chance to work on a highly strategic initiative focused on developing cutting-edge performance measurement and analytics software.
Why Join?
At our client, they believe in fostering a culture of collaboration, creativity, and continuous learning. Here, you'll have the opportunity to work with a diverse range of technologies, allowing you to leverage your existing skills while also expanding your knowledge.
What You'll Do:
Collaborate with a dynamic team of talented engineers to design and develop scalable, robust data platforms.
Utilise your experience in database technologies, particularly Snowflake and Oracle, to enhance our performance measurement capabilities.
Engage in Object-Oriented Software development using Java and apply your hands-on experience with Spark (Java or Scala).
Contribute to building efficient ETL data flows and ensure high-quality software delivery through DevOps practises.
Work in an agile scrum development environment, promoting innovative solutions and best practises.
The Expertise We're Looking For:
Bachelor's or Master's Degree in a technology-related field (e.g., Engineering, Computer Science) with a minimum of 5+ years of design and development experience.
Strong expertise in database technologies, particularly Snowflake and Oracle.
Proficiency in Object-Oriented Software development with Java.
Hands-on experience with Spark (Java or Scala)
Familiarity with AWS EMR is a plus.
Experience with Cloud technologies (AWS), including Docker and EKS.
Proven ability to build scalable and robust ETL data flows.
Strong design and analysis skills for large data platforms.
Familiarity with DevOps tools and practises (Maven, Jenkins, GitHub, Terraform, Docker).
Excellent interpersonal, communication, and collaboration skills.
What's In It For You?
A vibrant workplace that encourages sharing and collaboration.
Opportunities for growth and continuous learning in a supportive environment.
The chance to contribute to innovative projects that impact the financial industry.
If you are a motivated individual who thrives in a collaborative setting and is excited about leveraging your skills to drive success, we want to hear from you
Join us and be a part of a team that values your input and expertise.
Apply Today
Embrace the opportunity to make a difference in a dynamic workplace. Let's shape the future together
Adecco is a disability-confident employer. It is important to us that we run an inclusive and accessible recruitment process to support candidates of all backgrounds and all abilities to apply. Adecco is committed to building a supportive environment for you to explore the next steps in your career. If you require reasonable adjustments at any stage, please let us know and we will be happy to support you.
Adecco Ireland is acting as an Employment Agency in relation to this vacancy.
Big Data Operations Engineer
Posted today
Job Viewed
Job Description
Job Responsibilities:
- Responsible for the operation, maintenance, deployment, management, scaling, and optimization of core big data storage components. Ensure service stability and availability, and identify and resolve performance bottlenecks.
- Support various business teams regarding the use of big data components and assist in technical solution selection.
- Responsible for the operation and maintenance of the big data platform, ensuring the stability and availability of both real-time and offline services, and identifying and resolving performance issues.
- Promote the development of automated operation platforms to enhance the efficiency of operational work.
Job Requirements:
- Familiar with the Hadoop/Elasticsearch ecosystem; possess a solid understanding of mainstream distributed development suites such as Elasticsearch/HBase/Hive/Kafka/Zookeeper/Yarn/MR/Spark/Flink. Experience in installation, operation, and performance tuning is preferred.
- Proficient in basic command operations of Linux-based operating systems and capable of writing scripts for daily operational tasks.
Staff DevOps Engineer - Big Data
Posted 8 days ago
Job Viewed
Job Description
We are seeking a skilled engineer with exceptional DevOps skills to join our team. Responsibilities include automating and scaling Big Data and Analytics technology stacks on Cloud infrastructure, building CI/CD pipelines, setting up monitoring and alerting for production infrastructure, and keeping our technology stacks up to date.
**What you get to do in this role:**
+ Deploy, scale, and manage containerized Big Data applications using Kubernetes, docker, and other related tools.
+ Proactively identify and resolve issues within Kubernetes clusters, containerized applications, and data pipelines. Provide expert-level support for incidents and perform root cause analysis.
+ DevOps & Automation: Implement CI/CD automation, GitOps workflows, Infrastructure-as-Code mastery, and automate using Python, Bash, and Go for infrastructure management and toil reduction.
+ Responsible for deploying, monitoring, maintaining and supporting Big Data infrastructure on ServiceNow Cloud and Azure environments.
+ Provide production support to resolve critical Big Data pipelines, application issues, and mitigating or minimizing any impact on Big Data applications. Collaborate closely with Site Reliability Engineers (SRE), Customer Support (CS), Developers, QA and System engineering teams in replicating complex issues leveraging broad experience with UI, SQL, Full-stack and Big Data technologies.
+ Enforce data governance policies and the Definition of Done (DoD) in all Big Data environments.
+ Strong understanding of traditional relational databases like MySQL or PostgreSQL. Ability to write queries, perform joins, and optimize basic SQL queries.
+ Responsible for MySQL and PostgreSQL operations, and hybrid cloud architectures across cloud and on-prem environments.
**To be successful in this role you have:**
+ Bachelor's degree, or equivalent work experience
+ 8+ years of experience in DevOps or Site Reliability Engineering
+ Proficiency in CI/CD tools (Jenkins, Maven) and source control systems (Gitlab, BitBucket, or GitHub).
+ Working experience in artifact repositories (Nexus, Artifactory).
+ Demonstrated expertise in automation, testing, and operational best practices.
+ Strong experience with Kubernetes and Docker
+ Experience with Ansible and/or Terraform
+ Excellent experience in Big Data / Hadoop, Kafka, Spark, Airflow, Hive, HDFS, Impala, Hue, Victoria Metrics, Trino.
+ Strong background in Linux/Unix
+ Experience with Python, Bash scripting, and Helm Chart
+ Experience working with monitoring and alerting tools such as Grafana and being part of on-call rotations
+ AI literacy and curiosity. You have either 1) tried Gen AI in your previous work or outside of work, or 2) are curious about Gen AI and have explored it.
**Work Personas**
We approach our distributed world of work with flexibility and trust. Work personas (flexible, remote, or required in office) are categories that are assigned to ServiceNow employees depending on the nature of their work and their assigned work location. Learn more here ( . To determine eligibility for a work persona, ServiceNow may confirm the distance between your primary residence and the closest ServiceNow office using a third-party service.
**Equal Opportunity Employer**
ServiceNow is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, creed, religion, sex, sexual orientation, national origin or nationality, ancestry, age, disability, gender identity or expression, marital status, veteran status, or any other category protected by law. In addition, all qualified applicants with arrest or conviction records will be considered for employment in accordance with legal requirements.
**Accommodations**
We strive to create an accessible and inclusive experience for all candidates. If you require a reasonable accommodation to complete any part of the application process, or are unable to use this online application and need an alternative method to apply, please contact for assistance.
**Export Control Regulations**
For positions requiring access to controlled technology subject to export control regulations, including the U.S. Export Administration Regulations (EAR), ServiceNow may be required to obtain export control approval from government authorities for certain individuals. All employment is contingent upon ServiceNow obtaining any export license or other approval that may be required by relevant export control authorities.
From Fortune. ©2025 Fortune Media IP Limited. All rights reserved. Used under license.
Data Management
Posted today
Job Viewed
Job Description
At U.S. Bank, we're on a journey to do our best. Helping the customers and businesses we serve to make better and smarter financial decisions, enabling the communities we support to grow and succeed in the right ways, all more confidently and more often—that's what we call the courage to thrive. We believe it takes all of us to bring our shared ambition to life, and each person is unique in their potential. A career with U.S. Bank gives you a wide, ever-growing range of opportunities to discover what makes you thrive. Try new things, learn new skills and discover what you excel at—all from Day One.
As a wholly owned subsidiary of U.S. Bank, Elavon is committed to building the platforms and ecosystems that help over 1.5 million customers around the world to achieve their financial goals—no matter what they need. From transaction processing to customer service, to driving innovation and launching new products, we're building a range of tailored payment solutions powered by the latest technology. As part of our team, you can explore what motivates and energizes your career goals: partnering with our customers, our communities, and each other.
Job DescriptionU.S. Bank Global Corporate Trust Services is one of the largest providers of corporate trust services in the world. Our clients look to us for trustee, agency, escrow, document custody and money market issuing services via our 48 domestic offices and three international offices.
We are currently recruiting for our European Corporate Trust business within the Data Management & Control Group. This team are responsible for a variety of tasks including but not exclusive to ensuring client compliance with deal documents, ensuring receivables are paid fully and in a timely manner, liaising with Finance department, ensuring clients financials and Compliance certs are provided in a timely manner and completing DMC's Projects to the level required from senior management within given timeframes.
Further to this, the team have responsibility for deal set up across various platforms (ACS, STA, CTAO, SEI, ABS Trans, VIPR, PIVOT, Issue Tracker) from the deal incept, supporting testing across the various platforms, gathering of tax documentation and billing, to name a few of the further tasks.
Essential Functions:
- Deal Document Oversight – Ensuring that information required from clients relating to Deal documents are received and documented correctly in a timely fashion. Escalating any issues arising from this to Relationship Management in a time sensitive manner.
- Management of aged receivables process – interacting with Relationship Managers, Transaction Managers, Client, Finance and Admin groups to ensure aged receivables are paid fully in a timely manner.
- Deal Onboarding – Set up of deals across various systems, gathering of tax documentation, fee and new deal billing set up
- Queries & Escalations – Acting as escalation contact for business line queries and requests between the business line and client correspondents.
- Participate in Bank projects and UAT testing as required
- Assist with Business line reporting
- Escalating of issues in a timely manner to management
- Change Management – continuous looking for improvements, efficiencies and enhanced controls in DMC processes.
- Completion of IAR new deal and termination reviews, inclusive of clearing exceptions
- Ability to work on own initiative to 100% accuracy
- Understanding of regulations and risk attached to the role and when to escalate to avoid issues
Basic Qualifications
- Bachelor's degree in accounting or finance, or equivalent work experience
- Three to five years of experience in trust and securities operational functions
- Three to five years of management experience
Preferred Skills/Experience
- Good knowledge of trust and securities operational functions, systems, procedures, products and services
- Good knowledge and understanding of legal, regulatory and accounting principles which directly affect Wealth Management & Securities Services business lines and clients
- Well-developed analytical, problem-solving, organizational and project management skills
- Effective interpersonal, verbal and written communication skills
- Excellent supervisory and management skills, including a well-developed knowledge of human resources
- Ability to manage multiple, unrelated tasks
- Excellent verbal and written communication skills
- Understanding of the importance of timely and correct escalation
- Ability to create, implement and adhere to controls
- Working knowledge of Corporate Trust and its products
- Experience with receivables and Deal documentation
If there's anything we can do to accommodate a disability during any portion of the application or hiring process, please refer to our disability accommodations for applicants.
Benefits:
We offer an exciting, fast-paced and diverse working environment with employees of many different nationalities. We provide benefits to help you protect your health and financial security; and give you peace of mind. We also invest in your career growth with development resources that give you the opportunity to stretch and shine.
Posting may be closed earlier due to high volume of applicants.
Be The First To Know
About the latest Big data Jobs in Ireland !
Analyst - Data Management
Posted today
Job Viewed
Job Description
Your career at Deutsche Börse Group
Your area of work
The Data Management Group assume responsibility for the research, update and maintenance of all static data in our database that relate to funds and transfer agents. The main purpose of the role is to set up new funds, review and investigate all incoming queries and maintain fund rules, on both our Vestima & Vestima Prime Platforms. We also capture Agent Codes and set up new Trading Chain on Vestima.
This role will involve investigation and problem solving, ensuring all requests are responded to in a timely and accurate manner.
Your Responsibilities
Responsible for the setup of new funds, research, update and maintenance of all static information, ensuring all information is updated in a timely and accurate manner. The candidate will be working in a team environment, which interacts with both internal and external parties and involves a diverse range of functions some of which are outlined below:
- Set up of new funds and maintenance of fund information on both our Vestima & Vestima Prime Platforms
- Set up and maintenance of Administrator and Transfer Agent Information
- Source Agent Codes and Set up of Trading Chains on Vestima for Client Portfolios & BAU cases
- Ensure that all client queries in the team are dealt with in a prompt & professional manner
- Liaise with Transfer Agents and Clients to resolve complex cases
- Ensure that only accurate information received from the market is updated on our systems
- The candidate would have to be able to work to tight deadlines, be accurate and have the ability to work on their own initiative
Your profile
- The ideal candidate will have 1 year's experience in the funds industry
- Third level qualification in Business, Accounting or Finance related area
- Proficient with MS Office Outlook, Excel, Word and Access
- Proactive with the ability to work on own initiative
- Strong organisational skills and excellent attention to detail
- Ability to work under pressure to meet deadlines
- Strong problem solver with good analytical skills
- Excellent interpersonal and communication skills
- Taking on responsibility
- Results orientation
- Integrated thinking
- Problem solving skills
- Communication skills
- Planning/Organisation
- Willingness to learn
- Quick learner
- Proficient across multiple Core Systems and local applications
NOTE:
This job description is not intended to be all-inclusive. Employees may perform other related duties to meet the ongoing needs of the organization.
Company Culture Cork
Our
PEOPLE
, our focus on
RESULTS
and Our Commitment to Our
CUSTOMERS
and
COMMUNITY
drives Our success. Candidates must demonstrate an ability to understand and apply these four key elements (the building blocks) that shape our culture at Clearstream Cork - People, Customers, Results and Community #Clearstream
Software Engineer – Data Management
Posted today
Job Viewed
Job Description
Software Engineer – Data Management
PGIM Fixed Income
A GLOBAL LEADING ASSET MANAGER WITH A DIVERSE & INCLUSIVE CULTURE
As the Global Asset Management business of Prudential, we're always looking for ways to improve financial services. We're passionate about making a meaningful impact - touching the lives of millions and solving financial challenges in an ever-changing world.
We also believe talent is key to achieving our vision and are intentional about building a culture on respect and collaboration. When you join PGIM, you'll unlock a motivating and impactful career – all while growing your skills and advancing your profession at one of the world's leading global asset managers
If you're not afraid to think differently and challenge the status quo, come and be a part of a dedicated team that's investing in your future by shaping tomorrow today.
At PGIM, You Can
What you will do
Our Technology Solutions Group is a dynamic, fast-paced environment, with exciting changes on the horizon under new senior leadership. We are looking for you to support and build out scalable data platform for our Front, Back and Middle Office stakeholder groups.
As a data engineer, you will design and develop robust data pipelines using Python and Java, alongside building scalable APIs to expose data services. Collaborating with cross-functional teams, you'll translate business requirements into efficient, high quality, and scalable data platform. In addition, you will be responsible for providing Level-3 support for production incidents. We want you to see this challenge as a unique and valuable opportunity, so if this sounds interesting, then PGIM could be the place for you.
This position is performed in a hybrid manner with a mix of work performed from our office in Letterkenny, Ireland and remotely.
What you can expect
- Build applications ensuring that the code follows modern coding practices and industry standards, using best design patterns and architectural principles.
- Develop high-quality, well-documented, and efficient code adhering to all applicable company standards.
- Collaborate with tech leads to define technical designs and work with other team members to understand the system end-to-end.
- Partner with product owners to understand business needs, define feature stories, and deliver robust solutions with real business impact.
- Troubleshoot and resolve production incidents and service requests in a timely fashion.
- Develop unit tests, integration tests, and functional automation, researching and resolving problems discovered by quality assurance or product support.
- Work on complex problems requiring analytical skills and the ability to evaluate intangible variables.
- Identify opportunities to simplify the application development toolset, reducing unnecessary complexity and streamlining processes.
- Maintain a consistent feedback loop with development teams to champion modern technology adoption and decommissioning of legacy stacks.
- Develop data pipelines using programming languages, including but not limited to Java and Python.
- Work as part of a delivery team, collaborating with others to understand requirements, analyses and refine stories, design solutions, implement them, test them, and support them in production.
- Ensure that the software you build is reliable and easy to support in production. Providing Level-3 support for production issues when needed.
What you will bring
- 3+ years of experience developing software applications with a primary focus on Java or Python
- Exposure to building applications and APIs using Python programming language, frameworks, libraries and packages.
- Experience in writing and testing scalable code, debugging programs, and integrating applications with third-party web services.
- Fluent in relational database experience in Microsoft SQL server.
- Knowledge of Java enterprise development using Spring Boot, Spring Framework and REST APIs is a plus.
- Knowledge of building cloud-based applications on AWS or Azure.
- Knowledge of best practices for monitoring and supporting business critical processes and systems.
- Excellent analytical and problem-solving skills with the ability to think quickly and offer alternatives both independently and within teams.
What you will need
- 3+ years of experience working in a technology role preferably in software engineering, BSA/QA and/or application support.
- Knowledge of Fixed Income Asset management environments, for example trade lifecycle, operations, compliance, regulation, risk, financial reporting will be a plus.
- A hunger for continuous learning, constantly looking for opportunities to improve upon the status quo.
- Strong communication skills and an enthusiastic team player.
- Strong research, analytical, investigation and troubleshooting skills.
- A tenacious sense of ownership and a desire to bring incidents to resolution quickly.
- Experience with writing database queries using SQL (MS SQL preferred)
- Excellent documentation skills with ability to document processes, requirements, incident resolution steps etc.
What will set you apart?
- Experience working in Cloud technologies (AWS and/or Azure)
- Hands-on Java development experience in enterprise applications.
- Financial industry experience, specifically Fixed Income asset management.
- Direct experience supporting front, back and middle office teams (Investment, Operations, Compliance, Client Reporting, Data Governance).
- Ability to prioritize effectively within a dynamic global environment.
- Strong team player, results oriented with a flexible approach.
- Good interpersonal and communication skills, with excellent relationship building skills.
- Comfortable operating in high pressure environments whilst managing multiple incidents.
- Ability to deal with and navigate difficult situations through strong teamwork.
- PGIM welcomes all applicants, even if you don't meet every requirement. If your skills align with the role, we encourage you to apply.
What We Offer You
- Health Insurance: PGIM Ireland partner with Laya and BUPA to provide health insurance schemes that cover eligible employees day to day medical and hospital expenses.
- Annual Leave of 23 days at full pay.
- Pension Scheme: Members of the scheme can contribute up to 8% of salary per annum and PGIM Ireland matches contributions up to 8% of salary. Members can also make voluntary contributions to the scheme.
- Annual Bonus Programme & Shop LK Vouchers / CleverCards (subject to eligibility): Along with an annual bonus employees are rewarded with Shop LK Vouchers/ CleverCards which are paid tax free.
- Life Assurance: fully paid by PGIM Ireland, employees are covered from their start date and beneficiaries are provided with a lump sum of four times an employee's salary.
- Education Assistance: PGIM Ireland have an Education Assistance Programme that reimburses eligible employees for furthering their education.
About PGIM Fixed Income
PGIM Fixed Income is a global asset manager offering active solutions across all fixed income markets.
Our business climate is a safe inclusive environment, centered around mutual respect, intellectual honesty, transparency, and teamwork. Our leaders are focused on talent & culture; dedicated to fostering growth & development at all levels to develop the industry leaders of tomorrow.
Prudential Financial, Inc. is focused on creating a fully inclusive culture, where all employees feel comfortable bringing their authentic selves to work. We don't just accept difference—we celebrate it, support it, and thrive on it. At Prudential, employees have a unique opportunity to build their career path by owning their development, their career and their future. We encourage employees to hone their skills and explore continued opportunities within Prudential. For more information, please visit PGIM Fixed Income
PGIM Ireland is proud to be an equal opportunity employer and is committed to equal employment opportunity regardless of applicants' gender, civil status, family status, sexual orientation, religion, age, disability, race or membership of the traveler community. PGIM Ireland's aim is to hire the best people for the open roles and all appointments will be made on merit.
Any offer of employment made by PGIM Ireland will be contingent on receiving satisfactory references. Applicants should be aware that background checks will be carried out on all candidates offered a position within PGIM Ireland.
PGIM Ireland has been awarded the IBEC Keep Well Accreditation Mark, this is in recognition of our commitment to making our employees wellbeing a priority.
Master Data Management Specialist
Posted today
Job Viewed
Job Description
Job Title:
Material Master Data Management Specialist
Location:
Arklow, Ireland (3 days/week on-site)
Duration:
Until March 2026
Role Overview
Seeking a Material Master Data Specialist to support daily operations and migration activities, including data validation and user acceptance testing (UAT).
Key Responsibilities
- Maintain and manage
material master data
in SAP. - Support
data migration
(cleansing, mapping, validation). - Execute and document
UAT
for migrated data. - Collaborate with business and IT teams to resolve data issues.
- Ensure data accuracy and compliance with governance standards.
Requirements
- Experience with
SAP Material Master Data
(ECC or S/4HANA). - Background in
data migration
and
UAT
. - Strong attention to detail and communication skills.
- Available
3 days per week
on-site in
Arklow
.