40 Data Engineer jobs in Ireland

Data Engineer

Dublin, Leinster Reperio Human Capital Ltd

Posted 2 days ago

Job Viewed

Tap Again To Close

Job Description

Data Engineer Salary: € Fully remote My client is seeking a talented Data Engineer to join their team as part of their major growth plans. You will develop and optimise ETL processes using Azure stack, including ADF and Synapse Analytics. The role will also involve playing an important part in building scalable, secure, and high-performing data solutions using the Microsoft Azure ecosystem. Requirements: 3+ years of experience in a Data Engineering or similar role Strong knowledge of Azure cloud services for data architecture Hands-on experience with building data pipelines and integrating structured & unstructured data Proficient in SQL and Python (or Scala) Solid understanding of data modelling, data warehousing, and modern data platform best practices Experience with orchestration and monitoring tools (e.g., Azure Monitor, Log Analytics) Comfortable working in Agile/Scrum teams Benefits: Pension Healthcare Bonus Generous annual leave entitlement with option to buy more Fully remote working If this role as a Data Engineer interests and suits you, then apply using the link below. If you require any further information, get in touch with Jamie Sadlier at Reperio. Reperio Human Capital acts as an Employment Agency and an Employment Business. Skills: Data Engineer ETL Azure SQL Python Fully Remote Benefits: Work From Home
This advertiser has chosen not to accept applicants from your region.

Data Engineer

D18 Dublin, Leinster Fulcrum Digital

Posted 11 days ago

Job Viewed

Tap Again To Close

Job Description

Permanent

Overview We are the global technology company behind the world’s fastest payments processing network. We are a vehicle for commerce, a connection to financial systems for the previously excluded, a technology innovation lab, and the home of Priceless®. We ensure every employee has the opportunity to be a part of something bigger and to change lives. We believe as our company grows, so should you. We believe in connecting everyone to endless, priceless possibilities. Our team within Client's: The Services org is a key differentiator for Client's, providing the cutting-edge services that are used by some of the world's largest organizations to make multi-million dollar decisions and grow their businesses. Focused on thinking big and scaling fast around the globe, this agile team is responsible for end-to-end solutions for a diverse global customer base. Centered on data-driven technologies and innovation, these services include payments-focused consulting, loyalty and marketing programs, business Test & Learn experimentation, and data-driven information and risk management services.

Requirements

Advanced Analytics Program: Within the Services Technology Team, the Advanced Analytics program is a relatively new program that is comprised of a rich set of products that provide accurate perspectives on Credit Risk, Portfolio Optimization, and Ad Insights. Currently, we are enhancing our customer experience with new user interfaces, moving to API and web application-based data publishing to allow for seamless integration in other Client's products and externally, utilizing new data sets and algorithms to further analytic capabilities, and generating scalable big data processes. We are looking for an innovative data engineer who will lead the technical design and development of an Analytic Foundation. The Analytic Foundation is a suite of individually commercialized analytical capabilities that also includes a comprehensive data platform. These services will be offered through a series of APIs that deliver data and insights from various points along a central data store. This individual will partner closely with other areas of the business to build and enhance solutions that drive value for our customers. Engineers work in small, flexible teams. Every team member contributes to designing, building, and testing features. The range of work you will encounter varies from building intuitive, responsive UIs to designing backend data models, architecting data flows, and beyond. There are no rigid organizational structures, and each team uses processes that work best for its members and projects. Here are a few examples of products in our space: • Portfolio Optimizer (PO) is a solution that leverages Client's data assets and analytics to allow issuers to identify and increase revenue opportunities within their credit and debit portfolios. • Audiences uses anonymized and aggregated transaction insights to offer targeting segments that have a high likelihood to make purchases within a category to allow for more effective campaign planning and activation. • Credit Risk products are a new suite of APIs and tooling to provide lenders real-time access to KPIs and insights serving thousands of clients to make smarter risk decisions using Client's data. Help found a new, fast-growing engineering team! Position Responsibilities: As a Data Engineer within the Advanced Analytics team, you will: • Play a large role in the implementation of complex features • Push the boundaries of analytics and powerful, scalable applications • Build and maintain analytics and data models to enable performant and scalable products • Ensure a high-quality code base by writing and reviewing performant, well-tested code • Mentor junior engineers and teammates • Drive innovative improvements to team development processes • Partner with Product Managers and Customer Experience Designers to develop a deep understanding of users and use cases and apply that knowledge to scoping and building new modules and features • Collaborate across teams with exceptional peers who are passionate about what they do Ideal Candidate Qualifications: • 4+ years of full stack engineering experience in an agile production environment • Experience leading the design and implementation of large, complex features in full-stack applications • Ability to easily move between business, data management, and technical teams; ability to quickly intuit the business use case and identify technical solutions to enable it • Experience leveraging open source tools, predictive analytics, machine learning, Advanced Statistics, and other data techniques to perform analyses • High proficiency in using Python or Scala, Spark, Hadoop platforms & tools (Hive, Impala, Airflow, NiFi, Scoop), SQL to build Big Data products & platforms • Experience in building and deploying production-level data-driven applications and data processing workflows/pipelines and/or implementing machine learning systems at scale in Java, Scala, or Python and deliver analytics involving all phases like data ingestion, feature engineering, modeling, tuning, evaluating, monitoring, and presenting • Experience in cloud technologies like Data bricks/AWS/Azure • Strong technologist with proven track record of learning new technologies and frameworks • Customer-centric development approach • Passion for analytical / quantitative problem solving • Experience identifying and implementing technical improvements to development processes • Collaboration skills with experience working with people across roles and geographies • Motivation, creativity, self-direction, and desire to thrive on small project teams • Superior academic record with a degree in Computer Science or related technical field • Strong written and verbal English communication skills

This advertiser has chosen not to accept applicants from your region.

Senior Data Engineer

Dublin, Leinster UnitedHealth Group

Posted 2 days ago

Job Viewed

Tap Again To Close

Job Description

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by diversity and inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health equity on a global scale. Join us to start **Caring. Connecting. Growing together.**
In healthcare, evolution doesn't just happen. It takes innovation, imagination, and a passion for solving problems in new and better ways. And innovation is taking place at a lightning-fast pace every day at Optum. As the fastest growing part of the UnitedHealth Group family of businesses, we're expanding our team in Ireland and creating excellent opportunities for those who want greater purpose and more impact in their work. We'll provide the investment, support, and resources to advance your career. You'll provide the talent, ambition, and drive.
As a **Senior Data Engineer,** you will be working on developing and maintaining data pipelines that extract, transform, and load (ETL) data from various sources into a centralized data storage system, such as a data warehouse or data lake. Ensure the smooth flow of data from source systems to destination systems while adhering to data quality and integrity standards. In addition to having impact on a great team, you'll also discover the career opportunities you'd expect from an Industry Leader.
**Schedule** : Full-time position with standard working hours of Monday - Friday, 9am - 5pm.
_Careers with Optum offer flexible work arrangements and individuals who live and work in the Republic of Ireland will have the opportunity to split their monthly work hours between our Dublin or Letterkenny office and telecommuting from a home-based office in a hybrid work model._
**Primary Responsibilities:**
+ Data Integration: Integrate data from multiple sources and systems, including databases, APIs, log files, streaming platforms, and external data providers. Handle data ingestion, transformation, and consolidation to create a unified and reliable data foundation for analysis and reporting
+ Data Transformation and Processing: Develop data transformation routines to clean, normalize, and aggregate data. Apply data processing techniques to handle complex data structures, handle missing or inconsistent data, and prepare the data for analysis, reporting, or machine learning tasks
+ Contribute to common frameworks and best practices in code development, deployment, and automation/orchestration of data pipelines
+ Implement data governance in line with company standards
+ Partner with Data Analytics and Product leaders to design best practices and standards for developing and productional analytic pipelines
+ Partner with Infrastructure leaders on architecture approaches to advance the data and analytics platform, including exploring new tools and techniques that leverage the cloud environment (Azure, Snowflake, others)
+ Monitoring and Support: Monitor data pipelines and data systems to detect and resolve issues promptly. Develop monitoring tools, alerts, and automated error handling mechanisms to ensure data integrity and system reliability
_You will be rewarded and recognised for your performance in an environment that will challenge you and give you clear direction on what it takes to succeed in your role, as well as providing development for other roles you may be interested in._
**Required Qualifications:**
+ Bachelor's Degree (or higher) in Database Management, Information Technology, Computer Science or similar
+ Extensive experience designing data solutions including data modeling. Data Architecture experience is a plus
+ Extensive hands-on experience developing data processing jobs (PySpark / SQL) that demonstrate a strong understanding of software engineering principles
+ Experience orchestrating data pipelines using technology like ADF, Airflow etc.
+ Well versed in Python regarding data manipulation, cleaning, transforming, and analyzing structured data to support our data-driven initiatives
+ Experience with Snowflake
+ Fluent in SQL (any flavor), with experience using Window functions and more advanced features
+ Experience of DevOps tools, Git workflow and building CI/CD pipelines
+ Experience applying data governance controls within a highly regulated environment
**Preferred Qualifications:**
+ Proven Data Engineering experience
+ Familiarity with framework & libraries such as snowpark
+ Experience working in projects with agile/scrum methodologies
+ Experience with Azure Databricks
+ Familiarity with production quality ML and/or AI model development and deployment
**Soft Skills:**
+ Ability to communicate effectively with users and able to understand new required changes
+ A motivated self-starter who excels at managing their own tasks and takes ownership
**Please note you must currently be eligible to work and remain indefinitely without any restrictions in the country to which you are making an application. Proof will be required to support your application.**
_All telecommuters will be required to adhere to the UnitedHealth Group's Telecommuter Policy._
_At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission._
_Diversity creates a healthier atmosphere: Optum is an Equal Employment Opportunity employer, and all qualified applicants will receive consideration for employment without regard to gender, civil status, family status, sexual orientation, disability, religion, age, race, and membership of the Traveler community, or any other characteristic protected by law. Optum is a drug-free workplace. © 2025 Optum Services (Ireland) Limited. All rights reserved._
#RPO #BBMEMEA
This advertiser has chosen not to accept applicants from your region.

Senior Data Engineer

Dublin, Leinster UnitedHealth Group

Posted 2 days ago

Job Viewed

Tap Again To Close

Job Description

Optum is a global organisation that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by diversity and inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health equity on a global scale. Join us to start **Caring. Connecting. Growing together.**
In healthcare, evolution doesn't just happen. It takes innovation, imagination, and a passion for solving problems in new and better ways. And innovation is taking place at a lightning-fast pace every day at Optum. As the fastest growing part of the UnitedHealth Group family of businesses, we're expanding our team in Ireland and creating excellent opportunities for those who want greater purpose and more impact in their work. We'll provide the investment, support, and resources to advance your career. You'll provide the talent, ambition, and drive.
As a **Senior Data Engineer** you will be working on developing and maintaining data pipelines that extract, transform, and load (ETL) data from various sources into a centralized data storage system, such as a data warehouse or data lake. Ensure the smooth flow of data from source systems to destination systems while adhering to data quality and integrity standards. In addition to having impact on a great team, you'll also discover the career opportunities you'd expect from an Industry Leader.
**Schedule:** Full-time position with standard working hours of Monday - Friday, 9am - 5pm.
_Careers with Optum offer flexible work arrangements and individuals who live and work in the Republic of Ireland will have the opportunity to split their monthly work hours between our Dublin or Letterkenny office and telecommuting from a home-based office in a hybrid work model._
**Primary Responsibilities:**
+ Data Integration: Integrate data from multiple sources and systems, including databases, APIs, log files, streaming platforms, and external data providers. Handle data ingestion, transformation, and consolidation to create a unified and reliable data foundation for analysis and reporting
+ Data Transformation and Processing: Develop data transformation routines to clean, normalize, and aggregate data. Apply data processing techniques to handle complex data structures, handle missing or inconsistent data, and prepare the data for analysis, reporting, or machine learning tasks
+ Contribute to common frameworks and best practices in code development, deployment, and automation/orchestration of data pipelines
+ Implement data governance in line with company standards
+ Partner with Data Analytics and Product leaders to design best practices and standards for developing and productionizing analytic pipelines
+ Partner with Infrastructure leaders on architecture approaches to advance the data and analytics platform, including exploring new tools and techniques that leverage the cloud environment (Azure, Snowflake, others)
+ Monitoring and Support: Monitor data pipelines and data systems to detect and resolve issues promptly. Develop monitoring tools, alerts, and automated error handling mechanisms to ensure data integrity and system reliability
_You will be rewarded and recognised for your performance in an environment that will challenge you and give you clear direction on what it takes to succeed in your role, as well as providing development for other roles you may be interested in._
**Required Qualifications:**
+ Bachelor's Degree (or higher) in Database Management, Information Technology, Computer Science or similar
+ Proven experience designing data solutions, including data modelling
+ Hands-on experience in data processing using PySpark and SQL
+ Experience with orchestration tools like Azure Data Factory or Airflow
+ Fluent in SQL, including advanced functions and window operations
+ Demonstrated experience with DevOps tools and CI/CD pipelines
+ Experience applying data governance controls in regulated environments
+ Well versed in Python regarding data manipulation, cleaning, transforming, and analyzing structured data to support our data-driven initiatives
**Preferred Qualifications:**
+ Proven Data Engineering experience
+ A motivated self-starter who excels at managing their own tasks and takes ownership
+ Experience working in projects with agile/scrum methodologies
+ Experience with Azure Databricks and Snowflake
+ Familiarity with production quality ML and/or AI model development and deployment
**Soft Skills:**
+ Effective communication skills and the ability to understand and adapt to changing requirements
**Please note you must currently be eligible to work and remain indefinitely without any restrictions in the country to which you are making an application. Proof will be required to support your application.**
_All telecommuters will be required to adhere to the UnitedHealth Group's Telecommuter Policy._
_At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalised groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission._
_Diversity creates a healthier atmosphere: Optum is an Equal Employment Opportunity employer and all qualified applicants will receive consideration for employment without regard to gender, civil status, family status, sexual orientation, disability, religion, age, race, and membership of the Traveller community, or any other characteristic protected by law. Optum is a drug-free workplace. © 2025 Optum Services (Ireland) Limited. All rights reserved._
#RPO #BBMRAQ
This advertiser has chosen not to accept applicants from your region.

Quantitative Data Engineer

Dublin, Leinster Brightwater

Posted 3 days ago

Job Viewed

Tap Again To Close

Job Description

Quantitative Data Engineer About the Company My client is a global trading and technology firm utilising cutting-edge and innovative technology. They work on highly complex in-house software that runs on the most recent platforms available, solving a wide range of problems. They are currently at the early stages of an initiative that will revolutionise the industry and they are looking to attract the best talent to their team to achieve their mission. About the Position This is a full time, Permanent position based in Dublin City Centre. 5 days per week on site. The Quantitative Engineering and Data (QED) group in Dublin is a newly formed team focused on developing tools, processes, and datasets to drive data-driven revenue generation at my client's business. The QED Dublin group is on an exciting strategic mission to accelerate innovation across the business. You will collaborate with Quantitative Research, Trading and Engineering teams both in Dublin and worldwide, to analyze data and create datasets for research and trading evaluations. As you grow in the role, you'll develop an understanding of the quantitative trading business, using your data skills to provide valuable contributions to the company. Key Responsibilities Accountable for supporting research and trading activities by applying a variety of data science, analytics, and engineering expertise. Acquire, clean, and manage data, while developing and maintaining data pipelines. Collaborate with researchers throughout the development process, handling tasks that align with your skill set to boost research efficiency. Enhance the research process by automating tasks or integrating new tools. Gain in-depth knowledge of my clients systems and business operations. Experience/Requirements A degree in a STEM field, with demonstrated independent work through professional experience, an MSc, or a PhD. Over 3 years of professional experience in Python, including proficiency in NumPy, Pandas, and other scientific libraries. Experience working within a Linux environment. A strong interest in collaborating with Researchers to enhance and accelerate their research. Ability to work autonomously, stay self-motivated, and identify opportunities for process improvement. Excellent interpersonal and communication skills, enabling effective collaboration in a quantitative environment. High attention to detail and the ability to adapt to shifting priorities. A keen interest in quantitative trading. Remuneration Package Along with a market leading salary and benefits package, you will enjoy an innovative environment, have access to on site facilities like a gym, food catering and games rooms. Contact Please contacton oror simply click the apply button. To view all live jobs with Brightwater and market insights, please visit our website:
This advertiser has chosen not to accept applicants from your region.

Senior Data Engineer

Dublin, Leinster Reperio Human Capital Ltd

Posted 9 days ago

Job Viewed

Tap Again To Close

Job Description

Senior Data Engineer Location: Dublin Salary: € Hybrid Reperio are working with a fintech company who are seeking an experienced Data Engineer to add to their growing team. The successful candidate will contribute to shaping data architecture, building robust pipelines, and ensuring high data quality across the organization. This is a hands-on role with significant influence on technology choices, design patterns, and team culture. Requirements: 5+ years of experience in Data Engineering Strong experience with AWS-native tools (e.g., Glue, Lambda, Step Functions). Experienced in Python and SQL Strong understanding of data warehousing, ETL/ELT processes, and cloud architecture Experience with orchestration tools like Airflow or Step Functions Experience with modern data platforms like Databricks or Snowflake Familiarity with CI/CD and version control for data pipelines Benefits: Pension Health insurance Bonus Flexible hybrid working If this role as a Senior Data Engineer interests and suits you, then apply using the link below. If you require any further information, get in touch with Jamie Sadlier at Reperio. Reperio Human Capital acts as an Employment Agency and an Employment Business. Skills: Senior Data Engineer AWS ETL Snowflake Dublin Hybrid
This advertiser has chosen not to accept applicants from your region.

Senior Data Engineer

Dublin, Leinster Reperio Human Capital Ltd

Posted 9 days ago

Job Viewed

Tap Again To Close

Job Description

Senior Data Engineer Location: Dublin Salary: € Hybrid My client is on the lookout for a Senior Data Engineer to join their expanding team in Dublin. In this role, you'll lead the development of robust, scalable data pipelines on AWS and Snowflake, ensuring high data quality, availability, and performance across the organization. You'll also be a strategic voice in shaping our data architecture, tools, and practices. Requirements: 6+ years of data engineering experience with a strong focus on cloud data platforms. Proven experience building data pipelines using AWS services: S3, Glue, Lambda, Step Functions, and IAM. Expert-level knowledge of Snowflake, including performance tuning, security, and cost optimization. Proficient in SQL and Python for data transformation and automation. Hands-on experience with orchestration tools (e.g., Airflow or AWS-native alternatives). Familiar with dbt for transformation logic and data modelling (nice to have). Strong understanding of data warehousing concepts and cloud architecture. Comfortable working in CI/CD and infrastructure-as-code environments. Excellent communication skills and a proactive, solution-driven mindset. Benefits: Pension Healthcare Bonus Share options Flexible hybrid working If this role as a Senior Data Engineer interests and suits you, then apply using the link below. If you require any further information, get in touch with Jamie Sadlier at Reperio. Reperio Human Capital acts as an Employment Agency and an Employment Business. Skills: Senior Data Engineering AWS Snowflake Python Dublin Hybrid
This advertiser has chosen not to accept applicants from your region.
Be The First To Know

About the latest Data engineer Jobs in Ireland !

Senior Data Engineer

Galway, Connacht Adecco

Posted 9 days ago

Job Viewed

Tap Again To Close

Job Description

Our client, a leading organisation in technology strategy and planning, is on the lookout for a dynamic and skilled Senior Data Engineer to join their vibrant team in Galway! This is your chance to make a real impact by contributing to cutting-edge data solutions that empower analytics and drive business decisions. We also have contract Data Engineer roles in Galway if of interest. As a Senior Data Engineer, you'll be at the heart of our innovative data landscape. Your expertise will help create solutions that enable leaders to optimise their work delivery and add significant value to the organisation and its customers. Your work will involve: Designing and implementing robust data models that ensure our integrations are performant, reusable, and scalable. Collaborating with diverse teams to understand dependencies and improve shared work progress. utilising a variety of applications and technologies, including PowerApps, PowerBI, Tableau, and more! Managing complex data movement requirements from both internal and external data sources into a cohesive ecosystem. Skills: Experience: 5+ years in data engineering, data analysis, data warehouses, or data lakes. Technical Skills: Proficiency in relational databases (Oracle SQL, Postgres) and experience with cloud data warehousing services like Snowflake or AWS RDS. Data Movement Expertise: Strong knowledge of ETL/ELT processes and tools, including Python and shell scripting. Analytical Skills: Proven ability to navigate enterprise data and meet business analytic needs. Outstanding SQL skills with a knack for deep data analysis across multiple platforms. Experience in developing ELT/ETL pipelines and familiarity with data modelling techniques. Knowledge of data ingestion tools (e.g., Apache NiFi, Kafka) and experience with APIs and PowerApps is a plus. Strong organisational and communication abilities to simplify and convey technical challenges effectively. If you're looking for a challenging yet rewarding opportunity and want to be part of an enthusiastic team dedicated to excellence, we'd love to hear from you! Apply now to become a Senior Data Engineer and help shape the future of data solutions in our organisation. Join this exciting journey-your next big career step awaits in Galway! Adecco is a disability-confident employer. It is important to us that we run an inclusive and accessible recruitment process to support candidates of all backgrounds and all abilities to apply. Adecco is committed to building a supportive environment for you to explore the next steps in your career. If you require reasonable adjustments at any stage, please let us know and we will be happy to support you. Adecco Ireland is acting as an Employment Agency in relation to this vacancy. Skills: Dara Engineer Oracle PL SQL ETL Benefits: DOE great benefits or contract rates
This advertiser has chosen not to accept applicants from your region.

Senior Data Engineer

Dublin, Leinster Reperio Human Capital Ltd

Posted 9 days ago

Job Viewed

Tap Again To Close

Job Description

Senior Data Engineer Databricks (Initial 12 Month Contract) Hybrid (1 day per week onsite, Ireland) | Competitive Daily Rate My client is looking for a Senior Data Engineer on a 12-month rolling contract to support key data and analytics initiatives. This role will focus on building scalable, cloud-based data solutions and delivering meaningful insights across the business. Role Highlights: Develop and maintain robust data pipelines using Databricks, Python, and PySparkWork across cloud and on-premises environments, including Snowflake and object storage Integrate data from various sources, including APIs and flat files Improve and streamline existing SQL logic, views, and stored procedures Partner with business teams to build reports and dashboards using Power BI Support data quality, lineage, and governance using tools such as Atlan or Monte Carlo What You'll Bring: Proven experience in data engineering within cloud-focused environments Strong skills in Python, PySpark, and SQL Hands-on experience with Databricks, Snowflake, and cloud platforms (AWS or Azure) Solid understanding of data modelling techniques Strong communication skills and experience working with business stakeholders Nice to Have: Background in Scaled Agile (SAFe) methodologies Experience mentoring or guiding junior engineers Interviews available immediately Candidates must be eligible to work in Ireland without restriction To learn more, contact Scott Hool in confidence. Reperio Human Capital acts as an Employment Agency and an Employment Business. Skills: Data Engineer Databrick Snowflake
This advertiser has chosen not to accept applicants from your region.

AWS Data Engineer

Dublin, Leinster REALTIME recruitment

Posted 9 days ago

Job Viewed

Tap Again To Close

Job Description

Skilled AWS Cloud Data Engineer required to join a Global Organisation *Remote Role* (if you reside in Ireland and have relevant visa to work in Ireland) This role is ideal for someone who thrives in a collaborative, cloud-first environment and is excited to help design, build, and optimize scalable data infrastructure and pipelines in AWS. Key Responsibilities: Design, develop, and maintain robust data pipelines and ETL processes in AWS Collaborate with Data Science and BI teams to enable seamless data access and analytics Optimize data workflows for performance and cost-efficiency Implement and manage data lakes, warehouses, and processing frameworks Ensure data quality, reliability, and compliance with security standards Required Qualifications: 3+ years of hands-on experience working with Apache Spark in production environments 3+ years of experience with cloud data warehouses such as Snowflake, Redshift, or BigQuery Solid understanding of AWS data services (e.g., S3, Glue, EMR, Redshift, Lambda) Proficiency in SQL and at least one programming language (e.g., Python or Scala) Experience building and orchestrating data workflows using tools like Airflow, Step Functions, or similar Nice to Have: Exposure to Ray for distributed computing Familiarity with CI/CD processes and infrastructure-as-code tools (e.g., Terraform, CloudFormation) Experience supporting machine learning pipelines or real-time data processing Benefits: Work From Home
This advertiser has chosen not to accept applicants from your region.
 

Nearby Locations

Other Jobs Near Me

Industry

  1. request_quote Accounting
  2. work Administrative
  3. eco Agriculture Forestry
  4. smart_toy AI & Emerging Technologies
  5. school Apprenticeships & Trainee
  6. apartment Architecture
  7. palette Arts & Entertainment
  8. directions_car Automotive
  9. flight_takeoff Aviation
  10. account_balance Banking & Finance
  11. local_florist Beauty & Wellness
  12. restaurant Catering
  13. volunteer_activism Charity & Voluntary
  14. science Chemical Engineering
  15. child_friendly Childcare
  16. foundation Civil Engineering
  17. clean_hands Cleaning & Sanitation
  18. diversity_3 Community & Social Care
  19. construction Construction
  20. brush Creative & Digital
  21. currency_bitcoin Crypto & Blockchain
  22. support_agent Customer Service & Helpdesk
  23. medical_services Dental
  24. medical_services Driving & Transport
  25. medical_services E Commerce & Social Media
  26. school Education & Teaching
  27. electrical_services Electrical Engineering
  28. bolt Energy
  29. local_mall Fmcg
  30. gavel Government & Non Profit
  31. emoji_events Graduate
  32. health_and_safety Healthcare
  33. beach_access Hospitality & Tourism
  34. groups Human Resources
  35. precision_manufacturing Industrial Engineering
  36. security Information Security
  37. handyman Installation & Maintenance
  38. policy Insurance
  39. code IT & Software
  40. gavel Legal
  41. sports_soccer Leisure & Sports
  42. inventory_2 Logistics & Warehousing
  43. supervisor_account Management
  44. supervisor_account Management Consultancy
  45. supervisor_account Manufacturing & Production
  46. campaign Marketing
  47. build Mechanical Engineering
  48. perm_media Media & PR
  49. local_hospital Medical
  50. local_hospital Military & Public Safety
  51. local_hospital Mining
  52. medical_services Nursing
  53. local_gas_station Oil & Gas
  54. biotech Pharmaceutical
  55. checklist_rtl Project Management
  56. shopping_bag Purchasing
  57. home_work Real Estate
  58. person_search Recruitment Consultancy
  59. store Retail
  60. point_of_sale Sales
  61. science Scientific Research & Development
  62. wifi Telecoms
  63. psychology Therapy
  64. pets Veterinary
View All Data Engineer Jobs