At iCIMS, we're redefining how people connect with opportunity through intelligent, human-centred technology. We're growing rapidly and are seeking Data Engineers at multiple levels of experience - from emerging to experienced professionals - to build the next generation of our Talent Cloud platform through scalable data pipelines, storage systems, and analytics infrastructure that power our data-driven decision-making and AI capabilities.
You'll design, build, and optimise data infrastructure that supports analytics, business intelligence, and product development, working with software engineers, data scientists, and product experts in a culture that values innovation, ownership, and continuous learning.
Responsibilities
- Design, develop, and maintain scalable data pipelines to collect, process, and store data from multiple sources
- Build and optimise data infrastructure to support analytics, reporting, and AI/ML workloads
- Implement event sourcing and streaming architectures using platforms (e.g., Kafka, AWS Kinesis) for real-time data processing
- Apply data governance, security principles, and compliance frameworks to ensure data quality and regulatory adherence
- Collaborate with data scientists, software engineers, and product teams to deliver reliable data solutions
- Troubleshoot and resolve data-related issues whilst maintaining data quality and integrity
- Write automated tests and participate in code reviews to maintain code quality
- Contribute to best practices, frameworks, and tools for data engineering excellence
Qualifications
- Bachelor's degree in Computer Science, Engineering, Data Science, or related field (or equivalent professional experience)
- 0-2 years of experience building data pipelines and systems
- Proficiency in Python and SQL; familiarity with Java
- SQL skills and experience with relational and non-relational databases (e.g., SQL Server, PostgreSQL, MySQL, MongoDB)
- Familiarity with cloud platforms (AWS preferred) and data storage services (e.g., S3, Redshift)
- Familiarity with streaming platforms (e.g., Kafka, AWS Kinesis) and event-driven architectures
- Understanding of data modelling, warehousing, and schema design principles
- Familiarity with data transformation tools (e.g., dbt), BI platforms (e.g., Looker), and API development for data consumption
- Knowledge of version control (Git), CI/CD pipelines, and security principles for data systems (encryption, IAM, compliance frameworks)
- Strong analytical and problem-solving skills with intellectual curiosity
- Strong communication and collaboration skills across both technical and non-technical teams