Our people love the exciting and meaningful work they do, the cutting-edge resources and technology they have access to, the benefits we offer and the great community we’ve built. Want to join them?
Job title: Data Engineer (Data training provided via 6 week academy programme)
Reports to: People Manager
Main purpose of role & level in the business:
The Data Engineer works within a multi-skilled agile team to develop large-scale data processing software to meet user needs in demanding production environments.
Most of our work comes through repeat business and direct referrals, which comes down to the quality of our people. The success of our Data Engineering teams means that customers are bringing us an increasing number of exciting data projects using cutting-edge technology to solve real-world problems. We are seeking more high calibre people to join our Data & Analytics capability where you will grow and contribute to industry-leading technical expertise. Working to develop data processing software primarily for deployment in Big Data technologies, the role encompasses the full software lifecycle including design, code, test and defect resolution
Minimum (essential) requirements :
You will be a current software engineer/developer with an active interest in the Data sector.
Software development experience
You may currently work with distributed data processing technologies including Apache Hadoop and Apache Spark and with JVM languages
You may Understand ETL/ELT data processing pipelines
Understanding of contemporary data storage technology, such as document, graph, log stores and other non-relational platforms
Ability to write easily testable code including automated unit testing.
Have some knowledge of continuous integration tools and techniques (e.g. Jenkins).
Ability to work collaboratively with others using version control (e.g. Git)
Clear verbal presentation with an open attitude to sharing information.
Able to prioritise and work to deadlines.
Working with operations teams to ensure operational readiness
Software development experience with Cloudera’s distribution of Apache Hadoop.
Experience of data visualisation and complex data transformations, including ETL tools such as Talend.
Experience with steaming and event-processing architectures including technologies such as Kafka and change-data-capture (CDC) products.
Comfortable with continuous improvement and sharing input on data best practice.
Participation in development and/or technology communities.
Experience of Python and R
Who you are:
Our vision is to enable outstanding people to create digital solutions that have a positive impact on people’s lives. Our values aren't abstract; they are the behaviours we expect from each other every day, and underpin everything that we do. We expect everyone to display our values by being determined in how obstacles are overcome; honest when dealing with others; respectful of how you treat others; creative to find solutions to complex problems and cooperative by sharing information, knowledge and experience.
These values, applied collectively, help to produce an outstanding Kainos person, team and culture.
Kainos is a high-growth IT services company providing digital technology solutions and agile software development to enterprise customers. Across our 30-year history, we have worked on transformational projects across government, NHS and a myriad of private sector clients.