About Us
Like the modern Britain we serve, we’re evolving. Investing billions in our people, data and tech to transform the way we meet the ever-changing needs of our 26 million customers. We’re growing with purpose. Join us on our journey and you will too
A great opportunity has arisen for a Data Engineer to work within the Personalised Experiences and Communications Platform to join product engineering cross functional teams. As a Data Engineer your responsibilities will be delivering the highest quality data capability, drawing upon your engineering expertise, whilst being open minded to the opportunities the cloud provides.
What you’ll be doing
- Building reusable data pipelines at scale, work with structured and unstructured data, and feature engineering for machine learning or curate data to provide real time contextualise insights to power our customers journeys.
- Using industry leading toolsets, as well as evaluating exciting new technologies to design and build scalable real time data applications.
- Spanning the full data lifecycle and experience using mix of modern and traditional data platforms (e.g. Hadoop, Kafka, GCP, Azure, Teradata, SQL server) you’ll get to work building capabilities with horizon-expanding exposure to a host of wider technologies and careers in data.
- Helping in adopting best engineering practices like Test Driven Development, code reviews, Continuous Integration/Continuous Delivery etc for data pipelines.
- Mentoring other engineers to deliver high quality and data led solutions for our Bank’s customers
What We Need You To Have Experience In
Coding
- Coding/scripting experience developed in a commercial/industry setting (Python, Java, Scala or Go and SQL)
- Databases & frameworks
- Strong experience working with Kafka technologies
- Working experience with operational data stores, data warehouse, big data technologies and data lakes
- Experience working with relational and non-relational databases to build data solutions, such as SQL
- Server/Oracle, experience with relational and dimensional data structures
- Experience in using distributed frameworks (Spark, Flink, Beam, Hadoop)
Containerisation
- Good knowledge of containers (Docker, Kubernetes etc)
Cloud
- Experience with GCP
- Good understating of cloud storage, networking and resource provisioning
It would be great if you had
- Certification in GCP “Professional Data Engineer”
- Certification in Apache Kafka (CCDAK)
- Proficiency across the data lifecycle
We also offer a wide-ranging benefits package, which includes:
- A generous pension contribution of up to 15%
- An annual performance-related bonus
- Share schemes including free shares.
- Benefits you can adapt to your lifestyle, such as discounted shopping.
- 30 days’ holiday, with bank holidays on top
- A range of wellbeing initiatives and generous parental leave policies
Hours: Full-time – 35 hours per week
Working Pattern: Our work style is hybrid, which involves spending at least two days per week currently, or 40% of our time, at our Bristol or London office.