Senior Data Engineer
At Holiday Extras we believe that time is precious and that holidays are some of the most precious times we have. We’re building the future of travel using innovative technology, a wide choice of products and unbeatable prices. We are looking for an experienced Data Engineer to come and use cutting edge technology to help us revolutionise how customers (7 million and growing) have less hassle, more holiday.
We’re in the process of moving our data storage to a scalable environment using Google Cloud Services. We are experimenting with live streaming data and improving our batch processing. We are continuously collecting and exposing data sources to feed new data sets, empowering the business continue to learn and grow.
We pride ourselves on being able to offer exciting challenges, whilst also providing a flexible, fulfilling environment in which to work, in the heart of the Kent countryside, 5 minutes from the beach. Our Learning Academy, quiet Lounge, supportive environment and focus on personal development, help our team members grow and become experts in their field! We have fortnightly “Project Lounge” days where you have the chance to work on anything extending your knowledge in different business areas or simply collaborate in a project to make a difference to the business.
You will be joining at an exciting period of exceptional projected growth; Our people move fast and deliver excellence at pace, never accepting second best. We may have been in business 34 years, but Holiday Extras is an innovative and entrepreneurial travel-tech business with a startup mentality.
Here’s what we’re looking for:
- A versatile and experienced data engineer with expertise in writing advanced SQL.
- Experience in at least one other popular programming language such as Python, Bash Shell, Java. You’ll need more than one tool in your toolbox!
- Demonstrable ability to work with a variety of data infrastructure, including relational databases (e.g. DB2, MySQL), and column store (ideally Google Big Query)
- Expertise in building and maintaining reliable ETL jobs
- Experience designing new data models based on multiple data sources.
- Experience with version control software, e.g. Git
- A desire to understand the business and where it’s heading.
- A track record of being solutions focussed and an excellent problem solver, not afraid to make decisions and act quickly.
- A proactive communicator
- Always determined to acquire new skills and learn.
- You’ll have good numerical skills and approach every task with a flexible ‘can do’ attitude.
- Experience working with Apache Airflow, Luigi or Talend.
- Exposure to business intelligence tools and dashboards (we use Looker).
Here’s what you’ll be doing:
- You will have autonomy and responsibility to deliver data requirements to the business.
- You will learn where our data is stored and processed to do this job well. You will have the opportunity to work on new stimulating projects as well as using our existing platforms.
- You will be migrating our data pipelines from legacy platforms to our new cloud based data stack.
- You will be learning and working with cutting-edge tech and solutions.
- You will develop your career in a strong team with a well-defined culture, that is building a fast-moving, disruptive business.
- You will lead in solving challenges and delivering innovative solutions for the business.
- You will be responsible for making data accessible to users in our business and engaging with the users to understand what their data requirements are.
- You will be supporting end-users to get the most from the data and tools available.
- You will be a self-organised individual who can follow processes well, whilst collaborating with your teams.
- You’re going to have a lot of fun!
If you’ve got a few minutes check out our blog and see what our engineers have to say.
If you are interested, please submit your CV and Cover Letter by clicking 'Apply'.