As a data engineer specialising in data technology you’ll work in the data engineering team on the design and implementation of data solutions for our customers.
- You have a background in data engineering or a related area
- You have a strong interest and passion for all things data related
- You are comfortable talking with stakeholders and communicating complex solutions clearly and concisely
- You are driven to developing high quality solutions
- You are a motivated, self-learner who can work with a degree of autonomy when needed
- Collaborate with data scientists and other data-engineers on data problems
- Work on-site with senior data engineers, scientists and customers to understand their data problems, identify needs and develop solutions;
- Work on data integration solutions using a wide array of technologies and data-sources
- Work with big data components and migration strategies
- Work with cloud based infrastructure (AWS, Azure and Google) for hosting data solutions/applications.
- Experience developing ETL and data integration solutions
- Strong SQL skills and experience working with a relational database (MySQL, Postgres, MySQL, Oracle etc…)
- Experience working with data warehouse solutions, extracting and processing data in various ways using off-the-shelf tools and open-source tools
- Experience with a general purpose programming language, 3+ years experience one of (Python a plus, JVM languages, JS, Ruby)
- Working with Linux or Windows operating systems and version control systems (Git) (3+ years)
- Experience with migrating on premise data stores to cloud based data stores (Hadoop cloud distros a plus)
- Experience with various ETL strategies and architectures
- Hands-on experience with cloud environments (AWS preferred)
- Experience of data security, data governance and quality assessment
- Building API’s and apps using Python/JS or an alternative language
- Knowledge of Hadoop technology (Map-reduce, HDFS, Hive)
- Experience with non-relational database solution (e.g. MongoDB, Google BigQuery)
- Experience with AWS data pipeline, Azure data factory or Google Cloud Dataflow;
- Working with containerisation technologies (Docker, Kubernetes etc…)
- Knowledge of R and statistical languages
- Ability to fly or drive to client sites including international travel.
This is a UK-based role at Mango Head Office in Chippenham or our London office. Applicants must have the right to work in the EU.