More jobs:
Job Description & How to Apply Below
Data Engineer
- Duration:
Initial 6 months - Location:
Remote - Rate: £200 - £250 Outside IR35
This role focuses on accelerating the development of the Data Platform by contributing to the cross‑functional engineering teams within the Wealth business unit. The primary responsibilities involve technical development, ensuring the quality of data solutions, and collaborating with team members.
As part of the engineering team, you will work with engineering team leads, solution architects, software engineers, quality engineers, and product members to contribute to the design, development, delivery, and operation of UK Wealth data products.
Technical SkillsStrong knowledge and recent experience in many of the following data technologies and methodologies:
- Data Architecture Principles:
Data lakes, data warehouses, ETL/ELT processes (knowledge Debezium advantageous), data modelling, and data governance. - Programming
Languages:
Python (for data manipulation, analysis, and scripting), SQL (for database interaction and data querying), Pyspark, Kafka and potentially Java or Scala for big data processing. - Database Technologies:
Experience with relational databases like MS SQL Server and Postgre
SQL, as well as No
SQL databases (e.g., Dynamo
DB). - Data Intelligence Platforms:
Working knowledge of enterprise DI platforms, specifically Databricks, including data ingestion, streaming, transformation, data integration patterns and tools. - Data Integration and Pipelines:
Building, maintaining, and optimising data pipelines using tools and frameworks designed for efficient data movement and transformation. - Cloud Data Platforms:
Familiarity with cloud services for data engineering, such as AWS (e.g., S3, Redshift, Glue). - Data Testing and Quality:
Implementing data quality checks, data validation, and unit testing for data pipelines to ensure data accuracy and reliability. - Scripting and Automation:
Using Bash, Power Shell, or Python scripts to automate data‑related tasks and workflows. - Version Control:
Proficient with Git and Git Hub for managing code and data‑related configurations. - Build and CI/CD for Data:
Understanding and applying CI/CD principles for data pipelines and deployments, including familiarity with tools like Jenkins, Git Lab CI, or similar platforms. - Data Security:
Awareness of data security practices, including access control, data encryption and compliance with data privacy regulations. - Data for AI products/solutions:
Knowledge of the use of AI in the context of data. Practical experience of AI and AI tooling is beneficial but not required.
- Strong interpersonal skills with the ability to communicate effectively at all levels
- Analytical thinker with a logical approach to problem‑solving and solutions
- Flexible in approach and mindset to adapt to changing priorities and requirements
- Ability to thrive under pressure in a fast‑paced environment, able to prioritise tasks and manage your own time appropriately
- Able to work both independently and collaboratively within a team environment to achieve team goals and objectives
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
Search for further Jobs Here:
×