Job Description
What is the Opportunity? Are you a talented, creative, and results-driven professional who thrives on delivering high-performing applications. Come join us!Global Functions Technology (GFT) is part of RBC’s Technology and Operations division. GFT’s impact is far-reaching as we collaborate with partners from across the company to deliver innovative and transformative IT solutions. Our clients represent Risk, Finance, HR, CAO, Audit, Legal, Compliance, Financial Crime, Capital Markets, Personal and Commercial Banking and Wealth Management. We also lead the development of digital tools and platforms to enhance collaboration.
Our team is focused on the design, development and deployment of capabilities that enable the end-to-end delivery of the Enterprise ESG framework. Senior Software Developer, Climate Solutions is responsible for developing application for large-scale data processing and analysis. You will work with all stakeholders to design best in class technology solutions. We value positive attitude, willingness to learn, open communication, teamwork, and commitment to clean, secure and well-tested code.
What will you do?
Design, develop, and maintain scalable data pipelines and infrastructure to support various analytical and operational needs.
Contribute to data pipeline architecture so it follows a micro-service format and the components are deployable to on-premise and Cloud
Collaborate closely with stakeholders including data scientists, software engineers, and domain experts to understand data requirements and deliver robust solutions.
Ensure data quality, integrity, and reliability throughout the data lifecycle.
Support the integration of analytics and machine learning models into production systems, focusing on data availability and performance.
Optimize database performance and schema design for large-scale data sets.
Must Have:
3+ years of experience in Python, other programming languages a plus, frameworks experience a plus
Proficient in SQL Server and other database engines, programming experience in writing queries, performance tuning.
Understanding Data Pipelines and ETL, Exposure to Dev Ops practices and tools for CI/CD pipelines.
Excellent problem-solving and analytical skills, with the ability to work in a collaborative team environment.
Ability to translate business requirements into technology implementation. Must be self-motivated and be able to work with minimum supervision.
Ability to take full end-to-end ownership and successfully deliver a feature within a project
Bachelor’s or Master’s degree in Computer Science, Engineering, Information Systems, or a related field.
Experience with PySpark, Airflow knowledge and cloud platforms (AWS, Azure, GCP) and their data services.
Knowledge of data governance and security practices.
Frontend and backend engineering, test driven development, microservices and architecture design principles.
Prior experience in the Financial Industry, especially supporting top-of-the-house risk analytic functions (such as Market Risk, Credit Risk or Liquidity Risk), in the design and development of the regulatory frameworks.
Job Skills
Analytics, Application Architecture Design, Application Development, Application Integrations,…
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search: