×
Register Here to Apply for Jobs or Post Jobs. X

Data Platform Developer

Job in Raleigh, Wake County, North Carolina, 27601, USA
Listing for: HUB International
Full Time position
Listed on 2026-03-14
Job specializations:
  • IT/Tech
    Data Engineer, Data Analyst, Database Administrator, Data Warehousing
Salary/Wage Range or Industry Benchmark: 60000 - 80000 USD Yearly USD 60000.00 80000.00 YEAR
Job Description & How to Apply Below

ABOUT US

At HUB International, we are a team of entrepreneurs. We believe in protecting and supporting the aspirations of individuals, families, and businesses. We help our clients evaluate their risks and develop solutions tailored to their needs. We believe in empowering our employees to learn, grow, and make a difference. Our structure enables our teams to maintain their own unique, regional culture while leveraging support and resources from our corporate centers of excellence.

HUB is a global insurance and employee benefits broker, providing a boundaryless array of business insurance, employee benefits, risk services, personal insurance, retirement, and private wealth management products and services. With over $5 billion in revenue and almost 20,000 employees in 600 offices throughout North America, HUB has grown substantially, in part due to our industry leading success in mergers and acquisitions.

Position

Summary

We are seeking a skilled Data Platform Developer to design, develop, and maintain our database infrastructure and reporting capabilities. This role is critical to supporting data-driven decision-making across the organization, with a primary focus on Snowflake cloud data platform management and optimization.

Database Development & Management
  • Design, build, and maintain SQL databases, data models, and ETL processes
  • Write and optimize complex SQL queries, stored procedures, and functions
  • Ensure data integrity, security, and compliance with organizational standards
Snowflake Platform Administration
  • Develop and manage data pipelines within Snowflake
  • Optimize Snowflake performance, including query tuning and resource management
  • Implement and maintain role-based access controls and data governance policies
Reporting & Analytics Support
  • Create and maintain views, tables, and data structures to support reporting tools and dashboards
  • Partner with business stakeholders to translate requirements into technical solutions
  • Troubleshoot and resolve data discrepancies and performance issues
Collaboration & Documentation
  • Work cross-functionally with Operations, Analytics, and IT teams
  • Document data architecture, workflows, and standard operating procedures
  • Contribute to continuous improvement of data infrastructure and processes
Documentation & Knowledge Management
  • Data architecture diagrams (ERDs, data flow diagrams, system integration maps)
  • Data pipeline documentation (Coalesce/dbt DAGs, transformation logic, dependencies)
  • API documentation (endpoint specifications, authentication flows, payload schemas)
  • Stored procedure and function libraries (purpose, parameters, usage examples)
  • Data dictionaries – Create and maintain comprehensive documentation including data dictionaries, ETL workflows, SOPs, runbooks, governance policies, process diagrams, and training materials
Required Qualifications
  • 3+ years of hands‑on experience in SQL development and database management
  • Proficiency with Snowflake – including data warehousing, query optimization, and administration
  • Big Query – the ability to build, optimization of data tables, leveraging native connectors for Power BI and data integration
  • Strong experience with relational database concepts and data modeling
  • Familiarity with ETL tools (Dataflow, Pub/Sub), data integration and orchestration processes
  • Ability to write efficient, scalable SQL code
Preferred Qualifications
  • Experience with cloud platforms (AWS, Azure, or GCP) and serverless architecture (Lambda functions, event‑driven processing)
  • Python expertise for data engineering, including stored procedures, Lambda functions, and API integrations
  • Experience with BI/reporting tools (Tableau, Power BI, Looker) and data pipeline orchestration (dbt, Airflow)
  • Experience building Snowflake stored procedures (SQL/Python), UDFs, and data automation workflows
  • Knowledge of API integrations, REST APIs, and cloud data pipelines (S3, Power Automate)
  • Background in insurance, benefits, or financial services with familiarity in benefits administration and carrier data
Competency
  • Strong analytical and problem‑solving skills
  • Ability to communicate technical concepts to non‑technical stakeholders
  • Detail‑oriented with a focus on data accuracy and quality
  • S…
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary