More jobs:
GCP Data Engineer
Job in
Hartford, Hartford County, Connecticut, 06112, USA
Listed on 2026-01-12
Listing for:
Inizio Partners Corp
Full Time
position Listed on 2026-01-12
Job specializations:
-
IT/Tech
Data Engineer -
Engineering
Data Engineer
Job Description & How to Apply Below
As a Data Engineer you will work on process of transforming raw data to a usable format, which is then further analyzed by other teams. Cleansing, organizing, and manipulating data using pipelines are some of the key responsibilities of a data engineer. You will also work on applying data engineering principles on the Google Cloud Platform to optimize its services. And create interactive dashboards/reports to present it to the stakeholders.
Role and Responsibilities:- Create and maintain optimal data pipeline architecture
- Assemble large, complex data sets that meet functional / non-functional business requirements
- Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re‑designing infrastructure for greater scalability, etc.
- Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and other data sources
- Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency, and other key business performance metrics
- Design intuitive and visually appearing visualizations to communicate complex data insights to users using Tableau, Streamlit, Dash Enterprise and Power BI
- Use Flask to develop APIs that integrate data from various sources and facilitate automation of business processes
- Work with stakeholders including the Executive, Product, Data and Design teams to assist with data‑related technical issues and support their data infrastructure needs
Required Qualification and
Skills:
- Expert in Dash Enterprise in developing friendly UI and strong storytelling experience
- Ability to conduct real‑time data exploration, predictive modeling integration and a seamless deployment of machine learning modules to GCP
- Overall, 5‑8 years of experience on ETL technologies
- 3+ years of experience in data engineering technologies like SQL, Hadoop, Big Query, Dataproc, Composer
- Hands‑on experience with data visualizations like Tableau, Power BI, Dash and Streamlit
- Experience in building interactive dashboards, visualizations, and custom reports for business users
- Knowledge of Flask for developing APIs and automating data workflows
- Experience in data automation and implemented workflows in a cloud environment
- GCP data engineer certified to be preferred
- Ability to understand and design the underlying data/schema
- Strong communication skills to effectively communicate client updates
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
Search for further Jobs Here:
×