More jobs:
GCP Data Integration Engineer
Job in
Nashville, Davidson County, Tennessee, 37247, USA
Listed on 2026-03-12
Listing for:
KANINI
Full Time
position Listed on 2026-03-12
Job specializations:
-
IT/Tech
Data Engineer, Cloud Computing, Data Warehousing, Data Analyst
Job Description & How to Apply Below
We are looking for a detail-oriented and technically skilled GCP
Data Integration Engineer to design, develop, and manage robust data integration solutions. The ideal candidate will have hands-on experience in integrating data across disparate systems, building ETL/ELT pipelines, and ensuring the accuracy, quality, and consistency of enterprise data. You will play a key role in enabling seamless data flow between systems to support business intelligence, analytics, and operational needs.
- Design and implement data integration workflows between internal and external systems, including APIs, databases, SaaS applications, and cloud platforms.
- Develop and maintain scalable ETL/ELT pipelines for structured and unstructured data using tools like Informatica, Talend, SSIS, Apache NiFi, or custom Python/SQL scripts.
- Build and manage real-time and batch data pipelines leveraging technologies like Kafka, Spark Streaming,.
- Ensure high data quality, accuracy, and consistency during data ingestion and transformation.
- Implement data validation, cleansing, deduplication, and monitoring mechanisms.
- Contribute to metadata management, data lineage, and data catalog initiatives.
- Collaborate with data engineers, business analysts, data scientists, and application teams to understand integration needs and deliver effective solutions.
- Troubleshoot and resolve data integration and pipeline issues in a timely manner.
- Provide documentation and knowledge transfer for developed solutions.
- Support data movement across hybrid environments (on-prem, cloud, third-party systems).
- Work with Dev Ops or platform teams to ensure scalability, security, and performance of data integration infrastructure.
- Bachelor’s degree in Computer Science, Information Systems, Engineering, or a related field.
- 4–8 years of experience in data integration, data engineering, with strong ETL and SQL.
- Strong experience with integration tools such as Informatica, Talend, Mule Soft, SSIS, or Boomi.
- Proficient in SQL, Python, and scripting for data manipulation and automation.
- Experience with cloud data platforms (GCP) and services such as Google Cloud Dataflow.
- Familiarity with REST/SOAP APIs, JSON, XML
, and flat file integrations.
- Experience with message queues or data streaming platforms (Kafka, Rabbit
MQ, Kinesis). - Understanding of data warehousing concepts and tools (e.g., Snowflake, Redshift, Big Query).
- Knowledge of data security, privacy, and compliance best practices (HIPAA, GDPR, etc.).
- Prior experience in industries like healthcare, fintech, or e-commerce is a plus.
- Strong problem-solving and debugging skills.
- Excellent communication and collaboration abilities.
- Ability to manage multiple priorities and deliver in a fast-paced environment.
Attention to detail and a commitment to delivering high-quality work.
#J-18808-LjbffrTo View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
Search for further Jobs Here:
×