More jobs:
F2F – Sr. Data Engineer; Data Lake MarTech Atlanta, GA; Hybrid Onsite
Job in
Stockton, Lanier County, Georgia, 31649, USA
Listed on 2025-12-19
Listing for:
Empower Professionals
Full Time
position Listed on 2025-12-19
Job specializations:
-
IT/Tech
Data Engineer
Job Description & How to Apply Below
Location: Stockton
Role:
Senior Data Engineer (Data Lake Mar Tech)
Locations:
Atlanta, GA (Hybrid Onsite)
Duration: 12 Months Contract
F2F Interview Highly Preferred for Local Candidates .
Note:
Candidate needs to be in the office 3-4 Days every week.
Local or candidates from adjacent states only.
Purpose:
- We are looking for a Senior Data Engineer to be part of our scrum teams and perform functional & system development for Hadoop applications for our Enterprise Data Lake initiative.
- This is high visibility fast paced key initiative will integrate data across internal and external sources, provide analytical insights and integrate with our critical systems.
- Participate in the agile development process.
- Develop functional and technical specifications from business requirements for the commercial platform.
- Ensure application quality and adherence to performance requirements.
- Help create project estimates and plans. Represent engineering team in project meetings and solution discussions.
- Participate code review process.
- Work with team members to achieve business results in a fast paced and quickly changing environment.
- Pair up with data engineers to develop cutting edge Analytic applications leveraging Big Data technologies:
Hadoop, No
SQL and In‑memory Data Grids. - Mentor and influence up and down the chain of command.
- Perform other duties and/or projects as assigned.
- Bachelor’s degree in a quantitative field (such as Engineering, Computer Science, Statistics, Econometrics) and a minimum of 10 years of experience
- Minimum 5 years’ experience working with and developing big data solutions
- Experts in the following Ab Initio tools: GDE — Graphical Development Environment;
Co-Operating System;
Control Center;
Metadata Hub;
Enterprise Meta Environment;
Enterprise Meta Environment Portal;
Acquire It;
Express It;
Conduct It;
Data Quality Environment;
Query It. - Hands‑on experience on writing shell scripts, complex SQL queries, Hadoop commands and Git.
- Ability to write abstracted, reusable code components.
- Programming experience in at least two of the following languages:
Scala, Java or Python.
- Strong business acumen.
- Critical Thinking and Creativity.
- Performance tuning experience.
- Experience in developing Hive, Sqoop, Spark, Kafka, HBase on Hadoop.
- Familiar with Ab Initio, Hortonworks, Zookeeper, and Oozie is a plus.
- Willingness to learn new technologies quickly.
- Superior oral, and written communication skills, as well as the willingness to collaborate across teams of internal and external technical staff, business analysts, software support and operations staff.
- Strong business acumen including a broad understanding of Synchrony Financial business processes and practices.
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
Search for further Jobs Here:
×