More jobs:
Tech Lead; Spark and Python expertise
Job in
McLean, Fairfax County, Virginia, USA
Listed on 2026-01-13
Listing for:
Infinity Outsourcing
Full Time
position Listed on 2026-01-13
Job specializations:
-
Software Development
Data Engineer
Job Description & How to Apply Below
About the job Tech Lead (Spark and Python expertise)
Position: Tech Lead (Spark and Python expertise)
Location: McLean, VA. (Day-1 Onsite| 3 days Onsite & 2 Days remote)
Duration: long term
Pay rate: $67-$70/hr on C2C and $125,000-$130,000/annum
Client: Hexaware
Job Description:
Mandatory:
- 10+ years of experience in solution, design and development of applications using Java 8+/J2EE, Spring, Spring-Boot, Micro Services, RESTful Services and with experience in Big Data and with experience working in heavy data background needed.
- Develop, program, and maintain applications using the Apache Spark open-source framework.
- Work with different aspects of the Spark ecosystem, including Spark SQL, Data Frames, Datasets, and streaming.
- Spark Developer must have strong programming skills in Java, Scala, or Python.
- Familiar with big data processing tools and techniques.
- Experience with the Hadoop ecosystem.
- Good understanding of distributed systems.
- Experience with streaming data platforms.
- Must be strong in Cloud AWS event-based architecture, Kubernetes, ELK (Elasticsearch, Logstash & Kibana).
- Must have excellent experience in designing and implementing cloud-based solutions in various AWS Services (S3, Lambda, Step Function, AMQ, SNS, SQS, Cloud Watch Events, etc.).
- Must be well experienced in design and development of Microservice using Spring-Boot and REST API and with Graph
QL. - Must have solid knowledge and experience in No
SQL (Mongo
DB). - Good knowledge and experience in any Queue based implementations.
- Strong knowledge/experience in ORM Framework - JPA / Hibernate.
- Good knowledge in technical concepts Security, Transaction, Monitoring, Performance.
- Should be well versed with TDD/ATDD.
- Should have experience on Java, Python and Spark.
- 2+ years of experience in designing and implementing cloud-based solutions in various AWS Services.
- Strong experience in Dev Ops tool chain (Jenkins, Artifactory, Ansible/Chef/Puppet/Spinnaker, Maven/Gradle, Atlassian Tool suite).
- Very good knowledge and experience in Non-Functional (Technical) Requirements like Security, Transaction, Performance, etc.
- Excellent analytical and problem-solving skills.
Nice to have:
- Experience with OAuth implementation using Ping Identity.
- Experience in API Management (Apigee) or Service Mesh (Istio).
- Good knowledge and experience in Queue/Topic (Active-MQ) based implementations.
- Good knowledge and experience in Scheduler and Batch Jobs.
- Experience with scripting languages using Unix.
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
Search for further Jobs Here:
×