Core Platform Engineer
Listed on 2026-01-26
-
IT/Tech
Cloud Computing, Data Engineer, Cybersecurity, Data Security
Outstanding contract opportunity! A well-known Financial Services Company is looking for a Core Platform Engineer in Charlotte, NC or Iselin, NJ (Hybrid Schedule).
Work with the brightest minds at one of the largest financial institutions in the world. This is long-term contract opportunity that includes a competitive benefit package! Our client has been around for over 150 years and is continuously innovating in today's digital age. If you want to work for a company that is not only a household name, but also truly cares about satisfying customers' financial needs and helping people succeed financially, apply today.
Contract Duration: 24 Months
Primary Role:
Build and maintain secure, scalable infrastructure and services.
What You Will Be Doing
- Support a highly available and scalable infrastructure containing Object storage, Openshift, Spark, Iceberg, Yunikorn, Trino
- Monitor for configuration drift and enforce infrastructure policies.
- Configure and monitor Big Data ecosystem components with various BI tools, observability tools etc
- Build automated regression and performance test suite to ensure health checks of all components of the platform
- Monitor system health and enforce runtime policies
- Implement and manage security protocols, including Oauth authentication, TLS encryption, and role-based access control (RBAC).
- Conduct regular maintenance, including cluster scaling, perform regular security audits.
Skills
Programming & Scripting
- Languages:
Python, Bash, Shell, SQL, Java (basic), Scala (for big data, good to have) - Automation & Scripting:
Python scripting for automation, Linux shell scripting
Operating Systems & Container
- System programing, performance tuning, networking
- OCP, Kubernetes (K8s), Helm, Terraform, container orchestration and deployment
- Frameworks:
Nexus One, Apache Spark, Hadoop, Hive, Trino, Iceberg - ETL Tools:
Apache Airflow, NiFi (good to have) - Data Pipelines:
Batch and streaming (Kafka, Flink)
AI/ML & MTC (Model Training & Consumption) (Nice to have)
- Frameworks or LLM modeling
Security & Access Control
- Access Models: RBAC (Role-Based Access Control), ABAC (Attribute-Based Access Control)
- Data Protection:
Encryption at rest and in transit, TLS/SSL, KMS (Key Management Services) - Compliance: GDPR, HIPAA (if applicable), IAM policies
System Design & Architecture (good to have, at least at a conceptual level)
- Scalability:
Load balancing, caching (Redis, Memcached), horizontal scaling - High Availability:
Failover strategies, disaster recovery, monitoring (Prometheus, Grafana)
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).