×
Register Here to Apply for Jobs or Post Jobs. X

Mid-Level Platform Engineer – B2B Data Exchange

Job in Pierre, Hughes County, South Dakota, 57501, USA
Listing for: Eliassen Group
Full Time position
Listed on 2026-03-13
Job specializations:
  • IT/Tech
    Cybersecurity, Cloud Computing
Salary/Wage Range or Industry Benchmark: 60 - 70 USD Hourly USD 60.00 70.00 HOUR
Job Description & How to Apply Below

Mid‑Level Platform Engineer – B2B Data Exchange

Anywhere

Type:
Contract

Category:
Dev Ops

Industry: Communications

Workplace Type:
Remote

Our client is seeking a mid‑level platform engineer to design and operate secure, scalable B2B data exchange capabilities on AWS. The role focuses on reusable patterns, guardrails, and automation that enable compliant, observable data flows across external partners and internal teams. Work includes standardized integrations, governance enablement, and reliability engineering within a regulated environment. The engineer will collaborate with stakeholders to improve integrations and drive continuous operational excellence.

Due to client requirements, applicants must be willing and able to work on a W2 basis. For our W2 consultants, we offer a great benefits package that includes medical, dental, and vision benefits, 401(k) with company matching, and life insurance.

Rate: $60.00 to $70.00/hr w2

Responsibilities
  • Design and implement standardized B2B data exchange patterns including SFTP, AS2, and API using AWS services such as Transfer Family, API Gateway, Private Link, cross‑account S3, Event Bridge, and managed MFT/EDI where applicable.
  • Build data ingestion, curation, and delivery pipelines using S3, Glue, Lake Formation, Step Functions, Lambda, Kinesis or MSK, and Redshift or Snowflake on AWS.
  • Apply end‑to‑end encryption and key management practices including TLS and KMS; implement data masking or tokenization and controls for PII or PHI handling.
  • Engineer secure VPC networking patterns, cross‑account access, and service‑to‑service connectivity.
  • Implement role‑ and attribute‑based access controls with IAM and Lake Formation to deliver least‑privilege access for business units.
  • Operationalize data contracts, schema validation, and cataloging with Glue Data Catalog and versioned interfaces.
  • Establish guardrails via AWS Organizations, Control Tower, and SCPs; enforce configuration baselines and drift detection.
  • Embed observability with Cloud Watch, Cloud Trail, AWS Config, structured logging, metrics, and traces.
  • Define and implement SLIs or SLOs, error budgets, automated remediation, and create runbooks, playbooks, and incident response workflows.
  • Implement secrets management, certificate rotation, partner credential lifecycle, and automated onboarding or offboarding.
  • Deliver infrastructure as code with Terraform or Cloud Formation and Git Ops workflows; create reusable modules, pipelines, and golden patterns.
  • Integrate CI or CD with testing, security scans, and linting; enable progressive delivery for data exchange components.
  • Partner with Security, Legal, and Compliance to meet SOX, SOC 2, GDPR, CCPA, and HIPAA requirements where applicable.
  • Collaborate with TPMs, Data Engineering, Security or IAM, and external partners to troubleshoot and improve integrations.
  • Provide Tier‑3 data engineering support, lead post‑incident reviews, and drive continuous improvement across reliability and security.
Experience Requirements
  • 3 to 5+ years of platform, cloud, or data engineering with a track record of secure B2B integrations at scale.
  • Advanced proficiency with AWS services for data exchange and governance including S3, IAM, KMS, Transfer Family, API Gateway, Private Link, VPC, Glue or Lake Formation, Step Functions, Lambda, and Event Bridge; familiarity with Kinesis or MSK and Redshift or Snowflake on AWS.
  • Hands‑on implementation of SFTP, AS2, EDI, and API‑based exchanges, managed file transfer, and partner connectivity patterns.
  • Strong security fundamentals including least privilege, encryption in transit and at rest, tokenization and PII handling, network segmentation, and cross‑account access.
  • Expertise with infrastructure as code using Terraform or Cloud Formation, CI or CD, Git workflows, and scripting with Python or Bash.
  • Proven experience building observability and reliability using Cloud Watch, Cloud Trail, metrics or tracing, alerting, runbooks, and SLOs.
  • Excellent troubleshooting with the ability to lead complex incident response and root cause analysis.
  • Experience with enterprise identity and federation using SAML, OAuth, or OIDC, and ABAC or RBAC policy models (preferred).
  • Knowledge…
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary