More jobs:
AWS Data Architect
Job in
Columbia, Lexington County, South Carolina, 29228, USA
Listed on 2026-03-02
Listing for:
Diligent Tec, Inc
Full Time
position Listed on 2026-03-02
Job specializations:
-
IT/Tech
Data Engineer, AWS, Cloud Computing
Job Description & How to Apply Below
Job Title:
AWS Data Architect
Location:
Columbia, SC
Work Model:
Onsite
Job Type: Long Term Contract
Experience:
Min. 14+ Years
We are seeking a highly experienced, hands‑on AWS Data Architect to lead the design and execution of our cloud data ecosystem. The primary focus is the end‑to‑end migration and modernization of legacy on‑premise Microsoft stacks (SQL Server EDW, SSIS, and multi‑platform reporting) to a cloud‑native AWS architecture.
CoreRoles & Responsibilities
- Design and evolve a modern Lakehouse / Data Mesh architecture using AWS S3, Glue, and Amazon Redshift
- Lead migration of MS SQL Server EDW to AWS ensuring performance and data integrity
- Modernize pipelines by refactoring SSIS packages into AWS Glue, Step Functions, or MWAA (Airflow)
- Drive BI/report modernization (SSRS, Crystal Reports, Power BI, Tableau, Hyperion → AWS / Quick Sight)
- Implement governance, scalability, and compute optimization (Athena/Lambda vs EMR/MSK)
- Convert legacy SSIS ETL logic into Python / Spark (Glue / EMR)
- Perform database migration using AWS DMS & SCT
- Build real‑time streaming solutions (Amazon Kinesis / MSK)
- Automate infrastructure via Terraform / AWS CDK / Cloud Formation
- Tune Amazon Redshift (distribution styles, sort keys, query performance)
- Optimize AWS costs (S3 lifecycle, Glue job efficiency)
- Implement security & governance (IAM, Lake Formation, KMS, Secrets Manager)
- Migration / Storage: AWS DMS, SCT, Amazon S3
- Processing & Analytics: AWS Glue, EMR, Lambda, Redshift (RA3), Athena
- Data Stores:
Dynamo
DB, Aurora (Postgre
SQL/MySQL), Neptune - Messaging / Orchestration:
Kinesis, MSK, SQS, Step Functions, MWAA (Airflow)
- Deep expertise in SQL Server, SSIS, SSRS
- Experience with Crystal Reports, Power BI, Tableau, Hyperion
- Advanced Python & SQL (T‑SQL / Spark
SQL) - Dev Ops / CI‑CD (Git, Jenkins/Git Lab/Code Pipeline)
- Strong understanding of Parquet, Avro, Delta Lake
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
Search for further Jobs Here:
×