More jobs:
Data Architect
Job in
Tallahassee, Franklin County, Florida, 32318, USA
Listed on 2026-02-28
Listing for:
Capital Technology Alliance
Full Time
position Listed on 2026-02-28
Job specializations:
-
IT/Tech
Data Engineer, Data Analyst, Data Security, Data Warehousing
Job Description & How to Apply Below
The Data Architect will support the client’s Enterprise Data and Analytics Platform (EDAP) initiative. This role is responsible for architecting, designing, and engineering a modern cloud-based data and analytics ecosystem, while collaborating closely with Department stakeholders, system integrators, and security teams to ensure alignment with business, governance, and regulatory requirements.
Job Duties:
- Provide architectural leadership and hands-on engineering support for the Enterprise Data and Analytics Platform (EDAP).
- Collaborate with business, technical, and executive stakeholders to translate business needs into scalable data and analytics solutions.
- Design, document, and implement cloud-based data lake, data warehouse, and Lakehouse architectures in AWS and Snowflake.
- Develop current-state and future-state conceptual, logical, and physical data models, including reverse engineering existing systems.
- Design and optimize data pipelines supporting batch, CDC, and streaming integrations using industry-standard tools.
- Implement data quality rules, standards, profiling, lineage, and observability across the data ecosystem.
- Architect and enforce data governance, metadata management, cataloging, and master data management (MDM) solutions.
- Design secure data access using RBAC, ABAC, PBAC, row-level, and column-level security controls.
- Ensure compliance with HIPAA and other regulatory requirements through encryption, masking, anonymization, and privacy controls.
- Support analytics, business intelligence, and data science platforms, including BI, ML, and AI capabilities.
- Collaborate with infrastructure and security teams to design secure, cost-optimized AWS cloud environments.
- Support Dev Ops/Data Ops processes, including CI/CD, testing, monitoring, and performance optimization.
- Review and validate system integrator deliverables, architecture artifacts, and test plans.
- Participate in project meetings, documentation, status reporting, and stakeholder communications.
Required Qualifications:
- Current data and/or analytics certification (e.g., CDMP) OR 18+ hours of relevant data and analytics training/webinars within the last three years.
- 5+ years of experience interfacing directly with business stakeholders and explaining technical architectures and data models to non-technical audiences.
- 6+ years of experience architecting, engineering, implementing, and supporting enterprise data warehouses, including 2+ years using Snowflake.
- 3+ years of experience architecting and supporting cloud-based data lakes using AWS S3 and Apache-based technologies (e.g., Parquet).
- 2+ years of experience designing and implementing cloud-based data Lakehouse platforms such as Databricks, Snowflake, Delta Lake, Hudi, or Iceberg.
- 10+ years of experience in data modeling (conceptual, logical, physical, ER models) and data profiling/reverse engineering; proficiency with Erwin preferred.
- 6+ years of experience designing and engineering data pipelines using ETL, CDC, and streaming approaches with tools such as Informatica, AWS Glue, Spark, Kafka, Kinesis, or Mule Soft.
- 6+ years of experience with SQL programming; 3+ years with Python or similar object-oriented languages; 1+ year developing AWS Lambda functions.
- 5+ years of experience architecting and engineering relational and No
SQL databases (document, graph, key-value, columnar, vector). - 3+ years of experience designing and implementing AWS cloud infrastructure for enterprise data and analytics platforms.
- 3+ years of experience architecting data security and privacy solutions, including DLP, encryption, masking, RBAC/ABAC, and HIPAA compliance.
- 3+ years of experience designing internal and external data sharing hubs and API-based data exchange solutions.
- 2+ years of experience using Dev Ops or Data Ops practices.
- 5+ years of experience in data and analytics testing, quality assurance, and acceptance processes.
- 3+ years of experience implementing data governance and management tools such as data quality, metadata/catalog, and lineage solutions (e.g., Collibra, Informatica, Precisely).
- 2+ years of experience implementing Master Data Management (MDM) solutions using tools such as Informatica MDM,…
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
Search for further Jobs Here:
×