×
Register Here to Apply for Jobs or Post Jobs. X

Quality Analyst

Job in Indianapolis, Hamilton County, Indiana, 46262, USA
Listing for: VLink Inc
Full Time position
Listed on 2026-01-12
Job specializations:
  • IT/Tech
    Data Analyst, Data Engineer, Data Security, IT QA Tester / Automation
Salary/Wage Range or Industry Benchmark: 60000 - 80000 USD Yearly USD 60000.00 80000.00 YEAR
Job Description & How to Apply Below
Location: Indianapolis

VLink Inc

VLink is a leading global provider of software engineering services with next‑gen technologies and best‑in‑class talent.

Great Place to Work® Certified™

Best Places to Work in CT

Job Title:

Quality Analyst

Location:

Indianapolis, Indiana/hybrid role
Duration:
Long Term Contract
Responsibilities
  • Participate in the development of ams/pas data testing data testing frameworks to validate data pipelines across ETL/ELT processes.
  • Ensure data accuracy, completeness, and consistency across systems such as Applied Epic, ivans download, Guidewire, and integrated platforms like Salesforce FSC.
  • Build automated testing solutions to validate large volumes of policy, claims, billing, and customer data in real time.
  • Leverage tools such as Great Expectations, Soda, or custom Python/SQL test harnesses to automate quality assurance processes.
  • Validate data migration efforts from legacy systems (e.g., on‑prem databases, flat files) to cloud platforms such as AWS, Azure, or GCP.
  • Ensure that all data transformations, mappings, and business rules retain fidelity during cloud transitions.
  • Integrate data testing into Dev Ops pipelines using Jenkins, Git Hub Actions, Azure Dev Ops, or similar tools.
  • Support continuous integration and delivery practices by automating regression, smoke, and sanity testing on data assets.
  • Design reusable synthetic and masked data sets for comprehensive and compliant test coverage.
  • Develop self‑service test data provisioning models to support QA, UAT, and SIT phases in insurance data projects.
  • Work closely with data engineers, QA teams, and business analysts to understand evolving data requirements and downstream impacts.
  • Serve as the data quality advocate across the personal lines business—underwriting, claims, billing, and customer service.
  • Strong knowledge of Insurance Pricing, Quoting, comparative rater manual testing knowledge.
  • Worked on testing Agency to Carrier testing and binding, issuance process and workflow for personal lines.
  • Validate APIs and batch integrations between Applied Epic, Ivans Download, AMS Platforms, Vertafore, Guidewire PC, DC, and external platforms like Salesforce FSC, or partner portals.
  • Ensure that customer and policyholder data flows correctly through system interfaces and matches expected formats and rules.
  • Use anomaly detection techniques and tools to flag irregularities in premium, claims, and customer data early in the lifecycle.
  • Recommend and help implement fixes or automated alerts to reduce operational and regulatory risk.
  • Support regulatory audits and reporting by ensuring lineage, traceability, and testing documentation is always current.
  • Assist in maintaining data integrity for compliance with NAIC, HIPAA, CCPA, GDPR, and other standards.
Qualifications
  • 3–4 years of hands‑on experience in data testing, guidewire datasets, or quality engineering, preferably within the insurance industry.
  • Strong domain knowledge in Personal Lines Insurance (Auto, Homeowners, Renters, etc.), including policy, billing, and customer data.
  • Applied Epic – understanding of agency management system data structure.
  • Guidewire (Policy Center) – familiarity with data schema, business rules, and integration points.
  • Exposure to Salesforce Financial Services Cloud (FSC) or Salesforce CRM data models (add‑on but highly desirable).
  • Understanding of insurance data flows, regulatory reporting, and compliance needs.
  • Strong expertise in SQL for data profiling, validation, and comparison across systems.
  • Great Expectations, Datafold, Soda, Informatica Data Quality, or Talend DQ.
  • Proficiency in at least one scripting language (e.g., Python) to build custom validation tools or test scripts.
  • Familiarity with ETL/ELT testing for data pipelines developed in tools like dbt, Apache Airflow, Informatica, or Talend.
  • Understanding data warehousing, Lakehouse architectures, and cloud‑native data processing frameworks.
  • Ability to test data across multiple systems and APIs, validating end‑to‑end data movement from source (e.g., Guidewire) to target (e.g., warehouse or BI tool).
  • Familiarity with API testing tools like Postman, SoapUI, or REST Assured (especially for validating insurance platform integrations).
  • Knowledge of data privacy and compliance testing practices related to GDPR, CCPA, HIPAA, and NAIC guidelines.
  • Ability to support audit trails and test documentation for regulatory data flows and reports.
  • Experience integrating data testing in CI/CD pipelines using Jenkins, Git Lab, Azure Dev Ops, or similar tools.

Warm Regards,

O:  Ext: 248

#J-18808-Ljbffr
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary