×
Register Here to Apply for Jobs or Post Jobs. X

Dataiku​/Snowflake Analyst - SS

Job in New York City, Richmond County, New York, USA
Listing for: NavitasPartners
Full Time position
Listed on 2026-01-27
Job specializations:
  • IT/Tech
    Data Engineer, Data Warehousing, Data Analyst
Job Description & How to Apply Below
Position: Dataiku / Snowflake Analyst - 0123 SS #1
Job Title:
Dataiku / Snowflake Analyst


Location: NYC, NY
Duration: 6-12 Months
Rate: $50
Job Summary

We are seeking a Dataiku / Snowflake Analyst responsible for designing, developing, testing, and delivering high-quality data solutions. This role requires strong hands-on experience with Snowflake, cloud platforms, and modern data pipelines, along with the ability to quickly learn new technologies and contribute to enterprise-scale analytics solutions.
Key Responsibilities
  • Architect and implement large-scale data intelligence solutions using Snowflake Data Warehouse
  • Design, build, automate, and maintain modern data pipelines in cloud environments
  • Develop and test data workflows; clearly document implementations, requirements, and test cases
  • Provide flexible and scalable solutions to complex business problems
  • Contribute to the evolution of data architecture to improve flexibility and usability
  • Follow established development processes, standards, and best practices
  • Conduct code reviews and mentor other developers
  • Troubleshoot and resolve production issues
  • Stay current with emerging technologies and recommend improvements where appropriate
Minimum Qualifications
  • BE /

    B.Tech / MCA
  • Strong written and verbal communication skills
Preferred Qualifications & Skills
  • Proven experience designing and implementing production-grade, large-scale data solutions on Snowflake
  • Hands-on experience with AWS and/or Azure
  • Experience building ETL pipelines using Python and Snowflake SnowSQL
  • Strong SQL development experience on Snowflake
  • Hands-on experience building productionized data ingestion and processing pipelines using:
    • Java
    • Spark
    • Scala
    • Python
  • Experience with enterprise data warehousing platforms such as Teradata, Oracle, or DB2
  • Knowledge of backup, recovery, and restoration of data and database systems
  • Ability to create field enablement and technical documentation to support analytics integration across on-prem and cloud data lake architectures
  • Strong presentation and stakeholder communication skills
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary