×
Register Here to Apply for Jobs or Post Jobs. X

Data Engineer

Remote / Online - Candidates ideally in
110006, Delhi, Delhi, India
Listing for: Confidential
Remote/Work from Home position
Listed on 2026-02-05
Job specializations:
  • IT/Tech
    Data Engineer, Cloud Computing, Big Data
Job Description & How to Apply Below
Who we are

We are Fluxon, a product development team founded by ex-Googlers and startup founders. We offer full-cycle software development: from ideation and design to build and go-to-market. We partner with visionary companies, ranging from fast-growing startups to tech leaders like Google and Stripe, to turn bold ideas into products with the power to transform the world.

This is a remote position, with a preference for candidates located in Hyderabad, Bangalore, or Gurgaon, India.

About the role

As the first Data Engineer at Fluxon, you'll take the lead in designing, building, and maintaining the data infrastructure that powers our products and enables data-driven decision-making for our clients.

You'll be responsible for:

Design and implement data models and warehouse schemas to support analytics and reporting needs
Build and maintain reliable data pipelines to ingest, transform, and load data from various sources
Collaborate with product and engineering teams to understand data requirements and deliver scalable solutions
Ensure data quality, integrity, and accessibility across the organization
Optimizing query performance and improving the efficiency of existing data infrastructure
Maintain comprehensive documentation for data models, pipelines, and processes for team reference

You'll work with technologies including:

Data & Analytics
Data Warehouse:
Google Big Query, Snowflake, AWS Redshift, Databricks
ETL/Pipeline Tools:
Apache Spark, Apache Airflow, dbt
Streaming & Queuing:
Apache Kafka, Pub/Sub, RabbitMQ

Languages
SQL
Python (good to have)

Cloud & Infrastructure
Platforms:
Google Cloud Platform (GCP) or Amazon Web Services (AWS)
Storage:
Google Cloud Storage (GCS) or AWS S3
Orchestration & Processing:
Cloud Composer (Airflow), Dataflow, Dataproc

Data Stores
Relational:
PostgreSQL

Monitoring & Observability
GCP Cloud Monitoring Suite

Qualifications

3-5 years of industry experience in data engineering roles
Strong proficiency in SQL and experience with data warehousing concepts (dimensional modeling, star/snowflake schemas)
Experience building and maintaining ETL/ELT pipelines
Familiarity with cloud data platforms, preferably GCP and Big Query
Understanding of data modeling best practices and data quality principles
Solid understanding of software development practices including version control (Git) and CI/CD

Nice to have:

Experience with Python for data processing and automation

Experience with Apache Spark or similar distributed processing frameworks
Familiarity with workflow orchestration tools (Airflow, Prefect)
Exposure to dbt or similar transformation tools

What we offer
Exposure to high-profile SV startups and enterprise companies
Competitive salary
Fully remote work with flexible hours
Flexible paid time off
Profit-sharing program
Healthcare
Parental leave that supports all paths to parenthood, including fostering and adopting
Gym membership and tuition reimbursement
Hands-on career development
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary