×
Register Here to Apply for Jobs or Post Jobs. X

DE; SQL+Python+Data Pipeline

Job in Bengaluru, 560001, Bangalore, Karnataka, India
Listing for: Confidential
Full Time position
Listed on 2026-02-05
Job specializations:
  • IT/Tech
    Data Engineer, Data Analyst, Database Administrator, Data Warehousing
Job Description & How to Apply Below
Position: DE (SQL+Python+Data Pipeline)
Location: Bengaluru

Wissen Technology is Hiring for Junior Data Engineer

About Wissen Technology:
Wissen Technology is a globally recognized organization known for building solid technology teams, working with major financial institutions, and delivering high-quality solutions in IT services. With a strong presence in the financial industry, we provide cutting-edge solutions to address complex business challenges.

Role Overview :

We are looking for a Junior Data Engineer with a strong foundation in SQL, databases, and cloud data platforms. This role is ideal for candidates who have hands-on experience in querying large datasets, working with data frames in Python, and are eager to grow in a data engineering environment involving cloud technologies, data pipelines, and real-time data streaming.

Key Responsibilities:

Write, optimize, and maintain complex SQL queries for data extraction, transformation, and reporting.

Work with relational databases and cloud data warehouses such as Snowflake and Redshift.

Apply data modeling principles, including normalization and understanding of various data types.

Utilize Python along with Pandas or PySpark for data processing and analysis.

Assist in the development and maintenance of ETL/ELT pipelines to manage large-scale data workflows.

Collaborate with analysts, engineers, and business teams to ensure clean, consistent, and accessible data.

Gain exposure to No

SQL databases such as Mongo

DB or Dynamo

DB.

Use cloud platforms, preferably AWS, for managing and accessing data services.

Create reports and dashboards using visualization tools like Tableau, Power BI, or Looker.

Support initiatives involving real-time data streaming and help manage data streaming platforms (e.g., Kafka, Kinesis).

Required Skills:

1+ years of experience working with SQL and databases.

Strong proficiency in SQL and understanding of data modeling and data warehousing concepts.

Basic experience with Python and data frame libraries like Pandas or PySpark.

Familiarity with cloud-based data warehouse platforms like Snowflake and Redshift.

Understanding of No

SQL databases and unstructured data handling.

Exposure to ETL/ELT tools and practices.

Awareness of cloud platforms, especially AWS and its data services.

Working knowledge of data visualization tools such as Tableau, Power BI, or Looker.

Interest in or basic understanding of real-time data streaming platforms.
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary