Python Developer
Listed on 2026-03-05
-
Software Development
Data Engineer, Python
About Us
Building Services 32BJ Benefit Funds (“the Funds”) is the umbrella organization responsible for administering Health, Pension, Retirement Savings, Training, and Legal Services benefits to over 185,000 SEIU 32BJ members. Our mission is to make significant contributions to the lives of our members by providing high quality benefits and services. Through our commitment, we embody five core values:
Flexibility, Initiative, Respect, Sustainability, and Teamwork (FIRST). By following our core values, employees are open to different and new ways of doing things, take active steps to improve the organization, create an environment of trust and respect, approach their work with the intent of a positive outcome, and work collaboratively with colleagues.
The Funds oversees and manages $11 billion of assets, which are made up of many, varied and complex funds. The dollars come from a number of sources, including the property owners who pay into the funds on behalf of their employees, and as such, requires those who oversee and manage the money to be highly skilled financial management people.
32BJ Benefit Funds will continue to drive innovation, equity, and technology insights to further help the lives of our hard-working members and their families. We use cutting edge technology such as: M365, Dynamics 365 CRM, Dynamics 365 F&O, Azure, AWS, SQL, Snowflake, Qlik View, and more.
Please take a moment to watch our video to learn more about our culture and contributions to our members:
Job Code1541
Department NameIT Development
Reports ToTeam Lead, Data Integration
FLSA StatusExempt
Union CodeN/A
ManagementNo
Job SummaryUnder the supervision of the Team Lead, Data Integration, the Python Developer is responsible for building secure, scalable data pipelines and integrating data from multiple sources. The role involves deep collaboration with business analysts, other developers, and analytics and data scientists. The ideal candidate will have hands‑on experience using Python APIs
, managing Python environments
, and implementing data security practices such as secure configuration and encryption. The role involves working with Databricks
, modern data platforms, and following solid SDLC, documentation, and Dev Ops practices
.
- Design, develop, and maintain robust Python-based applications and scalable data pipelines using Python
- Write clean, scalable, and efficient code following best practices
- Develop and consume REST APIs in Python for data ingestion and integration
- Configure and manage Python environments (virtual environments, dependency management)
- Optimize applications for maximum speed and scalability
- Implement data encryption and security best practices for configuration as well as data in transit and at rest
- Build and optimize data workflows using Databricks (PySpark)
- Write, optimize, and maintain T‑SQL queries for SQL Server and PostgreSQL
- Perform ETL/ELT data processing and transformations
- Support data integration using SSIS and Azure Data Factory
- Develop well‑documented Jupyter/Databricks notebooks and maintain clear technical and process documentation for data pipelines and workflows
- Follow SDLC best practices throughout development and deployment
- Use Git / Azure Dev Ops Git for source code control and CI/CD collaboration
- Participate in code reviews and troubleshoot data quality or performance issues
- Perform tasks as required by management/supervisory staff
- Provide support after hours, and on weekends as needed
- 2+ years of experience in Python-based Data Engineering
- Experience working with RESTful APIs in Python
- Experience configuring and managing Python environments (venv, conda, pip)
- Hands‑on experience with Azure Key Vault for secrets management
- Knowledge of data encryption and data security fundamentals
- Strong SQL skills with SQL Server, Postgre
SQL, and T‑SQL - Experience building ETL/ELT pipelines using Databricks (PySpark)
- Understanding of SDLC, version control (Git), and CI/CD processes
- Preferred (Nice-to-Have) Skills
- Experience with SSIS
- Experience with Azure Data Factory
- Familiarity with Dremio
- Exposure to Azure cloud services and AWS
- Knowledge of data modeling techniques
- Experience working in Agile/Scrum environments
Bachelor’s degree in Computer Science, or a related discipline.
Reasoning AbilityHigh
Physical Demands- Under 1/3 of the time:
Standing, Walking, Climbing or Balancing, Stooping, Kneeling, Crouching, or Crawling - Over 2/3 of the time:
Talking or Hearing - 100% of the time:
Using Hands
- 1/3 to 2/3 of the time:
Work near moving or mechanical parts, exposure to radiation, moderate noise.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).