More jobs:
Oracle Warehouse Builder/Developer
Job Description & How to Apply Below
Oracle Warehouse Builder, OWB, Oracle Workflow Builder, Oracle TBSS
Oracle Warehouse Builder 9i (Client Version 9.0.2.62.3/Repository Version 9.0.2.0.0)
Oracle Warehouse Builder 4
Oracle Workflow Builder 2.6.2
Oracle Database 10gTNS for IBM/AIX RISC System/6000
Version 10.2.0.5.0 - Production
More than 5 years experience on Oracle Warehouse Builder (OWB) and Oracle Workflow Builder
Expert Knowledge on Oracle PL/SQL to develop individual code objects to entire Data Mart's.
Scheduling tools Oracle TBSS ( jobs to create and run) and trigger based for file sources based on control files.
Must have design and development experience in data pipeline solutions from different source systems (FILES, Oracle) to data lakes.
Must have involved in creating/designing Hive tables and loading analyzing data using hive queries.
Must have knowledge in CA Workload Automation DE 12.2 to create jobs and scheduling.
Extensive knowledge on entire life cycle of Change/Incident/Problem management by using Service Now.
Oracle Warehouse Builder 9i (Client Version 9.0.2.62.3/Repository Version 9.0.2.0.0).
Oracle Warehouse Builder 4
Oracle Workflow Builder 2.6.2
Oracle Database 10gTNS for IBM/AIX RISC System/6000
Version 10.2.0.5.0 - Production.
Oracle Enterprise Manager 10gR1.(Monitoring jobs and table spaces utilization)
Extensive knowledge in fetching Mainframe Cobol files(ASCII AND EBSDIC formats) to the landing area and processing(formatting) and loading(Error handling) of these files to oracle tables by using SQL
* Loader and External tables.
Extensive knowledge in Oracle Forms 6 to integrate with OWB 4.
Extensive knowledge on entire life cycle of Change/Incident/Problem management by using Service-Now.
work closely with the Business owner teams and Functional/Data analysts in the entire development/BAU process.
Work closely with AIX support, DBA support teams for access privileges and storage issues etc.
work closely with the Batch Operations team and MFT teams for file transfer issues.
Migration of Oracle to Hadoop eco system:
Must have working experience in Hadoop eco system elements like HDFS, Map Reduce, YARN etc.
Must have working knowledge on Scala & Spark Data frames to convert the existing code to Hadoop data lakes.
Must have design and development experience in data pipeline solutions from different source systems (FILES, Oracle) to data lakes.
Must have involved in creating/designing Hive tables and loading analyzing data using hive queries.
Must have knowledge in creating Hive partitions, Dynamic partitions and buckets.
Must have knowledge in CA Workload Automation DE 12.2 to create jobs and scheduling.
Use Denodo for Data virtualization to the required data access for end users.
Position Requirements
5+ Years
work experience
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
Search for further Jobs Here:
×