×
Register Here to Apply for Jobs or Post Jobs. X

Manager, Data Operations & Management T500-22541

Job in 500001, Hyderabad, Telangana, India
Listing for: McDonald's Global Office in India
Full Time position
Listed on 2026-02-05
Job specializations:
  • IT/Tech
    Data Engineer, Data Analyst
Job Description & How to Apply Below
Position: Manager, Data Operations & Management [T500-22541]
About McDonald’s:
One of the world’s largest employers with locations in more than 100 countries, McDonald’s Corporation has corporate opportunities in Hyderabad. Our global offices serve as dynamic innovation and operations hubs, designed to expand McDonald's global talent base and in-house expertise. Our new office in Hyderabad will bring together knowledge across business, technology, analytics, and AI, accelerating our ability to deliver impactful solutions for the business and our customers across the globe.

Position Overview:

As a key member of the McDonald’s Global Data Platform & Operations team, this role focuses on the physical design, management, and optimization of cloud database platforms, with a strong emphasis on GCP Big Query. The role supports the implementation and maintenance of database structures, metadata definitions, and source-to-target mappings, while ensuring database platforms are reliable, scalable, and cost-efficient in support of advanced analytics and AI use cases.

The ideal candidate  should be passionate about databases  and brings hands-on experience with cloud databases, strong SQL skills, and a practical understanding of batch, event-based, and streaming data ingestion into database layers, and partners closely with data engineers, analysts, and platform stakeholders to ensure physical data designs and platform operations meet performance, governance, and business needs.

Responsibilities:
Meet with data analysts, business intelligence teams, and other key stakeholders to gather, understand, and address complex data requirements. Ensure that the data solutions designed are both technically sound and aligned with business needs.
Hands-on design, development, deployment, and management of database architecture solutions. Ensure that the solutions are scalable, flexible, and tailored for advanced analytics and AI scenarios utilizing GCP Big Query.
Execute ongoing GCP Big Query platform administration and health checks, including configuration management, slot and reservation allocation, capacity monitoring, and remediation of platform-level issues to ensure stable, scalable, and cost-efficient Big Query services.

Implement database operations best practices, including:
Alerts & Monitoring:
Configure proactive alerts for query performance, slot utilization, storage consumption, and failed jobs; monitor cost and usage trends; automate health checks for schema integrity.
SQL Tuning:
Review execution plans, optimize joins and aggregations, leverage partitioning and clustering, avoid SELECT *, and recommend materialized views for frequent queries.
User Recommendations:
Provide query optimization guidelines, educate analysts on efficient SQL practices, encourage parameterized queries, and share dashboards for performance and cost transparency.
Planning and coordinating database downtime for maintenance or any other activities
Work closely with the IT security team to ensure that all database platforms adhere to strict security protocols, ensuring data privacy and protection.
Participate in data governance initiatives. Collaborate with cross-functional teams to help drive best practices, emphasizing data accuracy, security, and ensuring that all architectures are in line with regulatory compliance standards.
Stay updated with the latest trends and best practices for database platform architecture. Continuously evaluate the existing design and processes, recommending and implementing improvements as necessary.
Be skilled at conveying complex ideas in a clear and understandable manner, tailoring communication strategies based on the audience and the nature of the project.
Engage in Deployment efforts through CI/CD or manual intervention.
Available for a 24/7 environment and for on-call support.

Qualifications:

Bachelor’s or master’s degree in information technology, or a related field.
2+ years of experience in data design for logical and physical models, including ER, dimensional, and canonical modeling approaches for analytics and data warehousing.
3+ years of experience with cloud services such as GCP and AWS (GCP preferred).
3+ years of hands-on experience managing and optimizing GCP…
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary