Data Engineering Scientist AI/ML - Remote
Minnetonka, Hennepin County, Minnesota, 55345, USA
Listed on 2026-02-28
-
IT/Tech
AI Engineer, Machine Learning/ ML Engineer, Data Scientist, Data Science Manager
Data Engineering Scientist with AI/ML – Remote
Company: United Health Group
Location: Minnetonka, MN (Remote/On-site)
Job Type: Full-time
Benefits: Retirement
Posted: 5 hours ago
Job DescriptionOptum Tech is seeking a highly skilled and motivated AI/ML Engineer to lead innovation in claims adjudication through advanced Generative AI solutions. This role emphasizes Large Language Models (LLMs), agentic frameworks, and prompt engineering to automate complex workflows. You will design and deploy secure, scalable, and responsible AI systems while collaborating across teams to deliver measurable impact.
You will enjoy the flexibility to work remotely from anywhere within the U.S. For all hires in the Minneapolis or Washington, D.C. area, you will be required to work in the office a minimum of four days per week.
Primary Responsibilities- Design, develop, and deploy AI/ML and Generative AI models for predictive, prescriptive, and generative analytics across healthcare datasets.
- Implement advanced architectures including LLMs (GPT, Gemini, LLaMA), Retrieval-Augmented Generation (RAG), and Agentic Frameworks.
- Build and optimize end-to-end pipelines using Python (Sci-kit Learn, Pandas, Flask, Lang Chain), PySpark, T‑SQL and SQL.
- Develop and fine‑tune multiple GenAI models for NLP, summarization, prompt engineering, and conversational AI.
- Apply MLOps best practices: model versioning, drift analysis, quantization, MLFlow, containerization with Docker, and CI/CD pipelines.
- Work with cloud platforms:
Azure (Databricks, ML Studio, Data Factory, Data Lake, Delta Tables), AWS, and GCP for scalable deployments. - Integrate data warehousing solutions like Snowflake and manage large‑scale data pipelines.
- Collaborate in an Agile environment, participate in sprint planning, and maintain code repositories using Git Hub/Git.
- Ensure compliance with security and governance standards for healthcare data.
- Coach and mentor junior team members.
- Design and implement machine learning and deep learning models for classification, NLP tasks.
- Build and maintain end-to-end ML pipelines including data preprocessing, model training, evaluation, and deployment.
- Develop and fine‑tune LLM‑based applications using Lang Chain, Lang Graph, and other GenAI frameworks.
- Build Multi Agentic workflows and RAG pipelines for enterprise use cases.
- Leverage AWS Bedrock and Google Vertex AI for scalable and production‑grade GenAI deployments.
- Implement guardrails to prevent prompt injection, reduce hallucinations, and ensure safe model outputs.
- Apply best practices for LLM security, including output moderation, access control, and auditability.
- Ensure compliance with Responsible AI principles—fairness, transparency, and explainability.
- Deploy and manage GenAI solutions on AWS and Google Suite, utilizing services like Bedrock, Sage Maker, Vertex AI.
- Integrate LLMs with enterprise systems using REST APIs, SDKs, and orchestration tools.
- Work closely with product managers, data scientists, and platform teams to translate business needs into GenAI solutions.
- Mentor junior engineers and contribute to internal knowledge‑sharing initiatives.
- 5 years of hands‑on experience in AI/ML techniques like Prompt Engineering, RAG (Retrieval Augmented Generation) and Agentic AI.
- Solid expertise in Python, PySpark, T‑SQL, SQL, and big data technologies (Hadoop, Spark).
- Proven deep knowledge of statistics, data modeling, and simulation.
- Hands‑on experience with Generative AI frameworks/architectures (Lang Chain, Hugging Face, OpenAI APIs).
- Proficiency in cloud technologies:
Azure (Databricks, ML Studio), AWS Bedrock, Azure Foundry, Kafka, and cloud‑native AI services. - Demonstrated familiarity with CI/CD pipelines, Git Hub Actions, and containerization tools.
- Proven excellent problem‑solving skills and ability to handle ambiguity.
- Proven solid understanding of LLM security, prompt engineering, and responsible AI practices.
- Experience with LLMs (GPT, Gemini, LLaMA) and prompt‑based learning.
- Knowledge of Kafka, Tensor Flow, and advanced deep learning architectures (CNNs, Autoencoders).
- Solid understanding of Agile methodologies and Dev Ops practices.
- Internal Data management and Big data handling experience.
- All employees working remotely will be required to adhere to United Health Group's Telecommuter Policy.
The salary for this role will range from $71,200 to $127,200 annually based on full‑time employment.
Final date to receive applicationsThis posting will remain open for a minimum of 2 business days or until a sufficient candidate pool has been collected. The job posting may close early due to volume of applicants.
#J-18808-Ljbffr(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).