Friday 

Room 1 

11:40 - 12:40 

(UTC+10

Talk (60 min)

Fine-Grained Authorization: The Missing Piece in Agentic AI Security

As organizations integrate Generative AI into their systems, securing data access for both human users and AI agents has become a critical challenge. Traditional access control approaches fall short when AI systems need contextual, document-level permissions at scale and speed.

AI
Security

This talk demonstrates how Fine-Grained Authorization (FGA) provides robust security for Retrieval-Augmented Generation (RAG) and agentic AI systems. Learn how to implement permission models that protect sensitive information while enabling AI to access only authorized data.

The talk explores implementations using OpenFGA and LangChain, showcasing how to build security directly into AI retrieval pipelines.

The presenters will provide real world case studies to discover how enterprises can prevent data leakage, implement multi-tenant isolation, and maintain audit trails while scaling to billions of access decisions.

Open source tools like OpenFGA and integration techniques with vector databases will be featured, along with best practices for real-world deployment. Thus join us to understand how one can maintain security without sacrificing performance or user experience in Agentic / Gen AI applications.

Shivay Lamba

Shivay Lamba is a software developer specializing in DevOps, Machine Learning and Full Stack Development.

He is an Open Source Enthusiast and has been part of various programs like Google Code In and Google Summer of Code as a Mentor and is currently a MLH Fellow. He has also worked at organizations like Amazon, EY, Genpact. He is a Tensorflow.JS SIG member and community lead from India.