Security in Enterprise AI

5. Connecting your data to an AI Agent

5. Connecting your data to an AI Agent

Learn how to connect your agent to internal files and systems using StackAI’s powerful Knowledge Base tools.

A powerful AI agent needs access to the right data. In this video, you’ll learn how to use knowledge bases in StackAI to give your agent access to private files, systems, and APIs—like SharePoint, Google Drive, Notion, and more. You’ll see how to configure file indexing, set search filters and metadata, control access with roles-based permissions, and let your agent search data only when needed using tools.

Summary

  • Why knowledge bases matter: build context-aware agents powered by private data

  • Two ways to configure knowledge bases: left-side panel and within workflows

  • Upload static documents (e.g. PDFs) and adjust chunking, indexing, and search settings

  • Connect knowledge bases to LLMs, referencing documents and questions inside prompts

  • Preview parsed files and assign metadata tags for better document discovery

  • Drag and drop other types of nodes: APIs, tables, web sources, or dynamic file systems

  • Connect to SharePoint, Google Drive, Dropbox, and more via secure credentials

  • Enable auto-sync to keep data up to date as files change

  • Use a knowledge base as a tool, so the LLM queries it only when needed

  • Configure search strategy, result length, and advanced tool settings

  • Set up roles-based access control to restrict data access by person or group

  • Share knowledge bases securely via the central manager interface

  • Next up: deep dive into LLM tooling and advanced capabilities inside your workflows

────────────────────────────────────────────────────────────────────────────────────

Course Overview

At StackAI, our customers are builders, and we want to equip them with the training they need to create today’s leading AI agents. That’s why we’ve developed StackAI academy.

StackAI academy allows our customers to obtain a foundational set of skills about the StackAI platform, so they can create AI agents that solve critical use cases in finance, healthcare, education, and more. 

Welcome to Stack AI Academy - Course #5 - Deep-Dive on Knowledge Bases. In this course, you’ll learn everything you need to know about Knowledge Bases in Stack AI. 

Knowledge Bases: Where to Find Them

There are two main places to configure Knowledge Bases in StackAI. One is the Knowledge Bases tab on the main menu.

You can also access the Knowledge Bases from the Workflow Builder. 

Here you’ll see a variety of different websites, from static documents, static websites, tables, documents hosted on file servers, and more. 

Setting Up Your First Knowledge Base

The simplest way to set up a Knowledge Base is to drag-and-drop the basic documents node from the sidebar.  

Connect the node to your LLM. Then upload a supported file type.

This document will be uploaded to StackAI, and you can configure all of the different parameters to customize how StackAI is breaking up the file and making it accessible for large language models.

This includes different chunking strategies, metadata filtering, and the number of search results returned. 

Manage Your Documents

When you click on the file in the Knowledge Base, you can see a complete preview of the document.

You can also view and create metadata to help classify the document.

In the Parsed tab, you can see the parsed data. 

Additionally, you can view the chunks that are processed by the LLM. This makes the data more accessible to AI. 

How to Integrate Knowledge Bases with LLMs

You will need to provide the LLM with a reference to the Knowledge Base in order to utilize the data.

You can add in other Knowledge Bases in the same way. Simply drag and drop them onto the canvas, connect them to your LLMs, and reference the Knowledge Bases in the “Prompt” section.

You can also access documents stored on a file server, including platforms such as Sharepoint, Google Drive, and Dropbox.

Create a Connection to Document Stores

To access documents stored on a file server, you must create a connection to the server with your login credentials. 

You can also create connections from the main menu, under “Connections”.  

Once you’ve created a connection, you can harness the documents in your file server with your LLM. 

Knowledge Bases as a Tool

You can also leverage Knowledge Bases as a Tool. In Stack AI, a Tool is an app that runs only if the LLM decides it is needed.

You can use Knowledge Bases as a Tool. The LLM will only reference the Knowledge Base documents if the prompt calls for it.

Role-Based Access Controls (RBAC)

Finally, let’s take a look at the role-based access controls (RBAC). 

In certain cases, you might not want particular users to have access to the underlying documents in the Knowledge Base. Restricted users can thus ask questions about the document via LLM, but they cannot download the documents.

To set role-based access controls, you must first share the Knowledge Base with the user in question. Enter the settings section of the Knowledge Base and click on “Share Knowledge Base”. 

This will take you to a screen where you can invite team members to access the Knowledge Base, and also set permissions for each user.

You can also utilize these access control settings from the Knowledge Base section of the main menu.  

With this, you can control the information that users have access to, while still allowing those with lesser permissions to search the documents.