The Anatomy of a GPTScript Chatbot Assistant

Aug 1, 2024 by Bill Maxwell
The Anatomy of a GPTScript Chatbot Assistant

The anatomy of a GPTScript chatbot assistants

Chatbots have evolved in recent years and with the introduction of Large Language Models (LLMs) into the mainstream, they are playing a prominent role in user interactions with cutting-edge AI systems. Historically, early chatbots were very structured, based on rigid sets of rules developed by programmers. These systems gradually improved as they adopted natural language abilities in an attempt to make them conversational. However, they still required extensive scripts and rules to be developed so users were able to interact with them effectively.

With the relatively recent introduction of LLM based systems, users can interact with chatbots naturally because of the deep understanding LLMs have of natural language. Leveraging some powerful capabilities around function calling (tools) we can enable the LLM to pull in real time data and take actions on behalf of the user. The aggregated tooling and chat capability is referred to as an agent. The idea being you can have multiple agents working together performing tasks within a specific domain.

With GPTScript you can easily build powerful chatbots with the ability to interact with systems and pull in information as needed within minutes. Powering these capabilities GPTScript provides a powerful set of built in constructs like OpenAPI integration, chat, contexts, and the ability to easily add additional capabilities through tools.

Let’s take a look at each of the components that make up a chatbot assistant.

Chat

Here is an example of a simple chatbot:

Chat: true Welcome the user

With this simple GPTScript file you can start a chat session with the LLM. In this case, it is no different than chatting directly with the LLM provider.

The main building block of the chatbot is the Chat: true directive. This changes the behavior from being a run once script into an interactive agent. When the gptscript defines this line it will start a dialog with the end user. Without any other configuration it is a chat window with the model, similar to chatGPT or other interface. Where this takes on more powerful capabilities is when it is combined with tools that allow it to perform additional actions on the system.

Tools

Tools are what enable the gptscripts to carry out actions and interact with systems. These tools can gather more information for processing by the LLM, for instance you can use the AWS CLI to analyze costs in your AWS account. You can use tools to provision new VMs and troubleshoot issues in a Kubernetes cluster. Multiple tools can be combined to perform more complex tasks by the LLM.

Building on our example above, we can add some built in tools like:

Chat: true Tools: sys.read, sys.write, sys.ls Welcome the user

Now our script can answer questions about the directory, file contents, and create new files. For instance we could ask, “What programming language is used in the current directory?”.

With tools, we could also pass a cloud provider CLI, and assuming it is logged in, ask the agent about VMs or security groups running in that provider. We could also drive complex queries to help with cost analysis over time. Or, if we have an OpenAPI spec for our REST API, then we can pass that specification file to the agent allowing it to make API requests. This integration is very powerful because it makes API consumption with AI very easy. Tools can even be programs written in any language that can be executed from a command line. GPTscript natively supports Python, Golang, and NodeJS tools, and can pull them from remote git repos and install their dependencies.

Context

You can use contexts when you want to have instructions added to the system prompt, provide a consistent tool set, or provide consistent instructions across agents. The context can be static text or dynamically generated content. It is run before the consuming tool gets invoked and is available to the LLM while running the main body of the gptscript. The context also provides a convenient way to export a common set of tools across multiple agents so each gptscript tool does not have to define the same tool line. Multiple tools can import the same context allowing for a single place to shape the personality and behavior of the chatbot. So as the end user is conversing with the chatbot, each agent can hand off to another agent to get the best answer possible for the user. Here is an example context that provides a consistent set of tooling and instructions:

Shared Tools: sys.exec !sys.echo You have access to run commands on this system.

Anything using the above context snippet will be able to run commands on the system because the sys.exec tool is made available.

The sys.echo tool is used to return exactly the text following it without the LLM modifying or interpreting the contents.

Wrapping up

Leveraging the building blocks of GPTScript you can quickly and easily build AI powered chatbot assistants. The scaffolding is handled by GPTScript and allows non-developers to create useful AI powered chatbot assistants. This enables users to be more productive because they have an AI assistant tailored to their needs. To learn more about GPTScript check out the docs. You can also checkout Clio our DevOps assistant project that makes use of these GPTScript elements to simplify day to day operations tasks.