Dwarves
Memo
Type ESC to close search bar

Generative UI

What is Generative UI?

Examples

Benefits

Vercel AI SDK

Currently, the most used solution is the Vercel AI SDK.

Example

// use server
const askGPT = async () => {
  const ui = createStreamableUI()(
    // invoke some task
    async () => {
      workflow = createAgentExecutor();
      // handle stream events from LLM
      for await (const streamEvent of (
        runnable as Runnable<RunInput, RunOutput>
      ).streamEvents(inputs, {
        version: "v2",
      })) {
        // handle event stream from LLM
        ui.update(<UI props={data} />);
      }
    }
  )();

  return ui;
};
const Chat = () => {
  const [elements, setElements] = useState([]);

  const handleSubmit = (message: string) => {
    const ui = askGPT({
      message: message,
    });
    setElements([...elements, ui]);
  };

  return (
    <form
      onSubmit={() => {
        handleSubmit(inputValue);
      }}
    >
      {elements}
      <input />
    </form>
  );
};

Pros:

Cons:

By observing the behavior of Vercel AI SDK, we came up with a general idea and 2 approches.

General Idea

Goal

Idea

Approach 1

// Example final message
[
  {
    type: "text",
    data: "......",
  },
  {
    type: "movie-search-tool",
    data: {
      title: ".....",
      description: "....",
    },
  },
];

Approach 2

Handle event flow

Below is an example for handling event stream generated during LLM processing with langchain.

References