Lang graph is an AI building framework

Similarly to Lang Chain, Lang Graph will walk through a series of discrete and LLM based tasks, with a few key differences. Instead of an LLM being the sole doer of the task, the LLM will choose the right tool based on the information provided.

Instead of a person, or an event, triggering a Lang Chain, a LangGraph will get take a prompt as an input pass all the relevant information to an LLM to choose a tool from a list of prescripted options.

Building Blocks

  • Node
  • Edge
  • Conditional Edge
  • State

Examples

Persistence

LangGraph has a persistence layer, which offers a number of benefits:

  • Memory: LangGraph persists arbitrary aspects of your application’s state, supporting memory of conversations and other updates within and across user interactions;
  • Human-in-the-loop: Because state is checkpointed, execution can be interrupted and resumed, allowing for decisions, validation, and corrections via human input.

Streaming

LangGraph also provides support for streaming workflow / agent state to the user (or developer) over the course of execution. LangGraph supports streaming of both events (such as feedback from a tool call) and tokens from LLM calls embedded in an application.

Debugging and Deployment

LangGraph provides an easy onramp for testing, debugging, and deploying applications via LangGraph Platform. This includes Studio, an IDE that enables visualization, interaction, and debugging of workflows or agents. This also includes numerous options for deployment.