Skip to main content
The task context is a stateful Pydantic model used throughout the workflow. It provides a single reference point accessible from any node, so relevant data can be stored and retrieved as needed.

TaskContext Class

The TaskContext class serves as the central data container for workflow execution:
class TaskContext(BaseModel):
    """Context container for workflow task execution.

    TaskContext maintains the state and results of a workflow's execution,
    tracking the original event, intermediate node results, and additional
    metadata throughout the processing flow.

    Attributes:
        event: The original event that triggered the workflow
        nodes: Dictionary storing results and state from each node's execution
        metadata: Dictionary storing workflow-level metadata and configuration

    Example:
        context = TaskContext(
            event=incoming_event,
            nodes={"AnalyzeNode": {"score": 0.95}},
            metadata={"priority": "high"}
        )
    """

    event: Any
    nodes: Dict[str, Any] = Field(
        default_factory=dict,
        description="Stores results and state from each node's execution",
    )
    metadata: Dict[str, Any] = Field(
        default_factory=dict,
        description="Stores workflow-level metadata and configuration",
    )

    def update_node(self, node_name: str, **kwargs):
        self.nodes[node_name] = {**self.nodes.get(node_name, {}), **kwargs}
TaskContext attributes:
  • Event Data - Stores the original triggering event
  • Node Results - Maintains results from each node execution
  • Metadata - Stores workflow-level configuration and metadata

The Event Attribute

The event attribute in TaskContext serves as the main entry point for workflow input data. When a workflow is initialized, the provided event is parsed according to the event_schema specified within the WorkflowSchema. This mechanism supports many event formats because each workflow defines its own event schema. You can run multiple workflows for different inputs without changing the shared workflow infrastructure. Example WorkflowSchema:
class PlaceholderWorkflow(Workflow):
    workflow_schema = WorkflowSchema(
        description="",
        event_schema=PlaceholderEventSchema,
        start=InitialNode,
        nodes=[
            NodeConfig(
                node=InitialNode,
                connections=[],
                description="",
                concurrent_nodes=[],
            ),
        ],
    )

Type Hinting

By default, the event attribute of TaskContext has the type Any, which means you won’t get autocomplete or type checking when accessing its fields or methods. However, since each workflow defines its own event_schema, you already know the expected structure of event within that workflow. To benefit from IDE features like autocomplete and static type checking, explicitly type the event attribute when retrieving it from the TaskContext. This makes your code more readable and helps catch errors earlier. Example:
event: PlaceholderEventSchema = task_context.event
Implementation steps:
  1. Define Event Schema - Create a Pydantic model for your event structure
  2. Configure Workflow - Set the event_schema in your WorkflowSchema
  3. Type the Event - Cast the event to your schema type in node processing
  4. Enjoy Type Safety - Get full IDE support and compile-time error checking

Best practices

  • Event schema design: Use specific, well‑described field names
  • Node result storage: Store results under the node class name for consistency
  • Metadata usage: Keep workflow‑level configuration in metadata
  • Type safety: Cast the event to your schema type in node processing
I