# Fallback to LLM Reasoning

In enterprise automation workflows, not all decision paths can be fully predefined. When structured logic reaches a point of uncertainty — missing data, ambiguous cases, or exceptions outside the configured rules — GLIK allows workflows to **fall back to LLM reasoning**.

This mechanism activates the LLM Block in a controlled manner to evaluate a situation, offer natural language judgment, and continue workflow execution.

***

### When Does Fallback Trigger?

Fallback to LLM is typically configured within a `Conditional Branch` or `Tool Node` decision graph. It is used:

* When no matching rule is found in policy memory
* When required data is missing (e.g., incomplete invoice fields)
* When downstream systems return ambiguous or null responses
* As a last step before human escalation

***

### Why Use Fallbacks in Enterprise Workflows?

Fallbacks ensure that workflows don’t break or silently fail. Instead, they:

* Maintain continuity in data pipelines
* Capture human-like reasoning to explain edge cases
* Provide transparent justifications for difficult decisions

This is especially valuable in regulated or auditable environments where AI must be able to explain *why* a decision was made.

***

### Example Use Case: Expense Policy Decision Engine

If an invoice exceeds policy thresholds *and* the policy file is missing, the workflow uses fallback logic:

```yaml
- block: Expense Evaluation
  type: condition
  logic:
    - if: invoice_data.amount > policy_memory.threshold
      then: fallback_to_llm
```

The LLM Block is then activated:

```yaml
- block: Fallback Reasoning
  type: llm
  prompt: >
    The policy threshold could not be verified.
    Please provide an explanation and recommend whether to APPROVE, REJECT, or ESCALATE.
```

***

### Best Practices

* Clearly define fallback conditions; avoid triggering unnecessarily
* Use deterministic and scoped prompts (include relevant context)
* Pair LLM output with traceable logs or memory writes
* Set guardrails to avoid over-reliance

***

### Alternatives

* Use a `Tool Node` connected to a human escalation system
* Write to GLIK Knowledge and queue for asynchronous review


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://docs.glik.ai/system-architecture/blocks-and-nodes/input-and-extraction/llm-block/fallback-to-llm-reasoning.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
