Understanding Factors in the Validation API: How Assert Content v4 Evaluates Context and Assumptions
The Validation API is a versatile tool for analyzing and validating complex content. With the recent v4 enhancements to the validation/assert-content method, the API can now provide structured, evidence-backed reasoning that incorporates factors — elements that need to be considered when evaluating an assertion and its outcome. Factors can be assumptions, contextual information, or facts that influence the outcome, making the assessments more nuanced and actionable.
In this post, we’ll walk through a practical example — analyzing a feature specification for a digital banking app — to illustrate how the API works. Note that this is just one example of how the API can be applied.
Why Factors Matter
Assertions aren’t made in a vacuum. When evaluating whether a piece of content meets certain expectations, there are always contextual details, assumptions, or constraints that need to be considered. These are what we now call factors.
- Contextual factors: Assumptions or background conditions (e.g., developer experience, available frameworks, platform maturity).
- Determinant factors: Directly influence the assertion outcome (e.g., number of integrations, volume of data, extent of UI changes).
By explicitly including these factors in both the input and output, the API ensures that outcomes are grounded in context and can be better understood by reviewers, developers, or decision-makers.
Getting Started with Factors Using Generate-Assertions
The generate-assertions method is often the first step: you provide content and an objective, and it produces a set of assertions you might want to validate. With the new enhancement, you can now also request factors alongside those assertions.
Here’s an example request:
[
{
"assertionReasoningLevel": "advanced",
"content": "<same feature content as before>",
"subject": "Software Feature Description",
"objective": "Determine the list of assertions to assess the implementation complexity of any software features. Make sure the assertions are generic and not specific to the content provided.",
"provideFactors": true
}
]And here’s a sample of the assertions generated:
- Assertion:Assess the number of distinct functional requirements specified for the feature.
- Factor: The count of functional requirements listed in the feature description.
- Assertion:Evaluate the diversity and number of system integrations required for the feature.
- Factor: The number and type of external/internal systems to be integrated.
- Assertion:Determine the extent of data model changes required to support the feature.
- Factor: The scope of changes to the data model.
Each assertion comes with determinant factors that explain what influences its outcome. These factors can then be fed directly into the assert-content method for deeper reasoning and evaluation.
Using Assert-Content with Factors
The assert-content method takes both the content and the assertions (including their factors) to perform the actual evaluation. Outcomes are then:
- Clearly linked to assumptions or context.
- Easier to justify with structured evidence.
- Actionable through proposals that consider the influencing factors.
For example, if an assertion is about “system integrations,” the determinant factor (number and type of integrations) will directly influence whether the outcome is low, medium, or high complexity.
Example Walkthrough
Step 1 – The Input
For our example, we’ll use a software feature specification as input:
- Feature: Budgeting & Spending Insights
- Description: Categorize transactions, provide dashboards and summaries, allow budget setup, and trigger alerts.
- Contextual Factors (overallFactors):
- Developer experience: junior
- Development platform: Appian
- Testing framework: None
- Desired Assertions: Complexity across multiple dimensions.
{
"assertionMethodVersion": "v4",
"contents": [ { "text": "Feature: Budgeting & Spending Insights..." } ],
"overallFactors": [
{ "role": "contextual", "description": "Developer experience", "value": "junior" },
{ "role": "contextual", "description": "Development platform", "value": "Appian" },
{ "role": "contextual", "description": "Testing framework", "value": "None" }
],
"desiredAssertions": [
{ "assertionInstruction": "Determine the complexity of data modeling..." },
{ "assertionInstruction": "Determine the complexity of UI/UX..." }
]
}For the more visually inclined:

Step 2 – The Response Highlighting Factors
For Data Modeling & Storage, the API considers both the contextual and determinant factors, showing how they influence the outcome:

{
"assertionInstruction": "Determine the complexity of data modeling and storage...",
"outcome": "3",
"factors": [
{
"role": "contextual",
"description": "The experience level of the developer who will be building the feature is junior",
"effects": [{ "description": "Perceived complexity is higher", "impactLevel": "high" }]
},
{
"role": "determinant",
"description": "The feature requires extending the transaction schema and storing new user specific data, which is a moderate change.",
"effects": [{ "description": "Moderate data model complexity", "impactLevel": "high" }]
}
]
}Note that we can also see supporting evidence of the outcome:

Similarly, for UI/UX Complexity, factors show how assumptions or conditions like multi-platform requirements and custom visualizations affect the outcome:

The API response fragment pertaining to this:
{
"assertionInstruction": "Determine the complexity of UI/UX...",
"outcome": "5",
"factors": [
{
"role": "contextual",
"description": "The experience level of the developer who will be building the feature.",
"effects": [{ "description": "Higher perceived complexity", "impactLevel": "high" }]
},
{
"role": "determinant",
"description": "Custom visualization and dashboard",
"effects": [{ "description": "Complex interactive UI/UX", "impactLevel": "high" }]
}
]
}Key Takeaways
- Factors connect context to outcomes. They make the reasoning behind each assertion explicit.
- Generate-assertions helps you get started. It now provides assertions with determinant factors you can immediately use with assert-content.
- Assert-content uses factors in reasoning. The evaluation is no longer just a label—it’s contextualized, explainable, and actionable.
This addition makes the Validation API a more robust tool for any scenario requiring structured validation, whether it’s evaluating feature complexity, assessing compliance, or verifying document consistency.
Closing
With factors, the Validation API’s assert-content method goes beyond simple classification, validation or extraction. It considers factors that influence outcomes, provides evidence, and delivers structured reasoning, making it applicable to a wide range of scenarios where context and assumptions matter.
👉 Try it with your own content and explore how overallFactors and inferred factors shape the assessments.
