
Content Analyzer – Building on top of Dynamic Structuring of Content Analysis
In my previous post, I went to some level of technical details about core API capabilities that were built to support a project I’m working on. As the continuous improvement train keeps on rolling, a few new capabilities were built and also made more easily consumable by non-technical people.
Recent Updates
Over the last week a series of small improvements were made to the content analysis logic. Here are some of the highlights.
Assertion APIs
Probably the most significant update done on the assertion API was the inclusion of the Proposal capability which provides insights as to how the outcome of the assertion can be remediated or improved. We’ll see later in this post how that comes into play.
A few tweaks were done behind the scene in order to generate higher quality assertions by the way of providing additional context as to what the subject of the content to assert is.
Enter Content Analyzer
In order to iterate on the content analysis functionality required for the project, I started building what I currently call the Content Analyzer. This rather unassuming feature taps into a lot of the APIs that were previously discussed to perform assertions and review content.
Setting the Stage
To run the content analysis, the following key things are required:
- Your objective with the analysis
- The subject of the content
- The content itself
Let’s go through a couple of examples to help wrap your head around what that capability actually does.
Use Case #1 – User Story Analysis

You can see from the example above that we’re setting the analysis stage by stating our objective of reviewing a particular user story (the content subject) to see if it’s well defined. We then move on to provide the actual user story itself that we want to review.
Now that we’ve described the job to be done, let’s launch the analysis and see what comes out of that.
Analysis Summary
After a few seconds, the first piece displayed in an overall summary of the analysis. From there you can note some highlights as to what was good about the content (i.e. what met the assertions) and some possible improvements that could be made.

Analysis Details
Moving on to the next section, we get more into the nitty gritty of what was done in the analysis.

As you can see above, the content analyzer generated a series of relevant assertions based on what we provided to critically review the content. We can also see the outcome of each assertion as well as some insights on what you could do to address that.
Use Case #2 – Review Analysis
Let’s jump to another example of analysis the Content Analyzer supports today. In this example, we’ll be analyzing a television review. Here’s what we provided to the Content Analyzer:

Essentially we describe again the job to be done and the content, which in this case is the raw transcript from a YouTube video from RTINGS.com about the Hisense U8N that was recently released.
Analysis Summary
In this particular instance, the summary looks quite a bit different than the previous example. Here the Content Analyzer opted for a different strategy to analyze the content. Instead of generating and validating a particular sets of assertions, the engine instead opted for an approach that capture statements and categorize them automatically under relevant aspects.
For this specific analysis, since Dretza knows about the Television category already, there was no particular need to generate a dynamic list of aspects to consider for the review, instead it was able to tap directly into a list of curated aspects automatically.

Just as an example, let’s switch gear (wink, wink) and use an item type that Dretza doesn’t currently know about, a motorcycle. As you can see below, we were able to generate a list of relevant aspects to consider while analyzing content. You can also see the particular content reviewed omitted a lot of the aspects to consider, which we can use as an indicator of relevance and quality of the content.

Analysis Details
Going back to our original TV review example, let’s move to the Analysis Details section. We can now dig deeper into what the summary chart reflected.

Here we can see a list of statements that were made in the review that pertain to a particular aspect. Some are positive, some are negative. They can also be neutral but this particular example doesn’t showcase that.
Looking Ahead
There’s still some additional improvements in the pipeline for the Content Analyzer to continue improving the ease of use and the quality of the analysis output and integrating the new capabilities fully into the rest of the product.
Conclusion
I hope this quick post was both informative and perhaps piqued your curiosity. If ever you want to experiment with the Content Analyzer or the analysis APIs, feel free to reach out!