During the nine months since we announced Deep Content at Localization World 31 in Dublin I have been travelling the world evangelizing the idea of automatically, semantically enriching content to potential and existing customers on site and at several industry conferences.
If you haven't yet heard about Deep Content then I urge you to contact us to find out more. Deep Content uses technologies such as entity recognition and internet knowledge graph queries to automatically annotate words and phrases in your digital content with facts and links to pertinent reference resources. This adds semantic depth to the content itself thus promoting end user utility and usefulness, and increasing value to content publishers and marketers.
Deep Content is one of the state-of-the-art technologies along with Neural Machine Translation, Adaptive Machine Translation, and others, which is sweeping through the industry inducing buyers and suppliers alike to re-evaluate how they take their brands and messaging to target markets.
As well as piquing the interest of translation buyers, Deep Content has also been noticed and endorsed by industry analysts such as Common Sense Advisory who reported on it individually in their article "Zen and the Art of Localizing Code and Content" and then citing it among the four new technologies powering the new paradigm of the Augmented Translator.
Having educated the market in what is possible with integrated Natural Language Processing techniques and Big Data Graphs I am engaged with customers on piloting Deep Content using their own data as the sources of semantic enrichment.
If you thought that Deep Content probably won't integrate with your content types and publishing mechanisms then reconsider: Deep Content uses open standards such as RDF, XML, XLIFF and HTML5 to enable the widest possible adoption.
For more information on Deep Content please contact Vistatec.