Who doesn’t remember the opening scene of the classic Terminator 2? Over the years, Arnold Schwarzenegger’s entrance into the biker bar has been the subject of endless parodies, homages and references. That “I need your clothes, your boots and your motorcycle” line were the first words of a character who would later become a cultural icon of the 90s. However, it is not the fight, the gestures or the robot’s vocabulary that best shows the viewer the protagonist’s lack of humanity: it is his way of understanding reality.
For several shots during this scene, we see reality as the terminator is seeing it: a thermal red-tinted, binary, detail-filled shot. Its interface allows it to analyze and evaluate in depth everything it sees at a glance: signs, vehicles, substances, sounds and people, assisting it in making quick and effective decisions in the search for the three things it needs to continue its mission. Wouldn’t it be wonderful to have such an intelligent tool in our daily work?
VSNExplorer MAM is powered by multiple AI engines
It is estimated that a scientist spends approximately 60% of his time selecting and organizing data, and another 15% obtaining it. In other words, they spend the vast majority of their time in a repetitive, inefficient and error-prone task. Something similar happens if we move to the Broadcast & Media industry, where catalogers, editors and many other profiles that work side by side with content management systems (MAM) on a daily basis are often bogged down in classification tasks that are rarely efficient. The main difference is that, as James Cameron predicted to some extent, AI technology can help these professionals manage much more content with the very same hands.
For some time now, VSNExplorer MAM, VSN’s content management platform, has been integrating different Artificial Intelligence (AI) engines such as IBM Watson, Google Cloud, Microsoft Azure, AWS and EtiqMedia to obtain automatic metadata from the contents. This technology is far from being limited to the standard attributes of the file itself (such as format, duration or creation date), but goes much further, being also capable of recognizing elements of the images that the system ingests and stores: people, objects, logos and texts – thanks to OCR technology that also enables transcription (speech-to-text) and translation functionalities of the same. The power of Artificial Intelligence does not end here: it is also capable of recognizing audio elements, such as voices and music, technologies that many will be familiar with from their application to protect copyrights on some of the most popular video platforms, such as YouTube or Twitch.
In-depth media analysis for professionals
But if there are AI functionalities integrated in VSNExplorer MAM that we can qualify without hesitation as futuristic, these are the dynamic action recognition that combines sound and image analysis to correlate verbs, mentions and terms and recognize the displayed emotions (anger, joy, sadness, fear…) and that, along with the rest of the collected data, result in media assets analyzed in detail that will not only save a lot of time to the users, but will undoubtedly assist their businesses in their main mission, whether it is to create the content that their audiences want to see (linear and non-linear channels), locate a crucial media segment (security companies and public organizations), bring to light content of great historical value (documentalists) or choose the best segments and most impactful images for a particular video edition (news and sports programs).
The best part? The potential of the integrated AI in VSNExplorer MAM does not end here. If we have a local service of these characteristics, we can adapt it to our specific needs, ‘training’ the system to recognize the metadata of greatest interest and automating processes in a gradually more efficient way, which makes it not only a powerful functionality, but also flexible, customizable and scalable. Of course, with tools like this it is not difficult to predict that the fiction is getting increasingly closer, and that the future of content management is already intelligent.
Subscribe to our newsletter to stay updated about our activity