It’s common for security cameras to generate alerts based on motion detection. But what if they could detect suspicious behaviour and prevent theft or a violent attack? Heptasense is a startup based in Lisbon that’s been toiling away developing an artificial intelligence that could monitor gestures and translate human movements.

    The startup promises that the software will recognize and understand human gestures and the way people interact with each other. This should ultimately lead to the suspicious behaviour being detected in order to take appropriate measures. So will Heptasense’s software soon prevent the next terrorist attack?

    Heptasense Is Turning Security Cameras into Smart Devices

    There are other industries interested in the software of the Lisbon startup. Heptasense is already working together with BMW and Mercedes. Their software could be used both for quality control in production halls and for the implementation of gesture control for the cars of the future.

    For many years, also retailers have been longing for technologies that would make their customers’ behaviour more transparent. The software developed by Heptasense can not only determine the customers’ age and gender, but can also analyze their exact movements and interactions in the store. This way, retailers can increase customer retention and have the ability to track customer behaviour patterns.

    One of the company’s goals is to replace the security infrastructure that is nowadays monitored by humans with a more reliable and efficient artificial intelligence. The company’s software is cloud-hosted and uses mainly motion signals and muscle signals to detect human gestures. Additionally, Heptasense offers its software solution to improve sales and create detailed customer profiles.

    However, an AI is only as good as the programming behind it. With such sensitive topics as terrorist attacks and theft prevention, it is important that the right criteria are being used. Therefore an important question arises: What gesture or what kind of interaction will exactly the software consider suspicious?