SmartAEye Logo

SmartAEye

SmartAEye is a embedded machine vision software for IoT devices targeting to “smart cities” and “smart surveillance” market, based on deep learning techniques

Keywords: Smart cities, smart surveillance, embedded machine vision, deep learning, IoT devices

Vision

The aim of the SmartAEye project is to develop an embedded machine vision software based on deep learning techniques, targeting to be integrated to IoT devices in the “smart cities” and “smart surveillance” markets. The SmartAEye software will be embedded on different hardware platforms (CPU, GPUI or full SoCs) to demonstrate both its functionality and its commercial exploitation.

Functionalities
People Detection

People detection and counting for protecting privacy issues (GDPR Compliant) and statistical purposes in (e.g. shopping centers / malls, shops, public services etc.) or external environments (e.g. central shopping streets of interest, public squares, public stops etc.)

Vehicle Detection

Vehicle detection and counting for statistical purposes (e.g. bridges, tunnels, toll stations etc.)

Smart Parking

Detection and classification of parking lots

The main feature and comparative advantage of the SmartAEye approach is that all data processing (image or video) are performed on the edge device, thereby safeguarding citizens’ personal data.

Goals and Objectives

1.

Create a complete set of data for different “smart city” and “smart surveillance” scenarios, which will be captured in real-world conditions from sensors in different locations and with different environmental conditions.

2.

Develop machine vision technology based on artificial intelligence and deep learning that will be trained with real-time data as well as with existing data sets.

3.

Exploit of deep learning techniques and IRIDA’s know-how in training of neural network modules and real-time implementation of deep learning architectures for commercial purposes, which have been gained through collaboration with a companies such as Qualcomm, Analog Devices, Arrow Electronics, Cadence and Xilinx.

4.

Pilot implementation of the functionalities in embedded systems (edge devices) using heterogeneous computing techniques to optimally exploit available computing resources such as CPU, GPU and DSP.

Methodology - Project Phases

Phase 1: Data Collection and Deep Learning Training

Dataset development and selection

Training of CNN / deep learning architectures

Phase 2: Embedded System Prototyping

Embedded platform selection

Embedded software development

Phase 3: Demo System Development and Validation

Testing based on datasets of Phase 1

Testing and validation in real-world conditions

Work Packages

Gantt SmartAEye

Impact

The SmartAEye product targets the “smart cities” and “smart surveillance” markets. The implementation of SmartAEye library, running in an integrated system (edge device), results in the creation of an extremely competitive product compatible that does not violate personal data (GDPR compliant). A typical example of the use of technology is the smart parking application: with the use of a “sensor” that integrates the functions of SmartAEye, the scenario can be implemented in which the citizens are informed about the empty parking spaces.

These functionalities are particularly critical for the implementation of “smart surveillance” applications where, for example, in shopping malls, shops and malls they measure the number and flow of customers without collecting personal data.

Methodology

SmartAEye Methodology
Stages of training a neural network

Model optimization and conversion process

These are implemented through OpenVINO Toolkit or OpenVINO DL Workbench

SmartAEye Training

Results

Trajectory estimation using deep learning methods

We use neural networks to learn how to successfully predict the trajectory of people in the plane, given the current position and the last n positions. In the human trajectory problem, we need sequences of two-dimensional positions in the (x, y) plane, which will be trajectories. The utilized is LSTM (Long Term Memory Network).

LSTM model architecture for human trajectory estimation

Data Annotation for trajectory estimation

Prediction of trajectory tracking using the LSTM network

Dissemination

During the period from 12/05/2020 to 11/05/2022, the project team dealt with the diffusion and commercial exploitation of SmartAEye through direct meetings with companies in Greece and Europe. In addition, the company participated in international exhibitions abroad, either as an exhibitor or as a guest of its international partners.

Participation in the DFAE meeting of Renesas in Dusseldorf

Irida Labs, in May 2022, as part of its collaboration with Renesas Electronics, to promote the results of the SmartAEye project, was invited to participate in the Distributors and FAEs meeting organized in Düsseldorf. This invitation continued the collaboration with Renesas as Irida Labs is a member of the preferred partner program.

Participation in the HikVision Innovation Summit exhibition

H Irida Labs has developed an essential collaboration regarding promoting the results of the SmartAEye project with HikVision, one of the world’s top 3 camera manufacturers. As part of this collaboration, in May 2022 HikVision invited selected partners, including Irida Labs, to present solutions, such as those developed within the SmartAEye project, at an exhibition organized in Amsterdam specifically for this purpose. HikVision promoted this exhibition with a particular website.

Contact

During the period from 12/05/2020 to 11/05/2022, the project team dealt with the diffusion and commercial exploitation of SmartAEye through direct meetings with companies in Greece and Europe. In addition, the company participated in international exhibitions abroad, either as an exhibitor or as a guest of its international partners.





    Funding Details

    This project has been co‐financed by the European Regional Development Fund of the European Union and Greek national funds through the Operational Program Competitiveness, Entrepreneurship and Innovation, under the call RESEARCH – CREATE – INNOVATE (project code:Τ2EDK-02263)

    Skip to content