Tutorial 1: Machine learning driven sensing for IoT building applications


  • Ashish Pandharipande, NXP Semiconductors
  • Avik Santra, Infineon Technologies

Abstract: Sensing technologies play an important role in realizing smart, energy-efficient and sustainable buildings. Sensors of different modalities are part of building infrastructures, like lighting, HVAC and surveillance, that are increasingly becoming connected. Data from such multi-modal IoT sensors can be used to realize new and improved building applications using advanced signal processing and machine learning. In this tutorial, we will cover the following topics:

  • IoT building applications like lighting/HVAC controls, system monitoring and diagnostics, space management, and location-based services;
  • Sensor system and machine learning architectures;
  • Machine learning based sensor data processing techniques.

Tutorial 2: Tensor learning for signal processing


  • Yipeng Liu, University of Electronic Science and Technology of China (UESTC)

Abstract: Tenor is a natural representation for multi-dimensional data, and tensor computation based signal processing can avoid multi-linear structure loss in classical matrix based signal processing methods. The recent advances in tensor computations bring the improvement of a number of classical signal processing techniques by their tensor extensions. This tutorial will provide a systematic introduction to recent advances in tensor learning for signal processing, such as tensor regression, tensor dictionary learning, low rank tensor completion, tensor principal component analysis, tensor subspace clustering, support tensor machine, tensor neural network, etc. These techniques can be used to image reconstruction, image quality enhancement, background extraction, multi-view image clustering, weather forecasting, pose estimation, speech source separation, image and speech classification, etc. This tutorial serves the following objectives:

  • Introduction to tensor
  • Tensor learning for signal processing
  • Selected discussion

Tutorial 3: Fundamental Understandings of Machine Learning by Local Information Geometry


  • Shao-Lun Huang, from Tsinghua-Berkeley Shenzhen Institute

Abstract: In contemporary machine learning, it is critical to identify “informative” low-dimensional features from high-dimensional data for learning tasks, while the notion of “informative” is not unified over learning problems. In this tutorial, we introduce a natural information metric for quantifying the informativeness of features from Hirschfeld-Gebelein-Rényi (HGR) maximal correlation and a modal decomposition of the dependence between random variables, via a local information geometric approach. We show that this information metric is locally equivalent to several scenarios, including Tishby’s information bottleneck, hypothesis testing, universal feature selection, Wyner’s common information. This establishes a theoretic connection between information theory, statistical learning, and machine learning algorithms by understanding the underlying information structure. Moreover, we introduce the alternative conditional expectation (ACE) algorithm for computing the maximal correlation features of data. Finally, we show that the hidden layers of deep neural networks can be interpreted as fine-tune the network parameters to extract such features, which provides the theoretical interpretation of the mechanism of deep learning. Outline of this tutorial:

  • The local information geometry
  • The geometric structure for two random variables
  • Local information geometry and machine learning
  • The algorithm and deep neural networks

Tutorial 4: Deep learning based sparse signal processing for massive wireless access


  • Wei Chen, Beijing jiaotong university,
  • Yanna Bai, Beijing jiaotong university

Abstract: Massive machine-type communication (mMTC) is one of the three core services of 5G and supports various applications in Internet-of-Things, e.g., smart factories, smart-grid, environment sensing, and health monitoring. The central challenge in mMTC is to connect a large number of uncoordinated devices through a limited spectrum. The typical mMTC communication pattern is sporadic, using short packets. This is suitable for grant-free random access in which the activity detection, channel estimation, and data recovery can be formulated as a sparse recovery problem and solved via compressive sensing algorithms. It brings new challenges in terms of processing complexity and latency. The data-driven method is a promising direction to address the complexity problem and meet the low latency need for time-sensitive scenarios. Instead of using deep learning as a black box, we provide some guidelines for using different sparse structures in neural networks, integrating communication system knowledge into the network, and breaking the limitations of classical networks. In addition to massive access, those methods are also suitable for sparse signal processing that wildly exists in various applications. In this tutorial, we show the latest research progress of deep learning for sparse signal processing, using compressive sensing based massive access as an example. We provide the guide of construct deep learning problems, the combination of deep learning with specific applications, and methods to improve existing networks. Besides, its application is not limited to massive access. But its application is not limited to massive access. Attendees can also get inspiration about using deep learning in sparse signal processing, which wildly exists in different research areas and industrial applications.