Learning Smoothly Varying Patterns

Learning a succinct set of substructures that predicts global network properties plays a key role in understanding complex network data. Existing approaches address this problem by sampling the exponential space of all possible subnetworks to find ones of high prediction accuracy. We are develop a novel framework that avoids sampling by formulating the problem of predictive subnetwork learning as node selection, subject to network-constrained regularization. Our framework involves two steps: (i) subspace learning, and (ii) predictive substructures discovery with network regularization, and is based upon spectral graph learning and gradient descent optimization. It is possible to show that these solutions converge to a global optimum solution---a desired property that cannot be guaranteed by sampling approaches. Through experimental analysis on a number of real world datasets, we demonstrate the performance of our framework against state-of-the-art algorithms, not only based on prediction accuracy but also in terms of domain relevance of the discovered substructures.

Affiliated Faculty

Research interests: 

Applied Machine Learning, Complex Network Analysis, Non-convex Optimization, Multi Agent Systems, Natural Language Processing

Omid received a B.Sc in Computer Engineering in 2011 and M.Sc. in Artificial Intelligence in 2014 from Sharif University of Technology, Tehran, Iran. Prior to joining Dynamo lab in 2015, he spent few years as a software engineer in industry. He has a background in complex networks, analysis of financial data and applied machine learning.

Photo of Zexi Huang.
Research interests: 

Network Mining, Representation Learning, Graph Signal Processing, Transfer Learning

Zexi received his B.Eng. in Computer Science and Technology at University of Electronic Science and Technology of China, Chengdu in 2018. He joined Dynamo lab in 2018. During his bachelor, he also worked as a visiting research assistant at Nanyang Technological University, Singapore.