Dec 13, 2019
Neural architectures and many learning environments can conveniently be expressed by graphs. Interestingly, it has been recently shown that the notion of receptive field and the correspondent convolutional computation can nicely be extended to graph-based data domains with successful results. On the other hand, graph neural networks (GNN) were introduced by extending the notion of time-unfolding, which ended up into a state-based representation along with a learning process that requires state relaxation to a fixed-point. It turns out that algorithms based on this approach applied to learning tasks on collections of graphs are more computationally expensive than recent graph convolutional nets. In this talk we advocate the importance of refreshing state-based graph representations in the spirit of the early introduction of GNN for the case of “network domains” that are characterized by a single graph (e.g. traffic nets, social nets). In those cases, data over the graph turn out to be a continuous stream, where time plays a crucial role and blurs the classic statistical distinction between training and test set. When expressing the graphical domain and the neural network within the same Lagrangian framework for dealing with constraints, we show novel learning algorithms that seem to be very appropriate for network domains. Finally, we show that in the proposed learning framework, the Lagrangian multipliers are associated with the delta term of Backpropagation, and provide intriguing arguments on its biological plausibility.
Neural Information Processing Systems (NeurIPS) is a multi-track machine learning and computational neuroscience conference that includes invited talks, demonstrations, symposia and oral and poster presentations of refereed papers. Following the conference, there are workshops which provide a less formal setting.
Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%
Presentations on similar topic, category or speaker