INNS Webinar Series ArchiveExplore our archive of past INNS Webinars. This rich collection provides a valuable learning resource for students and professionals interested in neural networks and related research. Dive into recordings of previous lectures from our bi-monthly Webinar Series. For information on upcoming live sessions and the overall series, please visit the INNS Webinar Series page. 2025![]() NeuroAI
Abstract: NeuroAI creates synergies between the study of brains and AI advances because both of their neural architectures are massively parallel and highly connected by weights trained by learning algorithms. Horace Barlow suggested that the oriented filters in visual cortex, generally thought to be edge detectors, might be a compact way to represent natural scenes. We confirmed this hypothesis with Independent Component Analysis (ICA), an unsupervised learning algorithm, training on patches of natural scenes. The independent components were edge filters; each patch could be reconstructed from only a sparse set of components. When large-scale Convolutional Neural Networks (CNNs) with many processing layers became feasible a decade ago, they could recognize thousands of objects in images invariant to location, scale, and rotation. Similar progress has occurred in language processing, starting with NETtalk in the 1980s, which we trained to pronounce English text, a difficult problem in linguistics because of many irregularities. Today, Large Language Models (LLMs) talk to us on almost any topic in perfect syntax. Watch hereThe AI Act: Perspectives for the Technical and Scientific Communities
Abstract: This talk will provide an introductory overview of the EU's landmark regulation on AI, the AI Act. It will specifically seek to address the role of experts in its development, the foreseen impact on the technical and research communities, and how these can support its implementation. Furthermore, the talk will give participants an opportunity to learn more about the role of the Joint Research Centre, the European Commission's science and knowledge service, in the shaping of the AI Act. Lastly, it will introduce the Commission's AI in Science Strategy, currently undergoing a public consultation. Watch here![]() Dataset Distillation and Pruning: The AI-driven Hospital of the Future and Beyond
Abstract: AI today can pass the Turing test and is in the process of transforming science, technology, humans, and society. Surprisingly modern AI is built out of two very simple and old ideas, rebranded as deep learning: neural networks and gradient descent learning. The storage of information in neural networks by gradient descent is distributed or "holographic", and since Dennis Gabor invented holography, I am particularly honored to be a recipient of the prize that bears his name. I will describe several applications of AI to problems in biomedicine developed in my laboratory, from the molecular level to the patient level, using omic data, imaging data, and clinical data. Examples include the analysis of circadian rhythms in gene expression data, the identification of polyps in colonoscopies, and the prediction of post-operative outcomes. I will discuss the opportunities and challenges for developing, integrating, and deploying AI in the first AI-driven hospitals of the future and present two frameworks for addressing some of the most pressing societal issues related to AI research and safety. Watch here
Dataset Distillation and Pruning: Streamlining Machine Learning Performance
Abstract: In the rapidly evolving field of machine learning, "Dataset Distillation and Pruning" has emerged as a key strategy for enhancing model efficiency. Dataset distillation involves extracting essential information from extensive datasets to create refined, smaller-scale data that maintains model robustness while reducing computational burden. It can be likened to distilling knowledge from vast amounts of data. On the other hand, dataset pruning is akin to pruning unnecessary branches from a tree. This technique involves removing redundant or minimally impactful data points, resulting in a more streamlined, faster, and resource-efficient machine learning model. By eliminating extraneous information, dataset pruning aids in constructing lean algorithms with outstanding performance and without unnecessary computational overhead. These two approaches collectively address the challenges posed by the abundance of data in the digital age. Dataset distillation and pruning complement each other in model compression research and further optimize the entire machine learning workflow's energy consumption, ultimately facilitating sustainable deployment of large-scale data and models on endpoints. Watch hereThe Critical Role of AI in Learning Analytics and Assessment in the Future of Education
Abstract: The increasing adoption of Artificial Intelligence (AI) in higher education presents both opportunities and challenges for institutions, teachers, and students. As AI-driven tools for personalized learning and alternative assessment approaches are poised to replace or transform traditional methods, this presentation delves into the transformative impact of AI on the future of education. We will explore current trends in learning and assessment, examining how AI technology is redefining these practices. This presentation aims to provide a comprehensive understanding of how AI is reshaping assessment practices and driving the future of educational success, catering to learners, educators, administrators, and policymakers. Watch here
2024
![]() INNS Annual Lecture
|