Speaker: Prof Jesus Lizana
Time: 17:00-18:00
Levett Room, Wolfson College & Online (Teams link)
Jesus Lizana is Associate Professor in Engineering Science at the University of Oxford, with a unique experience profile in architecture and engineering. His research focuses on the cross-disciplinary challenges to support the transition towards zero carbon climate-responsive buildings. At Oxford, Lizana is engaged in many research initiatives and has received several prestigious and extensive grants, including a Marie Curie Fellowship. He leads the research on Zero-Carbon Space Heating and Cooling at ZERO Institute and supports the interdisciplinary research in the Future of Cooling Programme of the Oxford Martin School. Alongside his academic career, Lizana also serves as a consultant on many building energy-related projects, data science, and sustainable cooling across various global locations, including the United Kingdom, India, Spain, Morocco, and Saudi Arabia. Lizana received his PhD in low-carbon buildings at the University of Seville in Spain after completing a BSc in Architecture and an MSc in Building Engineering. Previously to his appointment at Oxford, he has lectured and conducted research at the University of Seville (Spain), the University of Edinburgh (Scotland), the Technical University of Munich (Germany), Universidade de Lisboa (Portugal), and the Spanish National Research Council (Spain).
The rapid increase of global mean temperature and unprecedented heat events require new approaches to support and monitor the climate adaptation and heat resilience of cities. Crafting effective plans necessitates accurate data and tools that adapt to the ever-changing dynamics of urban environments. This presentation will show the recent advances in diagnosing and treating accurately, city by city, overheated urban areas (in time and space) where climate adaptation should be prioritised to promote heat resilience. The research aims to fully integrate crowdsourced urban climate observations (citizen weather stations) with satellite and remote sensing data using machine learning techniques to generate high spatio-temporal resolution observations of urban atmospheric states and dynamics. The results will support the development of an urban heat diagnosis tool with global applicability to enable insight and evidence-supported actions to promote zero-carbon and sustainable cooling at different scales. This research is part of the Future of Cooling Programme of the Oxford Martin School.
Speaker: Razvan Pascanu
Time: 17:00-18:00
Buttery, Wolfson College & Online (Teams link)
I'm currently a Research Scientist at DeepMind. I grew up in Romania and studied computer science and electrical engineering for my undergrads in Germany. I got my MSc from Jacobs University, Bremen in 2009. I hold a PhD from University of Montreal (2014), which I did under the supervision of prof. Yoshua Bengio. I was involved in developing Theano and helped writing some of the deep learning tutorials for Theano. I've published several papers on topics surrounding deep learning and deep reinforcement learning (see my scholar page). I'm one of the organizers of EEML (www.eeml.eu) and part of the organizers of AIRomania. As part of the AIRomania community, I have organized RomanianAIDays since 2020, and helped build a course on AI aimed at high school students.
In this talk I will focus on State Space Models (SSMs) , a subclass of Recurrent Neural Networks (RNNs) that has recently gained some attention through works like Mamba, obtaining strong performance against transformer baselines. I will start by first explaining how SSMs can be viewed as just a particular parametrization of RNNs and what are the crucial differences compared to previous recurrent architectures that led to these results. My goal is to demystify the relative complex parametrization of the architecture and identify what elements are needed for the model to perform well. In this process I will introduce the Linear Recurrent Unit (LRU), a simplified linear layer inspired by existing SSM layers. In the second part of the talk, I will focus on language modelling and the block structure in which such layers tend to be embedded. I will argue that beyond the recurrent layer itself, the block structure borrowed from transformers plays a crucial role in the recent successes of this architecture, and present results at scale of well performing hybrid recurrent architectures as compared to strong transformer baseline. I will close the talk with a few open questions and thoughts on the importance of recurrence in modern deep learning models.