Continual Learning Working Group: Haozhe Shan
CEPSR 620 Schapiro 530 W. 120th StSpeaker: Haozhe Shan Title: A theory of continual learning in deep neural networks: task relations, network architecture and learning procedure Abstract: Imagine listening to this talk and afterwards forgetting everything else…
Multi-resource-cost Optimization for Neural Networks Models Working Group (NNMS): Tom Griffiths
Zuckerman Institute - L3-079 3227 Broadway, New York, NY, United StatesTitle: Bounded optimality: A cognitive perspective on neural computation with resource limitations
Multi-resource-cost Optimization for Neural Networks Models Working Group (NNMS): Simon Laughlin
Zuckerman Institute - L3-079 3227 Broadway, New York, NY, United StatesTitle: Neuronal energy consumption: basic measures and trade-offs, and their effects on efficiency Zoom: https://columbiauniversity.zoom.us/j/98299154214?pwd=1J3J0lEpF6XdqHkHy02c7LuD6xUWx2.1
Continual Learning Working Group: Amogh Inamdar
CSB 488Title: Taskonomy: Disentangling Task Transfer Learning Abstract: TBD Link: http://taskonomy.stanford.edu/taskonomy_CVPR2018.pdf
CTN: Brenden Lake
Zuckerman Institute - L3-079 3227 Broadway, New York, NY, United StatesTitle: Meta-learning for more powerful behavioral modeling Abstract: Two modeling paradigms have historically been in tension: Bayesian models provide an elegant way to incorporate prior knowledge, but they make simplifying and constraining…
Continual Learning Working Group: Lindsay Smith
CEPSR 620 Schapiro 530 W. 120th StTitle: A Practitioner’s Guide to Continual Multimodal Pretraining Reading: https://arxiv.org/pdf/2408.1447 Zoom: https://columbiauniversity.zoom.us/j/97176853843?pwd=VLZdh6yqHBcOQhdf816lkN5ByIpIsF.1
CTN: Benjamin Grewe
Zuckerman Institute - L5-084 3227 Broadway, New York, NY, United StatesTitle: Target Learning rather than Backpropagation Explains Learning in the Mammalian Neocortex Abstract: Modern computational neuroscience presents two competing hypotheses for hierarchical learning in the neocortex: (1) deep learning-inspired approximations of the…
Continual Learning Working Group: Yasaman Mahdaviyeh
CEPSR 620 Schapiro 530 W. 120th StTitle: Meta Continual Learning Revisited: Implicitly Enhancing Online Hessian Approximation via Variance Reduction Reading: https://openreview.net/pdf?id=TpD2aG1h0D Zoom: https://columbiauniversity.zoom.us/j/97176853843?pwd=VLZdh6yqHBcOQhdf816lkN5ByIpIsF.1
ARNI Annual Retreat
Faculty House 64 Morningside DrGeneral Agenda October 21st, Day 1 from 8:45am to 5pm Breakfast and Lunch Provided Opening 3 Keynote Speakers from ARNI Faculty Research Brainstorming and Discussions Project/Student Poster Session Education and…
CTN: Tatiana Engel
Zuckerman Institute - L5-084 3227 Broadway, New York, NY, United StatesTitle: Unifying neural population dynamics, manifold geometry, and circuit structure. Abstract: Single neurons show complex, heterogeneous responses during cognitive tasks, often forming low-dimensional manifolds in the population state space. Consequently, it is…
CTN: Naureen Ghani
Zuckerman Institute - L5-084 3227 Broadway, New York, NY, United StatesTitle: Mice wiggle a wheel to boost the salience of low visual contrast stimuli Abstract: From the Welsh tidy mouse to the New York City pizza rat, movement belies rodent intelligence. We show that head-fixed…
CTN: Jacob Macke
Zuckerman Institute - L5-084 3227 Broadway, New York, NY, United StatesTitle: Building mechanistic models of neural computations with simulation-based machine learning Abstract: Experimental techniques now make it possible to measure the structure and function of neural circuits at an unprecedented scale and…