Multi-resource-cost Optimization for Neural Networks Models Working Group (NNMS): Simon Laughlin
Zuckerman Institute - L3-079 3227 Broadway, New York, NY, United StatesTitle: Neuronal energy consumption: basic measures and trade-offs, and their effects on efficiency Zoom: https://columbiauniversity.zoom.us/j/98299154214?pwd=1J3J0lEpF6XdqHkHy02c7LuD6xUWx2.1
Continual Learning Working Group: Amogh Inamdar
CSB 488Title: Taskonomy: Disentangling Task Transfer Learning Abstract: TBD Link: http://taskonomy.stanford.edu/taskonomy_CVPR2018.pdf
CTN: Brenden Lake
Zuckerman Institute - L3-079 3227 Broadway, New York, NY, United StatesTitle: Meta-learning for more powerful behavioral modeling Abstract: Two modeling paradigms have historically been in tension: Bayesian models provide an elegant way to incorporate prior knowledge, but they make simplifying and constraining assumptions; on the other hand, neural networks provide great modeling flexibility, but they make it difficult to incorporate prior knowledge. Here I describe how to…
Continual Learning Working Group: Lindsay Smith
CEPSR 620 Schapiro 530 W. 120th StTitle: A Practitioner’s Guide to Continual Multimodal Pretraining Reading: https://arxiv.org/pdf/2408.1447 Zoom: https://columbiauniversity.zoom.us/j/97176853843?pwd=VLZdh6yqHBcOQhdf816lkN5ByIpIsF.1
CTN: Benjamin Grewe
Zuckerman Institute - L5-084 3227 Broadway, New York, NY, United StatesTitle: Target Learning rather than Backpropagation Explains Learning in the Mammalian Neocortex Abstract: Modern computational neuroscience presents two competing hypotheses for hierarchical learning in the neocortex: (1) deep learning-inspired approximations of the backpropagation algorithm, where neurons adjust synapses to minimize error, and (2) target learning algorithms, where neurons reduce the feedback required to achieve a desired activity.…
Continual Learning Working Group: Yasaman Mahdaviyeh
CEPSR 620 Schapiro 530 W. 120th StTitle: Meta Continual Learning Revisited: Implicitly Enhancing Online Hessian Approximation via Variance Reduction Reading: https://openreview.net/pdf?id=TpD2aG1h0D Zoom: https://columbiauniversity.zoom.us/j/97176853843?pwd=VLZdh6yqHBcOQhdf816lkN5ByIpIsF.1
ARNI Annual Retreat
Faculty House 64 Morningside DrGeneral Agenda October 21st, Day 1 from 8:45am to 5pm Breakfast and Lunch Provided Opening 3 Keynote Speakers from ARNI Faculty Research Brainstorming and Discussions Project/Student Poster Session Education and Broader Impact Discussions October 22nd, Day 2 from 9am to 1pm Breakfast and Lunch Provided 1 Keynote Speaker Brainstorming and Discussion on Collaborations & Knowledge…
CTN: Tatiana Engel
Zuckerman Institute - L5-084 3227 Broadway, New York, NY, United StatesTitle: Unifying neural population dynamics, manifold geometry, and circuit structure. Abstract: Single neurons show complex, heterogeneous responses during cognitive tasks, often forming low-dimensional manifolds in the population state space. Consequently, it is widely accepted that neural computations arise from low-dimensional population dynamics while attributing functional properties to individual neurons is impossible. I will present recent work from…
CTN: Naureen Ghani
Zuckerman Institute - L5-084 3227 Broadway, New York, NY, United StatesTitle: Mice wiggle a wheel to boost the salience of low visual contrast stimuli Abstract: From the Welsh tidy mouse to the New York City pizza rat, movement belies rodent intelligence. We show that head-fixed mice develop an active sensing strategy while performing a visual perceptual decision-making task (The International Brain Laboratory, 2021). Akin to humans shaking a computer mouse…
CTN: Jacob Macke
Zuckerman Institute - L5-084 3227 Broadway, New York, NY, United StatesTitle: Building mechanistic models of neural computations with simulation-based machine learning Abstract: Experimental techniques now make it possible to measure the structure and function of neural circuits at an unprecedented scale and resolution. How can we leverage this wealth of data to understand how neural circuits perform computations underlying behaviour? A mechanistic understanding will require models that…
ARNI Seminar Series Kick Off: Speaker Jim DiCarlo
Zuckerman Institute - L7-119 3227 Broadway, New York, NY, United StatesTitle: Do contemporary, machine-executable models (aka digital twins) of the primate ventral visual system unlock the ability to non-invasively, beneficially modulate high level brain states? Abstract: In this talk, I will first briefly review the story of how neuroscience, cognitive science and computer science (“AI”) converged to create specific, image-computable, deep neural network models intended…
Continual Learning Working Group: Brainstorming
CEPSR 620 Schapiro 530 W. 120th StZoom Link: https://columbiauniversity.zoom.us/j/97176853843?pwd=VLZdh6yqHBcOQhdf816lkN5ByIpIsF.1