Skip to content
  • CTN: Benjamin Grewe

    Zuckerman Institute - L5-084 3227 Broadway, New York, NY, United States

    Title: Target Learning rather than Backpropagation Explains Learning in the Mammalian Neocortex Abstract: Modern computational neuroscience presents two competing hypotheses for hierarchical learning in the neocortex: (1) deep learning-inspired approximations of the backpropagation algorithm, where neurons adjust synapses to minimize error, and (2) target learning algorithms, where neurons reduce the feedback required to achieve a desired activity.…

  • Continual Learning Working Group: Yasaman Mahdaviyeh

    CEPSR 620 Schapiro 530 W. 120th St

      Title: Meta Continual Learning Revisited: Implicitly Enhancing Online Hessian Approximation via Variance Reduction Reading: https://openreview.net/pdf?id=TpD2aG1h0D Zoom: https://columbiauniversity.zoom.us/j/97176853843?pwd=VLZdh6yqHBcOQhdf816lkN5ByIpIsF.1

  • ARNI Annual Retreat

    Faculty House 64 Morningside Dr

    General Agenda October 21st, Day 1 from 8:45am to 5pm Breakfast and Lunch Provided Opening 3 Keynote Speakers from ARNI Faculty Research Brainstorming and Discussions Project/Student Poster Session Education and Broader Impact Discussions October 22nd, Day 2 from 9am to 1pm Breakfast and Lunch Provided 1 Keynote Speaker Brainstorming and Discussion on Collaborations & Knowledge…

  • CTN: Tatiana Engel

    Zuckerman Institute - L5-084 3227 Broadway, New York, NY, United States

    Title: Unifying neural population dynamics, manifold geometry, and circuit structure. Abstract: Single neurons show complex, heterogeneous responses during cognitive tasks, often forming low-dimensional manifolds in the population state space. Consequently, it is widely accepted that neural computations arise from low-dimensional population dynamics while attributing functional properties to individual neurons is impossible. I will present recent work from…

  • CTN: Naureen Ghani

    Zuckerman Institute - L5-084 3227 Broadway, New York, NY, United States

    Title: Mice wiggle a wheel to boost the salience of low visual contrast stimuli Abstract: From the Welsh tidy mouse to the New York City pizza rat, movement belies rodent intelligence. We show that head-fixed…

  • CTN: Jacob Macke

    Zuckerman Institute - L5-084 3227 Broadway, New York, NY, United States

    Title: Building mechanistic models of neural computations with simulation-based machine learning Abstract: Experimental techniques now make it possible to measure the structure and function of neural circuits at an unprecedented scale and…

  • ARNI Seminar Series Kick Off: Speaker Jim DiCarlo

    Zuckerman Institute - L7-119 3227 Broadway, New York, NY, United States

    Title: Do contemporary, machine-executable models (aka digital twins) of the primate ventral visual system unlock the ability to non-invasively, beneficially modulate high level brain states? Abstract: In this talk, I…

  • CTN: Tanya Sharpee

    Zuckerman Institute - L5-084 3227 Broadway, New York, NY, United States

    Seminar Time: 11:30am Date: 11/8/2024 Location: JLG, L5-084  Host: Krishan Kumar   Title: Building mechanistic models of neural computations with simulation-based machine learnin

  • Continual Learning Working Group: Nikita Rajaneesh

    CEPSR 6LW4 Computer Science Department 500 West 120 Street

    Title: Wandering Within a World A discussion on Wandering Within a World: Online Contextualized Few-Shot Learning, this 2021 paper by our very own Rich Zemel leverages contextual information in a…

  • CTN: Catherine Hartley

    Zuckerman Institute - L5-084 3227 Broadway, New York, NY, United States

    Title: TBD Abstract: TBD

  • CTN: Seminar Speaker Alessandro Ingrosso

    Zuckerman Institute - L3-079 3227 Broadway, New York, NY, United States

    Title: Statistical mechanics of transfer learning in the proportional limit Abstract: Transfer learning (TL) is a well-established machine learning technique to boost the generalization performance on a specific (target) task…