Past Events

March 07, 2024
Weekly Meeting
Continual Learning Working Group
Date & Time: Thursday, March 7, 1:30-2:40pm
Location: CEPSR 620
Zoom: https://columbiauniversity.zoom.us/j/94783759415?pwd=cTlDTDdCVk9vdEV0QzRKL0hKQW1Kdz09
Paper: https://arxiv.org/abs/1906.01076


February 29, 2024
Weekly Meeting
Continual Learning Working Group

Date & Time: Thursday, February 29, 1:30-2:40pm
Location: CEPSR 620
Zoom: https://columbiauniversity.zoom.us/j/97515072030?pwd=VGJONXR6bW9LVTN3VlZZSXdRZnNIdz09
Paper: https://arxiv.org/abs/2302.00487

February 22, 2024
Weekly Meeting
Continual Learning Working Group

Date & Time: Thursday, February 22, 1:30-2:40pm
Location: CEPSR 620 and https://columbiauniversity.zoom.us/j/3658091817?pwd=WHFJVzAwbDdQcFMzc2FreVplKzVMUT09
Paper: https://arxiv.org/abs/2302.00487

February 20, 2024


Generative AI Freespeech & Public Discourse

Organized by: Columbia Engineering and the Knight First Amendment Institute

ARNI coPI Kathy McKeown and ARNI faculty Carl Vondrick participate in the
Panel 1: Empirical and Technological Questions: Current Landscape, Challenges, and Opportunities

Date & Time: Tuesday, February 20, 1:00-2:15pm
Location: Forum Auditorium
Link: https://www.engineering.columbia.edu/symposium-generative-ai-free-speech-public-discourse
Article: https://www.engineering.columbia.edu/news/navigating-generative-ai-and-its-impact-future-public-discourse?utm_source=newsletter&utm_medium=email&utm_campaign=highlights030124

February 16, 2024

Animal Behavior Video Analysis Working Group

Title: Mapping the landscape of social of social behavior using high-resolution 3D tracking of freely interacting animals

Date & Time: Friday, February 16, 3 pm
Location: Zuckerman Institute, 3227 Broadway, Conf room: L5-084
Presenter: Ugne Klibaite, PhD

Harvard University, Department of Organismic & Evolutionary Biology (PI, Bence P. Ölveczky)

Abstract: Social interaction is a fundamental component of animal behavior. However, we lack tools to describe it with quantitative rigor, limiting our understanding of its principles and the neuropsychiatric disorders, like autism, that perturb it. To address these limitations, I and collaborators have developed a technique for high-resolution 3D tracking of freely interacting animals and their body-wide social touch patterns, solving the challenging subject occlusion and part assignment problems using 3D geometric reasoning, graph neural networks, and semi-supervised learning. Using this technology, I have collected and annotated over 34 million 3D postures in interacting rats, featuring five new monogenic autism models lacking reports of social behavioral phenotypes. I will introduce a novel multi-scale approach which I have used to identify a rich landscape of stereotyped interactions, synchrony, and body contact across strains. This deep phenotyping approach revealed a spectrum of changes in rat autism models and in response to amphetamine, and this framework has the potential to facilitate quantitative studies of social behaviors and their neurobiological underpinnings.

Join Zoom Meeting:
https://columbiauniversity.zoom.us/j/94848687512pwd=d0d2L20wSUdZWGZ4dytuZ1YyaEt3QT09

Meeting ID: 948 4868 7512
Passcode: 446335

January 19, 2024

Animal Behavior Video Analysis Working Group

Title: Multimodal Learning from Pixels to People

Date & Time: Friday, January 19, 3 pm
Location: CSB 480 (Mudd Building, 500 W 120th Street)
Presenter: Carl Vondrick 

Abstract: People experience the world through modalities of sight, sound, words, touch, and more. By leveraging their natural relationships and developing multimodal learning methods, my research creates artificial perception systems with diverse skills, including spatial, physical, logical, and cognitive abilities, for flexibly analyzing visual data. This multimodal approach provides versatile representations for tasks like 3D reconstruction, visual question answering, and object recognition, while offering inherent explainability and excellent zero-shot generalization across tasks. By closely integrating diverse modalities, we can overcome key challenges in machine learning and enable new capabilities for computer vision, especially for the many upcoming applications where trust is required.

Join Zoom Meeting:
https://columbiauniversity.zoom.us/j/96127949475pwd=TWxLa3A3a3lBRjdqbDBWMkRycHFMZz09

Meeting ID: 948 4868 7512
Passcode: 446335

December o1, 2023

Animal Behavior Video Analysis Working Group

Title: Precise quantification of natural behavior with computer vision

Date & Time: Friday, December 01, 3 pm
Location: Zuckerman Institute, 3227 Broadway, Conf room: L5-084.

Abstract: To understand the neural control of movement, cognition, and social interaction, we need to precisely quantify motor behaviors. Deep learning tools now enable to extract meaningful behavioral signals from raw videos, in high spatiotemporal resolution. These technologies are gaining increasing adoption in system neuroscience and are transforming the field in many ways. We will provide an overview of the field, present the limitations of some of the standard approaches, and present some of our own work on pose tracking (keypoint detection) and perhaps behavioral segmentation (discovering discrete behavioral motifs). We look forward to exploring fresh perspectives on this important problem.

Join Zoom Meeting:
https://columbiauniversity.zoom.us/j/95557736296pwd=V2tTNEVOellZMENGUDF5RXVwcUUyQT09

Meeting ID: 948 4868 7512
Passcode: 446335