More from Events Calendar
- Feb 109:00 AMSpring into Writing with Writing Together Online!Writing Together Online offers structured time to help you spring into writing and stay focused this semester. We offer writing sessions every workday, Monday through Friday. Join our daily 90-minute writing sessions and become part of a community of scholars who connect online, set realistic goals, and write together in the spirit of accountability and camaraderie. The program is open to all MIT students, postdocs, faculty, staff, and affiliates who are working on papers, proposals, thesis/dissertation chapters, application materials, and other writing projects. For more information and to register, go to this link or check the WCC website. Please spread the word and join with colleagues and friends.Register for Spring 2025 Writing Challenge 1Choose those sessions that you want to attend during Challenge 1: February 10th through March 21stMondays 9:00–10:30amTuesdays 8–9:30am and 9:30–11amWednesdays 9:00–10:30amThursdays 8–9:30am and 9:30–11amFridays 8–9:30am and 9:30–11amMIT Students and postdocs who attend at least 5 sessions per challenge will be entered into a raffle of three $25 Amazon gift cards. The raffle will take place on Friday, March 21st. The more you participate, the more times you will be entered into the raffle of prizes.For more information and to register, check the WCC website. Please spread the word and join with peers and friends.The funding support for this program comes from the Office of Graduate Education
- Feb 1010:00 AMThesis Defense - Sarah GreerTitle: Geometrically-informed methods of wave-based imagingSpeaker: Sarah Greer
- Feb 1012:00 PMLanguage Conversation Exchange Lunch: meet, eat, and speakLet's meet, eat, and speak! Let’s celebrate Lunar New Year together. Practice a language with a group of native speakers and other language learners, meet other language lovers, and learn about the LCE. The registration is here.Anyone who is affiliated with MIT can participate in the LCE. Our members include students, staff, visiting scientists and scholars, faculty members, and their spouses and partners.The lunch is sponsored by ISO (International Students Office).
- Feb 1012:00 PMNeuroLunch: Zinong Yang (Lewis Lab) & Guy Gaziv (DiCarlo Lab)Speaker: Zinong Yang (Lewis Lab)Title: Attentional failures after sleep deprivation represent moments of cerebrospinal fluid flow.Abstract: Sleep deprivation rapidly disrupts cognitive function, and in the long term contributes to neurological disease. Why sleep deprivation has such profound effects on cognition is not well understood. Here, we use simultaneous fast fMRI-EEG to test how sleep deprivation modulates cognitive, neural, and fluid dynamics in the human brain. We demonstrate that after sleep deprivation, sleep-like pulsatile cerebrospinal fluid (CSF) flow events intrude into the awake state. CSF flow is coupled to attentional function, with high flow during attentional impairment. Furthermore, CSF flow is tightly orchestrated in a series of brain-body changes including broadband neuronal shifts, pupil constriction, and altered systemic physiology, pointing to a coupled system of fluid dynamics and neuromodulatory state. The timing of these dynamics is consistent with a vascular mechanism regulated by neuromodulatory state, in which CSF begins to flow outward when attention fails, and flow reverses when attention recovers. The attentional costs of sleep deprivation may thus reflect an irrepressible need for neuronal rest periods and widespread pulsatile fluid flow.Speaker: Guy Gaziv (DiCarlo Lab)Title: Towards Noninvasive, Beneficial Modulation of Neural Population Activity via Natural Vision PerturbationsAbstract: Precise control of neural activity is generally achieved through invasive techniques. In this talk, I will present our recent work investigating prospects within the primate ventral visual network for precisely modulating neural activity in high-level brain regions. These modulations are achieved through model-guided image perturbations that are adapted to arbitrary natural visual stimuli. In particular, these perturbations can selectively bias the activity of targeted high-level neurons upon presentation of the perturbed stimulus, with minimal effects on the activity of non-targeted neural sites. Motivated by model predictions on the viability of this approach, we tested it in three Macaque IT sub-populations. We found strong quantitative agreement between the model-predicted and biologically-realized modulation effects, allowing the injection of arbitrary neural population bias patterns. These results highlight that current machine executable models of the ventral stream are now powerful enough to design vision-based non-invasive sensory-based neural interventions at neural site-level resolution.
- Feb 1012:10 PMTunnel Walk sponsored by getfitWant to get exercise mid-day but don’t want to go outside? Join the tunnel walk for a 30-minute walk led by a volunteer through MIT’s famous tunnel system. This walk may include stairs/inclines. Wear comfortable shoes. Free.Location details: Meet in the atrium by the staircase. Location photo below.Tunnel Walk Leaders will have a white flag they will raise at the meeting spot for you to find them.Prize Drawing: Attend a walk and scan a QR code from the walk leaders to be entered into a drawing for a getfit tote bag at the end of the getfit challenge. The more walks you attend, the more entries you get. Winner will be drawn and notified at the end of April. Winner does not need to be a getfit participant.Disclaimer: Tunnel walks are led by volunteers. In the rare occasion when a volunteer isn’t able to make it, we will do our best to notify participants. In the event we are unable to notify participants and a walk leader does not show up, we encourage you to walk as much as you feel comfortable doing so. We recommend checking this calendar just before you head out. [As of Feb 7, this calendar is defaulting to the year 1899. Click "today" to be brought to the current month.]Getfit is a 12-week fitness challenge for the entire MIT community. These tunnel walks are open to the entire MIT community and you do not need to be a current getfit participant to join.
- Feb 101:00 PMJarrod Hicks Thesis Defense: The role of texture in auditory scene analysisTitle: The role of texture in auditory scene analysis Speaker: Jarrod Hicks Abstract:Everyday auditory scenes contain sounds from many sources. For example, when crossing the street, you might hear sounds produced from the rumble of passing cars, the chatter of pedestrians, and the rapid tick of crosswalk signals. To make sense of this complex mixture of sounds, the auditory system must separate the mixture into coherent perceptual representations that are likely to correspond to the underlying sources in the world. This process is known as auditory scene analysis. Although a rich body of work has probed auditory scene analysis with simple synthetic stimuli and revealed principles of perceptual organization, the extent to which these principles apply to real-world scenes with natural sounds remains unclear.This thesis empirically examines auditory scene analysis with realistic sounds. In particular, we study the perception of scenes containing a common class of environmental sounds known as “textures”, investigating how the auditory system makes use of statistical structure to separate textures from other sources and how the underlying statistical representation both constrains and enables scene analysis. We first investigated the mechanisms of hearing in noise using real-world background “noise” textures. The results show that the auditory system estimates the properties of “noise” textures and stores them over time, using the resulting internal model to estimate other concurrent sounds. We then considered how concurrent sound texture sources are separated from each other. We found that auditory scene analysis with textures involves some principles identified in classical scene analysis work with simple sounds, but that these principles apply to the higher-order statistical representations that define natural textures. Together, the results reveal new aspects of auditory scene analysis with real-world sounds and clarify the role texture plays in everyday hearing. Our findings provide a bridge between the simple, synthetic stimuli studied historically and the rich complexity of real-world sounds.Zoom Link: https://mit.zoom.us/j/97868598361?pwd=I5y3JhWyWSExh3SarlSpvVrBEvqRou.1