Skip date selector
Skip to beginning of date selector
July 2025
August 2025
September 2025
October 2025
Thursday, July 3, 2025
- All dayA Wider Horizon: How Katharine Dexter McCormick Changed the World and MITOn view in the Hayden Library Loft (Floor 1M) May 12 - September 30, 2025Throughout her life, Katharine Dexter McCormick widened the horizons of what was possible for women. A suffragist, philanthropist, and scientist, she broke boundaries from an early age, becoming one of the first women to graduate from MIT. She later went on to fund McCormick Hall, the first on-campus dormitory for women at MIT. Learn more about the exhibit
- All dayExhibit NOW in IMES E25-310, from May 23 onward! Stop by to visit and learn more!
- 10:00 AM6hRefracted Histories: 19th-c. Islamic Windows as a Prism into MIT’s Past, Present, and FutureFebruary 26, 2025 - September 4, 2025Hidden within MIT’s Distinctive Collections, many architectural elements from the earliest days of the Institute’s architecture program still survive as part of the Rotch Art Collection. Among the artworks that conservators salvaged was a set of striking windows of gypsum and stained-glass, dating to the late 18th- to 19th c. Ottoman Empire. This exhibition illuminates the life of these historic windows, tracing their refracted histories from Egypt to MIT, their ongoing conservation, and the cutting-edge research they still prompt.The Maihaugen Gallery (14N-130) is open Monday through Thursday, 10am - 4pm, excluding Institute holidays.
- 1:00 PM1hAkhilan Boopathy Thesis Defense: Towards High-Dimensional Generalization in Neural NetworksDate & Time: July 3rd, 1 PMLocation: 46-3002Zoom: https://mit.zoom.us/my/akhilanTitle: Towards High-Dimensional Generalization in Neural NetworksAbstract:Neural networks excel in a wide range of applications due to their ability to generalize beyond training data. However, their performance degrades on high-dimensional tasks without large-scale data, a challenge known as the curse of dimensionality. This thesis addresses this limitation by pursuing three key objectives aimed at understanding and improving neural network generalization.1. We aim to investigate the scaling laws underlying generalization in neural networks including double descent, a phenomenon in which as a model's capacity or training data is increased, the test error temporarily increases at a certain point before continuing to decrease. In particular, we will have two goals: 1) a better understanding of when double descent can and cannot be empirically observed and 2) a better understanding of scaling laws with respect to training time.2. Inductive bias refers to the set of assumptions a learning algorithm makes to predict outputs on inputs it has not encountered. We propose quantifying the amount of inductive bias required for a model to generalize well with a fixed amount of training data. By developing methods to measure inductive bias, we can assess how much information model designers need to incorporate into neural networks to improve their generalizability. This quantification can guide the design of harder tasks that better test a model's generalization.3. Finally, we aim to develop new methods to enhance neural network generalization, particularly focusing on reducing the exponential number of training samples required for high-dimensional tasks. This involves creating algorithms and architectures that can learn effectively from limited data by incorporating stronger inductive biases. In particular, we will focus on two inductive biases in particular: 1) learning features of the training loss landscape correlated with generalization and 2) using modular neural network architectures. We expect that these techniques can improve generalization, particularly in high-dimensional tasks.Together, these contributions aim to deepen our theoretical understanding and develop practical tools for enabling neural networks to generalize effectively from limited data.Thesis Committee: Ila Fiete (supervisor), Leslie Kaelbling, Paul Liang
- 2:45 PM15mMIT@2:50 - Ten Minutes for Your MindTen minutes for your mind@2:50 every day at 2:50 pm in multiple time zones:Europa@2:50, EET, Athens, Helsinki (UTC+2) (7:50 am EST) https://us02web.zoom.us/j/88298032734Atlantica@2:50, EST, New York, Toronto (UTC-4) https://us02web.zoom.us/j/85349851047Pacifica@2:50, PST, Los Angeles, Vancouver (UTC=7) (5:50 pm EST) https://us02web.zoom.us/j/85743543699Almost everything works better again if you unplug it for a bit, including your mind. Stop by and unplug. Get the benefits of mindfulness without the fuss.@2:50 meets at the same time every single day for ten minutes of quiet together.No pre-requisite, no registration needed.Visit the website to view all @2:50 time zones each day.at250.org or at250.mit.edu