So much has modified within the 15 years since Kaiming He was a PhD student.
“If you find yourself in your PhD stage, there may be a high wall between different disciplines and subjects, and there was even a high wall inside computer science,” He says. “The guy sitting next to me could possibly be doing things that I completely couldn’t understand.”
Within the seven months since he joined the MIT Schwarzman College of Computing because the Douglas Ross (1954) Profession Development Professor of Software Technology within the Department of Electrical Engineering and Computer Science, He says he’s experiencing something that in his opinion is “very rare in human scientific history” — a lowering of the partitions that expands across different scientific disciplines.
“There isn’t any way I could ever understand high-energy physics, chemistry, or the frontier of biology research, but now we’re seeing something that may also help us to interrupt these partitions,” He says, “and that’s the creation of a standard language that has been present in AI.”
Constructing the AI bridge
Based on He, this shift began in 2012 within the wake of the “deep learning revolution,” some extent when it was realized that this set of machine-learning methods based on neural networks was so powerful that it could possibly be put to greater use.
“At this point, computer vision — helping computers to see and perceive the world as in the event that they are human beings — began growing very rapidly, because because it seems you may apply this same methodology to many alternative problems and many alternative areas,” says He. “So the pc vision community quickly grew really large because these different subtopics were now capable of speak a standard language and share a standard set of tools.”
From there, He says the trend began to expand to other areas of computer science, including natural language processing, speech recognition, and robotics, creating the inspiration for ChatGPT and other progress toward artificial general intelligence (AGI).
“All of this has happened over the past decade, leading us to a brand new emerging trend that I’m really looking forward to, and that’s watching AI methodology propagate other scientific disciplines,” says He.
One of the crucial famous examples, He says, is AlphaFold, a synthetic intelligence program developed by Google DeepMind, which performs predictions of protein structure.
“It’s a really different scientific discipline, a really different problem, but individuals are also using the identical set of AI tools, the identical methodology to unravel these problems,” He says, “and I feel that’s just the start.”
The long run of AI in science
Since coming to MIT in February 2024, He says he has talked to professors in almost every department. Some days he finds himself in conversation with two or more professors from very different backgrounds.
“I definitely don’t fully understand their area of research, but they are going to just introduce some context after which we will begin to discuss deep learning, machine learning, [and] neural network models of their problems,” He says. “On this sense, these AI tools are like a standard language between these scientific areas: the machine learning tools ‘translate’ their terminology and ideas into terms that I can understand, after which I can learn their problems and share my experience, and sometimes propose solutions or opportunities for them to explore.”
Expanding to different scientific disciplines has significant potential, from using video evaluation to predict weather and climate trends to expediting the research cycle and reducing costs in relation to latest drug discovery.
While AI tools provide a transparent profit to the work of He’s scientist colleagues, He also notes the reciprocal effect they will have, and have had, on the creation and advancement of AI.
“Scientists provide latest problems and challenges that help us proceed to evolve these tools,” says He. “But additionally it is vital to do not forget that a lot of today’s AI tools stem from earlier scientific areas — for instance, artificial neural networks were inspired by biological observations; diffusion models for image generation were motivated from the physics term.”
“Science and AI will not be isolated subjects. We’ve been approaching the identical goal from different perspectives, and now we’re getting together.”
And what higher place for them to come back together than MIT.
“It isn’t surprising that MIT can see this modification sooner than many other places,” He says. “[The MIT Schwarzman College of Computing] created an environment that connects different people and lets them sit together, talk together, work together, exchange their ideas, while speaking the identical language — and I’m seeing this begin to occur.”
By way of when the partitions will fully lower, He notes that it is a long-term investment that won’t occur overnight.
“A long time ago, computers were considered high tech and also you needed specific knowledge to know them, but now everyone seems to be using a pc,” He says. “I expect in 10 or more years, everyone shall be using some form of AI indirectly for his or her research — it’s just their basic tools, their basic language, they usually can use AI to unravel their problems.”