On 26 November 2019, AAAS announced Leading Scientists Elected as 2019 Fellows. Professor Jinchao Xu, the CCMA director, was among the seven mathematicians in the world named as 2019 AAAS Fellow. And he was honored at the 2020 AAAS Annual Meeting, which took place this February in Seattle, Wash.

Xu ,Jinchao (3)

Steven Chu, Chair of the Board of the American Association for the Advancement of Science(AAAS), inducts Prof.Jinchao Xu (left) as a member of the 2019 class of AAAS Fellows during the 2020 AAAS Annual Meeting, which took place this February in Seattle,Wash. Photo courtesy of Robb Cohen Photography & Video.

During the meeting, Jinchao was invited to give a presentation at the AAAS mathematics sections (Section A) business meeting on the research that was cited as part of his nomination. Below are clips his speech.

Good morning, everybody!
My name is Jinchao Xu and I am from Penn State, a faculty member in department of mathematics for more than 30 years. I am much honored to be a newly elected AAAS Fellow and I am especially happy to have a chance here to talk a little bit about my research.
I call myself a computational mathematician. I study mathematical algorithms to solve problems arising from scientific and engineering computing, such as theory, algorithmic development and applications of finite element methods, multigrid and domain decompositions.
On one hand, I consider myself a pure analyst, developing theories and algorithms from a pure mathematical viewpoint, not always worry if they are practically relevant.On the other hand, I work with domain scientists from different fields to develop algorithms and actually numerical software to solve very practical problems, not worry too much mathematical rigor.
One subject field that I work on is numerical methods for solving algebraic systems of equations.  Mathematically, it sounds very simple, namely, given a big matrix A, how to solve the linear system Ax=b? Everybody from high school essentially knows how to this problem by Gaussian elimination.
The challenge here is of course the cost, say, how much time it will take to solve it on a laptop or on a supercomputer?
In fact, Gaussian elimination is actually the algorithm that has been used to rank the top 500 supercomputers in the world.  So far, by this measure, the fastest computer in the world is the Summit, a supercomputer developed by IBM for use at Oak Ridge National Lab.
Guess how much will it take this fastest computer in the world to solve Ax=b with, say, one million unknown? According to the report, it is about 2 seconds! But how about a larger system? Say, 10 millions? 1 billion? Since the complexity of a Gaussian elimination scales up like the cubic of the number of unknowns, 10 million would take about half an hour, which is still bearable, but 1 billion would take more than 60 years! As you see, linear system Ax=b is not so easy to solve!
Now the question is: can we do better for algebraic systems that have special mathematical structures, say equations from the discretization of models for, say, elasticity, fluids, electric-magnetic field? The answer is of course, yes.
Take a linear algebraic system from the discretization of an elasticity equation, we can use a special type of method, called multigrid method to solve it. Using this special method, it would only take less than 1 second on a laptop to solve for 1 million unknown!  But since the complexity of multigrid scales linearly, 10 million would only take about 10 seconds, 1 billion would only take less than half an hour on a laptop!  From 60 years on a supercomputer to half an hour on a laptop is obviously quite an improvement!  This is of course the performance of multigrid in rather ideal situation.  It has been a big challenge for years how to make multigrid algorithm work for more practical and for more complicated problems.
One of my major research interest is the development of efficient mathematical algorithms such as multigrid for complicated problems in more practical settings. Let me give you an example here. Around year 2006, in collaboration with a colleague, Ralf Hiptmair, from ETH, I developed a multilevel method for solving Maxwell equations. I reported our method in a seminar in 2007 in Lawrence Livermore Lab.  Some colleagues there implemented our algorithm to solve Magnet ohydrodynamics problems arising from modeling of fusion energy and they found that, for a problem with 70 millions unknown, our new algorithm is 25 times faster than the best algorithms used in DOE lab at the time.
In fact, it can be orders of magnitude faster for larger scale simulations. These results were included in a DOE report that was used to lobby for research fundings in congress around year 2008 to support fundamental research. This has been quite interesting experience.   We actually developed our algorithm by pure mathematical manipulation. It is just very satisfying to see that basic mathematical research can have an impact in practical application.
In practice, however, not all the efficient algorithms are developed by mathematicians like us.  One remarkable example is the deep learning algorithm in artificial intelligence, a hot topic these days.  Deep learning has been mostly developed by computer scientists.
We mathematicians would get curious why and how these techniques work so well.  I got curious about deep learning myself recently, as the core technology in deep learning is something called “deep neural networks” or “multi-layer networks” which sounds very much like “multi-level grid” algorithm that I have studied for decades.
After months of study, I made a very interesting discovery: we can make a very tiny little modification of a classic multigrid algorithm to directly turn it into a new class and, actually more efficient, convolutional neural network, which is the core technology in AI these days.  Since I know how multigrid method works mathematically and have developed all kinds of theories, I hope that my new finding would open doors for mathematical understanding and improving of deep learning algorithms. Actually I started to collaborate with some colleagues from MicroSoft AI lab next door here.  I hope this will lead to something significant both in theory and in applications of deep learning.

Share →

Leave a Reply

Skip to toolbar