Shunichi Koshimura Ph.D. (Engineering) Professor,International Research Institute of Disaster Science Founding Deputy Director, Co-creation Center for Disaster Resilience Tohoku University × Susumu Date Ph.D. (Engineering) Associate Professor,Applied Information Systems Research Division,Cybermedia Center Osaka University × Hiroaki Kobayashi Ph.D. (Engineering) Professor, Graduate School of Information Sciences Special Adviser for Digital Innovation to President Special Assistant of Cyberscience Center Tohoku University × Yusaku Ohta Ph.D. (Science) Associate Professor, Research Center for Prediction of Earthquakes and Volcanic Eruptions, Graduate School of Science Tohoku University × Kazunori Sudo Executive Vice President NEC CorporationShunichi Koshimura Ph.D. (Engineering) Professor,International Research Institute of Disaster Science Founding Deputy Director, Co-creation Center for Disaster Resilience Tohoku University × Susumu Date Ph.D. (Engineering) Associate Professor,Applied Information Systems Research Division,Cybermedia Center Osaka University × Hiroaki Kobayashi Ph.D. (Engineering) Professor, Graduate School of Information Sciences Special Adviser for Digital Innovation to President Special Assistant of Cyberscience Center Tohoku University × Yusaku Ohta Ph.D. (Science) Associate Professor, Research Center for Prediction of Earthquakes and Volcanic Eruptions, Graduate School of Science Tohoku University × Kazunori Sudo Executive Vice President NEC Corporation
シェア ツイート

Supercomputers that save the country from catastrophic disasters
The true value of a real-time tsunami inundation forecast system

On March 11, 2011, the 2011 Great East Japan Earthquake occurred off the coast of east Japan with a magnitude of 9.0, and a devastating tsunami subsequently struck the eastern part of the Pacific Coast. The tsunami devastated many buildings and infrastructures in a large area of over 500k㎡, and the number of fatalities reached 19,000. However, neither the Japanese national government nor the various local governments were able to grasp the whole picture of the impact immediately after the tsunami, and they were thus unable to provide prompt responses to the tsunami disaster.
11 years after…

Just before the clock struck midnight on 16 March 2022, a real-time tsunami inundation forecast system quietly began operation. The system estimates the maximum inundation depths, starting time of inundation, building damage in coastal land areas, etc. in real time. The system of the central government, has been built on the NEC supercomputer SX series (hereafter referred to as ”supercomputer SX”) through an industry-academic collaboration between Tohoku University, Osaka University, Kokusai Kogyo Corporation, A2 Corporation, RTi-cast Corporation, and NEC Corporation in Japan. The system, which began its operation in 2018, was developed to function 24/7, on constant standby for any earthquakes. And on that very same night of 16 March 2022, an earthquake off the coast of Fukushima Prefecture with a magnitude of 7.4 occurred causing major damages including the derailment of the Tohoku Shinkansen bullet train. We spoke to the developers about this initiative, like no other in the world, in which the system produces a report on the estimated tsunami damage in less than 30 minutes.

Minimising tsunami damage

Shunichi Koshimura

Shunichi Koshimura
Ph.D. (Engineering)
Professor,International Research Institute of Disaster Science
Founding Deputy Director, Co-creation Center for Disaster Resilience
Tohoku University

What was the situation on the day of the Fukushima earthquake? Was it quite tense?

Shunichi Koshimura (hereafter referred to as “Koshimura”): I felt a strong ground motion here in Sendai and was sure that tsunami was generated. That was the first day the system actually was triggered to estimate tsunami inundations and damage to infrastructures. As I was involved in the development, I remember being on high alert. We shared the system operation status and estimated results with our operation team, communicating by a special prioritized mobile line used in the event of disasters.

Susumu Date

Susumu Date
Ph.D. (Engineering)
Associate Professor, Applied Information Systems Research Division, Cybermedia Center
Osaka University

Susumu Date (hereafter referred to as “Date”): I was in Osaka and didn’t feel the earthquake. When I learnt via the emergency contact network, that the system was getting ready to start up, I was just about to go to bed. It was faster than the earthquake alert on TV. I kept in touch with those in charge at Tohoku University and NEC to understand the situation, but I was very tense.


Susumu Date

Susumu Date
Ph.D. (Engineering)
Associate Professor, Applied Information Systems Research Division, Cybermedia Center
Osaka University

Hiroaki Kobayashi (hereafter referred to as “Kobayashi”): The results shown by the system estimated zero damage by the tsunami, so our operation team was dissolved. At the time it felt much longer as we did not know how it would turn out, but when I saw the time it actually took, it was actually really short.

Please tell us how this system will be useful when a tsunami actually occurs. How is information from the system different from earthquake and tsunami warnings issued by the Japan Meteorological Agency, hereafter referred to as “JMA”.

Koshimura: JMA’s system warns of whether a tsunami will occur with nearshore tsunami height estimates, however, our system can provide an early understanding of inundation on land and damage situations following a tsunami disaster. It's a system like no other in the world.

Yusaku Ohta (hereafter referred to as "Ohta"): The system is activated when the JMA issues tsunami advisories or warnings, or when the magnitude of an earthquake is estimated to be 6.5 or greater. In addition to making use of data such as on hypocenters and magnitudes from JMA, we also use coseismic fault information estimated from the coseismic displacement data observed every second by the Geospatial Information Authority of Japan using GNSS (Global Navigation Satellite System).

Hiroaki Kobayashi

Hiroaki Kobayashi
Ph.D. (Engineering)
Professor, Graduate School of Information Sciences
Special Adviser for Digital Innovation to President
Special Assistant of Cyberscience Center
Tohoku University

Koshimura: Based on these data, the system first analyzes how the fault ruptured to estimate the initial condition of the tsunami, simulates the tsunami propagation and inundation, and finally estimates damage situations caused by it. For each municipality on the Pacific Coast of Japan, it estimates the potential human casualties and damage to buildings. We have been building the system to cover the Japan Sea side as well.

The sequence of operations is short, taking about 4 minutes to complete the tsunami simulation itself. It can be completed in less than 30 minutes from the start of the earthquake, including waiting time for the end of the coseismic fault motion (end of the earthquake) and automatic preparation of simulation results.

Kobayashi: The system utilizes supercomputer SX that is usually used as an HPC infrastructure for academic research at universities. When the system activation criteria are met, any already running calculations for research are temporarily suspended and the system is launched. In case of system failures, we run the same system in both Tohoku University and Osaka University for system redundancy.

Enabling prompt responses in less than 20 minutes

Fortunately, there was no damage caused by the tsunami this time. However, in the unlikely event that damage is predicted, what kind of actions will be taken?

Koshimura: The national or local governments will use the results of the system as basic data to proceed with its disaster response.

In addition, as one of the countermeasures , for example in Kochi Prefecture, initial response drills are conducted assuming anticipated Nankai Trough earthquake, with a similar system. In the event of an earthquake, the local government immediately sets up a task force meeting led by the governor, with the meeting being held within 60 minutes. Normally, there is little damage information available at the first meeting, and instructions are just given to confirm and share the damage situation, but by using our system, the estimates would provide where and how much damage there is from the beginning. In other words, based on the results of the system, the local government can immediately start providing medical care, arranging supplies, and sharing evacuation routes. In the drill, the governor has set a policy of "responding based on the results of the forecast until the actual situation is clear."

Yusaku Ohta

Yusaku Ohta
Ph.D. (Science)
Associate Professor,
Research Center for Prediction of Earthquakes and Volcanic Eruptions,
Graduate School of Science
Tohoku University

Ohta: From the viewpoint of my specialty (seismology), I start matching various data and verifying the validity of the estimated coseismic fault plane.

How accurate was the estimate?

Koshimura: The forecast of sea level fluctuations due to the tsunami came out a few tens of centimeters higher than the observation, varying by the location. One of the reasons was that for safety purposes, the background tide level which was the basis for calculations was intentionally set higher. In order not to underestimate the scale of the damage, the height of the tsunami is calculated based on the maximum tide level during the 6 hours after the earthquake.

If a tsunami actually reaches land, the estimation depends on the accuracy of the inundation forecasting. For example, this is because we will simulate more precisely how a tsunami will overtop seawalls and how it will penetrate on land based on precise topographic information.

Moving from research to social contribution

Was it the experience of the 2011 Great East Japan Earthquake (3/11) that led to the development of this system?

Koshimura: I was at Tokyo Station on that day. I couldn't go home on the bullet train, so I went out of the station and rented a car, drove without any sleep, and managed to get back to Sendai next morning. At that time, information of the disaster could only be obtained through radio news, but it was not at all clear how far the tsunami had spread and how much damage it had caused. As a tsunami researcher, I was keenly aware that we must first understand the full extent of the damage. In the past, I had been doing research on tsunami modelling and damage estimation, but the aspect of understanding the tsunami phenomena was large, and I had not made progress to the point where I could make use of it in the response immediately after the disaster occurred.

Reproduction image of the tsunami that struck Sendai City at the 2011 Great East Japan Earthquake. Currently, the same simulation algorithm is implemented in SX-Aurora TSUBASA (Source: Tohoku University)

Ohta: At the time of 3/11, the magnitude estimated by JMA immediately after the occurrence of the event was 8.1, which was an underestimate compared with the actual magnitude of 9.0. In fact, we had been researching an estimation method that did not underestimate the magnitude based on real-time GNSS before that, but since it was research from a pure scientific point of view, the social implementation ability was insufficient. Later, in joint research with Geospatial Information Authority of Japan, we implemented the developed method into their system. Furthermore, with the help of Professor Koshimura, by connecting this technology to tsunami simulations, the possibility of social implementation from the viewpoint of tsunami inundation also opened up. It was an opportunity for us to make up for our regrets of 3/11.

Kobayashi: The beginning of the development dates back to 2012, one year after 3/11. At that time, I asked Professor Koshimura for cooperation to think about something that would be useful to society as an application for next-generation supercomputers.

What we developed from there became the prototype of the current system, which began in 2014 as a project of the Ministry of Internal Affairs and Communications. We have been working with Prof. Koshimura and Prof. Ohta as well as other professors of seismology and disaster engineering, as well as people from Osaka University and NEC.

Date: From around 2014, I have been involved in the project as a person in charge of the practical side of supercomputers. At that time, supercomputers were used for calculations in academic research, and many of the general public did not understand what supercomputers were useful for.

That's why I thought it would be a very good idea to deploy simulations that are directly useful to society on supercomputers. I was not the one who actuallydecided to collaborate with Tohoku University, but I agreed that if we can contribute to society by making use of existing supercomputers in an emergency, we should do it.

What were some of the difficulties in development?

Koshimura: Each of us specialized in a different research field. However, in order to make the forecast technology usable in the event of a real disaster, they had to be connected seamlessly by desired functionality of the forecast system, and that was the most challenging part. Because we were researchers from different fields - science, engineering, and computational science - working together towards a single goal was quite a challenge, however, this is what we thought was the most significant part of the project.

Ohta: There was a kind of extra-curricular club atmosphere to the activities at the time. Researchers and business people from various fields would get together in the evenings and hold meetings until around 8pm. Each field had its own non-negotiables, so there were a great many matters to consider.

Koshimura: I am good at setting targets to achieve, and the first thing I did was to come up with the motto ‘Triple 10 Challenge'. I said let's do an estimate of the fault model in 10 minutes, complete simulation of the tsunami propagation, inundation and damage in 10 minutes with a 10-meter resolution.

Kobayashi: When we, the system implementers, received such difficult targets, we accepted them at first. However, we also did go back and forth requesting to rethink the model, or if we could omit some parts of the calculation to minimize the computation, etc.

Koshimura: "Extra-curricular club activities" are still ongoing: in March 2018, a start-up company called RTi-cast was established with investment from Tohoku University Venture Partners, Kokusai Kogyo, A2, NEC, and others, which oversees operating, promoting this system, and developing new ones. We are also members of this company and trying a new challenge of social implementation.

Even if calculations only start once the earthquake has occurred, they can still be made in time.

What were the advantages of using supercomputers for this system?

Koshimura: Tsunami warnings issued by JMA are based on simulation results for more than 100,000 of earthquake scenarios stored in a database, and when an earthquake occurs, the results corresponding to the conditions of that earthquake are drawn from the database. This method is called "database method ". In contrast, we use supercomputers, which allow us to perform simulations at high speed. So we realized that we could use the observed earthquake information and crustal deformation information directly to make accurate predictions, and that we could run the simulation after the earthquake and still get results in real time. That was around 2012.

Real-time computation makes it easier to obtain results that are closer to reality than the database method. Until then, the use of supercomputers was mainly for performing a lot of calculations or solving difficult equations, and there was no thought given to using it for real-time issues.

So, in a way, by using supercomputers to foresee real-world situations faster than they occur in reality, damage can be predicted quickly and human casualties can be prevented. Among supercomputers, NEC's products are characterized by the use of unique vector processors. What are its advantages over other supercomputers?

Kobayashi: Tsunami simulation is efficiently implemented on vector processors. Around 2015, at the beginning of development of our system, NEC's supercomputer needed only a fraction of the number of cores to achieve the same performance as the K computer, which was the fastest supercomputer in Japan at the time. Vector processors are also suitable for other issues in the category of "field calculations", such as the calculation of fluids and waves travelling through solids.

Performance comparison with NEC's SX-ACE supercomputer

Performance comparison with NEC's SX-ACE supercomputer (Courtesy of Tohoku University)
*The red line represents NEC's SX-ACE
This shows the computation time when the same tsunami inundation simulation is executed. The performance is higher when the number of cores is smaller and the calculation time is shorter.

Recently, some supercomputers use graphics processors (GPUs) that incorporate a very large number of cores, but in the case of field calculations, they do not necessarily use all of their performance. Increasing the number of cores often increases the memory loads, and this creates a bottleneck. Vector processors have faster data paths for a single core and can exploit performance efficiently.

Date: Vector processors also have aspects that make it easier for beginners to speed up calculations compared to GPUs. In fact, there were students who were able to use vector processors in about two weeks, even though they didn't know any programming languages. GPUs can also be used in Osaka University's supercomputers, but we often see people trying to migrate their programs to GPUs and finding it time-consuming to get to make full use of them, or struggling to get the best performance out of them.

When the latest supercomputer at Osaka University was installed in 2021, the real-time tsunami inundation forecast system was smoothly and successfully ported by continuing to use the vector processor, and immediately achieve high performance. As a result, the system could immediately take advantage of the high performance brought by the vector processor. It is possible to run this system on CPUs or GPUs, but achieving the same performance as the vector processor is difficult.

Even with the vector processor, the power consumption itself is considerably higher, as it is a supercomputer with high computing power. However, when compared under the condition that the performance when executing a program is the same, the vector processor is more efficient and can be said to operate with less power than supercomputers using other processors.

Kobayashi: Of course, not everything can be covered by vector processors, and it is important to use the right computing platforms suited for each individual applications. We are also looking into the use of quantum computers, which have recently become a hot topic. Targeting the city of Kochi in Kochi Prefecture, we are attempting to find optimal evacuation routes using a method called "quantum annealing”.

Ultimate Goal: “Zero Fatality” with supercomputers

Please tell us how you intend to have the system evolve in the future.

Koshimura: Our goal is to provide information that will save lives by raising the level of our forecast beyond the current one. One idea is utilizing full capability of supercomputers to run multiple inundation scenario forecasts and guessing which is the worst-case scenario to overwhelmingly increase the reliability of inundation damage forecasts. Currently, we report "how many buildings will be damaged," but in order to save lives, we need to show that "it is safe up to this point”. The biggest challenge is to ensure the reliability of the forecast.

Ohta: Currently, only one scenario is being simulated on the supercomputer SX, however, if it becomes possible to run, say, 1,000 scenarios, it will be possible to estimate the worst disaster situation based on the information available at that time.

Koshimura: We would like to increase the reliability of our forecasts while reducing the time to provide information to 3 minutes. JMA releases tsunami warning information in 3 minutes, but we want to provide a forecast of inundation conditions in the same amount of time. This will be communicated to people in real time and lead to saving lives. We hope to achieve this in the next five years. Ultimately, our goal is to achieve "zero fatality”.

We look forward to its realization. Finally, as a pioneer in the use of supercomputers, what is your message to researchers and executives?

Koshimura: Many researchers tend to think that their work ends when they publish their outcome in a journal paper. However, if they want to make their own work really useful in the society, I would like to say that they should work until the final stage of social implementation and exit, when the technology is useful to people and society. I beleve it is the job,especially of disaster researchers, to go that far.

We consider our efforts to be part of "real-time disaster science”. We should be able to do something similar with volcanic eruptions, floods, and many other types of disasters.

Date: Osaka University's supercomputers can be used by general companies for industrial purposes utilizing up to 15% of their computing resources. While some companies are eager to use it, many people seem to feel that supercomputers are special and unapproachable, but if there is any interest, and you would like to know if it could be used for a certain purpose, please do consult with us.

As was the case with the real-time tsunami inundation forecast system, connecting real-world data with the capabilities of a supercomputer could lead to the emergence of completely new services. I expect companies to propose novel ways of using supercomputers that are not bound by conventional ideas.

We interviewed Kazunori Sudo, NEC's Executive Vice President in charge of the system platform business, who oversees the development and maintenance of servers, supercomputers, and other computers, about the wide range of application possibilities that vector supercomputers hold.

Following the Fukushima earthquake, the " real-time tsunami inundation forecast system" running on NEC's supercomputer actually went into operation.

Kazunori Sudo

Kazunori Sudo
Executive Vice President
NEC Corporation

Kazunori Sudo (hereafter referred to as “Sudo”): I would like to express my deepest sympathies to all those who have suffered damage from the earthquake that struck off the coast of Fukushima Prefecture in 2022, and wish for the earliest possible restoration of the damage.

It has been a few years since the system was first introduced, and we are relieved that the system has fully demonstrated its true value and fulfilled its role. Although it could be said that the fact that the real-time tsunami inundation forecast system is not in operation is a sign of peace and tranquillity, as Japan is one of the most earthquake-prone countries in the world, I am strengthening my resolve to contribute to the safety of as many people as possible with our advanced technology.

Are there other examples of NEC's supercomputers like this system, being used for social contribution?

Sudo: While it is true that the main application of supercomputers is for scientific and technological calculations, as the information society rapidly develops and the world changes drastically, examples are emerging where supercomputers are being used to predict social problems based on the various types of vast information generated. For example, in the field of natural disasters, we have developed a service with Mitsui Kyodo Construction Consultants, Inc. that can predict river flooding six hours in advance using a supercomputer, that takes advantage of the features of NEC's vector supercomputers.

What are the applications in which vector-type supercomputers will show their strength? Are there any potential business applications?

Sudo: Vector supercomputers excel in large-scale matrix calculations that cannot be handled by general-purpose CPUs, and typical applications include fluid and electromagnetic wave analysis and structural analysis for scientific and technical calculations. In the business domain, the supercomputer has also recorded the fastest processing speed in financial benchmarks, and has been used to optimize recommendations in electronic commerce (EC).

In order to expand the application areas of vector supercomputers, which are unique in the world, we are investigating and proposing combinations of applications that are compatible with vector supercomputers. To further broaden the base of vector processors, an idea is to publicly solicit various applications in the form of a hackathon. In the past, supercomputers were very expensive and difficult to reach, but now there are supercomputer systems starting at around 2 million yen, so I think they are becoming easier to deploy.

In the future, we will not only offer vector supercomputers, as the areas in which supercomputers will be used are expanding. There are various types of processors in the world, and each has its own characteristics, such as general-purpose CPUs, GPUs that excel in deep learning of AI, and IPUs that specialize in AI. Therefore, it is important to combine them according to the application. This concept is called heterogeneous computing, and we will continue to promote it.

We are also currently focusing on quantum computing. In addition to promoting the development of quantum annealing simulators and hardware suitable for "combinatorial optimization problems" such as delivery route search and personnel allocation, we are also working on gate-model quantum computers that can be utilized across a wide range of applications.