6-2 Challenge to Measure the Power Distribution in Very-High-Temperature Core Environments

-Development of a Power Distribution Measurement Method Using Neutrons Leaked from an HTGR Core-

Fig.6-3 Sensitivity

Fig.6-3 Sensitivity of the HTGR ex-core detectors to the in-core fuel

The sensitivity shows the contribution of neutrons generated by each fuel block to the detector signal when an ex-core detector is assumed near the pressure vessel in the HTTR geometry. The red cells are fuel blocks. The sensitivity is normalized so that the sum over all cores is unity.

 

Fig.6-4 Concept

Fig.6-4 Concept of moving an ex-core detector

This is an example of an orbit for moving an ex-core detector. Since this concept is based on a principle similar to CT technology, the ideal method of moving the measurement detector is a spiral orbit, similar to that of CT.

 


The High Temperature Gas-cooled Reactor (HTGR) employs a helium gas coolant to extract fission energy in the reactor as thermal energy. The High Temperature Engineering Test Reactor (HTTR) was the first in the world to successfully achieve a reactor outlet coolant temperature of 950 ℃ on April 19, 2004. This high temperature is expected to help realize many possibilities, including highly efficient power generation with a thermal efficiency of 50% and hydrogen production, which is expected to become carbon neutral.

However, in the Light Water Reactors (LWRs) that are currently in commercial use, the temperature inside the core is about 300 ℃; hence, it is possible to insert a neutron detector directly into the core. This in-core detector can be used to measure the power distribution of each fuel assembly, thereby enabling fuel management for efficient fuel burning.

In contrast, the HTGR has a very-high temperature environment where the temperature inside the core can reach up to 1000 ℃. Therefore, it is not possible to insert detectors into the core, and some of the operation and maintenance technologies developed for the LWR cannot be applied. In addition, if the power distribution in the reactor can be measured during HTGR operation and maintenance, it will be possible to realize a more economical HTGR design by reasonably reducing the safety margin based on improved fuel temperature estimation accuracy, in addition to efficient burning through fuel management.

This perspective led us to consider a method of measuring the power distribution in a core by observing the neutrons leaking from the core using an ex-core detector. Fortunately, we found that HTGRs have the advantage of longer neutron flight pass than the LWRs because of the differences in the characteristics of the neutron moderators (LWR uses light water, whereas HTGR uses graphite). Fig.6-3 shows the sensitivity of the HTGRs to the in-core fuel as detected by the ex-core detectors. It shows the contribution of neutrons generated by each fuel block to the detector signal, and it can be seen that the sensitivity extends to the center of the core. In the case of LWRs, measurement can be made for only the outer fuel assembly.

By using the long range of sensitivity detected by the ex-core detectors in the HTTRs, we devised an inverse analysis method of the power distribution based on a principle similar to computed tomography (CT) with many measurement points, and we successfully demonstrated the principle numerically in the HTTR geometry. As shown in Fig.6-4, at present, we assume that the detector is driven by a spiral orbit as in the CT.

The significance of this idea was recognized in FY2021, and the project was selected by MEXT as Nuclear Energy System Research and Development Project (JPMXD0221459236) “Development of Nuclear Instrumentation System for Power Distribution Measurement of HTGR”, and development toward practical application is underway in collaboration with JAEA, ANSeeN Corporation, and Shizuoka University.

(Yuji Fukaya)


 |