Multi-scale Simulations on Preferential Absorption Behaviors of Cavities in BCC Fe
Wang, Yuhao
2025
Abstract
Structural materials degrade under intense particle irradiation due to ongoing microstructural evolution, a critical challenge in nuclear energy applications. Ferritic-martensitic steels stand out for their strong resistance to radiation-induced swelling compared with austenitic steels, making them primary candidates for next-generation nuclear reactors. However, advanced fast reactors and future fusion systems operate at the temperature range of maximum swelling propensity, heightening the need to understand how cavities grow and absorb point defects. Prior research has revealed that the classic thermal-based criterion in the critical bubble model (CBM) may lose accuracy outside the intermediate temperature regime. An alternative bias-driven criterion has been proposed, emphasizing the concept of cavity bias and covering a broader temperature window. This thesis advances those ideas with a multiscale simulation scheme, integrating atomistic calculations and kinetic Monte Carlo (KMC) simulations to elucidate cavity preferential absorption in BCC Fe. Initially, atomistic calculations were used to compute interaction distances and capture radii for single interstitial and vacancy defects around a representative void. A homogenization scheme accounted for one-dimensional diffusion paths of larger SIA clusters, enabling effective bias evaluations at different void sizes and temperatures. When helium was added, molecular statics (MS) calculations showed that gas pressure greatly influenced local interaction energy landscapes, sometimes leading to unexpected results, especially at high He-vacancy ratios. Under these conditions, purely static methods implied unphysical trends in SIA absorption, contradicting established cavity bias assumptions. The observed discrepancy stemmed from a simplified spherical approximation and the assumption of immediate relaxation of SIA to its most favorable dumbbell orientation. To address this limitation, a new in-house KMC code was developed. This approach took strain-affected migration barriers, which were extracted from molecular dynamics (MD) simulations as input and modeled defect jumps near the cavity. In doing so, it provided a more realistic time evolution of cavity absorption. Results demonstrated that equilibrium bubble pressure is a decisive factor, steering sink strength and bias between interstitial and vacancy preferences. The multi-scale integration of MS, MD, and KMC thus offers a more comprehensive description of cavity absorption behaviors on point defects in BCC Fe, producing physically reasonable trends across different temperatures, cavity sizes, and internal gas pressures. These findings strengthen the cavity bias model by highlighting helium’s impact on bubble-induced strain fields and illustrating that multi-scale simulations can resolve contradictions arising from static atomistic calculations. In addition, the calculated sink strength may guide larger-scale mesoscale models, such as cluster dynamics, and help produce results that can be compared more directly with experiments. Finally, this multiscale framework can accommodate more complex defect structures or migration mechanisms, offering a flexible foundation for studying sink-defect interaction in advance reactor materials.Deep Blue DOI
Subjects
Ferritic-martensitic steels Radiation-induced swelling Cavity bias Molecular dynamics Kinetic Monte Carlo Multi-scale simulations
Types
Thesis
Metadata
Show full item recordCollections
Remediation of Harmful Language
The University of Michigan Library aims to describe its collections in a way that respects the people and communities who create, use, and are represented in them. We encourage you to Contact Us anonymously if you encounter harmful or problematic language in catalog records or finding aids. More information about our policies and practices is available at Remediation of Harmful Language.
Accessibility
If you are unable to use this file in its current format, please select the Contact Us link and we can modify it to make it more accessible to you.