Article ID Journal Published Year Pages File Type
7880018 Acta Materialia 2015 10 Pages PDF
Abstract
Following the Hume-Rothery rules, it is a longstanding notion that atomic size mismatch induces intrinsic residual strains in a common lattice which may cause lattice instability and thus phase transition in an alloy. For conventional alloys, such an intrinsic residual strain can be derived with the continuum theory of elasticity; however, lack of distinction between solvent and solute atoms in recently developed high entropy alloys simply defies such an approach. Here, we develop a general self-contained geometric model that enables the calculation of intrinsic residual strains around different sized elements in a multi-component alloy, which links the average lattice constant of the alloy to a few critical geometric variables related to the close atomic packing in that lattice, such as atomic size, atomic fraction and packing density. When applied to glass-forming high entropy alloys and bulk metallic glasses, our model unravels that amorphization occurs when the root-mean-square (R.M.S.) residual strain rises above ∼10%, in good agreement with the Lindemann's lattice instability criterion. By comparison, the transition from a single- to multi-phase solid solution takes place in crystalline high entropy alloys when the R.M.S. residual strain approaches ∼5%. Our current findings provide a quantitative insight into phase stability in multicomponent alloys, which should be useful in the design of high entropy alloys with desired phases.
Related Topics
Physical Sciences and Engineering Materials Science Ceramics and Composites
Authors
, , ,