Add like
Add dislike
Add to saved papers

Study of Li atom diffusion in amorphous Li 3 PO 4 with neural network potential.

To clarify atomic diffusion in amorphous materials, which is important in novel information and energy devices, theoretical methods having both reliability and computational speed are eagerly anticipated. In the present study, we applied neural network (NN) potentials, a recently developed machine learning technique, to the study of atom diffusion in amorphous materials, using Li3 PO4 as a benchmark material. The NN potential was used together with the nudged elastic band, kinetic Monte Carlo, and molecular dynamics methods to characterize Li vacancy diffusion behavior in the amorphous Li3 PO4 model. By comparing these results with corresponding DFT calculations, we found that the average error of the NN potential is 0.048 eV in calculating energy barriers of diffusion paths, and 0.041 eV in diffusion activation energy. Moreover, the diffusion coefficients obtained from molecular dynamics are always consistent with those from ab initio molecular dynamics simulation, while the computation speed of the NN potential is 3-4 orders of magnitude faster than DFT. Lastly, the structure of amorphous Li3 PO4 and the ion transport properties in it were studied with the NN potential using a large supercell model containing more than 1000 atoms. The formation of P2 O7 units was observed, which is consistent with the experimental characterization. The Li diffusion activation energy was estimated to be 0.55 eV, which agrees well with the experimental measurements.

Full text links

We have located links that may give you full text access.
Can't access the paper?
Try logging in through your university/institutional subscription. For a smoother one-click institutional access experience, please use our mobile app.

Related Resources

For the best experience, use the Read mobile app

Mobile app image

Get seemless 1-tap access through your institution/university

For the best experience, use the Read mobile app

All material on this website is protected by copyright, Copyright © 1994-2024 by WebMD LLC.
This website also contains material copyrighted by 3rd parties.

By using this service, you agree to our terms of use and privacy policy.

Your Privacy Choices Toggle icon

You can now claim free CME credits for this literature searchClaim now

Get seemless 1-tap access through your institution/university

For the best experience, use the Read mobile app