- Notifications
You must be signed in to change notification settings - Fork54
Deep neural networks for density functional theory Hamiltonian.
License
mzjb/DeepH-pack
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
DeepH-pack is the official implementation of the DeepH(DeepHamiltonian) method described in the paperDeep-learning density functional theory Hamiltonian for efficient ab initio electronic-structure calculationand in theResearch Briefing.
DeepH-pack supports DFT results made byABACUS,OpenMX,FHI-aims orSIESTA and will supportHONPAS soon.
For more information, see thedocumentation and thetalk (in Chinese).
@article{deeph, author = {Li, He and Wang, Zun and Zou, Nianlong and Ye, Meng and Xu, Runzhang and Gong, Xiaoxun and Duan, Wenhui and Xu, Yong}, title = {Deep-learning density functional theory Hamiltonian for efficient ab initio electronic-structure calculation}, journal = {Nature Computational Science}, volume = {2}, number = {6}, pages = {367-377}, ISSN = {2662-8457}, DOI = {10.1038/s43588-022-00265-6}, url = {https://doi.org/10.1038/s43588-022-00265-6}, year = {2022}, type = {Journal Article}}
@article{deephe3, author = {Gong, Xiaoxun and Li, He and Zou, Nianlong and Xu, Runzhang and Duan, Wenhui and Xu, Yong}, title = {General framework for E(3)-equivariant neural network representation of density functional theory Hamiltonian}, journal = {Nature Communications}, volume = {14}, number = {1}, pages = {2848}, ISSN = {2041-1723}, DOI = {10.1038/s41467-023-38468-8}, url = {https://doi.org/10.1038/s41467-023-38468-8}, year = {2023}, type = {Journal Article}}@article{xdeeph, author = {Li, He and Tang, Zechen and Gong, Xiaoxun and Zou, Nianlong and Duan, Wenhui and Xu, Yong}, title = {Deep-learning electronic-structure calculation of magnetic superstructures}, journal = {Nature Computational Science}, volume = {3}, number = {4}, pages = {321-327}, ISSN = {2662-8457}, DOI = {10.1038/s43588-023-00424-3}, url = {https://doi.org/10.1038/s43588-023-00424-3}, year = {2023}, type = {Journal Article}}
To use DeepH-pack, following environments and packages are required:
Prepare the Python 3.9 interpreter. Install the following Python packages required:
- NumPy
- SciPy
- PyTorch = 1.9.1
- PyTorch Geometric = 1.7.2
- e3nn = 0.3.5
- pymatgen
- h5py
- TensorBoard
- pathos
- psutil
In Linux, you can quickly achieve the requirements by running
# install miniconda with python 3.9wget https://repo.anaconda.com/miniconda/Miniconda3-py39_4.10.3-Linux-x86_64.shbash Miniconda3-py39_4.10.3-Linux-x86_64.sh# install packages by condaconda install numpyconda install scipyconda install pytorch==1.9.1${pytorch_config}conda install pytorch-geometric=1.7.2 -c rusty1s -c conda-forgeconda install pymatgen -c conda-forge# install packages by pippip install e3nn==0.3.5pip install h5pypip install tensorboardpip install pathospip install psutil
with${pytorch_config}
replaced by your own configuration.You can find how to set it inthe official website of PyTorch.
Prepare the Julia 1.6.6 interpreter. Install the following Julia packages required with Julia's builtin package manager:
- Arpack.jl
- HDF5.jl
- ArgParse.jl
- JLD.jl
- JSON.jl
- IterativeSolvers.jl
- DelimitedFiles.jl
- StaticArrays.jl
- LinearMaps.jl
- Pardiso.jl
In Linux, you can quickly achieve the requirements by first running
# install julia 1.6.6wget https://julialang-s3.julialang.org/bin/linux/x64/1.6/julia-1.6.6-linux-x86_64.tar.gztar xzvf julia-1.6.6-linux-x86_64.tar.gz# open the julia REPLjulia
Then enter the pkg REPL by pressing]
from the Julia REPL. In the pkg REPL run
(@v1.6) pkg> add Arpack(@v1.6) pkg> add HDF5(@v1.6) pkg> add ArgParse(@v1.6) pkg> add JLD(@v1.6) pkg> add JSON(@v1.6) pkg> add IterativeSolvers(@v1.6) pkg> add DelimitedFiles(@v1.6) pkg> add StaticArrays(@v1.6) pkg> add LinearMaps
Followthese instructions to install Pardiso.jl.
One of the supported DFT packages is required to obtain the dataset andcalculate the overlap matrix for large-scale material systems.DeepH-pack supports DFT results made by ABACUS, OpenMX, FHI-aimsor SIESTA and will support HONPAS soon.
OpenMX:
- InstallOpenMX package version 3.9 for density functional theory Hamiltonian matrix calculation to construct datasets.If you are using Intel MKL and Intel MPI environments, you can use the following variable definitions for makefileOr edit the makefile yourself according to your environment to install OpenMX version 3.9.
CC = mpiicc -O3 -xHOST -ip -no-prec-div -qopenmp -I${MKLROOT}/include/fftw -I${MKLROOT}/includeFC = mpiifort -O3 -xHOST -ip -no-prec-div -qopenmp -I${MKLROOT}/includeLIB = ${CMPLR_ROOT}/linux/compiler/lib/intel64_lin/libiomp5.a ${MKLROOT}/lib/intel64/libmkl_blas95_lp64.a ${MKLROOT}/lib/intel64/libmkl_lapack95_lp64.a ${MKLROOT}/lib/intel64/libmkl_scalapack_lp64.a -Wl,--start-group ${MKLROOT}/lib/intel64/libmkl_intel_lp64.a ${MKLROOT}/lib/intel64/libmkl_intel_thread.a ${MKLROOT}/lib/intel64/libmkl_core.a ${MKLROOT}/lib/intel64/libmkl_blacs_intelmpi_lp64.a -Wl,--end-group ${CMPLR_ROOT}/linux/compiler/lib/intel64_lin/libifcoremt.a -lpthread -lm -ldl
- A modified OpenMX package is also used to compute overlap matrices only for large-scale materials structure. Install 'overlap only' OpenMX according to thereadme documentation in thisrepository.
- InstallOpenMX package version 3.9 for density functional theory Hamiltonian matrix calculation to construct datasets.If you are using Intel MKL and Intel MPI environments, you can use the following variable definitions for makefile
SIESTA:InstallSIESTA package fordensity functional theory Hamiltonian matrix calculation to constructdatasets. DeepH-pack requires SIESTA version >= 4.1.5.
ABACUS: InstallABACUS packagefor density functional theory Hamiltonian matrix calculation toconstruct datasets. DeepH-pack requiresABACUS version >= 2.3.2.
Note added: the DeepH-ABACUS interface currently suffers from bug regarding the sparsity pattern of ABACUS's overlap matrix, which may cause errors in DeepH prediction. We're currently working on this issue, and this note will be removed once a fix is ready.
Run the following command in the path of DeepH-pack:
git clone https://github.com/mzjb/DeepH-pack.gitcd DeepH-packpip install.
To perform efficientab initio electronic structure calculation by DeepH methodfor a class of large-scale material systems, one needs to design an appropriatedataset of small structures that have close chemical bonding environment withthe target large-scale material systems. Therefore, the first step of a DeepHstudy is to perform the DFT calculation on the above dataset to get the DFTHamiltonian matrices with the localized basis. DeepH-pack supports DFTresults made by ABACUS, OpenMX, FHI-aims or SIESTA and will support HONPAS soon.
For more information, see thedocumentation.
Preprocess
is a part of DeepH-pack. ThroughPreprocess
, DeepH-pack willconvert the unit of physical quantity, store the data files in the formatof text andHDF5 for each structure in a separate folder, generate localcoordinates, and perform basis transformation for DFT Hamiltonian matrices.We use the following convention of units:
Quantity | Unit |
---|---|
Length | Å |
Energy | eV |
You need to edit a configuration in the format ofini, setting up thefile referring to the default fileDeepH-pack/deeph/preprocess/preprocess_default.ini
.The meaning of the keywords can be found in thedocumentation.For a quick start, you must set upraw_dir,processed_dir andinterface.
With the configuration file prepared, run
deeph-preprocess --config${config_path}
with${config_path}
replaced by the path of your configuration file.
Train
is a part of DeepH-pack, which is used to train a deep learning model using the processed dataset.
Prepare a configuration in the format ofini, setting up the file referring to the defaultDeepH-pack/deeph/default.ini
. The meaning of the keywords can be found in thedocumentation. For a quick start, you must set upgraph_dir,save_dir,raw_dir andorbital, other keywords can stay default and be adjusted later.
With the configuration file prepared, run
deeph-train --config${config_path}
with${config_path}
replaced by the path of your configuration file.
Tips:
Name your dataset. Usedataset_name to name your dataset, the same names may overwrite each other.
Hyperparameters of the neural network. The neural network here contains some hyperparameters. For a specific problem your should try adjusting the hyperparameters to obtain better results.
The keywordorbital. The keywordorbital states which orbitals or matrix elements are predicted. It is a little complicated to understand its data structure. To figure out it, you can refer to thedocumentation or the methodmake_mask in class
DeepHKernel
defined inDeepH-pack/deeph/kernel.py
.Alternatively, a Python script at
DeepH-pack/tools/get_all_orbital_str.py
can be used to generate a default configuration to predict all orbitals with one model.Use TensorBoard for visualizations. You can track and visualize the training process through TensorBoard by running
tensorboard --logdir=./tensorboard
in the output directory (save_dir):
Inference
is a part of DeepH-pack, which is used to predict theDFT Hamiltonian for large-scale material structures and performsparse calculation of physical properties.
Firstly, one should prepare the structure file of large-scale materialand calculate the overlap matrix. Overlap matrix calculation does notrequireSCF
. Even if the material system is large, only a small calculationtime and memory consumption are required. Following are the steps tocalculate the overlap matrix using different supported DFT packages:
- ABACUS: Set the following parameters in the input file of ABACUS
INPUT
:and run ABACUS like a normalcalculation get_S
SCF
calculation.ABACUS version >= 2.3.2 is required. - OpenMX: See thisrepository.
For overlap matrix calculation, you need to use the same basis set and DFTsoftware when preparing the dataset.
Then, prepare a configuration in the format ofini, setting up thefile referring to the defaultDeepH-pack/deeph/inference/inference_default.ini
.The meaning of the keywords can be found in theINPUT KEYWORDS section.For a quick start, you must set upOLP_dir,work_dir,interface,trained_model_dir andsparse_calc_config, as well as aJSON
configuration file located atsparse_calc_config for sparse calculation.
With the configuration files prepared, run
deeph-inference --config${config_path}
with${config_path}
replaced by the path of your configuration file.
When the directory structure of the code folder is not modified, the scripts in it can be used to generate a dataset of non-twisted structures, train a DeepH model, make predictions on the DFT Hamiltonian matrix of twisted structure, and perform sparse diagonalization to compute the band structure for the example study of bismuthene.
Firstly, generate example input files according to your environment path by running the following command:
cd DeepH-packpython gen_example.py${openmx_path}${openmx_overlap_path}${pot_path}${python_interpreter}${julia_interpreter}
with${openmx_path}
,${openmx_overlap_path}
,${pot_path}
,${python_interpreter}
, and${julia_interpreter}
replaced by the path of original OpenMX executable program, modified 'overlap only' OpenMX executable program, VPS and PAO directories of OpenMX, Python interpreter, and Julia interpreter, respectively. For example,
cd DeepH-packpython gen_example.py /home/user/openmx/source/openmx /home/user/openmx_overlap/source/openmx /home/user/openmx/DFT_DATA19 python /home/user/julia-1.5.4/bin/julia
Secondly, enter the generatedexample/
folder and runrun.sh
in each folder one-by-one from 1 to 5. Please note thatrun.sh
should be run in the directory where therun.sh
file is located.
cd example/1_DFT_calculationbash run.shcd ../2_preprocessbash run.shcd ../3_trainbash run.shcd ../4_compute_overlapbash run.shcd ../5_inferencebash run.sh
The third step, the neural network training process, is recommended to be carried out on the GPU. In addition, in order to get the energy band faster, it is recommended to calculate the eigenvalues of different k points in parallel in the fifth step bywhich_k interface.
After completing the calculation, you can find the band structure data in OpenMX Band format of twisted bilayer bismuthene with 244 atoms per supercell computed by the predicted DFT Hamiltonian in the file below:
example/work_dir/inference/5_4/openmx.Band
The plotted band structure will be consistent with the right pannel of figure 6c in our paper.
You can train DeepH models using the existingdataset to reproduce the results of our paper.
Firstly, download the processed dataset for graphene (graphene_dataset.zip), MoS2 (MoS2_dataset.zip), twisted bilayer graphene (TBG_dataset.zip) or twisted bilayer bismuthene (TBB_dataset.zip). Uncompress the ZIP file.
Secondly, edit corresponding config files in theDeepH-pack/ini/
.raw_dir should be set to the path of the downloaded dataset.graph_dir andsave_dir should be set to the path to save your graph file and results file during the training. For grahene, twisted bilayer graphene and twisted bilayer bismuthene, a single MPNN model is used for each dataset. For MoS2, four MPNN models are used. Run
deeph-train --config${config_path}
with${config_path}
replaced by the path of config file for training.
After completing the training, you can find the trained model insave_dir, which can be used to make prediction on new structures by run
deeph-inference --config${inference_config_path}
with${inference_config_path}
replaced by the path of config file for inference.Please note that the DFT results in this dataset were calculated using OpenMX.This means that if you want to use a model trained on this dataset to calculate properties, you need to use the overlap calculated using OpenMX.The orbital information required for overlap calculations can be found in thepaper.
Train the DeepH model by random graphene supercellsand predict the Hamiltonian of carbon nanotube usingthe ABACUS interface. See README.md inthis filefor details.
He Li
Tsinghua University
mzjb313 [at] gmail [dot] com
Zechen Tang
Tsinghua University
- Xiaoxun Gong
- Honggeng Tao
- Zun Wang
- Nianlong Zou
- Ting Bao
Yong Xu
Tsinghua University
yongxu [at] tsinghua [dot] edu [dot] cn
Wenhui Duan
Tsinghua University
duanw [at] tsinghua [dot] edu [dot] cn
About
Deep neural networks for density functional theory Hamiltonian.