 Research
 Open access
 Published:
Prediction of geothermal temperature field by multiattribute neural network
Geothermal Energy volumeÂ 12, ArticleÂ number:Â 22 (2024)
Abstract
Hot dry rock (HDR) resources are gaining increasing attention as a significant renewable resource due to their low carbon footprint and stable nature. When assessing the potential of a conventional geothermal resource, a temperature field distribution is a crucial factor. However, the available geostatistical and numerical simulations methods are often influenced by data coverage and human factors. In this study, the Convolution Block Attention Module (CBAM) and Bottleneck Architecture were integrated into UNet (CBAMBUNet) for simulating the geothermal temperature field. The proposed CBAMBUNet takes in a geological model containing parameters such as density, thermal conductivity, and specific heat capacity as input, and it simulates the temperature field by dynamically blending these multiple parameters through the neural network. The bottleneck architectures and CBAM can reduce the computational cost while ensuring accuracy in the simulation. The CBAMBUNet was trained using thousands of geological models with various real structures and their corresponding temperature fields. The methodâ€™s applicability was verified by employing a complex geological model of hot dry rock. In the final analysis, the simulated temperature field results are compared with the theoretical steadystate crustal ground temperature model of Gonghe Basin. The results indicated a small error between them, further validating the method's superiority. During the temperature field simulation, the thermal evolution law of a symmetrical cooling front formed by low thermal conductivity and high specific heat capacity in the center of the fault zone and on both sides of granite was revealed. The temperature gradually decreases from the center towards the edges.
Introduction
Geothermal energy stands out as a novel form of renewable energy known for its high stability and low vulnerability to external influences when compared to other renewable sources such as tidal, wind, and solar energy (Zhao and Wan 2014; Wang et al. 2020; Qiu et al. 2022). Being a clean energy alternative with minimal carbon dioxide emissions, geothermal energy has gathered considerable attention from researchers and governments globally (Zhu et al. 2015; Xia and Zhang 2019; Yang et al. 2022). Understanding the temperature field distribution within geothermal areas is paramount for assessing their reserves (Bassam et al. 2010; Forrest et al. 2005). The temperature field plays a critical role in identifying the optimal drilling location and depth before commencing geothermal drilling operations (Vogt et al. 2010). Given the substantial costs associated with measuring temperature fields in geothermal regions, it becomes imperative to adopt methods that can accurately simulate these temperature distributions based on available data.
Currently, various methods are employed for simulating geothermal temperatures, including geostatistical methods (Williams and DeAngelo 2011; Siler et al. 2016) and numerical simulations (Song et al. 2018; Aliyu and Archer 2021; Salinas et al. 2021; Lv et al. 2022). These methods have been successfully applied to conventional geothermal fields. Fabbri (2001) utilized postprocessing indicator Kriging outcomes to derive probability maps, which highlighted areas with high probabilities of temperatures above 80Â â„ƒ, between 70Â â„ƒ and 80Â â„ƒ, and lower than 70Â â„ƒ. SepÃºlveda (2012) used Kriging to predict drillhole temperatures and stratigraphic data sets in the Wairakei geothermal field, New Zealand. Cheng et al. (2019) developed a conceptual numerical model that employs a fully coupled thermosporoelastic finiteelement model with Discrete Fracture Network (DFN) to simulate the response of naturally fractured geothermal reservoirs to water injection. Akbar and Fathianpour (2021) devised a computational model utilizing geological, geophysical, and structural data to enhance the understanding of highenthalpy geothermal reservoirs. The model incorporates a Curie depth map to estimate heat sources and employs finite element methods to solve governing equations. Lesmana et al. (2021) studied and compared two development strategies, fullscale and stepwise development, for the Tompaso field in North Sulawesi, Indonesia, based on numerical and thermodynamic simulations. Conducting numerical simulations, Li et al. (2022) investigated the impact of geological layering on the thermal energy performance of underground mines, evaluating the influence of geological stratification on heat storage capacity and performance using heat storage and insulation materials. However, these methods have a few weaknesses. Both numerical simulation and geostatistical methods need detailed geological information and data quantity to complete the task, which limits the application of the two methods.
Machine learning has emerged as a promising area of research and development in geothermal exploration, with a rise in its widespread application across multiple research fields within geothermal energy. Currently, traditional machine learning techniques are predominantly employed in the exploration, reservoir characterization, petrophysics, and drilling aspects of the geothermal energy industry. In contrast, the deep learning algorithms are primarily utilized in reservoir engineering, seismic activity, and production/injection engineering (Moraga, et al. 2022; Okoroafor et al. 2022). Esen et al. (2007; 2008a, b, c, d, e; 2015) have conducted extensive research on ground coupled heat pump (GCHP), ground heat exchanger (GHE), and ground source heat pump (GSHP). They utilized various machine learning techniques such as ANFIS, ANN, and SVM to model and predict the performance of GCHP, GHE, and GSHP, providing diverse tools to enhance the modeling and predictive capabilities of machine learning. Rezvanbehbahani et al. (2017) applied the Gradient Boosted Regression Tree (GBRT) model to predict the heat flux distribution in Greenland using the simplified global geothermal heat flow (GHF) data set. Assouline et al. (2019) proposed a new methodology that combined the results random forest algorithm with GIS data processing and physical modeling to assess Switzerlandâ€™s shallow geothermal potential via the geothermal gradient, ground thermal conductivity, and ground thermal diffusivity. In an effort to forecast the temperature of geothermal reservoirs based on selected hydro geochemistry data, Fusun and Mehmet Haklidir (2020) developed a Deep Neural Network (DNN) model, demonstrating promising results. LÃ¶sing and Ebbing (2021) suggested a machine learningbased method that employed the gradientboosting regression technique to count geothermal heat flow (GHF) in Antarctica. Gudala and Govindarajan (2021) improved the mathematical model through the dynamic variations in the rock, fracture and fluid properties, and checked the geothermal performance through the recently developed integrated machine learningresponse surface modelARIMA model. Ishitsuka et al. (2021) developed two methods: one based on Bayesian estimation and the other based on neural network to estimate the temperature distribution of geothermal field. Xiong et al. (2022) compared the deep learning GoogLeNet model with Support Vector Machine (SVM), Decision Tree (DT), KNearest Neighbor (KNN) and other traditional machine learning to recognize geothermal surface manifestations. Yang et al. (2022) used the deep belief network (DBN) to identify the formation temperature field, and successfully applied the network to the identification of stratum temperature field of the southern Songliao Basin, China. Kiran et al. (2022) used FORGE welllogging data to synthesize the evolution of dynamic data, and analyzed and compared KNearest Neighbor, Random Forest, Decision Tree, Gradient Boosting and Deep Learning model with hidden layers.
The above researchersâ€™ work on geothermal based on machine learning and deep learning fully demonstrates the practical significance of using deep learning to predict the geothermal temperature field. Therefore, we propose a novel network called CBAMBUNet for simulating temperature fields in complex hot dry rocks. Specifically, CBAMBUNet takes key parameters including density, specific heat capacity, and thermal conductivity as inputs, and these parameters are adaptively fused by the neural network to simulate temperature fields of hot dry rocks. The data set used in this study was generated using the finite element method, and the numerical results were compared with logging data to verify the accuracy of the proposed model. Our findings indicate that CBAMBUNet is more effective for simulating the temperature field of hot dry rocks. Furthermore, the use of CBAMBUNet in complex models allows for the analysis of the evolution of fracture temperature fields, lithology, and other related factors.
Methodology
The challenge of simulating a temperature field with multiple rock parameters can be reframed as a nonlinear regression problem. In tackling such nonlinear regression challenges, neural network processing emerges as an effective solution. Thus, UNet is leveraged to address the complex task of temperature field simulation. To enhance the UNet capability in accurately simulating the temperature field of hot dry rocks and mitigating uncertainties tied to singleparameter simulations, a combination of bottleneck architectures and Convolutional Block Attention Module is employed with UNet. By utilizing a network to integrate three parameters and merging them with geological structure and additional information, a more precise geothermal temperature field can be generated.
CBAMBUNet architecture
The architecture of CBAMBUNet is based on UNet (Ronneberger et al. 2015). The UNet's efficient representation capabilities allow for the accurate simulation of temperature fields from rock parameters. Specifically, the encoder component of the CBAMBUNet architecture extracts both rock parameters (density, thermal conductivity, and specific heat capacity) and temperature field data. The decoder subsequently generates a corresponding functional relationship between the two data types, enabling the deep simulation of temperature fields that have been trained through the UNet.
The modified UNet features a contraction path on the left side with four downsampling blocks and an expansion path on the right side with four upsampling blocks, in line with the original UNet structure. Each downsampling block in the left path comprises a bottleneck architecture, ReLU activation function, Convolutional Block Attention Module (CBAM), sigmoid activation function, and downsampling operation. The bottleneck architecture reduces the number of network parameters, thereby accelerating the training process. Incorporating the CBAM enhances the robustness and generalization capabilities of the UNet. Both ReLU and sigmoid activation functions are utilized as nonlinear transformations to increase network nonlinearity (Krizhevsky et al. 2012). The downsampling operation involves a 2â€‰Ã—â€‰2 maxpooling layer with a stride of 2, reducing the feature map size by half while retaining the maximum value. The intermediate bottlenecks include the bottleneck architecture, ReLU activation function, CBAM, and sigmoid activation function. Each upsampling block includes an upsampling operation (2â€‰Ã—â€‰2 bilinear interpolation with a stride of 2), concatenation to merge left path features, bottleneck architecture, ReLU activation function, CBAM, and sigmoid activation function. The upsampling operation uses an upsampling layer to double the input image size. Finally, the temperature field is generated by a 1â€‰Ã—â€‰1 convolution layer. By adjusting the number of output channels to 1, the network can accurately map rock parameters to the temperature field during the contraction and expansion learning phases, thereby achieving multiparameter fusion. The structure of CBAMBUNet is shown in Fig.Â 1.
Convolutional block attention module (CBAM)
The convolutional block attention module (CBAM) was introduced by Woo et al. (2018). The module includes two sequential submodules, namely the channel and spatial submodules, and serves as a straightforward and efficient attention mechanism for feedforward convolutional neural networks. CBAM aims to direct the attention to the crucial features while also enhancing the representation capability of the neural network. By leveraging attention mechanisms, CBAM effectively focuses on informative features while suppressing redundant ones. In this study, the channel attention module and the spatial attention module are sequentially applied, facilitating the transmission of information in the neural network by learning to reinforce or suppress relevant characteristic information. FigureÂ 2 shows the architecture of CBAM.
The channel attention module represents a distinctive type of attention module that aims to address the information loss commonly associated with single pooling operations. This is achieved by performing two pooling operations (maxpooling and averagepooling) in order to obtain two different feature maps. These maps are then processed by multilayer perceptual filters and added together. Finally, the sigmoid activation function is applied to obtain the channel attention. FigureÂ 3 shows the channel attention module.
The channel attention is as follows:
where \({{\varvec{M}}}_{c}({\varvec{F}})\) is channel attention module;\(\sigma (\bullet )\) denotes the sigmoid function; \(MLP(\bullet )\) is a multilayer perceptron; \(AvgPool(\bullet )\) is avgpooling; \(MaxPool(\bullet )\) is maxpooling.
The spatial attention module principally reflects the importance of input values in the spatial dimension. The attention module is obtained through maxpooling and averagepooling, concatenation, convoluted by a standard convolution layer, and finally sigmoid activation function. FigureÂ 4 shows the spatial attention module.
The spatial attention can be expressed as:
where \({{\varvec{M}}}_{s}({{\varvec{F}}}{\prime})\) is the spatial attention module; \(f(\bullet )\) is convolution layer operation;\(Cat(\bullet )\) is concatenate operation.
The input feature map is first multiplied by the channel attention point, then multiplied by the spatial attention point, and ultimately the final feature map is obtained after CBAM processing. This process is as follows:
where \(\otimes\) represents the multiplication of matrices by elements; \({{\varvec{F}}}{\prime}\) is input characteristic map; \({{\varvec{F}}}^{{\prime}{\prime}}\) is output characteristic map.
Bottleneck architectures
The bottleneck architecture is a crucial component of ResNet, featuring a distinctive bottleneck design (He et al. He et al. 2016). This module utilizes three convolutional kernels of sizes 1â€‰Ã—â€‰1, 3â€‰Ã—â€‰3, and 1â€‰Ã—â€‰1 to reduce network parameters and accelerate network training. The introduction of the CBAM enhances the network's robustness and generalization capabilities, albeit at the cost of increased network parameters, heightened computational complexity, and slower training speeds. To address this challenge, integrating the bottleneck architecture into UNet helps expedite network training. FigureÂ 5 shows the bottleneck architecture.
Simulating of temperature field based on CBAMBUNet
CBAMBUNet takes the geological model containing rock parameters R (density, thermal conductivity, and specific heat capacity) as input and the temperature field T as expected output. That is, the relationship between R and T is established by CBAMBUNet:
where \(CBAMBNet(\bullet )\) denotes an CBAMBUNet;\({\varvec{\theta}}=\{{\varvec{W}},{\varvec{b}}\}\),W and b both are learnable parameters, W represents weight matrix, b represents bias matrix.
In the training process of CBAMBUNet, the optimization and adjustment of the objective function are iterative. By continually comparing the current objective function with the expected objective function, the weight and basis superparameters of each layer in the neural network are adjusted. The mean square error (MSE) function, which is smooth, continuous, and differentiable, is chosen as the optimization criterion, making it more conducive to the use of gradient descent algorithms. In addition, the gradual decrease of the MSE as the error reduces assists in convergence. The fixed learning rate used can also quickly converge to the minimum value. In this study, the MSE is utilized to assess the difference between simulated \({\varvec{R}}_{i} ;{\varvec{\theta}}\) and actual \({\mathbf{T}}\) temperatures, and its calculation formula is as follows:
where the term \({\Vert \bullet \Vert }_{F}^{2}\) stands for the Frobenius norm; \({\{{{\varvec{R}}}_{i};{{\varvec{T}}}_{i}\}}_{i=1}^{N}\) stands for the number of training samples.
CBAMBUNet minimizes the loss function using the Adam algorithm (Kingma and Ba 2014) and updates the learned parameters \(\bf {\uptheta }\) of the neural network by propagating the prediction error back through backpropagation:
where \({{\varvec{\theta}}}^{(i)}\) represents neural network parameters of layer I; \(\alpha\) represents learning rate.
The trained CBAMBUNet input the geological model containing rocks parameters \({{\varvec{R}}}^{\boldsymbol{^{\prime}}}\), and the neural network can realize the endtoend temperature field \({{\varvec{T}}}^{\boldsymbol{^{\prime}}}\) simulation:
Numerical examples
Data set preparation
As a datadriven algorithm, the performance of CBAMBUNet is contingent upon the qualities of the training data set. In the study, the training data set builds upon the simulation of the temperature field of the strata after 600,000Â years utilizing the finite element method. The rock parameters are derived from hightemperature and highpressure petrophysical experiments and previous research (Zhang et al. 2021). To simplify the finite element calculation, this study posits several assumptions as the basis for establishing labeled data:

(1) The rock matrix is considered to be homogeneous and isotropic, particularly with regard to thermal conductivity, which is assumed to be temperature independent.

(2) This study solely accounts for deep heat sources and does not take into consideration the generation of radioactive heat by rocks.

(3) Hydrothermal geothermal activity is not accounted for in the geological model, and the rock mass is assumed to be of the hot dry geothermal type. As water is not a part of the geological model, only heat conduction and energy transfer are considered.

(4) The dimensions of the geological model are 16Â kmâ€‰Ã—â€‰16Â km, with the heat source temperature set at 800Â â„ƒ and the ground temperature set at 20Â â„ƒ.
The rock mass parameter ranges for all training data sets analyzed in this study are shown in TableÂ 1. The geological structure model for the training data subset is shown in Fig.Â 6. Density, thermal conductivity, and specific heat capacity in the geological model are shown in Fig.Â 7. Subsequently, solving the temperature field was carried out using the finite element method, and the annotated results are shown in Fig.Â 8.
Data processing
The purpose of preprocessing data is to modify it to align with the requirements of the model and ensure compatibility between the data and model. Variations in values may result in the overrepresentation of attributes with greater values and increase training time for the neural network. Algorithms based on sample distance are sensitive to the magnitude of data. In this research, three parameters (density, thermal conductivity, and specific heat capacity) were selected to simulate the temperature field, and their values varied significantly, necessitating data preprocessing. To address this issue in the present study, a Zscore standardization technique was employed:
where x^{*} is the normalized data; x is the original data; \(\mu\) is the mean of the sample data; \(\sigma\) is the standard deviation of the sample data.
To enhance the training data, we segmented the training data and labels into 160â€‰Ã—â€‰160 slices, resulting in 10,000 slices with corresponding labels. Out of these, 8000 slices were utilized for training the neural network, while the remaining 2000 slices were assigned to the validation set. It is important to note that when using the CBAMBUNet architecture for image processing, segmented images do not necessarily need to be divided into 160â€‰Ã—â€‰160 slices.
Training CBAMBUNet
The CBAMBUNet parameters are optimized using the Adam optimizer (Kingma and Ba 2014), with an initial learning rate of 0.001. To prevent overfitting and improve model generalization, the learning rate is attenuated using the cosine annealing algorithm with warm restart (Loshchilov and Hutter, Loshchilov and Hutter 2016). A batch size of 4 is used for training, and the model is trained for a total of 60 epochs.
Result test model
In this section, in order to verify the effectiveness and generalization ability of the neural network on the data set, the trained CBAMBUNet is applied to the geological model 1.
This section establishes a geological model, as shown in Fig.Â 9, comprising heat conduction channels, nonthermal conductors, and hightemperature granite conductors. The temperature field is simulated using CBAMBUNet and compared with the results obtained from a finite element method simulation, as shown in Fig.Â 10. Although the CBAMBUNet training relies on the data set generated by the finite element method, there exist slight disparities between the output of CBAMBUNet and the temperature field generated by the finite element method. Specifically, in the granite conductor, the temperature should be higher than in the periphery. However, the isotherms from the finite element method do not exhibit characteristics consistent with theoretical expectations. In contrast, the neural network simulation aligns more closely with the expected theoretical behavior. Consequently, it is concluded that the method proposed in this study offers enhanced performance in accurately predicting the temperature field, especially in scenarios where the granite conductorâ€™s temperature behavior differs from that observed in the finite element method simulations, thus showcasing the method's improved predictive capabilities.
Study area
The Gonghe Basin, located in Qinghai Province, is a rhombic basin that has developed during the Cenozoic Era. It lies on the northeast edge of the QinghaiTibet Plateau (Fig.Â 11A) and has been formed through tectonic movements of the Qilian and Kunlun Mountains (Fig.Â 11B) (Zeng et al. 2018; Zhang et al. 2018a, b). The basin boundary fault activity has resulted in uplift and rising of surrounding mountains; as a result, the basin has remained relatively stable, and an extensive set of Cenozoic sediments have been deposited. These sediments comprise primarily Quaternary alluvialâ€“diluvial deposits, fluvialâ€“lacustrine deposits, and Neogene and Paleogene lacustrine deposits. The base of the basin is mainly composed of Triassic strata and intrusive rocks, consisting of granite, granodiorite, and porphyry granite (Fig.Â 11C) (Wang et al. 2015; Li et al. 2015). In recent years, a series of wells (e.g., DR3, DR4, GR1, and GR2) have been organized and implemented by the China Geological Survey and Qinghai Provincial Department of Land and Resources, revealing the occurrence of high temperatures in Gonghe Basin (Fig.Â 11D) (Zhang et al. 2018a, b; Yan et al. 2015). This highlights the significant development potential of hot dry rocks in the Gonghe Basin.
Gonghe model
To further validate the applicability of CBAMBUNet, a 12Â km Ã— 20Â km Gonghe geological model was developed in this section, referencing the geological model of the Gonghe Basin established by Gao and Zhao (2024). This model (Fig.Â 12) comprises various geological structures with complex lateral geological conditions. It is structured into four layers: two uppermost caprocks, a middle geothermal reservoir, and a lower heat source. Tectonic activities have led to the development of numerous faults and cracks horizontally within the model, serving as conduits for geothermal energy underground. Subsequently, the trained CBAMBUNet is utilized to simulate the temperature field of the Gonghe geological model (Fig.Â 13). A comparison is made between the temperature field simulated by CBAMBUNet and those simulated by 3DUNet and the finite element method (Fig.Â 13) (Gao and Zhao 2024). The comparison results indicate that the performance of CBAMBUNet aligns more consistently with theoretical expectations. Specifically, areas with a higher concentration of cracks at depths of 2â€“3Â km, 6Â km, 9Â km, and 18â€“19Â km exhibit increased temperatures compared to the surrounding regions. In order to verify the accuracy of the simulation, the actual logging temperature measurement curve are compared with the temperature field simulated by CBAMBUNet, as shown in Fig.Â 14. The results reveal a high degree of consistency with the actual geological conditions, thus underlining the reliability and feasibility of CBAMBUNet in accurately predicting the temperature field in complex geological settings.
FigureÂ 14a shows the theoretical steadystate crustal geotherms of the Gonghe and the temperature curve obtained by CBAMBUNet simulation temperature field. The output value indicates a difference of less than 20Â â„ƒ, suggesting an error rate of less than 2% for CBAMBUNetÂ (Fig. 14b). These results demonstrate the high superiority of our method.
Discussion
Based on the aforementioned points, we remain confident of the potential success of our proposed approach in simulating the temperature distribution of hot dry rocks. Consequently, this segment of the study emphasizes on scrutinizing the impact of various factors such as CBAM, bottleneck architectures, and cosine annealing algorithm with the warm restart method on the performance of UNet. Additionally, we carry out an indepth analysis of the limitations and possible future directions of our study. The following discussion is based on geological model 1.
Effect of CBAM
This study enhances the capability of neural networks in handling regression problems by incorporating Convolutional Block Attention Modules (CBAM) into the original Unet architecture. The impact of CBAM on neural network performance is assessed by comparing the simulated temperature field by CBAMBUNet and the originalÂ UNet. The experimental outcomes reveal that, although the training duration of CBAMBUNet needs to be extended, the integrated attention mechanism of CBAM significantly enhances the simulation accuracy of the temperature field (Fig.Â 15). Notably, CBAMenhanced models outperform original UNet in simulating the effects of heat conduction channels, aligning more precisely with contemporary geological insights. These results demonstrate the efficacy of integrating CBAM into neural network structures to enhance the precision of regression modeling.
Effect of bottleneck architectures
To mitigate the time overhead induced by merging CBAM, a bottleneck architecture has been introduced into the neural network. Furthermore, the time taken for an epoch by CBAMBUNet, CBAMUNet, and originalÂ UNet under identical conditions, as well as the overall training duration, were compared. The experimental findings demonstrate that CBAMBUNet exhibits reduced time consumption, underscoring the benefits of incorporating the bottleneck architecture. Table 2 shows a comparative analysis of the time consumed by these three methods.
Effect of cosine annealing algorithm with warm restart
Hyperparameters are a set of free parameters that provide a means of controlling the entire algorithm. In this study, the learning rate was identified as a critical hyperparameter. To investigate the impact of the learning rate, two distinct learning rate adjustment strategies were compared in CBAMBUNet training, while keeping all other hyperparameters of CBAMBUNet constant. FigureÂ 16 shows the loss curves resulting from the two different learning rate adjustment strategies. In the conventional algorithm, the loss value initially decreases gently, then gradually stabilizes around the 30th epoch with a high loss value. Conversely, the cosine annealing algorithm with warm restart exhibits only mild fluctuations during the attenuation process and stabilizes around the 30th epoch with a lower loss value. Notably, when the cosine annealing algorithm with a warm restart and the conventional algorithm complete training simultaneously, the conventional algorithm still exhibits underfitting, while the cosine annealing algorithm with a warm restart trains the neural network better. Consequently, this study suggests that the cosine annealing algorithm with a warm restart performs better than the conventional algorithm in training the neural network.
Limitations and future work
As an AI algorithm that is reliant on data, the training data set plays a critical role in determining the generalization capability of the neural network and is instrumental in establishing the functional relationships used in this study. The training data set for this study was created through the use of the finite element method to simulate the temperature field. As a result, the performance of our neural network is dependent, to some extent, on the finite element method. When utilizing the finite element method to establish the labels, a significant number of preconditions must be taken into account. These include the initial temperature of the heat source, the heat sourceâ€™s location, the boundary conditions (e.g., nonthermal conduction boundary), and the specific time at which the temperature conduction occurs. These preconditions limit the neural networkâ€™s ability to simulate the temperature field solely under certain circumstances. Consequently, it is not possible to simulate the temperature field of hot dry rocks over time dimensions. Moreover, the process of setting up the labels is timeconsuming.
Conclusions
This study utilizes the CBAMBUNet method to simulate the temperature field of hot dry rock. The main findings are as follows:

1. Based on the simulated temperature field: The cover layer has a significant impact on the regional temperature field due to its low thermal conductivity. This results in the temperature field above the cover layer being lower than the surrounding temperature field. Compared to granite and the crust, thermal conductive channels exhibit higher heat transfer rates, with temperatures in the conductive channels also higher than in the surrounding layers. The temperature field inside granite is higher than the surrounding geothermal field, indicating a faster heat transfer speed compared to the surrounding layers.

2. Based on training the neural network: By incorporating attention mechanisms, a better calculation of the weights of three parameters and fitting of the spatial geological model have been achieved. Integration of bottleneck architectures enhances the training speed of the network and significantly reduces the time required for network training. The cosine annealing algorithm with warm restarts can improve the networkâ€™s fitting efficiency. Utilizing a multiparameter fusion network to simulate the temperature field can effectively leverage multiple parameters, leading to more accurate results.
Availability of data and materials
Not applicable.
References
Akbar S, Fathianpour N. Improving the conceptualâ€“numerical model of Sabalan geothermal system using geological, geophysical and structural information. Geothermics. 2021;90:102001.
Aliyu MD, Archer RA. Numerical simulation of multifracture HDR geothermal reservoirs. Renew Energy. 2021;164:541â€“55.
Assouline D, Mohajeri N, Gudmundsson A, Scartezzini JL. A machine learning approach for mapping the very shallow theoretical geothermal potential. Geothermal Energy. 2019;7(1):1â€“50.
Bassam A, Santoyo E, Andaverde J, HernÃ¡ndez JA, EspinozaOjeda OM. Estimation of static formation temperatures in geothermal wells by using an artificial neural network approach. Comput Geosci. 2010;36(9):1191â€“9.
Cheng Q, Wang X, Ghassemi A. Numerical simulation of reservoir stimulation with reference to the Newberry EGS. Geothermics. 2019;77:327â€“43.
Esen H, Inalli M, Sengur A, et al. Artificial neural networks and adaptive neurofuzzy assessments for groundcoupled heat pump system. Energy Build. 2007;40(6):1074â€“83.
Esen H, Inalli M, Sengur A, et al. Modelling a groundcoupled heat pump system using adaptive neurofuzzy inference systems. Int J Refrig. 2008a;31(1):65â€“74.
Esen H, Inalli M, Sengur A, et al. Forecasting of a groundcoupled heat pump performance using neural networks with statistical data weighting preprocessing. Int J Therm Sci. 2008b;47(4):431â€“41.
Esen H, Inalli M, Sengur A, et al. Modeling a groundcoupled heat pump system by a support vector machine. Renewable Energy. 2008c;8:33.
Esen H, Inalli M, Sengur A, et al. Performance prediction of a groundcoupled heat pump system using artificial neural networks. Expert Syst Appl. 2008d;35(4):1940â€“8.
Esen H, Inalli M, Sengur A, et al. Predicting performance of a groundsource heat pump system using fuzzy weighted preprocessingbased ANFIS. Build Environ. 2008e;12:43.
Esen H, Esen M, Ozsolak O. Modelling and experimental performance analysis of solarassisted ground source heat pump system. J Exp Theor Artif Intell. 2015;29(1):1â€“17.
Fabbri P. Probabilistic assessment of temperature in the Euganean geothermal area (Veneto region, NE Italy). Math Geol. 2001;33(6):745â€“60.
Forrest J, Marcucci E, Scott P. Geothermal gradients and subsurface temperatures in the northern Gulf of Mexico. 2005.
Gao W, Zhao J. Deeptime temperature field simulation of hot dry rock: a deep learning method in both time and space dimensions. Geothermics. 2024;119:102978.
Gudala M, Govindarajan SK. Numerical investigations on a geothermal reservoir using fully coupled thermohydrogeomechanics with integrated rsmmachine learning and arima models. Geothermics. 2021;96(1â€“12):102174.
He K, Zhang X, Ren S, Sun J. Deep residual learning for image recognition. In: Proceedings of the IEEE conference on computer vision and pattern recognition. 2016. pp. 770â€“778.
Ishitsuka K, Kobayashi Y, Watanabe N, Yamaya Y, Bjarkason E, Suzuki A, Saito R. Bayesian and neural network approaches to estimate deep temperature distribution for assessing a supercritical geothermal system: evaluation using a numerical model. Nat Resour Res. 2021;30(5):3289â€“314.
Kingma D, Ba J. (2014). Adam: a method for stochastic optimization. Computer Science. 2014.
Kiran R, Dansena P, Salehi S, Rajak VK. Application of machine learning and well log attributes in geothermal drilling. Geothermics. 2022;101:102355.
Krizhevsky A, Sutskever I, Hinton GE. ImageNet classification with deep convolutional neural networks. Commun ACM. 2012;60:84â€“90.
Lesmana A, Pratama HB, Ashat A, Saptadji NM. Sustainability of geothermal development strategy using a numerical reservoir modeling: a case study of Tompaso geothermal field. Geothermics. 2021;96:102170.
Li X, Mo X, Huang X, Dong G, Yu X, Luo M, Liu Y. UPb zircon geochronology, geochemical and Srâ€“Ndâ€“Hf isotopic compositions of the Early Indosinian Tongren Pluton in West Qinling: petrogenesis and geodynamic implications. J Asian Earth Sci. 2015;97:38â€“50.
Li B, Zhang J, Yan H, Zhou N, Li M, Liu H. Numerical investigation into the effects of geologic layering on energy performances of thermal energy storage in underground mines. Geothermics. 2022;102:102403.
Loshchilov I, Hutter F. SGDR: stochastic gradient descent with warm restarts. ICLR 2017 (5th International Conference on Learning Representations). 2016.
LÃ¶sing M, Ebbing J. Predicting geothermal heat flow in Antarctica with a machine learning approach. J Geophys Res: Sol Earth. 2021;126(6):e2020JB021499.
Lv Y, Yuan C, Gan Q, Li H, Zhu X. Analysis of heat transfer based on complex embedded discrete fracture network (EDFN) for fieldscale EGS. Geothermics. 2022;104:102463.
Moraga J, Duzgun HS, Cavur M, Soydan H. The geothermal artificial intelligence for geothermal exploration. Renewable Energy. 2022;192:134â€“49.
Okoroafor ER, Smith CM, Ochie KI, Nwosu CJ, Gudmundsdottir H, Aljubran MJ. Machine learning in subsurface geothermal energy: two decades in review. Geothermics. 2022;102:102401.
Qiu Z, Zou C, Mills BJW, Xiong Y, Tao H, Lu B, Liu H, Xiao W, Poulton SW. A nutrient control on expanded anoxia and global cooling during the Late Ordovician mass extinction. Commun Earth Environ. 2022;3:82.
Rezvanbehbahani S, Stearns LA, Kadivar A, Walker JD, van der Veen CJ. Predicting the geothermal heat flux in Greenland: a machine learning approach. Geophys Res Lett. 2017;44(24):12271â€“9.
Ronneberger O, Fischer P, Brox T. October). Unet: Convolutional networks for biomedical image segmentation. In: Navab N, Hornegger J, Wells WM, Frangi AF, editors. International Conference on Medical image computing and computerassisted intervention. Cham: Springer; 2015. p. 234â€“41.
Salinas P, Regnier G, Jacquemyn C, Pain CC, Jackson MD. Dynamic mesh optimisation for geothermal reservoir modelling. Geothermics. 2021;94:102089.
SepÃºlveda F, Rosenberg MD, Rowland JV, Simmons SF. Kriging predictions of drillhole stratigraphy and temperature data from the Wairakei geothermal field, New Zealand: Implications for conceptual modeling. Geothermics. 2012;42:13â€“31.
Siler DL, Hinz NH, Faulds JE, Queen J. 3D analysis of geothermal fluid flow favorability: Bradyâ€™s, Nevada, USA. In: Siler DL, editor. Proceedings fortyfirst workshop on geothermal reservoir engineering. Stanford: Stanford University; 2016.
Song X, Shi Y, Li G, Yang R, Wang G, Zheng R, Lyu Z. Numerical simulation of heat extraction performance in enhanced geothermal system with multilateral wells. Appl Energy. 2018;218:325â€“37.
Tut Haklidir FS, Haklidir M. Prediction of reservoir temperatures using hydrogeochemical data, Western Anatolia geothermal systems (Turkey): a machine learning approach. Nat Resour Res. 2020;29(4):2333â€“46.
Vogt C, Mottaghy D, Wolf A, Rath V, Pechnig R, Clauser C. Reducing temperature uncertainties by stochastic geothermal reservoir modelling. Geophys J Int. 2010;181(1):321â€“33.
Wang B, Li BX, Ma XH. The prediction of the depth and temperature of the reservoir in the evaluation of hot dry rock (HDR) for GongheGuide basin. Ground Water. 2015;37(3):28â€“31.
Wang G, Liu Y, Zhu X, Zhang W. The status and development trend of geothermal resources in China. Earth Sci Front. 2020;27(1):1.
Williams CF, Deangelo J. Evaluation of approaches and associated uncertainties in the estimation of temperatures in the upper crust of the western united states. Trans Geotherm Resour Council. 2011;35:1599â€“605.
Woo S, Park J, Lee JY, Kweon IS. CBAM: Convolutional block attention module. In: Proceedings of the European conference on computer vision (ECCV). 2018. pp. 3â€“19.
Xia L, Zhang Y. An overview of world geothermal power generation and a case study on Chinaâ€”the resource and market perspective. Renew Sustain Energy Rev. 2019;112:411â€“23.
Xiong Y, Zhu M, Li Y, Huang K, Chen Y, Liao J. Recognition of geothermal surface manifestations: a comparison of machine learning and deep learning. Energies. 2022;15(8):2913.
Yan WD. Characteristics of Gonghe basin hot dry rock and its utilization prospects. Sci Technol Rev. 2015;33(19):54â€“7.
Yang W, Xiao C, Zhang Z, Liang X. Identification of the formation temperature field of the southern Songliao Basin, China based on a deep belief network. Renew Energy. 2022;182:32â€“42.
Zeng L, Zhang KJ, Tang XC, Zhang YX, Li ZW. MidPermian rifting in Central China: record of geochronology, geochemistry and Srâ€“Ndâ€“Hf isotopes of bimodal magmatism on NE QinghaiTibetan Plateau. Gondwana Res. 2018;57:77â€“89.
Zhang C, Zhang S, Li S, Jia X, Jiang G, Gao P, Hu S. Geothermal characteristics of the Qiabuqia geothermal area in the Gonghe basin, northeastern Tibetan Plateau. Chin J Geophys. 2018a;61(11):4545â€“57.
Zhang C, Jiang G, Shi Y, Wang Z, Wang Y, Li S, Hu S. Terrestrial heat flow and crustal thermal structure of the GongheGuide area, northeastern QinghaiTibetan plateau. Geothermics. 2018b;72:182â€“92.
Zhang C, Huang R, Qin S, Hu S, Zhang S, Li S, Wang Z. The hightemperature geothermal resources in the GongheGuide area, northeast Tibetan plateau: a comprehensive review. Geothermics. 2021;97:102264.
Zhao XG, Wan G. Current situation and prospect of China×³s geothermal resources. Renew Sustain Energy Rev. 2014;32:651â€“61.
Zhu JL, Hu KY, Lu XL, Huang XX, Liu KT, Wu XJ. A review of geothermal energy resources, development, and applications in China: current status and prospects. Energy. 2015;93:466â€“83.
Acknowledgements
We thank Peng Research Group in CUMTB for support of this work.Â We thank Zheng Qiushi from China University of Mining and Technology (Beijing) for providing feedback on the modifications made to our network.
Funding
This work is supported by Science Fund for Creative Research Groups of the National Natural Science Foundation of China (No. 42321002), Fundamental Research Funds for the Central Universities (Grant No. 2022JCCXMT01, 2602020RC130).
Author information
Authors and Affiliations
Contributions
Conceptualization, J.Z.; methodology, W.G. and J.Z.; software, W.G.; formal analysis, W.G. and J.Z.; writing and editing, W.G. and J.Z. All authors have read and agreed to the published version of the manuscript.
Corresponding author
Ethics declarations
Competing interests
We (the authors) declare that there are no competing interests associated with the research.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Gao, W., Zhao, J. Prediction of geothermal temperature field by multiattribute neural network. Geotherm Energy 12, 22 (2024). https://doi.org/10.1186/s4051702400300x
Received:
Accepted:
Published:
DOI: https://doi.org/10.1186/s4051702400300x