Journal Articles - Mathematic and Statistic - 2022
Permanent URI for this collection
Browse
Recent Submissions
1 - 5 of 8
-
PublicationAn Iterative Method for Solving the Multiple-Sets Split Variational Inequality Problem( 2022)In this work, we introduce a new algorithm for finding the minimum-norm solution of the multiple-sets split variational inequality problem in real Hilbert spaces. The strong convergence of the iterative sequence generated by the algorithm method is established under the condition that the mappings are monotone and Lipschitz continuous. We apply our main result to study the minimum-norm solution of the multiple-sets split feasibility problem and the split variational inequality problem. Finally, a numerical example is given to illustrate the proposed algorithm.
-
PublicationA Self-Adaptive Step Size Algorithm for Solving Variational Inequalities with the Split Feasibility Problem with Multiple Output Sets Constraints( 2022)In this paper, we investigate the problem of solving strongly monotone variational inequality problems over the solution set of the split feasibility problem with multiple output sets in real Hilbert spaces. The strong convergence of the proposed algorithm is proved without knowing any information of the Lipschitz and strongly monotone constants of the mapping. In addition, the implementation of the algorithm does not require the computation or estimation of the norms of the given bounded linear operators. Special cases are considered. Finally, a numerical experiment has been carried out to illustrate the proposed algorithm.
-
PublicationStability of fractional order of time nonlinear fractional diffusion equation with Riemann–Liouville derivative( 2022)In this paper, we investigate an equation of nonlinear fractional diffusion with the derivative of Riemann–Liouville. Firstly, we determine the global existence and uniqueness of the mild solution. Next, under some assumptions on the input data, we discuss continuity with regard to the fractional derivative order for the time. Our key idea is to combine the theories Mittag–Leffler functions and Banach fixed-point theorem. Finally, we present some examples to test the proposed theory.
-
PublicationDeep learned one‐iteration nonlinear solver for solid mechanics( 2022)The novel one-iteration nonlinear solver (OINS) using time series prediction and the modified Riks method (M-R) is proposed in this paper. OINS is established upon the core idea as follows: (1) Firstly, we predict the load factor increment and the displacement vector increment and the convergent solution of the considering load step via the predictive networks which are trained by using the load factor and the displacement vector increments of the previous convergence steps and group method of data handling (GMDH); (2) Thanks to the predicted convergence solution of the load step is very close to or identical with the real one, the prediction phase used in any existing nonlinear solvers is eliminated completely in OINS.
-
PublicationAdjusting Parameters in Optimize Function PSO( 2022)Particel Swarm Optimization (PSO) is a form of population evolutionary algorithm introduced in the early 1995 by two American scientists, sociologist James Kennedy and electrical engineer. Russell. This thesis mainly deals with the PSO optimization algorithm and the methods of adaptive adjustment of the parameters of the PSO optimization. The thesis also presents some basic problems of PSO, from PSO history to two basic PSO algorithms and improved PSO algorithms. Some improved PSO algorithms will be presented in the thesis, including: airspeed limit, inertial weighting, and coefficient limit. These improvements are aimed at improving the quality of PSO, finding solutions to speed up the convergence of PSO. After presenting the basic problems of the PSO algorithm, the thesis focuses on studying the influence of adjusting parameters on the ability to converge in PSO algorithms. PSO algorithms with adaptively adjusted parameters are applied in solving real function optimization problems. The results are compared with the basic PSO algorithm, showing that the methods of adaptive adjustment of the parameters improve the efficiency of the PSO algorithm in finding the optimal solutions.