Summary: | With the rising of modern data science, data-driven turbulence modeling with the aid of machine learning algorithms is becoming a new promising field. Many approaches are able to achieve better Reynolds stress prediction, with much lower modeling error (ϵM), than traditional Reynolds-averaged Navier-Stokes (RANS) models, but they still suffer from numerical error and stability issues when the mean velocity fields are estimated by solving RANS equations with the predicted Reynolds stresses. This fact illustrates that the error of solving the RANS equations (ϵP) is also very important for a RANS simulation. In the present work, the error ϵP is studied separately by using the Reynolds stresses obtained from direct numerical simulation (DNS)/highly resolved large-eddy simulation to minimize the modeling error ϵM, and the sources of ϵP are derived mathematically. For the implementations with known Reynolds stresses solely, we suggest to run an auxiliary RANS simulation to make a first guess on νt* and Sij0. With around 10 iterations, the error of the streamwise velocity component could be reduced by about one-order of magnitude in flow over periodic hills. The present work is not to develop a new RANS model, but to clarify the facts that obtaining mean field with known Reynolds stresses is nontrivial and that the nonlinear part of the Reynolds stresses is very important in flow problems with separations. The proposed approach to reduce ϵP may be very useful for the a posteriori applications of the data-driven turbulence models.
|