Abstract
In the drive for lower fuel consumption through increased bypass ratio and increased overall pressure ratio (OPR), engine designs for the next generation of single-aisle aircraft will require core sizes below 3 lb/s and OPRs above 50. Traditionally, these core sizes are the domain of centrifugal compressors, but materials limit pressure ratio in these machines to well below 50. An all-axial high-pressure compressor (HPC) at this core size, however, comes with limitations associated with the small blade spans at the back of the HPC, as clearances, fillets, and leading edges do not scale with the core size. The result is a substantial efficiency penalty, driven primarily by the tip leakage flow produced by the larger clearance-to-span ratio, which negates the cycle efficiency benefits of the high OPR. In order to enable small-core, high-OPR, all-axial compressors mitigating technologies need to be developed and implemented to reduce the large clearance-to-span efficiency penalty. However, for this technology development to be successful, it is imperative that predictive design tools accurately model the overall flow physics and trends of the technologies developed. In this paper, we describe an effort to determine whether different modeling standards are required for a large clearance-to-span ratio, and if so, identify criteria for an appropriate solver and/or mesh. Multiple models are run and results compared with data collected in the NASA Glenn Research Center’s low-speed axial compressor. These comparisons show that steady Reynolds-averaged Navier–Stokes (RANS) solvers can predict the pressure-rise characteristic to an acceptable level of accuracy, if careful attention is paid to mesh topology in the tip region. However, unsteady tools are necessary to accurately capture radial profiles of blockage and total pressure. The Delayed-Detached Eddy Simulation model was also used to run this geometry, but did not resolve any additional features not captured by the unsteady RANS simulation near stall.