Abstract
The prediction of unsteady aerodynamic loads is a central problem during the design of turbomachinery. Over the last 20 years, harmonic balance methods have been shown to be highly efficient for this task. A CPU-cost optimal setup of a harmonic balance simulation, however, requires knowledge of relevant harmonics. In the case of a single blade row with a periodic disturbance this question amounts to the classical problem of harmonic convergence, a problem which is solely due to the nonlinearity of the unsteady flow physics. In contrast, for multi-stage configurations, the choice of harmonics is further complicated by the fact that the interactions of disturbances with blade rows may give rise to a vast spectrum of harmonics that possibly have important modal content, e.g., Tyler–Sofrin modes. The aim of this paper is to show that the mixing entropy attributed to circumferential modes of a given harmonic can serve as a disturbance metric on the basis of which a criterion could be derived whether a certain harmonic should be included or not. The idea is based on the observation that the entropy due to the temporal and circumferential mixing of the flow at a blade row interface may be decomposed, up to third-order terms, into independent contributions from different frequencies and mode orders. For a given harmonic balance (and steady) flow result, the mixing entropy attributed to modes that are simply mixed out, rather than resolved in the neighboring row, is shown to be a natural indicator of a potential inaccuracy. We present important features of the mixing entropy for unsteady disturbances, in particular a close relationship to sound power for acoustic modes. The problem of mode selection in a 1.5-stage compressor configuration serves as a practical example to illustrate our findings.