Model-based design of experiments (MBDoE) has been widely used for efficient development of mathematical models, which can then be used for various applications for real world systems. The conventional optimality criteria for MBDoE can suffer from ill-conditioning of design matrix, which can be easily encountered in practical systems. To alleviate this problem, in this work, an alternative optimality criterion is proposed, whose formulation depends on mean squared error of biased estimators obtained by parameter subset selection. Such formulation is applied to subset selection methods by ranking and by transformation. Then, using an illustrative linear example, the performance of the proposed criterion is compared with three conventional criteria: A-, D-, and E-optimality criteria. Through the case study, it is shown that the proposed criterion can outperform the conventional ones in all the cases, generating linear models with smaller prediction errors, and it can provide better results with subset selection by transformation. (C) 2022 Elsevier Ltd. All rights reserved.