Negative Correlation Learning (NCL) and Mixture of Experts (ME), two popular combining methods, each employ different special error functions for the simultaneous training of NN experts to produce negatively correlated NN experts. In this paper, we review the properties of the NCL and ME methods, discussing their advantages and disadvantages. Characterization of both methods showed that they have different but complementary features, so if a hybrid system can be designed to include features of both NCL and ME, it may be better than each of its basis approaches. In this study, an approach is proposed to combine the features of both methods, i.e., Mixture of Negatively Correlated Experts (MNCE). In this approach, the capability of a control parameter for NCL is incorporated in the error function of ME, which enables the training algorithm of ME to establish better balance in bias-variance-covariance trade-offs. The proposed hybrid ensemble methods, MNCE, are compared with their constituent methods, ME and NCL, in solving several benchmark problems. The experimental results show that our proposed method preserve the advantages and alleviate the disadvantages of their basis approaches, offering significantly improved performance over the original methods.