The Neural network system is an educational paradigm that unites several neural networks to solve a problem. This paper explores the relationship between the ensemble and its networks of neural components, both from the viewpoint of regression and classification, which reveals that certain networks are stronger than other neural networks. This result is surprising because the rest of the neural networks enter the ensemble at present. To prove that a GASEN algorithm efficiently selects the appropriate neural networks to construct an ensemble from different neural networks available. At first several neuronal networks were taught by GASEN. Then the network allocates random weights and uses genetic algorithms to establish these weights to classify the fitness of the neural system in one ensemble to a certain degree. Ultimately, it used the weights designed for the ensemble for certain neural networks. A comprehensive analytical analysis reveals that, in comparison to typical assemblies, such as luggage, GASEN can generate network assemblies with much smaller sizes but with a higher generalization efficiency. This study, in addition, gives the mistake a gradual regression, demonstrating that the performance of GASEN could be that it can greatly reduce its bias and uncertainty such that GASEN is well aware of its operating mechanism.