In this paper, we propose a new learning paradigm of neural network and apply it to solve the subspace decomposition problem for principal components analysis. In this proposed network, each neuron learns about the environment through a process of Hebbian-based self-regulation which actively controls the neuron's own learning by perceiving its status in overall learning effectiveness. Based on this concept of self-regulation, we derive the primary learning rules of the synaptic adaptation in the network. The Hebbian-based self-regulative neural network is utilized to explore significant features of the environment data in an unsupervised way and to implement subspace decomposition of the data space. Numerical simulations demonstrate the efficiency of the learning model and verify the practicability of the concept of individual neuronal self-regulation for learning control.