This Master thesis deals with the complete understanding and creation of a 3-layer Backpropagation Neural Network with synaptic weight update performed on a per sample basis (called, On-Line update). The aim is to create such a network for general purpose applications and with a great degree of freedom in choosing the inner structure of the network. The algorithms used are all members of supervised learning classes, i.e. they are all supervised by a desired signal. The theory will be treated thoroughly for the steepest descent algorithm and for additional features which can be employed in order to increase the degree of generalization and learning rate for the network. Empirical results will be presented and some comparsions with pure linear algorithms will be made for a signal processing applications, speech enhencement.