Different kinds of Multilayer Perceptrons, using a back-propagation learning algorithm, have been used to perform data compression tasks. Depending upon the architecture and the type of problem learned to solve (classification or auto-association), the networks provide different kinds of dimensionality reduction preserving different properties of the data space. Some experiments show that using the non-linearities of the MLP units may improve performances of classical linear dimensionality reduction. All the experiments reported here have been carried out on speech data.
Bibliographic reference. Blanchet, Pascal (1989): "Multilayer perceptron architectures for data compression tasks", In EUROSPEECH-1989, 1329-1332.