Neural Network Papers

  1. B. Hassibi and T.Kailath, H-infinity optimal training algorithms and their relation to backpropagation, in Advances in Neural Information Processing Systems, Vol 7, G. Tesauro, D.S. Touretzky and T.K. Leen, Eds., pp. 191-199, MIT-Press, Apr 1995.

  2. B. Hassibi, A.H. Sayed and T. Kailath, H-infinity optimality criteria for LMS and backpropagation, in Advances in Neural Information Processing Systems, Vol 6, J.D. Cowan, G. Tesauro and J. Alspector, Eds., pp. 351-359, Morgan-Kaufmann, Apr 1994.

  3. B. Hassibi, D.G. Stork, G. Wolf and T. Watanabe, Optimal brain surgeon: Extensions, streamlining and performance comparisons, in Advances in Neural Information Processing Systems, Vol 6, J.D. Cowan, G. Tesauro and J. Alspector, Eds., pp. 263-271, Morgan-Kaufmann, Apr 1994.

  4. B. Hassibi, A.H. Sayed and T. Kailath, LMS and backpropagation are minimax filters, in Theoretical Advances in Neural Computation and Learning, V. Roychowdhury, K.Y. Siu, and A. Orlitsky, Eds., pp. 424-449, Kluwer 1994.

  5. B. Hassibi and D.G. Stork, Second order derivatives for network pruning: Optimal brain surgeon, in Advances in Neural Information Processing Systems, Vol 5, S.J. Hanson, J.D. Cowan and C.L. Giles, Eds., pp. 164-172, Morgan-Kaufmann, Apr 1993.

  6. B. Hassibi, D.G. Stork and G. Wolf, Optimal brain surgeon and general network pruning, Proceedings of the 1993 IEEE International Conference on Neural Networks, pp. 293-300, San Francisco, CA, Apr 1993.