The authors examine the possibility of improving the performance of discrete-synapse neural networks, functioning as content-addressable memories, by the inclusion of noise in their training procedure, and study the effects on the training itself. Pattern stability field distributions for optimized networks are illustrated for various levels of training noise, including the noiseless, maximally stable, regime. They show that the clipped Hebb rule is optimal in the high training noise limit, but that simulated annealing cannot be relied upon to identify a well defined optimal network for an arbitrary, finite, training-noise, in contrast to the case for continuous-synapse systems. Training by use of a continuous-synapse network, whose synapses are subsequently clipped, is also addressed.