c&p from my psychology paper:
In its most basic form, a neural net is a a set of units connected by statistically weighted links. (Russel & Norvig, 1995, p. 567). The units are mathematical stand ins for neurons-they do all the actual processing based on some set of inputs (including a current activation level) and they give back some output (including a new activation level). The weights on the links determine how important that specific link (signal) will be in the neurons computation, and these signals are then passed on to the next unit(s) in the chain, continuing until the signal has passed through whatever hierarchy it was supposed to. (Russel & Norvig, 1995, chap. 19) The weights are adjusted through a series of trials so that each node can learn which signals it needs to pay attention to and which ones it should filter out, making neural networks very suited to studying attention tasks, wherein every neuron has to decide what computational weight it wants to attach to each activation potential traveling through it.
Basically, people like neural nets 'cause they're good for studying/modeling networks with feedback.