Very Large Scale Integration (VLSI):
Very-large-scale integration (VLSI) is the process of creating integrated circuits by combining thousands of transistors into a single chip. VLSI began in the 1970s when complex semiconductor and communication technologies were being developed. The microprocessor is a VLSI device. The term is no longer as common as it once was, as chips have increased in complexity into billions of transistors.
Taken From:
http://en.wikipedia.org/wiki/Very-large-scale_integration
The phrase "neural networks" (NN) has been bounced around for quite a while and is more of a "catch phrase" than a precise reference to any particular semiconductor topology. In general the ideas behind NN revolve around "adaptive learning" and "recognition". While some studies have focused on discrete hardware approaches, typical modern approaches focus on software and firmware, making them relatively platform independent. In many ways NN are a subset of Artificial Intelligence (AI), that is, NN seek to solve problems intuitively rather than sequentially or algorithmically.
From a strictly hardware point-of-view, probably the best example of what NN might "look like" would be Field Programmable Gate Arrays (FPGA). FPGAs consist of hundreds to hundreds of thousands of "Programmable Logic Elements" (PLE), and in some cases dedicated RAM, microprocessors, multipliers etc. While these devices are typically used to "prototype" VLSI hardware, they could easily be integrated into a PC either as a "co-processor" or even as the primary CPU. Numerous "Application Specific Integrated Circuits" (ASIC) could be replaced with a single FPGA. For instance, various Digital Signal Processors, cryptographic engines or Advanced Math Processors could all be synthesized and placed on an on-board FPGA to dramatically decrease the processing time a standard CPU would require for a particular task.
On the list of reasons PCs do NOT have FPGA to accomplish these tasks is the huge programming overhead involved compared to the very small number of users who might benefit from having it available. Even migrating Windows from 16-bit to 32-bit to 64-bit creates HUGE log jams in driver development, and compatibility issues keep rearing their ugly heads. Adding an FPGA to mobo's has the potential to create a serious SNAFU, for instance, what if the user wants to run two programs that utilize some of the same resources on the FPGA? Which one wins? Certainly not the user!
Playing with an FPGA evaluation board is fun and very educational. In almost no time you can load a micro controller and test your latest firmware changes, then a moment later load a synthesized video card and stream video to it, but as versatile as it is, this functionality comes with an enormous programming overhead that your average user just doesn't need or want. This appears to be the case with most pursuits into NN; that is, while the idea of having an adaptive system that is a general solution for many things seems appealing, in most cases, more traditional "firm" approaches to individual problems is more practical.
Fish