The neural networks that have exploded in popularity lately are nothing like actual neurons. They're usually just elements of a matrix, and the "layers" are just matrix multiplications. "Neuromorphic" ones are still pretty simple compared to the real thing but they are much more complex than the other neural networks. They try to physically model interactions between neurons, so each neuron is constantly sending and receiving and self-tuning. It's nothing like a matrix multiplication.
Edit: It also runs efficiently on very different hardware. https://www.nextplatform.com/2017/02/15/memristor-research-h...