Machine learning is a field that is inspired by how humans and, by extension, the brain learns.The brain consists of a biological neural network that has neurons that are either active or inactive. Modern-day artificial intelligence is loosely based on how biological neural networks function. This paper investigates whether a multi layered perceptron that utilizes inactive/active neurons can reduce the number of active neurons during the forward and backward pass while maintaining accuracy. This is done by implementing a multi layer perceptron using a python environment and building a neuron activation algorithm on top of it. Results show that it ispossible to reduce the number of active neurons by around 30% with a negligible impact on test accuracy. Future works include algorithmic improvements and further testing if it is possible to reduce the total amount of mathematical operations in other neural network architectures with a bigger computational overhead.