The recognition of daily actions, such as walking, sitting or standing, in the home is informative for assisted living, smart homes and general health care. A variety of actions in complex scenes can be recognised using visual information. However cameras succumb to privacy concerns. In this paper, we present a home action recognition system using an 8×8 infared sensor array. This low spatial resolution retains user visual privacy, but is still a powerful representation of actions in a scene. Actions are recognised using a 3D convolutional neural network, extracting not only spatial but temporal information from video sequences. Experimental results obtained from a publicly available dataset Infra-ADL2018 demonstrate a better performance of the proposed approach compared to the state-of-the-art. We show that the sensor is considered better at detecting the occurrence of falls and actions of daily living. Our method achieves an overall accuracy of 97.22% across 7 actions with a fall detection sensitivity of 100% and specificity of 99.31%.