Exoskeletons have decreased physical effort and increased comfort in activities of daily living (ADL) such as walking, squatting, and running. However, this assistance is often activity specific and does not accommodate a wide variety of different activities. To overcome this limitation and increase the scope of exoskeleton application, an automatic human activity recognition (HAR) system is necessary. We developed two deep-learning models for HAR using one-dimensional-convolutional neural network (CNN) and a hybrid model using CNNs and long-short term memory (LSTM). We trained both models using the data collected from a single three-axis accelerometer placed on the chest of ten subjects. We were able to classify five different activities, standing, walking on level ground, walking on an incline, running, and squatting, with an accuracy of 98.1% and 97.8%, respectively. A two subject real-time validation trial was also conducted to validate the real-time applicability of the system. The real-time accuracy was measured at 96.6% and 97.2% for the CNN and the hybrid model, respectively. The high classification accuracy in the test and real-time evaluation suggests that a single sensor could distinguish human activities using machine-learning-based models.