Cloud manufacturing is a service-oriented networked manufacturing model that aims to provide manufacturing resources as services in an on-demand manner. Scheduling is one of the key techniques for cloud manufacturing to achieve the aim. Multi-task scheduling with dynamical task arrivals is a critical problem in cloud manufacturing. Many traditional algorithms such as the genetic algorithm (GA) and ant colony optimization algorithm (ACO) have been used to address the issue, which, however, either are incapable of or perform poorly in tackling the problem. Deep reinforcement learning (DRL) as the combination of deep learning (DL) and reinforcement learning (RL) provides an effective technique in this regard. In view of this, we employ a typical DRL algorithm—Deep Q-network (DQN)—and propose a DQN-based approach for multitask scheduling in cloud manufacturing. Three different task arrival modes—arriving at the same time, arriving in random batches, and arriving one by one sequentially—are considered. Four baseline methods including random scheduling, round-robin scheduling, earliest scheduling, and minimum execution time (min-time) scheduling are investigated. A comparison of results indicates that the DQN-based scheduling approach is effective and performs best among all approaches in addressing the multitask scheduling problem in cloud manufacturing.