This chapter reviews the computational capabilities of discrete-time recurrent neural networks and features a number of relevant papers. Section 4.1 reviews some concepts of formal language theory. Section 4.2 studies under which conditions DTRNN behave as finite-state machines, and introduces two main groups of featured papers: those discussing the construction of FSM with, on the one hand, DTRNN which have threshold linear units, and, on the other hand, DTRNN which have sigmoid units. Section 4.3 discusses the comparative capacity of DTRNN and a cornerstone computational model, the Turing machine, and introduces a third group of featured papers dealing with this subject.