Definition: The concept of entropy constitutes, together with energy, a cornerstone of contemporary physics and related areas. It was originally introduced by Clausius in 1865 along abstract lines focusing on thermodynamical irreversibility of macroscopic physical processes. In the next decade, Boltzmann made the genius connection—further developed by Gibbs—of the entropy with the microscopic world, which led to the formulation of a new and impressively successful physical theory, thereafter named statistical mechanics. The extension to quantum mechanical systems was formalized by von Neumann in 1927, and the connections with the theory of communications and, more widely, with the theory of information were respectively introduced by Shannon in 1948 and Jaynes in 1957. Since then, over fifty new entropic functionals emerged in the scientific and technological literature. The most popular among them are the additive Renyi one introduced in 1961, and the nonadditive one introduced in 1988 as a basis for the generalization of the Boltzmann–Gibbs and related equilibrium and nonequilibrium theories, focusing on natural, artificial and social complex systems. Along such lines, theoretical, experimental, observational and computational efforts, and their connections to nonlinear dynamical systems and the theory of probabilities, are currently under progress. Illustrative applications, in physics and elsewhere, of these recent developments are briefly described in the present synopsis.