Being an artificial language designed to communicate instructions to a machine, particularly a computer, a programming language is a coded language used by programmers to write programs that is instructions that the computer can understand, in order to do what the programmer or the computer user wants the computer to do. Therefore, a programming language consists of a vocabulary and a set of grammatical rules for instructing a computer to perform specific tasks.
The most basic, called Low-level language, computer language is the machine language that uses binary digits, that is â€˜0â€™ and â€˜1â€™ code which a computer can run (execute) very fast without using a translator or interpreter program, but is tedious and complex. The high-level languages (such as Basic, C and Java) are much simpler to use, as they are more â€œEnglish-Likeâ€. But this type of programming language needs to use another program like a compiler or interpreter, to convert the high-level language code into the machine code, and are therefore slower.
Programming languages can be used to create programs that control the behavior of a machine and to express algorithms precisely.
The earliest programming languages predate the invention of the computer and were used to direct the behavior of machines. Thousand of different programming languages have been created, mainly in the computer field, with many more being created every year. Most programming languages describe computation in an imperative style, that is, as a sequence of commands, although some languages, such as those that support functional programming or logic programming, use alternative forms of description.
The description of a programming language is usually split into the two components of syntax and semantics. Some languages are defined by a specification document, for example C programming language is specified by an ISO Standard, while other languages have a dominant implementation that is used as a reference.
Thus, a programming language is a notation for writing programs, which are the specifications of a computation or algorithms.