Duenas-Diez, Marta and Juan Perez-Mercader

Every problem in computing can be cast as decision problems of whether strings are or are not in a language. This makes language recognition fundamental for natural and artificial computation. Computer science teaches that computations and language recognition are carried out by three classes of automata, the most complex of which is the Turing machine. Living systems on Earth compute using the complex machinery of biochemistry; in the artificial, computation today is mostly electronic. Thinking of chemical reactions as molecular recognition machines, and without using biochemistry, we formulate the design, realization, construction and operation of one automaton in each class by means of 1-pot, table top chemical reactors: from the simplest, Finite automata, to the most complex, Turing machines. We test them experimentally by transcribing strings of symbols chemically, sequentially feeding them to the reactor and monitoring the physico/chemical output of the reaction. The strings we use for recognition belong to standard computer science languages specific for each class of automata. We find that language acceptance/rejection criteria by an automaton can be formulated in terms of energy considerations. The free energy spent for every word accepted by our chemical Turing machine can be tuned to a constant value associated with the pair machine-language, where for words in the language the entropy rate offsets the energy dissipation. Our Turing machine uses the Belousov-Zhabotinsky chemical reaction and checks the same symbol in an unprecedented Avogadro’s number of processors. Our findings have implications for chemical and general computing, bio-engineering and technology, for the study of the origin and presence of life in other planetary bodies and for artificial biology.