On the Expressivity of Recurrent Neural Cascades
- 11:00 21st May 2024 ( Trinity Term 2024 )051
Recurrent Neural Cascades (RNC) are the recurrent neural networks with no cyclic dependencies among recurrent neurons. This class of recurrent networks is successfully used in practice. We study the expressivity of RNC in terms of formal languages, which provide a unifying framework where to describe the expressivity of all formalisms capturing temporal patterns. The existing results on the expressivity of RNC are scarce. What is known is that they do not capture all regular languages. We take steps towards a comprehensive characterisation of the expressivity of RNC. We show that the subclass RNC+ with positive recurrent weights captures all star-free regular languages, a central class corresponding to the expressivity of many well-established temporal formalisms. Furthermore, we show that RNC+ captures no other regular language, and it captures no other formal language in the presence of an identity element—i.e., an input that can occur an arbitrary number of times without affecting the output. Thus, we obtain that the expressivity of RNC+ in the presence of identity elements coincides with the star-free regular languages, which makes RNC+ a strong candidate as a model for learning temporal patterns.