Psychological systems are biological systems which are physical systems that are alive. Therefore, any theory that lays explanatory claim to phenomena of the mind, ultimately must be a theory about how a physical system is able to accumulate non-random order into its internal structure that appears to codetermine its behaviour. Less formally stated, a science that studies the behaviour of physical systems that are alive, that appear to have a memory which makes their behaviour adaptive, future oriented and intelligent, should be grounded in physical and biological principles and laws.
For now, generating such a theory might be a bridge too far (however, see Turvey & Carello, 2012), the least we may demand is that our current theories of human behaviour should not contradict highly corroborated theories of physics that describe (constituent components of) simple or complex dynamical systems. This is arguably not the case in current psychological theorising, theories assume internal, highly organised structures (such as mental representations) as causes for behaviour, without explaining where the order came from, or how it is maintained or increased. Well studied and formally defined constructs from other scientific disciplines are often imported at a metaphorical level, or are misinterpreted and essentially wrong. For example, plasticity, holism, behavioural state/mode/change, and especially, any concept related to the term information (computation, coding/decoding, information processing/storage/retrieval, entropy, etc.). Information is a formally defined quantity that resolves uncertainty about the states or properties of a theoretical object of measurement (e.g. a system, a signal) relative to its degrees of freedom, by assigning it (the uncertainty), a value. If a system represents 1 bit of information (e.g. a coin-toss system), this means it means it can be in 1 of 2 states, or have one of 2 distinct values.
This is clearly not the same as “meaning” with which it is often conflated in theorising about cognition and behaviour. Shannon lucidly explained this in his seminal paper, which was to be the start of a new scientific discipline, information theory:
“The fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point. Frequently the messages have meaning; that is they refer to or are correlated according to some system with certain physical or conceptual entities. These semantic aspects of communication are irrelevant to the engineering problem. The significant aspect is that the actual message is one selected from a set of possible messages. The system must be designed to operate for each possible selection, not just the one which will actually be chosen since this is unknown at the time of design.”
—Shannon (1948, p.379)