"Men, it has been well said, think in herds; it will be seen that they go mad in herds, while they only recover their senses slowly, and one by one." (Charles Mackay)
Let's start with the most devastating regulatory failure ever:-
The majority in government, in regulation, and in the financial services industry were supported in their rosy view of the 1990s financial world by the fact that all 'the great and the good' shared their optimistic analysis of corporate behaviour and of the virtues of 'light touch regulation'. This was despite the fact that there were nevertheless a good number of perceptive commentators who did their best to warn of the forthcoming catastrophe. The widespread failure to heed these warnings was a good example of two very common psychological behaviours, often called 'herd behaviour' and 'group think'.
Herd behaviour is not necessarily irrational. Herds usually run together for a good reason, and there is nothing irrational about humans seeking to watch and learn from what others are doing. Rational investors in an efficient market will therefore produce frenzies and crashes from time to time. But the investor example shows that herds — like groups of colleagues — can also get it badly wrong, running too fast and too blindly into danger rather than away from it. And regulators are surely responsible for ensuring that herds to do not crash over cliffs and take lots of innocent (customer/taxpayer) victims with them.
Policymakers, too, must hold tight to the ability to think for themselves. They should not let their colleagues override their doubts and fail to properly analyse alternative courses of action. And, whilst they might sensibly follow a herd for a little while, they should as soon as possible start evaluating situations for themselves and, if necessary, get out of the herd before it is too late.
Group-think is more pervasive but also highly undesirable, for it is essentially the unquestioning acceptance of obviously wrong answers simply because it is socially painful to disagree. Large organisations require a good deal of conformity, of course. You can't have every single person asking questions. Decisions have to be made and implemented. And if everyone you know, every newspaper you read, every person you once admired, are all saying the same thing, it takes a real effort of will, and real courage, to argue back. But someone has to do it - and yet it is uncomfortable to be a contrarian. Indeed, dissent can lead to the dissenter becoming the subject of personal and sometimes humiliating attack. Camilla Cavendish (who had run Prime Minister David Cameron's Policy Unit) told the Today Programme in December 2016 that some officials "became very very angry and took it viscerally personally" when it was suggested that there might have been a better way of negotiating with the rest of the EU in advance of the 2016 Brexit referendum. Questioning their strategy was perceived by them as questioning their commitment.
It can also sometimes be very difficult to identify those making (or sustaining) the very largest decisions. It would be difficult, for instance, to say that the (commercially and financially disastrous) decision to build Advanced Gas-cooled Nuclear reactors (AGRs) in the UK was made by anyone at all - certainly not Minister Fred Lee who could in practice do little more than act as a mouthpiece for what was in practice a collective decision of the Atomic Energy Authority, , the Central Electricity Generating Board and civil servants. Those who expressed doubts, or provided negative feedback on the program, found their carer progression blocked or terminated.
I suspect, though cannot provide evidence that, much the same applies to those who doubt the wisdom of the UK's Trident nuclear deterrent program.
For further information, you might like to read Mannie Sher's paper on The Psychology of Regulation, the first part of which deals with this subject.
The British reluctance to offer overt challenge or criticism doesn't help. Industrialist Sir Denys Henderson was famed for his unforgiving tongue but appears to have had little impact as a non-exec of the Board of Barclays Bank. One scholarly director told him "You are essentially an oratio recta [direct talking] man but Barclays is essentially an oratio obliqua [indirect talking] company." No doubt this contributed to Barclays' subsequent troubles. (The Senior Civil Service, too, suffers from too much indirect talking.)
Margaret Heffernan offers an interesting - almost hilarious - analysis of group-think in her excellent book Wilful Blindness.
"I've even heard boards discuss how, and why, they are invulnerable to groupthink, oblivious to the irony inherent in their confidence. ... Dennis Stevenson, then chairman of HBOS, eulogised the outstanding board he chaired [at a time when] everyone knew the bank teetered on the edge of collapse. ... [Lord Stevenson cited as evidence] the fact that, even in this crisis, 'we are as one'. He seemed oblivious to the notion that the unity of his board may have been a contributory factor to the bank's mess in the first place."
Shared Information Bias is a related problem. It arises (and it arises all too frequently) when groups spend more time discussing what everyone knows than they do exploring each individual's knowledge and perspective. The main reason for this is (again) politeness. People like to get along and have a low tolerance for conflicting views. In a crisis, experts may be particularly keen to be helpful by reaching consensus quickly, which increases the pressure to suppress doubts and questions. In his novel Amsterdam, Ian McEwan describes a meeting in which "everyone nodded, nobody agreed'".
Margaret Heffernan's book (referenced above) introduced me to the related concept of cognitive dissonance. This phrase refers to the stress that we all encounter if we try seriously to consider two incompatible views at the same time. We all therefore fiercely hold onto our preconceived beliefs even in response to intense external pressure. One result is that - as we almost all regard ourselves as good people, and essentially honest - we find it very hard to admit to ourselves that we or our organisation is behaving badly. Senior executives typically therefore push back very hard against any regulator's or other suggestion that their actions maybe doing unnecessary harm, including risking their own business.
Cognitive dissonance almost certainly accounts for the failure of many, including Alan Greenspan - a deep believer in the power of the markets, and a fan of financial instruments such as derivatives - to react to all the warnings that something was going very badly wrong in the period before the 2008 financial crisis.
And without wishing to question the Brexit decision itself, I do wonder whether cognitive dissonance lay at the heart of many Brexiteers unwillingness to engage with serious critics of the implementation paths chosen by both the Theresa May and Boris Johnson governments.
See also this note on behavioural economics for a further discussion of herd behaviour etc.