Default deny: A signal of academic maturity?
I listened to a very interesting episode of The Pod Delusion podcast today. In fact, all of the bits of the programme were interesting: from a discussion of faith schools from an athiest perspective, to a description of Tesco’s planning policies from a local council planning officer. However, the item on the programme that I want to talk about was called Default Deny and was written by Sean Ellis. Below I’ll give a brief description of his argument before talking about how I think this applies to academic life.
Sean took his inspiration from an article called The Six Dumbest Ideas in Computer Security, by Marcus Ranum which states that the dumbest idea is Default Accept. Until recently, many computer security systems were designed around the idea of letting everything go through apart from things that were specifically told to be stopped (thus the name: Default Accept). This is easily seen in Windows anti-virus software. Windows (at least before Windows XP) used the Default Accept principle, so anti-virus software was created with huge lists of pieces of software which must be denied. In his article, Marcus Ranum argues that it is far more sensible to base the system on a Default Deny principle, thus only allowing specific software to run.
So far, so good – but what does this have to do with academia? Well, Sean applies this to ideas by asking whether people have Default Deny brains or Default Accept brains, that is: whether they accept any new ideas by default, or whether they immediately reject them until they have studied the evidence and been convinced. He comes to the conclusion that many people have Default Accept brains, but that Default Deny brains would probably be a more healthy situation.
That struck a chord with me, in terms of scientific studies and academia in general. It can easily be said that science must be done with a Default Deny brain (or at least with a Default Deny hat on), but I think there is more to it than this. As a student in secondary school we were expected to work within a Default Accept paradigm. We weren’t expected to question what we were being taught – we were meant to just accept it at face value. That started to change at A-level when we were first introduced to some of the controversy in our subjects. I distinctly remember one of my A-level geography lessons when we were told that scientists still weren’t sure about how drumlins (a type of glacial landform) were formed. I suppose that was the day that I first became really interested in research: I’d found that we didn’t know everything, and I wanted to be one of the people that started to find things out!
However, even though we had been introduced to controversy at A-level we weren’t expected to really question the fundamentals we were taught. When writing an exam essay about the Burgess model I was expected to list a selection of its flaws, but not to question it at a fundamental level, nor to talk about the inherent problems with models.
At the beginning of my undergraduate degree I started to read journal papers: the original places where new scientific ideas are circulated. But, as a new student, I was far too in awe of the authors to criticise their work. After all, how could I, a lowly first year student, find problems in the work of professors and lecturers – especially work which had been accepted by equally experienced peer-reviewers? Gradually, however, I began to realise that it didn’t actually take a huge amount of knowledge to be able to intelligently critique someone’s work. Ignoring the name and affiliation of the author often helped with this. I distinctly remember finding a paper about the Empirical Line Method in a relatively unknown journal (something like the Chinese Journal of Soil Science – a strange place for a remote sensing paper), reading it, and realising that I could have written something better in my sleep! After that they followed thick and fast: I realise that papers in Remote Sensing of Environment are not without their flaws, and my unquestioned acceptance of the Ruddiman hypothesis completely ignored the body of work that presented alternative theories.
The gradual change from a Default Accept paradigm to Default Deny is an important change which occurs during education. Sadly, this doesn’t seem to occur for many people (even students who go on to get good degrees at university), and it should really start far lower down in school (there is no reason why able students should not be questioning the accepted point of view at GCSE level). Science is entirely based around the questioning of facts, and the denying of knowledge that cannot be backed up with evidence. All of my favourite academic papers (and, indeed, most of the important papers ever written in science) came about from the author denying what he has read and heard, and considering alternative points of view. When the change from Default Accept to Default Deny takes place it marks the start of the transition of that individual to a true scientist – and is a mark of increasing academic maturity.
Categorised as: Academic, TOK-related
[…] hot on the heels of my last post, I’m now applying another concept from a computing blog post to science. For many years […]