Registrations for this event have now closed but you can view some of the key content straight from the website without logging in. Logging in as an existing registrant will give you access to search, filter and the schedule building facilities.
avatar for Daniel Muijs

Daniel Muijs

Deputy Director, Research and Evaluation
Daniel Muijs is Head of Research at Ofsted. Prior to his current role, Daniel was Professor of Education at the University of Southampton, and Associate Dean Research and Enterprise in the Faculty of Social, Human and Mathematical Sciences. He also previously worked as Professor of Pedagogy and Teacher Development at the University of Manchester, Professor of School Leadership and Management at the University of Newcastle and as senior lecturer in quantitative research methods at Warwick Institute of Education.

He is an acknowledged expert in the fields of Educational and Teacher Effectiveness and quantitative research methods and has published widely in these areas. He is co-editor of the journal 'School Effectiveness and School Improvement' and has held key advisory posts in a range of academic and professional organisations, including currently the executive council of the European Association for Research in Learning and Instruction and the Questionnaire Expert Group for the OECD TALIS survey.

Daniel holds a PhD in Social Sciences from the Catholic University of Leuven (Belgium), an MSc in Managerial Economics (Catholic University of Leuven) and a BA in Communication Sciences (Catholic University of Leuven).

In Daniel's words...

Since joining Ofsted and taking on a role in our Research and Evaluation team, I have from time to time been asked ‘why does Ofsted do research?’. It is a fair question. After all, as some have noted, Ofsted is not a research organisation.

The first answer is that we want to be as sure as we can be that we are looking at the right things when we inspect and regulate. To do so, we need to follow the best evidence on what good and effective provision is. That means looking at the existing evidence base, but also carrying out our own research when necessary.
When we developed the education inspection framework (EIF), for example, we started by reviewing existing research on educational effectiveness. However, we discovered that there was a lack of current research on curriculum and curriculum quality in England. So, we carried out our own research to complement that evidence base. This meant that we were able to develop a valid concept of quality of education, which has informed what we inspect.
We also use research to test new methodologies, such as the deep-dive approach we are using as part of EIF. This ensures that how we inspect as well as what we inspect are evidence based.

Acting on our evidence

For us, being research-informed is a necessary factor in doing our job well. We need to make sure that we are basing our judgements on the things that matter to the life chances of children and young people. When we have evidence of what works, it is imperative that we act on this. This is a social justice issue, because the quality of education and care matter most to the disadvantaged and vulnerable in society. They are the ones who benefit most from high-quality provision.

As a force for improvement, we also have a role to play in informing others about the quality of education and social care. Indeed, part of our statutory duty is to inform the Secretary of State on exactly this. At Ofsted, we are in a unique position to provide a birds-eye view of the system and we would not be doing our duty if we did not make productive use of this.

Doing research also allows us to inform providers on good (and not so great) practice. We have, for example, been able to provide evidence for social care providers on effective working with older children suffering from neglect and for schools and colleges on factors that most impact teacher well-being.

Our research also allows us to make an evidence-based contribution to national debates. We did this in commenting on what schools and colleges can, and importantly cannot, do in helping to reduce knife crime.
The culmination of this work is our Annual Report, which summarises what we have found out through our inspections, regulatory activities, analysis, research and evaluation.

As a responsible organisation, we also need to know whether our frameworks and methodologies are being implemented as intended and are having the outcomes we expect. That is why we now evaluate all our new frameworks and methods. We are committed to doing so in a transparent way, so we publish our evaluation work as well as our research reports. Our evaluation work is there to make sure that our frameworks and methods are fit for purpose. However, it also allows us to tweak and adapt elements that are working less well and to check for any unintended consequences of what we are doing. We aim to highlight the positives as well as where we can do better, as our recent review of the inspection of local authority children’s services (ILACS) framework in social care demonstrates.

Our research aims

Research and evaluation lie at the heart of what we want to achieve as a regulator and an inspectorate. However, we cannot do just any research and evaluation work, of course. We need to make sure that we align what we do to our role and goals, and that we link everything we do to our strategy.
We aim for our work to have an impact either on our own future work or on the remits that we inspect. Our role is not to do fundamental research or to rack up academic publications. We intend for our work to be practical, policy-focused and accessible, and that is reflected in how we publish our findings. Although we may in future publish some of our work in academic journals, this is not our primary aim.

Research methodology

We also do not hold a particular view on research methodology. We are pragmatic and eclectic in the methods we use, valuing both quantitative and qualitative research for the different insights that they can bring. Methods follow questions in our approach.
We commission some of our research, such as our off-rolling survey, from other organisations, though we do most in-house. We typically, though not exclusively, draw on the insights from our inspectors and on inspection evidence, but we have also used questionnaires and analysis of existing data.
In our coming research and evaluation programme, we are looking to expand our methods toolbox further. Across research and evaluation projects, we are committed to transparency on methods, rigour in how we work and full adherence to ethical guidelines.

Future projects

So, what are our plans for the next 2 years?
In social care, we are continuing our existing ‘good decisions’ programme, where we look at how decisions are made. This includes, for example, looking at how children are matched with foster carers. We are also carrying out more joint targeted area inspections (JTAIs). The first one will be on children’s mental health. All these projects will help us to inform the sector on good practice. We will also be looking at the impact that the ILACS framework and social care common inspection framework (SCCIF) have had in the social care sector.
In schools, we will be continuing our programme of work on behaviour. This follows on from the first phase of work we reported on in our recent commentary. Our biggest new programme will be the re-introduction of thematic subject reviews. For this, we will be using data from inspection deep dives to look at the state of the nation in different subject areas across key stages. The first subjects we will be researching will be mathematics and languages. We will be looking at financial decision-making on inspection, to see whether this gives us more insight into the quality of leadership and management. We will also be carrying out research on alternative provision and doing further work on stuck schools.

In further education and skills (FES), we are doing a research project on work-based learning and carrying out an evaluation of apprenticeship subcontracting. We will also be looking at the reliability of inspection and will continue our work on reviewing the literature on effective practice in FES.

In early years, we are looking at how providers develop the 3 prime areas of learning in the early years foundation stage (EYFS). We will also look at the role of chains of providers and regulation. Both of these projects will also include cross-remit work with social care.

And, of course, we are evaluating the implementation and, later, the impact of the EIF in schools, early years and FES. This will be alongside doing a study on the impact of grading providers across our remits.

Next steps

As you can see, we are planning a varied programme of work. What all these projects have in common is that they will continue to contribute to the main aims of research and evaluation at Ofsted:
  • informing our inspection and regulatory work
  • evaluating what we do
  • providing a birds-eye view of the system in the remits that we inspect and regulate

To be a force for good in the lives of children and young people means knowing what works and what does not work in order to provide them with the best possible education and care. It also means acting on that information in our inspection and regulatory work. That is what we do, and that is why we do research and evaluation at Ofsted.

Twitter: https://twitter.com/ProfDanielMuijs
Ofsted News Twitter: https://twitter.com/Ofstednews
  • Timezone
  • Filter By Channel Four Online Channels
  • Filter By Type
  • Blended Learning
  • Building Culture
  • Leadership
  • Live or On Demand
  • Audience
  • Format
  • Theme