A day early this week!
In the PM’s speech this week referred to below, she mentioned the implications of Brexit for research:
…. since 2010 the number of overseas students coming to study at UK universities has increased by almost a quarter. The UK will always be open to the brightest and the best researchers to come and make their valued contribution. And today over half of the UK’s resident researcher population were born overseas.
When we leave the European Union, I will ensure that does not change.
- Indeed the Britain we build together in the decades ahead must be one in which scientific collaboration and the free exchange of ideas is increased and extended, both between the UK and the European Union and with partners around the world.
- I know how deeply British scientists value their collaboration with colleagues in other countries through EU-organised programmes. And the contribution which UK science makes to those programmes is immense.
- I have already said that I want the UK to have a deep science partnership with the European Union, because this is in the interests of scientists and industry right across Europe. And today I want to spell out that commitment even more clearly.
- The United Kingdom would like the option to fully associate ourselves with the excellence-based European science and innovation programmes – including the successor to Horizon 2020 and Euratom R&T. It is in the mutual interest of the UK and the EU that we should do so.
- Of course such an association would involve an appropriate UK financial contribution, which we would willingly make.
- In return, we would look to maintain a suitable level of influence in line with that contribution and the benefits we bring.
The UK is ready to discuss these details with the Commission as soon as possible.
Some more flesh was put on these bones by a policy paper from the Department for Existing the EU: Framework for the UK-EU partnership Science, research and innovation
AI, data and other Industrial Strategy news
In related news, Innovate UK published a report on the immersive economy
And the government issued 4 calls for ideas and evidence on the PM’s 4 missions.They want new ideas here:
- AI and data: “we have one question: Where can the use of AI and data transform our lives?”
- Ageing society: “we would like to hear your thoughts on the following: How can we best support people to have extra years of being healthy and independent?
- Clean Growth: “we would like to hear your thoughts on the following: How can our construction industry use its existing strengths to halve energy use in buildings?”
- Future of mobility: “we have one question: How can we ensure that future transport technologies and services are developed in an inclusive manner?.
If you’d like to contribute to any of these, please contact firstname.lastname@example.org
Subject level TEF
You can read BU’s response to the subject level TEF consultation here. We agree with the issues raised below and we advocated a new model because of serious problems with both Model A and Model B. We also suggested a longer time frame (because of the volume of work involved, not complacency), and disagreed with both grade inflation and teaching intensity metrics. And we challenged the awards at both institutional and subject level, proposing instead two awards (good and excellent/ excellent and outstanding) with stars for subjects.
Interesting developments for TEF (and more generally), the OfS have published their timetable for NSS and Unistats data for 2018:
- The Office for Students (OfS) is applying the Code of Practice for Statistics to its data publication in anticipation of its designation as a producer of official statistics by July 2018. This has implications for the pre-publication access that we can grant to NSS outcomes and Unistats data, as these will now be treated as official statistics. As a consequence, we will now publish the NSS public dataset at the same time as providers are able to access their own data 2 on Friday 27 July 2018.
- There will also be no provider preview as part of the annual Unistats data collection and publication process, and data available in system reports will be limited to that essential for quality processes associated with the Unistats return.
- In June 2018, we will add earnings data from the Longitudinal Education Outcomes dataset for English providers to Unistats.
- From September 2018, we will begin to use the Common Aggregation Hierarchy developed for the Higher Education Classification of Subjects to present data on Unistats in place of the current subject hierarchy.
- The Unistats website will be updated in June 2018 to include Year three outcomes from the Teaching Excellence and Student Outcomes Framework.
- Following consultation on the outcomes of the Review of Unistats in 2015, the funding bodies are working together on options for a replacement for the Unistats website. This new resource would draw on the findings from the review about decision-making behaviour and the information needs of different groups of prospective students. We will progress this work in stages – ensuring that it is developed in a way that meets the needs of prospective students across all countries of the UK – and will provide the sector with periodic updates, the first of which will be in summer 2018.
Research Professional have a neat summary of the sector response.
panel chair Janice Kay of the University of Exeter reflects on progress made and the challenges – and opportunities – arising from the exercise.
- “when breaking down the metrics into 35 subjects, cohort sizes can be small”
- “ is clear that the current format of the seven subject groupings poses challenges. For example, while it may reduce the writing load by asking institutions to describe its subjects in a summated way, it has sometimes limited what subjects can say about themselves, making it difficult to identify what happens in individual subjects. And we have heard that the format can increase writing effort, even if volume is reduced… It’s critical during this exercise that the written judgments can continue to do this, and that holistic judgments are not captured by metrics. There is therefore a question whether metric and written submission data can be better balanced in Model B.”
- Plus some credibility issues with Model A
Melanie Rimmer, chief planner at Goldsmiths, University of London, ponders the likely outcomes of the subject-level TEF consultation.
- “Model B best meets the primary intention of Subject-Level TEF – that being to provide greater information to students – since it allows for greater variation between outcomes for subjects. However, highlighting variation in provision will only be attractive to institutions where that differentiation is a better rating than the current provider-level rating. If you want to hide weaker performance, then opt for Model A.
- The main argument in favour of Model A is that it will reduce the burden of submission and assessment. That will be attractive to institutions which, having been through the exercise once and established their credentials, perceive the requirements of TEF as an unnecessary additional imposition that will deliver minimal return. Solid Golds and Silvers are likely to prefer Model A for this reason. Those at the borders of the ratings, with an eye on how close they are to moving between them, are more likely to see value in the greater effort required by Model B.”
- “Those which are unlikely to see their rating change, or indeed which might see their metrics moving in the wrong direction and worry about a lesser rating, will naturally support longer duration awards. Those hoping to gain a shinier medal as a result of improving performance will see value in more regular submissions.”
- “There are, however, bound to be areas of common ground on the consultation proposals. Every institution I have spoken to has identified a problem with the subject classifications, highlighting why combining disciplines X and Y makes no sense in their institution. However, in each case the disciplines cited are different because the issues stem primarily from institutional structures.”
Stephanie Harris of Universities UK (UUK) looks ahead to the future of TEF and the forthcoming statutory review of the exercise.
Claire Taylor of Wrexham Glyndŵr University looks at TEF from a quality enhancement perspective and considers the options for institutions in devolved nations.
- “perhaps the very act of putting together the written submission also provides an opportunity for us to engage with an enhancement agenda. By reflecting upon TEF metric performance within the written submission, providers have an opportunity to outline the qualitative evidence base in relation to enhancement, evaluation and impact, within the context of their own overall institutional strategic approach to improving the student experience”
- “an element of the horse having bolted”
- “low student cohort numbers”
- “the introduction of high and low absolute values into TEF metric workbooks indicates an element of benchmarking “creep””
- “the introduction of grade inflation metrics during TEF3 is of questionable value. Such a metric does not consider the contexts within which providers are operating. Providers have robust and detailed mechanisms for ensuring fair and equitable assessment of student work, including the use of external examiners to calibrate sector-wide, a system that contributes positively to the enhancement agenda and to which the grade inflation metric adds little value.”
- “The consultation asks for views around the introduction of a measure of teaching intensity. In my view, the proposed measure has no meaning and no connection to excellence, value or quality, let alone enhancement. There is the potential for the information to be misleading as it will need specialist and careful interpretation”
with an updated TEF diagram, “The Incredible Machine”, David Kernohan and Ant Bagshaw look at TEF3 and question its compatibility with the earlier versions of the exercise.
- “So what – honestly – is TEF now for? It doesn’t adequately capture the student experience or the quality of teaching. It does not confer any benefit – other than a questionable marketing boost – to providers, and there is no evidence that students are making serious use of it to choose courses, universities, or colleges.
- Internationally, concerns have already been raised that the three-level ratings are confusing – it’s been widely reported that “Bronze” institutions are often not considered to meet the UK’s laudably stringent teaching quality thresholds.
- And it is not even a reliable time series – a TEF3 Gold is now achievable by an institution that would not have passed the test under TEF2 rules. Later iterations may well be built “ground up” from subject TEF assessments, once again changing the rules fundamentally. Let’s not even mention TEF1 (it’s OK, no-one ever does) in this context.”
From Dods: The Science and Technology Committee have published its report from the Algorithms in decision-making inquiry which acknowledges the huge opportunities presented by algorithms to the public sector and wider society, but also the potential for their decisions to disproportionately affect certain groups.
- Press Release: Committee sets the agenda for new algorithmic ethics agency
- Full Report PDF: Algorithms in decision making
- Report Summary: Link
- Report Conclusions and Recommendations: Link
The report calls on the Centre for Data Ethics & Innovation – being set up by the Government – to examine algorithm biases and transparency tools, determine the scope for individuals to be able to challenge the results of all significant algorithmic decisions which affect them (such as mortgages and loans) and where appropriate to seek redress for the impacts of such decisions. Where algorithms significantly adversely affect the public or their rights, the Committee highlights that a combination of algorithmic explanation and as much transparency as possible is needed.
It also calls for the Government to provide better oversight of private sector algorithms which use public sector datasets, and look at how best to monetise these datasets to improve outcomes across Government. The Committee also recommends that the Government should:
- Continue to make public sector datasets available for both ‘big data’ developers and algorithm developers through new ‘data trusts’, and make better use of its databases to improve public service delivery
- Produce, maintain and publish a list of where algorithms are being used within Central Government, or are planned to be used, to aid transparency, and identify a ministerial champion with oversight of public sector algorithm use.
- Commission a review from the Crown Commercial Service which sets out a model for private/public sector involvement in developing algorithms.
Social Mobility Commission
Under the 10 minute rule, the Chair of the Education Committee Robert Halfon introduced legislation to give greater powers and resources to the Social Mobility Commission (SMC), the body set up to promote social justice. (Link here at 13.52.09pm). It will have its second reading on 15th June. The Committee published a draft Bill in March alongside its report. In its report, the Committee called for the establishment of a new implementation body at the heart of Government to drive forward the social justice agenda.
And in the meantime, the Government have announced a recommendation for a new Chair. Dame Martina Milburn has spent 14 years as Chief Executive of the Prince’s Trust, supporting more than 450,000 disadvantaged young people across the country in that time, with three in four of these going on to work, education or training. She is also a non-executive director of the National Citizen Service and the Capital City College Group, and was previously Chief Executive of BBC Children in Need and of the Association of Spinal Injury Research, Rehabilitation and Reintegration.
From Dods: Last Friday the SLongituand Technology Committee announced that it intends to develop its own proposals for immigration and visa rules for scientists post-Brexit. This work follows the Government’s rejection of the Committee’s call for the conclusions of the Migration Advisory Committee (MAC) relating to science to be brought forward to form part of an ‘early deal’ for science and innovation.
- News Story: An immigration system that works for science and innovation inquiry launched
- Inquiry Page: An immigration system that works for science and innovation inquiry
The Committee published its report on “Brexit, Science and Innovation” in March, and has recently received the Government’s response. The report welcomed the Prime Minister’s call for a “far-reaching pact” with the EU on science and innovation and recommended that an early deal for science—including on the ‘people’ element—could set a positive tone for the rest of the trade negotiations, given the mutual benefits of cooperation on science and innovation for the UK and the EU.
The Committee will draw on the submissions to its previous Brexit inquiry and the sector’s submissions to the MAC to construct its proposals for the immigration system, but further input to this process is welcome on the following points:
- If an early deal for science and innovation could be negotiated, what specifically should it to contain in relation to immigration rules and movement of people involved with science and innovation?
- What are the specific career needs of scientists in relation to movement of people, both in terms of attracting and retaining the people the UK needs and supporting the research that they do?
- What aspects of the ‘people’ element need to be negotiated with the EU-27, as opposed to being simply decided on by the Government?
- On what timescale is clarity needed in relation to future immigration rules in order to support science and innovation in the UK?
To subscribe to the weekly policy update simply email email@example.com
JANE FORSTER | SARAH CARTER
Policy Advisor Policy & Public Affairs Officer
Follow: @PolicyBU on Twitter | firstname.lastname@example.org