What is the SQE?
Who is the SQE for?
Costs and fees
Case studies
Dates and locations
Assessment information
The assessment day
Results and resits
Due to inactivity, and for security reasons, you will be automatically logged out of your SQE account in 1 minute.
Press ’continue’ to stay logged in.
The monitoring and maximising diversity survey has been updated. Please return to the survey to reconfirm your answers and complete the new section at the end.
You must do this to remain eligible for the SQE. You will not be able to book your next assessment until you have updated your answers.
This 2022/23 annual report of the Solicitors Qualifying Examination (SQE) contains data about more than 8000 individual candidates who took part in the SQE between October 2022 and July 2023. It covers four SQE assessment windows:
Since this report was prepared, the Statistical Report for the SQE2 July 2023 assessment has been finalised and published. The outcomes will be included in the 2023/24 annual report.
The SQE is a rigorous assessment designed to assure consistent, high standards for all qualifying solicitors, consisting of two parts:
Assessment reliability
Pass rates
Resits
Candidate trends
Apprentices
This SQE annual report provides a cumulative picture of the outcomes from the assessments that took place in the reporting period (October 2022 - July 2023).
Statistics and commentary are provided on the overall performance of candidates at the individual assessment level to enable comparisons over time and identify any emerging trends. Assessment data is provided, where applicable, at the cumulative level.
Four assessment windows are covered in this report:
When preparing this report the results for the July 2023 and October 2023 SQE2 assessment window deliveries were not available. We have included some provisional data on the number of candidates who attended these assessments where relevant.
The SQE is a single rigorous assessment designed to assure consistent, high standards for all qualifying solicitors. The SRA’s Statement of Solicitor Competence sets out what solicitors need to be able to do to perform the role effectively, and provides everyone with a clear indication of what to expect from a solicitor. This is what the SQE tests.
The SRA has appointed Kaplan SQE (Kaplan) as the approved assessment provider for the delivery of the SQE assessments and other related services.
In this reporting period, SQE assessments were delivered to more than 8000 individual candidates in 63 countries.
SQE1 consists of two 180 question multiple choice single best answer assessments (FLK1 and FLK2). These are delivered electronically in controlled and invigilated exam conditions at Pearson VUE test centres across the UK and internationally.
Each FLK assessment takes place on a separate day. Each day is split into two sessions of 2 hours 33 minutes with 90 questions in each session. There is a 60-minute break between the sessions. FLK1 and FLK2 each have a separate pass mark.
In order to pass SQE1, a candidate must pass both FLK1 and FLK2 assessments. Candidates who fail at their first attempt of SQE1 have two further opportunities to take the assessment(s) they failed (FLK1 and/or FLK2). More information can be found in the SQE1 Assessment Specification.
SQE2 comprises 16 stations – 12 written stations and four oral stations — that assess both skills and application of legal knowledge.
The stations in SQE2 cover six legal skills:
This is across five practice areas:
SQE2 written assessments take place in Pearson VUE test centres over three consecutive half days and all candidates take the same written stations on the same date.
SQE2 oral assessments take place over two consecutive half days. During the reporting period, oral assessments took place in centres in Cardiff, London and Manchester. The logistics involved in running the oral assessments mean that not all candidates in a cohort can take the same oral stations on the same day so multiple “sittings” are used for SQE2 oral stations. To protect the integrity of the assessments and to ensure equity, different tasks are set for the oral stations used at the different sittings, with the same skills and practice areas covered, so all candidates are assessed across the same skills and practice areas.
SQE2 has a single pass mark for the whole assessment, covering all 16-stations. There may be slightly different pass marks between the SQE2 sittings to account for differences in the difficulty of the different oral station tasks, as described above.
Candidates who fail SQE2 at the first attempt have two further opportunities to take that assessment, and must resit the whole 16-station assessment.
More information can be found in the SQE2 Assessment Specification.
Exemptions from the SQE assessments are only available to qualified lawyers.
There are some candidates who meet the SRA transitional arrangements and are using SQE2 to qualify as a solicitor. They are not required to take SQE1.
To summarise, there are three types of candidates:
The data provided in this report relate to candidates who received a mark for any of the assessments. Candidates whose attempts were discounted due to mitigating circumstances are not included. Outcome data is provided separately for FLK1 and FLK2 assessments and overall for SQE1. Outcome data is also provided for SQE2.
In this reporting period, a total of 8046 individual candidates received a mark for one or more of the SQE assessments. Table 1 below provides the number of candidates for each assessment, along with the numbers and proportions of candidates by attempt number, where applicable.
Candidates are allowed up to three attempts for each assessment within a six-year period. At the time of writing this report there had been four opportunities to sit SQE1 and three opportunities to sit SQE2. A small number of candidates had made a third attempt at the assessments.
Whilst two further SQE2 assessments will have taken place at the time of publication, the results were not released until November 2023 (SQE2 July 2023) and February 2024 (SQE2 October 2023). These will be included in the 2023/24 annual report. Since this report was prepared, the statistical reports for both SQE2 July 2023 and SQE2 October 2023 have been finalised and published.
At their first SQE1 attempt, candidates are required to sit both FLK1 and FLK2 in the same assessment window. If they fail one, they only have to resit that one assessment. Any passes can be carried forward and used within a six year period.
Because of this, and owing to mitigating circumstances that are applied separately for FLK1 and FLK2, the number of candidates may differ across FLK1, FLK2 and SQE1 overall.
*Data provided for candidates who sat both FLK1 and FLK2 in the assessment window
Though SQE2 in the July 2023 and October 2023 assessment windows had been delivered at the time of writing this report, the marks had not been released to candidates. There were 1,036 candidates who sat SQE2 in the July 2023 assessment window and 720 in October 2023. These approximately breakdown as:
These numbers will change if there are successful mitigating circumstance claims with attempts being discounted.
Statistical reports are published after results are released. Data from the July 2023 and October 2023 assessments will be included in the 2023/24 annual report.
Table 2 provides the pass marks for each assessment, the average score (Mean) and standard deviation (SD), and Table 3 provides measures of test reliability (Cronbach’s Alpha and Standard Error of Measurement (SEm)).
Cronbach’s Alpha (α) is a measure of test reliability that estimates the internal consistency, or how closely related the sets of items are in a test. It therefore tells us how well items (questions) work together as a set. A high α coefficient suggests that candidates tend to respond in similar ways from one item to the next. Values for α range from 0 (where there is no correlation between items) to 1 (where all items correlate perfectly with one another). The widely accepted gold-standard α for high-stakes assessments is 0.8.
The SQE assessments provide an observed/obtained score for a candidate at a snapshot in time (the assessment). If the candidate were to sit the same assessment on another occasion, they may achieve a different score owing to various factors. Some of these can be controlled to an extent, such as the training provided, and some cannot, such as the amount of sleep the candidate got the night before the assessment.
A candidate's theoretical “true” score can only be estimated by their observed score but there is an inevitable degree of error around each candidate’s observed score, which is consistent with most assessments. The standard error of measurement (SEm) for an assessment is an estimate of how repeated measures of the same group of candidates on the same assessment would be distributed around their theoretical “true” scores. The SEm is a function of the reliability of the assessment (α) and the standard deviation in scores on the assessment. Generally, the higher the reliability, the lower the standard error of measurement and vice-versa.
For all SQE assessments to date the SEm has been below 4% which suggests that observed scores generally represent a very good approximation for true scores in these assessments.
Note - for SQE2 there were multiple sittings for the oral assessments (see introduction)
Tables 4a to 4c below summarise the journey so far for the candidates who have received assessment marks in this reporting period. These are provided separately for SQE1 (candidate outcomes) and SQE2 (candidate routes and outcomes). The candidates shown in table 4b include transitional candidates not required to take SQE1.
Of the 113 candidates who passed SQE1 in January 2023 and went on to sit SQE2 in April 2023, 102 (90%) passed and 11 (10%) failed.
Of those who attempted either FLK1 or FLK2 in either January 2023 or July 2023, 510 (72%) passed their remaining assessments (included within the Passed rows of the table above).
Candidates who fail their first or second attempts may benefit from reviewing the information contained later in this report relating to candidate performance in different practice areas.
Of the 962 candidates who did not sit SQE1 and sat SQE2 in October 2022 or April 2023, 645 (67%) passed. This compares to the higher rate of 88% for those taking SQE2 after passing the SQE1.
Of the 1224 candidates who passed SQE2 in October 2022 or April 2023:
Of the 393 candidates who did not pass SQE2 in October 2022 or April 2023:
Table 5 shows the candidate pass rates (and number passing) for each assessment for all candidates and by attempt number, and for SQE2 by whether SQE1 had been previously sat.
Of the two SQE1 assessments, the pass rates for FLK1 have been higher than for FLK2 (+3% and +8% for January 2023 and July 2023 respectively). We will continue to monitor this across future assessments.
The overall pass rate was similar for both SQE1 assessments at 51% and 53%.
Pass rates in both January and July 2023 were lower for resitting candidates, when compared to first attempt candidates. This was true for both FLK1 and FLK2.
The pass rates for resitting candidates were higher in July than in January , increasing from 32% to 58% for FLK1 and from 38% to 50% for FLK2.
The proportion of candidates resitting has increased as more candidates start the SQE assessment journey. This rose from 8% to 17% for FLK1 and from 12% to 19% for FLK2 between January and July 2023.
The lower pass rates for resitting candidates might indicate that they should consider taking more time (and/or putting in more work or training) between sittings. This may help them improve from a failing to passing standard.
The SQE2 pass rates are markedly higher (20%+) than for SQE1. This is expected given the SQE2 eligibility requirement to have qualification-level functioning legal knowledge (i.e. have passed SQE1 or are a transitional candidate).
There has been a decrease in the proportion of candidates taking SQE2 who were not required to sit the SQE1; 42% of all candidates in April 2023 vs 87% in October 2022).
Candidates who sat SQE1 performed better than those who had not. This has contributed to the increase in the overall pass rate (71% in October 2022 to 77% in April 2023).
The proportions of resitting candidates remain small across the assessments (7%) and their pass rates are significantly lower than for first attempt candidates.
The pass rate for second attempt candidates dropped from 54% in October 2022 to 36% in April 2023. Again, careful consideration should be given to when to resit, to allow sufficient time to improve to a passing standard.
*Sat both assessments but with a different attempt number for each
**not reportable as less than 10 candidates in the attempt group
The SRA collects diversity and socio-economic data to help understand how candidates with different characteristics and backgrounds perform in the assessments. The categories are consistent with data collected by the Office for National Statistics (ONS) and the Social Mobility Commission.
Data is collected from candidates via an online monitoring and maximising diversity survey (or demographic data survey), completed ahead of assessment registration. Appendix 1 lists the data reported on in this section.
The large number of characteristics and groups within characteristics recorded in the data collected means that candidate numbers in some of the groups are small.
The complication of small candidate numbers in certain groups is compounded by almost half of candidates selecting ‘Prefer not to say’ for one or more of the 14 characteristics provided in this report. As a result, we have not been able to include multivariate analysis in this report.
Though multivariate analysis is not included for reasons mentioned, initial exploratory analyses suggest that many of the differential outcomes evident between groups may be explained by educational and socio-economic factors. However, we cannot draw reliable conclusions from this exploratory analysis yet. We will look at this again when candidate numbers and declaration rates increase.
In the tables below, we present univariate analysis of the outcomes data, which looks at each of the characteristics individually and independently. Tables 6 to 12 provide the following for FLK1, FLK2, SQE1 and SQE2 for each of the 14 characteristics presented:
Data in the tables exclude candidates who select ‘Prefer not to say’ - the following provides some proportions of candidates who selected this response:
These tables provide data for all candidates who received marks for any assessment (FLK1, FLK2, SQE1 and SQE2).
The data is pooled from the January 2023 and July 2023 assessments for FLK1, FLK2, and SQE1, and from the October 2022 and April 2023 assessments for SQE2. Where there are fewer than 10 candidates in any group the proportions and pass rates are not reported; this is indicated by greyed out cells in the table.
The full questions asked in the online demographic data survey in relation to each category are available in Appendix 1. We have started collecting data about candidates’ first language and this will be included in the next annual report.
Candidates who reported being in White or Mixed/multiple ethnic groups achieved higher pass rates than Asian/Asian British or Black/Black British. Differences in pass rates between groups were significant for all assessments.
Pass rates were similar between candidates who declared a disability and those who did not.
The majority of candidates taking SQE1 and SQE2 were in the younger age groups. More than half were in the 25-34 age group and this group achieved higher pass rates than older candidates in FLK1, FLK2 and SQE1.
SQE2 candidates in the 16-24 and 25-34 age groups achieved higher pass rates than older candidates.
There were fewer than 10 candidates in the 65+ age group for all assessments. The proportions and pass rates are therefore not provided for this group.
Candidates who reported their sex as male achieved a higher pass rate than female candidates in SQE1. The opposite was the case in SQE2.
There were no significant differences between candidates whose gender was the same or different to their sex registered at birth.
Candidates who reported that their sexual orientation was Bi, Gay / lesbian or Other achieved higher pass rates than Heterosexual / straight candidates in SQE1 assessments, but there were no significant differences in SQE2.
There were fewer than 10 candidates selecting ‘Other’ for sex for all assessments, and for SQE2 there were fewer than 10 candidates selecting ‘No’ for their gender being the same as the sex registered at birth and ‘Other’ for sexual orientation. The proportions and pass rates are therefore not provided for these groups.
There were differences in pass rates between religion/belief groups reported by candidates in all assessments. Candidates reporting Hindu, Muslim or Sikh as their religion generally had lower pass rates, and those reporting no religion or belief or Jewish had higher pass rates.
The most frequently indicated group was no religion or belief (40%-45% of candidates for each assessment).
Candidates who reported the occupation of the main household earner as professional achieved higher pass rates in SQE1 and SQE2.
Pass rates for candidates attending independent or fee-paying schools (non-bursary funded) were higher across all assessments. Candidates who attended school outside the UK achieved significantly lower pass rates in SQE2 than those in the other subgroups.
Candidates who reported that at least one parent attended university achieved higher pass rates in SQE1, with pass rates being similar in SQE2.
For SQE2, fewer than 10 candidates selected ‘Other’ or did not know the type of school they attended. The proportions and pass rates are therefore not provided for these groups.
*Group excluded from the Chi-square test of significance
Candidates with qualifications below degree level accounted for approximately 1.5% of all candidates and achieved higher pass rates in FLK2 and SQE2 than those with an undergraduate degree. Whilst an undergraduate degree (or equivalent qualification) is required for admission it is not a requirement for taking the SQE1 or SQE2 assessments.
Candidates with first class undergraduate degree classifications achieved higher pass rates in all assessments and accounted for over a fifth of candidates.
Candidates who disclosed they had not undertaken qualifying work experience achieved a higher pass rate in FLK2 and SQE1, with similar pass rates to those who had undertaken qualifying work experience in FLK1 and SQE2.
All candidates disclosed whether they were already a qualified lawyer. Those who were qualified achieved higher pass rates in FLK1, FLK2 and SQE1. The opposite was the case in SQE2.
There were fewer than 10 candidates with no formal qualifications for all assessments, and with a Commendation for their undergraduate degree for SQE2. The proportions and pass rates are therefore not provided for these groups.
*Group excluded from the Chi-square test
The tables in this section show the mean scores by practice area for FLK1 and FLK2 by the following candidate groups:
In FLK1, mean scores across the seven practice areas assessed range from 51.6% for Legal Services, to 70.7% for Ethics. This suggests candidates find some practice areas more difficult than others.
Five of the practice area mean scores are below 60%, with Legal Services (51.6%) and Dispute Resolution (52.3%) having the lowest scores. Mean scores were above 60% for Ethics (70.7%) and Contract Law (63.2%).
In FLK2, the mean scores appear slightly lower than for FLK1 and range from 49.9% for Wills and Intestacy, to 63.5% for Ethics. Whilst the range in mean scores is smaller than for FLK1 (13.6% vs 19.1%) this range again suggests candidates find some areas more difficult than others, with Wills and Intestacy (49.9%) and Property Practice (50.7%) having the lowest scores. Mean scores were above 60% for Ethics (63.5%) and Criminal Liability (60.3%).
There are similar patterns of performance between the passing and failing candidates, and first, second and third attempt candidates, across the majority of practice areas within each assessment. This suggests both stronger and weaker candidates perform well/less well in the same practice areas.
The mean difference between passing and failing candidates across the practice areas varies between 17.0% (FLK1 Legal Services) to 25.5% (FLK1 Business Law and Practice) with the differences being above 20% for four of the seven FLK1 practice areas and all of the seven for FLK2. The deficit in knowledge of the weaker candidates is therefore greater in these practice areas, suggesting these areas may require more focus/preparation for future attempts at SQE1.
When comparing performance between attempt numbers the differences between first attempt and second attempt mean scores range between 3.4% (FLK1 Ethics) and 9.0% (FLK1 Business Law and Practice), with the majority of differences being below 6%, and as expected, higher mean scores for first attempt candidates across all practice areas.
The differences between first and third attempt candidates are more variable with the differences being smaller (than between first and second attempt) for all of the FLK2 practice areas. Third attempt candidates scored higher in FLK2 Ethics than the first attempt candidates.
The following plots show the mean scores for the FLK1 and FLK2 practice areas for the passing and failing candidates, with data aggregated across the two assessments: pink = failing; green = passing; bars ordered by passing candidate mean scores descending.
In FLK1, the wider difference between passing and failing candidates is evident for Business Law and Practice, and for FLK2 this is evident for Land Law and Wills and Intestacy.
Y axis showing Mean Score (%)
X axis showing Practice area
Result
In the SQE2 assessments, all candidates sat the same 12 written stations within each assessment window. For the four oral stations, candidates were assessed on the same skills in the same practice areas within each assessment window, but with different stations across three oral sittings in October 2022 and four oral sittings in April 2023.
Whilst the number of stations assessing each skill and practice area remains the same for each SQE2 iteration, the combination of skills and practice areas will vary between iterations.
For more on this, please see the SQE2 Assessment Specification (Organisation and delivery section).
The table in this section shows the mean scores by station (with data aggregated across assessments where there are common stations and aligned to ensure a comparable scale across the assessments) for the following candidate groups:
Looking at the station scores for all candidates, mean scores range between 58.0% and 77.6%. Scores above 70% were achieved in nine of the 24 different stations, with mean scores of 69.5% or over for the four oral stations. The highest mean scores were in:
At the lower end, mean scores of less than 60% were in:
The skills assessed in the oral assessments have the overall highest mean scores with 76.0% for Advocacy and 71.5% for Interview and Attendance Note/Legal Analysis. Lower mean scores are seen for the written stations ranging from 63.3% for Legal Writing to 68.2% for Legal Research. Differences between the mean scores for first attempt and resit candidates range from 6.1% (Interview) to 11.3% (Case and Matter Analysis).
The range in the mean performance across the five practice areas is small, ranging from 66.4% (Business) to 70.8% (Wills and Intestacy), which suggests that, overall, candidates do not find particular practice areas easier or more difficult than others. Mean differences between first attempt and resit candidates range between 6.6% (Criminal Litigation) and 11.7% (Property Practice).
Figure 3 and Figure 4 show the mean scores for the SQE2 skills and practice area scores for the passing and failing candidates with data aggregated across the two assessments: pink = failing; green = passing; bars ordered by passing candidates’ mean scores descending.
The patterns of performance between passing and failing candidates appear similar across both skills and practice areas. Mean differences between the passing and failing candidates for skills range from 15.2% (Interview and Attendance Note/Legal Analysis) to 22.1% (Legal Research). For the practice areas the differences between the passing and failing candidates range from 18.2% (Criminal Litigation) to 22.0% (Dispute Resolution).
The mean score difference between passing and failing candidates was greater than 20% for the following three (out of six) skills and two (out of five) practice areas:
Skills:
Practice area:
The deficit in the legal skills (and associated legal knowledge) of the weaker candidates is therefore greater in these skills and practice areas, suggesting these areas may require more focus/preparation for future attempts at SQE2.
X axis showing Station
Table 16 shows the mean scores for all candidates along with passing and failing candidates for ethics and non-ethics focussed assessment content across FLK1 and FLK2.
In each case, there is a strong positive correlation between performance on ethics questions and the overall score. Naturally those candidates who have passed the assessments have significantly higher scores than those who have failed.
The mean scores are consistently higher for the ethics items compared to the non-ethics items across all assessments and groups. For all candidates the mean differences range between 5.2% (FLK2 January 2023) and 16.1% (FLK1 January 2023). With the current data, there appears to be no pattern in mean differences between ethics and non-ethics items across the FLK1 and FLK2 assessments
The mean differences between the passing and failing candidates are similar for both ethics and non-ethics content, indicating associated performance across the two content areas.
Professional Conduct and Ethics are assessed pervasively in the SQE2 assessments. Some of the 16 stations contain Professional Conduct matters, and some do not. There is no formula determining which assessments this may fall in, and candidates are required to spot these issues and deal with them appropriately.
The matters related to these topics do not therefore have their own assessment criteria allocated to them. They are an integral part of the assessment, considered in the round and marked under the Legally Comprehensive assessment criteria. The weight attached to Professional Conduct and Ethics is determined at the markers’ meeting, applying the Day 1 standard.
Kaplan SQE is the End Point Assessment Organisation for solicitor apprentices in England. Solicitor apprentices are required to pass SQE1 during their apprenticeship as an onprogramme assessment – it is a gateway requirement for SQE2 which is the end-point assessment (EPA).
Apprentices must pass SQE1 and have met all of the minimum requirements of their apprenticeship (including the gateway review) before they can attempt SQE2. When an apprentice has passed SQE2, they have completed the EPA for the Solicitor Apprenticeship Standard and passed the SQE.
Information about how solicitor apprentices and their training providers can engage with the SQE is available on our website.
Solicitor apprentices made up a small proportion of overall candidate numbers for each assessment as indicated in Table 17 and accounted for 5% of all candidates assessed in this reporting period.
The solicitor apprentice pass rates were significantly higher than the pass rates for non-apprentice candidates for FLK2 Jul 2023, SQE1 Jul 2023 and SQE2 Apr 2023. The difference in pass rates is most notable for SQE2 April 2023 with 97% of apprentices passing compared to 75% of all non-apprentice candidates. This is consistent with the SQE2 April 2022 assessment where the apprentice pass rate was 100%. It shows solicitor apprentices continue to demonstrate their preparedness for the end point assessment.
*Not reported as less than 10 candidates
There were 398 solicitor apprentices who took one or more of the assessments in this reporting period. Of these:
We are committed to making sure that a candidate is not disadvantaged by reason of a disability in demonstrating their competence and we will make reasonable adjustments to methods of assessment for candidates with a disability (within the meaning of the Equality Act 2010) to achieve this. We will also consider reasonable requests to accommodate candidates with other conditions which impact on their ability to demonstrate their competence.
Our approach to reasonable adjustments, including how we communicate with candidates and the arrangements we most frequently make, is set out in the Reasonable Adjustments Policy.
During the course of the year we implemented 738 reasonable adjustment plans.
The average time between receiving a completed application for reasonable adjustments (with full accompanying evidence) to proposing an adjustment plan to a candidate was 7 days on both SQE2 October 2022 and SQE2 July 2023, and 8 days on both SQE1 January and SQE1 July 2023.
The average number of days taken was 15 days for the SQE2 April 2023 assessment. The longer timeframe was due to a high volume of complex cases as well as operational challenges and bottlenecks during the booking process. Following evaluation, changes have been implemented to improve these for upcoming assessments and average times have returned to prior levels on subsequent windows.
Considerable time is spent with some candidates who have complex reasonable adjustment plans to ensure comprehensive support to finalise a reasonable adjustment plan for the assessments. Our standard guidance to candidates is to obtain their supporting evidence and make their reasonable adjustment request as early as possible. We strongly encourage candidates to submit their request form at the earliest opportunity, even if supporting documentation is not fully available at that point. Applying early provides time for Kaplan and the candidate to finalise an adjustment plan prior to seats being booked. Completion of the booking is carried out after the booking window has opened.
* 1% not recorded
Table 20 shows pass rates for candidates with reasonable adjustment plans alongside pass rates for the full cohort for FLK1, FLK2 and SQE2 assessments during the reporting period.
A Chi-square significance test was used to see if there were differences (at a 95% confidence level) between the pass rates for candidates with and without a reasonable adjustment plan. This is indicated in the final column as ‘Yes’ (significant differences) or ‘No’ (no significant differences).
For FLK2 Jan 2023, candidates with a reasonable adjustment plan in place achieved a significantly higher pass rate, with no other differences being significant.
Pass rates for candidates with reasonable adjustment plans are broadly similar to those for the full cohort for all SQE assessments, and there is no consistent pattern to suggest that candidates with reasonable adjustments achieved higher or lower pass rates than those without. For SQE2, only the outcomes of the SQE2 April 2022 assessment were available at the time of writing the last SQE Annual Report 2021/22. The outcomes showed a higher pass rate for candidates with reasonable adjustments. This did not occur for SQE2 October 2022 or SQE2 April 2023. The results of SQE2 July 2023 were not yet known at the time of writing.
*Significant where any differences between the groups are unlikely to be due to chance
Plans were in place for candidates with a wide range of disabilities, long-term and fluctuating conditions. Adjustments were also agreed for some candidates who were pregnant/nursing mothers.
The most common conditions amongst candidates with reasonable adjustment plans were neurological conditions such as dyslexia, autism, dyspraxia and ADHD. The same was true in the period covered by the previous SQE Annual Report (2021/22).
The majority of candidates (56%) had more than one adjustment to the mode of assessment. The adjustments were specific to the actual assessments and in some cases different arrangements were required on the SQE2 written and oral assessments.
The most common adjustments were as follows, with similar patterns seen across SQE1 and SQE2.
Other bespoke provisions were also arranged for candidates where evidence supported this. Examples of these included access to medical devices and use of a screen overlay. The types of other bespoke arrangements made were similar to the adjustments listed in the SQE Annual Report 2021/22.
After each assessment, candidates are invited to complete a survey to provide feedback about their experience. The questions in the survey (Appendix 2) relate to:
Candidates can provide general comments via a free-text box. They can also provide their contact details should they wish to be contacted further about their feedback.
These surveys continue to provide valuable information for Kaplan and the SRA to consider and, overall, it is clear that a significant proportion of candidates are still dissatisfied with the online journey.
Whilst the website is now better able to support large volumes of candidates at results release, the ongoing need for queuing and lack of confidence during booking windows causes frustration and concern for candidates.
The feedback from candidates in the surveys show a mix of experiences. Whilst there have been some adverse events that have regrettably impacted delivery to isolated pockets of candidates, the assessment has been delivered without issue to the vast majority.
Although the survey invites candidates to give feedback about the SQE in Welsh, there have so far been no candidates opting to take the assessment in Welsh. We are continuing with the phased introduction of the SQE in Welsh. From SQE2 October 2023, the whole of the SQE2 can be taken in Welsh.
Preparations are also ongoing to offer the SQE1 in Welsh from January 2025, following a small-scale pilot exercise in 2023. A report on the pilot for the SQE1 in Welsh will be published in early 2024.
All responses to the candidate survey are collated and analysed, with action plans put in place where improvements can be made, or new opportunities and solutions can be explored.
Feedback from candidates and stakeholders has also been collected, reviewed and considered from various other sources during the course of delivering the assessments. This includes input received from the SRA, SRA Psychometrician and the Independent Reviewer of the SQE about the overall delivery of the SQE assessments, and their oversight of any issue management should it arise.
During the reporting period, an issue unfortunately arose where an isolated error was found regarding the quality of marking for one of the sixteen assessment stations for the April 2023 sitting of SQE2. This issue was identified shortly after the results were released when a small number of candidates made enquiries about the detailed pattern of results they received. An error was found in a few of the Business Case and Matter Analysis scripts in a set of 346. In order to reassure candidates, we checked that no other errors occurred within this set. The review was concluded and the results of the review were presented to the Assessment Board in October 2023. The Assessment Board approved the results of the review, which were as follows:
Although there are rigorous assurance processes in place, we have reviewed our processes further in light of this error and include a specific review of zero and low scoring scripts during the marking and analysis of the SQE assessments. We communicated the outcomes to all affected candidates and apologised for the uncertainty this caused.
Table 21 summarises some of the key areas brought to our attention for improvement, and what is being done in response.
Note: We have started collecting data about candidates’ first language and this will be included in the next annual report.
For each question, candidates are asked to say whether they are very satisfied, satisfied, neither, unsatisfied or very unsatisfied.
Create your personal SQE account and book your assessments.
Find out what happens after passing the SQE and admission to the roll of solicitors.