Solicitor Qualifying Examination Annual Report 2022/23

Executive Summary

This 2022/23 annual report of the Solicitors Qualifying Examination (SQE) contains data about more than 8000 individual candidates who took part in the SQE between October 2022 and July 2023. It covers four SQE assessment windows:

  • SQE2 in October 2022
  • SQE1 in January 2023
  • SQE2 in April 2023
  • SQE1 in July 2023.

Since this report was prepared, the Statistical Report for the SQE2 July 2023 assessment has been finalised and published. The outcomes will be included in the 2023/24 annual report.

The SQE is a rigorous assessment designed to assure consistent, high standards for all qualifying solicitors, consisting of two parts:

  • SQE1, which tests candidates’ functioning legal knowledge (FLK1 and FLK2 assessments)
  • SQE2, which tests candidates’ practical legal skills and knowledge.

i) SQE1 Summary

  • 6545 candidates took one or more of the FLK1 and FLK2 assessments. Of these, 5821 candidates took SQE1 for the first time (across the two assessment windows).
  • Overall pass rates for SQE1 were similar for the January 2023 and July 2023 assessment windows (51% and 53% respectively). These are in line with the previous year (53% in both November 2021 and July 2022).
  • First attempt pass rates in January 2023 and July 2023 were 54% and 56% respectively. The pass rates for resit candidates were lower at 14% and 29% respectively.
  • 32% of candidates who failed part or all of SQE1 in January 2023 resat in July 2023. Their pass rates were 50% and 46% for FLK1 and FLK2 respectively.
  • FLK1 scores were higher than FLK2 scores in both SQE1 assessment windows.
  • All the SQE1 assessments showed very good reliability with correspondingly low standard errors of measurement (see Descriptive and Quality Assurance Statistics section iv).

ii) SQE2 Summary

  • 1617 individual candidates took an SQE2 assessment, with 1524 candidates sitting as a first attempt.
  • Across the two assessments, 40% had passed SQE1 and 60% did not sit SQE1 (due to transitional arrangements).
  • The overall pass rate for SQE2 in the reporting period was 75%. This was higher for candidates who had taken SQE1 (88%) than for those who did not sit SQE1 (66%). The overall pass rate for resit candidates was 37%.
  • 14% of candidates who failed SQE2 in the October 2022 assessment window resat in April 2023; 30% resat in July 2023 and 23% in the October 2023 assessment windows (provisional data).
  • 30% of candidates who failed SQE2 in the April 2023 assessment window resat in the October 2023 assessment window (provisional data).
  • Both SQE2 assessments showed very good reliability with low standard errors of measurement.

iii) Key messages

Assessment reliability

  • Data suggests both SQE1 and SQE2 assessments were reliable. This provides assurance that the assessments were well constructed and measured candidate performance consistently within each assessment.

Pass rates

  • The pass rate for SQE2 was markedly higher than for SQE1. This is expected given the SQE2 eligibility requirement to have qualification-level functioning legal knowledge (ie have passed SQE1 or are a transitional candidate).
  • Analyses of pass rates by candidate diversity and socio-economic data suggested some differences in performance between groups in the individual assessments. However, there was no evidence of bias in the SQE assessments overall. This will continue to be monitored.

Resits

  • Resit candidates had lower pass rates on their resit attempts across all assessments.

Candidate trends

  • There has been a shift from the early SQE2 cohorts, including large proportions of transitional candidates not required to take SQE1, to a greater proportion of candidates having sat SQE1 - 58% of SQE2 April 2023 candidates had sat SQE1 compared to 13% in October 2022.
  • Candidates were more likely to pass SQE2 if they had taken SQE1 or were not a qualified lawyer.
  • Candidates with higher university degree classifications were more likely to perform better in the assessments.

Apprentices

  • The small number of apprentices who have taken the assessment so far have typically performed better than other candidates

This SQE annual report provides a cumulative picture of the outcomes from the assessments that took place in the reporting period (October 2022 - July 2023).

Statistics and commentary are provided on the overall performance of candidates at the individual assessment level to enable comparisons over time and identify any emerging trends. Assessment data is provided, where applicable, at the cumulative level.

Four assessment windows are covered in this report:

  • October 2022 SQE2
  • January 2023 SQE1 (FLK1 and FLK2 assessments)
  • April 2023 SQE2
  • July 2023 SQE1 (FLK1 and FLK2 assessments).

When preparing this report the results for the July 2023 and October 2023 SQE2 assessment window deliveries were not available. We have included some provisional data on the number of candidates who attended these assessments where relevant.

i) About the SQE

The SQE is a single rigorous assessment designed to assure consistent, high standards for all qualifying solicitors. The SRA’s Statement of Solicitor Competence sets out what solicitors need to be able to do to perform the role effectively, and provides everyone with a clear indication of what to expect from a solicitor. This is what the SQE tests.

The SRA has appointed Kaplan SQE (Kaplan) as the approved assessment provider for the delivery of the SQE assessments and other related services.

In this reporting period, SQE assessments were delivered to more than 8000 individual candidates in 63 countries.

ii) SQE1

SQE1 consists of two 180 question multiple choice single best answer assessments (FLK1 and FLK2). These are delivered electronically in controlled and invigilated exam conditions at Pearson VUE test centres across the UK and internationally.

Each FLK assessment takes place on a separate day. Each day is split into two sessions of 2 hours 33 minutes with 90 questions in each session. There is a 60-minute break between the sessions. FLK1 and FLK2 each have a separate pass mark.

In order to pass SQE1, a candidate must pass both FLK1 and FLK2 assessments. Candidates who fail at their first attempt of SQE1 have two further opportunities to take the assessment(s) they failed (FLK1 and/or FLK2). More information can be found in the SQE1 Assessment Specification.

iii) SQE2

SQE2 comprises 16 stations – 12 written stations and four oral stations — that assess both skills and application of legal knowledge.

The stations in SQE2 cover six legal skills:

  • Advocacy
  • Case and matter analysis
  • Interview and attendance note/legal analysis
  • Legal drafting
  • Legal research
  • Legal writing.

This is across five practice areas:

  • Business, organisations, rules and procedures
  • Criminal litigation
  • Dispute resolution
  • Property practice
  • Wills and intestacy, probate administration and practice.

SQE2 written assessments take place in Pearson VUE test centres over three consecutive half days and all candidates take the same written stations on the same date.

SQE2 oral assessments take place over two consecutive half days. During the reporting period, oral assessments took place in centres in Cardiff, London and Manchester. The logistics involved in running the oral assessments mean that not all candidates in a cohort can take the same oral stations on the same day so multiple “sittings” are used for SQE2 oral stations. To protect the integrity of the assessments and to ensure equity, different tasks are set for the oral stations used at the different sittings, with the same skills and practice areas covered, so all candidates are assessed across the same skills and practice areas.

SQE2 has a single pass mark for the whole assessment, covering all 16-stations. There may be slightly different pass marks between the SQE2 sittings to account for differences in the difficulty of the different oral station tasks, as described above.

Candidates who fail SQE2 at the first attempt have two further opportunities to take that assessment, and must resit the whole 16-station assessment.

More information can be found in the SQE2 Assessment Specification.

iv) Exemptions

Exemptions from the SQE assessments are only available to qualified lawyers.

v) Transitional arrangements

There are some candidates who meet the SRA transitional arrangements and are using SQE2 to qualify as a solicitor. They are not required to take SQE1.

vi) Types of candidates taking SQE assessments

To summarise, there are three types of candidates:

  1. Candidates going through the full SQE route and sitting both SQE1 and SQE2
  2. Transitional candidates who are not required to take SQE1 and only sit SQE2
  3. Qualified lawyers who either sit the whole of the SQE or have an exemption from FLK1 and/or, FLK2 and/or SQE2.

i) Number of Candidates

The data provided in this report relate to candidates who received a mark for any of the assessments. Candidates whose attempts were discounted due to mitigating circumstances are not included. Outcome data is provided separately for FLK1 and FLK2 assessments and overall for SQE1. Outcome data is also provided for SQE2.

In this reporting period, a total of 8046 individual candidates received a mark for one or more of the SQE assessments. Table 1 below provides the number of candidates for each assessment, along with the numbers and proportions of candidates by attempt number, where applicable.

ii) Attempts and Resits

Candidates are allowed up to three attempts for each assessment within a six-year period. At the time of writing this report there had been four opportunities to sit SQE1 and three opportunities to sit SQE2. A small number of candidates had made a third attempt at the assessments.

Whilst two further SQE2 assessments will have taken place at the time of publication, the results were not released until November 2023 (SQE2 July 2023) and February 2024 (SQE2 October 2023). These will be included in the 2023/24 annual report. Since this report was prepared, the statistical reports for both SQE2 July 2023 and SQE2 October 2023 have been finalised and published.

At their first SQE1 attempt, candidates are required to sit both FLK1 and FLK2 in the same assessment window. If they fail one, they only have to resit that one assessment. Any passes can be carried forward and used within a six year period.

Because of this, and owing to mitigating circumstances that are applied separately for FLK1 and FLK2, the number of candidates may differ across FLK1, FLK2 and SQE1 overall.

Table 1: Number (and proportion) of marked candidates by attempt
Assessment Assessment window Number of Candidates Number of Candidates by Attempt
1st Attempt 2nd Attempt 3rd Attempt Mixed Attempt Numbers
FLK1 Jan 2023 3103 2845 (92%) 246 (8%) 12 (<1%) -
FLK2 Jan 2023 3253 2861 (88%) 357 (11%) 35 (1%) -
SQE1* Jan 2023 3031 2816 (93%) 184 (6%) 8 (<1%) 23 (1%)
FLK1 Jul 2023 3647 3038 (83%) 563 (16%) 46 (1%) -
FLK2 Jul 2023 3755 3035 (81%) 644 (17%) 76 (2%) -
SQE1* Jul 2023 3475 3005 (87%) 425 (12%) 31 (1%) 14 (<1%)
SQE2 Oct 2022 646 600 (93%) 46 (7%) - -
SQE2 Apr 2023 997 924 (93%) 69 (7%) 4 (<1%) -

*Data provided for candidates who sat both FLK1 and FLK2 in the assessment window

iii) SQE2 July 2023 and October 2023

Though SQE2 in the July 2023 and October 2023 assessment windows had been delivered
at the time of writing this report, the marks had not been released to candidates. There were
1,036 candidates who sat SQE2 in the July 2023 assessment window and 720 in October
2023. These approximately breakdown as:

  • 94% July and 84% October as first attempt
  • 6% July and 14% October as second attempt
  • <1% July and 2% October as third attempt.

These numbers will change if there are successful mitigating circumstance claims with attempts being discounted.

Statistical reports are published after results are released. Data from the July 2023 and October 2023 assessments will be included in the 2023/24 annual report.

iv) Descriptive and Quality Assurance Statistics

Table 2 provides the pass marks for each assessment, the average score (Mean) and standard deviation (SD), and Table 3 provides measures of test reliability (Cronbach’s Alpha and Standard Error of Measurement (SEm)).

Cronbach’s alpha

Cronbach’s Alpha (α) is a measure of test reliability that estimates the internal consistency, or how closely related the sets of items are in a test. It therefore tells us how well items (questions) work together as a set. A high α coefficient suggests that candidates tend to respond in similar ways from one item to the next. Values for α range from 0 (where there is no correlation between items) to 1 (where all items correlate perfectly with one another). The widely accepted gold-standard α for high-stakes assessments is 0.8.

In all SQE1 assessments to date α has been greater than 0.9 and above 0.8 for SQE2, suggesting very good internal consistency and high reliability for the SQE1 and SQE2 assessments.

 

Standard error of measurement (SEm)

The SQE assessments provide an observed/obtained score for a candidate at a snapshot in time (the assessment). If the candidate were to sit the same assessment on another occasion, they may achieve a different score owing to various factors. Some of these can be controlled to an extent, such as the training provided, and some cannot, such as the amount of sleep the candidate got the night before the assessment.

A candidate's theoretical “true” score can only be estimated by their observed score but there is an inevitable degree of error around each candidate’s observed score, which is consistent with most assessments. The standard error of measurement (SEm) for an assessment is an estimate of how repeated measures of the same group of candidates on the same assessment would be distributed around their theoretical “true” scores. The SEm is a function of the reliability of the assessment (α) and the standard deviation in scores on the assessment. Generally, the higher the reliability, the lower the standard error of measurement and vice-versa.

For all SQE assessments to date the SEm has been below 4% which suggests that observed scores generally represent a very good approximation for true scores in these assessments.

Table 2: Pass marks, descriptive and quality assurance statistics
Assessment window Assessment Sitting No.of Candidates Pass Mark Mean SD
Jan 2023 FLK1 - 3103 57% 58.0% 12.4%
FLK2 - 3253 56% 56.3% 14.3%
Jul 2023 FLK1 - 3647 53% 56.9% 12.6%
FLK2 - 3755 52% 52.8% 13.6%
Oct 2022 SQE2 1 272 62% 66.4% 11.5%
2 138 62% 66.7% 13.3%
3  236 62%  67.5%  11.1%
 Apr 2023  SQE2 1  155 62% 66.3% 12.2% 
2 289  62%   69.7% 10.0% 
3  286 61%  69.2%  10.7% 
4  267 61%  68.6%  11.5% 

Note - for SQE2 there were multiple sittings for the oral assessments (see introduction)

Table 3: Quality assurance statistics
Assessment window Assessment Sitting No. of Items/Stations Test Quality Indicators
Cronbach's Alpha SEm
Jan 2023 FLK1 - 180 0.930 3.267%
FLK2 - 180 0.941 3.302%
 Jul 2023 FLK1  - 180 0.929  3.274% 
FLK2  - 180 0.930  3.352% 
Oct 2022 SQE2 1 16 0.850 3.826%
2 16 0.871 3.764%
3 16 0.884 3.851%
 Apr 2023  SQE2 1 16 0.904 3.953%
2 16 0.846 3.997%
3 16 0.878 3.824%
4 16 0.842 3.993%

Note - for SQE2 there were multiple sittings for the oral assessments (see introduction)

v) Candidate Journey

Tables 4a to 4c below summarise the journey so far for the candidates who have received assessment marks in this reporting period. These are provided separately for SQE1 (candidate outcomes) and SQE2 (candidate routes and outcomes). The candidates shown in table 4b include transitional candidates not required to take SQE1.

Table 4a: SQE1 Candidate outcomes (6545 individual candidates)

Jan 2023 Jul 2023
Passed SQE1 and sat SQE2 Apr 2023 113 n/a
Passed SQE1 and sat SQE2 Jul 2023 660 n/a
Passed SQE1 and sat SQE2 Oct 2023 180 114
Passed SQE1 and yet to sit SQE2 333 1601
Passed SQE1 and exempt from SQE2 415 462
Failed part or all of SQE1 in Jan 2023, resat and did not pass in Jul 2023 n/a 286
Failed part or all of SQE1 - yet to resit 917 1464
Total 2618 3927

Of the 113 candidates who passed SQE1 in January 2023 and went on to sit SQE2 in April 2023, 102 (90%) passed and 11 (10%) failed.

Of those who attempted either FLK1 or FLK2 in either January 2023 or July 2023, 510 (72%) passed their remaining assessments (included within the Passed rows of the table above).

Candidates who fail their first or second attempts may benefit from reviewing the information contained later in this report relating to candidate performance in different practice areas.

Table 4b: Candidate routes to SQE2 (1617 individual candidates)

Oct 2022 Apr 2023
Passed SQE1 Nov 2021 79 30
Passed SQE1 Jul 2022 n/a 433
Passed SQE1 Jan 2023 n/a 113
Transitional candidates 541 421
Total 620 997

Of the 962 candidates who did not sit SQE1 and sat SQE2 in October 2022 or April 2023, 645 (67%) passed. This compares to the higher rate of 88% for those taking SQE2 after passing the SQE1.

Table 4c: SQE2 Candidate outcomes (1617 candidates)

Oct 2022 Apr 2023
Passed 457 759
Failed Oct 2022 - resat Apr 2023 - Passed n/a 8
Failed and resat in Jul 23 or Oct 23 (awaiting a result) 92 69
Failed and yet to resit 71 161
Total 620 997

Of the 1224 candidates who passed SQE2 in October 2022 or April 2023:

  • 84 (7%) had passed SQE1 in November 2021
  • 393 (32%) had passed SQE1 in July 2022
  • 102 (8%) had passed SQE1 in January 2023
  • 645 (53%) were not required to take SQE1.

Of the 393 candidates who did not pass SQE2 in October 2022 or April 2023:

  • 25 (6%) had passed SQE1 in November 2021
  • 40 (10%) had passed SQE1 in July 2022
  • 11 (3%) had passed SQE1 in January 2023
  • 317 (81%) were not required to take SQE1.

Table 5 shows the candidate pass rates (and number passing) for each assessment for all candidates and by attempt number, and for SQE2 by whether SQE1 had been previously sat.

i) Pass rates on SQE1 (FLK1 and FLK2)

Of the two SQE1 assessments, the pass rates for FLK1 have been higher than for FLK2 (+3% and +8% for January 2023 and July 2023 respectively). We will continue to monitor this across future assessments.

The overall pass rate was similar for both SQE1 assessments at 51% and 53%.

ii) Pass rates for SQE1 resitting candidates

Pass rates in both January and July 2023 were lower for resitting candidates, when compared to first attempt candidates. This was true for both FLK1 and FLK2.

The pass rates for resitting candidates were higher in July than in January , increasing from 32% to 58% for FLK1 and from 38% to 50% for FLK2.

The proportion of candidates resitting has increased as more candidates start the SQE assessment journey. This rose from 8% to 17% for FLK1 and from 12% to 19% for FLK2 between January and July 2023.

The lower pass rates for resitting candidates might indicate that they should consider taking more time (and/or putting in more work or training) between sittings. This may help them improve from a failing to passing standard.

iii) Pass rates on SQE2

The SQE2 pass rates are markedly higher (20%+) than for SQE1. This is expected given the SQE2 eligibility requirement to have qualification-level functioning legal knowledge (i.e. have passed SQE1 or are a transitional candidate).

There has been a decrease in the proportion of candidates taking SQE2 who were not required to sit the SQE1; 42% of all candidates in April 2023 vs 87% in October 2022).

Candidates who sat SQE1 performed better than those who had not. This has contributed to the increase in the overall pass rate (71% in October 2022 to 77% in April 2023).

iv) Pass rates for SQE2 resitting candidates

The proportions of resitting candidates remain small across the assessments (7%) and their pass rates are significantly lower than for first attempt candidates.

The pass rate for second attempt candidates dropped from 54% in October 2022 to 36% in April 2023. Again, careful consideration should be given to when to resit, to allow sufficient time to improve to a passing standard.

Table 5: Assessment pass rates
Assessment Date Candidate % Pass Rates (and number passing)
All 1st Attempt Only 2nd Attempt Only 3rd Attempt Only Split Attempts* Sat SQE1 Did Not Sit SQE1
FLK1 Jan 2023 59% (1837)) 62%(1754) 32% (79) 33% (4) - - -
FLK2 Jan 2023 56% (1810) 58% (1660) 38% (135) 43% (15) - - -
SQE1 Jan 2023 51% (1537) 54% (1508) 14% (25) ** ** - -
FLK1 Jul 2023 66% (2423) 68% (2072) 58% (324) 59% (27) - - -
FLK2 Jul 2023 58% (2162) 59% (1801) 50% (320) 54% (41) - - -
SQE1 Jul 2023 53% (1831) 56% (1693) 29% (125) ** ** - -
SQE2 Oct 2022 71% (457) 72% (432) 54% (25) - - 83% (67) 69% (390)
SQE2 Apr 2023 77% (767) 80% (742) 36% (25) ** - 89% (513) 60% (254)

*Sat both assessments but with a different attempt number for each

**not reportable as less than 10 candidates in the attempt group

The SRA collects diversity and socio-economic data to help understand how candidates with different characteristics and backgrounds perform in the assessments. The categories are consistent with data collected by the Office for National Statistics (ONS) and the Social Mobility Commission.

Data is collected from candidates via an online monitoring and maximising diversity survey (or demographic data survey), completed ahead of assessment registration. Appendix 1 lists the data reported on in this section.

The large number of characteristics and groups within characteristics recorded in the data collected means that candidate numbers in some of the groups are small.

The complication of small candidate numbers in certain groups is compounded by almost half of candidates selecting ‘Prefer not to say’ for one or more of the 14 characteristics provided in this report. As a result, we have not been able to include multivariate analysis in this report.

Though multivariate analysis is not included for reasons mentioned, initial exploratory analyses suggest that many of the differential outcomes evident between groups may be explained by educational and socio-economic factors. However, we cannot draw reliable conclusions from this exploratory analysis yet. We will look at this again when candidate numbers and declaration rates increase.

In the tables below, we present univariate analysis of the outcomes data, which looks at each of the characteristics individually and independently. Tables 6 to 12 provide the following for FLK1, FLK2, SQE1 and SQE2 for each of the 14 characteristics presented:

  • The outcome of a Chi-square significance test to indicate whether there are any significant differences (at a 95% confidence level) between the pass rates of the groups. This is indicated in the column header as ‘Yes’ (significant differences) or ‘No’ (no significant differences).
  • The proportion of candidates within each group. This is calculated with100% being all candidates with disclosed data AND being in a group with 10 or more candidates (‘Proportion %’ column).
  • The percentage pass rate for each group with 10 or more candidates (‘Pass Rate %’ column).

Data in the tables exclude candidates who select ‘Prefer not to say’ - the following provides some proportions of candidates who selected this response:

  • Fewer than 3% of candidates did not disclose age or sex.
  • Fewer than 10% did not disclose ethnicity or disability status, school type attended, highest qualification and parental education.
  • Fewer than 15% of candidates did not disclose sexuality, religion and undergraduate degree classification.
  • Fewer than 20% of candidates did not disclose household socio-economic status.
  • Around 20% of candidates did not disclose whether they had undertaken any qualifying work experience.

These tables provide data for all candidates who received marks for any assessment (FLK1, FLK2, SQE1 and SQE2).

The data is pooled from the January 2023 and July 2023 assessments for FLK1, FLK2, and SQE1, and from the October 2022 and April 2023 assessments for SQE2. Where there are fewer than 10 candidates in any group the proportions and pass rates are not reported; this is indicated by greyed out cells in the table.

The full questions asked in the online demographic data survey in relation to each category are available in Appendix 1. We have started collecting data about candidates’ first language and this will be included in the next annual report.

i) Ethnicity

Candidates who reported being in White or Mixed/multiple ethnic groups achieved higher pass rates than Asian/Asian British or Black/Black British. Differences in pass rates between groups were significant for all assessments.

Table 6: Pass rates by ethnicity characteristics
  FLK1 FLK2 SQE1 SQE2
Proportion % Pass Rate % Proportion % Pass Rate % Proportion % Pass Rate % Proportion % Pass Rate %
Significant differences: Yes Yes Yes Yes
Asian / Asian British 34 57 34 50 34 45 28 66
Black / Black British 6 43 7 36 7 30 6 53
Mixed / multiple ethnic groups 4 66 4 56 4 52 5 80
Other 7 53 7 44 7 40 9 62
White 49 71 48 66 48 61 52 83

ii) Disability

Pass rates were similar between candidates who declared a disability and those who did not.

Table 7: Pass rates by disability characteristics
  FLK1 FLK2 SQE1 SQE2
Proportion % Pass Rate % Proportion % Pass Rate % Proportion % Pass Rate % Proportion % Pass Rate %
Significant differences: No No No No
Do not consider themselves to have a disability 94 63 94 57 94 52 94 74
Do consider themselves to have a disability 6 64 6 56 6 51 6 78

iii) Age

The majority of candidates taking SQE1 and SQE2 were in the younger age groups. More than half were in the 25-34 age group and this group achieved higher pass rates than older candidates in FLK1, FLK2 and SQE1.

SQE2 candidates in the 16-24 and 25-34 age groups achieved higher pass rates than older candidates.

There were fewer than 10 candidates in the 65+ age group for all assessments. The proportions and pass rates are therefore not provided for this group.

Table 8: Pass rates by age characteristics

FLK1 FLK2
SQE1 SQE2
Proportion % Pass Rate % Proportion % Pass Rate % Proportion % Pass Rate % Proportion % Pass Rate %
Significant differences: Yes Yes Yes Yes
16 - 24 32 60 31 52 32 52 11 90
25 - 34 51 66 51 59 51 54 61 79
35 - 44 13 63 13 51 13 47 21 64
45 - 54 3 56 4 45 3 39 6 47
55 - 64 1 35 1 25 1 23 1 41
65+                

iv) Sex, Gender and Sexual Orientation

Candidates who reported their sex as male achieved a higher pass rate than female candidates in SQE1. The opposite was the case in SQE2.

There were no significant differences between candidates whose gender was the same or different to their sex registered at birth.

Candidates who reported that their sexual orientation was Bi, Gay / lesbian or Other achieved higher pass rates than Heterosexual / straight candidates in SQE1 assessments, but there were no significant differences in SQE2.

There were fewer than 10 candidates selecting ‘Other’ for sex for all assessments, and for SQE2 there were fewer than 10 candidates selecting ‘No’ for their gender being the same as the sex registered at birth and ‘Other’ for sexual orientation. The proportions and pass rates are therefore not provided for these groups.

Table 9: Pass rates by sex, gender and sexual orientation characteristics

FLK1 FLK2
SQE1 SQE2
Proportion % Pass Rate % Proportion % Pass Rate % Proportion % Pass Rate % Proportion % Pass Rate %
Sex - Significant differences: Yes Yes Yes Yes
Female 65 61 65 55 65 50 65 77
Male 35 67 35 59 35 55 35 70
Other                
Gender same as sex registered at birth - Significant differences: No No No n/a
No <1 75 <1 75 <1 75    
Yes 100 63 100 57 100 52 100 70
Sexual orientation - Significant differences: Yes Yes Yes No
Bi 5 67 5 61 5 58 3 87
Gay / lesbian 4 78 4 70 4 64 3 66
Heterosexual / straight 91 63 91 55 91 50 94 74
Other <1 74 <1 55 <1 58    

v) Religion or Belief

There were differences in pass rates between religion/belief groups reported by candidates in all assessments. Candidates reporting Hindu, Muslim or Sikh as their religion generally had lower pass rates, and those reporting no religion or belief or Jewish had higher pass rates.

The most frequently indicated group was no religion or belief (40%-45% of candidates for each assessment).

Table 10: Pass rates by religion/belief characteristics

FLK1 FLK2
SQE1 SQE2
Proportion % Pass Rate % Proportion % Pass Rate % Proportion % Pass Rate % Proportion % Pass Rate %
Significant differences: Yes Yes Yes Yes
Buddhist 1 58 2 47 1 45 2 69
Christian 32 64 32 56 32 51 35 73
Hindu 6 44 6 41 6 36 6 64
Jewish 1 85 1 73 1 72 1 80
Muslim 13 38 12 33 13 27 13 55
No religion or belief 45 72 44 66 44 62 40 86
Sikh 1 44 1 41 1 30 2 50
Other 1 60 2 49 2 44 1 69

vi) Socio-economic background measured by Occupation of Main Household Earner at 14, Type of School Attended and Parental Education

Candidates who reported the occupation of the main household earner as professional achieved higher pass rates in SQE1 and SQE2.

Pass rates for candidates attending independent or fee-paying schools (non-bursary funded) were higher across all assessments. Candidates who attended school outside the UK achieved significantly lower pass rates in SQE2 than those in the other subgroups.

Candidates who reported that at least one parent attended university achieved higher pass rates in SQE1, with pass rates being similar in SQE2.

For SQE2, fewer than 10 candidates selected ‘Other’ or did not know the type of school they attended. The proportions and pass rates are therefore not provided for these groups.

Table 11: Pass rates by household earner occupation, school type attended and parental education characteristics

FLK1 FLK2
SQE1 SQE2
Proportion % Pass Rate % Proportion % Pass Rate % Proportion % Pass Rate % Proportion % Pass Rate %
Occupation of Main Household Earner - Significant differences: Yes Yes Yes Yes
Professional background 65 66 65 59 66 55 67 78
Intermediate background 14 62 14 54 14 48 13 69
Working class background 17 58 17 55 16 48 16 76
Other background 4 53 4 48 4 40 4 64
Type of School Attended - Significant differences: Yes Yes Yes Yes
State-run or state- funded school - non-selective 30 61 30 56 30 50 29 80
State-run or state- funded school - selective on academic, faith or other grounds 15 65 14 60 15 55 13 76
Independent or fee-paying school 12 69 12 64 12 59 12 82
Independent or fee-paying school, where I received a bursary covering 90% or more of my tuition 1 70 1 58 1 51 1 86
Attended school outside the UK 41 63 42 54 41 50 45 69
Other <1 59 <1 53 <1 44    
I don't know* 1 40 1 40 1 37    
Parents Attended University - Significant differences: Yes Yes Yes No
Yes, one or both of my parents attended university 58 67 58 60 58 56 58 77
No, neither of my parents attended university 39 58 39 53 39 46 39 75
Do not know / not sure* 3 46 3 43 3 37 3 48

*Group excluded from the Chi-square test of significance

vii) Highest Level of Education, Undergraduate Degree Classification and Qualifying Work Experience Undertaken

Candidates with qualifications below degree level accounted for approximately 1.5% of all candidates and achieved higher pass rates in FLK2 and SQE2 than those with an undergraduate degree. Whilst an undergraduate degree (or equivalent qualification) is required for admission it is not a requirement for taking the SQE1 or SQE2 assessments.

Candidates with first class undergraduate degree classifications achieved higher pass rates in all assessments and accounted for over a fifth of candidates.

Candidates who disclosed they had not undertaken qualifying work experience achieved a higher pass rate in FLK2 and SQE1, with similar pass rates to those who had undertaken qualifying work experience in FLK1 and SQE2.

All candidates disclosed whether they were already a qualified lawyer. Those who were qualified achieved higher pass rates in FLK1, FLK2 and SQE1. The opposite was the case in SQE2.

There were fewer than 10 candidates with no formal qualifications for all assessments, and with a Commendation for their undergraduate degree for SQE2. The proportions and pass rates are therefore not provided for these groups.

Table 12: Pass rates by level of education, undergraduate degree classification, QWE and qualified status characteristics

FLK1 FLK2
SQE1 SQE2
Proportion % Pass Rate % Proportion % Pass Rate % Proportion % Pass Rate % Proportion % Pass Rate %
Highest Level of Education - Significant differences: No Yes No Yes
At least an undergraduate degree 95 63 96 57 96 53 95 76
Qualifications below degree level 2 70 1 68 1 54 1 100
No formal qualifications                
Not applicable* 3 58 3 40 3 37 4 47
Undergraduate Degree Classification - Significant differences: Yes Yes Yes Yes
1st 23 81 23 73 23 70 20 93
2:1 48 62 48 58 48 52 48 80
2:2 12 38 12 32 12 25 14 58
3rd 1 26 1 21 1 18 1 67
Distinction 3 68 3 58 3 55 3 81
Commendation <1 57 <1 50 <1 45    
Pass 3 46 3 36 3 31 4 44
Not applicable* 10 71 10 63 10 60 10 74
Qualifying Work Experience (QWE) Undertaken - Significant differences: No Yes Yes No
No QWE undertaken 42 58 39 60 42 53 15 77
QWE undertaken 58 57 61 55 58 46 85 76
Qualified Lawyer Status - Significant differences: Yes Yes Yes Yes
Not qualified 65 60 65 55 65 50 61 80
Qualified 35 70 35 60 35 56 39 66

*Group excluded from the Chi-square test

i) Practice Area Performance in SQE1

The tables in this section show the mean scores by practice area for FLK1 and FLK2 by the following candidate groups:

  • All candidates
  • Passing candidates
  • Failing candidates
  • First attempt candidates
  • Second attempt candidates
  • Third attempt candidates

In FLK1, mean scores across the seven practice areas assessed range from 51.6% for Legal Services, to 70.7% for Ethics. This suggests candidates find some practice areas more difficult than others.

Five of the practice area mean scores are below 60%, with Legal Services (51.6%) and Dispute Resolution (52.3%) having the lowest scores. Mean scores were above 60% for Ethics (70.7%) and Contract Law (63.2%).

In FLK2, the mean scores appear slightly lower than for FLK1 and range from 49.9% for Wills and Intestacy, to 63.5% for Ethics. Whilst the range in mean scores is smaller than for FLK1 (13.6% vs 19.1%) this range again suggests candidates find some areas more difficult than others, with Wills and Intestacy (49.9%) and Property Practice (50.7%) having the lowest scores. Mean scores were above 60% for Ethics (63.5%) and Criminal Liability (60.3%).

There are similar patterns of performance between the passing and failing candidates, and first, second and third attempt candidates, across the majority of practice areas within each assessment. This suggests both stronger and weaker candidates perform well/less well in the same practice areas.

The mean difference between passing and failing candidates across the practice areas varies between 17.0% (FLK1 Legal Services) to 25.5% (FLK1 Business Law and Practice) with the differences being above 20% for four of the seven FLK1 practice areas and all of the seven for FLK2. The deficit in knowledge of the weaker candidates is therefore greater in these practice areas, suggesting these areas may require more focus/preparation for future attempts at SQE1.

When comparing performance between attempt numbers the differences between first attempt and second attempt mean scores range between 3.4% (FLK1 Ethics) and 9.0% (FLK1 Business Law and Practice), with the majority of differences being below 6%, and as expected, higher mean scores for first attempt candidates across all practice areas.

The differences between first and third attempt candidates are more variable with the differences being smaller (than between first and second attempt) for all of the FLK2 practice areas. Third attempt candidates scored higher in FLK2 Ethics than the first attempt candidates.

The following plots show the mean scores for the FLK1 and FLK2 practice areas for the passing and failing candidates, with data aggregated across the two assessments: pink = failing; green = passing; bars ordered by passing candidate mean scores descending.

In FLK1, the wider difference between passing and failing candidates is evident for Business Law and Practice, and for FLK2 this is evident for Land Law and Wills and Intestacy.

FLK1 Aggregated Mean Practice Area Scores

Y axis showing Mean Score (%)

100
90
80
70
60
50
40
30
20
10
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
E
CL
T
BLP
LSys
DR
LSrv

X axis showing Practice area

Result

Graph key showing how Fail is represented in the graph
Graph key showing how Pass is represented in the graph

Figure 1: Mean FLK1 practice area scores for passing and failing candidates

FLK2 Aggregated Mean Practice Area Scores

Y axis showing Mean Score (%)

100
90
80
70
60
50
40
30
20
10
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
E
CLi
LL
TL
CLP
WI
PP

X axis showing Practice area

Result

Graph key showing how Fail is represented in the graph
Graph key showing how Pass is represented in the graph

Figure 2: Mean FLK2 practice area scores for passing and failing candidates

Table 13: FLK1 mean scores by practice area
Practice Area
All
By Result By Attempt
Pass Fail 1st 2nd 3rd
Business Law and Practice 55.8 65.2 39.7 56.9 47.9 47.5
Contract Law 63.2 71.2 49.5 64.0 58.0 53.4
Dispute Resolution 52.3 59.7 39.5 52.8 49.0 48.8
Ethics 70.7 77.6 58.8 71.1 67.7 66.8
Legal Services 51.6 57.8 40.8 52.1 47.7 47.7
Legal System 57.3 64.3 45.2 58.0 52.3 51.6
Tort 57.5 65.6 43.6 58.0 54.3 54.7
Table 14: FLK2 mean scores by practice area
Practice Area
All
By Result By Attempt
Pass Fail 1st 2nd 3rd
Criminal Law and Practice 53.0 63.0 40.1 53.8 48.6 50.7
Criminal Liability 60.3 69.4 48.4 60.9 56.8 58.9
Ethics 63.5 72.2 52.0 64.0 60.2 64.7
Land Law 55.0 65.5 41.2 55.7 51.0 52.6
Property Practice 50.7 60.0 38.6 51.5 46.4 48.0
Trust Law 55.1 64.3 43.0 55.9 50.9 52.7
Wills and Intestacy 49.9 60.2 36.6 50.8 45.1 48.0

ii) Practice Area Performance in SQE2

In the SQE2 assessments, all candidates sat the same 12 written stations within each assessment window. For the four oral stations, candidates were assessed on the same skills in the same practice areas within each assessment window, but with different stations across three oral sittings in October 2022 and four oral sittings in April 2023.

Whilst the number of stations assessing each skill and practice area remains the same for each SQE2 iteration, the combination of skills and practice areas will vary between iterations.

For more on this, please see the SQE2 Assessment Specification (Organisation and delivery section).

The table in this section shows the mean scores by station (with data aggregated across assessments where there are common stations and aligned to ensure a comparable scale across the assessments) for the following candidate groups:

  • All candidates
  • Passing candidates
  • Failing candidates
  • First attempt candidates
  • Resit candidates (second and third attempts grouped due to small numbers of candidates).

Looking at the station scores for all candidates, mean scores range between 58.0% and 77.6%. Scores above 70% were achieved in nine of the 24 different stations, with mean scores of 69.5% or over for the four oral stations. The highest mean scores were in:

  • Legal Drafting - Wills and Intestacy (77.6%)
  • Advocacy - Criminal Litigation (76.9%)
  • Advocacy - Dispute Resolution (75.0%).

At the lower end, mean scores of less than 60% were in:

  • Legal Drafting - Criminal Litigation (59.8%)
  • Legal Writing - Criminal Litigation (59.4%)
  • Case and Matter Analysis - Dispute Resolution (59.0%)
  • Legal Research - Criminal Litigation (58.0%).

The skills assessed in the oral assessments have the overall highest mean scores with 76.0% for Advocacy and 71.5% for Interview and Attendance Note/Legal Analysis. Lower mean scores are seen for the written stations ranging from 63.3% for Legal Writing to 68.2% for Legal Research. Differences between the mean scores for first attempt and resit candidates range from 6.1% (Interview) to 11.3% (Case and Matter Analysis).

The range in the mean performance across the five practice areas is small, ranging from 66.4% (Business) to 70.8% (Wills and Intestacy), which suggests that, overall, candidates do not find particular practice areas easier or more difficult than others. Mean differences between first attempt and resit candidates range between 6.6% (Criminal Litigation) and 11.7% (Property Practice).

Figure 3 and Figure 4 show the mean scores for the SQE2 skills and practice area scores for the passing and failing candidates with data aggregated across the two assessments: pink = failing; green = passing; bars ordered by passing candidates’ mean scores descending.

The patterns of performance between passing and failing candidates appear similar across both skills and practice areas. Mean differences between the passing and failing candidates for skills range from 15.2% (Interview and Attendance Note/Legal Analysis) to 22.1% (Legal Research). For the practice areas the differences between the passing and failing candidates range from 18.2% (Criminal Litigation) to 22.0% (Dispute Resolution).

The mean score difference between passing and failing candidates was greater than 20% for the following three (out of six) skills and two (out of five) practice areas:

Skills:

  • Legal Research (22.1%)
  • Case and Matter Analysis (21.4%)
  • Advocacy (20.4%).

Practice area:

  • Dispute Resolution (22.0%)
  • Business (20.2%).

The deficit in the legal skills (and associated legal knowledge) of the weaker candidates is therefore greater in these skills and practice areas, suggesting these areas may require more focus/preparation for future attempts at SQE2.

SQE2 Aggregated Mean Skill Scores

Y axis showing Mean Score (%)

100
90
80
70
60
50
40
30
20
10
 
 
 
 
 
 
 
 
 
 
 
 
 
Advocacy
Int. & Att Note
Legal Research
CMA
Legal Drafting
Legal Writing

X axis showing Station

Result

Graph key showing how Fail is represented in the graph
Graph key showing how Pass is represented in the graph

Figure 3: Mean SQE2 scores for passing and failing candidates by skill

SQE2 Aggregated Mean Practice Area Scores

Y axis showing Mean Score (%)

100
90
80
70
60
50
40
30
20
10
 
 
 
 
 
 
 
 
 
 
 
Wills and Intestacy
Dispute Resolution
Property Practice
Criminal Litigation
Business

X axis showing Practice area

Result

Graph key showing how Fail is represented in the graph
Graph key showing how Pass is represented in the graph

Figure 4: Mean SQE2 scores for passing and failing candidates by practice Area

Table 15: SQE2 mean scores by practice area and skill (station)
Practice Area Skill Number Type All By Result By Attempt
Pass Fail 1st 2nd/3rd
Business Organisations, Rules and Procedures CMA 2 Written 68.3 74.3 50.6 69.3 55.6
L Drafting 2 Written 65.5 70.1 52.0 66.2 55.9
L Research 2 Written 66.9 72.8 49.7 67.7 56.8
L Writing 2 Written 64.9 69.1 52.9 65.5 58.2
Criminal Litigation Advocacy 2 Oral 76.9 82.1 61.7 77.5 68.0
CMA 1 Written 67.7 72.2 52.7 68.5 57.1
L Drafting 1 Written 59.8 63.6 47.0 60.4 51.7
L Research 1 Written 58.0 62.7 46.6 57.7 61.7
L Writing 1 Written 61.1 65.3 50.8 61.0 62.1
Dispute Resolution Advocacy 2 Oral 75.0 80.2 59.9 75.5 69.4
CMA 1 Written 59.0 65.7 42.8 59.7 50.0
L Drafting 1 Written 70.8 77.0 55.7 71.0 67.2
L Research 1 Written 73.6 78.8 56.1 74.5 61.7
L Writing 1 Written 63.9 69.3 45.9 64.8 51.7
Property Practice Int/AttNote 2 Oral 69.5 73.0 59.4 69.9 63.9
CMA 1 Written 72.4 78.1 58.8 73.1 63.4
L Drafting 1 Written 62.1 68.2 41.6 63.6 43.2
L Research 1 Written 71.7 76.3 56.5 72.8 58.2
L Writing 1 Written 59.4 67.2 40.7 60.2 49.7
Wills and Intestacy, Probate Administration and Practice Int/AttNote 2 Oral 73.5 77.8 60.9 74.0 67.8
CMA 1 Written 70.4 74.8 55.6 71.1 61.6
L Drafting 1 Written 77.6 82.9 64.9 78.0 72.8
L Research 1 Written 68.1 75.4 50.6 68.9 58.4
L Writing 1 Written 64.1 68.4 49.6 65.1 50.9

i) Ethics Performance in SQE1

Table 16 shows the mean scores for all candidates along with passing and failing candidates for ethics and non-ethics focussed assessment content across FLK1 and FLK2.

In each case, there is a strong positive correlation between performance on ethics questions and the overall score. Naturally those candidates who have passed the assessments have significantly higher scores than those who have failed.

The mean scores are consistently higher for the ethics items compared to the non-ethics items across all assessments and groups. For all candidates the mean differences range between 5.2% (FLK2 January 2023) and 16.1% (FLK1 January 2023). With the current data, there appears to be no pattern in mean differences between ethics and non-ethics items across the FLK1 and FLK2 assessments

The mean differences between the passing and failing candidates are similar for both ethics and non-ethics content, indicating associated performance across the two content areas.

Table 16: FLK1 and FLK2 mean scores for ethics and non-ethics content

All Candidates Passing Candidates Failing Candidates
Assessment Ethics Items Non-Ethics Items Ethics Items Non-Ethics Items Ethics Items Non-Ethics Items
FLK1 Jan 2023 72.9 56.8 81.6 65.2 60.3 44.7
FLK2 Jan 2023 61.4 56.1 69.9 66.5 50.6 43.1
FLK1 July 2022 68.8 55.8 74.6 63.1 57.3 41.4
FLK2 July 2022 65.3 52.2 74.2 61.4 53.2 39.6

ii) Professional Conduct Performance in SQE2

Professional Conduct and Ethics are assessed pervasively in the SQE2 assessments. Some of the 16 stations contain Professional Conduct matters, and some do not. There is no formula determining which assessments this may fall in, and candidates are required to spot these issues and deal with them appropriately.

The matters related to these topics do not therefore have their own assessment criteria allocated to them. They are an integral part of the assessment, considered in the round and marked under the Legally Comprehensive assessment criteria. The weight attached to Professional Conduct and Ethics is determined at the markers’ meeting, applying the Day 1 standard.

Kaplan SQE is the End Point Assessment Organisation for solicitor apprentices in England. Solicitor apprentices are required to pass SQE1 during their apprenticeship as an onprogramme assessment – it is a gateway requirement for SQE2 which is the end-point assessment (EPA).

Apprentices must pass SQE1 and have met all of the minimum requirements of their apprenticeship (including the gateway review) before they can attempt SQE2. When an apprentice has passed SQE2, they have completed the EPA for the Solicitor Apprenticeship Standard and passed the SQE.

Information about how solicitor apprentices and their training providers can engage with the SQE is available on our website.

Solicitor apprentices made up a small proportion of overall candidate numbers for each assessment as indicated in Table 17 and accounted for 5% of all candidates assessed in this reporting period.

The solicitor apprentice pass rates were significantly higher than the pass rates for non-apprentice candidates for FLK2 Jul 2023, SQE1 Jul 2023 and SQE2 Apr 2023. The difference in pass rates is most notable for SQE2 April 2023 with 97% of apprentices passing compared to 75% of all non-apprentice candidates. This is consistent with the SQE2 April 2022 assessment where the apprentice pass rate was 100%. It shows solicitor apprentices continue to demonstrate their preparedness for the end point assessment.

Table 17: Pass rates for solicitor apprentices and non-solicitor apprentices
Attempt Group SQE1 Jan 2023 SQE1 Jul 2023 SQE2 Oct 2022 SQE2 Apr 2023
FLK1 FLK2 SQE1 FLK1 FLK2 SQE1
Apprentice Proportion of All Candidates 4% 7% 0% 7%
All Apprentice 61% 62% 52% 71% 71% 63% - 97%
Non-Apprentice 59% 55% 52% 66% 57% 52% 71% 75%
1st Apprentice 63% 64% 54% 71% 74% 65% - 97%
Non-Apprentice 62% 58% 53% 68% 58% 56% 72% 79%
2nd & 3rd Apprentice * 53% - 68% 57% - - -
Non-Apprentice 32% 38% - 57% 50% - 54% 34%

*Not reported as less than 10 candidates

There were 398 solicitor apprentices who took one or more of the assessments in this reporting period. Of these:

  • 28 passed both SQE1 and SQE2 this year, all at first attempt.
  • 38 passed SQE2 at first attempt this year after passing SQE1 at first attempt in the previous year.
  • 32 sat SQE2 in July 2023 and are awaiting results.
  • 33 sat SQE2 in October 2023 and are awaiting results.
  • 131 passed SQE1 this year at first attempt and are yet to sit SQE2.

We are committed to making sure that a candidate is not disadvantaged by reason of a disability in demonstrating their competence and we will make reasonable adjustments to methods of assessment for candidates with a disability (within the meaning of the Equality Act 2010) to achieve this. We will also consider reasonable requests to accommodate candidates with other conditions which impact on their ability to demonstrate their competence.

Our approach to reasonable adjustments, including how we communicate with candidates and the arrangements we most frequently make, is set out in the Reasonable Adjustments Policy.

During the course of the year we implemented 738 reasonable adjustment plans.

The average time between receiving a completed application for reasonable adjustments (with full accompanying evidence) to proposing an adjustment plan to a candidate was 7 days on both SQE2 October 2022 and SQE2 July 2023, and 8 days on both SQE1 January and SQE1 July 2023.

The average number of days taken was 15 days for the SQE2 April 2023 assessment. The longer timeframe was due to a high volume of complex cases as well as operational challenges and bottlenecks during the booking process. Following evaluation, changes have been implemented to improve these for upcoming assessments and average times have returned to prior levels on subsequent windows.

Considerable time is spent with some candidates who have complex reasonable adjustment plans to ensure comprehensive support to finalise a reasonable adjustment plan for the assessments. Our standard guidance to candidates is to obtain their supporting evidence and make their reasonable adjustment request as early as possible. We strongly encourage candidates to submit their request form at the earliest opportunity, even if supporting documentation is not fully available at that point. Applying early provides time for Kaplan and the candidate to finalise an adjustment plan prior to seats being booked. Completion of the booking is carried out after the booking window has opened.

Table 18: Proportion of candidates with a reasonable adjustment plan
Assessment Date % of candidates with a reasonable adjustment plan
SQE1 Overall Jan 2023 7%
SQE1 Overall Jul 2023 7%
SQE2 Overall Oct 2022 8%
SQE2 Overall Apr 2023 8%
Table 19: Proportion of candidates with reasonable adjustment (RA) plans who have one or multiple conditions or disabilities
Assessment Date % of candidates with RA plans who have one condition/disability % of candidates with RA plans who have multiple conditions/disabilities
SQE1 Overall Jan 2023 81% 18%*
SQE1 Overall Jul 2023 78% 22%
SQE2 Overall Oct 2022 78% 22%
SQE2 Overall Apr 2023 81% 19%

* 1% not recorded

i) Pass rates for candidates with reasonable adjustment plans

Table 20 shows pass rates for candidates with reasonable adjustment plans alongside pass rates for the full cohort for FLK1, FLK2 and SQE2 assessments during the reporting period.

A Chi-square significance test was used to see if there were differences (at a 95% confidence level) between the pass rates for candidates with and without a reasonable adjustment plan. This is indicated in the final column as ‘Yes’ (significant differences) or ‘No’ (no significant differences).

For FLK2 Jan 2023, candidates with a reasonable adjustment plan in place achieved a significantly higher pass rate, with no other differences being significant.

Pass rates for candidates with reasonable adjustment plans are broadly similar to those for the full cohort for all SQE assessments, and there is no consistent pattern to suggest that candidates with reasonable adjustments achieved higher or lower pass rates than those without. For SQE2, only the outcomes of the SQE2 April 2022 assessment were available at the time of writing the last SQE Annual Report 2021/22. The outcomes showed a higher pass rate for candidates with reasonable adjustments. This did not occur for SQE2 October 2022 or SQE2 April 2023. The results of SQE2 July 2023 were not yet known at the time of writing.

Table 19: Comparison of pass rates overall and for candidates with a reasonable adjustment (RA) plan
Assessment Date Overall Pass Rate RA Pass Rate Significant Differences*
FLK1 Jan 2023 59% 63% No
FLK2 Jan 2023 56% 62% Yes
FLK1 Jul 2023 66% 67% No
FLK2 Jul 2023 58% 53% No
SQE2 Overall Oct 2022 71% 68% No
SQE2 Overall Apr 2023 77% 75% No

*Significant where any differences between the groups are unlikely to be due to chance

ii) Nature of conditions and adjustments made

Plans were in place for candidates with a wide range of disabilities, long-term and fluctuating conditions. Adjustments were also agreed for some candidates who were pregnant/nursing mothers.

The most common conditions amongst candidates with reasonable adjustment plans were neurological conditions such as dyslexia, autism, dyspraxia and ADHD. The same was true in the period covered by the previous SQE Annual Report (2021/22).

The majority of candidates (56%) had more than one adjustment to the mode of assessment. The adjustments were specific to the actual assessments and in some cases different arrangements were required on the SQE2 written and oral assessments.

The most common adjustments were as follows, with similar patterns seen across SQE1 and SQE2.

  1. Up to 25% extra time.
  2. Stop the clock (STC) - STC means that a candidate is given a prescribed amount of time that they can use for breaks - the candidates can take as many breaks and decide on the length of the breaks up to the prescribed agreed amount of STC time.
  3. More than 25% extra time.
  4. Own assessment room (this is more common at SQE1 assessments).
  5. Access to medicine, snacks or water during the assessment

Other bespoke provisions were also arranged for candidates where evidence supported this. Examples of these included access to medical devices and use of a screen overlay. The types of other bespoke arrangements made were similar to the adjustments listed in the SQE Annual Report 2021/22.

After each assessment, candidates are invited to complete a survey to provide feedback about their experience. The questions in the survey (Appendix 2) relate to:

  • the SQE website
  • operations
  • assessment specification and questions
  • reasonable adjustments
  • SQE in Welsh
  • apprentices
  • the overall SQE service.

Candidates can provide general comments via a free-text box. They can also provide their contact details should they wish to be contacted further about their feedback.

These surveys continue to provide valuable information for Kaplan and the SRA to consider and, overall, it is clear that a significant proportion of candidates are still dissatisfied with the online journey.

Whilst the website is now better able to support large volumes of candidates at results release, the ongoing need for queuing and lack of confidence during booking windows causes frustration and concern for candidates.

The feedback from candidates in the surveys show a mix of experiences. Whilst there have been some adverse events that have regrettably impacted delivery to isolated pockets of candidates, the assessment has been delivered without issue to the vast majority.

Although the survey invites candidates to give feedback about the SQE in Welsh, there have so far been no candidates opting to take the assessment in Welsh. We are continuing with the phased introduction of the SQE in Welsh. From SQE2 October 2023, the whole of the SQE2 can be taken in Welsh.

Preparations are also ongoing to offer the SQE1 in Welsh from January 2025, following a small-scale pilot exercise in 2023. A report on the pilot for the SQE1 in Welsh will be published in early 2024.

All responses to the candidate survey are collated and analysed, with action plans put in place where improvements can be made, or new opportunities and solutions can be explored.

Feedback from candidates and stakeholders has also been collected, reviewed and considered from various other sources during the course of delivering the assessments. This includes input received from the SRA, SRA Psychometrician and the Independent Reviewer of the SQE about the overall delivery of the SQE assessments, and their oversight of any issue management should it arise.

During the reporting period, an issue unfortunately arose where an isolated error was found regarding the quality of marking for one of the sixteen assessment stations for the April 2023 sitting of SQE2. This issue was identified shortly after the results were released when a small number of candidates made enquiries about the detailed pattern of results they received. An error was found in a few of the Business Case and Matter Analysis scripts in a set of 346. In order to reassure candidates, we checked that no other errors occurred within this set. The review was concluded and the results of the review were presented to the Assessment Board in October 2023. The Assessment Board approved the results of the review, which were as follows:

  • No errors were found for 324 candidates and therefore their marks did not change
  • Some candidates’ marks (21) increased very slightly for the BCMA assessment station, but this did not lead to a change in their overall SQE2 result
  • One candidate’s marks were increased for the BCMA assessment station, which led to them passing the SQE2 assessment overall.

Although there are rigorous assurance processes in place, we have reviewed our processes further in light of this error and include a specific review of zero and low scoring scripts during the marking and analysis of the SQE assessments. We communicated the outcomes to all affected candidates and apologised for the uncertainty this caused.

Table 21 summarises some of the key areas brought to our attention for improvement, and what is being done in response.

Table 21: Key areas of feedback and responses
Feedback said… Our response has been…
Website: Prolonged booking journey and queuing Investing in the performance of the website to reduce reliance on queuing. Reviewing ways to stagger the demand seen on day 1 of the booking window. Working with training providers and candidates to better anticipate the demand at each assessment.
Website: The SQE2 booking journey is challenging for candidates. A review of the booking journey is underway and initial improvements are expected in 2024.
SQE1 and SQE2: Candidates wanted more sample questions. More SQE1 sample questions were published in late 2023 and more SQE2 sample questions are being prepared for publication in 2024.
SQE1: Candidates wanted more date options for taking SQE1 assessments Ready to increase the number of testing days in each SQE1 assessment window from January 2024, allowing increased flexibility for candidates
SQE1: More detailed feedback to candidates about their performance on SQE1 A detailed breakdown will be available for candidates when they receive their January SQE1 results. Guidance has also been provided on the website to help candidates to understand the more detailed information available
SQE1 and SQE2 Written: The quality of test centres and reliability of their technology has not consistently met expectations. The candidate satisfaction survey has been modified to collect feedback about specific sites and this feedback is acted on at centre-level. Improvements have been made by Pearson VUE, and compared to 2021/22 the overall satisfaction with administration at the test centres and their suitability rose for three assessment windows but was less positive for two.
SQE2: The assessment interface for SQE2 written assessments were difficult to use. Troubleshooting guidance has been issued to all centres regarding overwriting due to the ‘insert’ key, which is a challenge for some candidates during the assessment. This has improved the candidate experience and reduced the number of issues reported about this.

In response to a recommendation made by the SQE Independent reviewer, Kaplan continues to work with Pearson VUE regarding enhancements to the platform. Guidance has been issued to SQE markers about how to treat spelling and grammar during marking. The guidance was also published online to assist candidates preparing for the assessment.

Kaplan continues to share candidate feedback about the assessment interface with Pearson VUE, to inform their plans for development.
Reasonable Adjustments: Continue to operationalise a wider range of assistive technologies The first JAWS SQE assessment and the first Read&Write SQE assessment were delivered. Work is ongoing to introduce further assistive technologies in 2024. 
Reasonable Adjustments: Candidates want information about how to apply for an RA and what they can expect presented in an alternative format. In August we published a flowchart/simplified explanation of the RA process.
Providing data to training providers to help them with their course development Anonymised candidate data will be provided to training providers after each SQE1 assessment window, starting from the January 2024 sitting, and annually for SQE2.
 Provide more information to candidates about training providers A drop down menu in the survey which collects 
information on candidates’ training providers and courses has been added. This will help provide more accurate information to aid in the SRA’s approach to publishing performance data. 
 Improved visibility of key dates to support candidates preparing for the assessments
 
 A consolidated table of exam dates showing exam, booking, results and law cut off dates for each assessment is now available on our website. Assessment window dates have been published further in advance this year.
Candidate online monitoring and maximising diversity survey (demographic data survey) questions
Table Category Full Question in the Survey
6 Ethnicity What is your ethnic group?
7 Disability Do you consider yourself to have a disability according to the definition in the Equality Act 2010?
8 Age What age category are you in?
9 Sex What is your sex?
Gender same as sex registered at birth Is your gender the same as the sex you were registered at birth?
Sexual orientation What is your sexual orientation?
 10  Religion/belief  What is your religion or belief?
11 Parents attended university Did either of your parents attend university by the time you were 18?
Occupation of main household earner What was the occupation of your main household earner when you were aged about 14?
Type of school attended Which type of school did you attend for the most time between the ages of 11 and 16?
12 Highest level of education What is your highest level of education?
Undergraduate degree classification What was your undergraduate degree classification?
Qualifying work experience undertaken Have you undertaken any qualifying work experience?
Qualified lawyer status If you are a qualified lawyer, please state the country in which you achieved your law qualification(s).

Note: We have started collecting data about candidates’ first language and this will be included in the next annual report.

Post-assessment candidate survey questions
Question
The information on the SQE website about the assessment was helpful
The assessment specification provided me with enough guidance about the content and format of the assessment
It was a simple process to register for my candidate account
It was a simple process to book my assessment
The assessment was at a suitable venue
The administration on the day was efficient
The assessment questions were clear
The process for requesting reasonable adjustments was easy to use
The supporting information provided was helpful in telling me what I needed to do to request a reasonable adjustment
The reasonable adjustment received on the day matched the reasonable adjustment agreed in advance of the assessments
How would you rate your overall satisfaction with the service provided by the Kaplan SQE Equality and Quality team?
How would you rate your overall satisfaction with the SQE assessment service provided by Kaplan SQE?

For each question, candidates are asked to say whether they are very satisfied, satisfied, neither, unsatisfied or very unsatisfied.

Ready to register for the SQE?

Create your personal SQE account and book your assessments.

Register for SQE 

Have you passed the SQE?

Find out what happens after passing the SQE and admission to the roll of solicitors.

Learn more

Ready to register for the SQE?

Create your personal SQE account and book your assessments.

Register for SQE 

Have you passed the SQE?

Find out what happens after passing the SQE and admission to the roll of solicitors.

Learn more about Have you passed the SQE?