Solicitor Qualifying Examination Annual Report 2023/24

Executive Summary

The SQE is a rigorous assessment designed to assure consistent, high standards for all qualifying solicitors, consisting of two parts:

  • SQE1, which tests candidates' functioning legal knowledge (FLK1 and FLK2 assessments)
  • SQE2, which tests candidates' practical legal skills and functioning legal knowledge

This 2023/24 annual report of the Solicitors Qualifying Examination (SQE) contains data about more than 14,600 individual candidates who took part in the SQE between July 2023 and July 2024. It covers seven SQE assessment windows as follows:

SQE1 (FLK1 and FLK2 assessments)

  • January 2024
  • July 2024

SQE2

  • July 2023
  • October 2023
  • January 2024
  • April 2024
  • July 2024

The outcomes of the SQE2 July 2023 assessment window are included in this annual report because they were not available for the 2022/23 annual report. Future reports will include assessments delivered in the annual reporting period ie SQE2 October through to SQE2 July.

Guided by the outcomes of the recent report published by the University of Exeter, 'Potential Causes of Differential Outcomes by Ethnicity in Legal Professions', Kaplan will work with the SRA to explore how to supplement and improve the data collected from candidates so that we can gain more detailed and reliable insights about the factors or variables contributing to differential outcomes on the SQE assessments.

i) SQE1 Summary

  • 11,242 candidates took one or more of the FLK1 and FLK2 assessments. Of these, 9,879 candidates took SQE1 for the first time (across the two assessment windows).
  • The overall pass rate for SQE1 in the January 2024 assessment window was 56%, similar to the pass rate in the previous year (51% in January 2023).
  • The SQE1 pass rate in the July 2024 window was 44%, lower than seen in the previous year (53% in July 2023).
  • First attempt pass rates in January 2024 and July 2024 were 59% and 48% respectively. The pass rates for resit candidates were lower at 16% and 20% respectively.
  • 29% of candidates who failed part or all of SQE1 in January 2024 resat in July 2024. Their pass rates were 36% and 30% for FLK1 and FLK2 respectively
  • FLK1 scores were higher than FLK2 scores in both SQE1 assessment windows.
  • All the SQE1 assessments showed very good reliability with low standard errors of measurement (see Descriptive and Quality Assurance Statistics section).

ii) SQE2 Summary

  • 5,490 individual candidates took an SQE2 assessment, with 5,276 candidates sitting as a first attempt.
  • 81% of the SQE2 candidates had passed SQE1 and 19% were candidates who did not sit SQE1 (due to transitional arrangements or exemptions).
  • The pass rate across all the SQE2 assessments in the reporting period was 76%. This was higher for candidates who had taken SQE1 (83%) than for those who did not sit SQE1 (43%). The overall pass rate for resit candidates was 33%.
  • 14% of the candidates who failed SQE2 resat at the earliest opportunity.
  • All five SQE2 assessments showed very good reliability with low standard errors of measurement.

iii) Key messages

Assessment reliability

  • Data suggests both SQE1 and SQE2 assessments were reliable. This provides assurance that the assessments were well constructed and measured candidate performance consistently within each assessment.

Pass rates

  • The pass rate for SQE2 was markedly higher than for SQE1. This is consistent with the previous year and is expected given the SQE2 eligibility requirement to have qualification-level functioning legal knowledge (ie have passed SQE1 or are a transitional candidate)
  • Analyses of pass rates by candidate diversity and socio-economic data suggested some differences in performance between groups in the individual assessments. However, there was no evidence of bias in the SQE assessments overall. This will continue to be monitored.

Resits

  • Pass rates for resit candidates were lower than for first attempt candidates across all assessments

Candidate trends

  • The shift away from the early SQE2 cohorts including large proportions of transitional candidates not required to take SQE1 has continued. For those qualifying under the Qualified Lawyers Transfer Scheme, the SQE2 in October 2023 was the last opportunity for these candidates to sit the assessment. However, transitional arrangements continue for candidates with the Legal Practice Course. In four of the five SQE2 assessments in this period, the proportion of candidates who had sat SQE1 was 75% or more (range 50% to 91%).
  • Candidates were more likely to pass SQE2 if they had taken SQE1 or were not a qualified lawyer.
  • Candidates with higher university degree classifications were more likely to perform better in the assessments.

Apprentices

  • Approaching 5% of assessments taken in this period were by solicitor apprentice candidates.
  • The small number of apprentices who have taken the assessment in this period have typically performed better than other candidates in SQE2. Their performance in SQE1 assessments is similar to, or better than, other candidates.

This SQE annual report provides a cumulative picture of the outcomes from the assessments that took place in the reporting period (July 2023 - July 2024).

Statistics and commentary are provided on the overall performance of candidates at the individual assessment level to enable comparisons over time and identify any emerging trends. Assessment data is provided, where applicable, at the cumulative level.

Seven assessment windows are covered in this report with two for SQE1 and five for SQE2 as follows:

SQE1 (FLK1 and FLK2 assessments)

  • January 2024
  • July 2024

SQE2

  • July 2023
  • October 2023
  • January 2024
  • April 2024
  • July 2024

The outcomes of the SQE2 July 2023 assessment window are included in this annual report because they were not available for the 2022/23 annual report. Future reports will include assessments delivered in the annual reporting period ie SQE2 October through to SQE2 July.

When preparing this report, the results for the October 2024 SQE2 assessment window deliveries were not available. We have included some provisional data on the number of candidates who attended this assessment where relevant.

i) About the SQE

The SQE is a single rigorous assessment designed to assure qualifying solicitors have been assessed to a consistent, high standard. The SRA's Statement of Solicitor Competence sets out what solicitors need to be able to do to perform the role effectively, and provides everyone with a clear indication of what to expect from a solicitor. This is what the SQE tests.

The SRA has appointed Kaplan SQE (Kaplan) as the approved assessment provider for the delivery of the SQE assessments and other related services. Since the SQE was launched in 2021, over 8900 candidates have completed the SQE.

In this reporting period, SQE assessments were delivered to more than 14,600 individual candidates in 37 countries.

ii) SQE1

SQE1 consists of two 180 question multiple choice single best answer assessments (FLK1 and FLK2) in the following subject areas:

  • Business Law and Practice; Dispute Resolution; Contract; Tort; Legal System of England and Wales; Constitutional and Administrative Law and EU Law and Legal Services (FLK 1).
  • Property Practice; Wills and the Administration of Estates; Solicitors Accounts; Land Law; Trusts; Criminal Law and Practice (FLK 2).

These are delivered electronically in controlled and invigilated exam conditions at Pearson VUE test centres across the UK and internationally.

Each FLK assessment was run across five consecutive days, with each candidate taking the assessment on one of the five days. The FLK1 and FLK2 assessments took place in consecutive weeks within each assessment window.

Each FLK assessment is split into two sessions of 2 hours 33 minutes, with 90 questions in each session. There is a 60-minute break between the sessions. Different assessment forms (papers) were allocated at random to candidates throughout each five-day assessment, with each form having a separate pass mark.

In order to pass SQE1, a candidate must pass both FLK1 and FLK2 assessments. Candidates who fail their first attempt have two further opportunities to take the assessment(s) they failed (FLK1 and/or FLK2). More information can be found in the SQE1Assessment Specification.

iii) SQE2

SQE2 comprises 16 stations – 12 written stations and four oral stations — that assess both skills and application of legal knowledge.

The stations in SQE2 cover six legal skills:

  • Advocacy
  • Case and matter analysis
  • Interview and attendance note/legal analysis
  • Legal drafting
  • Legal research
  • Legal writing.

This is across five practice areas:

  • Business, organisations, rules and procedures
  • Criminal Litigation
  • Dispute Resolution
  • Property Practice
  • Wills and Intestacy, Probate Administration and Practice.

SQE2 written assessments take place in Pearson VUE test centres over three consecutive half days and all candidates take the same written stations on the same date.

SQE2 oral assessments take place over two consecutive half days. During the reporting period, oral assessments took place in centres in Birmingham, Cardiff, Manchester and London. The logistics involved in running the oral assessments mean that not all candidates in a cohort can take the same oral stations on the same day, so multiple “sittings” are used for SQE2 oral stations. To protect the integrity of the assessments and to ensure equity, different tasks are set for the oral stations used at the different sittings. However, the same skills and practice areas are covered in all sittings of an assessment window.

SQE2 has a single pass mark for the whole assessment, covering all 16-stations. There may be slightly different pass marks between the SQE2 sittings to account for differences in the difficulty of the different oral station tasks, as described above.

Candidates take SQE2 written assessments in Pearson VUE test centres over three consecutive half days. All candidates take the same written stations on the same date.

More information can be found in the SQE2 Assessment Specification.

iv) Exemptions

Exemptions from the SQE assessments are only available to qualified lawyers. Whilst exemptions are available for SQE1 (FLK1 and/or FLK2) and SQE2 assessments, SQE1 exemptions are rarely given.

v) Transitional arrangements

There are some candidates who meet the SRA transitional arrangements and are using SQE2 to qualify as a solicitor. They are not required to take SQE1.

vi) Types of candidates taking SQE assessments

To summarise, there are three types of candidates:

  1. Candidates going through the full SQE route and sitting both SQE1 and SQE2.
  2. Transitional candidates who are not required to take SQE1 and only sit SQE2.
  3. Qualified lawyers who either sit the whole of the SQE or have an exemption from FLK1 and/or, FLK2 and/or SQE2.

i) Number of Candidates

The data provided in this report relate to candidates who received a mark for any of the assessments. Candidates whose attempts were discounted due to mitigating circumstances are not included. Outcome data is provided separately for FLK1 and FLK2 assessments and overall for SQE1. Outcome data is also provided for SQE2.

In this reporting period, a total of 14,625 individual candidates received a mark for one or more of the SQE assessments. Table 1 below provides the number of candidates for each assessment, along with the numbers and proportions of candidates by attempt number, where applicable.

ii) Attempts and Resits

Candidates are allowed up to three attempts for each assessment within a six-year period. At the time of writing this report, there had been six opportunities to sit SQE1 and nine opportunities to sit SQE2. A small number of candidates had made a third attempt at the assessments.

At their first SQE1 attempt, candidates are required to sit both FLK1 and FLK2 in the same assessment window. If they fail one, they only need to resit that one assessment. Any passes can be carried forward and used within a six-year period.

Because of this, and owing to mitigating circumstances that are applied separately for FLK1 and FLK2, the number of candidates may differ across FLK1, FLK2 and SQE1 overall.

Table 1: Number (and proportion) of marked candidates by attempt
Assessment Assessment window Number of Candidates Number of Candidates by Attempt
1st Attempt 2nd Attempt 3rd Attempt Mixed Attempt Numbers**
FLK1 Jan 2024 6204 5614 (90%) 534 (9%) 56 (1%) -
FLK2 Jan 2024 6476 5606 (86%) 757 (12%) 113 (2%) -
SQE1* Jan 2024 6061 5581 (92%) 441 (7%) 39 (1%) 0 (0%)
FLK1 Jul 2024 5332 4224 (79%) 994 (19%) 114 (2%) -
FLK2 Jul 2024 5519 4209 (76%) 1156 (21%) 154 (3%) -
SQE1* Jul 2024 5006 4168 (83%) 744 (15%) 71 (2%) 23 (<1%)
SQE2 July 2023 1004 944 (94%) 56 (6%) 4 (<1%) -
SQE2 October 2023 642 542 (84%) 90 (14%) 10 (2%) -
SQE2 January 2024 876 833 (95%) 41 (5%) 2 (<2%) -
SQE2 April 2024 2181 2097 (96%) 82 (4%) 2 (<1%) -
SQE2 July 2024 932 860 (92%) 65 (7%) 7 (1%) -

*Data provided for candidates who sat both FLK1 and FLK2 in the assessment window

**Where candidates sat both FLK1 and FLK2 in the same assessment window but with different attempt numbers (eg due to a previous discounted attempt)

iii) SQE2 October 2024

Although the SQE2 assessment in October 2024 had taken place when this report was written, the marks had not been released to candidates. There were 1,044 candidates who sat SQE2 in October 2024. These approximately breakdown as:

  • 81% as first attempt
  • 18% as second attempt
  • 1% as third attempt.

These numbers will change if there are successful mitigating circumstance claims with attempts being discounted.

Statistical reports are published after results are released. Data from the October 2024 assessment will be included in the 2024/25 annual report.

iv) Descriptive and Quality Assurance Statistics

Table 2 provides the pass marks for each assessment, the average score (Mean) and standard deviation (SD), and Table 3 provides measures of test reliability (Cronbach's alpha and standard error of measurement (SEm).

With the move to using scaled scores for the SQE1 assessments, alongside the introduction of multiple testing days, we deployed more than one form of the assessment for each assessment window. To achieve accurate and fair comparisons between test takers the pass marks are scaled to 300 (on a scale of 0 to 500, where 0 equates to 0% and 500 equates to 100%). The quality statistics are provided as an average of the values for the multiple assessment forms for each assessment window.

Cronbach’s alpha

Cronbach’s Alpha (α) is a measure of test reliability that estimates the internal consistency, or how closely related the sets of items are in a test. It therefore tells us how well items (questions) work together as a set. A high α coefficient suggests that candidates tend to respond in similar ways from one item to the next. Values for α range from 0 (where there is no correlation between items) to 1 (where all items correlate perfectly with one another). The widely accepted gold-standard α for high-stakes assessments is 0.8.

In all SQE1 assessments to date, α has been greater than 0.9 and above 0.8 for SQE2, suggesting very good internal consistency and high reliability for the SQE1 and SQE2 assessments.

Standard error of measurement (SEm)

The SQE assessments provide an observed/obtained score for a candidate at a snapshot in time (the assessment). If the candidate were to sit the same assessment on another occasion, they may achieve a different score owing to various factors. Some of these can be controlled to an extent, such as the training provided, and some cannot, such as the amount of sleep the candidate got the night before the assessment.

A candidate's theoretical “true” score can only be estimated by their observed score, but there is an inevitable degree of error around each candidate's observed score, which is consistent with most assessments.

The standard error of measurement (SEm) for an assessment is an estimate of how repeated measures of the same group of candidates on the same assessment would be distributed around their theoretical “true” scores. The SEm is a function of the reliability of the assessment (α) and the standard deviation in scores on the assessment. Generally, the higher the reliability, the lower the standard error of measurement and vice-versa.

For all SQE assessments to date, the SEm has been below 4%, which provides confidence that observed scores generally represent a very good approximation of true scores in these assessments.

Table 2: Pass marks, descriptive and quality assurance statistics
Assessment window Assessment Sitting No. of Candidates Pass Mark Mean SD
Jan 2024 FLK1 - 6204 300 310* 63*
FLK2 - 6476 300 305* 70*
Jul 2024 FLK1 - 5332 300 296* 64*
FLK2 - 5519 300 287* 71*
Jul 2023 SQE2 1 422 62% 72.1% 10.6%
2 285 61% 68.9% 11.2%
3 184 61% 65.9% 10.5%
4 113 62% 64.9% 10.9%
 Oct 2023  SQE2 1 408 62% 65.2% 10.6%
2 234 61% 63.2% 12.1%
 Jan 2024  SQE2 1 490 62% 68% 10.3%
2 386 61% 65.2% 10.4%
Apr 2024  SQE2 1 564 61% 67.7% 9%
2 612 61% 67.3% 8.6%
3 513 62% 68.4% 9.1%
4 492 61% 66.7% 9.6%
Jul 2024  SQE2 1 343 62% 68% 11.1%
2 348 61% 68.7% 9.8%
3 241 61% 64.8% 10.3%

*Average values (across the multiple forms) are provided for the FLK1 and FLK2 assessments, with scaled values provided where 0 equates to 0%, 300 equates to the pass mark, and 500 equates to 100%.

Note - for SQE2 there were multiple sittings for the oral assessments (see introduction)

Table 3: Quality assurance statistics
Assessment window Assessment Sitting No. of Items/Stations Test Quality Indicators
Cronbach's Alpha SEm
Jan 2024 FLK1 - 180 0.934 3.224%
FLK2 - 180 0.941 3.272%
 Jul 2024 FLK1  - 180 0.931 3.279%
FLK2  - 180 0.937  3.300%
Jul 2023 SQE2 1 16 0.875 3.697%
2 16 0.876 3.687%
3 16 0.867 3.795%
4 16 0.865 3.832%
 Oct 2023  SQE2 1 16 0.880 3.793%
2 16 0.901 3.658%
 Jan 2024  SQE2 1 16 0.861 3.732%
2 16 0.851 3.795%
 Apr 2024  SQE2 1 16 0.825 3.602%
2 16 0.837 3.529%
3 16 0.846 3.549%
4 16 0.862 3.604%
 Jul 2024  SQE2 1 16 0.890 3.672%
2 16 0.869 3.571%
3 16 0.871 3.708%

*Average values (across the multiple forms) are provided for the FLK1 and FLK2 assessments Note - for SQE2 there were multiple sittings for the oral assessments (see introduction).

v) Candidate Journey

Tables 4a to 4c below summarise the journey so far for the candidates who have received assessment marks in this reporting period. These are provided separately for SQE1 (candidate outcomes) and SQE2 (candidate routes and outcomes). The candidates shown in table 4b include transitional candidates not required to take SQE1.


Table 4a: SQE1* Candidate outcomes (11242 individual candidates)
Outcome Jan 2024 Jul 2024
Passed SQE1 and sat SQE2 Apr 2024 1477 n/a
Passed SQE1 and sat SQE2 Jul 2024 630 n/a
Passed SQE1 and sat SQE2 Oct 2024 (excludes any resitting from SQE2 Apr 2024) 246 411
Passed SQE1 and yet to sit SQE2 686 1788
Passed SQE1 and exempt from SQE2 684 557
Failed part or all of SQE1 in Jan 2024, resat and did not pass in Jul 2024 n/a 592
Failed part or all of SQE1 - yet to resit 1674 2497
Total 5397 5845

*Reported for the second SQE1 window where a candidate has sat in both assessment windows in this period.

Of the 1,477 candidates who passed SQE1 in January 2024 and went on to sit SQE2 in April 2024, 1,324 (90%) passed and 153 (10%) failed. For those who sat SQE2 in July 2024, 535 (85%) passed and 95 (15%) failed.

Of the 2,280 candidates who failed FLK1 and/or FLK2 in January 2024, 620 (27%) passed SQE1 in July 2024 (included within the Passed rows of the table above).

Of those who attempted just FLK1 or FLK2 in either January 2024 or July 2024, 913 (69%) passed their remaining assessments (included within the Passed rows of the table above).

Candidates who fail their first or second attempts may benefit from reviewing the information contained later in this report relating to candidate performance in different practice areas.


Table 4b: Candidate routes to SQE2* (5490 individual candidates)
Outcome Jul 2023 Oct 2023 Jan 2024 Apr 2024 Jul 2024
Passed SQE1 Nov 2021 15 6 11 3 4
Passed SQE1 Jul 2022 91 28 47 22 15
Passed SQE1 Jan 2023 647 171 67 51 20
Passed SQE1 Jul 2023 n/a 112 592 389 86
Passed SQE1 Jan 2024 n/a n/a n/a 1477 630
Transitional/exempt candidates 251 325 143 170 117
Total 1004 642 860 2112 872

*Reported for the first SQE2 window where a candidate has sat in more than one assessment window in this period.

Of the 1,006 candidates who did not sit SQE1 and sat SQE2 between July 2023 and July 2024, 460 (46%) passed. This compares to the higher rate of 85% for those taking SQE2 after passing the SQE1.

Table 4c: SQE2 Candidate outcomes* (5490 candidates)
Outcome
Jul 2023 Oct 2023 Jan 2024 Apr 2024 Jul 2024
Passed as a 1st attempt 776 374 633 1693 666
Passed as a 2nd attempt 16 32 11 29 25
Passed as a 3rd attempt 1 2 1 0 3
Awaiting a resit result (Oct 2024) 8 10 73 96 n/a
Failed 1st attempt and yet to resit 130 184 120 322 202
Failed 2nd attempt and yet to resit 0 0 6 41 36
Total 931 602 844 2181 932

*Reported for the latest SQE2 window where a candidate has sat in more than one assessment window in this period.

Of the 4,262 candidates who passed SQE2 in this reporting period:

  • 3,802 (89%) had passed SQE1
    • 180 (4%) in November 2021 or July 2022
    • 853 (20%) in January 2023
    • 910 (21%) in July 2023
    • 1,859 (44%) in January 2024
  • 460 (11%) were not required to take SQE1

Of the 1,228 candidates who did not pass SQE2 in this reporting period:

  • 682 (56%) had passed SQE1
    • 62 (5%) in November 2021 or July 2022
    • 103 (9%) in January 2023
    • 269 (22%) in July 2023
  • 546 (44%) were not required to take SQE1

Table 5 shows the candidate pass rates (and number passing) for each assessment for all candidates and by attempt number, and for SQE2 by whether SQE1 had been previously sat.

i) Pass rates on SQE1 (FLK1 and FLK2)

Of the two SQE1 assessments, the pass rates for FLK1 have been higher than for FLK2 (+2% and +5% for January 2024 and July 2024 respectively). We will continue to monitor this across future assessments.

The overall pass rate for SQE1 was higher in January 2024 (56%) than in July 2024 (44%)

ii) Pass rates for SQE1 resitting candidates

Pass rates in both January and July 2024 were lower for resitting candidates when compared to first attempt candidates. This was true for both FLK1 and FLK2.

The pass rates for resitting candidates were higher in July than in January for FLK1 (increasing from 35% to 45%) but were similar for FLK2 (45% and 44% respectively for January and July).

The proportion of candidates resitting increased between the January and July 2024 assessments for both FLK1 and FLK2. This rose from 10% to 21% for FLK1 and from 13% to 24% for FLK2.

The lower pass rates for resitting candidates might indicate that they should consider taking more time (and/or putting in more work or training) between sittings. This may help them improve from a failing to a passing standard.

iii) Pass rates on SQE2

The SQE2 pass rates are higher than for SQE1. This is expected given the SQE2 eligibility requirement to have qualification-level functioning legal knowledge (ie have passed SQE1 or are a transitional candidate).

The proportion of candidates taking SQE2 who were not required to sit the SQE1 has been lower in the more recent SQE2 assessments (eg 51% of all candidates in October 2023 vs 15% in July 2024). For those qualifying under the Qualified Lawyers Transfer Scheme, the SQE2 in October 2023 was the last opportunity for these candidates to sit the assessment. However, transitional arrangements continue for candidates with the Legal Practice Course.

Candidates who sat SQE1 performed better than those who had not, with pass rates ranging between 79% and 90%, compared to between 36% and 47% for those who had not sat SQE1.

iv) Pass rates for SQE2 resitting candidates

The proportions of resitting candidates remain small across the assessments (6%), and their pass rates are significantly lower than for first attempt candidates.

The pass rates for second attempt candidates range between 27% and 38%, compared to a range of 69% to 82% for first attempt candidates. Again, careful consideration should be given to when to resit to allow sufficient time to improve to a passing standard.

Table 5: Assessment pass rates
Assessment Date Candidate % Pass Rates (and number passing)
All 1st Attempt Only 2nd Attempt Only 3rd Attempt Only Split Attempts* Sat SQE1 Did Not Sit SQE1
FLK1 Jan 2024 63% (3930) 66%(3723) 34% (183) 43% (24) - - -
FLK2 Jan 2024 61% (3936) 63% (3543) 44% (330) 56% (63) - - -
SQE1 Jan 2024 56% (3368) 59% (3291) 16% (69) ** - - -
FLK1 Jul 2024 55% (2918) 57% (2416) 45% (446) 49% (56) - - -
FLK2 Jul 2024 50% (2769) 52% (2190) 44% (508) 46% (71) - - -
SQE1 Jul 2024 44% (2178) 48% (2008) 21% (156) 17% (12) ** - -
SQE2 Jul 2023 79% (793) 82% (776) 29% (16) ** - 90% (680) 45% (113)
SQE2 Oct 2023 64% (408) 69% (374) 36% (32) ** - 83% (263) 45% (145)
SQE2 Jan 2024 74% (645) 76% (633) 27% (11) ** - 79% (576) 47% (69)
SQE2 Apr 2024 79% (1722) 81% (1694) 34% (28) ** - 83% (1651) 36% (71)
SQE2 Jul 2024 74% (694) 77% (666) 38% (25) ** - 80% (632) 44% (62)

*Sat both assessments but with a different attempt number for each

**not reportable as less than 10 candidates in the attempt group

The SRA collects diversity and socio-economic data to help understand how candidates with different characteristics and backgrounds perform in the assessments. The categories are consistent with data collected by the Office for National Statistics (ONS) and the Social Mobility Commission.

Data is collected from candidates via an online monitoring and maximising diversity survey (or demographic data survey), completed or updated ahead of assessment registration. Appendix 1 lists the data reported on in this section.

The large number of characteristics and groups within characteristics recorded in the data collected means that candidate numbers in some of the groups are small.

In the tables below, we present univariate analysis of the outcomes data, which looks at each of the characteristics individually and independently. Tables 6 to 13 provide the following for FLK1, FLK2, SQE1 and SQE2 for each of the 15 characteristics presented:

  • The outcome of a Chi-square significance test to indicate whether there are any significant differences (at a 95% confidence level) between the pass rates of the groups. This is indicated in the column header as 'Yes' (significant differences) or 'No' (no significant differences).
  • The proportion of candidates within each group. This is calculated with 100% being all candidates with disclosed data AND being in a group with 10 or more candidates ('Proportion %' column).
  • The percentage pass rate for each group with 10 or more candidates ('Pass Rate %' column).

Data in the tables exclude candidates who select 'Prefer not to say' - the following provides some proportions of candidates who selected this response:

  • Fewer than 3% of candidates did not disclose age, sex or whether they had undertaken any qualifying work experience.
  • Fewer than 5% of candidates did not disclose their highest level of education or degree classification.
  • Fewer than 10% did not disclose ethnicity, disability status, school type attended or parental education.
  • Fewer than 15% of candidates did not disclose sexuality or religion.
  • Fewer than 20% of candidates did not disclose household socio-economic status.

Compared to the previous year, the proportion of 'Prefer not to say' responses had reduced, which contributes positively to producing more meaningful statistics.

These tables provide data for all candidates who received marks for any assessment (FLK1, FLK2, SQE1 and SQE2). Where a candidate has more than one attempt within any one assessment type in this reporting period, the latest attempt data has been used.

The data is pooled from the January 2024 and July 2024 assessments for FLK1, FLK2, and SQE1, and from the July 2023, October 2023, January 2024, April 2024 and July 2024 assessments for SQE2. Where there are fewer than 10 candidates in any group the proportions and pass rates are not reported; this is indicated by greyed out cells in the table.

The full questions asked in the online demographic data survey in relation to each category are available in Appendix 1. This report now includes data about candidates' first language (Table 13)

Overall, the findings for where there are pass rate differences between groups are largely similar to last year.

Guided by the outcomes of the recent report published by the University of Exeter, 'Potential Causes of Differential Outcomes by Ethnicity in Legal Professions', Kaplan will work with the SRA to explore how to supplement and improve the data collected from candidates so that we can gain more detailed and reliable insights about the factors or variables contributing to differential outcomes on the SQE assessments.

i) Ethnicity

Candidates who reported being in White or Mixed/Multiple ethnic groups achieved higher pass rates than those who reported being in Asian/Asian British or Black/Black British ethnic groups. Differences in pass rates between groups were significant for all assessments.

Table 6: Pass rates by ethnicity characteristics
  FLK1 FLK2 SQE1 SQE2
Proportion % Pass Rate % Proportion % Pass Rate % Proportion % Pass Rate % Proportion % Pass Rate %
Significant differences: Yes Yes Yes Yes
Asian / Asian British 30 56 31 53 30 45 22 70
Black / Black British 9 45 8 41 8 32 5 51
Mixed / multiple ethnic groups 5 68 6 65 5 55 5 80
Other 6 52 7 50 6 41 5 66
White 50 73 48 71 51 63 63 84

ii) Disability

Pass rates were similar between candidates who declared a disability and those who did not for FLK1, FLK2 and SQE1 assessments. Candidates who declared a disability achieved higher pass rates for SQE2.

Table 7: Pass rates by disability characteristics
  FLK1 FLK2 SQE1 SQE2
Proportion % Pass Rate % Proportion % Pass Rate % Proportion % Pass Rate % Proportion % Pass Rate %
Significant differences: No No No Yes
Do not consider themselves to have a disability 92 63 92 60 92 52 91 77
Do consider themselves to have a disability 8 66 8 62 8 53 9 82

iii) Age

The majority of candidates taking SQE1 and SQE2 were in the younger age groups (under 35 years). More than 40% were in the 16-24 age group, and this group achieved higher pass rates than candidates 35 years or above in FLK1, FLK2, SQE1 and SQE2.

There were fewer than 10 candidates in the 65+ age group for all assessments. The proportions and pass rates are therefore not provided for this group.

Table 8: Pass rates by age characteristics

FLK1 FLK2
SQE1 SQE2
Proportion % Pass Rate % Proportion % Pass Rate % Proportion % Pass Rate % Proportion % Pass Rate %
Significant differences: Yes Yes Yes Yes
16 - 24 42 66 40 65 41 57 40 89
25 - 34 44 65 45 60 45 53 48 76
35 - 44 11 57 12 52 11 43 9 51
45 - 54 3 49 3 46 3 35 3 29
55 - 64 <1 41 <1 31 <1 23 <1 29
65+                

iv) Sex, Gender and Sexual Orientation

Candidates who reported their sex as male achieved a higher pass rate than female candidates in SQE1. The opposite was the case in SQE2.

There were no significant differences between candidates whose gender was the same or different to their sex registered at birth.

Candidates who reported that their sexual orientation was Bi, Gay / lesbian or Other achieved higher pass rates than candidates who reported their sexual orientation as Heterosexual / straight in SQE1 assessments. For SQE2, candidates who reported that their sexual orientation was Bi achieved higher pass rates.

There were fewer than 10 candidates selecting 'Other' for sex for all assessments, and for SQE2 there were fewer than 10 candidates selecting 'No' for their gender being the same as the sex registered at birth. The proportions and pass rates are therefore not provided for these groups.

Table 9: Pass rates by sex, gender and sexual orientation characteristics

FLK1 FLK2
SQE1 SQE2
Proportion % Pass Rate % Proportion % Pass Rate % Proportion % Pass Rate % Proportion % Pass Rate %
Sex - Significant differences: Yes Yes Yes Yes
Female 65 61 65 59 65 50 64 78
Male 35 69 35 65 35 58 36 75
Other                
Gender same as sex registered at birth - Significant differences: No No No n/a
No <1 75 <1 69 <1 69    
Yes 100 64 100 61 100 53 100 77
Sexual orientation - Significant differences: Yes Yes Yes Yes
Bi 5 74 5 69 5 62 5 88
Gay / lesbian 3 76 3 72 3 66 5 81
Heterosexual / straight 91 62 92 59 91 51 90 76
Other 1 75 <1 70 1 56 <1 67

v) Religion or Belief

There were differences in pass rates between religion/belief groups reported by candidates in all assessments. For SQE1, candidates reporting Hindu, Muslim or Sikh as their religion had lower pass rates, and those reporting no religion or belief or Jewish had higher pass rates. For SQE2, candidates reporting Muslim as their religion had lower pass rates, and those reporting no religion or belief had higher pass rates.

The most frequently indicated group was no religion or belief (40%-45% of candidates for each assessment).

Table 10: Pass rates by religion/belief characteristics

FLK1 FLK2
SQE1 SQE2
Proportion % Pass Rate % Proportion % Pass Rate % Proportion % Pass Rate % Proportion % Pass Rate %
Significant differences: Yes Yes Yes Yes
Buddhist 2 62 2 58 2 52 1 67
Christian 33 62 33 58 33 49 32 73
Hindu 6 48 6 45 6 39 4 73
Jewish 1 81 1 76 1 72 2 86
Muslim 13 39 13 38 13 29 9 59
No religion or belief 43 74 43 71 43 63 50 85
Sikh 1 50 1 53 1 43 1 71
Other 1 61 1 54 1 44 1 79

vi) Socio-economic background measured by Occupation of Main Household Earner at 14, Type of School Attended and Parental Education

Candidates who reported the occupation of the main household earner as professional achieved higher pass rates in SQE1 and SQE2.

Pass rates for candidates attending independent or fee-paying schools (non-bursary funded) were higher across all assessments. Candidates who attended school outside the UK achieved significantly lower pass rates in SQE2 than those in the other groups.

Candidates who reported that at least one parent attended university achieved higher pass rates in SQE1 and SQE2.

Table 11: Pass rates by household earner occupation, school type attended and parental education characteristics

FLK1 FLK2
SQE1 SQE2
Proportion % Pass Rate % Proportion % Pass Rate % Proportion % Pass Rate % Proportion % Pass Rate %
Occupation of Main Household Earner - Significant differences: Yes Yes Yes Yes
Professional background 66 69 67 65 67 58 67 80
Intermediate background 14 57 14 56 14 48 14 77
Working class background 16 58 15 55 15 47 16 73
Other background 4 54 4 53 4 43 3 69
Type of School Attended - Significant differences: Yes Yes Yes Yes
State-run or state- funded school - non-selective 32 63 32 60 32 52 36 79
State-run or state- funded school - selective on academic, faith or other grounds 15 64 15 61 15 54 17 80
Independent or fee-paying school 13 73 13 71 13 63 18 86
Independent or fee-paying school, where I received a bursary covering 90% or more of my tuition 1 83 1 74 1 69 1 88
Attended school outside the UK 39 62 39 58 39 50 28 69
Other <1 50 <1 52 <1 40 <1 71
I don't know* 32 63 32 60 32 52 36 79
Parents Attended University - Significant differences: Yes Yes Yes Yes
Yes, one or both of my parents attended university 60 70 61 65 60 58 63 80
No, neither of my parents attended university 37 57 37 55 37 46 35 76
Do not know / not sure* 3 45 2 44 3 32 2 58

*Group excluded from the Chi-square test of significance

vii) Highest Level of Education, Undergraduate Degree Classification and Qualifying Work Experience Undertaken

Candidates with at least an undergraduate degree achieved higher pass rates than those with qualifications below degree level in FLK1 and SQE1, with pass rates similar for FLK2 and SQE2. Candidates with qualifications below degree level are a small group accounting for approximately 1% of all candidates. Whilst an undergraduate degree (or equivalent qualification) is required for admission, it is not a requirement for taking the SQE1 or SQE2 assessments.

Candidates with first class undergraduate degree classifications achieved higher pass rates in all assessments and accounted for approaching a quarter of candidates across the assessments.

Candidates who disclosed they had not undertaken qualifying work experience achieved a higher pass rate in all assessments.

All candidates disclosed whether they were already a qualified lawyer. Those who were not qualified achieved higher pass rates in FLK2, SQE1 and SQE2 than those who were qualified. The difference was much larger for SQE2. For FLK1, the pass rates were similar for those who were not qualified and those who were qualified.

There were fewer than 10 candidates with no formal qualifications for all assessments. The proportions and pass rates are therefore not provided for this group.

Table 12: Pass rates by level of education, undergraduate degree classification, QWE and qualified status characteristics

FLK1 FLK2
SQE1 SQE2
Proportion % Pass Rate % Proportion % Pass Rate % Proportion % Pass Rate % Proportion % Pass Rate %
Highest Level of Education - Significant differences: Yes No Yes No
At least an undergraduate degree 97 65 97 61 97 54 98 78
Qualifications below degree level 1 52 1 53 1 43 1 83
No formal qualifications                
Not applicable* 2 49 2 43 2 33 1 40
Undergraduate Degree Classification - Significant differences: Yes Yes Yes Yes
1st 24 82 24 80 25 74 34 91
2:1 47 63 46 61 46 52 49 78
2:2 10 35 10 31 10 23 6 45
3rd 1 21 1 19 1 12 <1 8
Distinction 2 64 2 59 2 51 1 68
Commendation <1 55 <1 61 <1 49 <1 54
Pass 2 43 2 38 2 28 1 36
Not applicable* 14 62 15 56 14 48 9 60
Qualifying Work Experience (QWE) Undertaken - Significant differences: Yes Yes Yes Yes
No QWE undertaken 55 65 55 63 55 55 53 84
QWE undertaken 45 62 45 58 45 50 47 70
Qualified Lawyer Status - Significant differences: No Yes Yes Yes
Not qualified 69 64 69 62 69 55 87 80
Qualified 31 63 31 58 31 49 13 55

*Group excluded from the Chi-square test

vii) First language

English was the first language for the majority of candidates, and this group achieved higher pass rates in all assessments.

Table 13: Pass rates by first language

FLK1 FLK2
SQE1 SQE2
Proportion % Pass Rate % Proportion % Pass Rate % Proportion % Pass Rate % Proportion % Pass Rate %
Highest Level of Education - Significant differences: Yes Yes Yes Yes
English 67 66 67 64 67 56 77 81
Other 33 60 33 56 33 47 23 63

 

i) Practice Area Performance in SQE1

The tables in this section show the mean scores by practice area for FLK1 and FLK2 by the following candidate groups:

  • All candidates
  • Passing candidates
  • Failing candidates
  • First attempt candidates
  • Second attempt candidates
  • Third attempt candidates

In FLK1, mean scaled scores across the seven practice areas assessed range from 278 for Dispute Resolution to 362 for Ethics. This suggests candidates find some practice areas more difficult than others.

Four of the practice area mean scaled scores are below 300 (equivalent to the overall passing standard) with Dispute Resolution (278) and Business Law and Practice (279) having the lowest scores. Mean scaled scores were above 300 for Ethics (362), Contract Law (335) and Legal Services (302).

In FLK2, the mean scaled scores appear slightly lower than for FLK1 and range from 265 for Wills and Intestacy to 354 for Ethics. The range in mean scaled scores is similar for both FLK1 and FLK2 (84 vs 89), which again suggests candidates find some areas more difficult than others, with Wills and Intestacy (265) and Property Practice (278) having the lowest scores. Mean scaled scores were above 300 for Ethics (354), Criminal Liability (317) and Land Law (303).

There are similar patterns of performance between the passing and failing candidates - and first, second and third attempt candidates - across the majority of practice areas within each assessment. This suggests both stronger and weaker candidates perform well/less well in the same practice areas.

The differences in mean scores between passing and failing candidates for each of the practice areas range from 85 in FLK1 Ethics to 125 in FLK2 Land Law. The differences are more than 100 for the majority of the practice areas (5 out of 7 for FLK1, 6 out of 7 for FLK2), and less than 100 for Ethics (FLK1 and FLK2) and Legal Services (FLK1).

When comparing performance between attempt numbers, the differences between first attempt and second attempt mean scores range between 8 (FLK2 Ethics) and 36 (FLK1 Tort), with the majority of differences being below 30, and, as expected, higher mean scores for first attempt candidates across all practice areas.

The differences between first and third attempt candidates are smaller than the differences between first and second attempt for all of the FLK1 and FLK2 practice areas. Third attempt candidates scored higher in FLK2 Ethics than the first attempt candidates.

The following plots show the mean scaled scores for the FLK1 and FLK2 practice areas for the passing and failing candidates, with data aggregated across the two assessment windows: pink = failing; green = passing; bars ordered by passing candidate mean scaled scores descending.

In FLK1, the wider difference between passing and failing candidates is evident for Tort and Business Law and Practice, and for FLK2, this is evident for Land Law, Trust Law and Criminal Liability. The shortfall in knowledge of the weaker candidates is therefore greater in these practice areas, suggesting these areas may require more focus/preparation for future attempts at SQE1.

FLK1 Aggregated Mean Practice Area Scores

Y axis showing Mean scaled Score

500
450
400
350
300
250
200
150
100
50
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
E
CL
T
LSer
LSys
BLP
DR

X axis showing FLK1 Practice area

Result

Graph key showing how Fail is represented in the graph
Graph key showing how Pass is represented in the graph

Figure 1: Mean FLK1 practice area scores for passing and failing candidates

FLK2 Aggregated Mean Practice Area Scores

Y axis showing Mean Scaled Score

500
450
400
350
300
250
200
150
100
50
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
E
CLi
LL
TL
CLP
PP
WI

X axis showing FLK2 Practice area

Result

Graph key showing how Fail is represented in the graph
Graph key showing how Pass is represented in the graph

Figure 2: Mean FLK2 practice area scores for passing and failing candidates

Table 14: FLK1 mean scores by practice area
Practice Area
All
By Result By Attempt
Pass Fail 1st 2nd 3rd
Business Law and Practice 279 325 212 283 254 272
Contract Law 335 378 271 339 308 315
Dispute Resolution 278 319 218 281 258 273
Ethics 362 397 312 364 351 364
Legal Services 302 341 246 307 274 282
Legal System 294 336 234 299 268 279
Tort 299 346 230 304 268 272
Table 15: FLK2 mean scores by practice area
Practice Area
All
By Result By Attempt
Pass Fail 1st 2nd 3rd
Criminal Law and Practice 290 338 228 293 272 278
Criminal Liability 317 368 253 321 301 312
Ethics 354 393 303 355 347 360
Land Law 303 358 233 306 286 294
Property Practice 278 328 216 282 260 271
Trust Law 300 351 235 304 280 289
Wills and Intestacy 265 314 202 268 246 260

Comparing performance across practice areas to the previous year, candidates have consistently performed better in Ethics, Contract Law and Tort (FLK1) and Ethics, Criminal Liability and Land Law (FLK2). Business Law and Practice and Dispute Resolution (FLK1) and Property Practice and Wills and Intestacy (FLK2) continue to be more challenging practice areas for the candidates.

ii) Practice Area Performance in SQE2

In each SQE2 assessment window, the candidates are assessed in four written legal skills (Writing, Case & Matter Analysis, Research and Drafting) in five practice contexts (Dispute Resolution, Criminal, Property, Probate and Business). The combination of written legal skills and practice contexts will vary between assessment windows, with the exception of Business, which follows the same pattern. However, all candidates sat the same 12 written stations within each assessment window.

Candidates were assessed in two oral legal skills - Interviewing and Advocacy - in four practice contexts: Property and Probate (Interviewing) and Dispute Resolution and Criminal Litigation (Advocacy) within each assessment window. Therefore, in total, the candidates sat four oral legal skills assessments within each assessment window (Property Interviewing, Probate Interviewing, Dispute Resolution Advocacy and Criminal Litigation Advocacy).

Candidates were also required to complete a Written Attendance Note / Legal Analysis for each Interviewing assessment.

In each assessment window, the oral legal skills assessments take place in sittings. The number of oral sittings required will depend on the number of candidates taking the assessment and differed across each assessment window:

  • Four sittings in July 2023
  • Two sittings in October 2023
  • Two sittings in January 2024
  • Four sittings in April 2024
  • Three sittings in July 2023

For more on this, please see the SQE2 Assessment Specification (Organisation and delivery section).

The table in this section shows the mean scores by station (with data aggregated across assessments where there are common stations and aligned to ensure a comparable scale across the assessments) for the following candidate groups:

  • All candidates
  • Passing candidates
  • Failing candidates
  • First attempt candidates
  • Resit candidates (second and third attempts grouped due to small numbers of candidates).

Looking at the station scores for all candidates, mean scores range between 60.1% and 76.9%. Mean scores above 70% were achieved in five of the 24 different stations, as follows:

  • Advocacy - Criminal Litigation (76.9%)
  • Advocacy - Dispute Resolution (74.2%)
  • Legal Research - Dispute Resolution (72.9%)
  • Legal Drafting - Dispute Resolution (70.8%)
  • Legal Writing - Wills and Intestacy, Probate Administration and Practice (70.7%)

At the lower end, mean scores of less than 62% were in:

  • Legal Drafting - Wills and Intestacy, Probate Administration and Practice (61.8%)
  • Legal Drafting - Business Organisations, Rules and Procedures (60.1%).

The Advocacy skills assessed in the oral stations have the overall highest mean score with 75.6%; the Interview and Attendance Note/Legal Analysis skills fall towards the lower end with 64.6%. The mean scores for the written skills assessed range from 64.2% for Legal Drafting to 67.4% for Case and Matter Analysis. Differences between the mean scores for first attempt and resit candidates range from 6.1% (Interview) to 12.4% (Legal Research).

The range in the mean performance across the five practice areas is small, ranging from 66.1% (Business) to 70.3% (Dispute Resolution), which suggests that, overall, candidates do not find particular practice areas easier or more difficult than others. Mean differences between first attempt and resit candidates range between 8.1% (Wills and Intestacy, Probate Administration and Practice) and 12.2% (Business Organisations, Rules and Procedures).

Figure 3 and Figure 4 show the mean scores for the SQE2 skills and practice area scores for the passing and failing candidates with data aggregated across the five assessment windows: pink = failing; green = passing; bars ordered by passing candidates' mean scores descending.

The patterns of performance between passing and failing candidates appear similar across both skills and practice areas. Mean differences between the passing and failing candidates for skills range from 14.1% (Interview and Attendance Note/Legal Analysis) to 19.7% (Advocacy). For the practice areas, the differences between the passing and failing candidates range from 15.3% (Wills and Intestacy, Probate Administration and Practice) to 20.4% (Business Organisations, Rules and Procedures).

The mean score difference between passing and failing candidates was approaching 20% for three of the skills as follows:

Skills:

  • Advocacy (19.7%)
  • Legal Writing (19.4%)
  • Case and Matter Analysis (19.4%)

For the practice areas, the score difference between passing and failing was close to or above 20% for the following two practice areas:

  • Business (20.4%)
  • Criminal Litigation (19.1%)

The shortfall in the legal skills (and associated legal knowledge) of the weaker candidates is therefore greater in these skills and practice areas, suggesting these areas may require more focus/preparation for future attempts at SQE2.

SQE2 Aggregated Mean Skill Scores

Y axis showing Mean Score (%)

100
90
80
70
60
50
40
30
20
10
 
 
 
 
 
 
 
 
 
 
 
 
 
Advocacy
Case and Matter Analysis
Legal Research
Legal Writing
Legal Drafting
Interview & Attendance Note/Legal Analysis

X axis showing Station

Result

Graph key showing how Fail is represented in the graph
Graph key showing how Pass is represented in the graph

Figure 3: Mean SQE2 scores for passing and failing candidates by skill

SQE2 Aggregated Mean Practice Area Scores

Y axis showing Mean Score (%)

100
90
80
70
60
50
40
30
20
10
 
 
 
 
 
 
 
 
 
 
 
Dispute Resolution
Criminal Litigation
Property Practice
Wills and Intestacy, Probate Administration and Practice
Business Organisations, Rules and Procedures

X axis showing Practice area

Result

Graph key showing how Fail is represented in the graph
Graph key showing how Pass is represented in the graph

Figure 4: Mean SQE2 scores for passing and failing candidates by practice Area

Table 16: SQE2 mean scores by practice area and skill (station)
Practice Area Skill Number Type All By Result By Attempt
Pass Fail 1st 2nd/3rd
Business Organisations, Rules and Procedures CMA 5 Written 67.4 72.7 51.1 68.3 54.4
L Drafting 5 Written 60.1 64.9 45.0 60.8 49.6
L Research 5 Written 64.7 69.0 51.1 65.4 54.3
L Writing 5 Written 64.2 69.6 47.4 65.0 52.4
Criminal Litigation Advocacy 5 Oral 76.9 81.9 61.5 77.6 66.9
CMA 2 Written 67.7 72.0 54.2 68.3 59.0
L Drafting 2 Written 63.2 66.7 52.3 63.8 54.6
L Research 3 Written 66.2 71.5 49.4 67.0 53.3
L Writing 3 Written 64.0 68.6 49.3 64.9 50.1
Dispute Resolution Advocacy 5 Oral 74.2 78.9 59.8 74.9 64.7
CMA 3 Written 66.5 71.3 51.4 67.2 55.2
L Drafting 3 Written 70.8 74.6 58.8 71.1 65.1
L Research 2 Written 72.9 76.7 61.2 73.4 66.0
L Writing 2 Written 63.1 67.8 48.8 63.8 53.4
Property Practice Int/AttNote 5 Oral 65.8 69.3 54.9 66.2 59.5
CMA 3 Written 66.5 70.3 54.3 67.0 58.5
L Drafting 3 Written 69.2 74.7 52.0 70.2 55.2
L Research 2 Written 69.6 74.2 55.6 70.4 58.3
L Writing 2 Written 67.0 71.1 54.4 67.9 54.6
Wills and Intestacy, Probate Administration and Practice Int/AttNote 5 Oral 63.3 66.7 52.9 63.7 58.2
CMA 2 Written 69.1 73.9 54.4 60.9 57.6
L Drafting 2 Written 61.8 64.2 54.6 62.0 59.8
L Research 3 Written 65.5 69.8 51.7 66.2 53.6
L Writing 3 Written 70.7 74.8 57.9 71.4 60.8

Comparing performance across the skills to the previous year, candidates have consistently performed better in Advocacy, Case and Matter Analysis and Legal Research, with lower scores for Legal Drafting and Legal Writing. Candidates performed less well in the Interview and Attendance Note skill this year compared to last year.

Across the Practice Areas, candidates have consistently performed better in Dispute Resolution and less well in Business. Candidates have performed less well in Wills and Intestacy this year compared to last year.

i) Ethics Performance in SQE1

Table 17 shows the mean scores for all candidates along with passing and failing candidates for Ethics and non-Ethics focussed assessment content across FLK1 and FLK2.

In each case, there is a strong positive correlation between performance on Ethics questions and the overall score. Naturally, those candidates who have passed the assessments have significantly higher scores than those who have failed.

The mean scores are consistently higher for the Ethics content compared to the non-Ethics content across all assessments and groups. For all candidates, the mean differences range between 10.6% (FLK1 January 2024) and 16.9% (FLK1 July 2024). With the current data, there appears to be no pattern in mean differences between Ethics and non-Ethics content across the FLK1 and FLK2 assessments.

The mean differences between the passing and failing candidates are slightly higher for the non-Ethics content (range 20.8% to 23.3%) than for the Ethics content (range 17.5% to 19.8%), indicating the deficit knowledge of the weaker candidates is greater in the non-Ethics content than in the Ethics content.

Table 17 FLK1 and FLK2 mean scores for ethics and non-ethics content

All Candidates Passing Candidates Failing Candidates
Assessment Ethics Items Non-Ethics Items Ethics Items Non-Ethics Items Ethics Items Non-Ethics Items
FLK1 Jan 2024 68.6 58.0 75.6 65.8 56.7 44.5
FLK2 Jan 2024 67.5 55.1 75.3 64.3 55.5 41.0
FLK1 July 2024 71.7 54.8 79.6 64.2 62.1 43.4
FLK2 July 2024 66.7 52.8 75.8 64.0 57.5 41.5

ii) Professional Conduct Performance in SQE2

Professional conduct and ethics are examined pervasively in the SQE2 assessments. The SQE2 assessments test a range of professional conduct issues and behaviours. There is no prescribed number of assessments that include a professional conduct or ethical issue in a SQE2 assessment window. However, professional and ethical behaviours are generally tested in most subject areas for each assessment window, where possible. Candidates must be able to spot the ethical or professional conduct issue and exercise judgement to resolve any issues honestly and with integrity.

There is no separate mark for professional conduct or ethics questions. Professional conduct and ethics are marked within the Legally Comprehensive assessment criteria (ie one of the Law criteria).

Kaplan SQE is the End Point Assessment Organisation for solicitor apprentices in England. Solicitor apprentices are required to pass SQE1 during their apprenticeship – it is a gateway requirement for SQE2 which is the end-point assessment (EPA).

Apprentices must pass SQE1 and have met all of the minimum requirements of their apprenticeship (including the gateway review) before they can attempt SQE2. When an apprentice has passed SQE2, they have completed the EPA for the Solicitor Apprenticeship Standard and passed the SQE.

Information about how solicitor apprentices and their training providers can engage with the SQE is available on our website.

Solicitor apprentices made up a small proportion of overall candidate numbers for each assessment as indicated in Table 18 and accounted for 5% of all candidate assessments in this reporting period.

In the SQE1 assessments, solicitor apprentice pass rates were lower than for the non-apprentices in January 2024, although this difference was not significant; in July 2024 the apprentice pass rates were significantly higher. For SQE2, solicitor apprentice pass rates were significantly higher than those for non-apprentice candidates for four of the five SQE2 assessments (all except April 2024). There were notably higher pass rates in July 2023 and October 2023, when apprentice pass rates were 100% (compared to 78% and 62% respectively for non-apprentice candidates). The higher pass rates for SQE2 are consistent with previous years and continue to evidence solicitor apprentices' preparedness for the end point assessment.

Table 18: Pass rates for solicitor apprentices and non-solicitor apprentices
Assessment Proportion* Group Pass Rate Pass Rate 1st Attempt Pass Rate 2nd/3rd Attempt
FLK1 Jan 2024 4% Apprentice 59 64 39
Non-apprentice 64 66 35
FLK2 Jan 2024 4% Apprentice 60 66 38
Non-apprentice 61 63 46
SQE1 Jan 2024 4% Apprentice 49 56 10
Non-apprentice 56 59 17
FLK1 Jul 2024 5% Apprentice 62 65 54
Non-apprentice 54 57 45
FLK2 Jul 2024 4% Apprentice 61 66 49
Non-apprentice 50 51 44
SQE1 Jul 2024 4% Apprentice 53 59 28
Non-apprentice 43 48 20

*Proportion of apprentice candidates

Table 19: SQE2 Pass rates for solicitor apprentices and non-solicitor apprentices
Assessment Proportion* Group Pass Rate Pass Rate 1st Attempt Pass Rate 2nd/3rd Attempt
SQE2 Jul 2023 3% Apprentice 100 100 -
Non-apprentice 78 82 23
SQE2 Oct 2023 5% Apprentice 100 100 -
Non-apprentice 62 67 34
SQE2 Jan 2024 11% Apprentice 84 84 50
Non-apprentice 72 75 27
SQE2 Apr 2024 3% Apprentice 83 83 100
Non-apprentice 79 81 33
SQE2 Jul 2024 10% Apprentice 86 87 50
Non-apprentice 73 76 39

*Proportion of apprentice candidates

There were 681 solicitor apprentices who took one or more of the assessments in this reporting period. Of these:

  • 93 passed both SQE1 and SQE2 this year.
  • 195 passed SQE2 this year after passing SQE1 in a previous year.
  • 55 passed SQE1 this year and are awaiting SQE2 results.
  • 130 passed SQE1 this year and are yet to sit SQE2.
  • 41 have failed SQE2 this year and are yet to resit.
  • 47 have failed either FLK1 or FLK2 and are yet to resit.
  • 120 have failed both FLK1 and FLK2 and are yet to resit.

Our approach to developing assessments is to anticipate candidate requests for reasonable adjustments and where possible make assessment arrangements that minimise the necessity for adjustments to be made. How we consider reasonable adjustments, including how we communicate with candidates and the arrangements we most frequently make, is set out in the Reasonable Adjustments Policy

We are committed to making sure that a candidate is not disadvantaged by reason of a disability in demonstrating their competence. We will make reasonable adjustments where a candidate, who is disabled within the meaning of the Equality Act 2010, would be at a substantial disadvantage in comparison to someone who is not disabled. We will make reasonable steps to remove that disadvantage.

We will also consider making accommodations where a candidate has condition/s that impact on their ability to undertake the SQE. All such requests for accommodations are considered in Kaplan's reasonable discretion and on a case-by-case basis.

During the course of the year, we implemented 1,554 reasonable adjustment plans.

The average time between receiving a completed application for reasonable adjustments (with full accompanying evidence) to proposing an adjustment plan to a candidate was between three and six days at each assessment window. This was shorter than the average number of days taken last year.

For some candidates who require complex reasonable adjustment plans, considerable time can be needed to finalise their plan for the assessments and to ensure comprehensive support is arranged. Our standard guidance to candidates continues to be to obtain their supporting evidence and make their reasonable adjustment request as early as possible.

We strongly encourage candidates to submit their request form at the earliest opportunity, even if supporting documentation is not fully available at that point. Applying early provides time for Kaplan and the candidate to finalise an adjustment plan.

Table 20: Proportion of candidates with a reasonable adjustment plan
Assessment Date % of candidates with a reasonable adjustment plan
SQE1 Overall Jan 2024 8%
SQE1 Overall Jul 2024 7%
SQE2 Overall Jul 2023 9%
SQE2 Overall Oct 2023 10%
SQE2 Overall Jan 2024 9%
SQE2 Overall Apr 2024 11%
SQE2 Overall Jul 2024 11%
Table 21: Proportion of candidates with reasonable adjustment (RA) plans who have one or multiple conditions or disabilities
Assessment Date % of candidates with RA plans who have one condition/disability % of candidates with RA plans who have multiple conditions/disabilities
SQE1 Overall Jan 2024 73% 27%
SQE1 Overall Jul 2024 71% 29%
SQE2 Overall Jul 2023 78% 22%
SQE2 Overall Oct 2023 78% 22%
SQE2 Overall Jan 2024 80% 20%
SQE2 Overall Apr 2024 77% 23%
SQE2 Overall Jul 2024 68% 32%

i) Pass rates for candidates with reasonable adjustment plans

Table 22 shows pass rates for candidates with reasonable adjustment plans alongside pass rates for the full cohort for FLK1, FLK2 and SQE2 assessments during the reporting period.

A Chi-square significance test was used to see if there were differences (at a 95% confidence level) between the pass rates for candidates with and without a reasonable adjustment plan. This is indicated in the final column as 'Yes' (significant differences) or ‘No’ (no significant differences).

For FLK1 and FLK2 in July 2024, candidates with a reasonable adjustment plan in place achieved significantly higher pass rates. For SQE2, the pass rates for candidates with a reasonable adjustment plan were significantly higher in April 2024 and July 2024.

Whilst there is no consistent pattern to suggest that candidates with reasonable adjustments achieved higher or lower pass rates than those without, this will continue to be monitored to see if the more recent findings indicate a change.

Table 22: Comparison of pass rates overall and for candidates with a reasonable adjustment (RA) plan
Assessment Date Overall Pass Rate RA Pass Rate Significant Differences*
FLK1 Jan 2024 63% 68% No
FLK2 Jan 2024 61% 63% No
FLK1 Jul 2024 55% 60% Yes
FLK2 Jul 2024 50% 55% Yes
SQE2 Overall Jul 2023 79% 84% No
SQE2 Overall Oct 2023 64% 70% No
SQE2 Overall Jan 2024 73% 73% No
SQE2 Overall Apr 2024 79% 85% Yes
SQE2 Overall Jul 2024 74% 84% Yes

*Significant where any differences between the groups are unlikely to be due to chance

ii) Nature of conditions and adjustments made

Plans were in place for candidates with a wide range of disabilities, long-term and fluctuating conditions. Accommodations were also agreed for some candidates who were pregnant/nursing mothers.

The most prevalent conditions amongst candidates with reasonable adjustment plans were associated with neurodiversity including dyslexia, autism, dyspraxia and ADHD. This was also the case during the period covered by the previous SQE Annual Report (2022/23).

The adjustments were specific to the actual assessments and, in some cases, different arrangements were required on the SQE2 written and oral assessments.

The most common adjustments were as follows, with similar patterns seen across SQE1 and SQE2:

  1. Up to 25% extra time.
  2. Stop the clock (STC) - a prescribed amount of time for breaks - the candidates can take as many breaks as they wish and decide on the length of the breaks up to the prescribed agreed amount of STC time. This adjustment can be made alongside extra time.
  3. Use of a laptop to accommodate the need for typing instead of handwriting attendance notes, some specified font size or style requirements and integration of some assistive technologies.
  4. Own assessment room.

Other bespoke provisions were also arranged for candidates where evidence supported this. Examples of these included access to medical devices and use of a screen overlay.

After each assessment, candidates are invited to complete a survey to provide feedback about their experience. The questions in the survey (Appendix 2) relate to:

  • The SQE website
  • Operations
  • Assessment specification and questions
  • Reasonable adjustments
  • Apprentices
  • The overall SQE service.

Candidates can provide general comments via free-text boxes. They can also provide their contact details should they wish to be contacted further about their feedback.

These surveys continue to provide valuable information for Kaplan and the SRA to consider. All responses to the candidate survey are collated and analysed, with action plans put in place where improvements can be made, or new opportunities and solutions can be explored.

At the beginning of the year, the areas of most concern for dissatisfied candidates were:

  • Booking process
  • Guidance provided about the assessment
  • SQE2 reasonable adjustments processes (where there are more testing days and more complicated test arrangements than for SQE1).

By the end of the year, satisfaction with the booking process had increased. Satisfaction with the guidance provided about the assessment rose for SQE1 and remained around the same for SQE2.

Unfortunately, satisfaction with the SQE2 reasonable adjustments processes decreased. The application form for this has since been extensively reviewed and will be updated online in Q1 of 2025 to improve the efficiency of the application process and reduce duplication. Changes are being made to agree and carry forward adjustments to future assessments, if the agreed adjustments are unlikely to change over time.

The strongest performing areas in the survey were:

  • The clarity of instructions in relation to the assessment questions/tasks
  • Satisfaction with SQE1 reasonable adjustments
  • The efficiency of administration at assessment centres

Satisfaction with the oral centres remained strong, while the administration at Pearson VUE centres notably went from being the weakest performing area in 2022/23 to the strongest in the SQE1 July 2024 survey.

Although the survey invites candidates to give feedback about the SQE in Welsh, no candidates opted to take the SQE2 assessment in Welsh in 2023/24. In 2024/25 the option to take the SQE1 in Welsh will be available for the first time. Candidates can sit a Welsh version of the SQE1 January, SQE2 April, SQE1 July and SQE2 October assessments.

Feedback from candidates and stakeholders has also been collected, reviewed and considered from various other sources. This includes input received from the SRA, its Psychometrician and the SQE Independent Reviewer about the overall delivery of the SQE assessments, and their oversight of any issue management should it arise.

Table 23 summarises some of the key areas brought to our attention for improvement, and what is being done in response.

Table 23: Key areas of feedback and responses
Feedback said… Our response has been…
Booking: Candidates were dissatisfied with the availability and distribution of seats during SQE booking windows. Redesigned the booking process to include seat reservation and allocation of seats prior to payment. Eliminated long website queues, improved candidate experience and satisfaction
Demand: London continued to be a high-demand location for candidates on all parts of the SQE. Introduced more London-based centres in the Pearson VUE test centre network to better meet demand. From SQE2 October 2024 onward, the oral assessment can also be delivered at a third London location
SQE1 and SQE2: Candidates wanted more sample questions. Released a short video to help candidates understand the anatomy and structure SQE1 questions. 40 more SQE1 sample questions were published in November 2024.

A sample recording of a SQE2 candidate performance replaced the previous written version of a sample candidate submission. It provides a more realistic setting of a real-life advocacy assessment. Three further sample videos were published in December 2024.
SQE1 and SQE2: The quality of test centres and reliability of their technology differed from centre to centre. Improved consistency and readiness at centres. Candidate satisfaction scores have improved significantly in relation to adequacy of the administration at Pearson VUE test centres (up from mid 60% to mid 80% at the SQE1 July 2024).
SQE2 written: Candidates experienced issues or challenges with technology during SQE2 assessments, including overwriting. Technical issues were reviewed to inform continuous improvement. This included eliminating issues that impacted typing in the SQE2 essay box and providing candidates with specific on-screen instructions about the use of the 'insert' function to prevent overwriting.
Reasonable Adjustments: There is an insufficient range of adjustments available, including assistive technologies Increased the range of time adjustments which match the needs of candidates. Scheduling an FLK assessment over more than one day and SQE2 over non-consecutive days. Delivered remote proctored assessments and assessments with a palantypist, FUSION and in Braille for the first time this year
Reasonable Adjustments: Candidates wanted clearer information about how to apply for an RA and what to expect Candidate emails and communications were updated to provide clearer information and reminders about what to expect at each stage of the process. Changes to the RA application form and supporting information are planned for Q1 of 2025.
Training Providers Training providers wanted data to help them with their course development. Since the SQE1 January 2024 we have shared anonymised candidate performance data, in each practice area. Similar reporting for the SQE2 assessments will begin in early 2025.
Mitigating Circumstances: Candidates want information about how to apply for an RA and what they can expect presented in an alternative format. In July 2024, the Mitigating Circumstances Policy was updated to allow an 'Assessment Delivery Failure' to be declared by Kaplan SQE on the day if a failure prevents delivery of the assessment. Candidates have better visibility of the options available to them. Kaplan can more smoothly manage candidates' progression to their next exam.
SQE candidate assessment experience: More information was needed on the candidate assessment experience including how candidates prepare for the assessment. We published three candidate case studies in 2024. These have been written by candidates and anonymised.

Unfortunately, during the reporting period, the results for the January 2024 SQE1 assessments had to be reissued by Kaplan to correct an error. Consequently, 175 candidates, who were originally told that they had failed either FLK1 and/or FLK2 (the two parts of SQE1) did, in fact, pass those assessments. The error did not affect the overall pass/fail outcomes for the other candidates (6,451) taking the assessment

The incorrect results arose not as a calculation error, but because the results calculation process did not round scores at the point that was set out in the published Marking and Standard Setting policy. There were no errors in the marking or the calculation of correct marks. There were no instances of any changes to the number of correct answers for any candidate. The error was unique to the January 2024 results.

Kaplan's initial internal investigations determined the root cause and made recommendations concerning prevention of future issues. One recommendation was to appoint an external, expert, independent reviewer to check the recalculation of the results issued, determine the factors contributing to the error and identify actions to prevent reoccurrence.

Kaplan appointed an external, expert reviewer, Anne Pinot de Moira, a Chartered Statistician with more than 25 years of experience in assessment research. There were two parts of the review: the first part checked - and confirmed as correct - the recalculation of the results issued, while the second part examined the contributing factors.

Kaplan published the outcome of the first part of the review on 15 May 2024 (one month after the results were reissued to candidates). The review concluded that the reissued results were accurate, including individual marks, quintiles and overall pass/fail outcomes.

The second part of the review, which examined contributing factors, concluded in July 2024. To address the contributing factors and prevent reoccurrence, the external reviewer identified recommendations to improve:

  • Change management
  • The effectiveness and alignment of policies and processes
  • The clarity of roles and responsibilities
  • Communication between teams

In response to the recommendations, Kaplan created an action plan to systematically address the issues identified and improve the overall quality and confidence in the outcomes of the assessments.

The action plan extends the learnings from this review to the wider business practices to prevent future errors and maintain the integrity of the qualifications provided. Progress on delivering the action plan is being reported regularly to the SRA.

Candidate online monitoring and maximising diversity survey (demographic data survey) questions
Table Category Full Question in the Survey
6 Ethnicity What is your ethnic group?
7 Disability Do you consider yourself to have a disability according to the definition in the Equality Act 2010?
8 Age What age category are you in?
9 Sex What is your sex?
Gender same as sex registered at birth Is your gender the same as the sex you were registered at birth?
Sexual orientation What is your sexual orientation?
 10  Religion/belief  What is your religion or belief?
11 Parents attended university Did either of your parents attend university by the time you were 18?
Occupation of main household earner What was the occupation of your main household earner when you were aged about 14?
Type of school attended Which type of school did you attend for the most time between the ages of 11 and 16?
12 Highest level of education What is your highest level of education?
Undergraduate degree classification What was your undergraduate degree classification?
Qualifying work experience undertaken Have you undertaken any qualifying work experience?
Qualified lawyer status If you are a qualified lawyer, please state the country in which you achieved your law qualification(s).
13 First language Please state what your first language is.
Post-assessment candidate survey questions
Question
The information on the SQE website about the assessment was helpful
There was clear guidance provided on the SQE Assessment
It was a simple process to book my assessment
The administration on the day was efficient (Written)
The instructions in relation to the assessment tasks were sufficiently clear
TThe administration on the day was efficient (Oral)
The instructions in relation to the assessment tasks were sufficiently clear (Oral)
It was a simple process to request a reasonable adjustment
The information about how to request a reasonable adjustment was clear
The reasonable adjustment received on the day matched the reasonable adjustment plan
How would you rate your overall satisfaction with the SQE assessment service provided by Kaplan SQE?

For each question, candidates are asked to say whether they are very satisfied, satisfied, neither, unsatisfied or very unsatisfied.

Ready to register for the SQE?

Create your personal SQE account and book your assessments.

Register for SQE 

Have you passed the SQE?

Find out what happens after passing the SQE and admission to the roll of solicitors.

Learn more

Ready to register for the SQE?

Create your personal SQE account and book your assessments.

Register for SQE 

Have you passed the SQE?

Find out what happens after passing the SQE and admission to the roll of solicitors.

Learn more about Have you passed the SQE?