What is the SQE?
Who is the SQE for?
Costs and fees
Case studies
Dates and locations
Assessment information
The assessment day
Results and resits
Due to inactivity, and for security reasons, you will be automatically logged out of your SQE account in 1 minute.
Press ’continue’ to stay logged in.
The monitoring and maximising diversity survey has been updated. Please return to the survey to reconfirm your answers and complete the new section at the end.
You must do this to remain eligible for the SQE. You will not be able to book your next assessment until you have updated your answers.
The SQE is a rigorous assessment designed to assure consistent, high standards for all qualifying solicitors, consisting of two parts:
This 2023/24 annual report of the Solicitors Qualifying Examination (SQE) contains data about more than 14,600 individual candidates who took part in the SQE between July 2023 and July 2024. It covers seven SQE assessment windows as follows:
SQE1 (FLK1 and FLK2 assessments)
SQE2
The outcomes of the SQE2 July 2023 assessment window are included in this annual report because they were not available for the 2022/23 annual report. Future reports will include assessments delivered in the annual reporting period ie SQE2 October through to SQE2 July.
Guided by the outcomes of the recent report published by the University of Exeter, 'Potential Causes of Differential Outcomes by Ethnicity in Legal Professions', Kaplan will work with the SRA to explore how to supplement and improve the data collected from candidates so that we can gain more detailed and reliable insights about the factors or variables contributing to differential outcomes on the SQE assessments.
Assessment reliability
Pass rates
Resits
Candidate trends
Apprentices
This SQE annual report provides a cumulative picture of the outcomes from the assessments that took place in the reporting period (July 2023 - July 2024).
Statistics and commentary are provided on the overall performance of candidates at the individual assessment level to enable comparisons over time and identify any emerging trends. Assessment data is provided, where applicable, at the cumulative level.
Seven assessment windows are covered in this report with two for SQE1 and five for SQE2 as follows:
When preparing this report, the results for the October 2024 SQE2 assessment window deliveries were not available. We have included some provisional data on the number of candidates who attended this assessment where relevant.
The SQE is a single rigorous assessment designed to assure qualifying solicitors have been assessed to a consistent, high standard. The SRA's Statement of Solicitor Competence sets out what solicitors need to be able to do to perform the role effectively, and provides everyone with a clear indication of what to expect from a solicitor. This is what the SQE tests.
The SRA has appointed Kaplan SQE (Kaplan) as the approved assessment provider for the delivery of the SQE assessments and other related services. Since the SQE was launched in 2021, over 8900 candidates have completed the SQE.
In this reporting period, SQE assessments were delivered to more than 14,600 individual candidates in 37 countries.
SQE1 consists of two 180 question multiple choice single best answer assessments (FLK1 and FLK2) in the following subject areas:
These are delivered electronically in controlled and invigilated exam conditions at Pearson VUE test centres across the UK and internationally.
Each FLK assessment was run across five consecutive days, with each candidate taking the assessment on one of the five days. The FLK1 and FLK2 assessments took place in consecutive weeks within each assessment window.
Each FLK assessment is split into two sessions of 2 hours 33 minutes, with 90 questions in each session. There is a 60-minute break between the sessions. Different assessment forms (papers) were allocated at random to candidates throughout each five-day assessment, with each form having a separate pass mark.
In order to pass SQE1, a candidate must pass both FLK1 and FLK2 assessments. Candidates who fail their first attempt have two further opportunities to take the assessment(s) they failed (FLK1 and/or FLK2). More information can be found in the SQE1Assessment Specification.
SQE2 comprises 16 stations – 12 written stations and four oral stations — that assess both skills and application of legal knowledge.
The stations in SQE2 cover six legal skills:
This is across five practice areas:
SQE2 written assessments take place in Pearson VUE test centres over three consecutive half days and all candidates take the same written stations on the same date.
SQE2 oral assessments take place over two consecutive half days. During the reporting period, oral assessments took place in centres in Birmingham, Cardiff, Manchester and London. The logistics involved in running the oral assessments mean that not all candidates in a cohort can take the same oral stations on the same day, so multiple “sittings” are used for SQE2 oral stations. To protect the integrity of the assessments and to ensure equity, different tasks are set for the oral stations used at the different sittings. However, the same skills and practice areas are covered in all sittings of an assessment window.
SQE2 has a single pass mark for the whole assessment, covering all 16-stations. There may be slightly different pass marks between the SQE2 sittings to account for differences in the difficulty of the different oral station tasks, as described above.
Candidates take SQE2 written assessments in Pearson VUE test centres over three consecutive half days. All candidates take the same written stations on the same date.
More information can be found in the SQE2 Assessment Specification.
Exemptions from the SQE assessments are only available to qualified lawyers. Whilst exemptions are available for SQE1 (FLK1 and/or FLK2) and SQE2 assessments, SQE1 exemptions are rarely given.
There are some candidates who meet the SRA transitional arrangements and are using SQE2 to qualify as a solicitor. They are not required to take SQE1.
To summarise, there are three types of candidates:
The data provided in this report relate to candidates who received a mark for any of the assessments. Candidates whose attempts were discounted due to mitigating circumstances are not included. Outcome data is provided separately for FLK1 and FLK2 assessments and overall for SQE1. Outcome data is also provided for SQE2.
In this reporting period, a total of 14,625 individual candidates received a mark for one or more of the SQE assessments. Table 1 below provides the number of candidates for each assessment, along with the numbers and proportions of candidates by attempt number, where applicable.
Candidates are allowed up to three attempts for each assessment within a six-year period. At the time of writing this report, there had been six opportunities to sit SQE1 and nine opportunities to sit SQE2. A small number of candidates had made a third attempt at the assessments.
At their first SQE1 attempt, candidates are required to sit both FLK1 and FLK2 in the same assessment window. If they fail one, they only need to resit that one assessment. Any passes can be carried forward and used within a six-year period.
Because of this, and owing to mitigating circumstances that are applied separately for FLK1 and FLK2, the number of candidates may differ across FLK1, FLK2 and SQE1 overall.
*Data provided for candidates who sat both FLK1 and FLK2 in the assessment window
**Where candidates sat both FLK1 and FLK2 in the same assessment window but with different attempt numbers (eg due to a previous discounted attempt)
Although the SQE2 assessment in October 2024 had taken place when this report was written, the marks had not been released to candidates. There were 1,044 candidates who sat SQE2 in October 2024. These approximately breakdown as:
These numbers will change if there are successful mitigating circumstance claims with attempts being discounted.
Statistical reports are published after results are released. Data from the October 2024 assessment will be included in the 2024/25 annual report.
Table 2 provides the pass marks for each assessment, the average score (Mean) and standard deviation (SD), and Table 3 provides measures of test reliability (Cronbach's alpha and standard error of measurement (SEm).
With the move to using scaled scores for the SQE1 assessments, alongside the introduction of multiple testing days, we deployed more than one form of the assessment for each assessment window. To achieve accurate and fair comparisons between test takers the pass marks are scaled to 300 (on a scale of 0 to 500, where 0 equates to 0% and 500 equates to 100%). The quality statistics are provided as an average of the values for the multiple assessment forms for each assessment window.
Cronbach’s Alpha (α) is a measure of test reliability that estimates the internal consistency, or how closely related the sets of items are in a test. It therefore tells us how well items (questions) work together as a set. A high α coefficient suggests that candidates tend to respond in similar ways from one item to the next. Values for α range from 0 (where there is no correlation between items) to 1 (where all items correlate perfectly with one another). The widely accepted gold-standard α for high-stakes assessments is 0.8.
In all SQE1 assessments to date, α has been greater than 0.9 and above 0.8 for SQE2, suggesting very good internal consistency and high reliability for the SQE1 and SQE2 assessments.
The SQE assessments provide an observed/obtained score for a candidate at a snapshot in time (the assessment). If the candidate were to sit the same assessment on another occasion, they may achieve a different score owing to various factors. Some of these can be controlled to an extent, such as the training provided, and some cannot, such as the amount of sleep the candidate got the night before the assessment.
A candidate's theoretical “true” score can only be estimated by their observed score, but there is an inevitable degree of error around each candidate's observed score, which is consistent with most assessments.
The standard error of measurement (SEm) for an assessment is an estimate of how repeated measures of the same group of candidates on the same assessment would be distributed around their theoretical “true” scores. The SEm is a function of the reliability of the assessment (α) and the standard deviation in scores on the assessment. Generally, the higher the reliability, the lower the standard error of measurement and vice-versa.
For all SQE assessments to date, the SEm has been below 4%, which provides confidence that observed scores generally represent a very good approximation of true scores in these assessments.
*Average values (across the multiple forms) are provided for the FLK1 and FLK2 assessments, with scaled values provided where 0 equates to 0%, 300 equates to the pass mark, and 500 equates to 100%.
Note - for SQE2 there were multiple sittings for the oral assessments (see introduction)
*Average values (across the multiple forms) are provided for the FLK1 and FLK2 assessments Note - for SQE2 there were multiple sittings for the oral assessments (see introduction).
Tables 4a to 4c below summarise the journey so far for the candidates who have received assessment marks in this reporting period. These are provided separately for SQE1 (candidate outcomes) and SQE2 (candidate routes and outcomes). The candidates shown in table 4b include transitional candidates not required to take SQE1.
*Reported for the second SQE1 window where a candidate has sat in both assessment windows in this period.
Of the 1,477 candidates who passed SQE1 in January 2024 and went on to sit SQE2 in April 2024, 1,324 (90%) passed and 153 (10%) failed. For those who sat SQE2 in July 2024, 535 (85%) passed and 95 (15%) failed.
Of the 2,280 candidates who failed FLK1 and/or FLK2 in January 2024, 620 (27%) passed SQE1 in July 2024 (included within the Passed rows of the table above).
Of those who attempted just FLK1 or FLK2 in either January 2024 or July 2024, 913 (69%) passed their remaining assessments (included within the Passed rows of the table above).
Candidates who fail their first or second attempts may benefit from reviewing the information contained later in this report relating to candidate performance in different practice areas.
*Reported for the first SQE2 window where a candidate has sat in more than one assessment window in this period.
Of the 1,006 candidates who did not sit SQE1 and sat SQE2 between July 2023 and July 2024, 460 (46%) passed. This compares to the higher rate of 85% for those taking SQE2 after passing the SQE1.
*Reported for the latest SQE2 window where a candidate has sat in more than one assessment window in this period.
Of the 4,262 candidates who passed SQE2 in this reporting period:
Of the 1,228 candidates who did not pass SQE2 in this reporting period:
Table 5 shows the candidate pass rates (and number passing) for each assessment for all candidates and by attempt number, and for SQE2 by whether SQE1 had been previously sat.
Of the two SQE1 assessments, the pass rates for FLK1 have been higher than for FLK2 (+2% and +5% for January 2024 and July 2024 respectively). We will continue to monitor this across future assessments.
The overall pass rate for SQE1 was higher in January 2024 (56%) than in July 2024 (44%)
Pass rates in both January and July 2024 were lower for resitting candidates when compared to first attempt candidates. This was true for both FLK1 and FLK2.
The pass rates for resitting candidates were higher in July than in January for FLK1 (increasing from 35% to 45%) but were similar for FLK2 (45% and 44% respectively for January and July).
The proportion of candidates resitting increased between the January and July 2024 assessments for both FLK1 and FLK2. This rose from 10% to 21% for FLK1 and from 13% to 24% for FLK2.
The lower pass rates for resitting candidates might indicate that they should consider taking more time (and/or putting in more work or training) between sittings. This may help them improve from a failing to a passing standard.
The SQE2 pass rates are higher than for SQE1. This is expected given the SQE2 eligibility requirement to have qualification-level functioning legal knowledge (ie have passed SQE1 or are a transitional candidate).
The proportion of candidates taking SQE2 who were not required to sit the SQE1 has been lower in the more recent SQE2 assessments (eg 51% of all candidates in October 2023 vs 15% in July 2024). For those qualifying under the Qualified Lawyers Transfer Scheme, the SQE2 in October 2023 was the last opportunity for these candidates to sit the assessment. However, transitional arrangements continue for candidates with the Legal Practice Course.
Candidates who sat SQE1 performed better than those who had not, with pass rates ranging between 79% and 90%, compared to between 36% and 47% for those who had not sat SQE1.
The proportions of resitting candidates remain small across the assessments (6%), and their pass rates are significantly lower than for first attempt candidates.
The pass rates for second attempt candidates range between 27% and 38%, compared to a range of 69% to 82% for first attempt candidates. Again, careful consideration should be given to when to resit to allow sufficient time to improve to a passing standard.
*Sat both assessments but with a different attempt number for each
**not reportable as less than 10 candidates in the attempt group
The SRA collects diversity and socio-economic data to help understand how candidates with different characteristics and backgrounds perform in the assessments. The categories are consistent with data collected by the Office for National Statistics (ONS) and the Social Mobility Commission.
Data is collected from candidates via an online monitoring and maximising diversity survey (or demographic data survey), completed or updated ahead of assessment registration. Appendix 1 lists the data reported on in this section.
The large number of characteristics and groups within characteristics recorded in the data collected means that candidate numbers in some of the groups are small.
In the tables below, we present univariate analysis of the outcomes data, which looks at each of the characteristics individually and independently. Tables 6 to 13 provide the following for FLK1, FLK2, SQE1 and SQE2 for each of the 15 characteristics presented:
Data in the tables exclude candidates who select 'Prefer not to say' - the following provides some proportions of candidates who selected this response:
Compared to the previous year, the proportion of 'Prefer not to say' responses had reduced, which contributes positively to producing more meaningful statistics.
These tables provide data for all candidates who received marks for any assessment (FLK1, FLK2, SQE1 and SQE2). Where a candidate has more than one attempt within any one assessment type in this reporting period, the latest attempt data has been used.
The data is pooled from the January 2024 and July 2024 assessments for FLK1, FLK2, and SQE1, and from the July 2023, October 2023, January 2024, April 2024 and July 2024 assessments for SQE2. Where there are fewer than 10 candidates in any group the proportions and pass rates are not reported; this is indicated by greyed out cells in the table.
The full questions asked in the online demographic data survey in relation to each category are available in Appendix 1. This report now includes data about candidates' first language (Table 13)
Overall, the findings for where there are pass rate differences between groups are largely similar to last year.
Candidates who reported being in White or Mixed/Multiple ethnic groups achieved higher pass rates than those who reported being in Asian/Asian British or Black/Black British ethnic groups. Differences in pass rates between groups were significant for all assessments.
Pass rates were similar between candidates who declared a disability and those who did not for FLK1, FLK2 and SQE1 assessments. Candidates who declared a disability achieved higher pass rates for SQE2.
The majority of candidates taking SQE1 and SQE2 were in the younger age groups (under 35 years). More than 40% were in the 16-24 age group, and this group achieved higher pass rates than candidates 35 years or above in FLK1, FLK2, SQE1 and SQE2.
There were fewer than 10 candidates in the 65+ age group for all assessments. The proportions and pass rates are therefore not provided for this group.
Candidates who reported their sex as male achieved a higher pass rate than female candidates in SQE1. The opposite was the case in SQE2.
There were no significant differences between candidates whose gender was the same or different to their sex registered at birth.
Candidates who reported that their sexual orientation was Bi, Gay / lesbian or Other achieved higher pass rates than candidates who reported their sexual orientation as Heterosexual / straight in SQE1 assessments. For SQE2, candidates who reported that their sexual orientation was Bi achieved higher pass rates.
There were fewer than 10 candidates selecting 'Other' for sex for all assessments, and for SQE2 there were fewer than 10 candidates selecting 'No' for their gender being the same as the sex registered at birth. The proportions and pass rates are therefore not provided for these groups.
There were differences in pass rates between religion/belief groups reported by candidates in all assessments. For SQE1, candidates reporting Hindu, Muslim or Sikh as their religion had lower pass rates, and those reporting no religion or belief or Jewish had higher pass rates. For SQE2, candidates reporting Muslim as their religion had lower pass rates, and those reporting no religion or belief had higher pass rates.
The most frequently indicated group was no religion or belief (40%-45% of candidates for each assessment).
Candidates who reported the occupation of the main household earner as professional achieved higher pass rates in SQE1 and SQE2.
Pass rates for candidates attending independent or fee-paying schools (non-bursary funded) were higher across all assessments. Candidates who attended school outside the UK achieved significantly lower pass rates in SQE2 than those in the other groups.
Candidates who reported that at least one parent attended university achieved higher pass rates in SQE1 and SQE2.
*Group excluded from the Chi-square test of significance
Candidates with at least an undergraduate degree achieved higher pass rates than those with qualifications below degree level in FLK1 and SQE1, with pass rates similar for FLK2 and SQE2. Candidates with qualifications below degree level are a small group accounting for approximately 1% of all candidates. Whilst an undergraduate degree (or equivalent qualification) is required for admission, it is not a requirement for taking the SQE1 or SQE2 assessments.
Candidates with first class undergraduate degree classifications achieved higher pass rates in all assessments and accounted for approaching a quarter of candidates across the assessments.
Candidates who disclosed they had not undertaken qualifying work experience achieved a higher pass rate in all assessments.
All candidates disclosed whether they were already a qualified lawyer. Those who were not qualified achieved higher pass rates in FLK2, SQE1 and SQE2 than those who were qualified. The difference was much larger for SQE2. For FLK1, the pass rates were similar for those who were not qualified and those who were qualified.
There were fewer than 10 candidates with no formal qualifications for all assessments. The proportions and pass rates are therefore not provided for this group.
*Group excluded from the Chi-square test
English was the first language for the majority of candidates, and this group achieved higher pass rates in all assessments.
The tables in this section show the mean scores by practice area for FLK1 and FLK2 by the following candidate groups:
In FLK1, mean scaled scores across the seven practice areas assessed range from 278 for Dispute Resolution to 362 for Ethics. This suggests candidates find some practice areas more difficult than others.
Four of the practice area mean scaled scores are below 300 (equivalent to the overall passing standard) with Dispute Resolution (278) and Business Law and Practice (279) having the lowest scores. Mean scaled scores were above 300 for Ethics (362), Contract Law (335) and Legal Services (302).
In FLK2, the mean scaled scores appear slightly lower than for FLK1 and range from 265 for Wills and Intestacy to 354 for Ethics. The range in mean scaled scores is similar for both FLK1 and FLK2 (84 vs 89), which again suggests candidates find some areas more difficult than others, with Wills and Intestacy (265) and Property Practice (278) having the lowest scores. Mean scaled scores were above 300 for Ethics (354), Criminal Liability (317) and Land Law (303).
There are similar patterns of performance between the passing and failing candidates - and first, second and third attempt candidates - across the majority of practice areas within each assessment. This suggests both stronger and weaker candidates perform well/less well in the same practice areas.
The differences in mean scores between passing and failing candidates for each of the practice areas range from 85 in FLK1 Ethics to 125 in FLK2 Land Law. The differences are more than 100 for the majority of the practice areas (5 out of 7 for FLK1, 6 out of 7 for FLK2), and less than 100 for Ethics (FLK1 and FLK2) and Legal Services (FLK1).
When comparing performance between attempt numbers, the differences between first attempt and second attempt mean scores range between 8 (FLK2 Ethics) and 36 (FLK1 Tort), with the majority of differences being below 30, and, as expected, higher mean scores for first attempt candidates across all practice areas.
The differences between first and third attempt candidates are smaller than the differences between first and second attempt for all of the FLK1 and FLK2 practice areas. Third attempt candidates scored higher in FLK2 Ethics than the first attempt candidates.
The following plots show the mean scaled scores for the FLK1 and FLK2 practice areas for the passing and failing candidates, with data aggregated across the two assessment windows: pink = failing; green = passing; bars ordered by passing candidate mean scaled scores descending.
In FLK1, the wider difference between passing and failing candidates is evident for Tort and Business Law and Practice, and for FLK2, this is evident for Land Law, Trust Law and Criminal Liability. The shortfall in knowledge of the weaker candidates is therefore greater in these practice areas, suggesting these areas may require more focus/preparation for future attempts at SQE1.
Y axis showing Mean scaled Score
X axis showing FLK1 Practice area
Result
Y axis showing Mean Scaled Score
X axis showing FLK2 Practice area
Comparing performance across practice areas to the previous year, candidates have consistently performed better in Ethics, Contract Law and Tort (FLK1) and Ethics, Criminal Liability and Land Law (FLK2). Business Law and Practice and Dispute Resolution (FLK1) and Property Practice and Wills and Intestacy (FLK2) continue to be more challenging practice areas for the candidates.
In each SQE2 assessment window, the candidates are assessed in four written legal skills (Writing, Case & Matter Analysis, Research and Drafting) in five practice contexts (Dispute Resolution, Criminal, Property, Probate and Business). The combination of written legal skills and practice contexts will vary between assessment windows, with the exception of Business, which follows the same pattern. However, all candidates sat the same 12 written stations within each assessment window.
Candidates were assessed in two oral legal skills - Interviewing and Advocacy - in four practice contexts: Property and Probate (Interviewing) and Dispute Resolution and Criminal Litigation (Advocacy) within each assessment window. Therefore, in total, the candidates sat four oral legal skills assessments within each assessment window (Property Interviewing, Probate Interviewing, Dispute Resolution Advocacy and Criminal Litigation Advocacy).
Candidates were also required to complete a Written Attendance Note / Legal Analysis for each Interviewing assessment.
In each assessment window, the oral legal skills assessments take place in sittings. The number of oral sittings required will depend on the number of candidates taking the assessment and differed across each assessment window:
For more on this, please see the SQE2 Assessment Specification (Organisation and delivery section).
The table in this section shows the mean scores by station (with data aggregated across assessments where there are common stations and aligned to ensure a comparable scale across the assessments) for the following candidate groups:
Looking at the station scores for all candidates, mean scores range between 60.1% and 76.9%. Mean scores above 70% were achieved in five of the 24 different stations, as follows:
At the lower end, mean scores of less than 62% were in:
The Advocacy skills assessed in the oral stations have the overall highest mean score with 75.6%; the Interview and Attendance Note/Legal Analysis skills fall towards the lower end with 64.6%. The mean scores for the written skills assessed range from 64.2% for Legal Drafting to 67.4% for Case and Matter Analysis. Differences between the mean scores for first attempt and resit candidates range from 6.1% (Interview) to 12.4% (Legal Research).
The range in the mean performance across the five practice areas is small, ranging from 66.1% (Business) to 70.3% (Dispute Resolution), which suggests that, overall, candidates do not find particular practice areas easier or more difficult than others. Mean differences between first attempt and resit candidates range between 8.1% (Wills and Intestacy, Probate Administration and Practice) and 12.2% (Business Organisations, Rules and Procedures).
Figure 3 and Figure 4 show the mean scores for the SQE2 skills and practice area scores for the passing and failing candidates with data aggregated across the five assessment windows: pink = failing; green = passing; bars ordered by passing candidates' mean scores descending.
The patterns of performance between passing and failing candidates appear similar across both skills and practice areas. Mean differences between the passing and failing candidates for skills range from 14.1% (Interview and Attendance Note/Legal Analysis) to 19.7% (Advocacy). For the practice areas, the differences between the passing and failing candidates range from 15.3% (Wills and Intestacy, Probate Administration and Practice) to 20.4% (Business Organisations, Rules and Procedures).
The mean score difference between passing and failing candidates was approaching 20% for three of the skills as follows:
Skills:
For the practice areas, the score difference between passing and failing was close to or above 20% for the following two practice areas:
The shortfall in the legal skills (and associated legal knowledge) of the weaker candidates is therefore greater in these skills and practice areas, suggesting these areas may require more focus/preparation for future attempts at SQE2.
Y axis showing Mean Score (%)
X axis showing Station
X axis showing Practice area
Comparing performance across the skills to the previous year, candidates have consistently performed better in Advocacy, Case and Matter Analysis and Legal Research, with lower scores for Legal Drafting and Legal Writing. Candidates performed less well in the Interview and Attendance Note skill this year compared to last year.
Across the Practice Areas, candidates have consistently performed better in Dispute Resolution and less well in Business. Candidates have performed less well in Wills and Intestacy this year compared to last year.
Table 17 shows the mean scores for all candidates along with passing and failing candidates for Ethics and non-Ethics focussed assessment content across FLK1 and FLK2.
In each case, there is a strong positive correlation between performance on Ethics questions and the overall score. Naturally, those candidates who have passed the assessments have significantly higher scores than those who have failed.
The mean scores are consistently higher for the Ethics content compared to the non-Ethics content across all assessments and groups. For all candidates, the mean differences range between 10.6% (FLK1 January 2024) and 16.9% (FLK1 July 2024). With the current data, there appears to be no pattern in mean differences between Ethics and non-Ethics content across the FLK1 and FLK2 assessments.
The mean differences between the passing and failing candidates are slightly higher for the non-Ethics content (range 20.8% to 23.3%) than for the Ethics content (range 17.5% to 19.8%), indicating the deficit knowledge of the weaker candidates is greater in the non-Ethics content than in the Ethics content.
Professional conduct and ethics are examined pervasively in the SQE2 assessments. The SQE2 assessments test a range of professional conduct issues and behaviours. There is no prescribed number of assessments that include a professional conduct or ethical issue in a SQE2 assessment window. However, professional and ethical behaviours are generally tested in most subject areas for each assessment window, where possible. Candidates must be able to spot the ethical or professional conduct issue and exercise judgement to resolve any issues honestly and with integrity.
There is no separate mark for professional conduct or ethics questions. Professional conduct and ethics are marked within the Legally Comprehensive assessment criteria (ie one of the Law criteria).
Kaplan SQE is the End Point Assessment Organisation for solicitor apprentices in England. Solicitor apprentices are required to pass SQE1 during their apprenticeship – it is a gateway requirement for SQE2 which is the end-point assessment (EPA).
Apprentices must pass SQE1 and have met all of the minimum requirements of their apprenticeship (including the gateway review) before they can attempt SQE2. When an apprentice has passed SQE2, they have completed the EPA for the Solicitor Apprenticeship Standard and passed the SQE.
Information about how solicitor apprentices and their training providers can engage with the SQE is available on our website.
Solicitor apprentices made up a small proportion of overall candidate numbers for each assessment as indicated in Table 18 and accounted for 5% of all candidate assessments in this reporting period.
In the SQE1 assessments, solicitor apprentice pass rates were lower than for the non-apprentices in January 2024, although this difference was not significant; in July 2024 the apprentice pass rates were significantly higher. For SQE2, solicitor apprentice pass rates were significantly higher than those for non-apprentice candidates for four of the five SQE2 assessments (all except April 2024). There were notably higher pass rates in July 2023 and October 2023, when apprentice pass rates were 100% (compared to 78% and 62% respectively for non-apprentice candidates). The higher pass rates for SQE2 are consistent with previous years and continue to evidence solicitor apprentices' preparedness for the end point assessment.
*Proportion of apprentice candidates
There were 681 solicitor apprentices who took one or more of the assessments in this reporting period. Of these:
Our approach to developing assessments is to anticipate candidate requests for reasonable adjustments and where possible make assessment arrangements that minimise the necessity for adjustments to be made. How we consider reasonable adjustments, including how we communicate with candidates and the arrangements we most frequently make, is set out in the Reasonable Adjustments Policy
We are committed to making sure that a candidate is not disadvantaged by reason of a disability in demonstrating their competence. We will make reasonable adjustments where a candidate, who is disabled within the meaning of the Equality Act 2010, would be at a substantial disadvantage in comparison to someone who is not disabled. We will make reasonable steps to remove that disadvantage.
We will also consider making accommodations where a candidate has condition/s that impact on their ability to undertake the SQE. All such requests for accommodations are considered in Kaplan's reasonable discretion and on a case-by-case basis.
During the course of the year, we implemented 1,554 reasonable adjustment plans.
The average time between receiving a completed application for reasonable adjustments (with full accompanying evidence) to proposing an adjustment plan to a candidate was between three and six days at each assessment window. This was shorter than the average number of days taken last year.
For some candidates who require complex reasonable adjustment plans, considerable time can be needed to finalise their plan for the assessments and to ensure comprehensive support is arranged. Our standard guidance to candidates continues to be to obtain their supporting evidence and make their reasonable adjustment request as early as possible.
We strongly encourage candidates to submit their request form at the earliest opportunity, even if supporting documentation is not fully available at that point. Applying early provides time for Kaplan and the candidate to finalise an adjustment plan.
Table 22 shows pass rates for candidates with reasonable adjustment plans alongside pass rates for the full cohort for FLK1, FLK2 and SQE2 assessments during the reporting period.
A Chi-square significance test was used to see if there were differences (at a 95% confidence level) between the pass rates for candidates with and without a reasonable adjustment plan. This is indicated in the final column as 'Yes' (significant differences) or ‘No’ (no significant differences).
For FLK1 and FLK2 in July 2024, candidates with a reasonable adjustment plan in place achieved significantly higher pass rates. For SQE2, the pass rates for candidates with a reasonable adjustment plan were significantly higher in April 2024 and July 2024.
Whilst there is no consistent pattern to suggest that candidates with reasonable adjustments achieved higher or lower pass rates than those without, this will continue to be monitored to see if the more recent findings indicate a change.
*Significant where any differences between the groups are unlikely to be due to chance
Plans were in place for candidates with a wide range of disabilities, long-term and fluctuating conditions. Accommodations were also agreed for some candidates who were pregnant/nursing mothers.
The most prevalent conditions amongst candidates with reasonable adjustment plans were associated with neurodiversity including dyslexia, autism, dyspraxia and ADHD. This was also the case during the period covered by the previous SQE Annual Report (2022/23).
The adjustments were specific to the actual assessments and, in some cases, different arrangements were required on the SQE2 written and oral assessments.
The most common adjustments were as follows, with similar patterns seen across SQE1 and SQE2:
Other bespoke provisions were also arranged for candidates where evidence supported this. Examples of these included access to medical devices and use of a screen overlay.
After each assessment, candidates are invited to complete a survey to provide feedback about their experience. The questions in the survey (Appendix 2) relate to:
Candidates can provide general comments via free-text boxes. They can also provide their contact details should they wish to be contacted further about their feedback.
These surveys continue to provide valuable information for Kaplan and the SRA to consider. All responses to the candidate survey are collated and analysed, with action plans put in place where improvements can be made, or new opportunities and solutions can be explored.
At the beginning of the year, the areas of most concern for dissatisfied candidates were:
By the end of the year, satisfaction with the booking process had increased. Satisfaction with the guidance provided about the assessment rose for SQE1 and remained around the same for SQE2.
Unfortunately, satisfaction with the SQE2 reasonable adjustments processes decreased. The application form for this has since been extensively reviewed and will be updated online in Q1 of 2025 to improve the efficiency of the application process and reduce duplication. Changes are being made to agree and carry forward adjustments to future assessments, if the agreed adjustments are unlikely to change over time.
The strongest performing areas in the survey were:
Satisfaction with the oral centres remained strong, while the administration at Pearson VUE centres notably went from being the weakest performing area in 2022/23 to the strongest in the SQE1 July 2024 survey.
Although the survey invites candidates to give feedback about the SQE in Welsh, no candidates opted to take the SQE2 assessment in Welsh in 2023/24. In 2024/25 the option to take the SQE1 in Welsh will be available for the first time. Candidates can sit a Welsh version of the SQE1 January, SQE2 April, SQE1 July and SQE2 October assessments.
Feedback from candidates and stakeholders has also been collected, reviewed and considered from various other sources. This includes input received from the SRA, its Psychometrician and the SQE Independent Reviewer about the overall delivery of the SQE assessments, and their oversight of any issue management should it arise.
Table 23 summarises some of the key areas brought to our attention for improvement, and what is being done in response.
Unfortunately, during the reporting period, the results for the January 2024 SQE1 assessments had to be reissued by Kaplan to correct an error. Consequently, 175 candidates, who were originally told that they had failed either FLK1 and/or FLK2 (the two parts of SQE1) did, in fact, pass those assessments. The error did not affect the overall pass/fail outcomes for the other candidates (6,451) taking the assessment
The incorrect results arose not as a calculation error, but because the results calculation process did not round scores at the point that was set out in the published Marking and Standard Setting policy. There were no errors in the marking or the calculation of correct marks. There were no instances of any changes to the number of correct answers for any candidate. The error was unique to the January 2024 results.
Kaplan's initial internal investigations determined the root cause and made recommendations concerning prevention of future issues. One recommendation was to appoint an external, expert, independent reviewer to check the recalculation of the results issued, determine the factors contributing to the error and identify actions to prevent reoccurrence.
Kaplan appointed an external, expert reviewer, Anne Pinot de Moira, a Chartered Statistician with more than 25 years of experience in assessment research. There were two parts of the review: the first part checked - and confirmed as correct - the recalculation of the results issued, while the second part examined the contributing factors.
Kaplan published the outcome of the first part of the review on 15 May 2024 (one month after the results were reissued to candidates). The review concluded that the reissued results were accurate, including individual marks, quintiles and overall pass/fail outcomes.
The second part of the review, which examined contributing factors, concluded in July 2024. To address the contributing factors and prevent reoccurrence, the external reviewer identified recommendations to improve:
In response to the recommendations, Kaplan created an action plan to systematically address the issues identified and improve the overall quality and confidence in the outcomes of the assessments.
The action plan extends the learnings from this review to the wider business practices to prevent future errors and maintain the integrity of the qualifications provided. Progress on delivering the action plan is being reported regularly to the SRA.
For each question, candidates are asked to say whether they are very satisfied, satisfied, neither, unsatisfied or very unsatisfied.
Create your personal SQE account and book your assessments.
Find out what happens after passing the SQE and admission to the roll of solicitors.