What is the SQE?
Who is the SQE for?
Costs and fees
Case studies
Dates and locations
Assessment information
The assessment day
Results and resits
Due to inactivity, and for security reasons, you will be automatically logged out of your SQE account in 1 minute.
Press ’continue’ to stay logged in.
The monitoring and maximising diversity survey has been updated. Please return to the survey to reconfirm your answers and complete the new section at the end.
You must do this to remain eligible for the SQE. You will not be able to book your next assessment until you have updated your answers.
This 2021/22 annual report of the Solicitors Qualifying Examination (SQE) contains data about more than 3,000 candidates who took part in the SQE between September 2021 and August 2022. Analyses are from three SQE assessment windows (SQE1 in November 2021; SQE2 in April 2022; and SQE1 in July 2022).
The SQE is a single rigorous assessment designed to assure consistent, high standards for all qualifying solicitors. It consists of two parts: SQE1, which tests candidates’ functioning legal knowledge (FLK1 and FLK2 assessments), and SQE2, which tests candidates’ practical legal skills and knowledge.
This Solicitors Qualifying Examination (SQE) annual report provides a cumulative picture of the outcomes from the SQE assessments that have taken place in the reporting period (September 2021 - August 2022). Statistics and commentary are provided on the overall performance of candidates at the individual assessment level to enable comparisons over time and identify any emerging trends, and assessment data is provided where applicable at the cumulative level. Three assessment windows are covered in this report:
While the results have not yet been released for the October 2022 assessment window delivery of SQE2, some data is provided on the number of candidates who attended this assessment.
The Solicitors Qualifying Examination (SQE) is a single rigorous assessment designed to assure consistent, high standards for all qualifying solicitors. It consists of two parts: SQE1, which tests candidates’ functioning legal knowledge (FLK), and SQE2, which tests candidates’ practical legal skills and knowledge. The SRA’s competence statement sets out what solicitors need to be able to do to perform the role effectively, and provides everyone with a clear indication of what to expect from a solicitor. This is what the SQE tests.
The SQE is provided by the Solicitors Regulation Authority (SRA). The SRA has appointed Kaplan SQE (Kaplan) as the approved assessment provider for the delivery of the SQE assessments and other related services.
In this first year, SQE assessments were delivered to more than 3000 candidates in 42 countries.
SQE1 consists of two 180 question multiple choice single best answer assessments (FLK1 and FLK2). These are delivered electronically in controlled and invigilated exam conditions at Pearson VUE test centres across the UK and internationally. Each FLK assessment takes place on a separate day. Each day is split into two sessions of 2 hours 33 minutes with 90 questions in each session. There is a 60-minute break between the sessions. FLK1 and FLK2 each have a separate pass mark. In order to pass SQE1, a candidate must pass both FLK1 and FLK2 assessments. Candidates who fail at their first attempt of SQE1 have two further opportunities to take the assessment(s) they failed (FLK1 and/or FLK2). Information about SQE1 can be found in the SQE1 Assessment Specification.
SQE2 comprises 16 stations – 12 written stations and four oral stations — that assess both skills and application of legal knowledge. The stations in SQE2 cover six legal skills (advocacy; case and matter analysis; interview and attendance note/legal analysis; legal drafting; legal research; legal writing) across five practice areas (business, organisations, rules and procedures; criminal litigation; dispute resolution; property practice; wills and intestacy, probate administration and practice).
SQE2 written assessments take place in Pearson VUE test centres over three consecutive half days and all candidates take the same written stations on the same date.
SQE2 oral assessments take place over two consecutive half days in oral assessment centres in Cardiff, London and Manchester. The logistics involved in running the oral assessments dictate that not all candidates in a cohort can take the same oral stations on the same date so multiple “sittings” are used for SQE2 oral stations. In order to protect the integrity of the assessments and to ensure equity, different oral stations are used at the different sittings, though the same skills and practice areas are covered so all candidates are assessed across the same skills and practice areas.
SQE2 has a single pass mark for the overall 16-station assessment. There may be slightly different pass marks between the SQE2 sittings to account for differences in the difficulty of the oral stations used, as described above.
Candidates who fail SQE2 at the first attempt have two further opportunities to take that assessment, and must resit the whole 16-station assessment. Information about SQE2 can be found in the SQE2 Assessment Specification.
Exemptions from the SQE assessments are only available to qualified lawyers1 . There are some candidates who meet the SRA transitional regulations and are using the SQE2 to qualify as a Solicitor. These are transitional candidates and these candidates are not required to take SQE1. To summarise there are three types of candidate taking SQE assessments:
1See https://www.sra.org.uk/become-solicitor/qualified-lawyers/sqe-exemptions/
The data provided in this report relate to candidates who received a mark for any of the assessments. Candidates whose attempts were discounted due to mitigating circumstances are not included. Outcome data is provided separately for FLK1 and FLK2 assessments, and overall for SQE1 and SQE2.
In this reporting period, a total of 3290 individual candidates have received a mark for one or more of the SQE assessments. Table 1 below provides the number of candidates for each assessment, along with the numbers and proportions of candidates by attempt number, where applicable.
Candidates are allowed up to three attempts for each assessment within a six-year period. At the time of writing this report there had only been two opportunities to sit SQE1, so no candidates had made a third attempt. We are only reporting here on one SQE2 assessment as results for the second SQE2 (completed in the October 2022 assessment window) are due to be released in March 2023 and will be included in the 2022/23 annual report.
At their first SQE1 attempt, candidates are required to sit both FLK1 and FLK2 in the same assessment window. If a candidate fails FLK1 or FLK2 they only need to resit the assessment which they failed: passes can be carried forward within a six year period. Because of this, and owing to mitigating circumstances, the number of candidates may differ across FLK1, FLK2 and SQE1 overall.
*The number of candidates in both FLK1 and FLK2 for that assessment window.
Though SQE2 in the October 2022 assessment window had been delivered at the time of writing this report, the marks had not been released to candidates. 646 candidates sat SQE2 in the October 2022 assessment window. These approximately breakdown as:
These numbers could change if there are successful mitigating circumstance claims with attempts being discounted.
A statistical report will be published after results are released, and the October 2022 SQE2 data will be included in the 2022/23 Solicitors Qualifying Examination Annual Report.
Table 2 provides the pass marks for each assessment, the average score (Mean), standard deviation (SD) and measures of test reliability (Cronbach’s Alpha and Standard Error of Measurement (SEm)).
Cronbach’s Alpha (α) is a measure of test reliability that estimates the internal consistency, or how closely related the sets of items are in a test. It therefore tells us how well items (questions) work together as a set. A high α coefficient suggests that candidates tend to respond in similar ways from one item to the next. Values for α range from 0 (where there is no correlation between items) to 1 (where all items correlate perfectly with one another). The widely accepted gold-standard α for high-stakes assessments is 0.8. In all SQE1 assessments to date α has been greater than 0.9 and above 0.8 for SQE2, suggesting very good internal consistency and high reliability for the SQE1 and SQE2 assessments.
The SQE assessments provide an observed/obtained score for a candidate at a snapshot in time (the assessment). If the candidate were to sit the same assessment on another occasion, they may achieve a different score owing to various factors. Some of these can be controlled to an extent, such as the training provided, and some cannot, such as the amount of sleep the candidate got the night before the assessment.
A candidate's theoretical “true” score can only be estimated by their observed score but there is an inevitable degree of error around each candidate’s observed score, which is consistent with most assessments. The standard error of measurement (SEm) for an assessment is an estimate of how repeated measures of the same group of candidates on the same assessment would be distributed around their theoretical “true” scores. The SEm is a function of the reliability of the assessment (α) and the standard deviation in scores on the assessment. Generally, the higher the reliability, the lower the standard error of measurement and vice-versa.
For all SQE assessments to date the SEm has been below 4% which suggests that observed scores generally represent a very good approximation for true scores in these assessments.
Note - for SQE2 there were multiple sittings for the oral assessments (see introduction)
Tables 3a to 3c below summarise the journey so far for the candidates who have received assessment marks. These are provided separately for SQE1 (candidate outcomes) and SQE2 (candidate routes and outcomes). The candidates include transitional candidates not required to take SQE1, as well as qualified lawyers who have been granted an exemption by the SRA. Because early cohorts include large numbers of transitional candidates not required to take SQE1, the cohorts and their outcomes may not be representative of a steady-state cohort.
Of the 341 candidates who passed SQE1 in November 2021 and went on to sit SQE2 in April 2022, 302 (89%) passed. Of the 39 candidates who failed, 12 (31%) attempted SQE2 again in October 2022 (the results of which are not covered by this report), and 27 (69%) are yet to resit.
Of those who attempted parts of SQE1 in both November 2021 and July 2022, 61 (26%) passed their remaining assessments in July. They are now eligible to move on to SQE2.
Candidates who fail their first attempt may benefit from reviewing the information contained later in this report relating to candidate performance in different practice areas.
*these numbers are provisional
Of the 385 candidates who did not sit SQE1 and sat SQE2 in April 2022, 259 (67%) passed. This compares to a significantly higher pass rate of 89% for those who had taken and passed the SQE1 in November 2021.
Of the 561 candidates who passed SQE2 in April 2022:
Of the 165 candidates who failed SQE2 in April 2022:
Table 4 shows the candidate pass rates (and number passing) for each assessment for all candidates and by attempt. It also shows, for SQE2, the pass rate for those transitional candidates who did not need to take the SQE1 (including QLTS and LPC transitional candidates).
In the first two SQE1 assessments, the pass rates for FLK1 have been higher than for FLK2 (+13% and +9%). This might suggest that candidates are better prepared for the assessment themes in the FLK1 assessment. We will continue to monitor this across future assessments.
The overall SQE1 pass rate was consistent for both SQE1 assessments at 53%.
Pass rates for SQE1 in July 2022 were significantly lower for resitting candidates, when compared to first attempt candidates. This was true for both FLK1 and FLK2.
The number resitting was relatively small (6% and 11% respectively for FLK1 and FLK2). However, the lower pass rate might indicate that resit candidates may want to consider taking more time (and/or putting in more work or training) between sittings. This may help them improve from a failing to passing standard.
The SQE2 pass rate was markedly higher than for SQE1 (+24%). This was unsurprising given that sitting SQE2 was conditional on achieving a pass in SQE1 (except where candidates did not need to sit).
Noting this, the SQE2 pass rate for candidates using the transitional arrangements to complete their qualification was lower than for candidates who passed SQE1 (67% and 89% respectively).
*not reportable as less than 10 candidates
Of the 385 transitional candidates who did not sit SQE1:
The SRA collects diversity and socio-economic data to help understand how candidates with different characteristics and backgrounds perform in the assessments. The data categories are consistent with data collected by the Office for National Statistics (ONS) and the Social Mobility Commission. Data is collected from candidates via an online monitoring and maximising diversity survey (or demographic data survey), completed ahead of assessment registration. Appendix 1 lists the data reported on in this section.
The large number of characteristics and groups within characteristics recorded in the data collected means that candidate numbers in some of the groups are very small. The complication of small candidate numbers in certain demographic groups is compounded by almost half of candidates (47%) selecting ‘Prefer not to say’ for one or more of the 14 characteristics provided in this report. As a result we have not been able to use multivariate analysis in this report.
Though multivariate analysis is not included for reasons mentioned, initial exploratory analyses suggest that much of the attainment gap evident between groups may be explained by educational and socio-economic factors. However, we cannot draw reliable conclusions from this exploratory analysis yet - we will look at this again when candidate numbers and declaration rates increase.
In the tables below, we present univariate analysis of the outcomes data, which looks at each of the characteristics individually and independently. Tables 5 to 11 provide the following for FLK1, FLK2, SQE1 and SQE2 for each of the 14 characteristics presented:
Data in the tables exclude candidates who select ‘Prefer not to say’:
These tables provide data for all candidates who received marks for each assessment (FLK1, FLK2, SQE1 and SQE2). The data is pooled from the November 2021 and July 2022 assessments for FLK1, FLK2, and SQE1. Where there are fewer than 10 candidates in any group the proportions and pass rates are not reported; this is indicated by greyed out cells in the table.
The full questions asked in the online demographic data survey in relation to each category are available in Appendix 1.
Candidates who reported being in White or Mixed/multiple ethnic groups achieved higher pass rates than Asian/Asian British or Black/Black British. Differences in pass rates between groups were significant for all assessments.
Candidates who declared a disability achieved a higher pass rate than those who reported no disability in SQE2. The opposite was the case in SQE1.
The majority of candidates taking SQE1 and SQE2 were in the younger age groups. More than half of all candidates were in the 25-34 age group and this group achieved higher pass rates than older candidates.
There were fewer than 10 candidates in the 65+ age group for each assessment and for the 55-64 age group for SQE2. The proportions and pass rates are therefore not provided for these groups.
Candidates who reported their sex as male achieved a higher pass rate than female candidates in SQE1, though there was no significant difference between sex groups in SQE2.
Candidates who reported that their sexual orientation was Bi or Gay / Lesbian achieved higher pass rates than Heterosexual / straight candidates in SQE1 assessments, but there were no significant differences in SQE2.
There were fewer than 10 candidates selecting ‘Other’ for sex and sexual orientation, and ‘No’ for their gender being the same as the sex registered at birth for all assessments. The proportions and pass rates are therefore not provided for these groups.
There were differences in pass rates between religion/belief groups reported by candidates in all assessments. Candidates reporting Hindu, Muslim or Sikh as their religion generally had lower pass rates.
The most frequently indicated group was no religion or belief (approximately 40% of candidates). Fewer than 10 candidates selected Buddhist as their religion for SQE2; the proportion and pass rate are therefore not provided for this group.
Candidates who reported that at least one parent attended university achieved higher pass rates in SQE1 assessments, but this factor was not significant in SQE2.
Pass rates were similar across all categories of main household earner occupation, and for the school type attended for the most time between the ages of 11 and 16.
Fewer than 10 candidates did not know the type of school they attended across all assessments. The proportions and pass rates are therefore not provided for this group.
*Group excluded from the Chi-square test of significance
Candidates with higher undergraduate degree classifications achieved higher pass rates in SQE1 and SQE2 assessments. There was no effect on pass rates of having an undergraduate degree or not.
All candidates disclosed whether they were already a qualified lawyer or not. Those who were, achieved a higher pass rate at FLK1 but a similar pass rate to those who were not qualified at FLK2.
However, in SQE2 candidates who disclosed that they were not qualified achieved a higher pass rate. Candidates who said they had undertaken qualifying work experience achieved higher pass rates in SQE1 assessments.
There were fewer than 10 candidates with a third class or Commendation for their undergraduate degree for SQE2. The proportions and pass rates are therefore not provided for these groups.
*Group excluded from the Chi-square test
The tables in this section show the mean scores by practice area for each assessment by the following candidate groups:
In FLK1, candidates performed best in the Ethics and Contract Law practice areas (mean scores range of 69% to 73.4% for Ethics and 65.5% to 70% for Contract). While Business Law and Practice and Dispute Resolution had the lowest mean scores (range of 49.6% to 58.5% for Business Law and Practice and 52.6% to 55.8% for Dispute Resolution).
In FLK2 Ethics and Criminal Liability had the highest mean scores (range of 65.1% to 74.8% for Ethics and 58.8% to 71.8 % for Criminal Liability). It was Property Practice and Wills and Intestacy that had the lowest mean scores (range of 46.4% to 48.3% for Property Practice and 50.7% to 55.6% for Wills and Intestacy).
There are similar patterns of performance between the passing and failing candidates, and first and second attempt candidates, across the majority of practice areas within each assessment. This suggests both stronger and weaker candidates perform well/less well in the same practice areas.
The mean difference between passing and failing candidates across the practice areas varies between 17.2% (FLK1 Legal Services in July 2022) to 28.0% (FLK2 Land Law in November 2021).
The differences are consistently above 20% for Business Law and Practice, Contract Law and Legal System in FLK1, and Wills and Intestacy, Land Law and Trust Law in FLK2.
The deficit in knowledge of the weaker candidates is therefore greater in these practice areas, suggesting these areas may require more focus/preparation for future attempts at SQE1.
When comparing first and second attempts in the July 2022 assessment window the differences range between 5.2% (FLK2 Ethics) and 15.0% (FLK1 Contract Law). First attempt candidates had higher mean scores for all areas.
The differences appear higher across the FLK1 practice areas suggesting second attempt candidates have more gaps in their knowledge in these areas.
Contract Law, Business Law and Practice, Legal System and Tort all have a mean difference greater than 10%. This is followed by Dispute Resolution, Ethics and Legal Services with mean differences between 8.7% and 9.8%. Within FLK2, only Land Law has a mean difference comparable with the FLK1 practice areas (8.9%) with the other areas ranging between 5.2% and 8.3% difference.
The following plots show the mean scores for the FLK1 and FLK2 practice areas by date for the passing and failing candidates (pink = failing; green = passing; bars ordered by passing candidate mean scores for November 2021 FLK1 descending).
The plots show there are differences when looking assessment to assessment in the rank order of mean performance of passing and failing candidates across the practice areas.
In FLK1, candidates scored lower on Business Law and Practice in July 2022 compared to November 2021 but scored higher on Tort.
In FLK2, candidates performed less well in Criminal Liability but better in Criminal Law and Practice.
Y axis showing Mean Score (%)
X axis showing Practice area
Result
In the April 2022 SQE2, all candidates sat the same 12 written stations with candidates sitting four different stations across the four oral sittings.
These findings relate to the April 2022 SQE2 assessment stations, with adjustments made to ensure all sittings are on a comparable scale. Please note that the combinations of skills and practice areas will vary between iterations of the SQE2 assessment. For more on this, please see the SQE2 Assessment Specification (Organisation and delivery section).
The tables in this section show the mean scores by station for the following candidate groups:
Looking at the mean scores for all candidates, the highest mean scores achieved were in:
Of the 16 mean station scores, nine were above 70%, which included all four oral stations. The lowest mean score was for the Legal Writing - Dispute Resolution station (57.0%) which was the only mean score below 60%.
Within each of the five practice areas there is a range of mean scores suggesting candidates do not find particular practice areas easier or more difficult than others. For example, within the Dispute Resolution practice area there are the highest (85%) and the lowest (57%) mean scores.
Of the six skills assessed, the mean performance is similar across practice areas for the two assessed orally (Advocacy and Interview and Attendance Note/Legal Analysis) and for Legal Drafting (means 65.3%, 68.0% and 68.2%).
The greatest variation in mean scores is in the Legal Research stations (69.7%, 77.2% and 85.0%) and Legal Writing (57.0%, 73.0% and 74.7%).
Figure 3 and Figure 4 show the mean scores for the SQE2 written and oral stations for the passing and failing candidates (pink = failing; green = passing; bars ordered by passing candidates’ mean scores descending).
The mean scores for failing candidates on the written stations do not follow the same pattern as those for the passing candidates.
The smallest mean difference is the Legal Writing - Business station (12.4%) and the largest is in the Case and Matter Analysis - Wills station (27.4%).
The mean score difference was greater than 20% for the following eight (out of 16) stations.
The deficit in the legal skills (and associated legal knowledge) of the weaker candidates is therefore greater in these practice areas, suggesting these areas may require more focus/preparation for future attempts at SQE2.
There was less variability in the mean scores for the four oral stations. There was a range of 2.1% for the failing candidates (60.5% to 62.6%) and 8.8% for those passing (74.4% to 83.2%).
The difference between the passing and failing candidate means ranged between 11.9% and 20.7%, with only Criminal Litigation – Advocacy being greater than 20%.
X axis showing Station
Table 15 shows the mean scores for all candidates along with passing and failing candidates for ethics and non-ethics focussed assessment content across FLK1 and FLK2.
In each case, there is a strong positive correlation between performance on ethics questions and the overall score. Naturally those candidates who have passed the assessments have significantly higher scores than those who have failed.
The mean scores are consistently higher for the ethics items compared to the non-ethics items across all assessments and groups.
For all candidates the mean differences range between 8.5% (FLK2 November 2021) and 19.7% (FLK2 July 2022). With the current data, there appears to be no pattern in mean differences between ethics and non-ethics items across the FLK1 and FLK2 assessments.
The mean differences between the passing and failing candidates are similar for both ethics and non-ethics content, indicating associated performance across the two content areas.
Professional Conduct and Ethics are assessed pervasively in the SQE2 assessments. Some of the 16 stations contain Professional Conduct matters, and some do not. There is no formula determining which assessments this may fall in, and candidates are required to spot these issues and deal with them appropriately.
The matters related to these topics do not therefore have their own assessment criteria allocated to them. Examiners award discretionary credit under the criteria related to the application of the law.
Kaplan SQE is the End Point Assessment Organisation for solicitor apprentices. Solicitor apprentices are required to pass SQE1 during their apprenticeship as an on-programme assessment. SQE2 is the synoptic end-point assessment (EPA).
Apprentices must pass SQE1 and have met all of the minimum requirements of their apprenticeship (including the gateway review) before they can attempt SQE2. When an apprentice has passed SQE2, they have completed the EPA for the Solicitor Apprenticeship Standard and passed the SQE.
Information about how solicitor apprentices and their training providers can engage with the SQE is available on our website.
Solicitor apprentices made up a small proportion of overall candidate numbers for each assessment as indicated in Table 16. This group has a higher pass rate across each of the three assessments when compared to the first attempt pass rates for all candidates for SQE1 Nov 2021, SQE2 Apr 2022 and SQE1 Jul 2022 (see Table 16).
Of particular note is the 100% pass rate for the SQE2 assessment in April 2022. Whilst this is the only SQE2 assessment with outcome data so far, it suggests the apprentice candidates are well prepared for the end-point assessment and applying their legal knowledge in skills based assessments.
* Proportion passing only FLK1 or FLK2 at their first attempt of SQE1
We are committed to making sure that a candidate is not disadvantaged by reason of a disability in demonstrating their competence and we will make reasonable adjustments to methods of assessment for candidates with a disability (within the meaning of the Equality Act 2010) to achieve this. We will also consider reasonable requests to accommodate candidates with other conditions which impact on their ability to undertake the SQE. All SQE candidates are assessed against the same standards set out in the Statement of Solicitor Competence and the Statement of Legal Knowledge, and must reach the Threshold Standard to qualify.
Our approach to reasonable adjustments, including how we communicate with candidates and the arrangements we most frequently make, is set out in the Reasonable Adjustments Policy.
Throughout this year, 264 reasonable adjustment plans were delivered. In the small number of cases where candidates declined the adjustments initially offered, a revised plan was proposed and agreed based on evidence provided by the candidate.
The average time taken between receiving a completed application for reasonable adjustments (with full accompanying evidence) to proposing an adjustment to a candidate has decreased through the year. It was around 12 working days in November 2021 and around 4.5 working days in July 2022. This improved as the process became more refined and the team became more established following the first assessment. Joint working and management of the reasonable adjustment requests with Pearson VUE was altered to improve completion rates.
Before the assessment, considerable time is spent with some candidates who have complex reasonable adjustment plans to ensure that the candidate is supported. Our standard guidance to candidates is to obtain their supporting evidence and make their reasonable adjustment request as early as possible. If supporting documentation is not available at the time of submitting the form, documentation should be submitted at the earliest opportunity.
Table 19 shows pass rates for candidates with reasonable adjustment plans alongside pass rates for the full cohort for SQE1 and SQE2 assessments this year. A Chi-square significance test was used to see if there were differences (at a 95% confidence level) between the reasonable adjustment candidate pass rates and the overall cohort. This is indicated in the final column as ‘Yes’ (significant differences) or ‘No’ (no significant differences).
Pass rates for candidates with reasonable adjustment plans were similar to the full cohort for all SQE1 assessments. However, there was some evidence to suggest that the pass rate for candidates with reasonable adjustment plans was higher in SQE2.
Noting that we only have outcomes from one SQE2 assessment at the time of writing this report, we will monitor future assessments for emerging trends.
*Significant where any differences between the groups are unlikely to be due to chance
Plans were in place for candidates with a wide range of disabilities, long-term and fluctuating conditions. Adjustments were also agreed for some candidates who were pregnant/nursing mothers.
The most common conditions amongst candidates with reasonable adjustment plans were neurological conditions such as dyslexia, autism, dyspraxia and ADHD.
The majority of candidates had more than one adjustment in their plan. These were specific to the actual assessments and in some cases different arrangements were required on the SQE2 written and oral assessments.
The most common adjustments were as follows, with similar patterns seen across SQE1 and SQE2.
Other bespoke provisions were also arranged for candidates where evidence supported this. Examples of these included:
JAWS screen reader technology is available as a possible adjustment, although no candidates required or requested it in this period.
After each assessment, candidates are invited to complete a survey to provide feedback about their experience. The questions in the survey (Appendix 2) relate to:
Candidates can provide general comments via a free-text box. They can also provide their contact details should they wish to be contacted further about their feedback.
These surveys have provided valuable information for Kaplan and SRA to consider during these early stages of the SQE. All responses are collated and analysed, with action plans put in place where improvements can be made, or new opportunities and solutions can be explored.
Feedback from candidates and stakeholders has also been collected and reviewed and considered from various other sources during the course of delivering the assessments.
Table 20 summarises some of the key areas brought to our attention for improvement, and what we are doing in response.
Create your personal SQE account and book your assessments.
Find out what happens after passing the SQE and admission to the roll of solicitors.