3 years on: Examiners’ and candidates’ views on the CASC (Clinical Assessment of Skills and Competencies)

Rajkumar Kamatchi, Saugata Bandyopadhyay, Ashok Kumar Jainer, Bettahahalasoor Somashekar, Marek Marzanski and Steven Marwaha

Cite this article as: BJMP 2012;5(4):a537
Download PDF


Abstract

Aim and method: The Clinical Assessment of Skills and Competencies (CASC), introduced in June 2008 is the new and only clinical examination in obtaining membership of Royal College of Psychiatrists. Although there is evidence of strong validity and reliability for OSCE (Objective Structured Clinical Examination) - type examinations, the acceptability, validity and reliability of the CASC is open to challenge. We conducted a national online survey of candidates and examiners to obtain their views and aimed to evaluate whether the CASC fulfils its purpose.

Results: The survey showed that 48% of the candidates (n=110) and 59% the examiners (n=22) agreed that the CASC examines the required competencies to progress to higher training. However only 15% of the candidates and 18% of the examiners accepted that CASC examines all the advanced psychiatric skills compared to the previous Part 2 clinical examination. Nevertheless, only a third of the candidates and examiners considered replacing the CASC with traditional long case as the best way forward.

Implications: Although CASC scenarios may reflect real-life situations and its content covers most skills in piecemeal, it lacks the holistic ethos underpinning the bio-psychosocial approach unique to psychiatry. The findings of the survey suggest that the current examination method requires further systematic evaluation. 

Post- graduate medical education in the United Kingdom has seen numerous dramatic changes in the last decade, with the introduction of structured training programmes and changes in assessment of skills driven by Modernising Medical Careers.1 Overall these new developments emphasise a competency based curriculum and assessments. Alongside and contingent on these wider changes in medical education, psychiatric trainees have faced major transformations in their membership (MRCPsych) examinations.

The MRCPsych examination was first introduced in 1972, a year after the Royal College of Psychiatrists was founded. There have been various modifications in its structure since its inception but a radical change occurred in the last decade with the introduction of an OSCE in 2003 and the CASC, a modified OSCE in June 2008. The CASC is considered as a high- stakes examination as it is now the only clinical and final examination towards obtaining the membership of the College. The MRCPsych qualification is considered as an indicator of achieving professional competence in the clinical practice of psychiatry and has the main aim of setting a standard that determines whether trainees are suitable to progress to higher specialist training.2 In his commentary to Wallace et al3 , Professor Oyebode describes the aims, advantages and disadvantages of the various assessment methods used in the MRCPsych examination and conclude that the precise assessment of clinical competence is essential.4

Traditionally, assessment of clinical skills involved a long case examination since it was introduced in clinical graduating examination by Professor Sir George Paget at Cambridge, UK in 1842. This has been followed by most of the medical institutions worldwide and remained as the clinical component of the MRCPsych examination until 2003. There are some shortcomings with this assessment method and the outcome can be influenced by several factors such as varying difficulty of the cases, co-operation of the real patient and examiner- related factors. The reliability of assessment of clinical competency with a single long case is low and it is necessary for the candidate to interview at least ten long cases to attain the reliability required for a high stakes examination like MRCPsych.5 A fair, reliable and valid examination is necessary to overcome these difficulties. The OSCEs proved to be one of the answers to these difficulties.

One important aspect of assessing the validity and acceptability of assessment methods is asking the opinions of examiners and candidates about their experiences and views about the examination once it has been rolled out. As far as the authors are aware there has been one previous published survey of CASC candidates’ views on this method of examination and this was based at a revision course. Whelan et al6 showed that approximately 70% of the candidates did not agree with the statement “there is no longer a need to use real patients in post-graduate clinical psychiatry exams”. In addition, only 50% of the candidates preferred the CASC compared to previous long case and the other 50% remained undecided. This raises doubts about the acceptability of the CASC format and merits further exploration.

Method

We conducted a national on-line survey asking both candidates and examiners about their views on the CASC examination.

Questionnaire development

Two questionnaires (one each for examiners and candidates) based on previously available evidence on this exam format6,7,8 were developed following discussions among the authors.

The final version of the questionnaire for both groups had the same seven questions with a five point Likert scale. It included questions on whether the exam effectively assessed the competency needed for real life practice, whether there was over testing of communication skills, whether feedback was adequate, respondents’ views on validity and reliability of the method and finally whether the clinical examination should revert to the previous style of long case and viva.

Sampling procedure

The examiners and the candidates who have already appeared in the CASC examination were invited to complete the online survey. The links to the questionnaires were distributed via the Schools of Psychiatry in thirteen deaneries in the United Kingdom (including Wales, Northern Ireland and Scotland). We approached 400 candidates and 100 examiners from different deaneries making sure the wide geographical distribution. The sample size was chosen based on the data that around 500 candidates appear in CASC exam each time and there are approximately 431 examiners on CASC board (personal contact with the College).Participants were assured that their responses were confidential. The survey was open from mid-March to mid-April 2011. Reminders were sent half way through the survey period.

Results

A total of 110 candidates and 22 examiners completed the survey. The response rate was better for candidates (27.5%) compared to the examiners (22%). Albeit the low response rate, the responses showed good geographical spread. Responses were received from most of the deaneries (87%). The London, East and West Midlands deaneries showed higher response rate (14% each) while Scotland, Severn and North Western deaneries showed least response rate (2% each).

Among the 110 candidates, 52% were males and 48% were females and among the examiners, 73% were males and 27% were females. 55% of the examiners were involved in the previous Part 2 clinical exam while only 7% of the candidates had the experience of previous Part 2 clinical exam. The results are summarised in Tables 1 and 2.

Table 1. Candidates’ views ( n= 110 )
Survey questions Strongly agree Agree Neutral Disagree Strongly disagree
CASC examines the required competencies to progress to higher training 10% 38% 7% 26% 19%
CASC examines all skills and competencies compared to previous Part 2 clinical exam 4% 11% 46% 21% 18%
CASC scenarios reflects the real life situations faced in clinical practice 12% 36% 13% 22% 17%
CASC gives more emphasis on testing communication and interviewing skills than overall competencies 29% 31% 14% 19% 7%
CASC is more valid and reliable as a clinical exam 9% 19% 29% 20% 23%
Feedback system ‘areas of concern’ are helpful to the unsuccessful candidates 1% 11% 28% 26% 34%
CASC needs to be replaced by traditional style of exam – a long case and a viva 14% 22% 25% 24% 15%

 

Table 2. Examiners’ views ( n= 22 )
Survey questions Strongly agree Agree Neutral Disagree Strongly disagree
CASC examines the required competencies to progress to higher training 14% 45% 14% 18% 9%
CASC examines all skills and competencies compared to previous Part 2 clinical exam 4% 14% 23% 45% 14%
CASC scenarios reflects the real life situations faced in clinical practice 14% 63% 5% 9% 9%
CASC gives more emphasis on testing communication and interviewing skills than overall competencies 22% 26% 17% 22% 13%
CASC is more valid and reliable as a clinical exam 9% 37% 27% 9% 18%
Feedback system ‘areas of concern’ are helpful to the unsuccessful candidates 0% 36% 14% 27% 23%
CASC needs to be replaced by traditional style of exam – a long case and a viva 18% 14% 41% 9% 18%

Clinical competencies and skills

59% of the examiners and 48% of the candidates have accepted that CASC examines the required competencies to progress to higher training. Strikingly only 18% of the examiners and 15% of the candidates agreed that CASC allows the assessment of all the skills and competencies necessary for the higher trainees in comparison to the previous Part 2 clinical exam.

Content of the CASC

Majority of the examiners (77%) and nearly half of the candidates (48%) agreed that CASC scenarios reflect real life situations faced by clinicians in normal practice. However 60% of the candidates and 48% of the examiners felt that CASC excessively emphasizes communication and interview skills.

Feedback - “areas of concerns”

More than half of the candidates (60%) and half of the examiners (50%) felt that the feedback indicating “areas of concerns”, for the failed candidates was not helpful to improve their preparations before the next attempt.

Validity and reliability of the CASC as a clinical exam

Just over one fourth of the candidates (28%) and less than half of examiners (46%) considered CASC as a valid and reliable method of clinical examination. However, only 36% of the candidates and 32% of the examiners supported replacing CASC with a traditional clinical exam (a long case and a viva). Broadly comparable numbers (39% of the candidates and 27% of the examiners) disagreed with the statement that the CASC should be replaced by the previous examination style.

Discussion

To our knowledge this is the first study of candidate and examiner views since the introduction of the CASC. Its predecessor OSCEs has a good reliability and validity in assessing medical students8 and it has become a standard assessment method in undergraduate examinations. Whilst OSCEs have been held to be reliable and valid in a number of assessment scenarios,8 there have been doubts about their ability to assess advanced psychiatric skills,9 which was one of the main reasons to retain the long case in MRCPsych Part 2 clinical exam.2 Over the years, most of the Royal Colleges introduced OSCEs into their membership examinations and used simulated patients in some scenarios. However CASC is the first examination with only simulated patients in a combination of paired and unpaired stations. So far there has been no published literature evaluating this method systematically.

In a recent debate paper10 it has been argued that CASC may have significant problems related to its authenticity, validity and acceptability. The findings of our survey reflect similar doubts about the reliability and validity of the CASC exam amongst both the candidates and examiners. The content validity of CASC has been demonstrated by the College Blueprint11 and the face validity appears to be good. However, as far as we are aware, the concurrent and predictive validity testing data have not been published. Although the global marking system appears to have better concurrent validity than other checklists, it gives the examiners the similar flexibility as the long case in making judgements which may affect CASC transparency and fairness. This may indicate that this new and promising examination method requires further systematic evaluations and modifications before its user’s fully accept it.

According to the results of our study the content of the CASC exam satisfies its purpose of assessing the candidates’ competencies to progress to the higher professional training. However many of the respondents felt that it lacked the completeness of previous traditional clinical examination, which collate skills. Although there were some differences between the candidates and the examiners on how they perceived the CASC exam, most of the respondents agreed that CASC laid more emphasis on communication and interviewing skills rather than overall assessment of the candidate’s competency.

Harden et al,12 in their paper on OSCEs, criticised the compartmentalisation of knowledge and discouraging candidates from a broader thinking during the clinical examinations. They also suggested using a long case and/or workplace based assessments rather than relying on OSCEs only in assessing trainees. Benning & Broadhurst13 expressed similar concerns on the loss of long case in MRCPsych examination. Our findings support the arguments that CASC assesses competencies in a piecemeal fashion rather than being reflective of the demands on senior doctors in real practice which often involve deciding what is and is not important depending on context.

The OSLER14 (Objective Structured Long Examination Record) method might overcome the shortcomings and improve the objectivity and transparency of long case. In this method, two examiners assess the candidate and grade their skills individually in a ten item objective record. Later they decide together the appropriate grade for each item and agree an overall grade. The ten items include four on history, three on examination and another three covering investigations, management and clinical acumen. The OSLER method is also practical as no extra assessment time is required and it can be used for both norm referenced & criterion referenced exams. The case difficulty can be determined by the examiners and all candidates are assessed for identical items. Thus this method assesses the candidate’s overall clinical competency and eliminates the subjectivity associated with the long case.

Another alternative might be using a combination of assessment methods as suggested by Harden.12 An 8-10 stations OSCE can be combined with a long case assessment using OSLER method. The OSCE stations might include patient management scenarios along with interview and communication skills scenarios. The final score determining the result could also include marks from work place based assessments as they provide a clear indication of the candidate’s skills and competence in real life situation.

It is also evident from our findings that both candidates and examiners are largely unsatisfied with the extent and usefulness of feedback that is provided to unsuccessful candidates. The feedback system have been criticised for its inability to clarify the specific areas or skills which need to be improved by the unsuccessful candidates. The recent “MRCPsych Cumulative Results Report’’ 15 states that the pass rate of the candidates declines after the first attempt. Perhaps this could be improved if failed candidates receive more detailed feedback about their performance.

There are a number of limitations to this study. The response rate was low but it was broadly in the range of other online surveys16 and there was representation from most of the deaneries in the United Kingdom. There could be a number of reasons for low response rate. As far as we are aware few deaneries were not willing to distribute the questionnaire through their School of Psychiatry and we had to contact the individual trusts in the area to distribute the survey. The poor response rate from the examiners could be because of their low interests in participating and lack of time. Also older examiners and those with more experience of CASC may have had particular views which might have had an influence on the responses. But when this was examined further, there were no major differences between respondents who had the experience of previous Part 2 examinations from those who had not. In addition one of the survey questions consisted of two parts (views on validity and reliability) which could have been difficult to answer accurately.

The findings of this preliminary study raise some doubts on acceptability of the CASC by both candidates and examiners. There might be a possibility of subjective bias in the responders’ views, perhaps influenced by other ongoing and controversial changes in the NHS, including the role of GMC and the College in the post- graduate medical education. However on the other hand it might be a signal that it is worthwhile to reconsider the implications of the CASC on education and training and to evaluate systematically this assessment method further.

Competing Interests
None declared
Author Details
RAJKUMAR KAMATCHI, MRCPsych, ST6 General Adult Psychiatry trainee & Honorary Associate Clinical Teacher, Coventry & Warwickshire Partnership NHS Trust, Coventry & Warwick Medical School, Coventry, UK. SAUGATA BANDYOPADHYAY, MRCPsych, ST5 General Adult Psychiatry Trainee,Birmingham and Solihull Mental Health Foundation NHS Trust, Birmingham, UK ASHOK KUMAR JAINER, MD, MRCPsych, Consultant General Adult Psychiatrist, Coventry & Warwickshire Partnership NHS Trust, Coventry, UK. BETTAHAHALASOOR SOMASHEKAR, MD, DNB, Consultant General Adult Psychiatrist, Coventry & Warwickshire Partnership NHS Trust, Coventry, UK. MAREK MARZANSKI, MD, MRCPsych, Consultant General Adult Psychiatrist & Associate Fellow, Coventry & Warwickshire Partnership NHS Trust, Coventry & Warwick Medical School, Coventry, UK. STEVEN MARWAHA, PhD, MRCPsych, Associate Clinical Professor of Psychiatry & Honorary Consultant Psychiatrist, Division of Mental Health and Wellbeing, Warwick Medical School, University of Warwick, Coventry & Coventry & Warwickshire Partnership NHS Trust, Coventry, UK.
CORRESPONDENCE: RAJKUMAR KAMATCHI, Coventry & Warwickshire Partnership NHS Trust, The Caludon Centre, Clifford Bridge Road, Coventry. CV2 2TE.
Email: rajkumaranjali@yahoo.com

References

  1. Department of Health. Modernising medical careers: the next steps. http://www.dh.gov.uk/en/publicationsand statistics/publications/publications and policy and guidance/ DH_4079530 (2004) 
  2. Tyrer S & Oyebode F. Why does the MRCPsych examination need to change? British Journal of Psychiatry 2004; 184: 197- 199 
  3. Wallace J, Rao R, Haslam R. Simulated patients and objective structured clinical examinations: review of their use in medical education. Advances in Psychiatric Treatment 2002; 8: 342- 348 
  4. Oyebode F. Commentary on: Simulated patients and objective structured clinical examinations; review of their use in medical education. Advances in Psychiatric Treatment 2002; 8: 348- 350. 
  5. Wass V, Jones R & Van Der Vieuston C. Standardised or real patients to test clinical competence? The long case revisited. Medical Education 2001; 35: 321- 325 
  6. Whelan P, Lawrence- Smith G, Church L, Woolcock MM, Rao R. Goodbye OSCE, hello CASC: a mock CASC course. Psychiatric Bulletin 2009; 33: 149- 153
  7. Thompson CM. Will the CASC stand the test? A review and critical evaluation of the new MRCPsych clinical examination. Psychiatric Bulletin 2009; 33: 145- 148
  8. Hodges B, Regeher G, Hanson M & et al. Validation of an objective structured clinical examination in psychiatry. Academic Medicine 1998; 73: 910- 912 
  9. Hodges B, Regeher G, McNaughton N & et al. OSCE checklists do not capture increasing levels of expertise. Academic Medicine 1999; 74: 1129- 1134
  10. Marwaha S. Objective Structured Clinical Examinations (OSCEs), psychiatry and the Clinical Assessment of Skills and Competencies (CASC) Same evidence, Different Judgement. BMC Psychiatry, 2011, 85 http://www.biomecentral.com/1471-244X/11/85 
  11. Royal College of Psychiatrists. MRCPsych CASC blueprint. (http://www.rcpsych.ac.uk/ pdf/ MRCPsych%20CASC%20Blueprint.pdf.) (2008)
  12. Harden RMcG, Stevenson M, Downie WW, Wilson GM. Assessment of Clinical Competence using Objective Structured Examination. British Medical Journal 1975; 1: 447- 451 
  13. Benning T & Broadhurst M.  The long case is dead- long live the long case; loss of the MRCPsych long case and holism in psychiatry. Psychiatric Bulletin 2007; 31: 441- 442 
  14. Gleeson F. Assessment of clinical competence using the objective structured long examination record (OSLER). Medical Teacher 1997; 19: 7- 14 
  15. Royal College of Psychiatrists.  MRCPsych Cumulative Results Report. (http://www.rcpsych.ac.uk/pdf/MRCPsych%20Cumulative%20Reults%20Report%20- %20august%202011.pdf) 
  16. Cook C, Heath F & Thompson RL. A meta- analysis of response rates in web or internet based surveys. Educational and Psychological Measurement 2000; 60: 821-826


Creative Commons Licence
The above article is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.


share