Academic Development Centre

Oral presentations

Using oral presentations to assess learning

Introduction.

Oral presentations are a form of assessment that calls on students to use the spoken word to express their knowledge and understanding of a topic. It allows capture of not only the research that the students have done but also a range of cognitive and transferable skills.

Different types of oral presentations

A common format is in-class presentations on a prepared topic, often supported by visual aids in the form of PowerPoint slides or a Prezi, with a standard length that varies between 10 and 20 minutes. In-class presentations can be performed individually or in a small group and are generally followed by a brief question and answer session.

Oral presentations are often combined with other modes of assessment; for example oral presentation of a project report, oral presentation of a poster, commentary on a practical exercise, etc.

Also common is the use of PechaKucha, a fast-paced presentation format consisting of a fixed number of slides that are set to move on every twenty seconds (Hirst, 2016). The original version was of 20 slides resulting in a 6 minute and 40 second presentation, however, you can reduce this to 10 or 15 to suit group size or topic complexity and coverage. One of the advantages of this format is that you can fit a large number of presentations in a short period of time and everyone has the same rules. It is also a format that enables students to express their creativity through the appropriate use of images on their slides to support their narrative.

When deciding which format of oral presentation best allows your students to demonstrate the learning outcomes, it is also useful to consider which format closely relates to real world practice in your subject area.

What can oral presentations assess?

The key questions to consider include:

  • what will be assessed?
  • who will be assessing?

This form of assessment places the emphasis on students’ capacity to arrange and present information in a clear, coherent and effective way’ rather than on their capacity to find relevant information and sources. However, as noted above, it could be used to assess both.

Oral presentations, depending on the task set, can be particularly useful in assessing:

  • knowledge skills and critical analysis
  • applied problem-solving abilities
  • ability to research and prepare persuasive arguments
  • ability to generate and synthesise ideas
  • ability to communicate effectively
  • ability to present information clearly and concisely
  • ability to present information to an audience with appropriate use of visual and technical aids
  • time management
  • interpersonal and group skills.

When using this method you are likely to aim to assess a combination of the above to the extent specified by the learning outcomes. It is also important that all aspects being assessed are reflected in the marking criteria.

In the case of group presentation you might also assess:

  • level of contribution to the group
  • ability to contribute without dominating
  • ability to maintain a clear role within the group.

See also the ‘ Assessing group work Link opens in a new window ’ section for further guidance.

As with all of the methods described in this resource it is important to ensure that the students are clear about what they expected to do and understand the criteria that will be used to asses them. (See Ginkel et al, 2017 for a useful case study.)

Although the use of oral presentations is increasingly common in higher education some students might not be familiar with this form of assessment. It is important therefore to provide opportunities to discuss expectations and practice in a safe environment, for example by building short presentation activities with discussion and feedback into class time.

Individual or group

It is not uncommon to assess group presentations. If you are opting for this format:

  • will you assess outcome or process, or both?
  • how will you distribute tasks and allocate marks?
  • will group members contribute to the assessment by reporting group process?

Assessed oral presentations are often performed before a peer audience - either in-person or online. It is important to consider what role the peers will play and to ensure they are fully aware of expectations, ground rules and etiquette whether presentations take place online or on campus:

  • will the presentation be peer assessed? If so how will you ensure everyone has a deep understanding of the criteria?
  • will peers be required to interact during the presentation?
  • will peers be required to ask questions after the presentation?
  • what preparation will peers need to be able to perform their role?
  • how will the presence and behaviour of peers impact on the assessment?
  • how will you ensure equality of opportunities for students who are asked fewer/more/easier/harder questions by peers?

Hounsell and McCune (2001) note the importance of the physical setting and layout as one of the conditions which can impact on students’ performance; it is therefore advisable to offer students the opportunity to familiarise themselves with the space in which the presentations will take place and to agree layout of the space in advance.

Good practice

As a summary to the ideas above, Pickford and Brown (2006, p.65) list good practice, based on a number of case studies integrated in their text, which includes:

  • make explicit the purpose and assessment criteria
  • use the audience to contribute to the assessment process
  • record [audio / video] presentations for self-assessment and reflection (you may have to do this for QA purposes anyway)
  • keep presentations short
  • consider bringing in externals from commerce / industry (to add authenticity)
  • consider banning notes / audio visual aids (this may help if AI-generated/enhanced scripts run counter to intended learning outcomes)
  • encourage students to engage in formative practice with peers (including formative practice of giving feedback)
  • use a single presentation to assess synoptically; linking several parts / modules of the course
  • give immediate oral feedback
  • link back to the learning outcomes that the presentation is assessing; process or product.

Neumann in Havemann and Sherman (eds., 2017) provides a useful case study in chapter 19: Student Presentations at a Distance, and Grange & Enriquez in chapter 22: Moving from an Assessed Presentation during Class Time to a Video-based Assessment in a Spanish Culture Module.

Diversity & inclusion

Some students might feel more comfortable or be better able to express themselves orally than in writing, and vice versa . Others might have particular difficulties expressing themselves verbally, due for example to hearing or speech impediments, anxiety, personality, or language abilities. As with any other form of assessment it is important to be aware of elements that potentially put some students at a disadvantage and consider solutions that benefit all students.

Academic integrity

Oral presentations present relative low risk of academic misconduct if they are presented synchronously and in-class. Avoiding the use of a script can ensure that students are not simply reading out someone else’s text or an AI generated script, whilst the questions posed at the end can allow assessors to gauge the depth of understanding of the topic and structure presented. (Click here for further guidance on academic integrity .)

Recorded presentations (asynchronous) may be produced with help, and additional mechanisms to ensure that the work presented is their own work may be beneficial - such as a reflective account, or a live Q&A session. AI can create scripts, slides and presentations, copy real voices relatively convincingly, and create video avatars, these tools can enable students to create professional video content, and may make this sort of assessment more accessible. The desirability of such tools will depend upon what you are aiming to assess and how you will evaluate student performance.

Student and staff experience

Oral presentations provide a useful opportunity for students to practice skills which are required in the world of work. Through the process of preparing for an oral presentation, students can develop their ability to synthesise information and present to an audience. To improve authenticity the assessment might involve the use of an actual audience, realistic timeframes for preparation, collaboration between students and be situated in realistic contexts, which might include the use of AI tools.

As mentioned above it is important to remember that the stress of presenting information to a public audience might put some students at a disadvantage. Similarly non-native speakers might perceive language as an additional barrier. AI may reduce some of these challenges, but it will be important to ensure equal access to these tools to avoid disadvantaging students. Discussing criteria and expectations with your students, providing a clear structure, ensuring opportunities to practice and receive feedback will benefit all students.

Some disadvantages of oral presentations include:

  • anxiety - students might feel anxious about this type of assessment and this might impact on their performance
  • time - oral assessment can be time consuming both in terms of student preparation and performance
  • time - to develop skill in designing slides if they are required; we cannot assume knowledge of PowerPoint etc.
  • lack of anonymity and potential bias on the part of markers.

From a student perspective preparing for an oral presentation can be time consuming, especially if the presentation is supported by slides or a poster which also require careful design.

From a teacher’s point of view, presentations are generally assessed on the spot and feedback is immediate, which reduces marking time. It is therefore essential to have clearly defined marking criteria which help assessors to focus on the intended learning outcomes rather than simply on presentation style.

Useful resources

Joughin, G. (2010). A short guide to oral assessment . Leeds Metropolitan University/University of Wollongong http://eprints.leedsbeckett.ac.uk/2804/

Race, P. and Brown, S. (2007). The Lecturer’s Toolkit: a practical guide to teaching, learning and assessment. 2 nd edition. London, Routledge.

Annotated bibliography

Class participation

Concept maps

Essay variants: essays only with more focus

  • briefing / policy papers
  • research proposals
  • articles and reviews
  • essay plans

Film production

Laboratory notebooks and reports

Objective tests

  • short-answer
  • multiple choice questions

Patchwork assessment

Creative / artistic performance

  • learning logs
  • learning blogs

Simulations

Work-based assessment

Reference list

X

Teaching & Learning

  • Education Excellence
  • Professional development
  • Case studies
  • Teaching toolkits
  • MicroCPD-UCL
  • Assessment resources
  • Student partnership
  • Generative AI Hub
  • Community Engaged Learning
  • UCL Student Success

Menu

Oral assessment

Oral assessment is a common practice across education and comes in many forms. Here is basic guidance on how to approach it.

The words Teaching toolkits ucl arena centre on a blue background

1 August 2019

In oral assessment, students speak to provide evidence of their learning. Internationally, oral examinations are commonplace. 

We use a wide variety of oral assessment techniques at UCL.

Students can be asked to: 

  • present posters
  • use presentation software such as Power Point or Prezi
  • perform in a debate
  • present a case
  • answer questions from teachers or their peers.

Students’ knowledge and skills are explored through dialogue with examiners.

Teachers at UCL recommend oral examinations, because they provide students with the scope to demonstrate their detailed understanding of course knowledge.

Educational benefits for your students

Good assessment practice gives students the opportunity to demonstrate learning in different ways. 

Some students find it difficult to write so they do better in oral assessments. Others may find it challenging to present their ideas to a group of people.  

Oral assessment takes account of diversity and enables students to develop verbal communication skills that will be valuable in their future careers.  

Marking criteria and guides can be carefully developed so that assessment processes can be quick, simple and transparent. 

How to organise oral assessment

Oral assessment can take many forms.

Audio and/or video recordings can be uploaded to Moodle if live assessment is not practicable.

Tasks can range from individual or group talks and presentations to dialogic oral examinations.

Oral assessment works well as a basis for feedback to students and/or to generate marks towards final results.

1. Consider the learning you're aiming to assess 

How can you best offer students the opportunity to demonstrate that learning?

The planning process needs to start early because students must know about and practise the assessment tasks you design.

2. Inform the students of the criteria

Discuss the assessment criteria with students, ensuring that you include (but don’t overemphasise) presentation or speaking skills.

Identify activities which encourage the application or analysis of knowledge. You could choose from the options below or devise a task with a practical element adapted to learning in your discipline.

3. Decide what kind of oral assessment to use

Options for oral assessment can include:

Assessment task

  • Presentation
  • Question and answer session.

Individual or group

If group, how will you distribute the tasks and the marks?

Combination with other modes of assessment

  • Oral presentation of a project report or dissertation.
  • Oral presentation of posters, diagrams, or museum objects.
  • Commentary on a practical exercise.
  • Questions to follow up written tests, examinations, or essays.

Decide on the weighting of the different assessment tasks and clarify how the assessment criteria will be applied to each.

Peer or staff assessment or a combination: groups of students can assess other groups or individuals.

4. Brief your students

When you’ve decided which options to use, provide students with detailed information.

Integrate opportunities to develop the skills needed for oral assessment progressively as students learn.

If you can involve students in formulating assessment criteria, they will be motivated and engaged and they will gain insight into what is required, especially if examples are used.

5. Planning, planning planning!

Plan the oral assessment event meticulously.

Stick rigidly to planned timing. Ensure that students practise presentations with time limitations in mind. Allow time between presentations or interviews and keep presentations brief.  

6. Decide how you will evaluate

Decide how you will evaluate presentations or students’ responses.

It is useful to create an assessment sheet with a grid or table using the relevant assessment criteria.

Focus on core learning outcomes, avoiding detail.

Two assessors must be present to:

  • evaluate against a range of specific core criteria
  • focus on forming a holistic judgment.

Leave time to make a final decision on marks perhaps after every four presentations. Refer to audio recordings later for borderline cases. 

7. Use peers to assess presentations

Students will learn from presentations especially if you can use ‘audio/video recall’ for feedback.

Let speakers talk through aspects of the presentation, pointing out areas they might develop. Then discuss your evaluation with them. This can also be done in peer groups.

If you have large groups of students, they can support each other, each providing feedback to several peers. They can use the same assessment sheets as teachers. Marks can also be awarded for feedback.

8. Use peer review

A great advantage of oral assessment is that learning can be shared and peer reviewed, in line with academic practice.

There are many variants on the theme so why not let your students benefit from this underused form of assessment?

This guide has been produced by the UCL Arena Centre for Research-based Education . You are welcome to use this guide if you are from another educational facility, but you must credit the UCL Arena Centre. 

Further information

More teaching toolkits  - back to the toolkits menu

[email protected] : contact the UCL Arena Centre

UCL Education Strategy 2016–21

Assessment and feedback: resources and useful links 

Six tips on how to develop good feedback practices  toolkit

Download a printable copy of this guide

Case studies : browse related stories from UCL staff and students.

Sign up to the monthly UCL education e-newsletter  to get the latest teaching news, events & resources.

Assessment and feedback events

Funnelback feed: https://search2.ucl.ac.uk/s/search.json?collection=drupal-teaching-learn... Double click the feed URL above to edit

Assessment and feedback case studies

Guidelines for Oral Assessments and Exams

Oral assessments gauge students’ knowledge and skills based on the spoken word, typically guided by questions or small tasks. Oral assessments can take on different formats, including:

  • Presentation on a prepared topic (individual or group, live or recorded)
  • Interviews or discussions to assess a student’s knowledge or skills
  • Simulations or demonstrations of skills individually or with others (e.g., client or patient)

Oral assessment is ideal for assessing:

  • Higher-order thinking and synthesis
  • Applied problem solving
  • Application of theory to practice
  • Depth of knowledge (rather than breadth)
  • Students’ ability to think on their feet
  • Interpersonal competence and professionalism (e.g., in mock interactions with clients or patients)

Oral assessment should not be used:

  • Solely for preventing academic dishonesty or to proctor students
  • As a direct replacement for a written assessment or as a high-stakes assessment
  • If it does not suitably assess the learning outcomes of a course

Advantages of Oral Assessments

  • Can assess depth of knowledge and skills , allowing for a more comprehensive view of students’ abilities, cognitive processes, and conceptual misunderstandings.
  • Opportunity for interaction , leading to a greater sense of connection for instructors and students, particularly in the remote environment.
  • More authentic form of assessment if students are solving problems, demonstrating skills, and communicating using disciplinary language and scenarios.
  • May increase learning , as students often spend more time preparing for oral exams.
  • Opportunity for clarification of ambiguous questions in the moment.
  • Can prevent some academic integrity issues because follow-up questions can be asked to clarify students’ thinking and understanding.

Disadvantages of Oral Assessments

  • More time to administer than written exams and not typically suitable for larger classes.
  • Often more stressful for students, which can interfere with their performance. Students may be unfamiliar with the format, leading to fear and anxiety. Oral exams may be particularly stressful for students with mental health concerns.
  • Potential for issues with reliability and fairness if students are asked different questions.
  • Potential for bias and subjective grading , as grading cannot be anonymous. Students’ articulateness, shyness, speed of thought, gender, ethnicity, language skills, accent, etc. can influence judgments about their knowledge and skills.
  • Potential for academic integrity issues as students can pass on questions to others who are taking the exam later.

Considerations when Designing and Implementing Oral Assessments

Step 1: decide on appropriate assessment strategies for your learning outcomes.

  • Decide which learning outcomes should be assessed through this method.
  • Decide how you will use oral assessment to complement other assessments of those learning outcomes (e.g., take-home assignments, group or individual reports). Oral assessments are best suited to probing depth of knowledge or skills.
  • Decide what alternative assessment options will be available for students who may be disadvantaged by, or less comfortable with, oral assessment (e.g., students with hearing or speech impairments, anxiety, non-native speakers).

Step 2. Create Questions and Structure of the Assessment

  • Design appropriate questions for each learning outcome. Focus on depth rather than breadth. Include potential follow-up questions and prompts based on different types of answers (e.g., asking students to clarify an unclear point or provide more detail).
  • Standardize the number of questions, difficulty of questions, and the time allotted.
  • Decide on the order of questions and any tasks students must perform (e.g., whiteboard drawing, screen sharing). Start with an easier question to ease students into the exam.
  • Determine how and when you will vary the questions across students (e.g., use of different scenarios).

Step 3. Create a Grading Scheme

  • Create a rubric or scoring guide with explicit criteria/standards, weighting, and model answers for each question. Often with oral assessments, answers are not necessarily right or wrong, but demonstrate different levels of mastery. The scoring guide should be straightforward enough that you can fill it in during the oral assessment.
  • Decide if prompting means that points will be deducted.

Step 4. Prepare Students and Create Practice Opportunities

  • Provide clear information to students about the content to be covered, the process and structure of the oral assessment, the material they can have available, and the grading criteria. Give students opportunities to ask questions about the assessment.
  • Provide opportunities for practice. Students are often not experienced in expressing themselves orally within the discipline. Build in informal opportunities for speaking in class and short presentation activities with time for discussion and feedback.
  • Share a recorded video demonstrating a typical oral assessment. Model relevant questions and answers, and how they would be graded.

Step 5. Conduct the Assessment

  • Decide whether to use multiple examiners, which can be helpful for managing time, taking notes, solving technical issues, and grading reliability.
  • Some students will need more encouragement as they may be shy or nervous. Shyness should not affect your perception of what the student knows.
  • Consider recording the assessment in case of grade appeals and to share with students if they want to debrief or request feedback on their performance

Additional Resources

How-to-Guide: Remote Oral Exams , Saunders-Smits, G. (2020). This comprehensive guide includes steps for designing a remote oral exam, and sample rubrics, grading sheets, and exams.

Guidelines for (Online) Oral Exams , University of Twente CELT (2020). This quick guide offers recommendations for increasing validity, reliability, and transparency of online oral exams.

How to Design and Execute an Online Oral Exam , University of Twente CELT (2020). This quick guide provides advice for test construction, organization, and administration.

A Short Guide to Oral Assessment , Joughin, G. (2010). This is a comprehensive guide offering recommendations for planning, executing, and assessing oral assessments.

Revitalizing Classes through Oral Exams , Dumbaugh, D. (2020). This Inside Higher Ed opinion piece details a mathematics instructor’s use of oral exams when transitioning to remote instructor.

Research on the Use of Oral Exams in Various Disciplines

Geography

Marketing/Business
Mathematics
Biology, Chemistry, Physics, Engineering
Nursing, Health Sciences, Nutrition
Theology
Computer Science
Psychology

Upcoming Events

Effective Lesson Planning for Labs and Seminars

Leading and Facilitating Discussions

Effective and Efficient Grading and Feedback Practices

Incorporating Active Learning Strategies into your Teaching

Rethinking Assessments in the Context of AI

Search form

E-mail the office of teaching and learning ( [email protected] ) with any questions related to your teaching and learning needs., file attachments.

AttachmentSize
277.75 KB

Browser does not support script.

  • Departments and Institutes
  • Research centres and groups
  • Chair's Blog: Summer Term 2022
  • Staff wellbeing

Oral presentations

banner 4

Oral assessments offer teachers the opportunity to assess the structure and content of a presentation as well as students’ capacity to answer any subsequent probing questions. They can be formatted as individual presentations or small-group presentations; they can be done face-to-face or online, and they can be given behind closed doors or in front of peers. The most common format involves one or two students presenting during class time with a follow-up question and answer session. Because of logistics and the demands of the curriculum, oral presentations tend to be quite short – perhaps 10 minutes for an undergraduate and 15-20 minutes for a postgraduate. Oral presentations are often used in a formative capacity but they can also be used as summative assessments. The focus of this form of assessment is not on students’ capacity to find relevant information, sources and literature but on their capacity to package such materials into a logically coherent exposition.

Advantages of oral presentations

  • Allows for probing questions that test underlying assumptions.
  • Quick to mark – immediate feedback is possible.
  • Allow students to demonstrate a logical flow/development of an idea.
  • Presentation skills are valued by employers.
  • Students are familiar with this assessment method.

Challenges of oral presentations

Can be stressful for some students.

Non-native speakers may be at a disadvantage.

Can be time-consuming.

Limited scope for inter-rater checks.

A danger that ‘good speakers’ get good marks.

How students might experience oral presentations

Students are often familiar with giving oral presentations and many will have done so in other courses. However, they may focus too much on certain aspects to the detriment of others. For example, some students may be overly concerned with the idea of standing up in front of their peers and may forget that their focus should be on offering a clear narrative. Other students may focus on the style of their presentation and overlook the importance of substance. Others yet may focus on what they have to say without considering the importance of an oral presentation being primarily for the benefit of the audience. The use of PowerPoint in particular should be addressed by teachers beforehand, so that students are aware that this should be a tool for supporting their presentation rather than the presentation in itself. Most oral presentations are followed by a question and answer phase – sometimes the questions will come from peers, sometimes they will come from teachers, and sometimes they will come from both. It is good practice to let students know about the format of the questions – especially if their capacity to answer them is part of the marking criteria.

Reliability, validity, fairness and inclusivity of oral presentations

Oral assessments are often marked in situ and this means that the process for allocating marks needs to be reliable, valid and fair when used under great time pressure. Through having a clearly defined marking structure with a set of pre-established, and shared, criteria, students should be aware of what they need to do to access the highest possible marks. Precise marking criteria help teachers to focus on the intended learning outcomes rather than presentational style. During oral presentations content validity is addressed through having marking criteria that focus on the quality of the points raised in the presentation itself and construct validity is addressed during the question and answer phase when the presenter is assessed for their capacity to comment on underpinning literature, theories and/or principles. One of the issues in having peer questions at the end of an oral presentation is that the teacher has very little control over what will be asked. This does not mean that such questions are not legitimate – only that teachers need to carefully consider how they mark the answers to such questions. In order to ensure equality of opportunity, teachers should ask their own questions after any peer questions, using them to fill any gaps and offer the presenter a chance to address any areas of the marking criteria that have not yet been covered. Oral presentation may challenge students with less proficiency in spoken English, and criteria should be scrutinised to support their achievement.

How to maintain and ensure rigour in oral presentations

Assessment rigour for oral presentations includes the teacher’s capacity to assess a range of presentation topics, formats and styles with an equal level of scrutiny.  Teachers should therefore develop marking criteria that focus on a student’s ability to take complex issues and present them in a clear and relatable manner rather than focus on the content covered. Throughout this whole process teachers should be involved in a form of constant reflexive scrutiny – examining if they feel that they are applying marking criteria fairly across all students. As oral presentations are ephemeral, consider how the moderator and/or external examiner will evaluate the assessment process. Can a moderator ‘double mark’ a percentage of presentations? Is there a need (or would it be helpful) to record the presentations?

How to limit possible misconduct in oral presentations

The opportunities for academic misconduct are quite low in an oral presentation – especially during the question and answer phase. If written resources are expected to be produced as part of the assessment (handouts, bibliographies, PowerPoint slides etc.) then guidance on citing and referencing should be given and marking criteria may offer marks for appropriate use of such literature. In guiding students to avoid using written scripts (except where it is deemed necessary from an inclusivity perspective) teachers will steer them aware from the possibility of reading out someone else’s thoughts as their own. Instead, students should be encouraged to use techniques such as limited cue cards to structure their presentation. The questions posed by the teacher at the end of the presentation are also a possible check on misconduct and will allow the teacher to see if the student actually knows about the content they are presenting or if they have merely memorised someone else’s words.

LSE examples

MA498 Dissertation in Mathematics

PB202 Developmental Psychology

ST312 Applied Statistics Project

Further resources

https://twp.duke.edu/sites/twp.duke.edu/files/file-attachments/oral-presentation-handout.original.pdf

Langan, A.M., Shuker, D.M., Cullen, W.R., Penney, D., Preziosi, R.F. and Wheater, C.P. (2008) Relationships between student characteristics and self‐, peer and tutor evaluations of oral presentations.  Assessment & Evaluation in Higher Education , 33(2): 179-190.

Dunbar, N.E., Brooks, C.F. and Kubicka-Miller, T. (2006) Oral communication skills in higher education: Using a performance-based evaluation rubric to assess communication skills.  Innovative Higher Education , 31(2): 115.

https://www.youtube.com/watch?v=HRaPmO6TlaM

https://www.lse.ac.uk/resources/calendar/courseGuides/PB/2020_PB202.htm

Implementing this method at LSE

If you’re considering using oral presentations as an assessment,  this resource  offers more specific information, pedagogic and practical, about implementing the method at LSE. This resource is password protected to LSE staff.

Back to Assessment methods

AT back button- final

Back to Toolkit Main page

Support for LSE Departments F StockSnap_VL01D336R8

Contact your Eden Centre departmental adviser

04 - Welcome to the Eden Centre logo

If you have any suggestions for future Toolkit development, get in touch using our email below!

Email: [email protected].

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • BMC Med Educ

Logo of bmcmedu

Development and validation of the oral presentation evaluation scale (OPES) for nursing students

Yi-chien chiang.

1 Department of Nursing, Chang Gung University of Science and Technology, Division of Pediatric Hematology and Oncology, Linkou Chang Gung Memorial Hospital, Taoyuan City, Taiwan, Republic of China

Hsiang-Chun Lee

2 Department of Nursing, Chang Gung University of Science and Technology, Taoyuan City, Taiwan, Republic of China

Tsung-Lan Chu

3 Administration Center of Quality Management Department, Chang Gung Medical Foundation, Taoyuan City, Taiwan, Republic of China

Chia-Ling Wu

Ya-chu hsiao.

4 Department of Nursing, Chang Gung University of Science and Technology; Administration Center of Quality Management Department, Linkou Chang Gung Memorial Hospital, No.261, Wenhua 1st Rd., Guishan Dist, Taoyuan City, 333 03 Taiwan, Republic of China

Associated Data

The datasets and materials of this study are available to the corresponding author on request.

Oral presentations are an important educational component for nursing students and nursing educators need to provide students with an assessment of presentations as feedback for improving this skill. However, there are no reliable validated tools available for objective evaluations of presentations. We aimed to develop and validate an oral presentation evaluation scale (OPES) for nursing students when learning effective oral presentations skills and could be used by students to self-rate their own performance, and potentially in the future for educators to assess student presentations.

The self-report OPES was developed using 28 items generated from a review of the literature about oral presentations and with qualitative face-to-face interviews with university oral presentation tutors and nursing students. Evidence for the internal structure of the 28-item scale was conducted with exploratory and confirmatory factor analysis (EFA and CFA, respectively), and internal consistency. Relationships with Personal Report of Communication Apprehension and Self-Perceived Communication Competence to conduct the relationships with other variables evidence.

Nursing students’ ( n  = 325) responses to the scale provided the data for the EFA, which resulted in three factors: accuracy of content, effective communication, and clarity of speech. These factors explained 64.75% of the total variance. Eight items were dropped from the original item pool. The Cronbach’s α value was .94 for the total scale and ranged from .84 to .93 for the three factors. The internal structure evidence was examined with CFA using data from a second group of 325 students, and an additional five items were deleted. Except for the adjusted goodness of fit, fit indices of the model were acceptable, which was below the minimum criteria. The final 15-item OPES was significantly correlated with the students’ scores for the Personal Report of Communication Apprehension scale ( r  = −.51, p  < .001) and Self-Perceived Communication Competence Scale ( r  = .45, p  < .001), indicating excellent evidence of the relationships to other variables with other self-report assessments of communication.

Conclusions

The OPES could be adopted as a self-assessment instrument for nursing students when learning oral presentation skills. Further studies are needed to determine if the OPES is a valid instrument for nursing educators’ objective evaluations of student presentations across nursing programs.

Supplementary Information

The online version contains supplementary material available at 10.1186/s12909-022-03376-w.

Competence in oral presentations is important for medical professionals to communicate an idea to others, including those in the nursing professions. Delivering concise oral presentations is a useful and necessary skill for nurses [ 1 , 2 ]. Strong oral presentation skills not only impact the quality of nurse-client communications and the effectiveness of teamwork among groups of healthcare professionals, but also promotion, leadership, and professional development [ 2 ]. Nurses are also responsible for delivering health-related knowledge to patients and the community. Therefore, one important part of the curriculum for nursing students is the delivery of oral presentations related to healthcare issues. A self-assessment instrument for oral presentations could provide students with insight into what skills need improvement.

Three components have been identified as important for improving communication. First, a presenter’s self-esteem can influence the physio-psychological reaction towards the presentation; presenters with low self-esteem experience greater levels of anxiety during presentations [ 3 ]. Therefore, increasing a student’s self-efficacy can increase confidence in their ability to effectively communicate, which can reduce anxiety [ 3 , 4 ]. Second, Liao (2014) reported improving speaking efficacy can also improve oral communications and collaborative learning among students could improve speech efficacy and decrease speech anxiety [ 5 ]. A study by De Grez et al. provided students with a list of skills to practice, which allowed them to feel more comfortable when a formal presentation was required, increased presentation skills, and improved communication by improving self-regulation [ 6 ]. Third, Carlson and Smith-Howell (1995) determined quality and accuracy of the information presented was also an important aspect of public speaking performances [ 7 ]. Therefore, all three above mentioned components are important skills for effective communication during an oral presentation.

Instruments that provide an assessment of a public speaking performance are critical for helping students’ improve oral presentation skills [ 7 ]. One study found peer evaluations were higher than those of university tutors for student presentations, using a student-developed assessment form [ 8 ]. The assessment criteria included content (40%), presentation (40%), and structure (20%); the maximum percent in each domain was given for “excellence”, which was relative to a minimum “threshold”. Multiple “excellence” and “threshold” benchmarks were described for each domain. For example, benchmarks included the use of clear and appropriate language, enthusiasm, and keeping the audience interested. However, the percentage score did not provide any information about what specific benchmarks were met. Thus, these quantitative scores did not include feedback on specific criteria that could enhance future presentations.

At the other extreme is an assessment that is limited to one aspect of the presentation and is too detailed to evaluate the performance efficiently. An example of this is the 40-item tool developed by Tsang (2018) [ 6 ] to evaluate oral presentation skills, which measured several domains: voice (volume and speed), facial expressions, passion, and control of time. An assessment tool developed by De Grez et al. (2009) includes several domains: three subcategories for content (quality of introduction, structure, and conclusion), five subcategories of expression (eye-contact, vocal delivery, enthusiasm, interaction with audience, and body-language), and a general quality [ 9 ]. Many items overlap, making it hard to distinguish specific qualities. Other evaluation tools include criteria that are difficult to objectively measure, such as body language, eye-contact, and interactions with the audience [ 10 ]. Finally, most of the previous tools were developed without testing the reliability and validity of the instrument.

Nurses have the responsibility of not only providing medical care, but also medical information to other healthcare professionals, patients, and members of the community. Therefore, improving nursing students’ speaking skills is an important part of the curriculum. A self-report instrument for measuring nursing students’ subjective assessment of their presentation skills could help increase competence in oral communication. However, to date, there is a no reliable and valid instrument of evaluating oral presentation performance in nursing education. Therefore, the aim of this study was to develop a self-assessment instrument for nursing students that could guide them in understanding their strengths and development areas in aspects of oral presentations. Development of a scale that is a valid and reliable instrument for nursing students could then be examined for use as a scale for objective evaluations of oral presentations by peers and nurse educators.

Study design

This study developed and validated an oral presentation evaluation scale (OPES) that could be employed as a self-assessment instrument for students when learning skills for effective oral presentations. The instrument was developed in two phases: Phase I (item generation and revision) and Phase II (scale development) [ 11 ]. The phase I was aimed to generate items by a qualitative method and to collect content evidence for the OPES. The phase II focused on scale development which was established internal structure evidence including CFA, EFA, and internal consistency of the scale for the OPES. In addition, the phase II collected the evidence of OPES on relationships with other variables. Because we hope to also use the instrument as an aid for nurse educators in objective evaluations of nursing students’ oral presentations, both students and educators were involved in item generation and revision. Only nursing students participated in Phase II.

Approval was obtained from Chang Gung Medical Foundation institutional review board (ID: 201702148B0) prior to initiation of the study. Informed consent was obtained from all participants prior to data collection. All participants being interviewed for item generation in phase I provided signed informed consent indicating willingness to be audiotaped during the interview. All the study methods were carried out in accordance with relevant guidelines and regulations.

Phase I: item generation and item revision

Participants.

A sample of nurse educators ( n  = 8) and nursing students (n  = 11) participated in the interviews for item generation. Nursing students give oral presentations to meet the curriculum requirement, therefore the educators were university tutors experienced in coaching nursing students preparing to give an oral presentation. Nurse educators specializing in various areas of nursing, such as acute care, psychology, and community care were recruited if they had at least 10 years’ experience coaching university students. The mean age of the educators was 52.1 years ( SD  = 4.26), 75% were female, and the mean amount of teaching experience was 22.6 years ( SD  = 4.07). Students were included if they had given at least one oral presentation and were willing to share their experiences of oral presentation. The mean age of the students was 20.7 ( SD  = 1.90), 81.8% were female; 36.3%, four were second year students, three were third students, and four were in their fourth year.

An additional eight educators participated in the evaluation of content evidence of the ORES. All had over 10 years’ experience in coaching students in giving an oral presentation that would be evaluated for a grade.

Item generation

Development of item domains involved deductive evaluations of the about oral presentations [ 2 , 3 , 6 – 8 , 12 – 14 ]. Three domains were determined to be important components of an oral presentation: accuracy of content, effective communication, and clarity of speech. Inductive qualitative data from face-to-face semi-structured interviews with nurse educators and nursing student participants were used to identify domain items [ 11 ]. Details of interview participants are described in the section above. The interviews with nurse educators and students followed an interview guide (Table  1 ) and lasted approximately 30–50 min for educators and 20–30 min for students. Deduction of the literature and induction of the interview data was used to determine categories considered important for the objective evaluation of oral presentations.

Interview guide for semi-structured interviews with nurse educators and nursing students for item generation

Participant groupQuestions
Educator1.What has been your reaction to oral reports or presentations given by your students?
2. What problems commonly occur when students are giving oral reports or presentations?
3. In your opinion, what do you consider a good presentation, and could you describe the characteristics?
4. How do you evaluate the performance of the student’s oral reports or presentations? Are there any difficulties or problems evaluating the oral reports?
Student1. Would you please tell me about your experiences of giving an oral report or presentation?
2. In your opinion, what is a good presentation and what are some of the important characteristics?

Analysis of interview data. Audio recordings of the interviews were transcribed verbatim at the conclusion of each interview. Interview data were analyzed by the first, second, and corresponding author, all experts in qualitative studies. The first and second authors coded the interview data to identify items educators and student described as being important to the experience of an oral presentation [ 11 ]. The corresponding author grouped the coded items into constructs important for oral presentations. Meetings with the three researchers were conducted to discuss the findings; if there were differences in interpretation, an outside expert in qualitative studies was included in the discussions until consensus was reached among the three researchers.

Analysis of the interview data indicated items involved in preparation, presentation, and post-presentation were important to the three domains of accuracy of content, effective communication, and clarity of speech. Items for accuracy of content involved preparation (being well-prepared before the presentation; preparing materials suitable for the target audience; practicing the presentation in advance) and post-presentation reflection; and discussing the content of the presentation with classmates and teachers. Items for effective communication involved the presentation itself: obtain the attention of the audience; provide materials that are reliable and valuable; express confidence and enthusiasm; interact with the audience; and respond to questions from the audience. The third domain, clarity of speech, involved of items could be, post-presentation, involved a student’s ability to reflect on the content and performance of their presentation and willingness to obtain feedback from peers and teachers.

Item revision: content evidence

Based on themes that emerged during, 28 items were generated. Content evidence of the 28 items of the OPES was established with a panel of eight experts who were educators that had not participated in the face-to-face interviews. The experts were provided with a description of the research purpose, a list of the proposed items, and were asked to rate each item on a 4-point Likert scale (1 = not representative, 2 = item needs major revision, 3 = representative but needs minor revision, 4 = representative). For item-level content validity index (I-CVI) was determined by the total items rated 3 or 4 divided by the total number of experts; scale-level content validity index (S-CVI) was determined by the total items rated 3 or 4 divided by the total number of items.

Based on the suggestions of the experts, six items of the OPES were reworded for clarity: item 12 was revised from “The presentation is riveting” to “The presenter’s performance is brilliant; it resonates with the audience and arouses their interests”. Two items were deleted because they duplicated other items: “demonstrates confidence” and “presents enthusiasm” were combined and item 22 became, “demonstrates confidence and enthusiasm properly”. The item “the presentation allows for proper timing and sequencing” and “the length of time of the presentation is well controlled” were also combined into item 9, “The content of presentation follows the rules, allowing for the proper timing and sequence”. Thus, a total of 26 items were included in the OPES at this phase. The I-CVI value was .88 ~ 1 and the scale-level CVI/universal agreement was .75, indicating that the OPES was an acceptable instrument for measuring an oral presentation [ 11 ].

Phase II: scale development

Phase II, scale development, aimed to establish the internal structure evidence for OPES. The evidence of relation to other variables was also evaluated as well in this phase. More specifically, the internal structure evidence for OPES was evaluated by exploratory factor analysis (EFA) and confirmatory factor analysis (CFA). The evidence of relationships to other variables was determined by examining the relationships between the OPES and the PRCA and SPCC [ 15 ].

A sample of nursing students was recruited purposively from a university in Taiwan. Students were included if they were: (a) full-time students; (b) had declared nursing as their major; and (c) were in their sophomore, junior, or senior year. First-year university students (freshman) were excluded. A bulletin about the survey study was posted outside of classrooms; 707 students attend these classes. The bulletin included a description of the inclusion criteria and instructions to appear at the classroom on a given day and time if students were interested in participating in the study. Students who appeared at the classroom on the scheduled day ( N  = 650) were given a packet containing a demographic questionnaire (age, gender, year in school), a consent form, the OPES instrument, and two scales for measuring aspects of communication, the Personal Report of Communication Apprehension (PRCA) and the Self-Perceived Communication Competence (SPCC); the documents were labeled with an identification number to anonymize the data. The 650 students were divided into two groups, based on the demographic data using the SPSS random case selection procedure, (Version 23.0; SPSS Inc., Chicago, IL, USA). The selection procedure was performed repeatedly until the homogeneity of the baseline characteristics was established between the two groups ( p  > .05). The mean age of the participants was 20.5 years ( SD  = 0.98) and 87.1% were female ( n  = 566). Participants were comprised of third-year students (40.6%, n  = 274), fourth year (37.9%, n  = 246) and second year (21.5%, n  = 93). The survey data for half the group (calibration sample, n  = 325) was used for EFA; the survey data from the other half (the validation sample, n  = 325) was used for CFA. Scores from the PRCA and SPCC instruments were used for evaluating the evidence of relationships to other variables.

The aims of phase II were to collect the scale of internal structure evidence, which identify the items that nursing students perceived as important during an oral presentation and to determine the domains that fit a set of items. The 325 nursing students for EFA (described above) were completed the data collection. We used EFA to evaluate the internal structure of the scale. The items were presented in random order and were not nested according to constructs. Internal consistency of the scale was determined by calculating Cronbach’s alpha.

Then, the next step involved determining if the newly developed OPES was a reliable and valid self-report scale for subjective assessments of nursing students’ previous oral presentations. Participants (the second group of 325 students) were asked, “How often do you incorporate each item into your oral presentations?”. Responses were scored on a 5-point Likert scale with 1 = never to 5 = always; higher scores indicated a better performance. The latent structure of the scale was examined with CFA.

Finally, the evidence of relationships with other variables of the OPES was determined by examining the relationships between the OPES and the PRCA and SPCC, described below.

The 24-item PRCA scale

The PRCA scale is a self-report instrument for measuring communication apprehension, which is an individual’s level of fear or anxiety associated with either real or anticipated communication with a person or persons [ 12 ]. The 24 scale items are comprised of statements concerning feelings about communicating with others. Four subscales are used for different situations: group discussions, interpersonal communications, meetings, and public speaking. Each item is scored on a 5-point Likert scale from 1 (strongly disagree) to 5 (strongly agree); scores range from 24 to 120, with higher scores indicating greater communication anxiety. The PRCA has been demonstrated to be a reliable and valid scale across a wide range of related studies [ 5 , 13 , 14 , 16 , 17 ]. The Cronbach’s alpha for the scale is .90 [ 18 ]. We received permission from the owner of the copyright to translate the scale into Chinese. Translation of the scale into Chinese by a member of the research team who was fluent in English was followed by back-translation from a differed bi-lingual member of the team to ensure semantic validity of the translated PRCA scale. The Cronbach’s alpha value in the present study was .93.

The 12-item SPCC scale

The SPCC scale evaluates a persons’ self-perceived competence in a variety of communication contexts and with a variety of types of receivers. Each item is a situation which requires communication, such as “Present a talk to a group of strangers”, or “Talk with a friend”. Participants respond to each situation by ranking their level of competence from 0 (completely incompetent) to 100 (completely competent). The Cronbach’s alpha for reliability of the scale is .85. The SPCC has been used in similar studies [ 13 , 19 ]. We received permission owner of the copyright to translate the scale into Chinese. Translation of the SPCC scale into Chinese by a member of the research team who was fluent in English was followed by back-translation from a differed bi-lingual member of the team to ensure semantic validity of the translated scale. The Cronbach’s alpha value in the present study was .941.

Statistical analysis

Data were analyzed using SPSS for Windows 23 (SPSS Inc., Chicago, IL, USA). Data from the 325 students designated for EFA was used to determine the internal structure evidence of the OPES. The Kaiser-Meyer-Olkin measure for sampling adequacy and Bartlett’s test of sphericity demonstrated factor analysis was appropriate [ 20 ]. Principal component analysis (PCA) was performed on the 26 items to extract the major contributing factors; varimax rotation determined relationships between the items and contributing factors. Factors with an eigenvalue > 1 were further inspected. A factor loading greater than .50 was regarded as significantly relevant [ 21 ].

All item deletions were incorporated one by one, and the EFA model was respecified after each deletion, which reduced the number of items in accordance with a priori criteria. In the EFA phase, the internal consistency of each construct was examined using Cronbach’s alpha, with a value of .70 or higher considered acceptable [ 22 ].

Data from the 325 students designated for CFA was used to validate the factor structure of the OPES. In this phase, items with a factor loading less than .50 were deleted [ 21 ]. The goodness of the model fit was assessed using the following: absolute fit indices, including goodness of fit index (GFI), adjusted goodness of fit index (AGFI), standardized root mean squared residual (SRMR), and the root mean square error of approximation (RMSEA); relative fit indices, normed and non-normed fit index (NFI and NNFI, respectively), and comparative fit index (CFI); and the parsimony NFI, CFI, and likelihood ratio ( x 2 /df ) [ 23 ].

In addition to the validity testing, a research team, which included a statistician, determined the appropriateness of either deleting or retaining each item. The convergent validity (internal quality of the items and factor structures), was further verified using standardized factor loading, with values of .50 or higher considered acceptable, and average variance extraction (AVE), with values of .5 or higher considered acceptable [ 21 ]. Convergent reliability (CR) was assessed using the construct reliability from the CFA, with values of .7 or higher considered acceptable [ 24 ]. The AVE and correlation matrices among the latent constructs were used to establish discriminant validity of the instrument. The square root of the AVE of each construct was required to reach a value that was larger than the correlation coefficient between itself and the other constructs [ 24 ].

The evidence of relationships with other variables was determined by examining the relationship of nursing students’ scores ( N  = 650) on the newly developed OPES with scores for constructs of communication of the translated scales for PRCA and SPCC. The hypotheses between OPES to PRCA and SPCC individually indicated the strong self-reported presentation competence were associated with lower communication anxiety and greater communication competence.

Development of the OPES: internal structure evidence

EFA was performed sequentially six times until there were no items with a loading factor < .50 or that were cross-loaded, and six items were deleted (Table  2 ). EFA resulted in 20 items with a three factors solution, which represented 64.75% of the variance of the OPES. The Cronbach’s alpha estimates for the total scale was .94. indicating the scale had sound internal reliability (Table ​ (Table2). 2 ). The three factors were labeled in accordance with the item content via a panel discussion and had Cronbach’s alpha values of .93, .89, and .84 for factors 1, 2 and 3, respectively.

Summary of exploratory factor analysis: descriptive statistics, factor loading, and reliability for nursing students ( N  = 325)

ScoreFactor loading
ItemDescriptionMeanSD123
7The content of the presentation matches the theme4.250.62.76.20.17
14Presentation aids, such as PowerPoint and posters, highlight key points of the report4.210.74.75.21.30
15Proper use of presentation aids such as PowerPoint and posters4.320.69.74.12.28
8The content of the presentation is clear and focused4.020.69.72.36.11
10The content of the presentation is organized and logical3.930.75.72.38.13
4Preparation of presentation aids, such as PowerPoint and posters, in advance4.53.67.70−.10.20
16Presentation aids, such as PowerPoint and posters, help the audience understand the content of the presentation4.260.68.69.20.37
9The organization of the presentation is structured to provide the necessary information, while also adhering to time limitations4.100.69.68.30.18
11The content of the presentation provides correct information4.120.66.68.31.10
1Preparation of the content in accordance with the theme and rules in advance4.490.61.64−.02.39
13The entire content of the presentation is prepared in a way that is understandable to the audience3.990.77.61.40.09
22Presenter demonstrates confidence and an appropriate level of enthusiasm3.920.91.17.83.25
21Presenter uses body language in a manner that increases the audience’s interest in learning3.500.95.09.81.22
24Presenter interacts with the audience using eye contact during the question and answer session3.650.92.15.77.24
23Presenter responds to the audience’s questions properly3.630.87.23.77.17
12The presenter’s performance is brilliant; it resonates with the audience and arouses their interests3.430.78.43.65.04
17The pronunciation of the words in the presentation is correct3.980.82.31.29.74
18The tone and volume of the presenter’s voice is appropriate3.820.82.22.50.70
19The words and phrases of the presenter are smooth and fluent3.700.82.26.52.65
20The clothing worn by the presenter is appropriate4.160.77.33.12.57
Eigenvalue (sum of squared loading)6.014.342.60
Explained variance30.03%21.72%13.00%
Cumulative variance30.03%51.75%64.75%
Cronbach’s α for each subscale.93.89.84
Cronbach’s α for the total scale.94
ItemDeleted following EFA
2Considers the background or needs of the audience to prepare the content of the presentation in advance3.940.84
3Discusses the content of the presentation with experts, teachers or peers (classmates) in advance3.940.89
5Practices several times in private in before the presentation3.960.89
6Invites classmates or teachers to watch a rehearsal before the presentation3.391.04
25Reflects on the experience as well as the strengths and weaknesses of the presentation3.830.85
26Obtains feedback from peers (e.g. classmates), teachers, or an audience3.920.81

Abbreviations : SD standard deviation, EFA exploratory factor analysis

Factor 1, Accuracy of Content, was comprised of 11 items and explained 30.03% of the variance. Items in Accuracy of Content evaluated agreement between the topic (theme) and content of the presentation, use of presentation aids to highlight the key points of the presentation, and adherence to time limitations. These items included statements such as: “The content of the presentation matches the theme” (item 7), “Presentation aids, such as PowerPoint and posters, highlight key points of the report” (item 14), and “The organization of the presentation is structured to provide the necessary information, while also adhering to time limitations” (item 9). Factor 2, “Effective Communication”, was comprised of five items, which explained 21.72% of the total variance. Effective Communication evaluated the attitude and expression of the presenter. Statements included “Demonstrates confidence and an appropriate level of enthusiasm” (item 22), “Uses body language in a manner that increases the audience’s interest in learning” (item 21), and “Interacts with the audience using eye contact and a question and answer session” (item 24). Factor 3, “Clarity of Speech” was comprised of four items, which explained 13.00% of the total variance. Factor 3 evaluated the presenter’s pronunciation with statements such as “The words and phrases of the presenter are smooth and fluent” (item 19).

The factor structure of the 20-items of the EFA were examined with CFA. We sequentially removed items 1, 4, 20, 15, and 16, based on modification indices. The resultant 15-item scale had acceptable fit indices for the 3-factor model of the OPES for chi-square ( x 2 /df  = 2.851), RMSEA (.076), NNFI (.933), and CFI = .945. However, the AGFI, which was .876, was below the acceptable criteria of .9. A panel discussion with the researchers determined that items 4, 15, and 16 were similar in meaning to item 14; item 1 was similar in meaning to item 7. Therefore, the panel accepted the results of the modified CFA model of the OPES with 15 items and 3-factors.

As illustrated in Table  3 and Fig.  1 , all standardized factor loadings exceeded the threshold of .50, and the AVE for each construct ranged from .517 to .676, indicating acceptable convergent validity. In addition, the CR was greater than .70 for the three constructs (range = .862 to .901), providing further evidence for the reliability of the instrument [ 25 ]. As shown in Table  4 , all square roots of the AVE for each construct (values in the diagonal elements) were greater than the corresponding inter-construct correlations (values below the diagonal) [ 24 , 25 ]. These findings provide further support for the validity of the OPES.

Confirmatory factor analysis: convergent reliability and validity of the OPES scale for nursing students ( n  = 325)

Construct/ItemItem scoreFactor loadingReliability
Mean λ CRAVE
Accuracy of content.881.517
 Item 74.250.60.69513.774***.483
 Item 144.230.68.66012.863***.435
 Item 83.980.66.78616.352***.617
 Item 103.880.69.82817.703***.686
 Item 94.030.72.76615.753***.586
 Item 114.080.65.69713.835***.486
 Item 133.920.78.56910.687***.324
Effective Communication.901.647
 Item 223.580.91.89420.230***.799
 Item 213.430.97.81717.548***.668
 Item 243.690.91.79416.816***.631
 Item 233.640.87.85418.802***.730
 Item 123.410.79.63912.490***.408
Clarity of speech.862.676
 Item 173.940.76.76515.541***.586
 Item 183.810.79.88119.002***.776
 Item 193.700.76.81717.026***.667

Note . λ standardized factor loading, R 2 reliability of item (squared multiple correlation, SMC), CR construct (component/composite) reliability, AVE average variance extraction

*** p  < .001

An external file that holds a picture, illustration, etc.
Object name is 12909_2022_3376_Fig1_HTML.jpg

The standardized estimates of CFA model for validation sample

Correlations among the latent variables from confirmatory factor analysis of the OPES scale for nursing students ( n  = 325)

Construct123
1. Accuracy of content.719
2. Effective communication.696***.804
3. Clarity of speech.597***.703***.822

a The value in the diagonal element is the square root of AVE of each construct

Development of the OPES: relationships with other variables

Relationships with other variable evidence was examined with correlation coefficients for the total score and subscale scores of the OPES with the total score and subscale scores of the PRCA and SPCC (Table  5 ) from all nursing students who participated in the study and complete all three scales ( N  = 650). Correlation coefficients for the total score of the OPES with total scores for the PRCA and SPCC were − .51 and .45, respectively (both p  < .001). Correlation coefficients for subscale scores of the OPES with the subscale scores of the PRCA and SPCC were all significant ( p  < .001), indicating strong valid evidence of the scale as a self-assessment for effective communication.

Correlation coefficients for total scores and subscale scores for the OPES, PRCA, and SPCC

Instruments & subscales12345678910111213141516
1. OPES
2. Accuracy of content
3. Effective Communication
4. Clarity of speech
5. PRCA
6. Group discussion
7. Meetings
8. Interpersonal
9. Public Speaking
10. SPCC
11. Public
12. Meeting
13. Group
14. Dyad
15. Stranger
16. Acquaintance
17. Friend

OPES Oral Presentation Evaluation Scale, PRCA Personal Report of Communication Apprehension, SPCC Self-Perceived Communication Competence

Bold figures all p  < .001.

The 15-item OPES was found to be a reliable and valid instrument for nursing students’ self-assessments of their performance during previous oral presentations. The strength of this study is that the initial items were developed using both literature review and interviews with nurse educators, who were university tutors in oral presentation skills, as well as nursing students at different stages of the educational process. Another strength of this study is the multiple methods used to establish the validity and reliability of the OPES, including internal structure evidence (both EFA and CFA) and relationships with other variables [ 15 , 26 ].

Similar to previous to other oral presentation instruments, content analysis of items of the OPES generated from the interviews with educators and students indicated accuracy of the content of a presentation and effective communication were important factors for a good performance [ 3 – 6 , 8 ]. Other studies have also included self-esteem as a factor that can influence the impact of an oral presentation [ 3 ], however, the subscale of effective communication included the item “Demonstrates confidence and an appropriate level of enthusiasm”, which a quality of self-esteem. The third domain was identified as clarity of speech, which is unique to our study.

Constructs that focus on a person’s ability to deliver accurate content are important components for evaluations of classroom speaking because they have been shown to be fundamental elements of public speaking ([ 7 ]). Accuracy of content as it applies to oral presentation for nurses is important not only for communicating information involving healthcare education for patients, but also for communicating with team members providing medical care in a clinical setting.

The two other factors identified in the OPES, effective communication and clarity of speech, are similar to constructs for delivery of a presentation, which include interacting with the audience through body-language, eye-contact, and question and answer sessions. These behaviors indicate the presenter is confident and enthusiastic, which engages and captures the attention of an audience. It seems logical that the voice, pronunciation, and fluency of speech were not independent factors because the presenter’s voice qualities all are keys to effectively delivering a presentation. A clear and correct pronunciation, appropriate tone and volume of a presentation assists audiences in more easily receiving and understanding the content.

Our 15-item OPES scale evaluated the performance based on outcome. The original scale was composed of 26 items that were derived from qualitative interviews with nursing students and university tutors in oral presentations. These items were the result of asking about important qualities at three timepoints of a presentation: before, during, and after. However, most of the items that were deleted were those about the period before the presentation (1 to 6); two items (25 and 26) were about the period after the presentation. Analysis did not reflect the qualitative interview data expressed by educators and students regarding the importance of preparing with practice and rehearsal, and the importance of peer and teacher evaluations. Other studies have suggested that preparation and self-reflection is important for a good presentation, which includes awareness of the audience receiving the presentation, meeting the needs of the audience, defining the purpose of the presentation, use of appropriate technology to augment information, and repeated practices to reduce anxiety [ 2 , 5 , 27 ]. However, these items were deleted in the scale validation stage, possibly because it is not possible to objectively evaluate how much time and effort the presenter has devoted to the oral presentation.

The deletion of item 20, “The clothing worn by the presenter is appropriate” was also not surprising. During the interviews, educators and students expressed different opinions about the importance of clothing for a presentation. Many of the educators believed the presenter should be dressed formally; students believed the presenter should be neatly dressed. These two perspectives might reflect generational differences. However, these results are reminders assessments should be based on a structured and objective scale, rather than one’s personal attitude and stereotype of what should be important about an oral presentation.

The application of the OPES may be useful not only for educators but also for students. The OPES could be used a checklist to help students determine how well their presentation matches the 15 items, which could draw attention to deficiencies in their speech before the presentation is given. Once the presentation has been given, the OPES could be used as a self-evaluation form, which could help them make modifications to improve the next the next presentation. Educators could use the OPES to evaluate a performance during tutoring sessions with students, which could help identify specific areas needing improvement prior to the oral presentation. Although, analysis of the scale was based on data from nursing students, additional assessments with other populations of healthcare students should be conducted to determine if the OPES is applicable for evaluating oral presentations for students in general.

Limitations

This study had several limitations. Participants were selected by non-random sampling, therefore, additional studies with nursing students from other nursing schools would strengthen the validity and reliability of the scale. In addition, the OPES was developed using empirical data, rather than basing it on a theoretical framework, such as anxiety and public speaking. Therefore, the validity of the OPES for use in other types of student populations or cultures that differ significantly from our sample population should be established in future studies. Finally, the OPES was in the study was examined as a self-assessment instrument for nursing students who rated themselves based on their perceived abilities previous oral presentations rather than from peer or nurse educator evaluations. Therefore, applicability of the scale as an assessment instrument for educators providing an objective score of nursing students’ real-life oral presentations needs to be validated in future studies.

This newly developed 15-item OPES is the first report of a valid self-assessment instrument for providing nursing students with feedback about whether necessary targets for a successful oral presentation are reached. Therefore, it could be adopted as a self-assessment instrument for nursing students when learning what oral presentation require skills require strengthening. However, further studies are needed to determine if the OPES is a valid instrument for use by student peers or nursing educators evaluating student presentations across nursing programs.

Acknowledgements

The authors thank all the participants for their kind cooperation and contribution to the study.

Authors’ contributions

All authors conceptualized and designed the study. Data were collected by Y-CH and H-CL. Data analysis was conducted by Y-CH and Y-CC. The first draft of the manuscript was written by Y-CH, Y-CC, and all authors contributed to subsequent revisions. All authors read and approved the final submission.

This study was supported by grants from the Ministry of Science and Technology Taiwan (MOST 107–2511-H-255-007), Ministry of Education (PSR1090283), and the Chang Gung Medical Research Fund (CMRPF3K0021, BMRP704, BMRPA63).

Availability of data and materials

Declarations.

All the study methods and materials have been performed in accordance with the Declaration of Helsink. The study protocol and the procedures of the study were approved by Chang Gung Medical Foundation institutional review board (number: 201702148B0) for the protection of participants’ confidentiality. All of the participants received oral and written explanations of the study and its procedures, as well as informed consent was obtained from all subjects.

Not applicable.

No conflict of interest has been declared by the authors.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Contributor Information

Yi-Chien Chiang, Email: wt.ude.tsugc.wg@gnaihccy .

Hsiang-Chun Lee, Email: wt.ude.tsugc.wg@eelyhtac .

Tsung-Lan Chu, Email: wt.gro.hmgc@57cej .

Chia-Ling Wu, Email: wt.ude.tsugc.liam@uwlc .

Ya-Chu Hsiao, Email: wt.ude.tsugc.wg@oaihsjy .

To read this content please select one of the options below:

Please note you do not have access to teaching notes, assessing oral presentation performance: designing a rubric and testing its validity with an expert group.

Journal of Applied Research in Higher Education

ISSN : 2050-7003

Article publication date: 3 July 2017

The purpose of this paper is to design a rubric instrument for assessing oral presentation performance in higher education and to test its validity with an expert group.

Design/methodology/approach

This study, using mixed methods, focusses on: designing a rubric by identifying assessment instruments in previous presentation research and implementing essential design characteristics in a preliminary developed rubric; and testing the validity of the constructed instrument with an expert group of higher educational professionals ( n =38).

The result of this study is a validated rubric instrument consisting of 11 presentation criteria, their related levels in performance, and a five-point scoring scale. These adopted criteria correspond to the widely accepted main criteria for presentations, in both literature and educational practice, regarding aspects as content of the presentation, structure of the presentation, interaction with the audience and presentation delivery.

Practical implications

Implications for the use of the rubric instrument in educational practice refer to the extent to which the identified criteria should be adapted to the requirements of presenting in a certain domain and whether the amount and complexity of the information in the rubric, as criteria, levels and scales, can be used in an adequate manner within formative assessment processes.

Originality/value

This instrument offers the opportunity to formatively assess students’ oral presentation performance, since rubrics explicate criteria and expectations. Furthermore, such an instrument also facilitates feedback and self-assessment processes. Finally, the rubric, resulting from this study, could be used in future quasi-experimental studies to measure students’ development in presentation performance in a pre-and post-test situation.

  • Higher education
  • Oral presentation competence

Van Ginkel, S. , Laurentzen, R. , Mulder, M. , Mononen, A. , Kyttä, J. and Kortelainen, M.J. (2017), "Assessing oral presentation performance: Designing a rubric and testing its validity with an expert group", Journal of Applied Research in Higher Education , Vol. 9 No. 3, pp. 474-486. https://doi.org/10.1108/JARHE-02-2016-0012

Emerald Publishing Limited

Copyright © 2017, Emerald Publishing Limited

Related articles

All feedback is valuable.

Please share your general feedback

Report an issue or find answers to frequently asked questions

Contact Customer Support

  • - Google Chrome

Intended for healthcare professionals

  • My email alerts
  • BMA member login
  • Username * Password * Forgot your log in details? Need to activate BMA Member Log In Log in via OpenAthens Log in via your institution

Home

Search form

  • Advanced search
  • Search responses
  • Search blogs
  • How to prepare and...

How to prepare and deliver an effective oral presentation

  • Related content
  • Peer review
  • Lucia Hartigan , registrar 1 ,
  • Fionnuala Mone , fellow in maternal fetal medicine 1 ,
  • Mary Higgins , consultant obstetrician 2
  • 1 National Maternity Hospital, Dublin, Ireland
  • 2 National Maternity Hospital, Dublin; Obstetrics and Gynaecology, Medicine and Medical Sciences, University College Dublin
  • luciahartigan{at}hotmail.com

The success of an oral presentation lies in the speaker’s ability to transmit information to the audience. Lucia Hartigan and colleagues describe what they have learnt about delivering an effective scientific oral presentation from their own experiences, and their mistakes

The objective of an oral presentation is to portray large amounts of often complex information in a clear, bite sized fashion. Although some of the success lies in the content, the rest lies in the speaker’s skills in transmitting the information to the audience. 1

Preparation

It is important to be as well prepared as possible. Look at the venue in person, and find out the time allowed for your presentation and for questions, and the size of the audience and their backgrounds, which will allow the presentation to be pitched at the appropriate level.

See what the ambience and temperature are like and check that the format of your presentation is compatible with the available computer. This is particularly important when embedding videos. Before you begin, look at the video on stand-by and make sure the lights are dimmed and the speakers are functioning.

For visual aids, Microsoft PowerPoint or Apple Mac Keynote programmes are usual, although Prezi is increasing in popularity. Save the presentation on a USB stick, with email or cloud storage backup to avoid last minute disasters.

When preparing the presentation, start with an opening slide containing the title of the study, your name, and the date. Begin by addressing and thanking the audience and the organisation that has invited you to speak. Typically, the format includes background, study aims, methodology, results, strengths and weaknesses of the study, and conclusions.

If the study takes a lecturing format, consider including “any questions?” on a slide before you conclude, which will allow the audience to remember the take home messages. Ideally, the audience should remember three of the main points from the presentation. 2

Have a maximum of four short points per slide. If you can display something as a diagram, video, or a graph, use this instead of text and talk around it.

Animation is available in both Microsoft PowerPoint and the Apple Mac Keynote programme, and its use in presentations has been demonstrated to assist in the retention and recall of facts. 3 Do not overuse it, though, as it could make you appear unprofessional. If you show a video or diagram don’t just sit back—use a laser pointer to explain what is happening.

Rehearse your presentation in front of at least one person. Request feedback and amend accordingly. If possible, practise in the venue itself so things will not be unfamiliar on the day. If you appear comfortable, the audience will feel comfortable. Ask colleagues and seniors what questions they would ask and prepare responses to these questions.

It is important to dress appropriately, stand up straight, and project your voice towards the back of the room. Practise using a microphone, or any other presentation aids, in advance. If you don’t have your own presenting style, think of the style of inspirational scientific speakers you have seen and imitate it.

Try to present slides at the rate of around one slide a minute. If you talk too much, you will lose your audience’s attention. The slides or videos should be an adjunct to your presentation, so do not hide behind them, and be proud of the work you are presenting. You should avoid reading the wording on the slides, but instead talk around the content on them.

Maintain eye contact with the audience and remember to smile and pause after each comment, giving your nerves time to settle. Speak slowly and concisely, highlighting key points.

Do not assume that the audience is completely familiar with the topic you are passionate about, but don’t patronise them either. Use every presentation as an opportunity to teach, even your seniors. The information you are presenting may be new to them, but it is always important to know your audience’s background. You can then ensure you do not patronise world experts.

To maintain the audience’s attention, vary the tone and inflection of your voice. If appropriate, use humour, though you should run any comments or jokes past others beforehand and make sure they are culturally appropriate. Check every now and again that the audience is following and offer them the opportunity to ask questions.

Finishing up is the most important part, as this is when you send your take home message with the audience. Slow down, even though time is important at this stage. Conclude with the three key points from the study and leave the slide up for a further few seconds. Do not ramble on. Give the audience a chance to digest the presentation. Conclude by acknowledging those who assisted you in the study, and thank the audience and organisation. If you are presenting in North America, it is usual practice to conclude with an image of the team. If you wish to show references, insert a text box on the appropriate slide with the primary author, year, and paper, although this is not always required.

Answering questions can often feel like the most daunting part, but don’t look upon this as negative. Assume that the audience has listened and is interested in your research. Listen carefully, and if you are unsure about what someone is saying, ask for the question to be rephrased. Thank the audience member for asking the question and keep responses brief and concise. If you are unsure of the answer you can say that the questioner has raised an interesting point that you will have to investigate further. Have someone in the audience who will write down the questions for you, and remember that this is effectively free peer review.

Be proud of your achievements and try to do justice to the work that you and the rest of your group have done. You deserve to be up on that stage, so show off what you have achieved.

Competing interests: We have read and understood the BMJ Group policy on declaration of interests and declare the following interests: None.

  • ↵ Rovira A, Auger C, Naidich TP. How to prepare an oral presentation and a conference. Radiologica 2013 ; 55 (suppl 1): 2 -7S. OpenUrl
  • ↵ Bourne PE. Ten simple rules for making good oral presentations. PLos Comput Biol 2007 ; 3 : e77 . OpenUrl PubMed
  • ↵ Naqvi SH, Mobasher F, Afzal MA, Umair M, Kohli AN, Bukhari MH. Effectiveness of teaching methods in a medical institute: perceptions of medical students to teaching aids. J Pak Med Assoc 2013 ; 63 : 859 -64. OpenUrl

oral presentation assessment criteria

Academia.edu no longer supports Internet Explorer.

To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to  upgrade your browser .

Enter the email address you signed up with and we'll email you a reset link.

  • We're Hiring!
  • Help Center

paper cover thumbnail

Assessing oral presentations - criteria checklist

Profile image of Jan Deurwaarder

Course documents used to assess oral communication - presenting a topic to an audience.

Related Papers

Andrew Leichsenring

Oral presentations are a common form of summative assessment in tertiary level English as a Foreign Language (EFL) syllabi. There will be an array of teaching and learning elements to be considered by a teacher in their set-up and execution of an ICT-based oral presentation activity that goes beyond having students stand in front of a class group and talk about a subject. Teaching effective oral presentation skills to university-level learners requires an understanding of how to maximize learning opportunities to persuasively convey a message orally and visually to an audience.

oral presentation assessment criteria

Timtnew Somrue

ken pelicano

ihyh9h9hiuohuihuiohiuohftydftycfygvuhg

Aysha Sharif

The purpose of this paper is to highlight the importance of oral presentation skills/public speaking for fresh graduates of all disciplines and emphasize developing online materials to cater the needs of the students on a large scale worldwide using the platform of technology in the field of English language teaching. An ESP course has been developed online in this study in the form of a blog called " English for Oral Presentation Skills " providing course view and outline, course objectives, learning outcomes and activities to be conducted by the teacher. The focus of this study is to provide students with language used in different stages of the presentation which include language for introduction, transitional language, language for conclusion and the language for Q & A session. It also includes general language implications to be considered while teaching English for oral presentation skills. It also highlights the information on different types of presentations and the structure of presentations. The overall course includes the assessment criteria as well, in order to make sure whether the students are able to present with proper language skills. Hence, this study is a successful attempt in providing large audience with the skills of public speaking using technology in language teaching.

Ariana Vacaretu

The guidebook complements another intellectual output of the project, “Syllabus for the elective course Mathematics research workshop/ Studying math through research”, in that it reveals how the project team operationalised two transversal competences and a specifically mathematical competence, preparing for the development of the methodology (methods and tools) for assessing them. The guidebook addresses teachers and experts in didactics who are interested in developing competence assessment tools. We are confident that the process of developing the competence assessment methodology / instruments described in this guidebook may prove useful for specialists interested in competence assessment. The guidebook is structured in four chapters. The first chapter presents aspects connected to assessment in the mathematics research workshops – what we know about how assessment is done in such research workshops, why we aim to assess competences students develop in the research workshops, and some aspects that should be kept in mind when assessing competences. In the second chapter, we share the diagram of competences students develop in the research workshops, and operationalise / define the three competences students develop in these workshops: collaborative problem solving, use of aids and tools, and written and oral communication skills for sharing the research results. Chapter three includes the methodology of assessing the above-mentioned competences, which was tested over the period of an academic year, and then revised. The last chapter shares the conclusions we drew upon testing the assessment methods and tools, as well as a few ideas related to how our approach can be continued.

Journal of Language Teaching and Research

Tariq Elyas

Simsim Samasim

The paper examines the value of oral communication as it is taught, practised and assessed across two environmental degrees and determines whether a competent level of oral communication training, assessment and practice in oral genres and oral skill development has been achieved. To undertake this investigation the authors applied defined levels of attainment for oral communication and mapped the levels of attainment in core units and location over a program of study; audited oral assessment strategies; examined lecturer and student reflections of oral communication learning activities, and examined graduates&#39; reflections through the course experience questionnaire. We found that both degrees currently use several genres of oral communication as a learning activity and a learning outcome, but there is limited training in oral communication and few units are using higher level authentic learning activities. Hence students are not experiencing varied oral genres, purposes, and au...

Loading Preview

Sorry, preview is currently unavailable. You can download the paper by clicking the button above.

RELATED TOPICS

  •   We're Hiring!
  •   Help Center
  • Find new research papers in:
  • Health Sciences
  • Earth Sciences
  • Cognitive Science
  • Mathematics
  • Computer Science
  • Academia ©2024

Verktøy og mal

Oral Presentations: Assessment Sheet

Checklist and a pen. Photo.

  • Oral Presentations, assessments (DOCX)
  • Oral Presentations, assessment (PDF)

Sitere eller gjenbruke?

Læringsressurser.

Presentations & Conversations

COMMENTS

  1. PDF SAMPLE ORAL PRESENTATION MARKING CRITERIA

    3. PEER ASSESSMENT OF GROUP PRESENTATIONS BY MEMBERS OF TEAM Use the criteria below to assess your contribution to the group presentation as well as the contribution of each of your teammates. 0 = no contribution 1 = minor contribution 2 = some contribution, but not always effective/successful 3 = some contribution, usually effective/successful

  2. PDF ORAL PRESENTATION EVALUATION CRITERIA

    Fair: Only mild enthusiasm, problems with comprehensibility, cannot be heard very well, not very interesting to audience. Weak: No interest in presentation, barely or incomprehensible. Fluency. 20. 17. 14. 11. Superior: Gets the idea across fully with little hesitation; goes beyond the minimum. Communicates with ease overall.

  3. PDF Oral Presentation Evaluation Criteria and Checklist

    ORAL PRESENTATION EVALUATION CRITERIA AND CHECKLIST. talk was well-prepared. topic clearly stated. structure & scope of talk clearly stated in introduction. topic was developed in order stated in introduction. speaker summed up main points in conclusion. speaker formulated conclusions and discussed implications. was in control of subject matter.

  4. PDF Oral Presentation Evaluation Rubric

    Organization. Logical, interesting, clearly delineated themes and ideas. Generally clear, overall easy for audience to follow. Overall organized but sequence is difficult to follow. Difficult to follow, confusing sequence of information. No clear organization to material, themes and ideas are disjointed. Evaluation.

  5. PDF Presentation Evaluation Criteria

    The presentation is properly focused. A clear train of thought is followed and involves the audience. The speaker makes main points clear. The speaker sequences main points effectively. The speaker includes internal summaries. The outline is repeatedly referenced to provide signposts. The speaker provides effective signposts.

  6. PDF Oral Presentation Rubric

    Oral Presentation Rubric 4—Excellent 3—Good 2—Fair 1—Needs Improvement Delivery • Holds attention of entire audience with the use of direct eye contact, seldom looking at notes • Speaks with fluctuation in volume and inflection to maintain audience interest and emphasize key points • Consistent use of direct eye contact with ...

  7. PDF Oral Presentation: Scoring Guide

    Oral Presentation: Scoring Guide. 4 points - Clear organization, reinforced by media. Stays focused throughout. 3 points - Mostly organized, but loses focus once or twice. 2 points - Somewhat organized, but loses focus 3 or more times. 1 point - No clear organization to the presentation. 3 points - Incorporates several course concepts ...

  8. PDF Scoring Rubric for Oral Scientific Presentations

    Level of Achievement. Excellent 16-20 points. Good 11-15 points. Marginal 6-10 points. Inadequate 0-5 points. Organization. Well thought out with logical progression. Use of proper language. Significance clearly stated.

  9. PDF RUBRIC FOR THE ASSESSMENT OF ORAL PRESENTATIONS

    Scoring Criteria Grade Framing (20%) The type of presentation is appropriate for the topic and audience. Introduction is attention-getting, lays out the problem well, and establishes a framework for the rest of the presentation. Content (50%) Presentation draws on theories and themes from the course. Presentation contains accurate information.

  10. Oral presentations

    Oral presentations are often combined with other modes of assessment; for example oral presentation of a project report, oral presentation of a poster, commentary on a practical exercise, etc. ... It is also important that all aspects being assessed are reflected in the marking criteria. In the case of group presentation you might also assess:

  11. Rubric formats for the formative assessment of oral presentation skills

    Acquiring complex oral presentation skills is cognitively demanding for students and demands intensive teacher guidance. The aim of this study was twofold: (a) to identify and apply design guidelines in developing an effective formative assessment method for oral presentation skills during classroom practice, and (b) to develop and compare two analytic rubric formats as part of that assessment ...

  12. PDF Rubric formats for the formative assessment of oral presentation skills

    lands are struggling with how to teach and assess students' oral presentation skills, lack clear performance criteria for oral presentations, and fall short in oering adequate forma-tive assessment methods that support the eective acquisition of oral presentation skills (Sluijsmans et al., 2013).

  13. PDF Oral Presentation Grading Rubric

    Nonverbal Skills. 4 - Exceptional. 3 - Admirable. 2 - Acceptable. 1 - Poor. Eye Contact. Holds attention of entire audience with the use of direct eye contact, seldom looking at notes or slides. Consistent use of direct eye contact with audience, but still returns to notes. Displayed minimal eye contact with audience, while reading ...

  14. Oral assessment

    In oral assessment, students speak to provide evidence of their learning. Internationally, oral examinations are commonplace. We use a wide variety of oral assessment techniques at UCL. Students can be asked to: present posters. use presentation software such as Power Point or Prezi. perform in a debate.

  15. Guidelines for Oral Assessments and Exams

    Oral assessments gauge students' knowledge and skills based on the spoken word, typically guided by questions or small tasks. Oral assessments can take on different formats, including: Presentation on a prepared topic (individual or group, live or recorded) Interviews or discussions to assess a student's knowledge or skills Simulations or demonstrations of skills individually or with ...

  16. Oral presentations

    Assessment rigour for oral presentations includes the teacher's capacity to assess a range of presentation topics, formats and styles with an equal level of scrutiny. Teachers should therefore develop marking criteria that focus on a student's ability to take complex issues and present them in a clear and relatable manner rather than focus ...

  17. DOC Criteria

    Criteria. Components3-Sophisticated2-Competent1-Not yet CompetentOrganization. Presentation is clear, logical, and organized. Listener can follow line of reasoning. Presentation is generally clear and well organized. A few minor points may be confusing. Organization is haphazard; listener can follow presentation only with effort.

  18. Development and validation of the oral presentation evaluation scale

    A self-assessment instrument for oral presentations could provide students with insight into what skills need improvement. ... The assessment criteria included content (40%), presentation (40%), and structure (20%); the maximum percent in each domain was given for "excellence", which was relative to a minimum "threshold". ...

  19. Assessing oral presentation performance: Designing a rubric and testing

    The purpose of this paper is to design a rubric instrument for assessing oral presentation performance in higher education and to test its validity with an expert group.,This study, using mixed methods, focusses on: designing a rubric by identifying assessment instruments in previous presentation research and implementing essential design ...

  20. PDF Oral presentation criteria 2016

    Microsoft Word - Oral_presentation_criteria_2016.doc. Oral Presentation Judging Criteria. Judges are primarily interested in two criteria: the project itself and your presentation of it. The criteria are weighted so that the project impacts the final score slightly more than the presentation itself, but they now reflect the presenter's ...

  21. How to prepare and deliver an effective oral presentation

    Delivery. It is important to dress appropriately, stand up straight, and project your voice towards the back of the room. Practise using a microphone, or any other presentation aids, in advance. If you don't have your own presenting style, think of the style of inspirational scientific speakers you have seen and imitate it.

  22. Assessing oral presentations

    Andrew Leichsenring. Oral presentations are a common form of summative assessment in tertiary level English as a Foreign Language (EFL) syllabi. There will be an array of teaching and learning elements to be considered by a teacher in their set-up and execution of an ICT-based oral presentation activity that goes beyond having students stand in ...

  23. Oral Presentations: Assessment Sheet

    Oral Presentations: Assessment Sheet. This table is a guideline to use for assessment of oral presentations using visual aids. You may print it out and use it for making notes during the presentation. Vis kompetansemål. Bilde: Henrik Sørensen / CC BY-NC. Filer. Oral Presentations, assessments (DOCX)