ebook img

ERIC EJ991409: The Assessment of Critical Evaluation, Leadership and Reflection Skills through Participation in Online Discussions PDF

2012·0.07 MB·English
by  ERIC
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview ERIC EJ991409: The Assessment of Critical Evaluation, Leadership and Reflection Skills through Participation in Online Discussions

HEA STEM Conference The assessment of critical evaluation, leadership and reflection skills through participation in online discussions Jacqui Taylor Increasingly, educators from all disciplines are using blogs, social networking sites, VLEs and wikis to encourage academic discourse between students. However, a common problem experienced by educators is how these important learning experiences can be assessed and because of this difficulty many are not assessed. For some time now, I have been using online discussions via the University VLE as a way to encourage student debate around key lecture topics (e.g. Taylor, 2002). The key learning outcomes which this assessed activity addresses, in addition to learning more about the topic, are to develop skills in reflective practice, critical evaluation and leadership. This article will review the ways that face-to-face and online academic discourse between students have been assessed, highlighting some of the differences to consider when setting up online discussion activities, compared to face-to-face discussion. I will then provide a case study of the way I set up online discussions and the method I currently use to assess contributions. The final part of the paper will consider the potential for using quantitative content analysis (QCA) and automated methods to assess online participation. Keywords: Online discussion; online skill assessment; virtual groups; quantitative content analysis; automating assessment. 1. Introduction Although, students may be assessed individu- O NLINE GROUP DISCUSSIONS ally for their presentation skills or for their develop many important traditional contribution towards a group project, their and 21st century skills important for actual discourse is rarely assessed, due graduates in this fast changing world mainly to the difficulty in assessing commu- (McGraw, 2009). However, existing models nication in real time face-to-face environ- of assessment typically fail to address these ments. Even if such interactions are skills. This Introduction will consider the recorded via video or audio, the complexity similarities and differences between the of such interaction makes assessment excep- learning experiences in face-to-face and tionally time-consuming and difficult. online group discussions and consider how With discourse in online contexts, a the setting-up of online discussions can record is made and interactions take place affect the development of skills. serially (rather than multiple parties communicating at the same time), and 1.1 Assessing ‘traditional’ and online discourse therefore it is easier to assess individual between students contributions. However, there is very little Academic discourse between students helps guidance for educators for assessing these them to develop an understanding of interactions. Vonderwall et al. (2009) high- different views on a topic and helps to light the paucity of research and practical develop public speaking and debating skills; advice on how to assess online postings. They however the learning that takes place during discuss assessment processes in asynchro- such interactions is rarely assessed. nous online discussions, but focus on the 52 Psychology Teaching Review Vol. 18 No. 2, Autumn 2012 © The British Psychological Society 2012 The assessment of critical evaluation, leadership and reflection skills… variety of aspects that ‘could’ be assessed debate activities led to the highest quality of such as self-regulation, learner autonomy, messages and highlight the similar qualities learning community and student writing of these two activities; that they were well skills. They conclude that, ‘asynchronous structured, provided clearly defined roles for online discussions facilitate a multidimen- the students and they provoked the students sional process of assessment… further to explicitly confront others’ opinions. research is needed to understand what A number of research articles (e.g. Gafni & assessment strategies or criteria enhance Geri, 2010) have shown that assessing online assessment and learning’ (p.309). contributions both increases participation as In an early paper, Newman et al. (1994) well as enhances the quality of academic compared critical thinking in face-to-face discourse. Swan et al. (2008) found that and computer-based seminars (where providing students with the assessment students participated in both), and found criteria led to increases in their participation that more new ideas emerged in the face-to- and fostered deeper learning. Regarding the face seminars but that more ideas expressed impact of individual differences, although in the online seminars were rated as impor- gender and personality have been shown to tant, justified or linked together; indicating a affect preferences for online discussion, they difference in quality and quantity. Heckman did not affect performance. However, an et al. (2002) compared four face-to-face and individual difference to consider in assess- four online group discussions and found ment that has received little investigation in that the online discussions generated high online learning is the impact of having levels of cognitive activity, which were equal English as a second language. Similarly, to or were superior to those identified in the students who have communication difficul- face-to-face discussions. However, they ties such as dyslexia will need to be assessed provide no guidance on how to conduct according to the relevant marking guide- assessment. lines. 1.2 Setting up online academic discussions 2. Case study: My assessment of I have previously identified key factors for online discussions educators to consider when setting up Hazari (2004) identified two ways to assess online discussions and which affect partici- contributions to online discussions: analytic pation and learning (Taylor, 2002). Three marking which involves assigning marks to factors will be considered here which specif- specified criteria, and holistic marking ically relate to assessment (task type, assess- where marks are assigned to the whole unit ment strategy, and individual differences), of analysis without scoring individual however other factors (such as group criteria. I use the analytical method and the composition and instructor involvement) unit of analysis that I use for assessment is should also be considered. the message; thus each message is evaluated As with offline learning activities, task using the criteria below. In addition to type can have a significant impact on the learning more about the topic and encour- quality and quantity of student engagement. aging extended research, the key learning Kanuka et al. (2007) examined the influence outcomes which the online discussion of different types of communication activity activity addresses are to develop skills in crit- on the quality of students’ messages in an ical reflection, evaluation and leadership. online discussion and found significant The assignment consists of participation in differences between the five activities: the three online discussions and co-ordination nominal group technique; debate; invited and leadership of one discussion. Each indi- expert; WebQuest and reflective delibera- vidual message is assessed by hand using tion. They found that the WebQuest and similar criteria to that used for other forms Psychology Teaching Review Vol. 18 No. 2, Autumn 2012 53 Jacqui Taylor of academic writing, specifically whether it is articles in the mass media. When I assess analytical and evaluative. In addition, marks reflection, marks are awarded for links made are awarded for reflection and timeliness of to personal experiences and examples from research cited. Each message is graded for this wider context. these criteria using a five-point scale, from ‘basic attempt’ to ‘excellent’. When leading 2.2 Critical evaluation and extended research an online discussion, students: provide an Mason (1991) proposed that the measure- introduction to the key points; respond to ment of online transcripts should be based other group members’ questions; motivate on the educational value that they exhibited. discussion, and send a conclusion. The He broke this down into a number of useful leader’s posts are graded for coordination, questions, for example, whether a message responding and motivation. Some brief built on previous messages, whether the discussion of each of these skills highlights participant drew on their own experience, relevant research and the defining features whether they referred to course material or for assessment. material outside the course and whether they initiated new ideas for discussion. When 2.1 Reflection I assess evaluation, marks are awarded for Students are increasingly required to reflect questioning and building on previous critically on their learning as part of their messages and research. The extent and time- coursework, however, as Coutinho (2007) liness of research and resources used are highlights, teaching and encouraging reflec- recognised in marking, for example, articles tive practice is problematic in many ways. For published within the last two years and for example, agreement on what constitutes research not already covered in lectures. reflective practice is vague and the assess- ment-centred approach to learning in HE 2.3 Leadership skills often focuses reflection on improving the In their study on the importance of trust in reflective writing style rather than on virtual teamwork, Jarvenpaa and Leidner reflecting upon learning (metacognition). (1998) found that teams with high trust In addition, when reflections are assessed, levels were more capable of dealing with the incentive is to demonstrate knowledge uncertainty and complexity than those with and hide ignorance or doubt which is low levels. Prior to the online discussions, counter to Dewey’s (1939) original purpose students are provided with a handout on the of reflection in which learning is derived characteristics, benefits and problems of a from analysing mistakes and solving prob- virtual team, along with some useful tips for lems. Seale and Cann (2000) explored the effectively managing and creating trust way learning technologies were used to facil- within their online discussions. Regarding itate reflective thinking in students. They the style of the online discussion, students illustrated how a small group of students are advised of the need to find a balance engaged with the material through online between social and task-based communica- discussion; students were able to make links tion. The problems of coordinating or with other learning experiences and to see leading online teams can be significant, and things in different ways. For my online leaders are encouraged to respond quickly discussions, students are encouraged to and to include informal comments so that research widely using both academic sources the style of discussion is not a series of long as well as sources from the media (as long as monologues and assessment identifies due consideration is given to their credi- attempts to encourage interactivity, for bility). This allows important concepts from example, including participant’s names and published papers to be illustrated using clips responding directly to them. This has been from YouTube and the BBC, and web links to shown to be one of the key indicators 54 Psychology Teaching Review Vol. 18 No. 2, Autumn 2012 The assessment of critical evaluation, leadership and reflection skills… demonstrating an awareness of social pres- example, drawing on 19 key studies ence and community. A leader’s role is crit- published in the preceding decade, Rourke ical at the start and end of a virtual et al. (2001) cover the potential uses and the discussion (when the definition of the topic methodological challenges of analysing and time plan are identified), therefore, online transcripts using QCA. This classic I assess the introduction and summary paper provides a comprehensive discussion messages separately. Extra marks are of issues relating to criteria, research awarded also for motivating comments and designs, units of analysis and ethical issues, students are encouraged to weave the find- however, it is not easy for the educator to use ings of empirical research into responses and seems primarily aimed at educational and questions to others. technologists and researchers. Indeed in a later paper, Rourke and Anderson (2004) 3. Potential for using Quantitative propose that QCA is still not systematic and Content Analysis (QCA) and software to objective enough to describe academic assist assessment discourse and provides procedures for devel- Due to a change to assessment strategy, the oping the validity of a QCA coding protocol online discussions will form the only assess- that is theoretically valid and to establish its ment for this unit in the future (currently it validity empirically. Many of the articles forms 30 per cent, with 70 per cent coming employing QCA in online environments are from an exam). Therefore, there is a need to theoretically driven by the Community of provide more detailed feedback to students Inquiry (CoI) framework. This framework regarding the assessment of their contribu- was developed by Garrison et al. (2001) and tions and I am considering the potential for consists of three elements: cognitive pres- using content analysis or automated tech- ence; social presence; and teaching pres- niques as an additional method of assess- ence. The first two elements can be used to ment. A literature review of this area further understand the potential use of QCA highlighted the methodologies and software in assessing reflection, evaluation and lead- that could be used, but the limited empirical ership. research highlights important factors to consider if assessment is based on these 3.1.1 Critical evaluation and extended research methods. Garrison et al. (2001) provided a detailed overview of ways to evaluate online tran- 3.1 Quantitative content analysis (QCA) scripts for evidence of critical thinking based Newman et al. (1995) developed a content on his five stages: (i) problem identification; analysis method to measure critical thinking (ii) problem definition; (iii) problem explo- in online group discussions and provided ration; (iv) problem evaluation/applicability; textual indicators to identify critical and and (v) problem integration. While uncritical thinking using sets of paired indi- Garrison’s stages are useful they need some cator, for example: relevant/irrelevant; simplification if they are to be used to assess important/trivial; new ideas/repeating what online discussions. It is clear that the main has been said; putting down new ideas/ focus of later work by Garrison et al. (2006) is welcoming new ideas. The system certainly in producing a methodology to systematically has potential for use in the assessment of and rigorously measure cognitive presence in online discourse as the use of obvious oppo- online communications. Their work is very sites should be easy for an educator (not useful in guiding educators in the adoption, experienced in qualitative analysis) to iden- design and implementation of online envi- tify messages that illustrate these extremes. ronments for learning, but is less useful as an Other papers have been published since this, assessment tool. Kanuka et al. (2007) used but they are not aimed at educators. For the construct cognitive presence to investi- Psychology Teaching Review Vol. 18 No. 2, Autumn 2012 55 Jacqui Taylor gate the role of critical discourse in distance 3.1.3 Leadership skills education and examined the influence of A key skill of an online leader is to different types of communication activity on encourage an atmosphere of trust and the quality of students’ messages in an online collaboration and this has been linked to the discussion. Using QCA to analyse messages concept of social presence. Rourke et al. from 19 students in an undergraduate (1999), drawing on the community of course, each message was assigned to one of inquiry work above, have produced a the four categories of cognitive presence. template to assess social presence through While the number of contributions cate- content analysis. The usability of the gorised in the highest phases of cognitive template for educators is enhanced through presence was low (20.21 per cent), interest- the provision of selections of coded tran- ingly they found that it was highest during scripts and inter-rater reliability figures illus- activities which were well structured, trate the validity of this template. However, provided clearly defined roles for the this article is of most use for conference students and that provoked students to moderators and researchers as the focus is explicitly confront others’ opinions. on setting-up and encouraging social pres- Extended research is probably the easiest to ence, rather than the assessment of this assess using QCA and to some extent can factor. even be partially automated. For example, dates can be highlighted to allow easy identi- 3.2 Software to assist analysis, assessment and fication of recent research and using a list of feedback references already used in lectures the extent Over the last five years, there has been an of new research cited by students can also be explosion in the use of computer software to easily identified. analyse text and there are literally hundreds of software products available which can 3.1.2 Reflection assist analysis of online text (e.g. kdnuggets). A study by Mair and Taylor (2011) set out to Many packages are developed for use in identify whether students were reflecting specific fields and contain features appro- online and if so, how deeply. A content priate to the type of discourse being analysis was conducted on discussion tran- analysed, for example: politics (e.g. Hopkins scripts using the four types of reflective writing & King, 2010 look at political speeches and identified by Hatton and Smith (1995). The campaigns); health (e.g. Kim, 2009 evalu- study found that the level of reflections within ated cancer blogs), and those commercially the postings became deeper over time. For available for marketing and advertising. example, although more reflection was occur- There are relatively few packages developed ring in the early discussions, the majority of for pedagogic use or for use in the social the postings were classed according to Hatton sciences. A review of papers published over and Smith (1995) as level 1 (merely reports the last two years found that the packages events with no attempt to provide reasons) used most often in the social sciences and level 2 (provides reasons, often based on include: Linguistic Inquiry and Word Count personal judgement). While later discussions (LIWC); QSR NVivo; TAMS Analyzer (Text contain deeper, dialogic and critical reflec- Analysis Mark-up System); ATLAS.ti, Text- tions, that is, more at Hatton and Smith’s level STAT, Ranks NL. These range in features, 3 (discourse with one’s self, mulling over from those which produce word frequency reasons, exploring alternatives). Although this lists and concordances to those with study highlighted the method as a potential powerful search possibilities (e.g. to identify way to categorise reflections, it has not been regular expressions or phrases). used for assessment, due to time constraints. 56 Psychology Teaching Review Vol. 18 No. 2, Autumn 2012 The assessment of critical evaluation, leadership and reflection skills… Despite the advances of methods for McGaw (2009) highlights a number of new automated content analysis in the field of 21st century skills which are developed media analysis, most methods are only able through interactions using social media. to highlight and count instances of pre-spec- However, a theme throughout this article is ified words or phrases and we are a long way that the assessment of these skills remains a from automated assessment of critical challenging area for educators. The poten- thinking, for example. One potential tial for partially automating assessment package that could be used immediately is through the use of QCA has been proposed, the free, open-source template NodeXL, however we are a long way from this. In addi- which makes it easy to develop network tion to the difficulties in operationalising graphs from data entered within a concepts such as reflection and evaluation, Microsoft® Excel® spreadsheet. This new systems need to account for the package has great potential to provide assess- different nature and style of online academic ment feedback in a visual format. discourse, for example, the different level of formality, different cultural styles of inter- 4. Conclusion action and to treat fairly those students with It is clear that assessment methods need to additional learning needs (such as dyslexia). be modernised, to reflect the changes in learning activities taking place as a result of Correspondence using interactive and collaborative technolo- Associate Professor Jacqui Taylor gies made possible by Web 2.0. Additionally, Psychology Research Centre, updated methods need to consider the expe- Bournemouth University, riences and expectations of the current Poole, BH12 5BB. generation of students (Taylor, 2011). Email: [email protected] References Coutinho, S.A. (2007). The relationship between Hatton, N. & Smith, D. (1995). Reflection in teacher goals, metacognition and academic success. education. towards definition and implemen- Educate, 7(1), 39–47. tation. Teaching and Teacher Education, 11, 33–49. Dewey, J. (1939). Experience, knowledge and value: Hazari, S. (2004). Strategy for the assessment of online A rejoinder. In P. Schilpp (Ed.), The philosophy of course discussion. Journal of Information Systems John Dewey (pp.517–608). Evanston, IL: North- Education, 349–355. western University. Heckman, R. & Annabi, H. (2005). A content analytic Gafni, R. & Geri, N. (2010). The value of collaborative comparison of learning processes in online and e-learning: Compulsory versus optional online face-to-face case study discussions. JCMC, 10(2). forum assignments. Interdisciplinary Journal of Hopkins, D.J. & King, G. (2010). A method of E-Learning and Learning Objects, 6, 335–343. automated non-parametric content analysis for Garrison, D.R., Anderson, T. & Archer, W. (2001). social science. American Journal of Political Science, Critical thinking, cognitive presence, and 54(1), 229–247. computer conferencing in distance education. Jarvenpaa, S. & Leidner, D. (1998). Communication American Journal of Distance Education, 15(1), 7–23. and trust in global virtual teams. Journal of Garrison, D.R., Cleveland-Innes, M., Koole, M. & Computer Mediated Communication, 3(4). Kappelman, J. (2006). Revisiting methodological Kanuka, H., Rourke, L. & Laflamme, E. (2007). The issues in the analysis of transcripts: Negotiated influence of instructional methods on the quality coding and reliability. The Internet and Higher of online discussion. British Journal of Educational Education, 9(1), 1–8. Technology, 38, 260–271. Garrison, D.R. & Akyol, Z. (2011). Understanding Kim, S. (2009). Content analysis of cancer blog posts. cognitive presence in an online and blended Journal Medical Library Association, 97(4), 260–266. community of inquiry: Assessing outcomes and processes for deep approaches to learning. British Journal of Educational Technology, 42(2), 233–250. Psychology Teaching Review Vol. 18 No. 2, Autumn 2012 57 Jacqui Taylor Mair, C. & Taylor, J. (2011). Critical reflection and Online resources discussion facilitated by a virtual learning environment Kdnuggets: Text Analysis, Text Mining, and across two universities. Paper presented at the Information Retrieval Software. Accessed International Technology, Education and 6 January 2012, from: Development (INTED) Conference, 7–9 March, http://www.kdnuggets.com/software/text.html Valencia, Spain. Linguistic Inquiry and Word Count (LIWC). Accessed Mason, R. (1991). Evaluation methodologies for 6 January 2012, from: http://www.liwc.net/ computer conferencing applications. In A.R. Kaye TextSTAT. Accessed 6 January 2012, from: (Ed.), Collaborative learning through computer http://neon.niederlandistik.fu–berlin.de/en/ conferencing.Berlin: Springer-Verlag. textstat/ McGaw, B. (2009).Transforming education: Assessing and QSR NVivo. Accessed 6 January 2012, from: teaching 21st century skills. Accessed 9 February http://www.qsrinternational.com 2012, from: ATLAS.ti. Accessed 6 January 2012, from: http://education2020.wikispaces.com/file/view/ http://www.atlasti.com/ Transformative_Assessment-A_Call_to_Action_ Ranks NL. Accessed 6 January 2012, from: and_Action.pdf http://www.ranks.nl/ Newman, G., Webb, B. & Cochrane, C. (1995). TAMS (Text Analysis Mark–up System). Accessed A content analysis method to measure critical 6 January 2012, from: thinking in face-to-face and computer-supported http://tamsys.sourceforge.net/ group learning. Interpersonal Computing and Technology, 3(2), 6–77. Rourke, L., Anderson, T., Garrison, D.R. & Archer, W. (2001). Methodological issues in the content analysis of computer conference transcripts. International Journal of Artificial Intelligence in Education, 12, 8–22. Rourke, L. & Anderson, T. (2004). Validity in quanti- tative content analysis. Educational Technology Research and Development, 52(1), 5–18. Rourke, L., Anderson, T., Archer, W. & Garrison, R. (1999). Assessing social presence in asynchronous computer conferencing transcripts. Journal of Distance Education, 14(2), 50–71. Seale, J.K., & Cann, A.J. (2000). Reflection online or offline: The role of learning technologies in encouraging students to reflect. Computers and Education, 34, 309–320. Swan, K., Shea, P., Richardson, J., Ice, P., Garrison, D.R., Cleveland-Innes, M. & Arbaugh, J.B. (2008). Validating a measurement tool of presence in online communities of inquiry. E-Mentor, 2(24), 1–12. Taylor, J. (2002). A review of the use of asynchronous e-seminars in undergraduate education. In R. Hazemi & S. Hailes (Eds.), The digital university: Building a learning community (pp.125–138). London: Springer. Taylor, J. (2011). How social media is changing students and the way they learn… and what psychology academics can do about it!Paper presented at the International Conference on the Teaching of Psychology (ICoTP), 29–31 July, Vancouver, Canada. Vonderwell, S, Xin Liang, X. & Alderman, K. (2009). Asynchronous discussions and assessment in online learning. Journal of Research on Technology in Education, 3(3), 309–328. 58 Psychology Teaching Review Vol. 18 No. 2, Autumn 2012

See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.