ebook img

ERIC EJ1148275: Students' Perceptions on Using Different Listening Assessment Methods: Audio-Only and Video Media PDF

2017·0.14 MB·English
by  ERIC
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview ERIC EJ1148275: Students' Perceptions on Using Different Listening Assessment Methods: Audio-Only and Video Media

English Language Teaching; Vol. 10, No. 8; 2017 ISSN 1916-4742 E-ISSN 1916-4750 Published by Canadian Center of Science and Education Students’ Perceptions on Using Different Listening Assessment Methods: Audio-Only and Video Media Norazean Sulaiman1, Ahmad Mazli Muhammad1, Nurul Nadiah Dewi Faizul Ganapathy1, Zulaikha Khairuddin1 & Salwa Othman1 1 Akademi Pengajian Bahasa, Universiti Teknologi MARA, Selangor, Malaysia Correspondence: Norazean Sulaiman, Akademi Pengajian Bahasa, Universiti Teknologi MARA, Selangor, Malaysia. E-mail: [email protected] Received: May 29, 2017 Accepted: July 9, 2017 Online Published: July 11, 2017 doi: 10.5539/elt.v10n8p93 URL: http://doi.org/10.5539/elt.v10n8p93 Abstract The importance and usefulness of incorporating video media elements to teach listening have become part of the general understanding and commonplace in the academia nowadays (Alonso, 2013; Macwan, 2015; Garcia, 2012). Hence, it is of vital importance that students are taught effectively and assessed accordingly on their listening skills. The purpose of this study is to examine students’ perceptions towards audio only method and video media method in listening assessment. The participants for this study were 150 students from four different faculties. Pre and post-test were conducted in collecting the data for this study with the same set of questions with two different assessment methods used. The results indicated that the majority of the participants have positive response towards the use of video media as their listening assessment method as it provides authentic, meaningful, and real-life situation contexts. Video has been used as a tool to cater the needs of 21st century learners as these learners are exposed with a lot of visual materials in their daily life. More video media related assessments should be implemented in the second language (L2) classrooms so that students will be more familiar with the different types of assessments present these days. In light of this notion, curriculum developers should be aware of the advancement in technology and ready to invest in changes. Keywords: perceptions, listening assessments, audio-only, video media 1. Introduction Over the past decades, an intense interest in the second language (L2) learning, testing, and teaching of listening skills bloomed and continue to blossom in the sphere of academia (Macwan, 2015; Dolati & Richards, 2011; Meskill, 1996; M. J. Kenning & M. M. Kenning, 1984). In present time, listening assessment is conducted by using traditional or audio-only method which does not portray real life situations or authentic scenarios. Due to limited vocabulary and low proficiency in the English language, it was observed that students did not perform well by using the traditional method. Based on the feedback obtained from VAK Questionnaire, it was discovered that students were more inclined to visual and kinesthetic learning styles. Hence, this study has come up with a hypothesis that students would prefer to use video media as their listening assessment method. This is due to the rapid growth of communicative tools such as Proxy Site, Redkix, Quora, FaceTime, Smule, SnapChat, WhatsApp etc. which are in trend among young leaners nowadays. Consequently, it strengthens the fact that instructors need to provide students with authentic, meaningful and functional listening materials for assessment to cater the needs of these millennials. Thus, this study would like to examine the students’ perceptions on the use of audio-only and video media as their listening assessment methods. 2. Literature Review 2.1 Students’ Perceptions towards Different Listening Assessment Methods In the world of teaching and learning, students’ perceptions are very crucial because teachers, lecturers and instructors need to take students’ preferences into consideration before they come up with teaching materials and lessons (Bulut & Üğüten, 2003). Apart from that, instructors also need to consider students’ prior knowledge even though it can be a challenge to the lecturers as reported in the Manual for Language Testing Development and Examining, Council of Europe (2011). This process is the same when instructors are setting question papers. 93 elt.ccsenet.org English Language Teaching Vol. 10, No. 8; 2017 Instructors need to know the objectives of the lessons because in evaluating students’ knowledge, they need to make sure that students have knowledge of the subject matter which is related to the issue of test validity (Reynisdóttir, 2016; Alonso, 2013). Students have different perceptions when their instructors use different approaches and methods in classes (McKittrick, Mitchum, & Spangler, 2014). According to Hasan and Tan (2012), students enjoyed using podcast as their listening tools. This is because it was found that podcast enhanced students’ listening comprehension and at the same time it was something current. In other words, students would prefer their instructors to use something different, authentic, and interesting in the class. Besides, the integration of technology would help instructors to gauge students’ interest and it can also be applied in listening assessment. This claim is supported by Ismaili (2013). Furthermore, Hassan, Mohd Mokhtar and Zainal Abiddin (2013) determined that students performed better when they use traditional method compared to online method as they had better instructions in traditional method compared to online method. Studies by Mirvan (2013), Woottipong (2014) and Sarani, Behtash and Arani (2014) highlighted that students’ perception on using video media was positive because by using video media, their listening comprehension became better and they were more interested to learn the skill and the language. This is due to the use of authentic language in the video. These three recent research results are parallel to one of the pioneer research by Gruba (1997) where he indicated that students have positive response to the use of video media. In fact, the students also preferred to have a video evaluation rather than written and audio evaluation. A study by McCarthy (2015) has proven this claim. The students preferred video evaluation because it reflects face to face interaction. Moreover, a report from UNESCO Bureau Report of Education (2013) stated that students perform better in their assessment and evaluation when they have truly understood the purpose and process of the particular assessment or evaluation (Muskin, 2015). This is in line with what was mentioned in the work of Bailey and Wolf (2012) in which they highlighted the alignment of language assessment. 3. Method 3.1 Research Design A quantitative approach has been utilised to gather data for the current study. In relation to the above, four major groups with an equal level of proficiency from four different faculties all over the campuses of Universiti Teknologi MARA (University M) were selected as the focus of the current study. These four groups were given two sets of similar listening assessments. The participants consisted of 150 semester one students from four faculties and referred to as Faculty of Film, Theatre and Animation (FITA), Faculty of Sports and Recreational (FSSR), Faculty of Music (FMu), and Faculty of Art and Design (FAD). The instruments used for this study were Visual, Auditory, Kinaesthetic (VAK) questionnaire, Multiple Choice Question (MCQ) sheet and questionnaire. 3.2 Method of Data Collection In this study, pre and post-test (audio only and video media methods) were used in collecting the data. First, all of these four groups listened to selected audio. They were given a set of listening assessments in which was completed based on the audio played. On the second session, students were given the same set of questions for listening assessment inclusive of video to answer the questions. 3.3 Methods of Data Analysis The data collected were analysed by using Statistical Package for the Social Sciences (SPSS). There were two different statistical tests used in order to determine the result of the study; descriptive statistics and inferential statistics. Mean, standard deviation, and t-test were used to analyse the data. 4. Results Table 1 demonstrates the mean scores of four variables which are ‘Authenticity’, ‘Students’ consultation’, ‘Students’ capabilities’, and ‘Test difficulty’ for FiTA, FSSR, FMu, and FAD. In regards to authenticity, the highest mean score obtained for audio was from Fmu with 4.25 (SD= .53), followed by FSSR with 4.14 (SD= .51), FAD with 3.60 (SD= .44), and FiTA with 1.10 (SD= .16). In terms of students’ consultation, the highest mean score obtained was from FSSR with 4.27 (SD= .49), followed by FMu with 4.14 (SD= .59), FiTA 4.12 (SD= .57), and FAD with 4.07 (SD= .49). Meanwhile, the highest mean score obtained for audio in regards to students’ capabilities was from FMu with 4.05 (SD= .77), followed by FSSR with 3.97 (SD= .62), FAD with 3.71 (SD= .06), and lastly from FiTA with 3.64 (SD= .65). On the other hand, the highest mean score obtained for audio for test difficulty was from FSSR with 4.11 (SD= 1.3), followed by FiTA with 3.86 (SD= .94), FMu 94 elt.ccsenet.org English Language Teaching Vol. 10, No. 8; 2017 with 3.39 (SD= 1.26), and FAD with 3.28 (SD= 1.3). Table 1. Students’ perceptions on audio only and video media methods Faculty Variables Mean Std. Deviation Audio Video Audio Video FiTA Authenticity 1.10 3.81 .16 .61 Students’ consultation 4.12 3.92 .57 .76 Students’ capabilities 3.64 3.56 .65 .68 Test difficulty 3.86 3.81 .94 1.14 FSSR Authenticity 4.14 4.02 .51 .72 Students’ consultation 4.27 4.08 .49 .72 Students’ capabilities 3.97 3.92 .62 .81 Test difficulty 4.11 3.94 1.3 1.41 FMu Authenticity 4.25 4.25 .53 .61 Students’ consultation 4.14 4.25 .59 .60 Students’ capabilities 4.05 4.05 .77 .77 Test difficulty 3.39 3.25 1.26 1.42 FAD Authenticity 3.60 3.83 .44 .59 Students’ consultation 4.07 4.05 .49 .38 Students’ capabilities 3.71 3.77 .06 .61 Test difficulty 3.28 2.93 1.3 1.56 As for the video media assessment, the highest mean score for authenticity was from FMu with 4.25 (SD= .61). It was followed by FSSR with 4.02 (SD= .72), FAD with 3.83 (SD= .59), and FiTA with 3.81 (SD= .61). In regards to the students’ consultation, the highest mean score attained was from FMu with 4.25 (SD= .60), followed by FSSR with 4.08 (SD= .72), FAD 4.05 (SD= .38), and FiTA 3.92 (SD= .76). The highest mean score attained for students’ capabilities was from FMu with 4.05 (SD= .77), followed by FSSR with 3.92 (SD= .81), FAD with 3.77 (SD= .61), and FiTA with 3.56 (SD= .68). Meanwhile, for test difficulty, the score obtained from FSSR was 3.94 (SD= 1.41), FiTA 3.81 (SD= 1.14), FMu 3.25 (SD= 1.42), and FAD with 2.93 (SD= 1.56) respectively. In regards to authenticity, the findings indicated that the majority of the participants from two faculties (FiTA and FAD) feel that the use of video is more authentic than audio for their listening assessments. The use of conversational videos as their listening material assessment is considered as authentic and functional as it is a real life situation. This relates with their visual predisposition as can be seen in their choice to major in the in visual arts. This perception contrasts with the perception of the participants from the other two faculties. Participants from FMu perceived both video media and audio only as authentic whereas participants from FSSR perceived audio only as more authentic. The study by Mirvan (2013) and Woottipong (2014) affirmed that the use of video in the classroom has motivated the students to learn and participate more during the learning session due to the real life situation portrayed in the video used which is considered as natural, authentic and meaningful. As for students’ consultation, it relates to the students’ previous knowledge about traditional and video-media method and also according to the instructor’s explanation about the test. It is evident in the mean scores obtained from the majority of the students that showed participants understood audio only method more than video media method. The majority of the participants understood audio only instructions better as they are familiar with the traditional listening method assessment as compared to the video method assessment. This finding is parallel to the research done by Hassan, Mohd Mokhtar and Zainal Abiddin (2014) which also indicated that their participants understood the instructions better in traditional method as compared to online method used as their listening assessment method. The results for students’ capabilities indicated that students believe they can score in either audio only or video media methods. There was only a slight difference in the mean score obtained for both methods. Participants 95 elt.ccsenet.org English Language Teaching Vol. 10, No. 8; 2017 from FiTA and FSSR believed they can perform better by using audio only method. In contrast with FAD students who think they can score better by using video media. As for FMu participants, they believed they can score in both methods equally. Based on the result obtained, students’ acceptance on the new listening method which is video media method is positive. According to Hasan and Tan (2012), students prefer their instructors to use something different in the class because it enhances students’ listening comprehension and at the same time it is something current. Thus, it shows that students perceived new methods for listening assessment positively. In terms of test difficulty, the majority of the participants from all four faculties feel that video media assessment is easier for them due to the presence of the video in facilitating the test. Both tests were using the same materials and the students perceived the video media method as easier since they can also watch the video during the test. Therefore, they can see the non-verbal aspects presented in the conversation. Woottipong (2014), Sarani, Behtash and Arani (2014), and McCarthy (2015) in their study also found that students believe they can score well by using video media due to the use of functional language and authentic context in the video. Table 2 shows students’ perception on test validity for audio only. The majority of the participants from the four faculties agreed that instructions and test questions are clear and easy to understand with 85.7% from FiTA, 98% from FSSR, 89.5% from FMu, and 85.7 from FAD. Next, the result showed that participants from FiTA, FSSR, FMu and FAD agreed that the level of test difficulty matches the level of students’ language proficiency with 97.1%, 96%, 94.4%, and 85.7% respectively. Participants from the respective faculties also agreed that test content should relate to daily life, world knowledge, general or workplace issues; 97.1% for FiTA, 100% for FSSR, 94.7% for FMu, and 100% for FAD. Next, the majority of the participants with 100% from FiTA, 94.0% from FSSR, 94.7% from FMu and 100% from FAD agreed that test content should be up-to-date and interesting so that students can learn in meaningful contexts. Participants also agreed that test items should be sourced from different fields of study with 77.1% from FiTA, 40% from FSSR, 88.9% from FMu and 85.7% from FAD. Other than that, the majority of the participants also agreed that the difficulty level of tests should suit students’ level of proficiency with 88.8% from FiTA, 96.0% from FSSR, 92.9% from FMu and 100% from FAD. Table 2. Test validity for audio No. FMu FSSR FiTA FAD Items Frequency Percentage Frequency Percentage Frequency Percentage Frequency Percentage Yes No Yes No Yes No Yes No Yes No Yes No Yes No Yes No Instructions and test questions are clear 30 5 85.7 14.3 49 1 98.0 2.0 17 2 89.5 10.5 12 2 85.7 14.3 and easy to understand. The level of test difficulty matches the 34 1 97.1 2.9 48 2 96.0 4.0 17 1 94.4 5.6 12 2 85.7 14.3 level of students’ language proficiency. Test content should relate to daily life, 34 1 97.1 2.9 50 0 100 0 18 1 94.7 5.3 14 0 100 0 world knowledge, general, or workplace issues. Test content should be up-to-date and 35 0 100 0 47 3 94.0 6.0 18 1 94.7 5.3 14 0 100 0 interesting so that students can learn in meaningful contexts. Test items should be sourced from 27 9 77.1 22.9 20 30 40.0 60.0 16 2 88.9 11.1 12 2 85.7 14.3 different fields of study. The difficulty level of tests should suit 31 4 88.8 11.4 48 2 96.0 4.0 18 1 94.7 5.3 13 1 92.9 7.1 students’ level of proficiency. Table 3 shows students’ perceptions on test validity for the video assessment. The majority of the participants from the three faculties agreed that instructions and test questions are clear and easy to understand with 91.7% from FiTA, 100% from FSSR, 94.7% from FMu, and 92.9 % from FAD. The results showed that participants from FiTA agreed that the level of test difficulty matches the level of students’ language proficiency with 91.4%, FSSR with 100%, FMu with 94.7%, and FAD with 92.3%. In terms of test content, the participants from all three faculties agreed that test content should relate to daily life, world knowledge, general or workplace issues with 97.1%, 100%, 94.7% and 92.9%, respectively. In regards to up-to-date, interesting and meaningful context, participants from FiTA agreed with 91.4%, FSSR with 94%, FMu with 94.7% and FAD with 92.9%. The 96 elt.ccsenet.org English Language Teaching Vol. 10, No. 8; 2017 majority of the participants also agreed that test items should be sourced from different fields of study with 82.9% from FiTA, 54% from FSSR, 94.7% from FMu and 76.9% from FAD. In terms of test difficulty, 97.1% from FiTA, 92% from FSSR, 94.7% from FMu and 92.9% from FAD agreed that the difficulty level of tests should suit students’ level of proficiency. Table 3. Test validity for video No. FMu FSSR FiTA FAD Items Frequency Percentage Frequency Percentage Frequency Percentage Frequency Percentage Yes No Yes No Yes No Yes No Yes No Yes No Yes No Yes No Instructions and test questions are clear 32 2 91.4 8.6 50 0 100.0 0.0 18 1 94.7 5.3 13 1 92.9 7.1 and easy to understand. The level of test difficulty matches the 32 3 91.4 8.6 50 0 100.0 0.0 18 1 94.7 5.3 12 1 92.3 7.7 level of students’ language proficiency. Test content should relate to daily life, 34 1 97.1 2.9 50 0 100.0 0.0 18 1 94.7 5.3 13 1 92.9 7.1 world knowledge, general, or workplace issues. Test content should be up-to-date and 32 3 91.4 8.6 47 3 94.0 6.0 18 1 94.7 5.3 13 1 92.9 7.1 interesting so that students can learn in meaningful contexts. Test items should be sourced from 29 6 82.9 17.1 27 23 54.0 46.0 18 1 94.7 5.3 10 3 76.9 23.1 different fields of study. The difficulty level of tests should suit 34 1 97.1 2.9 46 2 92.0 8.0 18 1 94.7 5.3 13 1 92.9 7.1 students’ level of proficiency. 5. Discussion It can be reflected here that in terms of the instruction given, students were clear and were able to comprehend the instruction stated for all the parts in the question sheet. This is triggered by the presence of the lecturer and the instructor during the test. This statement is aligned with the value contained in the UNESCO Bureau Report of Education (2013) in which students perform better in their assessment and evaluation when they have truly understood the purpose and process of the particular assessment or evaluation (Muskin, 2015). In terms of level of test difficulty, participants at large agreed that the assessment should match the level of their language proficiency. Hence, the test questions should be prepared accordingly. This is in line with what was mentioned in the work of Bailey and Wolf (2012) in which they highlighted on the alignment of language assessment. The researchers claimed that a strong alignment between standards and assessments will ensure meaningful and accurate measures of students’ language achievement. Next, the majority of students in the current research believed that test content should relate to daily life, world knowledge, general or workplace issues. This can be understood clearly that the students favour authentic context where it is more interesting and related to them. This is in accordance to what Alonso (2013) had mentioned in his work; the aim of teaching listening comprehension is to help learners of English cope with listening in real life. This is in total contrast in the case of students in University M; the content of the present listening test is not authentic and does not reflect real-life situations. On top of that, participants at large agreed that test content should be up-to-date and interesting so that students can learn in meaningful contexts. This is crucial as listening assessment needs to be matched to their interest as students understand better when their listening assessment is related to current trends and real life situations. In accordance to the latter point, Ismaili (2013) mentioned that students were less attracted to use course-books compared to watching movies. Moreover, in regards to the test items, participants also agreed that test items should be sourced from different fields of study. Hence, it is a challenge to lecturers as reported in the Manual for Language Testing Development and Examining, Council of Europe by UNESCO (2015), where the design phase of an assessment would be the ultimate challenge in preparing a sturdy and well considered listening assessment. Overall, from the feedback received from the participants, researchers of the current study believe that there were various elements involved in producing a proper assessment. Some of these elements were the instruction of the test, level of test difficulty, test content, test items, and level of students’ proficiency and ability. 97 elt.ccsenet.org English Language Teaching Vol. 10, No. 8; 2017 6. Conclusion and Recommendations According to the students’ perceptions, the majority of the participants perceived video media as a great tool for their listening assessment as it provides authentic, meaningful, and real life situation contexts. However, more consultation or instructions on video media should be given to the students as they have less exposure on the use of video for their listening assessment. Students were only familiar with the traditional method (audio only) which is used to assess their listening skills starting from primary school till tertiary level. Gruba (1997) has foreseen the potential of using video media in listening assessments. 1997 was only the beginning of the new video media era, and the introduction of the Internet. After 20 years, why are we still using the traditional way in assessing listening skills? The students should be given more exposure to natural and authentic conversations which should have happened a long time ago. Video has been widely used as a learning material and it is proven to be a great tool in learning language. Video provides natural, meaningful, and authentic contexts for the learners to learn better. Video has been used as a tool to cater to the needs of 21st century learners as these learners are exposed to a lot of visual materials in their daily life. Learning also takes place outside of the classroom and thus, instructors need to bring outside situations and scenarios into the classroom. In the teaching of listening in the L2 classroom, the use of audio-visual aid has come a long way. Changes in the patterns, approaches and materials are seen to be quite apparent these days. Hence, it is of vital importance that students are taught effectively and assessed accordingly on their listening skills. Moreover, it is also crucial for educators or instructors to ponder upon the repercussions of implementing authentic materials in their lessons. Specifically, it is essential that instructors realize the importance of in-depth planning and training when implementing authentic assessment in the classroom. It is challenging to construct a practical and authentic assessment in any classroom, but it is much tougher in the L2 classroom. Therefore, instructors could be particularly specific in providing clear instructions for each assessment and instructors should be mindful of the contexts chosen for audio-visuals used in their listening assessments. On top of that, more video media related assessments should be implemented in the L2 classrooms so that students will be more familiar with the different types of assessments present these days. In light of this notion, curriculum developers should be aware of the advancement in technology and be ready to invest in changes. Acknowledgments We would like to thank our educational institution for supporting us financially through the ARAS Grant. We would also like to thank our team members and colleagues who have given us a lot of advice and feedback in completing this paper. References Alonso, R. S. (2013). The importance of teaching listening and speaking skills. Retrived from https://www.ucm.es/data/cont/docs/119-2015-03-17-12.RocioSeguraAlonso2013.pdf Bailey, A. L., & Wolf, M. K. (2012). The Challenge of Assessing Language Proficiency Aligned to the Common Core State Standards and Some Possible Solutions. Language, Literacy and Learning in the Content Area, 1-9. Bulut, T., & Üğüten, S. D. (2003). The Importance of Student Perceptions in Language Teaching. Retrieved from http://dergipark.gov.tr/download/article-file/50135 Dolati, R., & Richards, C. (2011). Harnessing the Use of Visual Learning Aids in the English Language Classroom. Arab World English Journal. Garcia, M. R. (2012). Usage of Multimedia Visual Aids in the English Language Classroom. Retrieved from https://www.ucm.es/data/cont/docs/119-2015-03-17-11.MariaRamirezGarcia2013.pdf Gruba, P. (1997). The Role of Video Media In Listening Assessment. An International Journal of Educational Technology and Applied Linguistics, 335. https://doi.org/10.1016/s0346-251x(97)00026-2 Hassan, A., Mohd Mokhtar, N., & Zainal Abiddin, N. (2014). Reflecting the Process of Teaching and Listening in Two Different Approaches in Educational Philosophy. Journal of Education and Learning, 70-78. https://doi.org/10.5539/jel.v3n1p70 Hasan, M., & Tan, H. B. (2012). ESL Learners’ Perception and Attitudes towards the Use of Podcast in Developing Listening Skills. The English Teacher, 160-173. Ismaili, M. (2013). The Effectiveness of Using Movies in the EFL Classroom – A Study Conducted at South East European University. Academic Journal of Interdisciplinary Studies, 121-132. https://doi.org/10.5901/ajis.2012.v2n4p121 98 elt.ccsenet.org English Language Teaching Vol. 10, No. 8; 2017 Kenning, M. J., & Kenning, M. M. (1984). An Introduction to Computer Assisted Language Teaching. London: Oxford University Press. Khoshsima, H., & Izadi, M. (2014). Dynamic Vs. Standard Assessment to Evaluate EFL Learners’ Listening Comprehension. Iranian Journal of Applied Language Studies, 1-26. Macwan, H. (2015). Using visual aids as authentic material in ESL classrooms. Research Journal of English Language and Literature (RJELAL). McCarthy, J. (2015). Evaluating Written, Audio and Video Feedback in Higher Education Summative Assessment Tasks. Issues in Educational Research, 153-169. McKittrick, M., Mitchum, C., & Spangler, S. R. (2014). The Sound of Feedback: Instructor Uses and Student Perceptions of SoundCloud Audio Technology. Journal of Teaching and Learning with Technology, 40-53. https://doi.org/10.14434/jotlt.v3n2.12959 Meskill, C. (19961). Listening Skills Development through Multimedia. Journal of Educational Multimedia and Hypermedia. Mirvan, X. (2013). The advantages of using films to enhance student’s reading skills in the EFL classroom. Journal of Education and Practice, 4(13), 62-66. Muskin, J. (2015). Student Learning Assessment and the Curriculum: Issues and Implications for Policy, Design and Implementation. Washington, DC, The Brookings Institution. Reynisdóttir, B. B. (2016). The Efficacy of Authentic Assessment. A Practical Approach to Second Language Testing, 1-31. Sarani , A. Behtash, E. Z., & Arani, S. N. (2014). The Effect of Video-Based Tasks in Listening Comprehension of Iranian Pre-intermediate EFL Learners. Gist Education and Learning Research Journal, 29-47. UNESCO. (2015). Education 2030: Towards Inclusive and Equitable Quality Education and Lifelong Learning for All. https://en.unesco.org/world-education-forum-2015/incheon-declaration. Woottipong, K. (2014). Effect of Using Video Materials in the Teaching of Listening Skills for University Students. International Journal of Linguistics, 6(4), 200-212. https://doi.org/10.5296/ijl.v6i4.5870 Copyrights Copyright for this article is retained by the author(s), with first publication rights granted to the journal. This is an open-access article distributed under the terms and conditions of the Creative Commons Attribution license (http://creativecommons.org/licenses/by/4.0/). 99

See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.