+ All documents
Home > Documents > Peer Assessment of Adolescent Learners' Writing Performance

Peer Assessment of Adolescent Learners' Writing Performance

Date post: 22-Nov-2023
Category:
Upload: oslomet
View: 0 times
Download: 0 times
Share this document with a friend
24
www.equinoxpub.com ISSN: 1756–5839 (print) ISSN: 1756–5847 (online) writing & pedagogy WAP VOL 7.2/3 2015 305–328 © 2015, EQUINOX PUBLISHING doi: 10.1558/wap.v7i2-3.26457 Peer Assessment of Adolescent Learners’ Writing Performance Dina Tsagari and Eleni Meletiadou 1 Abstract Peer assessment (PA), a process by which students’ oral or written work is assessed by peers, has received a lot of attention recently (Hansen Edwards, 2014; Harris et al., 2015). Using data collected from a secondary school in Cyprus, the current study investigated whether PA could improve the writing skills of 60 adolescent EFL students. e results showed that PA had a significantly positive impact on students’ writing performance. e article discusses the important role of PA in the develop- ment of students’ writing skills and offers research and practical recommendations for the implementation of PA in EFL contexts. KEYWORDS: EFL WRITING, PEER ASSESSMENT, SECONDARY EDUCATION Affiliation University of Cyprus, Nicosia, Cyprus email: [email protected] (corresponding author) Research Matters
Transcript

www.equinoxpub.com

ISSN: 1756–5839 (print)ISSN: 1756–5847 (online)writing & pedagogy

WAP VOL 7.2/3 2015 305–328© 2015, EQUINOX PUBLISHING

doi: 10.1558/wap.v7i2-3.26457

Peer Assessment of Adolescent Learners’ Writing Performance

Dina Tsagari and Eleni Meletiadou1

Abstract

Peer assessment (PA), a process by which students’ oral or written work is assessed by peers, has received a lot of attention recently (Hansen Edwards, 2014; Harris et al., 2015). Using data collected from a secondary school in Cyprus, the current study investigated whether PA could improve the writing skills of 60 adolescent EFL students. The results showed that PA had a significantly positive impact on students’ writing performance. The article discusses the important role of PA in the develop-ment of students’ writing skills and offers research and practical recommendations for the implementation of PA in EFL contexts.

KEyWOrdS: EfL WrItING, PEEr ASSESSmENt, SEcONdAry EdUcAtION

AffiliationUniversity of Cyprus, Nicosia, Cyprusemail: [email protected] (corresponding author)

Research Matters

306 WrItING & PEdAGOGy

Introduction

Second language (L2) writing has been studied across a variety of languages (Kaplan and Grabe, 2002) and has become a vehicle for access to knowl-edge, power and resources (crowley, 1998; Leki, 2003). With the advent of process approaches to L2 writing, among others, peer assessment (PA) emerged as a way to motivate learners to become actively involved in the process of writing and improve their writing performance (Plutsky and Wilson, 2004). There are numerous L2-related benefits from applying PA, such as linguistic development, meta-cognition and cognition, social inter-action, affect, and feedback (see Hansen Edwards, 2014). However, PA is not widely used in European countries despite the implementation of wide-ranging reforms that foster its use (i.e. the common European framework of reference for Languages, council of Europe, 2001; the European Language Portfolio; morrow, 2004; Schneider and Lenz, 2001). Also, its use in writing has been limited in EfL contexts (cheng and Warren, 2005) and secondary education (taras, 2001; tsivitanidou et al., 2011).

This study investigates the implementation of PA in state school second-ary education in cyprus and explores its impact on EfL students’ writing performance. The current context is of particular interest as it is dominated by summative assessment practices (tsagari, 2012; 2014) where the assess-ment of writing skills is product-oriented serving mainly summative pur-poses, e.g. in the form of students’ final grades (meletiadiou and tsagari, 2013; meletiadou, 2012). The use of PA in the local context is also limited due to the absence of PA from the EfL curriculum of Secondary Education (ministry of Education and culture, 2010). The study also aims to respond to on-going calls for the use of PA in school systems to promote reflec-tive thinking, self-improvement, and independent learning (curriculum development council, 2004). The objective is to produce a number of pointers that can motivate good practice in the assessment of EfL writing for policymakers, syllabus designers and educators, in school education in cyprus and other countries. We hope that such recommendations will add to the existing literature and support the development of renewed cultures of language assessment in traditional examination systems (curriculum development council, 2001).

Research on PA and L2 writing

Several theoretical frameworks support the use of PA activities in the writing classroom. These include process writing theory (flower and Hayes, 1981), collaborative learning theory (Johnson and Johnson, 1994), social cognitive theory (Vygotsky, 1978), interactionist theories of second

PEEr ASSESSmENt Of AdOLEScENt LEArNErS’ WrItING PErfOrmANcE 307

language acquisition (SLA) (diGiovanni and Nagaswami, 2001), cognitive constructivist theory (Piaget, 1978), and self-regulation theory (Villamil and de Guerrero, 1998) (for a detailed discussion see Hansen Edwards, 2014). research has also highlighted the positive role of PA in EfL writ-ing instruction writing performance, and autonomy (Plutsky and Wilson, 2004). There are also studies in EfL university contexts which found that peer feedback, a vital PA component, improves learners’ writing perfor-mance (Behjat and yamini, 2012; Plutsky and Wilson, 2004; richer, 1992), and that student assessors make more improvements than student assess-ees (rouhi and Azizian, 2013). In addition, research also shows that a com-bination of teacher assessment with peer assessment improves students’ L2 writing performance (Birjandi and Hadidi tamjid, 2011; tsui and Ng, 2000; Xiao and Lucking, 2008), and that students may gain even greater benefits compared to receiving only teacher assessment (Hyland, 2000; Lockhart and Ng, 1995; Paulus, 1999). moreover, involving students in the assessment process through PA activities increases the number of feedback opportunities for student assessors (Gielen et al., 2010). The reliability of PA results – agreement between students and teachers’ ratings – has also been the scope of research (Orsmond et al., 1997). findings indicate that teacher feedback can be less successful than peer feedback (yang et al., 2006), and is associated with misinterpretation and miscommunication (falchikov and magin, 1997; Newstead and dennis, 1994; Zhao, 2010).

However, other studies in PA have shown that not all students who received peer feedback in writing outperformed those who did not. One reason is that assessment criteria need to be explicit, and demonstrably used in formulating grades (Gibbs and Simpson, 2004; Kluger and deNisi, 1996). Other reasons are the limited provision of teacher training prior to using PA (Li and Steckelberg, 2004) and student assesses’ hesitation to accept feedback from student assessors (Brindley and Scoffield, 1998; Orsmond et al., 1996).

research on the impact of PA on EfL students’ writing performance is on-going (Gielen et al., 2010; Strijbos et al., 2010; Van Steendam et al., 2010; yu and Wu, 2011; 2013) and currently concerned with the benefits student assessors and student assessees receive (topping, 1998, 2010; Li et al., 2010; Lindblom-ylänne et al., 2006), the effect of the combination of PA with teacher assessment in secondary education, and the impact of training students in PA procedures (Sluijsmans et al., 2004; tsivitanidou et al., 2011; Zhao, 2010). If research can confirm the effectiveness of PA in the context of a ‘multiple-draft classroom’ (ferris, 1995), this will surely influ-ence the way teachers think about and incorporate PA practices in primary and secondary school education (Harris and Brown, 2013).

308 WrItING & PEdAGOGy

Aims and methodology

The present study investigated the potential and the challenges of imple-menting PA of EfL writing within the secondary school system in cyprus. The research questions addressed are:

rQ1: does PA have an impact on the writing skills of EfL learners in the current context?

rQ2: What is the effect of PA on the writing performance of the students receiving feedback (student assessees) as opposed to those assessing their peers (student assessors)?

rQ3: Are students’ PA assessments (grades) comparable to those of the teacher?

The current study employed a quasi-experimental research design (creswell and Plano clark, 2007) using a random assignment of partici-pants in experimental and control groups. The study also followed a case study approach to data collection (creswell, 2009; yin, 2014) in order to explore the implications of PA in the EfL classroom setting of a specific school. case study methodology was important in the current study as it provided the opportunity for an in-depth and intensive study of a single unit where the ‘aim is to elucidate features of a larger class of similar phe-nomena’ (Gerring, 2004: 341).

Research context and participants

The study took place in a State Institute in Nicosia, cyprus. State Institutes are public educational units offering supplementary education in school subjects (Lamprianou and Afantiti Lamprianou, 2013). State Institutes are run by the cypriot ministry of Education and culture and follow the same curricula as state secondary schools. In this context, EfL teaching aims to improve students’ language skills with the ultimate aim of preparing them for international exams, i.e. International General certificate of Secondary Education (IGcSE, cambridge International Exams), first certificate in English (fcE, cambridge English Language Assessment), etc.

The participants of this study were 60 intermediate level learners, 13–14 years old, who had the same teacher and attended an eight-month long EfL course (mid-September 2009 to mid-may 2010). The students were all native Greek cypriots who shared similar cultural and socio-economic backgrounds.

Their class teacher was an EfL practitioner with over 20 years of teaching experience. She held a BA in English Language and Literature and an mA in tEfL but had never employed PA in her classes before. According to the

PEEr ASSESSmENt Of AdOLEScENt LEArNErS’ WrItING PErfOrmANcE 309

teacher, the learners who took part in the study faced problems with their writing performance and had a negative attitude towards the assessment of writing. finally an external assessor, an experienced EfL teacher with a BA in English Language and Literature and an mA in Applied Linguistics, was invited to participate in the study to enhance the reliability of marking. The assessor provided marks for the second drafts of all essays after the PA implementation in order to enhance the reliability of the teacher’s marks.

PA implementation procedures

The teacher, guided by the researchers, implemented PA practices in her EfL classes. for instance, she asked students to write three types of essays, i.e. a narrative, a descriptive essay and an informal letter as stipulated in the official curriculum of public secondary schools. These were submitted in two drafts. The essays were written in class, without disrupting the regular teaching programme. This limited irrelevant con-struct variables, such as the amount of time spent on the task at home and possible help from peers, parents or other sources. The length of the essays was approximately 150 words each. during essay writing, the teacher assumed the role of a facilitator explaining difficult terms and giving advice when needed (O’ Brien, 2004).

Before PA implementation, the teacher randomly placed the students in three mixed-ability groups (20 learners in each group). Group A (control group) received teacher feedback only. Group B (experimental group 1) received teacher and peer feedback from Group c (experimental group 2), and Group c got teacher feedback and provided feedback to Group B (see table 1).

Table 1: Student grouping

310 WrItING & PEdAGOGy

The implementation of PA in class began in January 2010 and extended over a period of four months (see table 2). The students were engaged in the experiment once a week for two teaching sessions (45 minutes each). during the PA implementation, the students wrote the three types of essays and received/provided feedback before the second draft. In the feedback sessions, students received evaluative comments from either the teacher and/or their peers on a PA form (Appendix I). Group c students corrected Group B students’ essays randomly and spent about 20 minutes filling in the PA form during the feedback sessions while groups A and B worked on teacher-assigned tasks based on their coursebook. care was taken to avoid having the same students rate their peers’ essays twice. Also the identity of the student/assessor and the student/assessee were kept confidential to avoid conflicts and bitterness among learners (miller and Ng, 1994). for the same reason, oral interaction between peers was also avoided during PA activities. The time between drafts (usually one week) was considered to be sufficient for learners to redraft their essays. These measures were taken to ensure the reliability of the assessment process (raimes, 1983).

The teacher provided corrections only on students’ second drafts so as not to interfere with the revision process. The external assessor provided marks only for the second drafts of all essays after the PA implementation and these were only available to the teacher.

finally, before implementing PA in class, adequate training of the learn-ers and the external assessor was provided (Jacobs, 1987; Newkirk, 1984). The rationale and content of this will be presented in the next section.

Table 2: Overview of PA implementation

January 2010 Week 4: first draft of narrative essay

february 2010

Week 1: feedback Week 2: Second draft of narrative essayWeek 3: feedback first draft of descriptive essayWeek 4: feedback

march 2010

Week 1: Second draft of descriptive essayWeek 2: feedback first draft of informal letterWeek 3: feedbackWeek 4: Holiday

April 2010Week 1: HolidayWeek 2: Second draft of informal letterWeek 3: feedback

PEEr ASSESSmENt Of AdOLEScENt LEArNErS’ WrItING PErfOrmANcE 311

Training students in PA

to implement PA in the school system requires substantial student train-ing and practice (Sluijsmans et al., 2002; Van Steendam et al., 2010), along with guidance and time to adapt (Boud, 2000). for the purposes of this study, the teacher organized a one-week training seminar for her students under the guidance of the current researchers and prior to PA implemen-tation. The training was offered at the beginning of the research project (January 2010), extending over three 50 minute teaching lessons. All three groups participated with the aim of building a shared understanding of the nature, purposes and requirements of the research project. Also students used criteria and rubrics prior to PA implementation as a way to help them learn most effectively through PA (Andrade, 2010; Gielen et al., 2010). The three groups were trained together but special attention was paid so that students of the experimental Groups B and c would develop assessment criteria for peer feedback (Patri, 2002), and be trained in revising drafts (Xiao and Lucking, 2008). When students did not participate in PA training activities, they were asked to work on coursebook tasks or homework. The training period comprised a number of different steps for the three groups (see table 3).

Step 1 – Initial phase: The overall purpose of the project and the value of teacher feedback was explained to all three groups (rollinson, 2005). The importance of and concerns about peer feedback in the context of PA of writing were discussed briefly with Groups B and c with the teacher providing reasons as to why peers of equal status are able to provide help-ful feedback.

Step 2 – revision strategies: All groups were exposed to modelling ade-quate and inadequate revision strategies to address students’ difficulties in systematically revising their drafts based on peer comments. Students were later asked to use their newly acquired revision strategies on sample essays.

Table 3: training steps

Training steps Group A Group B Group C1. Initial phase

2. revision strategies

3. revision and use of PA criteria

4. Use of model texts

5. design of the PA form

6. mock PA: revision, rating and commenting

312 WrItING & PEdAGOGy

Step 3 – revision and use of PA criteria (Group B students only): Students were introduced to PA criteria presented in a sample PA form. The ensuing discussion focused on the assessee’s sense of obligation to accept or reject feedback.

Step 4 – Use of model texts: All groups worked with model texts to identify strong and weak points in writing. typical features of narrative, descriptive and informal letter writing were discussed.

Step 5 – design of the PA form: The teacher and the student assessors and assessees (Group B and c) jointly worked towards the design of a PA form to minimize marking and friendship biases and psychological resis-tance (fletcher and Baldry, 2000; morahan-martin, 1996). The teacher engaged students in setting clear criteria using sample checklists as para-digms (White and Amdt, 1991) and guidelines for peer response (Project, 1990), and developing familiarity with the instruments used (chapelle and Brindley, 2002). The criteria developed were grouped in categories and aligned with the official writing criteria of the ministry of Education. The weighting of the marks for each category was also negotiated with the learners, thus creating a sense of ownership of the PA form (falchikov and Goldfinch, 2000). The PA form designed (Appendix I) was used during the study to provide feedback to Group B on the three essays. (On a later occasion, the external assessor also received 40 minutes’ training on using assessment criteria before marking the essays).

Step 6 – mock PA: revising, rating, commenting (Groups B and c only): The aim was to raise students’ awareness on what they should be looking for in a written text and thus transfer this knowledge to their own texts (Nicol and milligan, 2006). Student assessees (Group B) were asked to revise three sample essays (with errors in all the areas included in the PA form). Student assessors (Group c) received the same three sample essays to do mock rating using the PA form. The students’ ratings were discussed in class and any differences with the teacher’s ratings and comments were clarified to avoid any future misunderstandings.

Data collection instrument

The PA form (Appendix I) was the main instrument used for giving and receiving feedback in the current study. The statements used in the PA feedback form were as simple as possible to correspond to the students’ age and cognitive abilities. Being simple and ‘procedural’ in nature, the PA form was expected to provide learners with basic guidelines for giving feedback on peers’ drafts and, consequently, for revising their own drafts critically and effectively (Johns, 1986; Lamberg, 1980). The form contained statements in both English and Greek, the latter being the students’ mother

PEEr ASSESSmENt Of AdOLEScENt LEArNErS’ WrItING PErfOrmANcE 313

tongue, to facilitate students’ understanding of the criteria and save time during PA activities. The form was reviewed by two experienced EfL teachers who checked its appropriateness for the particular learning group and context (recommended by Karegianes et al., 1980; macArthur et al., 1991). This resulted in further simplification of terms and statements, and changes in the format of the instrument.

The final PA form was a structured-response type of rubric in the form of a checklist consisting of four broad categories and 16 closed-ended state-ments. It employed a four-point Likert type scale ranging from ‘excellent’ to ‘very poor’. The first two categories focused on global writing concerns (i.e. content and organization), while the latter dealt with local writing con-cerns (i.e. vocabulary, language usage and mechanics). In the end, students allocated different marks to each category, e.g. content: 5 points, organiza-tion: 4 points, vocabulary and language usage: 6 points and mechanics: 5 points. more marks were allocated to language usage because at this level (intermediate), accuracy seems slightly more important than fluency as learners are still mastering the grammar and the syntax of the language.

Data analysis and results

data collected from the teacher, the students, the external assessor and the PA form were analysed. descriptive statistics of teacher’s grades (marks on all three essay drafts) (see table 4) were calculated (Berg, 1999). This helped compare the writing performance of Groups B and c with Group c (rQ1).

The results showed that students of Groups B and c received higher mean rating scores (mI) across the three essays compared to students in the control group (Group A). Also, despite expectations of improvement over time, mean improvement in the third essay for Group A (receiving teacher feedback only) was extremely low (e.g. mI  =  0.05), unlike mean scores for Groups B and c which were considerably higher (e.g. mI = 3.0 and md  =  3.2 respectively). The results also showed that the differ-ence between the control group (Group A: X   =  −0.08, Sd  =  1.23) and

Table 4: mean Improvement (mI) based on teacher’s scores per essay and group

Essays Group A(TA only)

Group B(Student assessees)

Group C(Student assessors)

MI SD MI SD MI SDEssay 1 0.05 0.94 1.3 1.86 1.90 1.77Essay 2 -0.35 1.34 1.5 2.03 2.45 1.57Essay 3 0.05 1.35 3.0 2.71 3.20 1.82

314 WrItING & PEdAGOGy

the experimental groups was statistically significant (Group B: X  = 1.93, Sd = 2.33, p < 0.001 and Group c: X  = 2.52, Sd = 1.78, p < 0.001). This means that the learners involved in the PA implementation improved their writing performance considerably more than those in the control group.

regarding differences of writing performance between the two experi-mental groups (rQ2), the results showed that student assessors improved their writing performance slightly more than student assessees (Group c: X  = 2.52, Sd = 1.78 vs Group B: X  = 1.93, Sd = 2.33). However, the dif-ference between the two experimental groups was small (see table 4). This indicates that PA improved student assessees’ writing performance nearly as much as student assessors’ performance. to estimate the differences in performance within groups, the scores of students’ drafts (narrative, descriptive, and informal) per group were subjected to a three-way analy-sis of variance (ANOVA). The results confirmed that the three groups per-formed differently in each of the three essays [f (2, 171) = 5.66, p = 0,004]. The results also showed that there were no significant interactions between students’ writing performance in the three groups [f (2, 171)  =  35.25, p < 0.001] and their essays [f (4, 171) = 1.40, p = 0.234] which means that the writing performance of students within groups was relatively stable.

to test the reliability of student generated scores (rQ3), comparison of the teacher and student assessors’ marks for each essay draft was made by conducting a two-sided Pearson correlation test. The correlation between the teacher and the student assessors’ marks for the first draft of all essays was very high (r = 0.89, p < 0.001). The same test was conducted on the marks of the second draft, yielding an equally high correlation (r = 0.78, p < 0.001) and showed that student generated scores were comparable to those of the teacher.

to further test the reliability of the findings, a Pearson correlation test was conducted to compare the teacher’s marks for the second draft of all students’ essays with the external assessor’s marks. results showed a very high correlation between the teacher and the external assessor’s marks (r = 0.93, p < 0.001). This confirmed that the teacher was not biased towards any of the groups and she marked all students’ essays consistently.

to sum up, the results indicated that students involved in PA improved their writing performance as a result of working with the assessment cri-teria and by employing them actively when asked to assess their peers’ work. moreover, PA had an almost equal effect on both student assessors and assessees. Students were also found to be as reliable assessors as their teacher. As such the current findings confirm findings of other studies which showed that PA can be a helpful tool in improving adolescent learn-ers’ writing skills (topping, 2013; tsivitanidou et al., 2011; 2012; van den Berg et al., 2006).

PEEr ASSESSmENt Of AdOLEScENt LEArNErS’ WrItING PErfOrmANcE 315

Discussion and recommendations

Several researchers have emphasized the fact that PA is a complex under-taking (Sluijsmans et al., 2002). The findings of our study confirmed that PA can improve EfL learners’ writing skills which is in line with previ-ous research (Olson, 1990; Paulus, 1999; Plutsky and Wilson, 2004; van den Berg et al., 2006), and that benefits for both experimental groups were almost similar. Our study also refutes the claim that PA is only suitable for adult learners (Harlen, 2007). In the current study, adolescent learners pro-vided reliable marks, very close to their teacher’s, consistent with research findings in older learners (Patri, 2002; Pope, 2005; rudy, fejfar, Griffith and Wilson, 2001; topping, 2010).

The literature also depicts PA as a demanding task because it requires learners to have an understanding of the goal(s) of assessment, apply assess-ment criteria, and make judgements about learner products in relation to assessment criteria (cho and macArthur, 2010; topping, 2003; tsai and Liang, 2009). The findings of our study showed that PA enabled students to appreciate why and how marks were assigned (Brindley and Scoffield, 1998) and provided them with a clearer understanding of what was required to achieve a particular standard (Hanrahan and Isaacs, 2001). PA implemen-tation in the current study was supported through a checklist of evaluative questions that students were asked to apply on their own writing (cheng and Warren, 1996; Stoddard and macArthur, 1993). Through the train-ing and involvement of students in the PA form design, the form became a familiar instrument. This helped students gain a clearer understand-ing of the assessment criteria needed to critically evaluate not only their peers’ but also their own writing skills which is a necessary skill for quality writing and academic success in general (Gieve, 1998; Thompson, 2002). developing critical evaluation skills in writing makes students better writ-ers and self-reviewers by identifying, for instance, logical gaps, problems with organization and other defects (Beach, 1989; ferris, 1995; Thompson, 2002). In the course of the current PA implementation, students produced and worked on multiple drafts of their written texts (Berg, 1999). This also encouraged extended practice in writing and developed learners’ writing skills effectively. This means that if students stop focusing on the end prod-uct and their grade and are provided with appropriate guidance and sup-port, writing is manageable (Panou, 2006). PA also provided students with an authentic audience for their writing, e.g. their peers rather than teachers (caulk, 1994; freedman and Sperling, 1985; mittan, 1989).

The results of the study carry important pedagogical implications for the implementation of PA. for example, successful PA implementation relies on the teacher’s ability to adequately prepare students and prevent

316 WrItING & PEdAGOGy

problems (e.g., over/under marking, cheating) (Noonan and duncan, 2005; ross, 2006; topping, 2013). In our study, PA assessment criteria were clarified and exemplified for students in advance. careful adapta-tion of the PA instrument used also met the students’ language level, i.e. the instrument used was presented in both L1 and L2. The students in the experimental groups developed their revision strategies, worked with model texts to identify their weaknesses and strengths as writers, nego-tiated assessment criteria, jointly constructed by teachers and learners, used checklists and exemplification and were trained in using the assess-ment criteria after taking part in the design of the PA form. These steps deepened students’ understanding of the assessment procedure, gave them a greater sense of ownership, and increased the reliability of the PA method (Karegianes et al., 1980; macArthur et al., 1991). Therefore, we recommend that teachers who wish to successfully employ PA should provide learners with sufficient and relevant training. PA in a public space may also trigger threats to psychological safety and inter-personal rela-tionships (Harris and Brown, 2013; raider-roth, 2005; topping, 2013). Students require psychologically safe relationships with teachers and peers (cowie, 2005, 2009; van Gennip et al., 2010). Therefore, PA ano-nymity should also be ensured to avoid conflict and resentment among learners as was the case in the current study. This reduces peer pressure and ensures that the PA procedure will be more accurate due to reduced fear of failure when giving low scores (Vanderhoven et al., 2012).

to conclude, PA has become a significant feature of today’s classrooms (Kollar and fischer, 2010) since learning is eventually viewed as a partici-pative and collaborative activity (Barab et al., 2001). However, PA should be gradually and consistently introduced into the teaching curriculum throughout the school year to become an integral part of teaching pro-grammes. to support such endeavours, sufficient and relevant training in PA should be planned and offered to teachers in both pre- or in-service teacher development programmes. PA requires teachers to change their existing beliefs about teaching, learning and assessment (Black et al., 2003; Noonan and duncan (2005). raising teachers’ awareness of PA will no doubt help teachers improve their learners’ writing skills. training and materials should also be developed to support effective PA implementa-tion. In particular, the state needs to consider providing in-service training seminars for teachers on PA implementation. However, for such endeav-ours to be successful, PA needs to find its place in the English State School curriculum in cyprus and in similar systems.

PEEr ASSESSmENt Of AdOLEScENt LEArNErS’ WrItING PErfOrmANcE 317

Future research directions

There is still much to discover about the factors that underpin the suc-cessful implementation of PA in the L2 classroom. further research should involve PA studies with larger samples of participants to increase the gen-eralizability of the findings. Also researchers need to extend the duration of PA implementation to detect its long term effects on learners (e.g. from beginning to end of the school year) and focus also on the use of PA in other skill areas such as reading, speaking and listening in the same or other educational levels (Saito and fujita, 2004). Investigations of teachers’ and learners’ attitudes towards PA (meletiadou and tsagari, 2012; meletiadou, 2013), effects of teacher differences (cheng and Waren, 2005), and differ-ences regarding students’ gender and level of proficiency (miller and Ng, 1994) require further exploration. Such research can provide the field with a clearer understanding of the effects of PA on skills development and the ways PA is experienced by teachers and students and shed light on the conditions and circumstances that facilitate learning. Such research should also identify issues to be addressed in teacher preparation and develop-ment programmes (Harris and Brown, 2013).

Conclusion

This study has presented the implementation process of PA in EfL writ-ing with three groups of cypriot adolescent learners. despite its limita-tions (such as the relatively small number of writing tasks and students, the length of implementation period, and the particularities of the context), the current research has demonstrated the potential of PA as a powerful alter-native and learner-centred tool. It also showed that the cypriot and other ministries of Education should consider employing PA in order to promote better performance, learner autonomy, self- and metacognitive awareness. PA, along with other assessment methods, has the potential to achieve a substantial transformation of language learning within school systems. PA offers opportunities for individualization of learning that complements teacher assessment and enhances the assessment culture of centralized standards-based systems. It also provides learners with valuable feedback which can facilitate learning along with teachers’ feedback. However, PA does not undermine the teacher’s role. When using PA, teachers still play an important role as they retain considerable control over the types of classroom assessment practices used and the ways these are implemented (Ploegh et al., 2009).

318 WrItING & PEdAGOGy

About the authors

dina tsagari is Assistant Professor, department of English Studies, University of cyprus, cyprus. Her research interests include aspects of lan-guage testing and assessment (teacher assessment literacy, test washback, rater consistency), EfL course and materials design, teacher development, adult and distance education. She is a language testing consultant for vari-ous well-known language examination boards, an editorial/advisory board member of international referee journals, conferences and publishing committees, the coordinator of the classroom-based language assessment (cBLA) and Expert member of EALtA. She has participated in various research projects in Greece, cyprus and Hong Kong. She has published widely and presented in numerous local and international conferences.

Eleni meletiadou is currently a Phd candidate in Applied Linguistics at the department of English Studies, University of cyprus. She has been work-ing as an EfL teacher for more than 20 years in a variety of educational levels (primary, secondary schools, college and University) in both cyprus and Greece. Her research interests include peer assessment, collaborative language learning and teacher training. She has disseminated her work through national and international conferences and in various local and international publications.

Notes

1 department of English Studies, University of cyprus

References

Andrade, H. (2010). Students as the definitive source of formative assessment: Academic self-Assessment and the self-regulation of learning. In H. J. Andrade and G. J. cizek (Eds) Handbook of Formative Assessment, 90−105. New york: routledge.

Barab, S. A., Hay, K. E., Barnett, m. and Squire, K. (2001). constructing virtual worlds: tracing the historical development of learner practices. Cognition and Instruction 19: 47−94. http://dx.doi.org/10.1207/S1532690XcI1901_2

Beach, r. (1989). Showing students how to assess: demonstrating techniques for response in the writing conference. In c. m. Anson (Ed.) Writing and Response: Theory, Practice, and Research, 127−148. Urbana, IL: National council of teachers of English.

Behjat, f. and yamini, m. (2012). Blended learning: A ubiquitous learning environment for reading comprehension. International Journal of English Linguistics 2 (1): 97−106. http://dx.doi.org/10.5539/ijel.v2n1p97

PEEr ASSESSmENt Of AdOLEScENt LEArNErS’ WrItING PErfOrmANcE 319

Berg, c. E. (1999). The effects of trained peer response on ESL students’ revision types and writing quality. Journal of Second Language Writing 8 (3): 215−241. http://dx.doi.org/10.1016/S1060-3743(99)80115-5

Birjandi, P. and Hadidi tamjid, N. (2011). The role of self-, peer and teacher assessment in promoting Iranian EfL learners’ writing performance. Assessment and Evaluation in Higher Education 37 (5): 513−533. http://dx.doi.org/10.1080/02602938.2010.549204

Black, P. c., Lee, c., marshall, B. and William, d. (2003). Assessment for Learning: Putting it in Practice. maidenhead, Philadelphia: Open University Press.

Boud, d. (2000). Sustainable assessment: rethinking assessment for the learn-ing society. Studies in Continuing Education 22 (2): 151−167. http://dx.doi.org/10.1080/713695728

Brindley, c. and Scoffield, S. (1998). Peer assessment in undergraduate pro-grammes. Teaching in Higher Education 3 (1): 79−90. http://dx.doi.org/10.1080/1356215980030106

caulk, N. (1994). comparing teacher and student responses to written work. TESOL Quarterly 28 (1): 181−187. http://dx.doi.org/10.2307/3587209

chapelle, c. A. and Brindley, G. (2002) Assessment. In N. Schmitt (Ed.) An intro-duction to Applied Linguistics, 268−288. London: Arnold.

cho, K. and macArthur, c. (2010). Student revision with peer and expert review-ing. Learning and Instruction 20 (4): 328−338. http://dx.doi.org/10.1016/j.learninstruc.2009.08.006

cheng, W. and Warren, m. (1996). Hong Kong students’ attitudes toward peer assessment in English language courses. Asian Journal of English Language Teaching 6: 61−75. http://dx.doi.org/10.1191/0265532205lt298oa

cheng, W. and Warren, m. (2005). Peer assessment of language proficiency. Language Testing 22 (1): 93−121.

council of Europe (2001). Common European Framework of Reference for Languages: Learning, Teaching, Assessment. cambridge: cambridge University Press.

cowie, B. (2005). Pupil commentary on assessment for learning. Curriculum Journal 16 (2): 137−151. http://dx.doi.org/10.1080/09585170500135921

cowie, B. (2009). my teacher and my friends help me learn: Student perspec-tives and experiences of classroom assessment. In d. m. mcInerney, G. t. L. Brown and G. A. d. Liem (Eds) Student Perspectives on Assessment: What Students Can Tell us about Assessment for Learning, 85−105. charlotte, Nc: Information Age Publishing.

creswell, J. W. and V. L. Plano clark (2007). Designing and conducting mixed methods research. Thousand Oaks, cA: Sage Publications.

creswell, J. (2009). Research Design: Qualitative and Quantitative and Mixed Methods Approaches. London: Sage.

crowley, S. (1998). Composition in the University. Pittsburgh, PA: University of Pittsburgh.

curriculum development council (2001). Learning to Learn: Life-long Learning and Whole-person Development. Hong Kong: curriculum development council.

320 WrItING & PEdAGOGy

curriculum development council (2004). English Language Education: English Language Curriculum Guide (primary 1−6). Hong Kong: curriculum development council.

diGiovanni, E. and Nagaswami, G. (2001). Online peer review: An alternative to face-to-face. English Language Teaching Journal 55 (3): 263−271. http://dx.doi.org/10.1093/elt/55.3.263

falchikov, N. and Goldfinch, J. (2000). Student peer assessment in higher educa-tion: A meta-analysis comparing peer and teacher marks. Review of Educational Research 70 (3): 287−322. http://dx.doi.org/10.3102/00346543070003287

falchikov, N. and magin, d. (1997). detecting gender bias in peer marking of students’ group process work. Assessment and Evaluation in Higher Education 22 (4): 393−404. http://dx.doi.org/10.1080/0260293970220403

ferris, d. r. (1995). Student reactions to teacher response in multiple-draft composition classrooms. TESOL Quarterly 29 (1): 33−53. http://dx.doi.org/10.2307/3587804

fletcher, c. and Baldry, c. (2000). A study of individual differences and self-awareness in the context of multi-source feedback. Journal of Occupational and Organizational Psychology 73 (3): 303−319. http://dx.doi.org/10.1348/096317900167047

flower, L. and Hayes, J. r. (1981). A cognitive process theory of writing. College Composition and Communication 32 (4): 365−387. http://dx.doi.org/10.2307/356600

freedman, S. and Sperling, m. (1985). Written language acquisition: The role of response and the writing conference. In S. W. freedman (Ed.), The acquisition of written knowledge: Response and revision, 106−130. Norwood, NJ: Ablex.

Gerring, J. (2004) What is a case study and what is it good for? American Political Science Review 98 (2): 341−354. http://dx.doi.org/10.1017/S0003055404001182

Gibbs, G. and Simpson, c. (2004). conditions under which assessment supports students’ learning. Learning and Teaching in Higher Education 1: 3−31.

Gielen, S., Peeters, E., dochy, f., Onghena, P. and Struyven, K. (2010). Improving the effectiveness of peer feedback for learning. Learning and Instruction 20 (4): 304−315. http://dx.doi.org/10.1016/j.learninstruc.2009.08.007

Gieve, S. (1998). comments on dwight Atkinson’s ‘A critical Approach to critical Thinking in tESOL’: A case for critical Thinking in the English Language classroom.TESOL Quarterly 32 (1): 123−129. http://dx.doi.org/10.2307/3587907

Hanrahan, S. and Isaacs, G. (2001). Assessing self- and peer-assessment: The students’ views. Higher Education Research and Development 20 (1): 53−70. http://dx.doi.org/10.1080/07294360123776

Hansen Edwards, G. J. (2014). Peer assessment in the classroom. In J. A. Kunnan (Ed.) The companion to language assessment. John Wiley and Sons, Inc. http://dx.doi.org/10.1002/9781118411360.wbcla002

Harlen, W. (2007). Holding up a mirror to classroom practice. Primary Science Review 100: 29−31.

Harris, L. r., Brown, G. t. L. and Harnett, J. (2015). Analysis of New Zealand primary and secondary student peer- and self-assessment comments:

PEEr ASSESSmENt Of AdOLEScENt LEArNErS’ WrItING PErfOrmANcE 321

Applying Hattie and timperley’s feedback model. Assessment in Education: Principles, Policy and Practice 22: 265−281. http://dx.doi.org/10.1080/0969594X.2014.976541

Harris, L. r. and Brown, G. t. L. (2013). Opportunities and obstacles to con-sider when using peer- and self-assessment to improve student learning: case studies into teachers’ implementation. Teaching and Teacher Education 36: 101−111. http://dx.doi.org/10.1016/j.tate.2013.07.008

Hyland, f. (2000). ESL writers and feedback: Giving more autonomy to students. Language Teaching Research 4 (1): 33−54. http://dx.doi.org/10.1177/136216880000400103

Jacobs, G. (1987). first experiences with peer feedback on compositions: Student and teacher reaction. System 15 (3): 325−333. http://dx.doi.org/10.1016/0346-251X(87)90006-6

Johns, A. m. (1986). coherence and academic writing: Some definitions and suggestions for teaching. TESOL Quarterly 20 (2): 247−265. http://dx.doi.org/10.2307/3586543

Johnson, d., and Johnson, r. (1994). Learning Together and Alone, Cooperative, Competitive, and Individualistic Learning. Needham Heights, mA: Prentice-Hall.

Kaplan, r. B. and Grabe, W. (2002). A modern history of written discourse analysis. Journal of Second Language Writing 11 (3): 191−223. http://dx.doi.org/10.1016/S1060-3743(02)00085-1

Karegianes, m. L., Pascarella, E. t. and Pflaum, S. W. (1980). The effects of peer editing on the writing proficiency of low-achieving tenth grade students. Journal of Educational Research 73 (4): 203−207. http://dx.doi.org/10.1080/00220671.1980.10885236

Kluger, A. N. and deNisi, A. (1996). The effects of feedback interventions on performance: A historical review, a meta-analysis, and a preliminary feedback intervention theory. Psychological Bulletin 119 (2): 254−284. http://dx.doi.org/10.1037/0033-2909.119.2.254

Kollar, I. and fischer, f. (2010). Peer assessment as collaborative learning: a cognitive perspective. Learning and Instruction 20 (4): 344−348. http://dx.doi.org/10.1016/j.learninstruc.2009.08.005

Lamberg, W. (1980). Self-provided and peer-provided feedback. College Composition and Communication 31 (1): 63−69. http://dx.doi.org/10.2307/356635

Lamprianou, I. and Afantiti Lamprianou, t. (2013). charting private tutoring in cyprus: A socio-demographic perspective. In m. Bray, A. E. mazawi and r. G. Sultana, (Eds) Private Tutoring Across the Mediterranean: Power Dynamics and Implications for Learning and Equity, 29−56. Sense Publishers and Journal of Educational Studies. http://dx.doi.org/10.1007/978-94-6209-237-2_3

Leki, I. (2003). A challenge to L2 writing professionals. Is writing overrated? In B. Kroll (Ed.) Exploring Second Language Writing, 315−331. New york: cambridge University Press.

Li, L., Liu, X. and Steckelberg, A. L. (2010). Assessor or assessee: How stu-dent learning improves by giving and receiving peer feedback. British

322 WrItING & PEdAGOGy

Journal of Educational Technology 41 (3): 525−536. http://dx.doi.org/10.1111/j.1467-8535.2009.00968.x

Li, L. and Steckelberg, A. (2004). Using Peer Feedback to Enhance Student Meaningful Learning. chicago, IL: Association for Educational communications and technology.

Lindblom-ylänne, S., Pihlajamäki, H. and Kotkas, t. (2006). Self-, peer- and teacher-assessment of student essays. Active Learning in Higher Education 7 (1): 51−62. http://dx.doi.org/10.1177/1469787406061148

Lockhart, c. and Ng, P. (1995). Analyzing talk in ESL peer response groups: Stances, functions and content. Language Learning 45 (4): 605−655. http://dx.doi.org/10.1111/j.1467-1770.1995.tb00456.x

macArthur, c. A., Graham, S. and Schwartz, S. (1991). Knowledge of revision and revising behavior among students with learning disabilities. Learning Disability Quarterly 14 (1): 61−73. http://dx.doi.org/10.2307/1510373

meletiadou, E. (2012). The impact of training adolescent learners on their percep-tions of peer assessment of writing. Research Papers in Language Teaching and Learning 3 (1): 240−251.

meletiadou, E., and tsagari, d. (2013). An exploration of the reliability and valid-ity of peer assessment of writing in secondary education. In N. Lavidas, t. Alexiou and A. m. Sougari (Eds) Major trends in theoretical and applied lin-guistics: Selected papers from the 20th International Symposium on Theoretical and Applied Linguistics (April 1−3, 2011), 235–249. London: Versita de Gruyter.

meletiadou, E., and tsagari, d. (2012). Investigating the attitudes of adolescent EfL learners towards the peer assessment of writing. In d. tsagari (Ed.) Research on English as a Foreign Language in Cyprus, 225−245. Nicosia: University of Nicosia Press.

meletiadou, E. (2013). EfL learners’ attitudes towards peer assessment, teacher assessment and the process writing. In d. tsagari, S. Papadima-Sophocleous and S. Ioannou-Georgiou (Eds), International experiences in language test-ing and assessment − Selected papers in memory of Pavlos Pavlou, 95−114. frankfurt: Peter Lang.

miller, L., and Ng, r. (1994). Peer assessment of oral language proficiency. Perspectives: Working papers of the Department of English. city Polytechnic of Hong Kong 6: 41−56.

ministry of Education and culture (2010). Foreign language programme of study for Cypriot public secondary schools. Nicosia: ministry of Education.

mittan, r. (1989). The peer review process: Harnessing students’ communicative power. In d. Johnson and d. roen (Eds) Richness in writing: Empowering ESL students, 207−219. New york: Longman.

morahan-martin, J. (1996) Should peers’ evaluations be used in class projects?: Questions regarding reliability, leniency, and acceptance. Psychological Reports 78 (3c): 1243−1250. http://dx.doi.org/10.2466/pr0.1996.78.3c.1243

morrow, K. (2004). Insights from the Common European Framework. cambridge: cambridge University Press.

PEEr ASSESSmENt Of AdOLEScENt LEArNErS’ WrItING PErfOrmANcE 323

Newkirk, t. (1984). How students read student papers: an explor-atory study. Written Communication 1 (3): 324−325. http://dx.doi.org/10.1177/0741088384001003001

Newstead, S., and dennis, I. (1994). Examiners examined: The reliability of exam marking in psychology. The Psychologist 7: 216−219.

Nicol, d., and milligan, c. (2006). rethinking technology-supported assess-ment in terms of the seven principles of good feedback practice. In c. Bryan and K. clegg (Eds) Innovative assessment in higher education, 1−14. London: routledge.

Noonan, B., and duncan, c. r. (2005). Peer and self-assessment in high schools. Practical Assessment, Research and Evaluation 10 (17): 1−8.

O’ Brien, t. (2004). Writing in a foreign language: teaching and learning. Language Teaching 37: 1−28. http://dx.doi.org/10.1017/S0261444804002113

Olson, V. L. N. (1990). The revising process of sixth-grade writers with and without peer feedback. Journal of Educational Research 84 (1): 22−29. http://dx.doi.org/10.1080/00220671.1990.10885987

Orsmond, P., merry, S., and reiling, K. (1996). The importance of marking criteria in the use of peer assessment. Assessment and Evaluation in Higher Education 21 (3): 239−250. http://dx.doi.org/10.1080/0260293960210304

Orsmond, P., merry, S., and reiling, K. (1997). A study in self-assessment: tutor and students’ perceptions of performance criteria. Assessment and Evaluation in Higher Education 22 (4): 357−369. http://dx.doi.org/10.1080/0260293970220401

Panou, A. (2006). Using writing portfolios as an alternative assessment method in the Greek primary school. Patras: Open University of Patras.

Patri, m. (2002). The influence of peer feedback on self and peer assess-ment of oral skills. Language Testing 19 (2): 109−131. http://dx.doi.org/10.1191/0265532202lt224oa

Paulus, t. m. (1999). The effect of peer and teacher feedback on student writing. Journal of Second Language Writing 8 (3): 265−289. http://dx.doi.org/10.1016/S1060-3743(99)80117-9

Piaget, J. (1978). Behavior and evolution. New york: random House. Ploegh, K., tillema, H. H., and Segers, m. S. r. (2009). In search of quality criteria

in peer assessment practices. Studies In Educational Evaluation 35 (2−3): 102−109. http://dx.doi.org/10.1016/j.stueduc.2009.05.001

Plutsky, S., and Wilson, B. A. (2004). comparison of the three methods for teach-ing and evaluating writing: A quasi-experimental study. The Delta Pi Epsilon Journal 46 (1): 50−61.

Pope, N. K. L. (2005). The impact of stress in self- and peer assessment. Assessment and Evaluation in Higher Education 30 (1): 51−63. http://dx.doi.org/10.1080/0260293042003243896

Project, N. (1990). Responding to and assessing writing. Walton-on-Thames: Nelson.

race, P. (1998). Practical pointers on peer assessment, peer assessment in practice. Birmingham: SEdA.

324 WrItING & PEdAGOGy

raider-roth, m. B. (2005). trusting what you know: negotiating the relational context of classroom life. Teachers College Record 107: 587−628.

raimes, A. (1983). Techniques in Teaching Writing. New york: Oxford University Press.

richer, d. L. (1992). The effects of two feedback systems on first year college stu-dents’ writing proficiency. Dissertation Abstracts International 53: 2722.

rollinson, P. (2005). Using peer feedback in the ESL writing class. ELT Journal 59 (1): 23−30. http://dx.doi.org/10.1093/elt/cci003

ross, A. J. (2006). The reliability, validity and utility of self-assessment. Practical Assessment, Research and Evaluation 11: 1−13.

rouhi, A., and Azizian, E. (2013) Peer review: Is giving corrective feedback better than receiving it in L2 writing? Procedia – Social and Behavioral Sciences 93: 1349−1354. http://dx.doi.org/10.1016/j.sbspro.2013.10.042

rudy, d. W., fejfar, m. c., Griffith, c. H. I., and Wilson, J. f. (2001). Self- and peer assessment in a first-year communication and interviewing course. Evaluation and the Health Professions 24 (4): 436−445. http://dx.doi.org/10.1177/016327870102400405

Saito, H., and fujita, t. (2004). characteristics and user acceptance of peer rating in EfL writing classrooms. Language Teaching Research 8 (1): 31−54. http://dx.doi.org/10.1191/1362168804lr133oa

Schneider, G., and Lenz, P. (2001). European language portfolio: Guide for devel-opers. Strasbourg: council of Europe.

Sluijsmans, d., Brand-Gruwel, S., and Van merriënboer, J. (2002). Peer assess-ment training in teacher education. Assessment and Evaluation in Higher Education 27 (5): 443−454. http://dx.doi.org/10.1080/0260293022000009311

Sluijsmans, d. m. A., Brand-Gruwel, S., van merriënboer, J. J. G., and martens, r. L. (2004). training teachers in peer assessment skills: Effects on performance and perceptions. Innovations in Education and Teaching International 41 (1): 59−78. http://dx.doi.org/10.1080/1470329032000172720

Stoddard, B., and macArthur, c. (1993). A peer editor strategy: Guiding learning disabled students in response and revision. Research in the Teaching of English 27 (1): 76−103.

Strijbos, J. W., Narciss, S., and dünnebier, K. (2010). Peer feedback content and sender’s competence level in academic writing revision tasks: Are they criti-cal for feedback perceptions and efficiency? Learning and Instruction 20 (4): 291−303. http://dx.doi.org/10.1016/j.learninstruc.2009.08.008

taras, m. (2001). The use of tutor feedback and student selfassessment in sum-mative assessment tasks: towards transparency for students and for tutors. Assessment and Evaluation in Higher Education 26 (6): 606−614. http://dx.doi.org/10.1080/02602930120093922

Thompson, c. (2002). teaching critical thinking in EAP courses in Australia. TESOL Journal 11 (4): 15−20.

topping, K. J. (2013). Peers as a source of formative and summative assess-ment. In J. mcmillan (Ed.) SAGE Handbook of Research on Classroom Assessment, 395−412. London: Sage Publications. http://dx.doi.org/10.4135/9781452218649.n22

PEEr ASSESSmENt Of AdOLEScENt LEArNErS’ WrItING PErfOrmANcE 325

topping, K. J. (2010). methodological quandaries in studying process and out-comes in peer assessment. Learning and Instruction 20 (4): 339−343. http://dx.doi.org/10.1016/j.learninstruc.2009.08.003

topping, K. J. (2003) Self and peer assessment in school and university: reliability, validity and utility. In m. S. r. Segers, f. J. r. c. dochy, and E. c. cascallar (Eds) Optimizing new modes of assessment: In search of qualities and standards, 55−87. dordrecht: Kluwer Academic.

topping, K. J. (1998). Peer assessment between students in college and university. Review of Educational Research 68 (3): 249−276. http://dx.doi.org/10.3102/ 00346543068003249

tsai, c. c., and Liang, J. c. (2009). The development of science activities via on-line peer assessment: The role of scientific epistemological views. Instructional Science 37 (3): 293−310. http://dx.doi.org/10.1007/s11251-007-9047-0

tsivitanidou, O. E., Zacharia, c. Z., and Hovardas, t. (2011). Investigating secondary school students’ unmediated peer assessment skills. Learning and Instruction 21 (4): 506−519. http://dx.doi.org/10.1016/j.learninstruc.2010.08.002

tsivitanidou, O. E., Zacharia, c. Z., Hovardas, t., and Nicolaou, A. (2012). Peer assessment among secondary school students: Introducing a peer feedback tool in the context of a computer supported inquiry learning environment in science. Journal of Computers in Mathematics and Science Teaching 31 (4): 433−465.

tsagari, d. (2014). Investigating the face validity of cambridge English first in the cypriot context. Research Notes, 57: 23−31.

tsagari, d. (2012). fcE-exam preparation discourses: insights from an ethno-graphic study. Research Notes, 47: 36−47.

tsui, A., and Ng, m. (2000). do secondary L2 writers benefit from peer com-ments? Journal of Second Language Writing 9 (2): 147−170. http://dx.doi.org/10.1016/S1060-3743(00)00022-9

van den Berg, I., Admiraal, W., and Pilot, A. (2006). design principles and out-comes of peer assessment in higher education. Studies in Higher Education 31 (3): 341−356. http://dx.doi.org/10.1080/03075070600680836

van Gennip, N. A. E., Segers, m. S. r., and tillema, H. H. (2010). Peer assess-ment as a collaborative learning activity: The role of interpersonal variables and conceptions. Learning and Instruction 20 (4): 280−290. http://dx.doi.org/10.1016/j.learninstruc.2009.08.010

Van Steendam, E., rijlaarsdam, G., Sercu, L., and van den Bergh, H. (2010). The effect of instruction type and dyadic or individual emulation on the quality of higher-order peer feedback in EfL. Learning and Instruction 20 (4): 316−327. http://dx.doi.org/10.1016/j.learninstruc.2009.08.009

Vanderhoven, E., raes, A., Schellens, t., and montrieux, H. (2012). face-to-face peer assessment in secondary education: does anonymity matter? Procedia – Social and Behavioral Sciences 69: 1340−1347. http://dx.doi.org/10.1016/j.sbspro.2012.12.071

326 WrItING & PEdAGOGy

Villamil, O. S., and de Guerrero, m. (1998). Assessing the impact of peer revision on L2 writing. Applied Linguistics 19 (4): 491–514. http://dx.doi.org/10.1093/applin/19.4.491

Vygotsky, L. S. (1978). Mind in society: The development of higher psychological processes. cambridge, mA: Harvard University Press.

White, r. V., and Amdt, V. (1991). Process writing London: Longman.Xiao, y., and Lucking, r. (2008). The impact of two types of peer assessment

on students’ performance and satisfaction within a Wiki environment. The Internet and Higher Education 11 (3−4): 186−193. http://dx.doi.org/10.1016/j.iheduc.2008.06.005

yang, m., Badger, r., and yu, Z. (2006). A comparative study of peer and teacher feedback in a chinese EfL writing class. Journal of Second Language Writing 15 (3): 179−200. http://dx.doi.org/10.1016/j.jslw.2006.09.004

yin, r. K. (2014). Case study research: Design and methods (5th edn). Thousand Oaks, cA: Sage Publications.

yu, f. y., and Wu, c. P. (2013). Predictive effects of online peer feedback types on performance quality. Educational Technology and Society 16 (1): 332−341.

yu, f. y., and Wu, c. P. (2011). different identity revelation modes in an online peer assessment learning environment: Effects on perceptions toward asses-sors, classroom climate and learning activities. Computers and Education 57 (3): 2167−2177. http://dx.doi.org/10.1016/j.compedu.2011.05.012

Zhao, H. (2010). Investigating learners’ use and understanding of peer and teacher feedback on writing: A comparative study in a chinese English writ-ing classroom. Assessing Writing 15 (1): 3−17. http://dx.doi.org/10.1016/j.asw.2010.01.002

PEEr ASSESSmENt Of AdOLEScENt LEArNErS’ WrItING PErfOrmANcE 327

Criteria/Weighting Excellent- Very Good

Good-Average

Fair-Poor Very Poor

A. content1. Are the main ideas clear and well-supported with helpful details? Είναι οι βασικές ιδέες ξεκάθαρες; Στηρίζονται με χρήσιμες λεπτομέρειες;2. Are the ideas relevant to the topic? Είναι οι ιδέες σχετικές με το θέμα;3. Is the text easy for the reader? Είναι το κείμενο ευανάγνωστο;4. does the composition fulfill the task fully? Επιτυγχάνει η έκθεση πλήρως ό,τι ζητείται από την άσκηση;B. Organization5. Is there thorough development through introduction, body and conclusion? Υπάρχει εκτενής ανάπτυξη με εισαγωγή, κυρίως θέμα και επίλογο;6. Is there logical sequence of ideas and effective use of transi-tion? Υπάρχει λογική ακολουθία ιδεών και αποτελεσματική χρήση μετάβασης;7. Is there cohesion and are there unified paragraphs? yπάρχει συνοχή και ενιαίες παράγραφοι;8. does the writer achieve coher-ence by using simple linking devices? Υπάρχει συνεκτικότητα με την χρήση απλών συνδετικών μέσων;

Appendix I

The PA form

Read your peer’s essay carefully. Then have a close look at the questions. Indicate your response by ticking one box only for each question.

328 WrItING & PEdAGOGy

Criteria/Weighting Excellent- Very Good

Good-Average

Fair-Poor Very Poor

c. Vocabulary and Language Usage9. Is the vocabulary sophisticated and varied? Είναι το λεξιλόγιο περίπλοκο και ποικίλο;10. Is there effective word choice and usage? Is the meaning clear? Είναι αποτελεσματική η επιλογή και η χρήση λέξεων; Είναι το νόημα ξεκάθαρο;11. does the writer use simple/complex constructions effectively? Χρησιμοποιούνται απλές/περίπλο-κες εκφράσεις αποτελεσματικά;12. Are there errors of tense and/or subject/verb agree-ment? Υπάρχουν λάθη στους χρόνους και/ή στην συμφωνία υποκειμένου-ρήματος;13. Are there errors of number (singular/plural) and word order? Υπάρχουν λάθη στην χρήση του αριθμού (ενικός/πληθυντικός) και/ή στην σειρά των λέξεων;14. Are there errors of articles, pronouns and prepositions? Υπάρχουν λάθη στα άρθρα, αντω-νυμίες και προθέσεις;d. mechanics15. Are there problems with spell-ing and handwriting? Υπάρχουν προβλήματα με ορθογραφικά λάθη και/ή με τον γραφικό χαρακτήρα;16. Are there errors of punctua-tion and capitalization? Υπάρχουν λάθη με τα σημεία στίξης και την χρήση κεφαλαίων γραμμάτων;

Fill in your marks for each section and calculate the total score below.Analytic score: content: ___ (out of 5) Organization: ___ (out of 4) Vocabulary and Language use: ___ (out of 6) mechanics: ___ (out of 5) total score: ___ (out of 20).


Recommended