Toggle Main Menu Toggle Search

Open Access padlockePrints

Assessing Competency in Undergraduate Software Engineering Teams

Lookup NU author(s): Dr Marie Devlin, Professor Chris Phillips

Downloads


Abstract

From 2005, Active Learning in Computing partners Newcastle and Durham University (ALiC), part of the UK CETL initiative [1] introduced a collaborative learning model of Software Engineering to level 2 Computing Science students that reflects global industry practice by focusing on cross-site software development. Assessment for our respective modules focuses on measuring the students’ development of both the technical and transferable skills associated with the practice of being a software engineer. This cross-site model encourages greater student engagement with the discipline and reflects industrial practice in a more authentic way by incorporating real industrial problems and practices. Since the new model of working has been implemented, students have reported good learning outcomes in questionnaires and focus groups set up to evaluate the module design and their experiences. During the project teams are formed at Newcastle and each one is then paired with a corresponding team at Durham. The major project task is the design and implementation a large software system – (e.g. in 2005 the task was a tour guide application that could be loaded onto a PDA or mobile phone and in 2007, teams had to develop a virtual geocaching application). Students work together as a virtual enterprise across the sites using communication technologies to facilitate their collaboration. Skills outcomes for the module were and still are listed at Newcastle as: initiative; adaptability; teamwork; numeracy; problem-solving; interpersonal communication; written communication and oral presentation. The assessment scheme is formulated around measuring student development of these skills during the module [2]. Their level of proficiency or achievement of these skills is decided based on the quality of teamwork products, team presentations, peer assessments, and staff observations which receive a grade at the end. Assigning marks to students during and at the end of the module[3] means that student achievement “is abstracted into just a few numbers” so it is difficult for the student to perceive what skills they have learned and how these skills have developed during the project based on marks and feedback on lots of separate elements of coursework. How do they know how good are they at being a software engineer? What skills do they need to improve on? What are their strengths and weaknesses? We propose that assessment should focus on the development of a range of competencies similar to those identified by Turley and Bieman when conducting a study of exceptional and non-exceptional professional software engineers [4]. Competency levels could be measured via peer, self, formative and summative assessment in a style that relates directly to professional performance appraisal. Turley and Beiman identified 38 competencies including – helps others, willingness to confront others, responds to schedule pressure, focus on user/customer needs, is team-oriented and writes / automates tests with code. We would use these in conjunction with the assessment of technical and team work products to give students a more useful and realistic performance review of their competency levels and how these have changed and developed throughout their project. In this paper, we review the assessment methods currently used to evaluate student performance and measure learning outcomes. We outline a set of alternative competencies and appraisal methods that could be used instead. We define how these methods could be used to help students achieve greater understanding of the requirements of their chosen profession in a global context and to help staff and students qualitatively evaluate levels of achievement and skill development in comparison to these during undergraduate team projects in Software Engineering. Software Engineeering, Competency, Skill development REFERENCES [1] HEFCE CETL Initiative: http://www.hefce.ac.uk/Tinits/cetl. accessed 15/03/09 [2] CSC2005 Team Project Module Outline, Newcastle University, http://www.cs.ncl.ac.uk/modules/2008/CSC2005, 2008 accessed 15/09/09 [3] R.A. McNamara, “Evaluating Assessment with Competency Mapping “, Proceedings of the Sixth Conference on Australasian Computing Education, vol. 30, R. Lister and A. Young (eds), ACM International Conference Proceeding Series, vol. 57, Australian Computer Society, Darlinghurst, Australia, pp 193-99, 2004. [4] R.T. Turley and J.M. Bieman, “Competencies of Exceptional and Non-exceptional Software Engineers, Journal of Systems and Software, vol. 28, issue 1, pp 19-38, 1995.


Publication metadata

Author(s): Devlin M, Phillips C

Editor(s): Universidad Politecnica de Madrid - Servicio de Publicaciones-EUI-UPM

Publication type: Conference Proceedings (inc. Abstract)

Publication status: Published

Conference Name: IEEE Education Engineering Conference (EDUCON)

Year of Conference: 2010

Pages: 271-277

Date deposited: 24/09/2010

ISSN: 9781424465682

Publisher: Universidad Politecnica de Madrid

URL: http://dx.doi.org/10.1109/EDUCON.2010.5492569

DOI: 10.1109/EDUCON.2010.5492569

Notes: Topic of conference: The Future of Global Learning in Engineering Education. Full paper only available on CD-ROM. Proceedings on CD-ROM - IEEE Catalog Number: CFP10EDU-CDR - ISBN: 978-1-4244-6569-9 Conference Program Book - ISBN: 978-84-96737-70-9. Depósito legal: M-11728-2010

Library holdings: Search Newcastle University Library for this item

ISBN: 9781424465705


Share