Continuous & Comprehensive E-Assessment (CCEA)
An Academic & Administrative tool to enhance the complete learning environment in schools
Abstract
The ever changing needs of education have truly exposed the need for enhancing skills rather than learning a subject, and this has been aptly recognized by some educational boards like CBSE, quite evident from the adoption of CCE, PSA and OTBA, which lay stress on higher order and 21st century skills. Assigning extraordinary importance to assessments and ignoring the efforts before and after it, render the whole assessment process futile.
Educational Institutions seldom have standardized scheme of work, clearly defined instructional objectives, scientific benchmarking and remediation tools and this has nullified the relevance of conventional assessment tools. This creates the need for the new age Continuous Comprehensive E-Assessment.
Every skill set underscored by NCERT has to be assimilated by the student and this can be possible only when continuous assessments solutions are used which are crisp, short and are capable of providing real-time, educated, customized information to all the stake holders involved.
Continuous Comprehensive E-Assessment helps to inspect each and every skill set that goes into a topic and the performance could be drilled down to the lowest possible level, and there is no time lag between assessment, evaluation and remedial recommendations. Everything happens real-time, automatically; a feat that is quite cumbersome and un-economical in conventional testing methods, which are also prone to several human limitations.
How can delivery in classrooms be standardised? How can we standardise the performance of students? How can we make meaningful recommendations for remedial action? How can we minimize subjectivity induced by human intervention? How can different stakeholders get relevant, customized and actionable information?
Continuous Comprehensive E – Assessment is the only answer for all these questions. A scientific approach for the next generation teaching and learning process, CCEA is not to be construed as a supplement for the conventional education and evaluation tools; rather it has the potential to emerge as a path-breaking methodology around which the whole teaching and learning process will revolve.
Introduction to CCEA
CCE- Assessment is not to be confused with mere online testing or the use of computers and technology in the testing process. It refers to a gamut of activities, aimed at real-time monitoring of the achievements of students according to set standards and to make meaningful interventions to bolster learning by parent, teachers and school administrators.
A continuous and comprehensive assessment system would have the following features -
• Defined set of learning outcomes expected at the end of each lecture
• A comprehensive test bank with question items tagged to specific learning outcomes
• System generated action plans to individualize remedial action
• After test follow-ups on part of parents, teachers, students and administrators
• Continuous reporting system which prompts action rather than analysis paralysis
• A real-time communication system which connects together all stakeholders in a child's learning including parents, teachers and school administrators
The fundamental premise of CCEA would be that pre-assessment and post-assessment activities acquire predominance compared to the ritualistic assessment itself4. Thus every test becomes an investment and an opportunity to enhance the effectiveness of the teaching learning process9, thus the learning of the children and ultimately the whole academic environment of the school.
The need for CCEA
The instruction delivered in the classrooms in the current system is loosely defined and suffers from over reliance on the knowledge, experience and skills of a teacher. In the absence of common defined objectives and standards, the instruction delivered within a school itself suffers from a lot of variations. Even two teachers teaching the same topic may lay emphasis on two different skill levels. For example a teacher who is teaching mirror formula may lay emphasis on knowledge and application and may pen down his instruction's objective as:
"The learner should be able to recall mirror formula and use it in situations where one out of u, v and f needs to be found out."
Whereas another teacher may emphasize on the Higher Order Thinking skills of a student and may have an entirely different objective while teaching the same topic, for instance,
"The learner should be able to analyze situations where mirror formula is applicable and should be able to calculate the values of u, v and f and also predict how a change in object distance affects the image distance and the nature of the image."
Both teachers would use different methods and end up developing two different levels of skill sets in their students. If we need to measure the achievement of objectives we would not be able to measure with a common test instrument. This is where the need to link instruction with teaching arises and the precondition for this linking is that common standards and objectives of instruction should be developed for each subject area7. Most of the testing that happens today is a test based on topic areas, but seldom based on any instructional objectives.
The basic underlying structure of CCEA would be a well written set of Instructional Objectives that meet the ultimate economic, intellectual and personal purposes of education. The Modified Bloom's Taxonomy1 can be used as an effective tool in formulating the learning objectives. The effect would be more meaningful instruction, linked assessments and focused remedial action and more importantly CCEA would ensure that a continuous rolling cycle of instruction, learning, assessment and remedial action would become the norm2. Every teacher would have a clear idea of the expectations from each lecture, what are the outcomes that are to be met and how will it be measured and what kind of remedial action is required for whom.
Assessment Mode and Platform
Traditional paper-pencil based testing mode is time taking and cumbersome, and offers practical problems in scoring and analyzing even when machine marking is used. The delay in result and analysis limits the continuity of testing and follow-up cycle. When the number of tests decreases, the length of the tests have to be increased to maintain exhaustiveness and when the length of test increases the data generated is prone to errors induced by extrinsic factors like test fatigue.
So the need of the hour is short crisp tests that the students can take on the go and real-time analysis and follow-up feedback for all stakeholders to initiate immediate action. An e- assessment platform just does that3. E- Assessment platforms offer a variety of options in testing. From the traditional MCQ format to adaptive testing to highly advanced game based testing to support learning based on constructivist learning theory8.
The most popular and easy to make and administer forms of testing used in E-assessment are the MCQ format and the cloze type question format. Though they are easy to use and evaluate, construction of MCQ test items is a task that demands experience, expertise and time5. After the construction of the test, pre-testing it and tagging it to the instructional objective is another task that demands a lot of expertise and skill10.
The richness of the data collected increases when more and more schools use the same platform and the same set of question item bank. This would not only increase the amount of data on which all analysis and interpretations would rest, but also offers a floodgate of opportunities including standardization and scaling of scores for better analysis and interpretation of the scores.
Scheduling and Conducting Tests
The frequency and regularity of the testing required to implement CCEA requires schools to have dedicated Assessment Labs with internet connection. A dedicated lab equipped with computers equal in number to the class size, can cater to 25 class sections while maintaining a test frequency of 2 test exposure per week for all students. Hence the school time table should have 2 periods per week dedicated to assessment. A coordinator cum lab in charge can handle the entire process.
Reporting for follow-up
The most important part of CEEE is how the test attempt data of a student is converted to useful information for all stakeholders to make meaningful and significant contribution towards the academic success of a student. There are three kinds of reporting to be done
• Students'/parent's report
• Teachers' report
• Principal's report
The student and parent's report should be generated immediately after the test is taken by the student. The report should discuss
• A comparative analysis of his score vis-a vis other test takers
• A listing of instructional objectives where he needs to develop proficiency
• Remedial suggestions which include, what topics to read again from NCERT Books, what resources in public domain on the web can help him acquire necessary skills, including reading material and videos.
The teacher report is to be generated immediately after his /her class has taken a test. The report should discuss
• What percentage of the class is showing proficiency in each Instructional Objective?
• Which students are showing exceptional proficiency in an instructional area and which students need remedial action on that area?
• How do my class statistics compare to the statistics of other section or other test takers?
The principal's report should be generated weekly as a cycle of testing is completed in a week. The principal should get actionable report on
• The class sections which are showing below par achievement in various Instructional areas
• The teachers who are doing exceptionally well and the list of teachers by subject who would need assistance and support
It is very essential that the CCEA reports should not be merely informative but it should be presented in a way that prompts and guides corrective action. The analysis and reporting is to be generated without any manual intervention, completely automated and always on schedule. The value and the utility of the reports are enhanced if it can match pace with the busy schedules and lifestyles of the day by being available on handheld devices like smart phones6.
If CCEA system is implemented, then it has the potential to become the pillar in monitoring and controlling the complete academic environment of a school and lead to a healthy and transparent culture of evaluation for continuous improvement.
References
1. Anderson, L.W. and Krathwohl, D.R. (eds) (2001), A Taxonomy for Learning, Teaching and Assessing: A Revision of Bloom's Taxonomy of Educational Objectives. New York: Longman.
2. Atkins, S. and O'Connor, G. (2005), 'Re-purposing learning objectives as assessment instruments', The 10th Annual Roundtable Conference AR+t: Assessment Reporting & Technology, http://www.vcaa.vic.edu.au/roundtable/papers/sagcoatkins.pdf
3. Bennett, R. E., Jenkins, F., Persky, H. and Weiss, A. (2003), 'Assessing complex problem solving performances', Assessment in Education, 10, 347-59.
4. Biggs, J. B. (2002), 'Aligning teaching and assessment to curriculum objectives', LTSN Imaginative Curriculum Guide, http://www.ucl.ac.uk/teaching-learning/global_uni/internationalisation/downloads/Aligning_teaching
5. Bonk, C. J., & Dennen, V. P. (2005). Massive multiplayer online gaming: A research framework for military education and training. (Technical Report # 2005-1). Washington, DC: U.S. Department of Defense (DUSD/R): Advanced Distributed Learning (ADL) Initiative..
6. Evans, D. (2005), 'Potential uses of wireless and mobile learning', http://www.jisc.ac.uk/uploaded_documents/Potential%20Uses%20FINAL%202005.doc.
7. Glaser, R. (1963). "Instructional technology and the measurement of learning outcomes". American Psychologist 18 (8): 519–522.
8. Johnstone, A.(2004), 'LTSN Physical Sciences Practice Guide: Effective Practice in Objective Assessment. The Skills of Fixed Response Testing', http://www.physsci.ltsn.ac.uk/Publications/PracticeGuide/EffectivePracticeInObjectiveAssessment.pdf.
9. Kendle, A. and Nothcote, M. (2000), 'The struggle for balance in the use of quantitative and qualitative e-assessment tasks', 17th Annual Conference of ASCILITE, http://ascilite.org.au/conferences/coffs00/papers/amanda_kendle.pdf.
10. McAlpine, M. (2002c), 'Design requirements of a databank', CAA Centre, University of Luton, http://caacentre.Iboro.ac.uk/dIdocs/Bp3final.pdf.