This project (2018-1-SE01-KA201-039098) has been funded with support from the European Commission.
This web site reflects the views only of the author, and the Commission cannot be held responsible for any use which may be made of the information contained therein.

This project has been funded with support from the European Commission.
This web site reflects the views only of the author, and the Commission cannot be held responsible for any use which may be made of the information contained therein.

Select language   >   IT SE EN

Competence Assessment Tools



Back to the list of Assessment Tools Review

NAME OF THE ASSESSMENT TOOL
Information Literacy Performance Assessment (ILPA)
NAME OF AUTHOR(S)
Nancy Law, Allan Yeun, Mark Shum, Y. Lee
NAME OF PRODUCER
Centre for Information Technology in Education , University of Hong Kong
DATE OF PRODUCTION
2006
COSTING
Not free of charge
DESCRIPTION OF THE COSTING POLICY
ILPA was a one-shot effort, but the questions are available from the authors.
DIGITAL COMPETENCES
Communication, Content creation, Problem solving
LEVEL OF KNOWLEDGE
Basic
LANGUAGE/S OF TEACHING RESOURCES
English
TUTORIAL SUPPORT
No
DESCRIPTION
ILPA was developed by Centre for Information Technology in Education (CITE) at the University of Hong Kong as a diagnostic test to follow up if the introduction of Information and Communication Technologies (ICT) in Hong Kong schools had affected the ability of primary and secondary school pupils to use ICT in practical situations [1].
The tests are predicated on the idea that mere technical ability is not sufficient for information literacy, rather it is composed of seven dimensions that a person should be able to use ICT in: “Define”, “Access”, “Manage”, “Integrate”, “Create”, “Communicate” and “Evaluate”.
In order to assess the pupils’ abilities in these seven dimensions, CITE defined for each teaching subject tested—Chinese Language, Mathematics, and Science (Biology)— a set of tasks where each segment tested one of these dimensions.
In addition, questionnaires were distributed to:
the heads of the schools involved, to see what they prioritised in terms of pedagogical and technical goals at their schools,
the teachers at the schools, to get their own assessments of how well they understood ICT and how to use it in teaching,
the ICT staff at the schools, to get their assessment of what ICT software and hardware were available at the schools and what teaching activities they could support,
the tested pupils, to see if they had access to computers outside of school, how much they used them and for what.

The test tasks were set up on servers at CITE and accessed by software clients installed on the computers of the participating schools.

The tasks would require the pupils to:
Define: What information do they need to solve the task? I e, how should they search for information, and what keywords should they use?
Access: Retrieve information from online sources.
Manage: Store and arrange data on their computers.
Integrate: Combine and analyse the retrieved data.
Create: Create a document presenting the integrated data.
Communicate: Present the integrated data to a specific audience.
Evaluate: Determine the extent to which the collected data fulfil the requirements.
The exact data to be collected and how it would be presented was chosen for each school subject under test.


[1] Law, N., Yuen, A., Shum, M. and Lee, Y. 2007. Final Report on Phase (II) Study on Evaluating the Effectiveness of the ‘Empowering Learning and Teaching with Information Technology’ Strategy (2004/2007). Centre for Information Technology in Education (CITE), Faculty of Education, The University of Hong Kong.
COMMENTS
The strength of the ILPA approach is that it requires the tested pupils to perform an actual process of retrieving, analysing and reinterpreting data, thus corresponding to a semi-realistic situation.
The major weakness is that the process is hugely labour-intensive: It requires setting up servers and developing software for running the tests, installing client software at all the participating schools, running tests with invigilators in the room, and then evaluating the answers afterwards. (In actuality, not every response was in fact evaluated, but a representative sample, a full evaluation being impossible.) Due to the element of a creative process, there is no simple, automated way of evaluating the responses.
The evaluators were trained to grade the responses in a uniform way, and double-checking their reported grades confirms that they managed to do this.
There are other limitations, in that the time allotted to the tests seemingly was too short, but knowing this, it is of course possible to extend the time, should one replicate the test, but this would on the other hand introduce further resource requirements.
TEACHERS' COMMENT
The ILPA was a study conducted by the University of Hong Kong between 2004-2007 to evaluate the effectiveness of ‘Empowering Learning and Teaching with Information Technology’ strategy. The studies objectives were to evaluate the impact of IT on students learning outcomes in Chinese and maths at primary level and Chinese and Science at secondary level and in special schools and to conclude on the effectiveness of the strategy and make recommendations for the future. The study consisted of questionnaires distributed to school heads, teachers and selected students, which took approximately thirty minutes to complete and an online assessment for proficiency which took ninety minutes to complete.
Conclusions from the study show that in general students at all levels attained the basic level in all dimensions but were “rather weak” in attaining higher levels of proficiencies that required higher-order or critical thinking. The study found the strategy to be generally effective but there were still gaps and discrepancies among schools in terms of support and infrastructure. Teachers were found to be more competent in general ICT use than the pedagogical use of ICT.
The recommendations from this study were that the HKSAR Government should establish a minimum standard in terms of ICT access and school’s ICT infrastructure. To add to this they noted that although guidelines for the employment of technical support staff exist, there is no enforcement mechanism to ensure that such guidelines are being appropriately used by schools and recommended to be established is an up-to-date benchmark test for the minimum expected knowledge and skills of school technicians.