Logos International School
Phnom Penh, Cambodia
Hope International School
Phnom Penh, Cambodia
The increased accessibility of data analytic tools provides opportunities to improve teaching and learning and career counselling in secondary education. This preliminary study explored the design, development, and deployment of an initiative to use dashboards to provide teachers and career counsellors with tools in the form of dashboards to improve teaching, learning, and guidance. The study applied an action research methodology that drew from the Unified Theory of Acceptance and Use of Technology model in two international secondary schools in Cambodia. Though limited in its scope, the study highlights the potential for dashboards to inform teaching and learning and career counselling when integrated with the school teaching and learning philosophy and professional development.
Keywords: Career counselling; learning styles; dashboards; Unified Theory of Acceptance and Use of Technology; secondary education.
One common quote attributed to management guru Peter Drucker is “what gets measured gets managed” (Prusak, 2010). While this statement may reflect many aspects of modern education, it presupposes that data is analysed to inform teaching or management practice. However, our experience as educators suggests that analysis often does not occur or lacks depth. Furthermore, the analysis process may be impacted by confusing or contradicting data. To improve classroom management, we believe an important step is to enhance the capability of teachers to analyse data by presenting information visually.
Our study began as a collaborative project between three staff members from two Cambodian international schools to streamline and visualise the career guidance process. Both international K-12 schools were approximately 20 years old, of a similar size (approximately 300 students) and located in Phnom Penh but differed by following either a US or UK curriculum. For both schools, career guidance was important because most students sought international tertiary study in destinations such as the US, UK, Australia, Canada, Singapore, and New Zealand. With each country having different study requirements (Global Information Consultants, n.d.), it was important for career counsellors and teachers to access and integrate this information in their respective roles to meet learner expectations.
In scoping the project, the team identified a potential to improve teaching and learning by capturing and visualising information about students, their interests, and their learning styles in a dashboard. By ‘dashboard,’ we refer to graphical summaries of key information arranged to enable data analysis and decision-making (Brouns et al., 2015). Prominent data analysis software used to create dashboards include Microsoft PowerBI, Google Analytics, Sisense, and Zoho Analytics – to name a few of the software packages now available. Where previously these schools collected information on student learning styles through Learning Support (i.e., school programmes working with students identified as having special learning needs or being gifted and talented), this information was limited to students receiving learning support, was not widely distributed, and could not be easily analysed at an individual or class level. Our team hypothesised that better collection, distribution, and dissemination of this information through dashboards could enable teachers to better cater for cognitive diversity and improve teaching and learning in our classrooms.
Research question and objectives
We identified that a key success factor in the project was the acceptance and use of the dashboards by teachers and career counsellors. To better understand technology adoption, we initiated this concurrent study using action research to analyse results and share findings with staff so lessons could be better integrated into future development stages and other technology projects. The research question we formulated was: What can we glean about user acceptance and adoption of dashboards in our schools that can be used to support future teaching and learning and project development? Our project aims were to:
- Increase efficiency for learning support and career guidance by process mapping and streamlining related processes,
- Consolidate data collection about learning styles, learning preferences, and career destinations in one responsive form, and
- Improve teaching and learning by communicating information to academic staff through dashboards.
We planned the project to occur in four stages. Stage one conceptualised and designed the dashboard, stage two developed dashboards, stage three analysed feedback, and stage four compiled recommendations to guide further development. This article explores the underpinning theories, summaries the study methodology and its findings, and recommendations for further or future projects.
The Unified Theory of Acceptance and Use of Technology
Research on user acceptance and adoption of technology has seen a plethora of related models proposed over the past 40 years. Among the most recognised is the Unified Theory of Acceptance and Use of Technology (UTAUT) model – a synthesis of eight earlier psychology, innovation, and information technology models and theories (Venkatesh et al., 2003). Venkatesh et al. (2003) argued that four dimensions influence behavioural acceptance and use – performance expectancy (with constructs of perceived usefulness, job fit, extrinsic motivation, and relative advantage), effort expectancy (with constructs of perceived ease of use, complexity, ease of use), social influence (with constructs of subjective norms, social factors, and image) and facilitating conditions (with constructs of perceived behavioural control, facilitating conditions and compatibility) (see Figure 1). We believed that the synthesis of the earlier model into these four clear dimensions provided a good framework for understanding factors shaping staff acceptance and use of dashboards.
Figure 1: The Unified Theory of Acceptance and Use of Technology(Venkatesh et al., 2003, p. 447)
Models and theories underpinning the UTAUT model provide insight into how technology is accepted and used. The Technology Acceptance Model (TAM) predicts acceptance or rejection of technology based on the perceived usefulness of technology and the perceived ease of use (Granić & Marangunić, 2019). Research by Alharbi and Drew (2014) found that the behaviours of teaching staff in a Saudi university were consistent with the TAM – suggesting the model could be applicable in education contexts. However, a study by Setyohadi et al. (2017) on the adoption of e-Learning in an Indonesian context found that peer influence was also a key factor in user acceptance and of more importance than perceived usefulness. Venkatesh (2003) captured both factors in the UTAUT model and also drew from the Theory of Reasoned Action where Sheppard et al. (1988) argued that user feelings (positive or negative) and the feelings of their social network encouraged or hindered technology adoption. While we believed staff individual and collective beliefs might shape dashboard acceptance and use, we also recognised that staff could lack the experience or skills to use dashboard information in their teaching.
Another important underpinning theory of the UTAUT, the Theory of PC Utilisation, argued that acceptance and use were influenced directly by experience and indirectly by job fit, complexity, long-term consequences, affect towards use, social factors, and facilitating conditions (Thompson et al., 1994). The Social Cognitive Theory also argued that self-efficacy in using technology was also a significant factor shaping acceptance and use (Compeau & Higgins, 1995). While aspects of these models may still hold true, both models were proposed in the 90’s when computer use in secondary school was much lower. In contrast, staff in the schools in our study had extensive computer experience with all teaching at this time online due to COVID-19.
The Innovation Diffusion Theory argued technology diffusion occurs when there is a relative advantage in technology over its predecessor, ease of use, an enhancement to personal image or status, visibility within the organisation, compatibility with existing values, needs and past experiences, and demonstrated results (Moore & Benbasat, 1996). It was difficult to validate these constructs in the study as users had no direct precedent, and there was no clear link to how the use of the dashboards would enhance image or visibility.
We suspected other psychological models or theories underpinning the UTAUT model would have less influence on user acceptance and use at this stage in the project. The Motivational Model argued acceptance could be linked to extrinsic motivators when there is a belief that desired outcomes are linked to an activity or intrinsic motivators if the user is self-motivated for personal reasons (Vallerand, 1997). Nevertheless, our project had no formal, direct, or clear extrinsic motivators encouraging dashboard use, and dashboards were available to staff on a voluntary basis. The related Theory of Planned Behaviour (and also the Combined Technology Acceptance Model and Theory of Planned Behaviour) also argued that decision-making is influenced by user intention and perceived behavioural control of a technology (Taylor & Todd, 1995). Given the new and evolving development of the dashboards, it was unclear if these constructs would be a factor at this stage in the project.
Since being proposed, some academics have noted shortcomings in the UTAUT model or proposed further developments. The development includes factoring in online social support (Lin & Anol, 2008) and social networks (Sykes et al., 2009) or considering the influence of mobile technology (Wang & Wang, 2010) – factors not relevant in this study. Bagozzi (2007) noted how 41 variables and at least eight independent variables underpinning the UTAUT model contributed further to confusion in the study of technology adoption, and Van Raaij and Schepers (2008) and Li (2020) suggested the model may be overly complex. To simplify the study, our team focused on exploring the dimensions identified by Venkatesh (2003) through a qualitative study as a preliminary for further research in later project stages.
Use of dashboards within education
Although our project signals progress within our institutions, other studies have also used dashboards to share education analytics. In a UK study by Herodotou et al. (2021), staff who actively used dashboards displaying predictive learning analytics found significantly better performance in students previously identified as being at risk of failing. A similar but more limited Vietnamese university research project proposed how dashboards may achieve this but did not track or report on outcomes (Thanh et al., 2021). Limited research by Darling-Hammond et al. (2014) also suggests the use of dashboards may strengthen college preparation through targeted discussions and interventions – reflecting the potential benefits for student achievement through dashboards.
Research has also explored factors shaping successful dashboard implementation. Bingimlas (2009) found that despite teachers having a strong desire to integrate technology into their lessons, they were hindered by a lack of confidence and competence or had negative attitudes and inherent resistance. Raffaghelli et al. (2022), in a Spanish higher education institute, also found unrealistic expectations of users of Early Warning Systems (EWS) – what we refer to as ‘dashboards,’ negatively impacted acceptance and use. Similarly, research by Klein et al. (2019) in higher education found that successful adoption by faculty required reliable technological infrastructure and a ‘fit’ between the dashboard and user needs. Although research in secondary education is limited, it suggests dashboard adoption and use requires solutions that combine technology and support.
While earlier research found that dashboards helped visualise learning analytics and identified some strategies that shape dashboard adoption and use, there were some research gaps. We found no research exploring dashboard adoption and use by secondary staff in a Cambodian context. Furthermore, it was unclear how the increased use of technology by staff through blended and online learning during COVID-19 (König et al., 2020) impacted staff technology competency and attitudes and how this would impact dashboard adoption and use. Our project considered these factors by creating what we believed to be innovative, useful, and simple-to-use dashboards for teaching and counselling staff with technical support from our team and from a short training demonstration.
We applied an action research methodology to support practitioner-led professional learning by encouraging feedback and reflection. While we recognised potential tensions in balancing professional outcomes and research (Simonsen, 2009), we believed this approach could be a catalyst for further projects in these schools and help foster a research culture consistent with the research requirements of the Kingdom of Cambodia. Action research is an interactive, collaborative and data-driven enquiry process that seeks to create change by problem-solving and testing potential solutions (Young et al., 2010). Better put, action research is “a form of self-reflective enquiry undertaken by participants in social situations in order to improve the rationality and justice of their own practices, their understanding of these practices and the situations in which the practices are carried out” (Carr & Kemmis, 2006, pp. 5-6) and is often divided into four phases – planning, acting, observing, and reflecting. To reflect the iterative process of action research, the project was divided into cyclical development stages where earlier development was used to inform later decisions and development. A summary of these project cycles is shown in Table 1.
Table 1: Project cycles
An important emphasis in our study was double-loop learning. Unlike single-loop learning which seeks to address the ‘problem,’ double-loop learning addresses the underlying mental model(s) and challenges these concepts (Argyris, 1980). Our project cycles applied double-loop learning as we captured evidence from observing and reflecting on cycle 1 to inform further development in cycle 2. Our findings also helped us support teachers in our schools by providing feedback on their teaching and learning through the dashboards.
The study relied primarily on evidence collected through interviews and focus groups. Initial interviews with key project stakeholders from both schools helped draft flowcharts, forms, and static dashboards. These designs were then explored and developed by an exploratory focus group of three selected teachers and career counsellors from both schools. In joining the study, participants in the latter focus group completed a research form that collected background details and their use of the dashboard. A further recorded focus group composed of five participating teachers and two career counsellors from both schools explored the deployed dashboards and implementation process. This recorded focus group was conducted online and in English due to COVID-19 restrictions. A summary of participants is shown in Table 2.
Table 2: Participants’ profiles
|Cindy||Female||School A||Learning Support & Teacher||> 10 years|
|Chantelle||Female||School A||Teacher||1 – 3 years|
|Barry||Male||School A||Teacher, Learning Support & Career Counsellor||> 10 years|
|Neil||Male||School A||Teacher||> 10 years|
|Jenny||Female||School B||Teacher||>10 years|
|Makara||Male||School B||Learning Support & Career Counsellor||4 – 6 years|
All teaching staff at both schools had significant experience using computers in their teaching, especially since COVID-19 lockdowns moved teaching online. Given the teaching context, teacher experience, and relative simplicity of the dashboards, the team felt minimal teaching training was needed and only conducted a short five-minute demonstration during a staff meeting and sent a short email answering some anticipated questions.
Evidence was collected from the focus groups and analysed using QDA Miner. Categories and codes were created based on the UTAUT model and the proposed project stages. Codes were reviewed multiple times and by at least two team members. Quotes from participants in the latter focus group are included in the next section.
As with other action research studies, there was tension in balancing the role of researcher and team member during the project design, development, and implementation (Simonsen, 2009). Our small team included a diversity of roles, with team members being career counsellors, secondary teachers, and Learning Support in the represented schools, and their perspectives inevitably influenced their peers. Furthermore, the selection process encourages participation by teachers who are positive about the project’s potential and does not reflect the opinions of all teachers. Nevertheless, we believe our project aims and approach are aligned with those proposed by Sagor (2000) for action research to create reflective teachers, build professional cultures and progress institutional priorities and that our outcomes will help our schools improve their teaching and learning tools.
The dashboards sought to support staff in lesson preparation, interventions, and career preparation at an individual and class level. The information displayed through the dashboards included student name, nationality, personality, student identified values, student interests, student preferences for learning new information, student preferences for reinforcing learning, student preferences for learning conditions, student preferred assessments, career fields, and university preferences (including university and intended university activities). We also introduced security to protect information integrity and restrict dashboard access to the appropriate staff.
Findings and discussion
This section is divided into subsections exploring the dimensions identified by Venkatesh et al. (2003) in the UTAUT model before exploring how dashboards can support teaching and learning and concluding with further development and research.
There was no clear evidence in the study that social influence played any significant impact on the adoption of dashboards by participants. These results are consistent with other research on the acceptance and use in the implementation of Learning Management System (LMS) platforms in higher education in Saudi Arabia (Al-Shehri, 2017) and Jordan (Abbad, 2021) based on the UTAUT model which they attribute to familiarity with a digital environment. While this is also true of staff in these schools, we believe a better explanation is that staff had no previous experience using dashboards in a formal education context or as a tool to inform teaching and learning. Given the short project cycles (around two months), we believe there was insufficient time for social attitudes and norms to form on the use or design of dashboards. We expect this to change in subsequent project cycles as staff undertake further training on cognitive diversity, and we further promote the use of the dashboards within the school communities.
Our study identified that all three roles were impacted by the dashboards – teachers, career counsellors, and learning support. Teaching staff believed dashboards were useful and relevant in their teaching role. Neil shared how he believed:
This dashboard gives us the ability to really focus on teaching the class as a composition of individuals rather than just on reaching the one student.
He believed this could change lesson preparation in the future within both schools. Cindy shared how:
Seeing the different types of ways that students learn [helps] me make sure that I include a pre-recorded lecture or a game or whatever the students like, so I think [the dashboards] will be very useful.
Cindy has since used dashboards in her curriculum development to improve teaching and learning. Chantelle also used concepts she saw in the dashboards when teaching middle school:
I asked [middle school students] how they like to learn, or what seems to be helpful for them. I got those ideas from the dashboards and …it helped me to differentiate between sixth and seventh grade. The sixth-grade class loves paired learning, or pair-share… In seventh grade they prefer working on their own so they are more of individual workers. …I think the dashboard can be really helpful, especially when trying to figure out how to approach a class, [whether] a class in general, or a particular student.
Based on Chantelle’s comments, participants suggested extending data collection and dashboards to include middle school in the next project cycle. Cindy also believed that student destinations information helped teachers in curriculum development, as:
Young students asked me, ‘why do we need to know this again,’ and then I tell them, ‘well, even if you’re going to become like a race car driver when you grow up,’ because some of them want to be race car drivers… Connecting what I’m trying to teach them to where they’re headed… is pretty helpful.
Many focus group participants were experienced teachers and believed the dashboards presented significant benefits for new or inexperienced teachers. Makara shared:
Part of being professional is that you [learn] to read your classes, over time… When you are a new teacher [you do not have this ability so] this [dashboard] would be really, really useful.
All the teachers believed that dashboards were useful and perceived them to be an asset in their teaching role, particularly for new or inexperienced teachers. Teachers believed dashboards could help them understand student characteristics like personality, career direction, and preferred learning styles and plan and manage the class learning experience by using preferable assessments, learning activities, and interventions.
Learning support staff also perceived the dashboards to be useful in understanding student strengths and weaknesses and catering to students requiring learning support. These teachers believed the dashboards could improve lesson planning and intervention by better integrating students with and without learning and special needs in the classroom. Makara shared how the Personalised Education Plan (PEP), a plan outlining needs and strategies for the learning needs of gifted and talented students, would be enhanced by dashboards as teachers can:
Look at not only what those learning difficulties [there] are but actually how best does that student learn. We can use that to put that in place in their educational profile. If you are working with the way the students are identified through the dashboard, and this is how they best learn, it will be more helpful for them [if you are able to engage] the student in a way they are wanting to work as well as helping to overcome the learning difficulties that they have.
As dashboards convey information from student surveys, it provides insight into how students perceive the learning experience. Several teachers who were not in learning support also believed dashboards could enhance the PEP reports. Cindy shared that when:
[Talking] about each individual student who has a PEP, [the] dashboard would be a great resource that they could then use to point teachers to… specifically for those students who… may not have the support that other students have.
Likewise, Barry shared:
I actually just used the dashboards in rewriting [this year’s] PEPs (Personal Education Plans) as I had such great data on the students… so I was able to incorporate that in the redrafting of their PEPs.
Though not many learning support staff were represented in the study, those represented believed that dashboards were useful for meeting learning needs by improving documentation and supporting lesson planning and intervention. These results reflect similar initiatives in higher education which also linked dashboards to improved lesson planning and interventions (Herodotou et al., 2021; Raffaghelli et al., 2022).
Career counsellors found dashboards useful. Barry shared that:
Dashboards for the specific role of guidance are very relevant because I normally want to gather all of that data through an interview process anyway… [Dashboards are] very relevant for me [as they collect] data ahead of time for a first or second meeting with a student.
Though only a few career counsellors were represented in this study and in the staff community, they recognised how dashboards helped them collect and disseminate student career intentions which reduced meeting times and allowed them to better understand student university requirements. While these initial results are promising, future project developments may be strengthened by improving accountability through shared access between career counsellors, teachers, and students (Darling-Hammond et al., 2014) and to better facilitate meaningful learning in learners and the professional capacity of teachers and career counsellors.
The data showed that managerial support was important in the acceptance and use of dashboards. Jenny shared the importance education management plays in supporting the implementation of dashboards:
I’d probably prefer a check in maybe once a term or semester [from management, to find out] ‘Okay, have you checked in with the dashboard to see where students are at right now? Is your teaching in line with that? Do you need to adjust?’ A regular reminder, I think, might be helpful.
Furthermore, several participants noted how dashboards could be incorporated into teacher observations and linked to online sources for teacher professional development. Barry explained that:
The only reason to ever do an observation is to help a teacher grow, right? If you are doing an observation, and the observation shows evidence of a need for growth in a certain area, and then you [ideally could be] connected [through the dashboard] to professional development resources as a way to just encourage that development, growth [will happen] for sure.
Most participants expressed a belief that managerial support in the form of periodic encouragement and better integration of the dashboards into school culture would increase dashboard use. While this supports research by Nistor et al. (2012) that professional culture is an important factor in technology adoption, this research differed in having no clear evidence that national culture impacted the project. This is surprising given the staff diversity within both international schools but reflects that all participants that volunteered for the focus group were expatriate staff. Future cycles must explore how school and national cultures influence dashboard acceptance and use with particular focus on potential impacts for Cambodian staff.
Several themes about effort expectancy consistently emerged. The first was that there was inadequate training and support. Chantelle shared about the importance of:
Doing [training on the dashboards] during training week before school starts. I had to fiddle around with [the dashboard] for quite a bit and if it’s something that I feel like I have no idea how to use, it’s harder to want to use it.
When teachers see the application and the ease of it and the usefulness in practice, they’re more likely to use it.
Despite the earlier demonstration, the teachers believed better training was needed to use the dashboards – reflecting research by Raffaghelli et al. (2022) that technology familiarisation is necessary to support the acceptance and use of dashboards in education contexts. Staff further expressed how dashboards were not user-friendly as they contained too much data and could be confusing. Chantelle shared that:
It’s got a lot of good information, but sometimes too much information can make it overwhelming… I think the way the information is organised can be improved. The information in the dashboard is useful, but too much information makes the information hard to understand.
During this development stage, dashboards were designed to visualise as much information to as many roles as possible. Feedback suggests this approach is counter-productive and analysis is likely to improve by structuring dashboards to target information needs. For some participants, this meant having information spread over multiple dashboards and more ‘white space.’ Other participants recommended structuring information to align with the learner journey and progression of students as Thanh et al. (2021) noted in calling for a ‘simple framework’ with less information. In contrast, Raffagheill et al. (2022) found their users instead had higher expectations of dashboards. The contrasting perspectives show that the project team must engage users to better identify and ‘fit’ user needs (Klein et al., 2019) as these likely differ with the context.
Using dashboards to support teaching and learning
Despite strong staff feedback about the perceived usefulness of dashboards, staff acceptance was limited by insufficient training, managerial support, and professional development. These findings are consistent with earlier findings that user acceptance could be hindered by insufficient training but enhanced by providing user training and support (Compeau & Higgins, 1995). As the perceived usefulness and ease of use improve through staff training on using the dashboards, the literature suggests higher acceptance and use (Davis et al., 1989).
We also believe dashboards must be better incorporated into the teaching and learning philosophy of each school. Staff believed this could include professional development on applying dashboard information to lesson planning and delivery, teaching observations, and performance reviews. This could occur by linking dashboard information to online professional development resources. As Barry shared:
I think [professional development could be useful in] connecting the dashboards to differentiation… and teacher planning [so there is professional development content on] … differentiation understanding, some of the different learning styles of your students and how that then plays into your lesson planning and some of your activity choices.
A model illustrating how dashboards must be developed is shown in Figure 2. We recommend ongoing development of the dashboards as an educational tool, but more importantly, integrating this tool with the school teaching and learning philosophy and professional development. Finally, we believe implementation must be supported by management through extrinsic motivators and reinforced in teaching practice in observations and performance reviews.
Figure 2. Framework for supporting teaching and learning
Further dashboard development and research
Teaching and learning support staff recommended changing the current functional dashboard design that segments information based on roles to better align with the learner journey. Cindy suggested:
I want to see… information as a class… [as well as individually so] when you have an issue with one student [and think] ‘why are they not understanding, what can I do to help them’… I can see the student details as well.
Cindy also shared how she would like information segmented. She said:
[The] first thing that I’m going to want to know is how my students learn. That’s the goal of [the] lesson plan… so that they can learn.
Jenny also shared that:
From a teacher’s point of view that kind of a progression [starting with who I am as a student] makes the most sense for me. …If I went into it and I could see that aspect of the student’s journey, it would be more relevant from a teacher’s perspective.
Building on feedback, we planned future project cycles to build on three questions: “who is my student?” (student personality, interests, and values), “how do my students learn?” (learning styles of individual students and classes), and “what are the outcomes?” (intended and actual academic, social and spiritual outcomes of students). Participants also requested that dashboards more clearly emphasise teaching and learning methods relevant during COVID-19 lockdowns and include student family and faith characteristics – aspects previously captured by Learning Support but not shown in current dashboards. We believe the additional information and new design will help make the dashboards more aligned to teacher expectations and intended use.
Although our study validated aspects of the UTAUT model, further project cycles and study is required. Some factors, such as the perceived ease of use and perceived usefulness, were consistent themes and aligned with earlier TAM research (Granić & Marangunić, 2019). Other factors, like the influence of social support, national culture or user intentions, may influence future acceptance and use of dashboards by staff but require further validation in upcoming project cycles. While our intent was to use the UTAUT model as a holistic and ‘unified’ approach, the model was too complex to easily explain staff acceptance and use of the dashboards within the project. We recommend future project cycles use the earlier TAM as the factors identified in TAM more clearly explain teacher acceptance and use of dashboards and can be more rigorously tested.
The use of ‘dashboards’ in education has become more prevalent in the past decade and moved from being managerial to practitioner tools. Our study reflected how some staff recognised benefits to their teaching and learning by using dashboards to analyse information about their students. Though dashboard acceptance and use by teachers were high in the focus group, the small participant numbers and concentration of expatriate teaching staff are unlikely to represent the wider beliefs of staff or Cambodian teachers more generally.
New technologies in software and data analytics have the potential to enhance efficiency as well as student outcomes, both short term and long term. We found dashboards were relevant to multiple school roles including teaching, learning support, and career counselling in secondary school and possibly middle school. This study reflects how, when implementing new education tools and technologies, management must ensure new tools and technologies holistically support intended school outcomes and are implemented and supported with appropriate staff professional development in alignment with the school teaching and learning philosophy.
Our findings show merit in undertaking further project cycles to refine the dashboards and explore how dashboards are adopted and used within these schools and how this impacts teaching and learning. Further research may also expand the current project to include younger grades and how dashboards can more specifically support teachers as they identify and address cognitive diversity. Lastly, future studies should include a larger and more diverse participant group with a higher representation of Cambodian staff and more varied experience with technology so results can better reflect and inform further implementation of similar projects within the Kingdom of Cambodia.
We would like to acknowledge the input of all the participants who took part in the focus group discussions and testing of dashboard use. We would also like to thank the leadership of both contributing schools for their support of this project.
This study was conducted in accordance with the ethics requirements of the Kingdom of Cambodia and the participating schools. Information in this study remained confidential and was restricted to staff. To protect the participants’ identity, participant names in this article are pseudonyms. All researchers in this study were volunteers and did not receive payments or benefits beyond their salary for their contribution to this research.
Nathan Polley is a professional educator with experience in consulting, education leadership and management, research, corporate training, project management and administration in various senior education management roles in Australia, Aotearoa-New Zealand, Papua New Guinea, Egypt and Cambodia. Nathan currently lives and works in Cambodia, where he works with various business and education leaders to scale their organisations.
Russell Mills is a Guidance Counsellor with Hope International School, a school located in Phnom Penh, Cambodia. Previously he has worked as a Guidance Counsellor and an English Teacher in various Cambodian private colleges and as an international and professional musician. His interests include introducing jazz music within cross-cultural settings and supporting third-culture kids (TCK’s) to thrive in tertiary learning.
Abbad, M. M. M. (2021). Using the UTAUT model to understand students’ usage of e-learning systems in developing countries. Education and Information Technologies, 26(6), 7205–7224. https://doi.org/10.1007/s10639-021-10573-5
Alharbi, S., & Drew, S. (2014). Using the Technology Acceptance Model in understanding academics’ behavioural intention to use Learning Management Systems. International Journal of Advanced Computer Science and Applications, 5, 143–155. https://doi.org/10.14569/IJACSA.2014.050120
Al-Shehri, M. (2017). The effectiveness of D2L system: An evaluation of teaching-learning process in the Kingdom of Saudi Arabia. International Journal of Advanced Computer Science and Applications (IJACSA), 8(1), 442-448. https://doi.org/10.14569/IJACSA.2017.080156
Argyris, C. (1980). Inner contradictions of rigorous research. Academic Press.
Bagozzi, R. (2007). The legacy of the Technology Acceptance Model and a proposal for a paradigm shift. Journal for the Association for Information Systems, 8(4), 244–254. https://doi.org/10.17705/1jais.00122
Bingimlas, K. (2009). Barriers to the successful integration of ICT in teaching and learning environments: A review of the literature. Eurasia Journal of Mathematics Science & Technology Education, 5(3), 235–245. https://doi.org/10.12973/ejmste/75275
Brouns, F., Zorrilla Pantaleón, M. E., Álvarez Saiz, E. E., Solana-González, P., Cobo Ortega, Á., Rocha Blanco, E. R., Collantes Viaña, M., Rodríguez Hoyos, C., De Lima Silva, M., Marta-Lazo, C., Gabelas Barroso, J. A., Arranz, P., García, L., Silva, A., Sáez López, J. M., Ventura Expósito, P., Jordano de la Torre, M., Bohuschke, F., & Viñuales, J. (2015). Elearning, communication and open-data: Massive mobile, ubiquitous and open learning (D2.5 Learning analytics requirements and metrics report). Elearning Communication Open-Data. https://repositorio.unican.es/xmlui/handle/10902/15231
Carr, W., & Kemmis, S. (2006). Becoming critical: Education, knowledge and action research. Routledge.
Compeau, D. R., & Higgins, C. A. (1995). Computer self-efficacy: Development of a measure and initial test. Management Information Systems Quarterly, 19(2), 189–211. https://doi.org/10.2307/249688
Darling-Hammond, L., Wilhoit, G., & Pittenger, L. (2014). Accountability for college and career readiness: Developing a new paradigm. Education Policy Analysis Archives, 22(86), 1–38. http://dx.doi.org/10.14507/epaa.v22n86.2014
Global Information Consultants. (n.d.). Study abroad overview. https://www.globaledu.in/study-abroad-overview
Granić, A., & Marangunić, N. (2019). Technology acceptance model in educational context: A systematic literature review. British Journal of Educational Technology, 50(5), 2572–2593. https://doi.org/10.1111/bjet.12864
Herodotou, C., Maguire, C., McDowell, N., Hlosta, M., & Boroowa, A. (2021). The engagement of university teachers with predictive learning analytics. Computers & Education, 173, 104–285. https://doi.org/10.1016/j.compedu.2021.104285
Klein, C., Lester, J., Rangwala, H., & Johri, A. (2019). Technological barriers and incentives to learning analytics adoption in higher education: Insights from users. Journal of Computing in Higher Education, 31(3), 604–625. https://doi.org/10.1007/s12528-019-09210-5
König, J., Jäger-Biela, D. J., & Glutsch, N. (2020). Adapting to online teaching during COVID-19 school closure: Teacher education and teacher competence effects among early career teachers in Germany. European Journal of Teacher Education, 43(4), 608–622. https://doi.org/10.1080/02619768.2020.1809650
Li, J. (2020). Blockchain technology adoption: Examining the fundamental drivers. Proceedings of the 2020 2nd International Conference on Management Science and Industrial Engineering, 253–260. https://doi.org/10.13140/RG.2.2.30288.25602/1
Lin, C.-P., & Anol, B. (2008). Learning online social support: An investigation of network Information Technology based on UTAUT. CyberPsychology & Behavior, 11(3), 268–272. https://doi.org/10.1089/cpb.2007.0057
Moore, G. C., & Benbasat, I. (1996). Integrating Diffusion of Innovations and Theory of Reasoned Action models to predict utilization of information technology by end-users. In K. Kautz & J. Pries-Heje (Eds.), Diffusion and Adoption of Information Technology: Proceedings of the first IFIP WG 8.6 working conference on the diffusion and adoption of information technology, Oslo, Norway, October 1995 (pp. 132–146). Springer. https://doi.org/10.1007/978-0-387-34982-4_10
Nistor, N., Lerche, T., Weinberger, A., Ceobanu, C., & Heymann, O. (2012). Towards the integration of culture into the Unified Theory of Acceptance and Use of Technology. British Journal of Educational Technology, 45(1), 36–55. https://doi.org/10.1111/j.1467-8535.2012.01383.x
Prusak, L. (2010, October 7). What can’t be measured. Harvard Business Review. https://hbr.org/2010/10/what-cant-be-measured
Raffaghelli, J. E., Rodríguez, M. E., Guerrero-Roldán, A.-E., & Bañeres, D. (2022). Applying the UTAUT model to explain the students’ acceptance of an early warning system in Higher Education. Computers & Education, 182, 1-14. https://doi.org/10.1016/j.compedu.2022.104468
Sagor, R. (2000). Guiding school improvement with action research. Association for Supervision and Curriculum Development. http://site.ebrary.com/id/10115189
Setyohadi, D. B., Artisan, M., Sinaga, B. L., & Hamid, N. A. A. (2017). Social critical factors affecting intentions and behaviours to use e-Learning: An empirical investigation using Technology Acceptance Model. Science Alert, 10(4), 271–280. https://doi.org/10.3923/ajsr.2017.271.280
Sheppard, B., Hartwick, J., & Warshaw, P. (1988). The Theory of Reasoned Action: A meta-analysis of past research with recommendations for modifications and future research. Journal of Consumer Research, 15(1), 325–343. https://doi.org/10.1086/209170
Simonsen, J. (2009). The challenges for action research projects. Scandinavian Journal of Information Systems, 21(1), 124–141.
Sykes, T. A., Venkatesh, V., & Gosain, S. (2009). Model of Acceptance with peer support: A social network perspective to understand employees’ system use. Management Information Systems Quarterly, 33(2), 371–393. https://doi.org/10.2307/20650296
Taylor, S., & Todd, P. (1995). Assessing IT usage: The role of prior experience. Management Information Systems Quarterly, 19(4), 561–570. https://doi.org/10.2307/249633
Thanh, T., Td, D., Le, D. H., & Jr, P. G. A. (2021). Simple student grades analytics dashboards in higher education in Vietnam. Journal of Contemporary Issues in Business and Government, 27(1), 445–453.
Thompson, R. L., Higgins, C. A., & Howell, J. M. (1994). Influence of experience on personal computer utilization: Testing a conceptual model. Journal of Management Information Systems, 11(1), 167–187. https://doi.org/10.1080/07421222.1994.11518035
Vallerand, R. J. (1997). Toward a hierarchical model of intrinsic and extrinsic motivation. Advances in Experimental Social Psychology, 29(2), 271–360. https://doi.org/10.1016/S0065-2601(08)60019-2
Van Raaij, E., & Schepers, J. (2008). The acceptance and use of virtual learning environment in China. Computers & Education, 50(1), 838–852. https://doi.org/10.1016/j.compedu.2006.09.001
Venkatesh, V., Morris, M. G., Davis, G. B., & Davis, F. D. (2003). User acceptance of Information Technology: Toward a unified view. MIS Quarterly, 27(3), 425–478. https://doi.org/10.2307/30036540
Wang, H., & Wang, S. (2010). User acceptance of mobile internet based on the Unified Theory of Acceptance and Use of Technology: Investigating the determinants and gender differences. Scientific Journal Publishers, 38(3), 415–426. https://doi.org/10.2224/SBP.2010.38.3.415
Young, M. R., Rapp, E. M., & Murphy, J. W. (2010). Action research: Enhancing classroom practice and fulfilling educational responsibilities. Journal of Instructional Pedagogies, 1-10. http://files.eric.ed.gov/fulltext/EJ1096942.pdf