ONTASK TOOL A USER GUIDE FOR 2019 Created by Kelly Nichols & Nicole Moxey
ETEC 565A Leah Macfadyen April 3, 2019
INTRODUCTION & RATIONALE For our CYOA, we chose to evaluate OnTask, a learning analytics tool. Our driving question was
IS ONTASK AN ACCESSIBLE AND USEFUL LEARNING ANALYTICS TOOL FOR K-12 EDUCATORS? We analyzed this tool in relation to the driving question using the Moxey & Nichols framework (2019) which was developed for K-12 educators and administrators to support their search and evaluation for a learning analytics (LA) tool. The Moxey & Nichols Framework (2019) is an amalgamation of previous LA evaluation frameworks by Cooper (2012) and Scheffel et al. (2014).
Learning Analytics Evaluation Framework (Moxey & Nichols, 2019)
Having concluded our initial evaluation we now turned our focus to the literature to further analyze the benefits and challenges of the OnTask tool as identified through the Moxey & Nichols (2019) framework.
BENEFITS Most educators have very little experience using LA tools, and therefore, the tool needs to be easy to use for basic and advanced users. “It must be usable for both: the beginner, who just looks at it for the first time, as well as for the expert, who already has a specific question and wants to perform deeper analysis” (Dyckhoff et al., 2012, p. 62). Unfortunately, according to Romero et al., most of the data tools that are currently available are too complex and do not address the needs of the majority of educators (as cited in Dyckhoff et al., 2012, p. 59). LA tools should support educators who want to collect, integrate, and analyze data from different sources and it should provide step-by-step guidance for each process available (Dyckhoff et al., 2012, p. 61). There has been a recent shift in focus to create systems that draw from multiple data sources and operate on relatively simple rulebased logic. These generally require less knowledge and allow educators to provide personalized messages to students, by using data that they already have access to (Pardo et al., 2018, p. 236). “Learning analytics and predictive analytics are helpful in determining how students are likely to perform academically and whether they are at risk of failure or dropping out” (Sclater, 2017, p. 88). However, according to Sclater et al. (2016) and Shacklock (2016), LA are beginning to move away from using data to only predict students who are at risk of failing to data being used to improve the teaching and learning of all students (as cited in Pardo et al., 2018, p. 236).
Using learning analytics has benefits for both educators, as well as learners. It allows the educator to combine the right data sources with their own expertise to provide different suggestions to students depending on their level of engagement (OnTask, p. 1). Research shows that students need to be provided with ongoing and timely feedback, but currently, teachers have very little access to tools that help provide this type of frequent feedback (Vigentini et. al., 2018, p. 2). According to Jessop, El Hakim, & Gibbs (2014), this is a critical issue since student success has been identified as being directly correlated with the quantity and quality of feedback that that they receive (as cited in Lim et al, 2018, p.3). Studies by Black, Harrison, & Lee (2003) found that the lowest ranked aspect of graduate satisfaction surveys is the amount, frequency and type of feedback students received in their higher education courses (as cited in Vigentini et al., 2018, p. 1). Student self-regulated learning improves when the learner is provided with process feedback, rather than just outcome feedback. Some examples of process feedback include reading recommendations or identifying specific learning strategies (Lim et al, 2018, p. 2-3). When students receive ongoing process feedback, they are able to change their study and/or work habits to make positive changes throughout the course (OnTask, p. 1).
25,000
18,750
FIGURE 1.
12,500
According to Wikipedia, an annual report is a 6,250
comprehensive report on a company's activities throughout the preceding year.
0 Jan
Feb
Mar
Apr
May
Jun
Jul
Aug
Sep
Oct
Nov
Dec
The growing interest in learning analytics has provided opportunities for data to be used to help improve the relevance, timeliness and effectiveness of feedback that educators give to their students (Vigentini et al., 2018, p. 1). Hattie and Timperley (2007) propose four types of feedback that could benefit students: the task level, the learning process level (at which tasks are understood), the self-regulation level, and the level of self (including affect and personal evaluation). Learning process and self-regulation feedback are most effective, but due to time constraints and large numbers of students, these are extremely difficult to provide (Pardo et al., 2018, p. 237). However, with LA tools, such as OnTask, frequent, specific feedback could be sent out to students to help support their learning journey. OnTask is a platform that enables personalized feedback that is specific, or actionable, prompts a dialogue, is encouraging for the student, and is given in a timely manner (Lim et al, 2018, p. 34). The feedback is created based on student data, as well as from insights from the educator (Pardo et al., 2018, p. 240). Educators know that student success is directly correlated with the interactions, dialogues and feedback they receive from their peers and educators. However, it becomes a difficult task to give student specific and timely feedback in courses that contain large numbers of students or in online courses. To help overcome this issue, LA platforms such as OnTask enables the educator to choose the conditions which creates feedback that is encouraging and directly related to the requirements of the specific course (Lim et al., 2018, p. 2). Once the data is imported, the educator can create multiple conditions that can be used to encourage or guide the students. Some examples of conditions could be to notify the students of marks, remind specific students to post in discussion forums or complete required readings, enroll groups of students in a required workshop or tutorial, as well as to direct students to specific supports that are available. If a student is struggling with a concept, they can be directed to review readings that cover the concept (OnTask).
CHALLENGES Before discussing challenges and limitations of OnTask, we first must discuss the challenges and limitations of LA tools in education. Educational systems like Moodle and Blackboard, as well as communication tools like Elluminate help teachers structure courses and collaborate with students, but personalization of the student learning process has been limited (Ali et al., 2013, p. 130). Pardo et al. (2019) shares Hattie's (2008) assertion that “[s]tudent directed feedback has been identified as one of the most important factors influencing a student’s academic achievement” (p. 129). With increasing course enrolments and class sizes, educators are under pressure and struggling to achieve meaningful and personalized feedback due to
diminishing time and resources.
Moreover, despite having increased access and details of student data and behaviours, these data exhausts and trails still need to be transformed into digestible information for educators. However, Open Learning Analytics (OLA) platforms remain at the conceptual level and both the technology tools to deliver LA, as well as the pedagogical understanding behind it, are still in their infancy (Dyckhoff et al., 2012; Pardo et al., 2018).
By identifying the questions, one can begin to collect and curate data to help answer them. Ultimately, it is having the action driven by data rather than being informed from data that separates LA from educational data mining (Cooper, 2012). Lai and Schildkamp (2013, p. 10) categorize educational data into four components.
Focusing on educators with the intent to provide personalized feedback, OnTask refines these data categories further.
Once educators understand what data is being sought, they must then look to extract this data from a variety of resources. Some examples include:
Performance Data Sources Learning Management Systems (LMS) Gradebooks (e.g. Freshgrade) Online Tests (e.g. Socrative) ePortfolios (e.g Seesaw) Behaviour Management (Class Dojo)
Interaction Data Blogs (e.g. Weebly) Vlogs (e.g. Flipgrid) Podcasts (e.g. Voicethread)
Engagement Data Participation (e.g. Nearpod) Video watching patterns (e.g. Vimeo) Tracking & ClickStreams (e.g. Weebly)
SIS Data School Systems (e.g. MySchool)
Through the collection of different data sets from different sources and platforms, educators can begin to bridge the gap between information and feedback. But even so, many current LA tools and softwares have struggled to create an algorithmic solution that can replicate human intelligence and support the instructor-learning practice (Pardo et al., 2018, p. 235). But educational technology trends persist and the need and appetite for a tool that further explores data-informed student-centred pedagogies to provide feedback which has led to softwares like OnTask to emerge (Scherer, Siddiq, & Tondeur, 2019, p. 30; Vigentini et. al., 2018, p. 2).
OnTask was selected for evaluation as it remains one of the foremost open source LA tools available. As Pardo et al. (2018) identified, many current LA softwares and algorithms lacked the human centred model and focus largely on identifying at-risk students, but fail to address or approach the improvement of learning to all. Through the initial exploration phase of OnTask and watching the promotional videos of the software, OnTask highlighted its emphasis on a hybrid approach that allowed for coupled decision support systems with adaptive systems so that increased adoption of LA softwares in educational settings could occur and with clear and singular focus of personalized student feedback.
To guide our assessment of OnTask and its practical adoption and implementation for educators in a K-12 setting, we used the Technological Pedagogical Content Knowledge (TPACK) Framework, Technology Acceptance Model (TAM), and Learning Analytics Acceptance Model (LAAM).
In a previous study, Graham & Smith (2012), the TPACK framework was used to understand how teacher candidates about Information and Communications Technology (ICT) in their teaching, borrowing from this study are their definitions and examples.
Pedagogical Knowledge (PK) Includes general knowledge of learner characteristics (e.g. understanding what motivates a group of children, what is age appropriate or what their general learning preferences are).
Content Knowledge (CK) Practice of using representations by practitioners.
Technological Knowledge (TK) Ranging from older technologies such as a pencil to more digital technologies like computers. Focuses on the technology to be learned rather than as a tool to be used.
Pedagogical Content Knowledge (PCK) Educators modify or transform representations used in the content domain to make them more comprehensible to learners this will entail a content-specific understanding of the learners (e.g. students’ common misconceptions of a particular topic).
Technological Pedagogical Knowledge (TPK) Educators understand how to use digital tools to achieve the desired learning outcomes and experiences from students.
Technological Content Knowledge (TCK) Educators identify specific technology tools and/or programs that will help students gain a solid understanding of the desired content.
Technological Pedagogical Content Knowledge (TPACK) Educators understand how to use technology to teach concepts in a way that enhances student learning experiences.
The TAM model, first developed by Davis, Bagozzi, and Warshaw (1989), posited that perceived usage beliefs would determine individual behavioural intentions to use a specific technology or service (Ali et al., 2013). However, the TAM alone cannot predict a technologies success as outlined in one study that noted the “TAM falls short of conceptualizing what it means to accept and integrate technology in classrooms does not specify which types of professional knowledge about teaching and learning with technology teachers must have in order to integrate technology meaningfully" (Scherer, Siddiq, & Tondeur, 2019, p. 31). Acknowledging this shortcoming was the study by Ali et al., (2013) on a similar LA tool to OnTask called LOCO Analyst that found “pedagogical knowledge and information design skills of individuals can influence their perception of the usefulness of learning systems prior experience as one of the context factors that could potentially impact the perceived usefulness of a system” (Ali et al., 2013, p. 130). Consequently, to further our exploration of the OnTask tool, we also employed the Learning Analytics Acceptance Model (LAAM).
Both the TAM and LAM models help to understand what successfully predicts user behaviour of new technologies for both pre and in service teachers of varying educational levels and ethnicities, as well as how the analytics provided in a learning analytics tool affect educators’ adoption beliefs (Scherer, Siddiq, & Tondeur, 2019 & Ali et al., 2013).
TAM, LAAM, TPACK & ONTASK Usefulness The degree to which an educator believes that using a specific online learning system will increase his/her task performance (Ali et al., 2013, p.137).
Positives We found that the feedback gained from the dashboard and the ability to write conditional statements would allow for actionable insights in a transformative way compared to current and popular methods of feedback. But more importantly, this particular method allowed for the personalized feedback at scale and with the ability to differentiate and meet the needs of potentially all students, and not just the ones identified as “at risk.”
The requirement and application of Pedagogical and Content knowledge versus just Technological Knowledge was evident. For example, after extracting the data on a student’s engagement level or midterm assessment, crafting a personalized email would require the educator to identify the needs of that particular student and write a message that would encourage versus discourage a student’s learning regardless and despite their current behaviours and academic performance.
Additionally, the educator would require Content Knowledge to be able to provide the differentiated and supplementary resources needed to further the students learning in specific content areas.
Negatives OnTask provides neither interactive data visualizations nor predictive technologies which are features that educators often seek with LA tools that helps to guide their ability to interpret the data. Additionally the dashboard that does provide data visualizations does not have a sufficient tutorial or breakdown of the information and is left to the interpretation of the user resulting in the need for a strong Technological Pedagogical Knowledge of how to use quantitative information from data visualizations.
Presently OnTask does not have plugins available for third party LMS or SIS systems, so once a users data is updated, the changes are not automatic within OnTask and would require the user to re-upload the entire dataset. Moreover the creation of new columns is possible, however, to do so requires the user to enter and exit multiple windows and it would be more time efficient to use a third party system for data consolidation or entry.
Ease of Use The degree to which an educator expects the use of the learning system to be free of effort (Ali et al., 2013, p.137).
Positives OnTask can provide feedback about specific information such as the difficulties students are facing or how to better support high achieving students. The use of the dashboard which creates data visualizations through the use of charts and graphs supports this information relay to educators and helps them to better interpret the data sets.
Negatives Technological Knowledge was identified as a key barrier. Programming and technical jargon was repeatedly used throughout the tool and in the limited tutorials available.
Some examples include: strings, booleans, conditions, and workflows. Moreover, to begin using OnTask, it is expected that educators are familiar on where and how to collect and extract data, as well as the proper formats and file extensions (e.g. .csv, .xlsxl) needed for uploading.
Once datasets are uploaded and imported into Workflows, the user must then navigate identifying key columns and removing rows, including scale and weight rows, so as not to disrupt the data integration. Moreover, once data sets are uploaded, it is difficult and/or not possible to manipulate certain rows and columns of data.
Finally, once the workflow is imported, it is not possible to delete a column, if required. Instead, the educator must remove the desired column and and then re-import the workflow.
CONCLUSION It should be noted that regardless of the limitations and shortcomings of OnTask, some research has found that educators can construct their own internalized assumptions of a tool with connection to their pedagogical roles. In other words, an online teacher may be more optimistic about the affordances of OnTask versus its limitations should they place a higher value on its potential benefits in their course. Adversely, the perceptions formulated when an individual engages with the features of an LA tool are strong and consistent with their later evaluations. Therefore, it is imperative that when educators use an LA tool for the first time, they have a positive experience or else they risk developing a negative bias. It is our hope that this guide will help educators have a positive experience when they begin using OnTask for the first time. We have attempted to problem-solve and overcome some of the obstacles and frustrations that we faced when we started using this program.
Facilitators Guide Step by Step Tutorials for Educators
INTRODUCTION TO ONTASK How to Login Once you have created an OnTask account, sign in by entering your password and e-mail.
Extracting your data Decide what type of data you are looking at and visit the third party provider (e.g. Nearpod, Socrative, Myschool)
Once logged into the third party platform, look for an area labelled reports; here you will be prompted to download the reports usually in either a portable document format (.pdf) or comma separated value (.csv). When possible, choose the .csv file format and download it to your desktop or specified folder.
Working with Data Tables Each dataset must have a key column; a key column is identifying data, for educators, this would likely be a student number, as the data must be unique to each individual.
Next, you have the head row and head columns which holds the description of the data. There should be no other data in the rows that does not pertain to each individual student because this will skew the data. For example, scale and weight rows would need to be removed during the data upload process.
Once data is uploaded, columns can still be manipulated but not rows. Consequently, it is recommended that data be manipulated in third party systems or in excel before being uploaded into OnTask.
key column
Creating Workflows A workflow is like a course. To create a workflow, select the new workflow button and provide a name and description for the course you are creating.
Importing your data Once you have created a workflow, you will then be prompted to select the type of data you would like to upload.
CSV, Excel and Google Sheets are common file extensions that most third party data will come in.
It is unlikely for an educator to have Structured Query Language (SQL) data as that comes directly from the server itself.
Therefore, it is
not likely you will use this option.
Next you will find the file you downloaded to your desktop and upload it. During this process, you will need to be aware of the data table you are uploading and if there are any rows at the top or bottom you will need to skip. Asides from the column descriptions, this should be all rows that do not have individual student data.
Make sure to have at least one key column selected
Manipulating Columns You can sort the columns by ascending or descending, or click the top column box and move columns to different places on the table. You
cannot manipulate rows in the same way.
Adding a Column Currently, OnTask does not connect with third party systems directly. For example, you cannot upload a data table from a grade book and have new data sets appear when the grade book is updated in their third party system. Therefore, you have two options: you can upload another data table and merge it with the previous data table within the workflow, or you can add a column and manually enter in the data in each row.
Data type There are 4 different types of data that can be entered
Numbers- scores or percentiles (e.g. 5/10, 50%) Strings- which are letters or characters (e.g. A, B, C, D) Booleans- two options only (e.g. pass/fail, true/false, yes/no) Date/Time- you can add a timestamp (e.g. 03/30/2019)
Add Derived Columns This option allows you to merge columns together.
Additionally, you can select from a variety of operations when merging columns together.
Add Random Columns With this option, you can assign random values within a column. An example of when you might use this option is when putting students into random groups.
Merging Data Sets When merging data sets, you will follow the same first two steps when uploading data into a workflow. This time in step three, you will be prompted with some different options.
Here you will need to ensure that you have the same key columns for both the existing and new data tables.
You will then have four options to select how you would like your data rows to merge.
Option 1- Selecting all rows both existing and new table This option is likely to be the most commonly used. For example, an educator may upload assignments after the first term. After the second term, they may choose to upload another data set. By selecting the first option, this will allow for the new data to be added on to the existing data and further extending the data collection. Additionally, this option would also accommodate for students who have left the course.
Option 2- Selecting only rows with keys present in both existing and new table This option should be exercised carefully as it may result in data loss. Should any row be missing data from either the existing or the new table, then that entire row will be deleted out.
Option 3- Selecting only rows with key values in the existing tables This may be a good option to use for students who have handed work in late. When uploading the new data, previous data of students who have completed their assignments will remain while the new data will be uploaded for latecomers. This could also be an option for new students to a course.
Option 4- Selecting only rows with key values in the new tables. This method will drop rows from the existing table if they are not in the new table. Again, this may be an option for new students to a course or for assignments that are no longer needed.
Dashboard- The dashboard is a tool that provides data visualization. By taking large data sets and putting them into digestible images, educators can more easily identify the answers to the questions they are seeking, or be provided new insights. This information can then be used to create personalized actions and emails to students which will be explained in the next tutorial video.
Video Tutorial For more details and a step by step walk through, please visit: https://vimeo.com/327412506
Creating an Action Once you have uploaded a workflow (example - data that you wish to use). You are now ready to create an action.
Creating an Action Select “Actions” from your menu bar (top of page) and then select “+ Action.”
Create a Name, Description and then select “Personalized text” from the Action Type drop down menu. Click “Create action” when you are done.
Once this is complete, a new window will open up that has three tabs at the top: Personalized Text, Filter Learners and Text Conditions. The “Personalized Text” box is where you compose the message that gets sent out to the students.
To help guide you, there is an example of a message below.
Dear {{ Unnamed: 0 }},
We are nearing the end of our basketball unit and your mark is {{ Physical Literacy }}.
{% if Proficient/Extending %}Outstanding work! Your mark reflects comprehension and engagement with all course materials.{% endif %}
{% if Developing %}Good work! You have demonstrated a solid understanding and a target for you is to engage more with the online course materials.{% endif %}
{% if Limited/Emerging %}Uh-oh! This unit did not reflect your best effort. Please review the online course materials and come see me for extra support.{% endif %}
Please let me know if you have any questions before the final test.
Regards,
{{Coordinator_name}} {{Course Name}}
You are now ready to build a personalized email.
Step 1 - Address the email to individual students by selecting the “Insert Column Value” drop down box and choosing the key column that identifies the name of the student. For this example, it is “Unnamed” because all student identifying key columns have been removed due to confidentiality.
*Note* The choices available in the “Insert Column Value” drop down menu are dependent on what is included in your workflow (data). If you want the name of the student included in the message, make sure to include a column with student names.
Step 2 - Create your Text Conditions (if-this-then-this statements) by selecting the “Text Conditions” tab and then clicking the “+ Condition.”
Fill in the "Name," "Description" and "The text will be shown if" sections and click “Create condition.”
For our example, we created three rules: proficient/extending, developing and limited/emerging. See below for examples of each condition.
1. Proficient/Extending
2. Developing
3. Limited/Emerging
Once you have completed the desired “Text Conditions,” you will see how many students will be receiving each of the messages, as well as the formula used.
.
Step 3 - Add your conditions to your “Personalized Text” box by selecting the condition from the “Use condition in highlighted text” drop down menu.
Step 4 - For each of the “Conditions” included in the “Personalized Text” box, write the message you want the students to receive where it says, “YOUR TEXT HERE.”
Add any other information that you want to include in the message.
If you would like to include the students’ mark for a particular assignment, type the comment you want to add.
.
Then from the “Insert Column Value” drop down box, choose the assignment name. For this example, we are including “Physical Literacy.”
The student mark will be added to the message that is sent out.
Once you have completed your message, click the “Preview” button to see the email that each student will receive.
Video Tutorial For more details and a step by step walk through, please visit: For more details and a step by step walk through, please visit:
To view our tutorial video about how to add an Action, view our https://www.youtube.com/watch?v=hWldRJy9xFI&feature=youtu.be https://www.youtube.com/watchv=hWldRJy9xFI&feature=youtu.be
YouTube video below.
.
References Archambault, L. M., & Barnett, J. H. (2010). Revisiting technological pedagogical content knowledge: Exploring the TPACK framework doi:https://doi-org.ezproxy.library.ubc.ca/10.1016/j.compedu.2010.07.009
š
ć
ć
Ali, L., Asadi, M., Ga evi , D., Jovanovi , J., & Hatala, M. (2013). Factors influencing beliefs for adoption of a learning analytics tool: An empirical study. Computers & Education, 62, 130-148. doi:10.1016/j.compedu.2012.10.023
Cooper, A. (2012). A framework of characteristics for analytics. CETIS Analytics Series, 1(7). Bolton, JISC CETIS. Dyckhoff, A., Zielke, D., Bültmann, M., Chatti, A.M. & Schroeder, U. (2012). Design and Implementation of a Learning Analytics Toolkit for Teachers. Journal of Educational Technology & Society, 15(3), 58-76. Retrieved from http://www.jstor.org.ezproxy.library.ubc.ca/stable/jeductechsoci.15.3.58
Lai M., Schildkamp K. (2013) Data-based Decision Making: An Overview. In: Schildkamp K., Lai M., Earl L. (eds) Data-based Decision Making in Education. Studies in Educational Leadership, vol 17. Springer, Dordrecht
š
ć
Lim, L., Gentili, S., Pardo, A., Dawson, S., & Ga evi , D. (2018). Combining technology and human intelligence to provide feedback and learning support using OnTask. In Companion Proceedings of the 8th International Conference on Learning Analytics & Knowledge (LAK’18), March 5-9, 2018, Sydney, NSW, Australia.
OnTask. (n.d.). Retrieved March 7, 2019, from https://www.ontasklearning.org/
Pardo, A., Bartimote-Aufflick, K., Buckingham Shum, S., & Dawson, S. (12). OnTask: Delivering data-informed, personalized learning support actions. Journal of Learning Analytics, 5(3), 235-249. doi.10.18608/jla.2018.53.15.
š
ć
Pardo, A., Jovanovic, J., Dawson, S., Ga evi , D., & Mirriahi, N. (2019). Using learning analytics provision of personalised feedback: Learning analytics to scale personalised
to scale the
feedback. British Journal of
Educational Technology, 50(1), 128-138.
Scheffel, M., Drachsler, H., Stoyanov, S. & Specht, M. (2014). Quality Indicators for Learning Analytics. Journal of Educational Technology & Society, 17(4), 117-132.
Scherer, R., Siddiq, F., & Tondeur, J. (2019). The technology acceptance model (TAM): A meta-analytic structural equation modeling approach to explaining teachers’ adoption of digital technology in education. Computers & Education, 128, 13-35. doi:10.1016/j.compedu.2018.09.009
Sclater, N. (2017). Chapter 9. Metrics and Predictive Modelling, in Learning Analytics Explained (pp. 88-98). New York, USA: Taylor & Francis.
Vigentini, L., Liu, D. Y. T., Lim, L., & Maldonado, R. (2018). Personalising feedback at scale: approaches and practicalities. In Companion Proceedings of the 8th International Conference on Learning Analytics & Knowledge (LAK’18), March 5-9, 2018, Sydney, NSW, Australia.
References Images 3d Printer [Illustration]. Retrieved from https://www.shareicon.net/search?kwd=students&&cat=education&p=2
[Data Analytics Illustration]. Retrieved from https://www.scnsoft.com/services/analytics
[How OnTask Works Illustration]. Retrieved from https://www.ontasklearning.org/
[Male Teacher Illustration]. Retrieved from https://www.ontasklearning.org/
[Platform Tool Illustration]. Retrieved from https://www.lumapps.com/education/
[Students on Computer Image]. Retrieved from https://www.ontasklearning.org/
[Software Tool Illustration]. Retrieved from https://www.flipsnack.com/edu.html
[VR Goggles Illustration]. Retrieved from https://www.utahfilmcenter.org/education/in-your-classroom/
*All other images from Canva