A good face to face teacher may know when a student is struggling, The student is readily available so that the teacher may assess which specific steps or skills are lacking. This assessment or tutoring may become challenging in a distance learning environment. Aside from synchronous conversations, test results, writing samples, and other means, how do we know the level of learning success or struggles of distance students?
I’ve reviewed articles regarding data or learning analytics. Like Amazon, Google, and Netflix, big data techniques are employed to search for, analyze, and interpret massive amounts of readily available information. Amazon, Netflix, and Google use ‘big data’ analysis to provide recommendations, search results, or provide information that you didn’t have prior. Manipulation of algorithms provide trend analysis, keyword search, behavioral statistics, and recommendations.
Recently, I noticed an announcement for a new program at UMUC called M.S. in Data Analytics. I thought that the degree sounded intriguing and reviewed some details. Then I remembered a recent report I reviewed discussed predicted educational technology trends categorized by timeframe. The report provided links to a few other blogs and articles, such as by e-Literate. A question asked was how does a teacher know if a student is struggling with a concept in the classroom? Researchers suggest integrating ‘big data’ approaches to analyze student information and content. I may be oversimplifying it, but Big Data algorithms applied to educational or learning environments may be called learning analytics.
Current technology feedback applications support ‘click streams’ which inform the instructor to pace the lecture based upon immediate student feedback. However, the streams provide superficial feedback and not cognitive, deeper learning information and behavior that the instructor could use to modify several class variables, such as content and multimedia tools. Click streams provide information about retention, but what we need is explanatory information.
Instructors require explanatory information which could be provided by algorithmic manipulation of content and class data. Experts call the information semantic data.
Semantic is defined as “of, pertaining to, or arising from the different meanings of words or other symbols” (http://dictionary.reference.com/browse/semantic?s=t). Some subject matter experts may refer to it as data about data or metadata. “What can we do when we are empowered with semantic data and analysis?” (Feldstein, 2013). Semantic data requires “targeted hints that students can ask for, targeted feedback, and well-designed questions” (Feldstein) Without it, there is no semantic data. Semantic data could arm instructors, and the entire distance learning design team, with a learning curve analysis and recommendations as to where or how content should be modified.
The distance learning team could become a team of ‘learning engineers’. The learning design needs to provide this information by allowing “the system to semantically record each and every selection of students interacting with the material as described” (Feldstein, 2013). Learning engineers should design the frameworks and models that capture the ‘intent’ of the learning environment and content. Learning analytics may not replace instructor skills and intuition, but could greatly improve the teaching and cognitive presence of the instructor and design team.
From a UMUC perspective, perhaps there could be a future relationship between MDE and Data Analytics programs. Just a thought for further discussion.
Feldstein, M. (2013). A taxonomy of adaptive analytics strategies. e-Literate. Retrieved from mfeldstein.com/a-taxonomy-of-adaptive-analytic-strategies/
Johnson, L. ,Adams, S., Cummins, M., Estrada V., Freeman, A., and
Ludgate, H. (2013).NMC Horizon Report: 2013 K-12 edition. Austin, Texas: