LAk12: Baker’s Educational Data Mining

Thoughts and information captured from Ryan Baker’s Presentation as part of LAK12 videocast:

  • Educational Data Mining: improve research, improving learning models (Journal of Ed Psychology). It predicts the future. Change the future [tough for me to digest, maybe should say: change the future failure into success rather than change the future]
  • Resources: Journal of EDM, Intl EDM Society
  • EDM & LAK:
  • Similarities: understand learning through study of large data; improving education and research; drives planning, decision making and manual/automated intervention.
  • Differences:
  • LAK=include automated discovery, EDM=putting human judgement in the automated discovery [still confusing to me]
  • LAK=understand the system, EDM=focus on components and the relationship between them
  • LAK=inform instructors and learners, EDM=automated adaptation
  • LAK=focus on needs of multiple stakeholders; EDM=focus on model generalization
  • EDM Methods: Prediction (classification, regression, density estimation), Clustering [I oppose this], Relationship mining [I relate to this], Distillation, Discovery [I need to know more about it to align it to my research and interest].
  • Knowledge Engineering [?]
  • Vision: predict student success based on analysing data generated by the students. Data obtained from: course selection data, cognitive tutor log data, grade data, AST data, Khan Academy log, State Std exams, SAT Career interest, Strong Interest Inventory, MSLQ Survey. [missing soft data like intelligence indicators, interest indicators, strengths indicators, stimuli profiling, communication profiling, etc… will they be handled by LAK?]
  • https://pslcdatashop.web.cmu.edu/
  • [I did not sense the drive to help students to discover their learning strengths although I have read a lot about changing students attitudes!]
  • Learning indicators: correctness or incorrectness,

References

Baker, R. (2011). https://sas.elluminate.com/p.jnlp?psid=2012-01-31.1003.M.0728C08DFE8BF0EB7323E19A1BC114.vcr&sid=2008104

Bibliography

  • http://www.educationaldatamining.org/
  • http://lak12.sites.olt.ubc.ca/
  • https://pslcdatashop.web.cmu.edu/

Interesting Slides

Slide 29: Sample of correlation between behaviour and EDM indicators

Slide 30: Sample of correlation between action and gaming

 

LAK12: Siemen’s Educause Presentation

Here are the slides:

Notes, Reflections and Thoughts

  • Academic Analytics: target organizational efficiency, strategy and decision making. (Campbell, Dianne ?)
  • Educational Data-mining: Reducing components and analyzing relationship.
  • Learning Analytics: Systems and wholes that include social components and cognitive elements.
  • Bottom Up: Data collected through traditional learning activities.
  • Top-Down: System wide data collected.
  • Due to the existence of large data, cognitive processes need to use tools to convert them into useable information.
  • Confidence is directly related to (academic) success.
  • Quantified Self: Tracking the self abilities and analysis its data.
  • Precise and accurate information leads to better performance overall all types of organization.
  • In learning, there are many data/methods that exist: EduCause, Student Success researches, Duval, Haythornthwaite, De Liddo & Buckingham Shum (automated vs manual, 70% accuracy), Social learning analytics, Clow & Makriyannis (icebox?)
  •  Privacy and Ethics issues especially when you relate the data to none-learning layers will cause unease and we need manifestos to guide the privacy and ethics layer to minimize negative reaction. There is no research about P, E and analytics.
  • Gold Mine: Organizations [e.g Pearson, Stanford] offers open free learning courses because it offer them priceless free amount of learning data.
  • Learning organization collect massive data. We need (1) figure out a way to find them and (2) We need to relate the data together.
  • Data needs to include: data from outside LMS, from Library, classroom interaction. This required 3 layers to communicate together: Systems/enterprise level + Researchers + Educators.
  • 24% of learning organizations utilize deep analytics (Kron, 2011).
  • The process of handling data: Acquisition, Storage, Cleaning, Integration, Analysis, Representation [Myopic view]
  • Procedures for a systems: Strategy, Planning & Resources allocation, Metrics & Tools, Capacity Development, Systemic change (Click on image to enlarge, Siemens, @ 45:00 min)
  • Capacity Development: will require restructure and redevelopment capacity.
  • Resources found at: www.solaresearch.org
  • Cloud Based Analytic research: SAlgorithm should be open, Student should see what schools sees,
  • Conference in Vancouver: lak12.sites.olt.ubc.ca
  • Starting point for new born analyticians: Initiate the social practice. Other suggestions: use tools (statistics, SPSS, SNAP, …, low threshold tools), begin conversation in the institution to identify chunks of data, Sr. Admin track the procedural matrix, PD a team and the capacity (tab on Educause).
  • Initial questions: (1) Teacher level: why do students do what they do, what network structure contribute to student learning (2) Admin level: what is the impact of resources on learning success -> provide an edge.
  • Collective collaboration: relate and connect to others, organization,  state level, develop new tools.

References

Siemans, G. (2012). https://educause.adobeconnect.com/_a729300474/p4xmnq9p9rz/?launcher=false&fcsContent=true&pbMode=normal

 

LAK12: 3 ingredients that made up a new LAK side dish

The learning analytics has been slow burning on my mind since I enrolled in the lak12.mooc.ca course. I usually dedicate Monday afternoon for the LAK focused reading and reflections in preparation for Tuesday activities. But the idea is placed on a slow cooking pot throughout the week where I add an ingredient based on an incident here, anecdote there, or a info I gather. Then on Monday, I taste the pot to see if I can make something tasteful out of the mix. Today, I feel I can uncover a good side dish: analyzing the controlling instincts. Here where it came from.

The first ingredient: In an argument with my wife, I discovered that we sometime say something while we mean something totally different and we usually do not recognize that. For example, to me, prepare the table to eat means having plates and cutlery distributed on the table. To my wife, it means the feel and look of the elegance of the table which should include a red cover, lit candles and romantic music. She never said them in those words and she adamantly rejected this notion but admitted this is what was desired(yes… go figure). So, the same thing means different things to ourselves as well as to others. How can we develop an analytic system that can understand our behaviour and habits if we, ourselves, many times, do not understand them.

The second ingredient: we had couple of colleagues to dinner and we were chatting about validity of profiling tests. The discussion got to the MBTI profiling test. One conclusion that came out of the discussion was that MBTI wording of the questions measure “what you want to be” and not “who you are”. So, if the most famous measuring analytics cannot measure who we are, how can learning analytics measure our learning by analyzing data that resides on the internet. Mind boggling indeed.

The third ingredient: Buckingham newest book was sitting on my desk since my son bought it for me for Christmas. I decided to read it. To my amazement, around page 23, he indicated that our natural reactions are not random that depends on outside factors, but are based on recurring patterns that are deep  rooted in our personality. Those recurring patterns are our strengths! [My first aha: can we define a term called “our learning strength”? which is determined by a set of recurring learning habits or reactions? But how? add this ingredient to the pot].

Buckingham answer to the same question did not convince me that we can apply it effectively as a learning analytics. But its gesture has many good potential applications. He said that he applied the stimulus/reaction approach. [My second aha: maybe learning analytics should include processes to identify responses based on certain stimuli that the learner consistently exhibits while learning or surfing the net, consequently, one can determine the recurring pattern that formulate the learner’s learning strengths]. I am not sure yet that I want to add this ingredient to the pot. It needs more research.

So, recognizing that sometimes we do not know ourselves, recognizing that existing profiling tests cannot measure accurately who we are and recognizing that we need to look for recurring learning patters are 3 ingredients that makes a light side dish that still need more ingredients to make it tasteful. Let’s see what week 2 brings.

References

Buckigham, M. (2011). Standout: The groundbreaking new strengths assessment from the leader of the strengths finder. Nashville, Tennessee: Thomas Nelson.