Monday, January 24, 2011

LAK11 : Where do we find good critiques of learning analytics?




George Siemens started the discussion in LAK-11 with both a post and a forum.  This post is in response to his question.

I’ve come to the preliminary conclusion that because the field is so new-especially referenced by the exact phrase “learning analytics” that there probably aren’t a lot of good critiques.  However, if you consider one (ugly step?) sister from standardized testing – or  google “critiques of standardized testing”  there is a plethora of critiques and opinions.  Let’s try to ensure that learning analytics does NOT follow the mis-steps of standardized (mandated) testing.

I found one recent federal government publication(abstract) or (pdf) that attempts to report about the use of Educational data at the Local Level.  It focuses a lot on “data-driven decisions” and claims to address local uses of educational data from accountability to instructional improvement. The closest the article comes to critiquing the analytics is by addressing the "barriers to use" of the data.  These
barriers are mostly related to time and resources, rather than critiques about effectiveness or validity.  The only “critiques” mentioned were those related to standardized testing.

I’m guessing (hypothesizing?) – from the current critiques of big data– that critiques of learning analytics will fall into one of two buckets: 1) privacy issues or 2) analysis/application issues.
These issues could be represented by two questions:
     1.            How did you get that data, and what are you going to do with it?
     2.            Is that a valid (and/or reliable) conclusion (from the data), and what are you going to do with it?

Danah Boyd gave a keynote address at WWW2010 entitled: "Privacy and Publicity in the Context of Big Data"  She has a crib sheet of her talk  on her website, and it is a must-read for those working with big data as well as for those who need to understand the concept of big data - which is just about everyone who uses the internet.

5 topics from the analysis/application bucket:
1) Bigger Data are Not Always Better Data
2) Not All Data are Created Equal
3) What and Why are Different Questions
4) Be Careful of Your Interpretations
5) Just Because It is Accessible Doesn’t Mean Using It is Ethical

and 5 topics from the privacy bucket:
1) Security Through Obscurity Is a Reasonable Strategy
2) Not All Publicly Accessible Data is Meant to be Publicized
3) People Who Share PII Aren’t Rejecting Privacy
4) Aggregating and Distributing Data Out of Context is a Privacy Violation
5) Privacy is Not Access Control

Each of these issues should also be considered when evaluating a study in learning analytics.

Both of the categories (analysis/application and privacy) have a need for more emphasis and effort in educating the public about big data—both it’s potential and it’s possible pitfalls— in a way that is balanced , unbiased, and most importantly: understandable.   

For now, I tend to agree with Ian Ayres in his presentation to Google.  He was asked:
“Do you see this number-crunching, super-crunching,  getting a backlash and possibly meddlesome government getting involved with it?” (see snippet at approximately 49:56 into the video, back up for full question).   
Ayres replied (in summary):
"I think we should go cautiously because right now the world is getting better. We should...might wait until we see some bad actions, and right now there is a possibility of bad actions, but ... no smoking guns that have come up." (see snippet starting about 53:00 into video, back up for full answer)
There is one concern about learning analytics that I have only seen even partially addressed – and that was in a  book by Jim Jansen of Penn State about web-analytics.  (You may or may not be able to access the full pdf, depending on your affiliation with an educational institution, however parts of it can be found on the author’s website) The concern is that web-analytics (but could be generalized to learning analytics) have their theoretical basis in behaviorism.  We track, explore, theorize, and draw conclusions, from observed behaviors.    Despite dis-taste that some experience when the word behaviorism is mentioned (usually because of some famous-and-outrageous-by-today’s-standards experiment: think skinner box) we still routinely apply the basic theory (consciously or unconsciously).  For a little more material on this, read this from Dr. Jansen’s website.  Dr. Jansen does, however, state that web-analytics encompasses a broader perspective than strict behaviorism. He writes:
"For the area of Web analytics, however, we take a more open view of behaviorism. In this more accepting view, behaviorism emphasizes observed behaviors without discounting the inner aspects (i.e., attitudinal characteristics and context) that may accompany these outward behaviors.This more open outlook of behaviorism supports the position that researchers can gain much from studying expressions (i.e., behaviors) of users interacting with information systems. These expressed behaviors may reflect aspects of the person’s inner self as well as contextual aspects of the environment within which the behavior occurs. These environmental aspects may influence behaviors while also reflecting inner cognitive factors."
I think this is an area of learning analytics that is ripe for further review to make sure the mistakes of the past are avoided instead of repeated, and the successes are built upon with further insight.

No comments:

Post a Comment

blog background graphic (CC BY 2.0) courtesy Patrick Hoesly
Original T-Shirt Graphic for LAK11 Week1: Presentation post courtesy kris krüg, modified by M.R. McEwen