Sunday, 23 November 2014

Procedural Learning

You can learn anywhere,
but some places are better than others
We had one of our few professional development days last week (this one on metacognitionand I had a moment of insight in spite of the circumstances.

For the better part of three hours we were sitting on too-small benches designed for children in a large, drafty, echo-y cafeteria listening to booming, static-y microphones and online videos.  It was a near perfect storm of poor environmental factors around learning for me.  I'm not a good auditory learner at the best of times, when barriers to listening are in place I quickly fall off the engagement wagon, though I try to hang on.

Why was our professional development done here?  Because we could fit two schools worth of teachers into that space.  When teachers don't consider basic pedagogical factors in teaching each other, it makes me wonder what happens in their classrooms (also designed to fit as many bodies as possible).

What would a learning space designed for learning (rather than body count) look like?  Tech could mitigate the need for massive spaces to warehouse lots of bodies.  We've build this complex and expensive communications infrastructure between schools, but we still expect teachers to burn fossil fuels and gather physically for material that could have more efficiently and effectively been delivered through interactive video and shared notes.  If the advanced life-long learners aren't going to test these possibilities, who will?

It was in this environment, rather ironically, that Jenny Donohoo, one of the presenters, clarified procedural learning for me.  She did it in the context of metacognition, but it allowed me to more accurately understand why I fell out of subjects in high school that I otherwise had a great deal of interest in.

I'd initially entered physics wanting to get into astronomy, but instead of science being a tool with which to explore the universe, I discovered that it (at least in high school in the 1980s) was procedural course designed to chase anyone who didn't like repetition for the sake of it out.  I greatly enjoyed computers too, but the computer science teacher approached the subject with the same procedural bent, as did most of my math teachers.  I'd like to think that things have changed since I was taking those classes, but the amount of photocopies still pouring out of those departments suggests otherwise.

I'd often find myself in a  math or science class doing procedural work with no idea why.  I'm not averse to procedural work, in fact, I have a great deal of respect for it.  You don't spend hundreds of hours power skating with a psychotic Russian figure skating instructor in full goalie's equipment if you don't appreciate what drilling can do for you, but I never suffered through that for the sake of suffering through that, I did it to become a better hockey goalie.


You don't have to look far for inspirational sports quotes.
Many encourage practice, but the goal is never practice itself.
When students are asked to do procedural work (ie: getting drilled in skills so they become second nature), the reason why they are being asked to do this difficult, repetitive thing had better be crystal clear or you're going to run into engagement problems.  I'll suffer through power skating, or exhausting 6am practices in a frozen arena if I know it'll give me a better chance at peak performance in my next game.  I'll get up early and ride a motorbike until my legs are jello if I know it will lead me to a moment of bliss on two wheels.  I won't do these difficult things without a reason.  No one has ever described dedication as doing something for no clear reason (that would be futility).

When I look back on my experiences in mathematics, science and computer science I see teachers who want to drill students without telling them why.  They want stringent discipline without a goal.  Unless you're some sort of masochist or really enjoy being told what to do, procedural learning for the sake of it is likely to cause a great deal of friction with your learners; it chased me right out of those subjects.

Another thing Donohoo said in that PD was, "the most useful thing you can do for your students is find ways to communicate what is going on in your mind when you are practising your discipline."  Maybe some teachers simply enjoy solving problems and couldn't give a care that there isn't a greater goal in mind, but that alienates a lot of students.  If your expertise allows you to do something useful, articulating that to your students is a valuable way to engage them in your discipline.


I've tackled this from an individual teacher perspective, but procedural learning leaks into the classroom in other ways.  The most obvious example is the data gathering process of standardized testing.  You can take any complex skill like literacy or numeracy and by applying standardized testing to it, reduce learning to procedure.  Doing this can often result in better standardized testing scores!  No one loves procedure more than statistics gatherers.  I'm speculating, but I bet there is a high correlation between those teachers with encyclopedic, complex marks books and procedural approaches to learning.

They are usually the ones wringing their hands over engagement and classroom management.

The idea that education is something we do to students fits well with this procedural approach.  Bells ring, ten year old photocopies are handed out, teachers repeat what they've said before word for word, and we continue the production line.  Sometimes I'm amazed that anyone learns anything in a school.

Wednesday, 12 November 2014

Money Clouds

You hear a lot about the magic of the cloud these days.  It's linked to online integration, website optimization and the evolution of computers.  Integration and optimization involve encouraging users to put information online and making that data easy for agitators to access.  The modern, monetized internet is built around turning data into a commodity.  The 2014 web is designed around encouraging you to put as much of your life online as possible because that data has value.


Gotta watch out for those people who drink the koolaid...
The idea of computers evolving from mainframes to desktops to laptops to smartphones appears self evident, but I'm not so sure.  I'm starting to think the devices prompted us online and the evolution idea was set up afterwards as a marketing angle.  Our devices might not be a response to market needs, but a push by the data bankers to get more people producing.

When you boot up a computer you've created a self contained virtual environment that is designed for and subservient to your needs.  Within that machine you have security, privacy and administrative power over your data.  It's hard to argue that this is anything other than an empowering position for a user.

When you connect to the internet you surrender administrative control.  Your virtual environment is no longer yours, your data is no longer internal and local, it's no longer your data.  Privacy is an antiquated idea you have to let go of and security is entirely at the discretion of hackers who are increasingly supported by big business and government.  When you go online you have lost that private computing experience and thrown it wide open to many interested parties.


When you send in three one year old
broken Chromebooks you get one back, the
rest aren't cost effective. If driving people online to

collect data is the goal, then the Chromebook is a
master stroke - disposable hardware that funnels you
into using a single browser - a branded internet.
Why have we stampeded to the cloud?  Did our devices change to serve our needs or have our devices been designed to drive us online?  Apple famously rolled out the ipad.  At the same time they put together itunes, which not only dominates media sales but has also now come to dominate app sales as well.  Selling an ipad is nice, constantly selling media is an exciting, never ending source of income.

Data as an income stream is at the root of our online migration.  Microsoft made billions selling an operating system, but the data produced inside it was very much the domain of the user.  Software we purchase for that environment had to also be subservient to the user.  This is a lousy approach if you want to monetize data and enjoy the benefits of a continuous income stream.

Blizzard realized this with the move to online gaming.  World of Warcraft was one of the first games to successfully follow the data=continual income model, charging monthly fees instead of a one time point of sale for the game.  The end result is a gamer spending hundreds of dollars on a game instead of the single $50 outlay.  If you don't think it worked, check out how WoW compares to the other top grossing games of all time.

Google famously claims that it wants to organize the world's information and make it available and useful.  This is always dressed in altruistic nonsense, but this is a profit driven business that goes to great lengths to not pay taxes.  Google is a data mining company, it always has been.  The happy result of this data mining is a remarkably accurate search engine that also happens to feed the data mining operation.  

Once the search engine was established Google went after traditional desktop based applications.  Lite versions of word processing, spreadsheet software and other traditional desktop apps drew users in with the suggestion that your software and data could be wherever your internet connection was.  This drove the expansion of the internet as well as the need for more bandwidth. Once the apps were rolling other data collection techniques like mapping and geo-location were added to the mining process.  The more data that feeds the machine, the more ways it can monetize it.

Claiming to be free, these apps drive users out of their private desktops and into the fishbowl of the internet.  Online apps feed data mining operations just like search engines do.  This blog is written on Blogger, a Google owned web application that encourages information to be put online so it can be mined.  Why do I use it?  Because I want to publish my writing.  In certain circumstances it makes sense to put data out into the fishbowl, but you don't get to choose those circumstances on the web today.

The reason Google struggles with offering unmined online resources is because Google is a data mining company, it's what they do.  This isn't necessarily evil or nefarious, but it behooves us to understand how online companies work, especially if we're going to get all giddy about driving students online.

A lot of infrastructure had to be put into place for your personal computer to be built, but that infrastructure is minuscule compared to what is involved in creating an internet.  The cost of building and maintaining a worldwide networking infrastructure is staggering.  The only way to make it cost effective is to make the data itself pay.  There are cost benefits to scaling up this kind of infrastructure, so online companies drive as many people into producing data as possible.

Any company that lives online can't simply create something of value and then stand by it.  The sand is constantly falling through the hourglass, it costs bandwidth to offer even a simple online service in this expensive, complex, cut throat infrastructure.  The only way you can survive in an environment this carnivorously expensive is to make the data you're attracting pay.  You push to schools, to charities, anywhere you can to generate input.

There is no such thing as a free online app.  The whole point of any online service is to get you producing data that can be mined.  This data is valuable even if your name isn't attached.  Most privacy legalese attached to online services explicitly allows them to use your data as they see fit.  Cursory efforts are made to hide your name because no name = privacy, but your data is where the money is, and it isn't yours according to most online agreements.  You surrender control of your data when you agree to use their data mining, um, nifty, online application.

Now that we've trained entire generations to ignore traditional media, this intrusive and invasive analysis is where market research has gone.  Multinationals don't spend marketing dollars on TV commercials for people under thirty any more, it's wasted money.  Instead, they drive the herd online, creating heat around exciting new smartphones / tablets / wearable computing - whatever gets people producing data to feed the network.

Again, this is neither good nor evil, but it is an evolution away from ideas of traditional advertising (which itself could be cast in a poor light).  The questions we need to ask ourselves as educators are: 

  • If we demand that students use online services that monetize the information they share, are we eroding ideas of privacy and personal security by demanding their online interaction?
  • Are we commoditizing our students' learning?
  • Should that make us uncomfortable?

There are ways to bypass all of this, but that means turning away from the carefully designed, market driven future laid out for us.  Education could adopt open source software that offers complete administrative control.  Educators could require students to actually learn how to manage digital tools from a mastery learning perspective (instead of whatever bizarre kids-know-this-stuff-intuitively / digital native thing we're doing now).

We could supply Tor browsers for students to use that would guarantee real anonymity and privacy.  We could expect students and teachers to learn how to manage their own online spaces and develop their own tools with education as the focus and no hidden data mining agenda.  We could leverage the sharing power of the internet to spread these tools around the world at little or no cost, but we don't, because the future we've been sold is so shiny that we can't think of anything else.


One thing is for sure, the future will be branded.  Branded
information, branded thinking, branded learning?
At the Google presentation at the recent ECOO conference the g-employee asked the room, "why aren't you all joining Google For Education?  I'm not going to go on until someone can tell me why!"  He was very enthusiastic in his hard sell.

In a less high-pressure sale situation I can formulate a response:  I use Google tools, but I make a point of understanding what they are.  I get the impression that most Google Certified Teachers are more interested in being unpaid sales reps than they are recognizing the complexities of cloud based computing.  Any teacher who rushes into branding themselves with a private company's logo makes me question their commitment to pedagogy.  What's more important, using the best tool available or using the best tool from your brand?  It's a big reason why the idea of brand specific computing devices will never get my vote.  

We're being led to the cloud by implacable market forces who have monetized our information flow.  They offer ease of access, integration and a general malaise that many regular users of technology turn into ecstatic fandom.  You don't need to learn this stuff, we'll take care of all that for you, just hook yourself up to this milking machine and it'll all be OK.


Hook up students to the milking machine and tell them it's for their own good.  Edtech is preparing them for the future!

Saturday, 8 November 2014

ECOOs1: Nerd Machismo & Other Barriers That Prevent Technology Learning

Nerdismo works like any other kind of machismo,
insecure boys belittle others and make the most
of what little they know to establish a social
space they can control.
I attended an excellent talk by Anne Shillolo on how to engage girls in technology at the ECOO Conference this year.

I've been struggling for a number of years to convince girls to hang in there in senior computer classes.  In the grade nine introduction course I have a number of girls who are often front runners in terms of skills and ability to learn tech, but they all drift away in the senior grades.

Anne covered the systemic and social issues around this in great detail during her presentation.  Hopefully those issues will begin to resolve themselves now that many tech companies are conscious of the problem.  As much as I'd like to I can't model being a woman in technology, but there are some other angles I can pursue.

In grade nine, especially in semester one, you tend not to get a lot of attitude because they are all fairly terrified to be in high school for the first time and are cautious.  As students become acclimatized to their new school they look for where they are strongest and tend to establish dominance in those areas; the jocks own the gym, the drama kids rule the stage, etc.  I was dismissively told by a university professor once that tribalism is dead as a theory of human socialization, but that guy was an idiot.  In the world of high school (and pretty much everywhere else, including online) tribalism is alive and well.  Computer society is more tribalistic than most.

In the senior grades the (mostly male) computer geeks do to computer lab what the jocks do to the gymnasium, they establish dominance.  I've seen a number of girls begin a senior computer studies course only to bail after the first week because of all the posturing.  The most frustrating was a coding prodigy whose parents were both programmers who vanished to take an alternate course online where she didn't have to put up with the drama.  This nerdismo ends up damaging the field of computer studies in all sorts of ways, not the least of which is choking it of sections in high school because the vast majority of students feel ostracized by the culture of the students in the room.

Anne's girls missing out on technology presentation led me to consider just how insular computer culture can be.  The idea of barriers to learning mathematics, sciences and technology came up in Anne's presentation.  As someone who wanted to be an astronomer before he almost failed grade 10 physics (and did fail grade 11), I know that it takes a fair amount of effort by the alpha-nerds of the world to shake otherwise interested right-brained kids out of 'their' fields of study.  From the science teachers who seemed to take great joy in pointing out that this wasn't my thing to the computer science teacher who watched me drown in mathematical abstraction with an absent smile on his face when all I wanted to do was tinker with code, I've experienced those barriers first hand.

As a non-linear/tactile/intuitive/experimental thinker I was intentionally bludgeoned by numbers until I couldn't care less about computers.  Watching the tribe of like-minded students (many of whom were good friends) form around those teachers and pass beyond that semi-permeable membrane into the math/science/tech wonderland scarred me.

My tactile nature eventually paid off when I got back into computers (years later - scars heal) through information technology, but I've never forgotten how those left brained mathletes made me doubt myself and turn away from the computer technology I loved.  I went from being the first kid in our school to publish code and own his own printer to going to college for art (and dropping out) because that was what I thought was left to me.  There was certainly nothing like code.org leading a charge for greater accessibility in learning coding (Anne showed this in her presentation):



I can't help but wonder how many kids we shake out of technology because they don't approach it in an orthodox manner, or don't fit the stereotype of what we think a person in tech is.  It might be slowly changing, but the gateway to learning technology is guarded by your stereotypical computer geek, and they are as fierce about guarding it as any athlete in a locker room. 

When I see teachers putting students in silos because of this kind of thinking, or worse, punishing students who don't follow their discipline in the same way that they do, I can't help but remember that I was once that kid who ended up dropping out and walking away.

Everyone can learn coding and computers.  Anyone who says, "I'm no good at that stuff" (including all the teachers I hear say it daily) are responding to the barriers that surround it.  Exclusivity driven by arrogance has defined how many people see the computer field.  Digital technology is so big now that any kind of thinker and doer can survive and thrive in the field, but we need the traditional computer experts to tone down the nerdismo.

The people who build the digital world we inhabit have as much swagger as a professional athlete does nowadays, and it starts in high school with insecure boys chasing everyone who isn't like them out of the lab.  Until we take steps to open up technology to more diverse learners it'll continue to chase the girls and atypical thinkers out of this left brained, male dominated industry.


Perhaps I can convince more girls and alternative thinkers to keep learning technology into senior high school by not being an arrogant git, but I'm also fighting this well established conception of what a computer geek is.  Until I can tone down the nerdismo in the classroom, I fear that preconceptions and the aggressive nerdismo in the computer lab will dictate who takes my courses.  The field of computer studies would greatly benefit from an influx of creative/alternative thinkers, but until the geeks loosen their grip, nothing will change.

Tuesday, 4 November 2014

Infecting The System

If the internet is the nervous system for a new global
culture, should it be artificially limited by human
self interest?
Cory Doctorow ended a harrowing editorial on artificially limited computing in WIRED this month with the observation that the internet isn't simply an information medium but has, in fact, become the nervous system of the Twenty First Century.

Doctorow begins by questioning why we shackle computers with controls that users can't overpower, and in many cases don't even know exist.  He uses the example of the Sony rootkit, that would install viral software on machines whenever a consumer would run one of their music CDs.  The idea was to curb pirating, the result was creating a blind spot in millions of customer's machines that immediately got exploited by hackers.

Whenever we build a computer that is subservient to anything other than the user, we're creating blind spots that hackers can exploit.  Whenever our software or hardware is artificially limited to satisfy human values, whether they be government or business or even educationally motivated, we are creating a machine that is flawed.

There is a simple honesty to computing that I find very appealing.  When we're building a circuit or working with a computer or coding, students will often say that they didn't change anything but got a different output, or that they did everything exactly right and it doesn't work.  The subtext is always that computer is up to something.  Whatever the computer is up to, you put it up to it.  Computers don't make mistakes, humans do.  This is why it's vital that computers are not controlled by remote interests.  When remote interests dictate computer outputs, you end up with confused users who start to blame the machine.


... because someone programmed HAL to kill.
Machines don't make mistakes, unless people tell them to.
I've long said that computers are merely a tool, but many people see them as intelligent entities with hidden agendas.  If we allow institutions to hard code their interests into our computers then we are intentionally allowing our flaws to infect one of the most honest expressions of human ingenuity.  We're also creating that confusion around computers as entities with evil intent (we provide the intent).

What goes for our personal devices also goes for our networks.  Unless we are going to continually battle for net neutrality and efficiency over self interest, we're going to find ourselves with hobbled machines on near sighted networks, seeing only what vested interests want us to see.  In that environment computers and the internet can very quickly move from democratizing force to Orwellian control.  Keeping computers free of human influence is vital to human well being.

I've been uneasy about the nature of the modern internet as distraction engine as well as the branding of edtech.  Both examples reek of the infected human influence that Doctorow refers to in his editorial.  Wouldn't it be ironic if we, as a species, were on the verge of building a more perfect machine that allows us to move beyond our short-sighted selves, but instead of building that wonder we infect it with our own shortcomings and end up using it to create a kind of subservience never before imagined?

I see it every day in machines so locked down that they barely function as computers, with limitations on virtually everything they do.  This is done for ease of management, to satisfy legal paranoia and, ultimately, to ease the burden of digitally illiterate educators, but this approach has me watching whole generations growing up in an increasingly technology driven world having no idea what is is or how it works.  As a computer technology teacher this is difficult to swallow.

The only restriction on a computer should be the laws of physics and the state of the art.  Efficiency and user empowerment should be the machine's and our only focus.  Everything should be up to the user otherwise these magical machines aren't empowering us, they're being used to create dangerous fictions.  Is it difficult to teach students how to use computers like this?  Perhaps, but at least we'd be teaching them a genuine understanding of what digital technology is, and how to wield that power responsibly.  All we're doing now in education is feeding the infection.