Tag Archives: ICT

Year 10 Transition Activity at Gungahlin College

Welcome to our Year 10 transition day, and more specifically the IT Class we’ve got prepared for you.

One of the subjects we offer here at Gungahlin College is an introductory programming class that teaches you the fundamentals of how to program computers to do what you want them to do. We explore a number of different programming languages over the two years, and we’ll program a range of devices from desktop computers through to smartphones and embedded systems. The programming you do for each is a little bit different, and today we’re going to look at some of the intricacies of embedded systems programming using Arduino.

We’ve issued each pair an ed1 board, the details of which you can find here. It has been developed by National ICT Australian as a teaching tool for a couple of competitions and programs run by the National Computer Science School, and contains a whole heap of input sensors and output components that you can interact with directly. In today’s lesson, we’ll be exploring the:

To get started, you’ll need to open the Arduino IDE – you’ll find it in the Start menu under Arduino. It has been pre-installed for you and contains everything we need to program our boards. You’ll also need to collect a board and USB cable from the teacher.

When you get back to your computer, plug the board into the USB cable and the USB cable into the computer. It should light up and the board will start running the last program that was uploaded onto it. The way embedded systems work is that they store a single program that executes continuously once it has been loaded, so when the power is cycled to the board it simply restarts the program. There is a reset button on the board that also restarts the current program – find it now and press it to start the program again.

Back to the IDE – the first thing you need to do is use the Tools menu to select the:

  • Board -> Ardunio Duemilanovae w/ ATmega328
  • Serial Port -> COMX (where X is the highest number available in the list)

This ensures the computer knows how to communicate with the board, and does so over the right connection.

Now that’s done, we’re ready to start writing our first program. We’re going to start by loading a pre-written program that writes custom characters to the LCD screen – it allows us to draw whatever we want by lighting up the individual pixels on the screen. Delete everything in the Arduino IDE window, then copy and paste the code below into the window.

// include the library code:
#include <LiquidCrystal.h>

// initialize the library with the numbers of the interface pins
LiquidCrystal lcd(6, 7, 8, 2, 3, 4, 5);

// make some custom characters:
byte heart[8] = {
 0b00000,
 0b01010,
 0b11111,
 0b11111,
 0b11111,
 0b01110,
 0b00100,
 0b00000
};

byte smiley[8] = {
 0b00000,
 0b00000,
 0b01010,
 0b00000,
 0b00000,
 0b10001,
 0b01110,
 0b00000
};

byte frownie[8] = {
 0b00000,
 0b00000,
 0b01010,
 0b00000,
 0b00000,
 0b00000,
 0b01110,
 0b10001
};

byte armsDown[8] = {
 0b00100,
 0b01010,
 0b00100,
 0b00100,
 0b01110,
 0b10101,
 0b00100,
 0b01010
};

byte armsUp[8] = {
 0b00100,
 0b01010,
 0b00100,
 0b10101,
 0b01110,
 0b00100,
 0b00100,
 0b01010
};
void setup() {

 // create the heart character
 lcd.createChar(1, heart);
 // create the smiley character
 lcd.createChar(2, smiley);
 // create the frownie character
 lcd.createChar(3, frownie);
 // create the armsDown character
 lcd.createChar(4, armsDown); 
 // create the armsUp character
 lcd.createChar(5, armsUp); 

 // set up the lcd's number of columns and rows: 
 lcd.begin(16, 2);

 // Print a message to the lcd.
 lcd.print("I "); 
 lcd.write(1);
 lcd.print(" Arduino! ");
 lcd.write(2);

}

void loop() {

 // read the potentiometer on A0:
 int sensorReading = analogRead(A3);

 // map the result to 200 - 1000:
 int delayTime = map(sensorReading, 0, 1023, 200, 1000);

 // set the cursor to the bottom row, 5th position:
 lcd.setCursor(4, 1);

 // draw the little man, arms down:
 lcd.write(4);
 delay(delayTime);
 lcd.setCursor(4, 1);

 // draw him arms up:
 lcd.write(5);
 delay(delayTime); 

}

Assuming you have now completed all of the above steps in the order specified, you should be able to click the Upload button (The little sideways arrow) and you’ll upload the program to the board. You’ll see when the program is uploading because you’ll see a progress bar on the IDE and you’ll see the little Tx/Rx lights (near where the USB cable plugs in) flicker to indicate data is being transferred and received.

When the program is finished, you should see a message on the LCD screen, and a few custom icons like a heart, smiley face and man. That’s what your program does!

But, you can also interact with it. The dial on the bottom left of the board (called the potentiometer) can be turned, and when you turn it you should see the speed with which the little man flap his arms up and down change.

Let’s take a look at the program in more detail:

// include the library code:
#include 

// initialize the library with the numbers of the interface pins
LiquidCrystal lcd(6, 7, 8, 2, 3, 4, 5);

The above code block does two things – it includes the library (the pre-written code that defines what the LCD can do) so that we can make use of it to manipulate the LCD screen more easily. Without this library, writing to the LCD can be very difficult to do. Once we’ve included it, we then initialise the LCD panel by connecting the pins on the board up correctly – this makes sure the computer knows which signals to send to which pins that connect the LCD to the rest of the ed1. The code above is only ever run once – and needs to be done first so that we can make use of the LCD throughout the rest of our program.

// make some custom characters:
byte heart[8] = {
 0b00000,
 0b01010,
 0b11111,
 0b11111,
 0b11111,
 0b01110,
 0b00100,
 0b00000
};

byte smiley[8] = {
 0b00000,
 0b00000,
 0b01010,
 0b00000,
 0b00000,
 0b10001,
 0b01110,
 0b00000
};

byte frownie[8] = {
 0b00000,
 0b00000,
 0b01010,
 0b00000,
 0b00000,
 0b00000,
 0b01110,
 0b10001
};

byte armsDown[8] = {
 0b00100,
 0b01010,
 0b00100,
 0b00100,
 0b01110,
 0b10101,
 0b00100,
 0b01010
};

byte armsUp[8] = {
 0b00100,
 0b01010,
 0b00100,
 0b10101,
 0b01110,
 0b00100,
 0b00100,
 0b01010
};

The code above creates all of our custom characters using arrays of bytes – groups of single bits of information that can take on the values of 1 and 0. Here, a value of 1 indicates that the display should show a black dot, and a value of 0 means the display should leave that pixel clear.

You’ll notice that all of the bytes in the arrays are made up of 5 bits, and that each array has 8 bytes in it. This means that each character is made up of 40 individual pixels in a box that is 5 wide by 8 high. If you look closely, you’ll see how the patterns of 1s and 0s create the shapes we’re after – the heart is probably the easiest one to see, but the other shapes are also visible now that you know what you’re looking for.

We create the custom characters at the top of the program so that they can be used everywhere we want them to be used.

void setup() {
...
}

The setup() function is run as soon as the board is turned on, and is only run once. We can put any code we want to inside the curly braces (where the … is above) and that code will be run in that order as soon as the board has power. Let’s see what we do inside our setup() function:

 // create the heart character
 lcd.createChar(1, heart);
 // create the smiley character
 lcd.createChar(2, smiley);
 // create the frownie character
 lcd.createChar(3, frownie);
 // create the armsDown character
 lcd.createChar(4, armsDown); 
 // create the armsUp character
 lcd.createChar(5, armsUp); 

First, we create the characters by assigning each of our patterns to a single integer value (a number). You’ll see that the lcd.createChar() function requires 2 parameters – the first is the number you want to use to represent that character, and the second is the character itself (so in our case, 1 will be for the heart we created earlier). We assign all 5 of our characters to a number.

 // set up the lcd's number of columns and rows: 
 lcd.begin(16, 2);

This code tells the program that the LCD has 16 columns and 2 rows.

 // Print a message to the lcd.
 lcd.print("I "); 
 lcd.write(1);
 lcd.print(" Arduino! ");
 lcd.write(2);

Using the lcd.print() command, we can write text to the LCD. The print command understands how to display each of the standard symbols on the keyboard (letters, numbers etc), so we can specify any words or phrases we might want directly to print(). Notice how we use quotes (“) around the words? That’s so the program knows we want to print those actual characters to the screen.

You’ll see that we then use the lcd.write() function, and that corresponds to our heart on the display. By specifying a number inside the brackets of the write() function, it tells the lcd to draw the custom character we set up earlier to be assigned to our number 1. Notice also that we don’t use quotes this time – the value 1 will be substituted instead of the array of bytes we made earlier.

You can mix write() and print() statements together to print a combination of custom and regular characters to the screen. It will print them in order from left to right.

void loop() {
...
}

In addition to our setup() function, we also need a loop() function to be written. The loop() function runs as soon as the setup() function finishes, but unlike the setup() function, the loop() function repeats forever, so when it it finishes it starts itself again.

Let’s see what happens in the loop function:

 // read the potentiometer on A0:
 int sensorReading = analogRead(A3);

 // map the result to 200 - 1000:
 int delayTime = map(sensorReading, 0, 1023, 200, 1000);

The first two lines of code read from the potentiometer (it is on pin A3) and map the result of the reading from a number between 0 and 1023 to a number between 200 and 1000. What this means is that when you turn the potentiometer to a zero value, the program sets the delayTime to a value of 200 (instead of 0), and when the potentiometer is at 1023 it sets delayTime to 1000. This makes it easy for us to convert one range of numbers to a more useful one. The computer automatically works out what values the numbers between 0 and 1023 take on by breaking the range up into equal parts.

 // set the cursor to the bottom row, 5th position:
 lcd.setCursor(4, 1);

 // draw the little man, arms down:
 lcd.write(4);
 delay(delayTime);
 lcd.setCursor(4, 1);

 // draw him arms up:
 lcd.write(5);
 delay(delayTime); 

}

By writing the above code into the loop() function, the program will repeatedly perform the following code:

  1. It will set the cursor to the fifth position on the second row (we count the first row/column as number 0, so the second is 1, third is 2 and so on…)
  2. It will then draw the armsDown character (character number 4)
  3. It will then wait for the delayTime worked out by the potentiometer (i.e. it will do nothing for between 200 and 1000 milliseconds)
  4. It will then set the cursor back to the fifth position on the second row, and…
  5. Overwrite the armsDown character with the armsUp character (5)
  6. Then wait for another 200-1000 milliseconds

Since the reading is taken each time that block of code starts, every time the loop gets run a different value can be read from the potentiometer, which is what allows you to alter the time that it takes for the man to wave his arms up and down.

So, thats what our code does – cool, huh? Now its your turn to do something of your own using the bit of knowledge we’ve given you today. Here are some tasks you can try for yourself, and don’t forget, ask for help if you need it:

  1. Write your own message to the LCD screen, and try centring your top and bottom rows;
  2. Make up your own set of custom characters, and write them to different locations around the board;
  3. Use the potentiometer to change between different custom characters/numbers on the screen;
  4. See if you can make the potentiometer move a character across/around the screen; and
  5. If you finish the above, ask what other activities you might be able to do.

Online platforms for personalisation, analytics and immediate feedback #fliptm #TMACT

Last week I presented at the Gungahlin College Flipped TeachMeet on the topic of personalisation, feedback and analytics. The format of the TeachMeet was a bit different to the usual – presenters recorded their presentation via video and posted it online prior to the event, so the focus of the sessions was on discussion around the ideas. My video is embedded below and explains some of the tools I’m using to personalise learning, provide automated feedback to students and analyse data about student achievement to help identify where additional support is needed.

In the video I talk about a few platforms I’ve used extensively in class:

  • Grok Learning – Learn to program in Python, and get instant feedback every time you attempt a problem. For teachers, the dashboard really is an excellent tool that gives you an overview of student progress and helps you identify what topics students need extra support with.
  • Treehouse – This platform contains guided lessons and activities that help you learn a range of topics. For us our focus is on Web Design/Development, and it allows students to pick and choose individual paths to help them learn the skills and knowledge that is most suitable for where they’re at. Like Grok, feedback is given in browser so you can quickly see if you understand the material being presented.
  • Schoology – Our LMS provides us with automated quiz/test tools that give us a way of quizzing students and checking that they are grasping the material. Since students are provided with feedback on their progress, they can use this to identify their areas of strength and weakness and seek targeted assistance from teachers.
  • Oppia – A new open-source platform that allows users to create “explorations” that provide guided, personalised paths through the learning of material. Explorations can direct students to different activities depending on their answers to previous questions, which better targets the individual needs of each student.

There was a lot of enthusiasm on the night from teachers of many disciplines and school levels for Oppia in particular due to its flexibility, and I’m looking forward to working with some teachers at my school to see just how powerful it can be as a platform. I’m even thinking I’d be interested in contributing to the codebase as the group of us identify features we see as integral to it becoming a useful tool.

The K-12 Horizon Report 2013 lists learning analytics as a trend we’re likely to see having an impact in schools in the next 2-3 years, and its clear based on tools like Oppia and some of the third-party proprietary tools I’ve seen recently that data is becoming increasingly important in education circles. Being able to harness that data to better meet the needs of our students won’t put us out of a job – what it will do, though, is allow us to utilise our time better and focus on the things that are important to our students, rather than what might be important from our curriculum authorities.

Thoughts from the 2014 FutureSchools conference #FutureSKL

I’m currently sitting at the Gate Lounge 33 at Sydney Domestic Airport (well that’s where I was when I started writing the article, but it’s now the weekend and I’m at home finishing it up) after spending the last 2 days at the 2014 FutureSchools conference. If I was to sum up my thoughts in a few words, they’d probably be Good things are happening in schools, but there’s so much more to do.

As a general rule, the presentations were pretty good. I felt there were some that focussed a bit too much on the technology and/or the learning spaces themselves rather than the pedagogy that goes along with it, but some of that I believe comes back to not all of the presenters being “professional” presenters in a sense. My only criticism of the event overall would be that it was very much a “sit and listen” type of event – interaction with presenters was pretty low (with the exception of Eric Mazur who did a great job involving us in his presentation – see the section on Peer Instruction below), and that made the later sessions difficult to stay completely focused on. What we really needed were some opportunities to work with smaller groups of delegates to explore interesting ideas and talk about the details of what was being presented – much of it was big picture, and didn’t address some of the more pressing issues like how to bring staff along and/or break down preconceptions or negativity about change.

Learning Spaces: an enabler, not an answer

Presenters from a few different schools gave us some insight into the way they are using some of their learning spaces. Presenters from Brisbane Boys College, Scotch Oakburn College in Launceston, Stonefields School in Auckland, Mordialloc College in Victoria and Anglican Church Grammar School among others all demoed their learning spaces and talked about the ways they’re re-thinking how they’re used to keep kids engaged with school.

We’re very lucky at my school that, as a newly built school that is only a few years old, the learning spaces that have been set up throughout the building have a lot of variation and scope for being very flexible. What has tended to occur, though, is that each of the spaces in the building has become setup and used in a relatively static and permanent way – although there is scope for flexibility and dynamism, in many cases very little is done to change how each is space is used throughout the year. The result is that the methods used to teach in those spaces are very typical of what would be observed in regular classrooms – evidence that having flexible spaces alone is insufficient to change teaching.

That doesn’t really come as a surprise – technology works in much the same way. Replacing books with laptops doesn’t automatically create classrooms that aren’t teacher-driven (and in fact, I’ve seen many examples where the only difference is that students type notes from the board rather than writing them), nor does swapping out blackboards and chalk for IWBs. Like technology, learning spaces are an enabler – both provide us with new capabilities that wouldn’t have been available otherwise.

Think of it this way – if all students have at their disposal is books and pens, then every task they do will involve writing, drawing and/or conversation with content provided by the teacher. However, with technology, not only can they also do things like make movies, record their discussions and collaborate in real-time on the same documents, they can also access an unlimited amount of information to help consolidate their learning, and use a range of different resources that might be more appropriate for their learning styles (opting for a video or podcast series rather than text-heavy web sites or articles).

Many teachers are unfamiliar with the technology and unsure of how to use it to teach in new and interesting ways. The same can be said for learning spaces – if all you’ve ever known is rows of desks and a board at the front of the room, how then can you be expected to take advantage of the options provided by highly flexible learning spaces?

Interestingly, in the case of flexible learning spaces, many of the benefits they offer are only really available if they are coupled with technology. While we can configure learning spaces to provide students with areas for group discussion and collaboration, individual learning, large-group presentation of information and one-on-one support, if we’re relying on a single source of information or content delivery then the flexibility is of no value. To really take advantage of flexibility of space, we need to have lots of content options and activities that students can be engaged in that will allow them to learn and reflect in ways that make sense to them.

So how do we make better use of the spaces at our disposal? We need to invest a lot of time into teaching teachers how to teach in that environment. Notice I didn’t say “show teachers how to facilitate learning in that environment”? That was intentional. In many cases, students learn in spite of what their teachers do – the learning can often happen no matter what is going on. However, when teachers are effective teachers (i.e. they “teach” well, for a given interpretation of “teach”), then the learning that is possible for students is far greater than it would be otherwise.

My plan for this year is therefore twofold – by developing a strategic plan for the growth, use and implementation of technology for teaching and learning at the school, I’ll be seriously considering the role of professional development to not only address the technical and pedagogical needs of staff with respect to technology, but with respect to the learning spaces as well.

Changing Culture: Consultation, Community Involvement and Nurturing Innovation

What was really clear from early presentations where schools had successfully changed the culture was that in every case, without exception, students and the general community were involved in the process. Presenters made it very clear that a large part of what kept students interested in the school was the building of relationships with their teachers and school leaders, and that empowering them to drive aspects of the decision-making process was the easiest way to get buy in from the student body. We’re currently in the process of investigating a new timetable structure to cope with the increasing enrolment numbers at our school, and it gave me the idea – why don’t we have the students organise a community forum to collect ideas and present options about what this might look like? It is one of the things I’m going to suggest as part of our strategy over the next few months, and I’m hoping that other members of the leadership team will see the value in such a move. Does it mean that the student voice will be the only determinant of any change? No. But it will mean that their voices will be heard and can be considered as a part of the change process.

The other barrier to change that was discussed at length was the perception of what school should look like that came from parents – it was interesting to hear how principals who had only recently taken up their positions were contacted by families to find out if they were prepared to “stop the madness” that was going on in the school. As previous participants in the education process, many parents “know” what school is and are afraid of any departure from that picture. Successful schools that have managed to shift their pedagogical approaches away from teacher-centric, content-focused delivery practices to student-directed, teacher-guided, personalised learning unanimously had parents heavily involved in the transition. There was a significant investment in parent education; bringing teachers, students and parents together to openly share what it was each stakeholder group thought education should look like and what tools and environments would facilitate it. The greatest allies for schools in these conversations were the students themselves – it turns out that kids are much better at convincing their parents something is a good idea than the school is, and when all parties agree the transition is smoother and much quicker than it might otherwise be.

Stephen Harris, Principal of Northern Beaches Christian School, presented his “steps” for successful cultural change:

  1. Observe the situation, and involve everyone in the process;
  2. Have a clear vision about where you’re headed;
  3. Develop the vision with others – build it and it allow it to grow;
  4. Encourage ideas that support the vision through space and collaboration;
  5. Act on those ideas; and
  6. Evaluate progress regularly and adapt the vision based on what is working.

It’s a relatively simple idea, but for me I think the key is definitely defining the vision and having others buy-in to it – making it a shared vision so that everyone is working towards the same goal. I think of it a bit like a soccer team – everyone plays a very specific role, with  each working towards getting the ball in the opponent’s goal while not giving up their own. Without the goals at either end, we’d have a lot less structure and nothing concrete to work towards, and there’d also be no way of determining the success of any unplanned moments of brilliance that might come along.

Structures that encourage innovation

Another thread throughout many of the presentations was that innovation and change comes about only when supported by appropriate structures. Some of these are organisational, others physical. I’ve extracted the ones that struck a chord with me below.

Leadership Structures

A couple of schools talked about the way they’ve structured their leadership teams to both take advantage of the skills and expertise of their staff and to encourage creative thinking and innovation. NBCS and the Australian Science and Mathematics School both threw out the traditional, faculty-based organisational structure and instead have adopted more fluid and dynamic approaches that encourage experimentation and collaboration rather than reporting up and down the chain of command. This primarily achieves two things:

  • it eliminates the expectation of management that a hierarchical, top-to-bottom structure creates, encouraging every person in the organisation to take on leadership roles and innovate, and shifting the emphasis of senior members of the organisation towards visionary thinking and innovation; and
  • it breaks down the barriers that are naturally created by the independent business units common in hierarchies – typically in high schools, this is the faculty unit.

I love the idea that teachers should spend more time working with colleagues from other disciplines and sharing their thoughts more widely, and that leaders are given greater opportunity to define what the important aspects of their roles are.

What was really evident, however, was that for this approach to work, everyone must be invested in the vision and strategic direction of the school. There’s a significant amount of groundwork necessary to put that in place before you can just flip the organisation on its head.

Personal Learning Time

At our school, students are not timetabled on every class which provides them with their own Personal Learning Time. The idea is that by providing students with some flexible time they can use to focus on their study in a way that best suits them, and can seek out extra assistance from teachers and peers outside of regular class times. It’s a good idea, but it isn’t always utilised by students as well as it could be.

Many of the schools that presented talked about the way they have adopted “20% time” similar to organisations such as Google and 3M. The idea being that students can choose something to work on – absolutely anything, with no restrictions or limitations – and use 20% of the timetable at school to explore their interest. There is an expectation that they will present what they learn back to their teachers and peers, then move on to another topic or interest.

Across the board, the schools that have adopted it have said it is one of the most popular initiatives amongst the student body. It got me thinking – we’ve got that space in the timetable (which in our case works out to be about “16% time”), what if we could recognise anything a student did that sat outside of the regular curriculum during that time? I think there’s merit in the idea, and I also believe that there’s a good chance that the learning that takes place would flow on to better results in other subjects too. I’m going to investigate how we might be able to get that happening – providing some kind of framework for students to better utilise their non-timetabled school time, but still crediting them with some formal recognition of the learning that takes place. I’m sure it’s possible.

The Staffroom

I’ve never been a fan of staff rooms. Personally, I find that while they’re great for developing collegiality amongst the people that share a space, what they also tend to do is create separation between different staff rooms as a result of people not being challenged or exposed to alternative ideas on a regular basis. When there is little need to relocate yourself, busy days often mean you just don’t bother to do so. I’ve always made it my mission to try and get around to other staff rooms regularly so that staff know who I am and I get a chance to hear a bit about what they’re doing. I haven’t been as successful this year as I have previously (moving to a new school no doubt being a factor), but it’s something I’m working on.

To counter the negative effects of the staff room, some schools have begun the process of eliminating them altogether, or at the very least blurring the line between what defines a staff room “space”. Instead, staff are encouraged to work in locations that make the most sense at the time for their work – if it is collaborative planning, moving to a space with a round table and plenty of whiteboard space is going to be much more conducive than a standard staff room space might be. Equally, if what you’re working on requires uninterrupted attention, finding a private area where you can shut yourself away for a short period of time to finish something up is equally important.

I don’t believe that you can just get rid of the staff room altogether – I think there’s a need in any school environment for teachers to be able to separate themselves from the students at times, especially when you consider the many situations where privacy is important (for the students and the staff). But I do believe that you can minimise the amount of staff room space in a school. A large space or two with options for lots of people to work in different ways strikes me as the ideal – just like we want to create dynamic, fluid spaces for learning in different ways, so too should we be looking at these options for staff. Besides, there will always be the occasional empty space at various times of the day where classes aren’t happening, and that could be useful too.

The biggest blocker here would no doubt be staff themselves – many staff have become comfortable working in the current paradigm, and to change would be a fairly significant shift. We’re also used to many procedures in schools that tend to work on the assumption that teachers reside in staff rooms and that those places aren’t fluid – there’d be a lot of work that needs to be done to alter administrative processes and implement solutions that would allow us to operate in a different environment.

Information and technology

The Library

Without a doubt, one of the most contentious spaces when any suggestion for change is made is the library. I love books – I’ve got a decent sized collection of my own at home, but the reality is that when I go looking for information nowadays often books are not my first point of reference. There are some situations where books are absolutely fantastic – one of the most challenging things I find at the moment when teaching accounting is that while there is plenty of information online for techniques and processes that apply to accounting generally, finding information about things that are specifically Australian that are accessible to students can be really tough. There are books that do this well, and their value cannot be understated.

So when I suggest the following, don’t interpret it as me being a book-burner or anything – libraries need to change in a BIG way. We don’t need anywhere near the amount of books that is typical in a conventional, established library. We also don’t need the library to contain classrooms, labs of computers or tables set up only for individual study. The library has the potential to become an energising hub of information, research and thinking, but libraries with older designs don’t conjure up those images anymore.

I see libraries now as being much more multi-modal, and there are many librarians out there that completely understand it. Our TLs are regularly recording and sourcing video for students that they make available through our media servers, and this supplements our book collection. They do a great job and I value the TL role immensely.

However the spaces in libraries need to reflect this. More small study areas, lots of variety in the spaces available, collections of resources such as podcasts, videos, lectures and media from educational institutions across the world – that’s what is relevant to our students today. And, best of all, a lot of this material is actually free. The problem is the quantity and quality of what is out there, but that’s where the real value of the Teacher Librarian is – they know how to curate and catalogue amazing content.

To be able to do this effectively, TLs need the time and technology to support this move, and some input to help design library spaces that are attractive and inviting to students of all ages.

Communication

Communication is never the best it can be – it just isn’t possible. It’s a multi-faceted problem that gets so complex with new forms of communication that keeping up is a job in itself. But one of the things that always frustrates me is the amount of time spent on communicating administrative information when instead, what inspires learning and excites people is hearing about interesting developments in a range of areas.

We’ve got large screen TVs hooked up across the college that are capable of streaming all types of media from a content server. What exactly are they used for? Right now, RSS feeds of news, the school Twitter feeds and similar, but most of what goes up there is administrative – this event is coming up, don’t forget exam week etc. None of the content is designed to challenge thinking – it’s used to disseminate information.

That information shouldn’t dominate those screens. Sure, it’s important and it needs to be shared, but surely there are better ways to make use of significant amounts of display time. I’ve been thinking – what if the administrative announcements were up during certain times of the day, while during others the screens were showing streams of what was happening in the performing arts, or video of some interesting science experiments, or a major cosmological event, or a public lecture from a local university on a human rights issue? One of the ways I think we can better engage students with that kind of information is to make it easily accessible, and to give them a reason to go back and look at the screens on a regular basis. If all we’re doing is feeding them information they are getting from other sources (such as their smartphones), it’s an opportunity that’s going to waste.

I’m not completely sure of the capabilities of our systems, but I understand that the server and software is quite powerful. I’m going to incorporate better use of our existing systems into the strategic plan for technology.

The power of Peer Instruction

For me, the best session of the conference was the second day keynote presented by Eric Mazur from Harvard University. I mentioned it earlier because of all of the sessions that were held, it struck me as being the most interactive. While others attempted to involve us, the enthusiasm Eric generated as a result of the use of peer instruction in a restrictive lecture space was enlightening. What was surprising wasn’t that it worked – it’s something I’m sure all teachers have used before – but that it made me feel like a student that wanted to learn again. By the end of his lecture, I’m sure that every single person present was excited about thermal expansion in solids, or at least was hooked enough to need to know the answer to his question.

Mazur has delivered other lectures on this same topic in the past – his Confessions of a Converted Lecturer video is available on YouTube (this version is 80 minutes long, but there is also an 18 minute summary) – but ultimately, what he showed us was that students, when involved in each other’s learning, are able to teach others and convince them of a correct answer if they’re given the time to do so. Using real-time feedback and response systems, he was able to demonstrate how once a critical mass of students in a large group had understood the concept, he could have the group collectively find the correct answer to a problem very quickly. Even in a lecture, where the teacher remains at the front and direct access to them by the students isn’t possible, it is enough to have students speak with the people around them.

He’s known as the “Pioneer” of Peer Instruction and Flipped Learning, and he spoke about both of these topics in his presentation, but the clear contrast with his presentation compared to many others was that in his case, his focus was on the change in pedagogy that was necessary for student improvement. Not once did he discuss the flipped classroom beyond the idea that the video became the tool for delivery of content and the classroom experience the change to engage with the material through problems and practice – once that had ben established, all of his time was spent emphasising that the classroom environment and teacher actions had to change to ensure that opportunity was provided to the students.

I’ve taken a lot of what he modelled on board and I’m going to endeavour to do a lot more to provide my students with as many opportunities as possible to practice and share their learning experiences. I’m convinced that it needs to be a fundamental part of what learning should look like in all classrooms. That too will be a factor in the development of our technology plan.

Ultimately, a LOT to think about and share with the leadership team at school on my return. Was it a worthwhile two days – for me, absolutely. For the school? That’s dependent on the willingness of everyone to experiment a little and enact elements of what has been successful in other places. I think there’s a lot there that has the power to improve what we do.

Digital Technologies: Now a Subject in the Australian Curriculum

I was thrilled to see that the Australian Curriculum: Technologies has finally been made available online for all teachers to see and begin using in their schools. Sure, it is currently marked as “awaiting endorsement”, but that’s largely due to the Curriculum Review that has been instigated by our current federal minister Christopher Pyne. We’re now at a point where educators can get moving on implementation of the F-10 curriculum.

What excited me about the Digital Technologies curriculum in particular is the way that it has embraced the Digital Technologies as a way of thinking and a tool for creativity. The problem I’ve always had with the teaching of ICT in schools is that it has largely been seen as a tool that should be integrated to assist the teaching of other subjects – that’s fine, but that’s captured in the ICT General Capability in the Australian Curriculum and is very different to the study of ICT as a discipline, sometimes branded as Computer Science, Informatics, Computing or similar. Given the ubiquitous nature of ICT in our world today, it has always struck me as odd that we’ve relegated the understanding of ICT to being all about its use, rather than how it manages to achieve the “magic” that many people mistake it to be.

So finally we have some guidance for teachers, especially in the primary years, about what to teach to impress upon students the fundamental knowledge and skills required to be a developer of ICT solutions. This doesn’t mean we have to make students in Kindergarten write code in Java or anything – in fact, the Digital Technologies curriculum for Foundation to Year 2 instead focuses on pattern recognition and the classification of data in contexts that kids can understand. A significant amount of time was spent during the writing of the curriculum looking at how what students need to know to develop a strong conceptual understanding of the Digital Technologies could be integrated with what they are learning in other subjects. That’s important – it validates what teachers are currently doing to teach these ideas, or provides explicit advice to teachers about what they need to address when they design new lessons.

I accept that not everyone agrees with me – here’s a post from the Conversation written by an academic from the University of Newcastle – but on reading the article (which was forwarded to me by someone I know who has an interest in the DT curriculum), I felt the need to respond to some of the statements being made.

You can see my post in the comment thread at the link above, but I feel so passionate about what I had to say I felt the need to post it here, since I believe it stands on its own.

As an IT teacher who has the skills and knowledge to deliver this curriculum, I get a little bit frustrated about some of the ongoing concerns people keep expressing with the curriculum, largely because I feel like many of the criticisms are being made with underlying assumptions in place that need to be challenged.

The Digital Technologies curriculum does not insist that students become programmers – at least no more so that the English curriculum insists they become authors, the Mathematics curriculum insists they become mathematicians or the Science curriculum insists they become Scientists.

Many of the same arguments and/or questions about the relevance of some of the content included can be asked about other learning areas – such as the need for students to understand stem-and-leaf plots in Mathematics, or the structure of multi-cellular organisms. Look at all of the curriculum documents (and it is important we differentiate the curriculum from a syllabus – they are different things) and you’ll find that if it really came down to it, you could question the inclusion of many of the skills and/or understandings that the writers in each area have decided to focus on.

That aside, the other major consternation people have about it all is the time / crowded nature of the curriculum, however this all comes about because many commentators still insist on looking at the subjects as being independent of one another. We look at the Science curriculum and then, at school, we teach kids Science. We do the same with Maths, English… Why? How many times in the real world do we look at a problem and say “oh, that’s a problem that can only be solved by mathematics, I’m not going to consider any of my scientific or social understanding to come up with an answer”?

The curriculum has been written with the interdependence and relationships between the learning areas in mind – or at least that is my understanding. We talk about falling levels of literacy and numeracy, and then argue that this is a case for eliminating non-critical subjects from the learning of students? Surely the reason they are not engaging with school has to do with the fact that the way they are being taught isn’t working for them? It is possible to teach many numeracy and literacy concepts using much of what has been included in the Digital Technologies curriculum. Similarly, you can teach programming within the context of mathematics, algorithms as recipes in a kitchen, and data representation as an exploration of pattern recognition and language translation.

To simply look at the fact that programming has been included in the curriculum and then dismiss it due to the fact that not every kid needs to be a programmer completely fails to recognise the importance of logical reasoning and the methodical development of algorithmic solutions when faced with complex problems – a critical skill that can be developed through learning computational thinking. Not every student will end up being a mathematician, so why do they need to know about polynomials and parabolas?

And I also don’t think it is sufficient to argue that a lack of trained teachers is reason enough for the subject to be relegated to a position of less importance. The curriculum should be both aspirational and intended – it is up to schools, society and teacher-training programs to find reasons to encourage people with the skills and knowledge required to teach the curriculum to consider joining the profession. The same argument would not be applied to any other learning area – we would never say that not having enough English teachers would be reason enough to stop teaching English, would we?

The use of technology for the “thrill” of using it is fine – I’ve got no problem with people making use of the great technology available to better their lives etc. But accepting technology as “magic” is not acceptable in the longer-term if we want to continue to develop as a society. Would we be where we are today if we had simply accepted the idea that rain just happened and didn’t instead seek out a reason for it? We have the technology that we have today because people who found the passion and excitement to learn more about it did so through curiosity and interest.

We can make the Digital Technologies curriculum interesting for all students, just like we can for every other learning area. The first step in making that a reality is to stop artificially segregating the subjects and to emphasise the interdependence that exists across every discipline of knowledge. When designing a lesson or unit of work, what we need to do is look across multiple learning areas and find ways to engage students with lots of different interests – to connect what they are learning to their world.

Does this mean every child will like learning every aspect of the DT curriculum? No, just like not every child will enjoy Maths, Science or other subjects. But we can at least develop in them an appreciation of the value each discipline has, and the impact of each on their way of life now and in the future.

Oh – and on the last point re: not including Scratch (or anything else) in high school – the curriculum doesn’t do that. There is nothing that precludes the use of visual programming to teach concepts from any learning area. What has been expressly mentioned is that students learn about general purpose programming languages. These languages are different when compared to drag-and-drop type visual languages because they allow us to perform significantly more computation than is possible otherwise. They are important, but that doesn’t mean that other, more familiar platforms or languages can’t be used to address other aspects of the curriculum. I use a similar technique to explore recursion with my students, producing fantastic looking artwork using Context-Free grammars and exploring randomness as well (which is a nice way of visualising genetic mutation).

We need to stop looking at movement through the bands as discrete periods of learning – it is a continuum and the learning that takes place in earlier bands should be used as the foundation for learning in later ones.

I’d be very interested to hear the thoughts of other educators of all disciplines on this issue and those like it. Please join the conversation and post your comments below – this is one of those topics I’d love to see start a very interesting, ongoing dialogue.

#ACTvotes – but why does it take so long?

For those of you from outside of the ACT, you may not be aware that we’ve just had our ACT Election for our next term of government. The election coverage can be followed in a few places:

Of course, there are a myriad of other places for information too, but these tend to be the ones I use each time there is an election on I’m interested in following. This post hasn’t happened as a result of me wanting to talk about the results (although another minority government in the ACT isn’t anything new), but about the length of time it takes to get a result after each poll.

You see, the ACT has only one house of parliament (most States and the Federal government in Australia are bicameral and have a lower and upper house), and the Legislative Assembly is made up of 17 members spread across 3 electorates. As such, the election uses a quota-based system known as Hare-Clark, combined with a Robson Rotation for listing candidates on the ballot paper. As Antony points out on his blog and in the coverage on the ABC last night, the results can take a long time to finalise because:

  1. A single transferable vote method means that until all votes are distributed to lower preferences, candidates may not meet the required quota and therefore it cannot be determined who has won a seat;
  2. Robson rotation means there are a LOT of different ballot papers (since candidates are in a different order) so the scrutineering process takes longer since you cannot use the position of the preference on the ballot as an indication of who the vote is for; and
  3. Historically, the ACT always ends up with minority governments, so the cross bench needs time to negotiate with the major parties to determine who will be supported to form government.

It amazes me that we still rely on a manual count of votes to determine our winner given what computers are capable of today. Security is often raised as the reason why voting cannot occur over the Internet or via electronic means, and although the ACT has electronic voting facilities available, these are only installed in about 6 or so polling places and are used predominantly for pre-polling. So, although about 20% of the vote is entered electronically, the greater majority of votes are done by filling out ballot papers.

We’ll put the costs of printing and the environmental impact of the campaigns aside for the time being – I want to focus on the actual counting process. I’ve thought about this a lot, and one of the most frustrating things about the whole thing is that any voting system, including Hare-Clark, can be easily represented algorithmically. So, this would mean it would be trivial to write a computer program that could use the voting data to automatically determine the winner of the election in a small amount of time. In fact, one of the tasks I’m setting for some students involves writing a program to do just this, and I’m pretty confident that these Year 10 students with about a year of programming experience will be able to do just that. So, that clearly isn’t a deal breaker in terms of improving the efficiency of the process.

The biggest hurdle, then, would appear to be taking the votes themselves and converting them into an electronic format that could be used by the computer program to generate the result. If electronic voting and/or Internet voting are still a way off, then with the advances in OCR I can’t see why the paper ballots couldn’t be processed using a workflow like:

  1. Ballot papers are scanned into a computer;
  2. The OCR reads the vote to determine:
    1. The position of each candidate on the ballot paper; and
    2. The preference, if any, awarded to each candidate on that particular vote;
  3. The OCR-generated data is then displayed on a screen next to the scanned copy of the ballot paper, and is checked by 2 or more scrutineers for accuracy (this step currently takes place with paper ballots, but this approach could allow the workload to be distributed much more easily):
    1. If accurate, the scrutineers approve the result;
    2. If OCR has generated an error, the scrutineers manually correct the error, then approve the result;
  4. The accurate data is then formatted and stored in a database so that it can be read by the ccounting program/algorithm;
  5. At any stage, the votes that have been entered can be analysed using the algorithm, and this could generate not just first preference votes, but the final result of the Hare-Clark allocation based on the votes that have been collected.

Given the manual process still required to transfer the votes from paper form into an appropriate digital format, it would probably still take an evening to have all of the votes entered into the computer and processed. However, at any time during the evening, a full analysis of the outcome of the vote could be determined (based on a partial vote count) and this would make it easier for analysts like Antony Green to predict not just the parties that would win the seats, but the candidates as well. By 11pm that night, I dare say that everyone would know which candidates had achieved a quota, and by Monday morning negotiations could begin between the cross-benches and major parties with the result of the election known and confirmed.

It seems absurd to me that this kind of process isn’t already in place – not only does it make sense economically, but it’d also mean that our pollies could get on with running the place rather than being forced to wait for two weeks to determine what the composition of the parliament would actually look like.

I’d be interested to hear from people who have either been involved in the scrutineering or counting process, or from one of the analysts like Antony Green himself, to try to determine why we haven’t got a system like the one I’ve described in place already. Given it could be used for any voting system currently in use in Australia (since they all need to have defined algorithms that can easily be programmed into a computer), there has to be an incentive for the AEC to implement such a solution.

AppleTV in Educational Settings

Recently I’ve been experimenting with configuration and use of Apple TV in the classroom as a means of providing teachers and students with wireless projection capabilities for their supported iOS and Apple devices over AirPlay. This came to a head for me when I heard the announcement from Apple in late September that the version of iOS for Apple TV (v 5.1) included support for connecting the Apple TV to enterprise networks that use the WPA2 / Wireless Certificate / Radius methods for authentication. In the ACT, the public school system uses such a configuration, and until this recent update Apple TVs could not be connected to the wireless network.

So I investigated the process and found that is is actually a relatively simple one. The requirements are:

  • A 2nd or 3rd Generation Apple TV;
  • A Mac capable of running the latest version of Apple Configurator (available through the Mac App Store)
  • The certificate file for the wireless network to which you are connecting
  • A Micro USB cable (available from all good retailers, or perhaps as an inclusion with a mobile phone you have had over the past 5 years or so)

With those 5 things, the process became fairly simple to setup. The steps are all essentially laid out in the following three Apple Hot Topics from their support website:

  1. Apple TV: How to configure 802.1X Using a Profile – this can be used for setting up a profile for any iOS device, including iPads, iPods and iPhones so that the user doesn’t have to manually enter configuration details.
  2. Apple TV: How to configure a proxy using profile – again, can be done for any iOS device. You can even set these profiles up using iPhone Configuration Utility, but Apple Configurator may be required for Apple TV support (at least at the moment)
  3. Apple TV: How to install a configuration profile – This is the final step once you’ve built your profile, and ultimately is the way you prepare it for deployment.

There are a couple of gotchas that you ultimately need to be aware of when you do this, and a few steps involved specifically for connecting to the ETD network:

  1. The EDU network uses different settings to the STU network – this is in place at the moment but will, after the move to SchoolsNet, will no longer be in place, making things a bit easier. For this to work at our school, I needed to use STU (since students cannot connect their devices to EDU).
  2. The credentials for connecting to the STU network need to be present on the AD server for your student network – teacher credentials won’t work, so you need to have an account on your student server and use that one.
  3. Proxy settings for STU are required – make sure you use the same settings that are in place on your STU desktops and laptops (I won’t publish these settings here – if you’re a teacher in the ACT ETD, you’ll be able to look them up at school). You should be using the Automatic Proxy settings (not auto-detect).
  4. You will need to get a copy of your wireless certificate off the student server. You can export a copy of the certificate from your server so that you can put it on the Mac that will be running Apple Configurator.
  5. When transferring the profile to Apple TV, you MUST have the power cable plugged in – the HDMI cable isn’t necessary (I found that I couldn’t plug both HDMI and USB 2 in at the same time because the cables I had were a bit fat) but can remain plugged in.
  6. Finally, your STU wireless network needs to be able to support AirPlay – this requires multicast/Bonjour to be active. It is active on our network due to it being used for wireless printing for our Cafe App.

Other than that, it is a pretty painless procedure. Once the profile is installed as outlined in the third Hot Topic, all you need to do is double-tap the home button on your iOS device, tap on the AirPlay icon and select the device from the list of AirPlay devices on the network. If you’re using an AirPlay capable Mac, the AirPlay icon appears in your toolbar at the top of the screen when an AirPlay capable device is present.

There are a couple of settings you’ll want to turn on for your Apple TV:

  1. Consider setting an Airplay password if you want to restrict use of the Apple TV to a few people. This might be something you want to do, but it does limit the way you can have students use the device.
  2. If you want to allow anyone to connect via AirPlay, it is a good idea to turn on the setting that requires you to enter a 4 digit passcode to connect. This way, students or teachers need to be in the room to connect their device, and you won’t get students from the other side of the school throwing their display up without you knowing.

For the cost of a big screen TV that supports HDMI (< $1000) or a HDMI-capable projector (< $1200) and an Apple TV (about $100), you can have the capability in your classroom for anyone with a capable device to display their work to their peers. This gives the teacher the flexibility of demonstrating something from anywhere in the room, and for students to do the same. When you compare this setup to the cost of an Interactive Whiteboard (in the order of $4000-$7000), the potential for deploying this on a large scale is pretty significant if money is tight and doesn’t carry with it the restrictions of having to plug yourself in via cables in a specific place in the room.

I’d be interested to know what you think of this set up, and am happy to help you get yours up and running if it is something you’re interested in pursuing.

Way too long between drinks…

I spent a bit of time tonight looking back at the stuff I’ve done this year and realised that it has been way too long since I’ve given a rundown of my experiences with education or technology here on my blog. I’ve made some minor updates to my website, but no real post to capture what I’ve been doing. So, this post will just lay out some of what I’ve been up to this year, and it should start me on a more consistent and regular posting run from this point on (at least, that’s the intention…)

  1. Re-design the system for reporting at school so that we generate all of our course documents from a single database – the same one we use for reporting and assessment;
  2. Win a CS4HS Grant from Google to deliver some PD to teachers – in the ACT and in Bendigo, Victoria – on integrating Computer Science into the curriculum through mathematics, english, art and other subjects;
  3. Founding President of InTERACT (Information Technology in Education and Research ACT) – ACCE affiliated professional organisation for educators in the ACT;
  4. Roll out a dual-boot image for MacBooks to all teachers at school, allowing them to use either Mac OS or Windows as required for individual lessons or classes;
  5. Apply for and be appointed to the ACARA Advisory Group for the Australian Curriculum: Technologies for the writing phase, working to advise the writers on the content that will ultimately become the Australian Curriculum;
  6. Begin developing a course for iTunes U that allows students to learn programming on an iPad – still in development, but excited by the possibilities of using this (and iBooks Author) as a means of deploying content to iPads;
  7. Accept a position with the Inspire Centre for ICT Education at the University of Canberra / ACT Education and Training Directorate to develop the capacity of schools and teachers to utilise Apple Technologies effectively in the classroom;
  8. Complete online courses in Cryptography and Gamification through Coursera – a free, online educational platform supported by world class universities;
  9. Enrol in a class on Designing a New Learning Environment through Stanford University’s Venture Lab platform;
  10. Work closely with the organisers of the ACCE Conference on their ACCE Unplugged hangout sessions to get people excited and ready for the ACCE National Conference which took place in Perth in early October; and
  11. Set up Apple TV as a wireless projection solution for iOS and (new) MacBook devices for use in the classroom on HDMI capable projectors and TVs, with the intent to roll this out to many more classrooms in the future (the setup costs under $1000 per room, compared to $7000 for an IWB).

They’re the highlights at least – I’m sure there have been other things, but that alone has taken up large chunks of time this year. Now that I think about it, I really have been busy, so it’s no real surprise to see why the blog has been quiet of late.

Still, I’m making the commitment now and everyone who ends up reading this post will be my witness – I’ll post regularly, and use this as a way of keeping track of what I’ve achieved and where I’m going. I hope you’ll join me on the journey!

Is Android really as free as Google like to make it sound?

I just saw an article on TechCrunch that pointed to a (seeingly well-rehearsed) Keynote delivered by Vic Gundotra, VP at Google, that argued why Android is going to be so important for the mobile world. He sold it well, I have to admit, but it got me thinking a little more about how most of use the Internet and connected devices, and what sort of implications his ‘ideal future’ may have for us.

I find it interesting that he talks about the device that would lead to a 1984 type situation. I think he misses something vital – that the device is ultimately only a gateway to the world as we know it now. There’s something to keep in mind here – Apple may (with the iPhone ecosystem) dictate what we can and can’t do with our mobile devices in terms of the Apps we can install and the functionality we can tap into as developers, and yes, you could argue this is draconian, particularly given the App store approval processes and other thing.

However, when you access the Internet, what do you and millions of others probably do when you’re looking for something? I’d say most people hit Google. And what determines the results that appear when you search the Internet? The Google search algorithm. So, ultimately, who has the power to dictate what information you are most likely to see when you use the Internet? Google. And with that information, and the information you give them through services such as Gmail and everything else Google build and encourage people to use, they can tweak that algorithm to present you with what they want you to see.

Android on every phone may make the device and applications you can use on it “free and open”, but it also gives them even more information about you and how you use the Internet. And, in this world, information is power. Just think – if we all had Android on our phones, and we all used Google to search the Internet, imagine the power the men at the top of Google would have over you. What if they decided that ‘not being evil’ wasn’t any fun anymore?

Thoughts?

ITSC: This is NOT Amazing

I attended the Sydney ITSC Conference (hosted by Apple) recently and Chris Betcher delivered the Keynote address on the topic “This is NOT Amazing”. It struck a real chord with me and I couldn’t agree more with the sentiment – it’s something that has bugged me ever since I began my teaching career.

I mentioned this idea at a recent guest lecture I gave at the University of Canberra, and promised the students that I’d direct them to more information when it came to hand. So, to keep my promise to those students, I’ve posted links to the relevant posts from Chris, as well as an audio grab from his lecture.

I missed his introduction, but the guts of the lecture is still there. He’s also indicated he’ll post a version of it up himself after he delivers the final keynote at the last ITSC on May 23, so a better quality version will be available at his site around then.

This is NOT Amazing

Chris’ Blog post – http://chrisbetcher.com/2009/11/this-is-not-amazing/ (he also makes a recent post on his blog – http://chrisbetcher.com/ – where he reflects on the ITSC conferences and the way they operate).

In Chris’ Keynote, he refers to his daughter and her Virtual Busking project. If you’re interested in checking out more info about that, you’ll find it here – http://chrisbetcher.com/2009/04/425/

cLc in the DET

As I mentioned in a previous post, the ACT DET has recently announced the adoption of the cLc by Uniservity as its new Virtual Learning Environment. Over the last couple of days I’ve had the opportunity to really begin exploring how it operates, and here are my intial thoughts.

1. It has a lot of useful features

Now it’s probably true of every modern learning environment that many things Web 2.0 have been included – things like Wiki and Blog services, podcasting and RSS etc. The cLc has a quite extensive set of services built-in, and the editors allow a reasonable amount of flexibility to insert other stuff that isn’t built into the system. You can embed videos from YouTube and do all the usual stuff, but it doesn’t have every feature I would have liked. One of the obvious ones missing for me is an RSS aggregator/feed reader that can be attached to users and classes – given how much easier it is to have relevant content fed to you now, it’s a big hole that I would like to see filled in future versions.

2. The Interface needs work

I’ve spoken with the vendor and he’s acknowledged that the interface does have an “old school” feel about it – given it’s evolved from around 8 years ago that’s no real surprise. The good news though is that in September, Uniservity are releasing cLc Life – an update to the environment that will have a dramatic impact on how the user interface works. I’m going to reserve my criticisms of this aspect of the system until after Life is released and I’ve had a chance to use it, but until that happens, I feel that the complexities involved in using some elements will be a bit of a deterrent to teachers.

3. It will integrate nicely with our student management system

Setting up any online learning environment involves the tedious process of populating it with users and grouping them into classes (or whatever unit you want to use). Thankfully, this will be alleviated when the cLc launches in the production phase – the system will integrate nicely with Maze (our admin system) so that class lists are automatically populated with data, and the ability to do things like send one-click emails to groups of parents based on the school email records will make communicating much easier than it is now (gone will be the days we have to manage our own mailing lists). There are a few more minor challenges we need to address here, but they are related more so to the processes involved in keeping info up to date rather than the cLc itself.

4. It’s going to require a cultural shift

There are a number of ways that the cLc could be leveraged to deliver online learning experiences for our students, but its going to be important that our school works out a strategy that is going to work for our community. The ability to share resources across multiple classes should help alleviate workload concerns if staff work smarter, and ultimately allow more time spent planning as a collective which will be much more efficient than everyone planning things on their own. But this is going to require staff to embrace the change, and that’s an issue that we’d face regardless of the environment being adopted.

Am I as excited as I’d hoped I’d be when I first heard about it? No. Am I of the opinion it is going to have benefits to our students? It definitely has that potential, but ultimately that rests not with the cLc itself, but with the ability for our teachers to rise to the challenge and rethink the way they approach the use of an online learning environment to support their teaching.

Another example where it’s not about the technology, but it is about the pedagogy.