Friday, 18 December 2015

Oversold & Underused?

Or: It's not enough to change the tools, you have to change mindsets

(And if you don't change mindsets, nothing changes)

Oversold and Underused by Larry Cuban (2001)

I was inspired to write this post, by nothing more than a sense of bewilderment at how many of the teachers involved in the ongoing of working to see real reformation and revolution in terms of the potential of digital tools to transform learning are completely and utterly oblivious of this book, written ages ago (in computer terms anyway) and it's words of wisdom for anyone who wants to learn from the mistakes of the past...

"“Those who fail to learn from history are doomed to repeat it” George Santanya

Alongside Fullan's Stratosphere, Cuban's seminal text on the challenges of tech integration is one of the most important, and insightful books on the subject of tech integration in schools I ever read. What is more astounding still is that Cuban's predictions back in 2001 continue to remain as true today as they ever were.

Of course if anything at all here resonates with you in the slightest you really should read the book. But, I realise that most teachers have better things to do than read books about tech integration, so with that in mind, I have condensed the entire book into my own version of 'Cliff's Notes', some 'Sean's notes', in a Google doc. Below, for the sake of brevity I present the sections that I believe are absolutely essential. Everything that follows are Cuban's words, anything I have to interject is included in [brackets], in addition most instances of emphasis, and headings to structure content are also mine.


Teacher's Attitudes toward Technology

To fervent advocates of using technology in schools, no revolution had occurred in how the teachers organize or teach in these classrooms. Nor had there been dramatic or substantial changes in how teachers teach, or children learn. If anything, the addition of a computer center to the array of centers already in common use in these classrooms means that teachers had adapted an innovation to existing ways of teaching and learning that have dominated early childhood education for decades. Studies of computer use in other preschools and kindergartens across the country supported this observation.

Despite the claims of technology promoters that computers can transform teaching and learning, the teachers we studied adapted computers to sustain, rather than transform, their philosophy that the whole child develops best when both work and play are cultivated and “developmentally appropriate” tasks and activities are offered.

In interviews with the 21 teachers, 13 (just over 60 percent) said that their teaching had indeed changed because of their use of information technologies. [...] Of the 13 teachers who said that their teaching had changed, most referred to how they changed their preparation for teaching and how they used computers as another tool to teach. Only four said that they now organized their classes differently, lectured less, relied more on securing information from sources other than the textbook, gave students more independence, and acted more like a coach than a performer on stage. In short, they said that in using technology they had become more student-centered in their teaching; they had made fundamental changes in their pedagogy.

Neither the age, experience, nor gender of teachers was a significant factor in our data. We found little difference in computer use between veteran and novice teachers, between those with and those without previous technological experience, or between men and women. Furthermore, we did not find technophobia to be a roadblock. Teachers at both schools called for more and better technology, were avid home users, and believed in the future ubiquity of computers in society.

Teachers continually change their classroom practices. For example, some teachers quickly adopted computers for their classes, though most did not. Yet the teachers who decided to wait or choose to ignore the new technologies still engaged in changing other aspects of their teaching. Some may have decided to use a new textbook; others may have discovered a new way to do small-group work; and even others may have borrowed a technique from a colleague down the hall to press students to write more than a paragraph. These small changes are incremental and occur frequently among teachers. But these small adjustments are not what the promoters of computers had in mind. They wanted to transform teaching from the familiar teacher-centered approach to one that required the teacher to play a considerably different role. Using technology, the teacher would organize the classroom differently, giving students far more control over their learning (for example, working in teams on projects). Such changes would entail fundamental shifts in the teacher’s and students’ roles, the social organization of the classroom and power relationships between teacher and students.

The point, then, is that teachers change all the time. It is the kind of change that needs to be specified. Champions of technology wanted fundamental change in classroom practice. The teachers that we interviewed and observed, however, engaged mostly in incremental changes.

In a previous study, I investigated teachers’ responses to the introduction of the technological innovations of film (1910’s-1940’s), radio (1920’-1940’s), and instructional television (1950’s -1980’s). Each of these highly touted electronic marvels went through a cycle of high expectations for reforming schools, rich promotional rhetoric, and new policies that encouraged broad availability of the machines, yet resulted in limited classroom use. [...] But logistics gave teachers a headache. Securing a film from the district’s audio-visual centre at just the right time for a particular lesson of having the radio or television broadcast available at only one time and not other times caused problems. Incompatibility between the existing curriculum and the offerings of films, radio, and television further reduced use.

Since the nineteenth century, chalk and blackboard, pens, pencils, and textbooks have proven themselves over and over again to be reliable and useful classroom technologies. Teachers added other innovations such as the overhead projector, the ditto machine (later the copying machine), and film projector (later the VCR) because they too proved reliable and useful. But most teachers continue to see the computer as an add-on rather than as a technology integral to their classroom content and instruction.

p167 - 170
In the case of information technologies, teachers make choices by asking practical questions that computer programmers, corporate executives, or educational policymakers seldom ask. And the reason is straightforward enough: schools serve many and conflicting purposes in a democratic society. Teachers at all levels have to manage groups in a classroom while creating individual personal relationships; they have to cover academic content while cultivating depth of understanding in each student; they have to socialize students to abide by certain community values, while nurturing creative and independent thought. These complex classroom tasks, unlike anything software developers, policymakers and administrators have to face, require careful expenditure of a teacher’s time and energy. So in trying to reconcile conflicting goals within an age-graded high school or a bottom-heavy, research-driven university, teachers ask themselves down-to-earth questions in order to decide which electronic tools they will take to hand. Here are some of the questions teachers ask:

• Is the machine or software program simple enough for me to learn quickly?

• Is it versatile, that is, can it be used in more than one situation?

• Will the program motivate my students?

• Does the program contain skills that are connected to what I am expected to teach?

• Are the machine and software reliable?

• If the system breaks down, is there someone else who will fix it?

• Will the amount of time I have to invest in learning to use the system yield a comparable return in student learning?

• Will student use of computers weaken my classroom authority?

The maverick computer-using teachers I have identified sought to substantially change their instructional practices. They welcomed computers with open arms, took courses on their own, incessantly asked questions of experts, and acquired the earliest computers available at their school or for home use. They did so because they sensed that these machines fit their pedagogical beliefs about student learning and would add to the psychic rewards of teaching. Most of the innovators used computers to support existing ways of teaching. Others not only embraced the new technology, but also saw the machines as tools for advancing their student-centered agenda in transforming their classrooms into places where students could actively learn.

Thus, even within the constrained contexts in which teachers found themselves, teachers—as gatekeepers to their classrooms—acted on their beliefs in choosing what innovations to endorse, reflect, and modify.

The introduction of computers into classrooms in Silicon Valley schools had a number of unexpected consequences. They are:

• Abundant availability of a “hard” infrastructure (wiring, machines, software) and a growing “soft” infrastructure (technical support, professional development) in schools in the late 1990’s has not led, as expected, to frequent or extensive teacher use of technologies for tradition-altering classroom instruction.

• Students and teachers use computers and other technologies more at home than at school.

• When a small percentage of computer-using teachers do become serious or occasional users, the—contrary to expectations—largely maintain existing classroom practices.

Slow Revolution

Simply put, more and more teachers will become serious users of computers in their classrooms as the “hard” and “soft” infrastructures mature in schools. This explanation also suggests that uses of technology to preserve existing practices will continue among most teachers but give way slowly to larger numbers, especially as high schools and universities shift to more student-orientated teaching practices.

For the tiny band of teacher-users who have already transformed their classrooms into student-centered, active learning places, the slow-revolution explanation places them in the vanguard of a movement that will eventually convert all classrooms into technology-rich sites. Embedded in the explanation is a supreme confidence that with further work to secure better equipment, more training, and adequate technical support, as the years pass a critical mass of users will accrue, and the gravitational force of this group will draw most of the remaining teacher into technology’s orbit.

Depressing (but accurate) Predictions...

I believe that core teaching and learning practices—shaped by internal and external contexts—would remain very familiar to those who would visit mid-twenty-first-century schools.

Success in making new technologies available obscures, however, the divergent goals spurring the loosely tied coalition. Some promoters sought more productivity through better teaching and learning. Others wanted to transform teaching and learning from traditional textbook lessons to more learner-friendly, student-centered approaches. And some wanted students to become sufficiently computer literate to compete in a workplace that demanded high-level technological skills. Have these varied purposes been achieved in schools?

Beginning with computer or digital literacy, more and more students now take required keyboarding classes and courses in computers that concentrate on learning commonly used software. No consensus, however, exists on exactly what computer literacy is. Among computer advocates, definitions diverge considerably. Is it knowledge of and skill in programming? Is it being able to trouble-shoot computer lapses or software glitches? Is computer literacy knowing how to run popular software applications such as word processing programs and spreadsheets? Or is it simply completing a required course in computers? When we remember the many shifts in the meaning of computer literacy since the 1980's (recall how many experts once urged everyone to learn BASIC programming), any hope of securing agreement on a common definition appears slim. On such an elementary but crucial point, promoters offer little direction to computer-using teachers.

Some researchers have claimed that computer literacy, however defined, pays off in higher wages, further strengthening the educational rationale for using computers in schools. Yet schools can hardly claim full credit for students' growing technological literacy, when many also pick up computer knowledge and skills at home and in part-time jobs. The contribution that school courses and experiences have made to computer literacy and competitiveness in the workplace remains, at best, murky.

Nor has a technological revolution in teaching and learning occurred in the vast majority of American classrooms. Teachers have been infrequent and limited users of the new technologies for classroom instruction. If anything, in the midst of the swift spread of computers and the Internet to all facets of American life, "e-learning" in public schools has turned out to be word processing and Internet searches. As important supplements as these have become to many teachers' repertoires, they are far from the project-based teaching and learning that some techno-promoters have sought. Teachers at all levels of schooling have used the new technology basically to continue what they have always done: communicate with parents and administrators, prepare syllabi and lectures, record grades, assign research papers. These unintended effects must be disappointing to those who advocate more computers in schools.

Securing broad access and equipping students with minimal computer knowledge and skills may be counted as successes. Whether such intended effects lead to high-wage jobs is unclear because the outcomes may be due more to graduates' skills picked up outside of school or to their paper credentials. When it comes to higher teacher and student productivity and a transformation in teaching and learning, however, there is little ambiguity. Both must be tagged as failures. Computers have been oversold and underused, at least for now.

Yet technology will not go away, and educators have to come to terms with it as an educational tool. Understanding technology and the social practices that accompany it as a potent force in society is incumbent on both students and adults. From the telephone to the automobile to the computer, technologies carry with them the baggage of complex social practices and values that need to be explicitly examined.

How early childhood classrooms, high schools, and universities in Silicon Valley and across the nation responded to the last two decades of technological innovations is a case study in both stability and change. No one who attended schools in the 1950's and then visited schools in 2000 could fail to note many important differences in classroom practice. It is untrue that schools or teachers cannot change. Those visitors, however, would also note strong, abiding similarities between classrooms and teaching practices a half-century apart. Those similarities are due to the historical legacies and contexts. Ad hoc incremental changes have occurred often; fundamental changes have occurred seldom.

Although promoters of new technologies often spout the rhetoric of fundamental change, few have pursued deep and comprehensive changes in the existing system of schooling. The introduction of information technologies into schools over the past two decades has achieved neither the transformation of teaching and learning nor the productivity gains that a reform coalition of corporate executives, public officials, parents, academics, and educators have sought. For such fundamental changes in teaching and learning to occur there would have to have been widespread and deep reform in schools' organisational, political, social, and technological contexts.

I predict that the slow revolution in technology access, fuelled by popular support and continuing as long as there is economic prosperity, will eventually yield exactly what promoters have sought: every student, like every worker, will eventually have a personal computer. But no fundamental change in teaching practices will occur. I can imagine a time, for example, when all students use portable computer this way they use notebooks today. The teacher would post math assignments from the text and appropriate links on their Website, which students would access from home. Such access, however, will only marginally reshape the deeply anchored structures of the self-contained classroom, parental expectations of what teachers should be doing, time schedules, and teachers' disciplinary training that help account for the dominant teaching practices. The teacher in my example would use the laptops to sustain existing practices, including homework. In short, historical legacies in school structures and parents' and taxpayers' social beliefs about what schools should be doing, I believe, will trump the slow revolution in teaching practices. Those fervent advocates who seek to transform teaching and learning into more efficient, proactive work through active, student-centered classrooms will find wholesale access to computers ultimately disappointing.

Cuban L (2001). Oversold and underused: computers in the classroom. Cambridge, Mass.: Harvard University Press.

Sunday, 6 December 2015

Transforming Posters/Infographics

For good reason posters are a common choice of outcome for teachers to use when asking students to demonstrate their accrued learning. Of course this is nothing new, teachers have been asking their student to make posters to show their understanding for many years before computers became commonplace in classrooms.

Given that posters are so popular, this post is not about exploring the reasons for this, but asking the question, if you're going to ask your students to make posters with their computers, then why not ask them to use these tools to TRANSFORM their posters, not just replace/replicate them? Why not use them for formative assessment, as well as for summative assessment? This is something screens excel at compared to traditional paper posters.

Infographic or Poster?

First and foremost, if you're expecting your students to include more than a minimal amount of information, then you probably need them to make an infographic, not a poster.

Infographics are not posters; posters do not contain much information as they are designed to be simple and effective, infographics are designed to be complex and effective... A simple bit of Googling will illustrate this nicely:

Google Search for 'great posters'
Google Search for 'great infographics'

As can be seen from the above screenshots, posters and infographics are actually very different.

The purpose of a poster is to utilise (usually) one powerful image and a relatively small amount of text to communicate a message.

The purpose of an infographic is to condense large amounts of information into a form where it will be more easily absorbed by the reader.

Considering this distinction, it is likely that for most teachers, an infographic is more likely to be a more appropriate choice if you are expecting to use this medium as a vehicle for  assessment, and for your students to demonstrate subject content knowledge and understanding, as they will probably need to include a great deal more information than would be expected on a poster.

Six simple snippets for successful infographics:

  1. They don't need to be printed, in fact they function better when screen based, why? because... when screen based, a portrait format works best (supports scrolling), which means that they can... 
  2. Leverage the freedom of the vertical dimension on screen, no need to cram content into the space limits of a traditional paper/printed page.* 
  3. Customise the page setup: portrait approx width 25cm length 50cm (approx length twice width) 
  4. Base the design on a template, ideally within the same context eg 'infographic earthquake" find an example that will serve as a 'mentor graphic' or graphical model. 
  5. Use a mentor graphic—imitate, then innovate. Copy layout, font choices, structure, icon use and placement, then integrate your own content. 
  6. Choose your tool carefully, Pages is great for more control and polish, but does not lend itself to online collaboration, sharing  and feedback. Online tools like PiktoChart are great, even with interactive maps/charts, but you have to work within the limitations of the free version, and choose your template carefully. Google Drawings make collaboration, sharing and feedback easy, but are less polished, and more limiting graphically, it can be done though, here's one I made earlier.

    *You can still use the BFP (Big Format Printer) for printing extended infographics if you insist...

Use 'Mentor Graphics'

Simple but effective potential 'mentor infographics' for earthquakes, just Google 'earthquake infographic'

Grade 5 Expo Example

As the students are working in groups it makes much more sense for them to work in Google Drawings, much easier to collaborate, comment, organise. With a whole team working on one infographic it's even more important to use a mentor graphic that the whole team has agreed on, so that choices about font, layout are easier, and consistency is much easier to achieve.

Mentor Graphic
Google Drawing

Final PDF

Grade 6 Example

Here's an example from a Grade 6 Humanities unit, here the student chose a mentor graphic and then structured her own design around that; the similarities are obvious, but so are the differences. Inevitably, as they progress through the project their 'copy' evolves into a graphic that is more and more an imitation and less a duplication of the original mentor graphic (click to enlarge):


RAT Your Infographics

RAT (Replace, Amplify, Transform) your graphic. Don't just replace paper posters, amplify, or better still transform them on screens using SAMMS.

Situate them online so you can easily facilitate the ability for students to work on any screen, and any space, place or time that is convenient for them, this can also be a great environment for students to work collaboratively or cooperatively. (Not the same thing*)

Encourage students to use their access to the Internet effectively, constructively, and responsibly:  smart searches to identify powerful images and information for them to assemble, remix, represent... and of course cite.

Leverage the power of the mutability of the digital medium, whatever you do, do not leave the construction of the infographic to the end of the unit, have the students start the infographic as soon as possible, and then allow them to adapt, develop and evolve it  formatively over the course of the unit into the final product.

Make it multimodal, this means using images as illustrations and not just as decoration. If the infographic is presented online this also adds the possible affordance of animation (animated gifs are ideal, students can easily make their own with licecap) and even selective use of short (looping?) video clips. A powerful way to leverage this is for students to create a short screencast narration of the rationale behind the content of their infographic.

Finally, socially network it, so they can easily be shared with the teacher and with their peers for effective formative assessment, especially peer assessment (Students as learning resources for one another, Wiliam, 2011) This allows students to share an early draft of the infographic in a shared online space where all of their peers (including their teacher) can review and provide feedback on their ongoing work, constructive criticism, clarification, and celebration.

*Cooperation = parallel practice, Collaboration = integrative practice (Ingram and Hathorn, 2004)

Cooperation means many working on one thing, individual contribution is unclear, collaborative practice means each member as a 'ownership' of a specific element, these elements are combined to form the final outcome.

Monday, 9 November 2015

Copyright Licenses

The College takes copyright seriously and adheres to Singapore copyright law at all times. Singapore law does allow some exemptions for non-profit schools, so it is important to read and understand the College copyright policy. This most recent version can be found on the Staff Portal under "College Policies (Staff)" in the Staff folder.

To further assist staff to use copyrighted material the College has obtained an Umbrella License with the Motion Picture Licensing Company (Singapore.) This allows staff to use and show copyrighted films and programmes from a large number of studios/companies from around the world for a variety of purposes; including educational and recreational purposes in both the College and the boarding houses.

Please note:

The license does not provide the right to charge for watching these films or to profit from their use (even for fundraising purposes) the copy of the film or programme used must still be obtained from a legal source, eg purchased DVD, legal streaming service (eg iTunes, Netflix but not Pirate Bay) or local recorded transmission.

The list does include many of the main studios including:
  • Channel 4
  • BBC Worldwide
  • Discovery
  • National Geographic
  • Fox
  • Sony
  • MGM

The license does not cover every studio (eg not Disney) nor does it cover every film or programme ever made, you do need to check the copyright owner against the list of included companies. For a complete list of studios/companies covered by the license please check here.

More details, including how to check which made a particular film or show can be found here.

You can still use films that are not covered under the Umbrella license for educational purposes, as long as there is a clear educational link to the curriculum topics being covered at that time.

So for example, if you want to show a film as part of the school PSE programme that will highlight specific educational aims and outcomes, then that could be shown; but any grade or department would need as many legitimately purchased copies as are needed to view simultaneously. So it could be one copy if only one class is viewing it at a time, or a copy for each class if all classes needed to view it at the same time. 


The college subscribes to a video streaming service called Kanopy, you can access directly when on campus, no password needed, and there a few 'blockbuster' films in there, and even a few educational ones!

This is for teachers and possible diploma students to access, not younger students, as the films are a mix, with lots of 18 certificate films from The GodFather or No Country for Old Men, to Lemony Snicket or Rango... and everything in between, so you should presume that students can access the whole library if you give them the link.

See this post from our colleagues at the East Campus with more details, and advice on ways to access Kanopy off campus.

Saturday, 7 November 2015

Problems with Plagiarism

DO NOT COPY!? The images (below) are typical of the negative assumptions about the impact of digital technologies on research, and assumptions about how many educators assume their students will use them. These negative assumptions litter the internet with their tones of moral crisis (just Google image search 'plagiarism'). The problem is this advice is just impractical, a generation raised to assume the mutability of the screen as 'normal' is not one that is likely to take this advice to heart, instead they will rely on copy paste, but assume that this means that they are now cheating. I don't believe this has to be the case, I believe we can use copy and paste as a powerful part of the process of researching. I believe we should adopt a more constructive approach to copying, a constructive plagiarism.

Obviously I'm not advocating plagiarism, but we need to keep our eyes on the actual purpose of the research, the learning, not on outdated (copy+paste=plagiarism) methodologies and principles which are counter-productive, and counterintuitive. I say ‘21st Century’ because if ever there was a mode of operation that needed dragging into this century it is the models and modes of academic writing.

20th Century responses to 21st Century Possibilities
When we ask students to research a topic, the underlying assumption behind traditional guidance around plagiarism is that our students are going to construct a completely original piece of writing, or create a piece of work which does not contain any content that could be found in any other format that has ever been written or published since the dawn of time. Right from the outset this should strike any reasonable minded person as an absolutely ludicrous proposition, as the oft quoted Newton said himself "If I have seen further than others, it is by standing upon the shoulders of giants.", the point being, as is also reiterated in the good book by Solomon Ecclesiastes, "there is nothing new under the sun". Of course, the assumption in this context is that the primary mode of communication is text (because it generally is. Why? That’s another conversation), but you need look no further than the outstanding video series on Vimeo "Everything is a Remix" to see a powerful rationale for the kind of practices that I call 'constructive plagiarism'.

An Alternative Approach... 

We need to embrace the ways digital technologies can and should transform how our students research, particularly exploring the mutability of digital technologies. The negative assumptions behind the castigation of students who copy+paste is that protestors assume that that is where the process ends, rather than where the process begins.

The fact is that even our most supposedly original ideas and thoughts are a remixing of ideas, experiences, and content that we have encountered throughout our lifetimes and that we have either consciously or unconsciously internalised, remixed, reinterpreted and finally re-presented as our own original work. As David Woo says, “no true knowledge outside people: people construct individual meaning and try to agree on meaning in talking with each other”. So to set students the task of producing a wholly original piece of work is really an exercise in futility, at worst you are asking them to do the impossible, at best you're asking them to take other people's ideas, internalise them and re-present them as if they were their own, which is really nothing short of a form of plagiarism, albeit one where they make some effort to at least vaguely cite the original content.

Of course this is something we are likely to penalise them for doing, as the clearer they make the sources of their ideas, the more likely it is that their teacher will be able to discover their extent of their plagiarism. Now there is great deal more to this last point that I can't really get into here, suffice it to say that we need to build an intrinsic sense of trust, the alternative is a nightmarish ‘damned if you do, damned if you don't’ scenario where everyone loses. Students who are feel incriminated by exploiting the affordances of digital tools, such as their use of copy/paste, who then abandon any attempt at appropriate practice and start faking their citations, or skipping them completely for fear that if they leave a trail of evidence they are just increasing the likelihood that they will be ‘caught’.

Another common issue is students end up unsupported with an ‘all or nothing’ stance, ie you MUST write it all in isolation, not an empty page, if you copy and paste anything, that’s it, you’re effectively categorised as a cheat. So we end up with patchwork plagiarism of the worst kind, not because the students have corrupt intentions or motivations, but because no one is teaching them how to use copy/paste constructively.

Guess what goes in the middle... ? Image: Sean Savage.

What do we define plagiarism?
the practice of taking someone else's work or ideas and passing them off as one's own.
synonyms: copying, infringement of copyright, piracy, theft, stealing.

So, let me be clear, I am not advocating plagiarism, I am advocating a more.. progressive/practical stance towards it. There is a practical compromise somewhere between the extremes of ripping off the work of others, and riffing off the work of others.

riff ‎
A variation on something.

Any variation or improvisation, as on an idea.

To improvise in the performance or practice of an art, especially by expanding on or making novel use of traditional themes. (my emphasis)

(constructive) pla·gia·rism
the practice of taking someone else's work or ideas and remix/reword/rewriting them until they become one's own.

The method stems from the assumption that when completing a piece of work, there is a limit to the extent to which you can directly quote the works of others (who of course themselves have actually remixed and re-presented other work that they have encountered in their lifetimes). It is generally advisable to avoid being overly reliant upon direct quotes within the body of your work. When I first began academic writing many years ago, I made the (naive) assumption that I could literally use as many direct quotes as I needed, assuming that those would not be counted in the word count, only my reflections upon those direct quotes, oh how wrong I was! My logic, both then and now, is that rather than attempt to try and pretend I can contribute a completely authentic and original contribution to this body of work, I'd be better off to thoroughly appraise myself of current thinking around the idea and then reflect and write about my responses to the key ideas represented in the literature.

And so it was that I was introduced to the "game" of direct quotation versus paraphrasing. This enotes discussion below highlights the heart of the confusion surrounding this issue:
"In general, you will need to use direct quotes and paraphrases in a master's thesis. I would say that you will generally need to paraphrase more than you quote. What I would do is to paraphrase most of what I was trying to say. Then I would use direct quotes to emphasize the most important points that I was getting from a particular source. You want to be careful about quoting too much, lest it look like you do not understand what is being said and cannot put it in your own words."

"Paraphrasing shows how you synthesize, understand, and intend to use the material. It also shows your adviser and review committee that you can take an idea from someone or some source and reword it so that you are presenting this same idea in another way."
And that, in a nutshell is at the heart of constructive plagiarism, to paraphrase the above quote about paraphrasing, (is this a meta-paraphrase?) you take an idea from someone or some source and reword it so that you are presenting this same idea in another way.

Contrast that then, with this contradictory advice from the same discussion,
"it is probably better to overcite references than over paraphrase them. Under many circumstances, the review process of literature as well as the peer edit/ advisor editing process of writing a master's thesis also addresses the appropriate moments to paraphrase or to cite directly."

Or this, which effectively says do both...
"knowing that quotations are most important, knowing that structural and word requirements impose some limitations, knowing that other evidence that you have collected to prove your idea is also important, the answer to your question is that while paraphrasing is vital to demonstrating your understanding of a broad range of knowledge and to reviewing critical opinion and opposing arguments, you must have high quality and carefully selected quotations to substantiate the things you assert and claim. This may sound like a double-sided answer, but that is precisely what you need, a double-sided approach: You must include that which is most important (analysis and quotes) within the framework of that which is required.

So we find ourselves in a somewhat confusing situation whereby we teach our students that the body of their research cannot be just their own opinion, and has to clearly show that every idea that they wrestle with, actually already has a precedent in some other body of work, and yet at the same time they're not allowed to be overly reliant upon quoting these works, instead they have to play the game where once they have used up the amount of text they can dedicate to direct quotes, they then have to work around it by paraphrasing direct quotes to integrate them into their assignment. Or to put it another way, to make those authors words sound like theirs. That is how we end up in the position I call 'constructive plagiarism'. Namely the skill of taking a wide range of direct quotations from a wide range of sources and authors and then remixing and reworking those into a new seamlessly integrated piece of prose that clearly summarises and represents the ideas that were articulated in the original texts clearly, but has clearly been processed through their own worldview and experiences to represent their own remix of the many sources from which they were influenced.

This is where we get into the complexity of citation, when making direct quotes they should ideally be referenced in a bibliography, when you have paraphrased content that was originally a direct quote, instead this becomes a reference. In my experience very few academics require you to distinguish between these two, in which case we move closer to the ambiguity that is represented by constructive plagiarism...

I think the folks over at scribbr sum up this quandary well with the following:


Quoting is when you literally copy a part of a text. It is wise to limit the use of quotes, as they do not improve the readability of your thesis.

Plus, if you use many quotes, you will seem lazy. Next to that, when you use a quote it can give the impression that you did not understand the source or that you did not read the entire text. It is therefore wise to use a quote only when necessary.

For example, you can use one when you want to provide a definition of a certain concept. You can also consider using one when the author has written a sentence so beautifully or powerfully that a paraphrase would diminish the quality of the text.


Always try to keep a quote as short as possible, preferably no longer than a few sentences. You can also shorten a quote; for example, you might replace a redundant or irrelevant part of a quote with ellipses (…).

However, make sure not to take a quote out of its context by, for instance, citing only one sentence that supports your research in a study that otherwise contradicts your research.


When you paraphrase something, you describe a (part of a) study in your own words. Doing so, you can fit an existing theory into your own research very easily.

However, even though the paraphrase is in your own words, the idea is still someone else’s. Therefore, you always have to cite your source when you paraphrase.

It is also important to always introduce the paraphrase. You can do this as follows: “Janssen (2008) states in his research that …”

The standout sentence to me in that quotation is "However, even though the paraphrase is in your own words, the idea is still someone else’s. Therefore, you always have to cite your source when you paraphrase." (ibid) This gets to the very heart of this game whereby they as the author are attempting to convey the idea that these ideas of theirs, are not comprised solely of their own ideas, but are an amalgamation of (hopefully many) other authors ideas over time and a range of contexts that have been repurposed to better fit their particular situation. Allow me to quote another colleague here:

“Appealing to authority makes the message digestible with less heartburn if done well. Here I picture a street fight. A direct quote is pushing Einstein out in front of you to intimidate the foes. Paraphrasing is walking out to meet the foes head-on with Einstein, Hawking, and all the other giants that went into forming the idea trailing behind making your knuckle cracking all the more convincing.” Kurt Wittig

Back over to the good folks at scribbr, who have this further helpful advice to give:

Paraphrase or summarise?

The term “paraphrase” is generally used when someone describes someone else’s [words] research in their own words. However, this is not entirely correct. A paraphrase is a description of a certain quote from someone else, put in your own words. A paraphrase is therefore approximately of the same length as the source text’s quote.

When you completely or partially describe the outcome of a more substantial part of the research, it is called a summary.

There is a distinct difference between paraphrasing and summarizing. However, in general (as is also the case in many universities), both are called paraphrasing.

General tips

  • Only quote or paraphrase the authors of papers that are authoritative in their field of research. 
  • It is important that the quote or paraphrase has added value for your research. The quote or paraphrase should also fit in with the rest of the text. The text preceding or following the quote or paraphrase should clarify what you want to imply.
  • A quote or paraphrase is not complete without a proper in-text citation and entry in your reference list, formatted correctly in the appropriate referencing style.

As a case in point, if this was an academic assignment that I was submitting for formal credit, the above quotations from Scribbr would be too excessive to use as a direct quotation, so instead I would have to play the game of choosing one or two sentences and either omitting or paraphrasing the rest, and that’s just one point! Fortunately for me, I don't have to play that game here, and can just let them be, I'm excused from playing the game, and freed to focus on my learning, and wrestling with words that capture my thoughts. And so it is that the process of constructive plagiarism commences. If they have read widely (and they should) they will inevitability find themselves in a situation where they have far more content that they wish to use than they can possibly directly quote, so instead, they begin the process of paraphrasing huge amounts of content that contain essential ideas and themes, as the only other method for doing justice to these ideas they have collected would be to directly quote them, which they cannot do for the reasons outlined above.

Constructive copying then, is the process of remixing and reworking swathes of direct quotes from a wide range of source material while being careful to ensure that they cite the original authors clearly, but relying on paraphrasing and summarising enough to keep direct quotes to a minimum. When students exceed the amount of text they can realistically use as a direct quotation, they switch to paraphrasing/rewriting mode. In this mode they still credit the author but they effectively re-write the direct quotation into a different tense or apply it to a familiar context so that the voice sounds more like theirs then the original author's. Unfortunately, this is more difficult than it sounds, they have to be fairly adept at writing to be able to take the voices and insights of multiple authors and rewrite them so that they sound consistently like they own. In order to be able to do this they need to understand the content thoroughly, they need to be very well read, and they need to have a thorough understanding of the context within which they are writing, which takes me to the point of this entire process, the learning.

What’s the point?

Whenever I hear teachers wringing their hands, expressing consternation over the likelihood of their students constructing written assignments through processes that seem to border on plagiarism, I rarely hear any reference to the actual point of the activity, which is surely to motivate students to research widely, to internalise what they learn, and to represent it simply and clearly in a form that allows the teacher to accurately appraise the extent to which the student has understood the exercise. If the goal really is to motivate students to learn, then we should be less concerned about the minutiae of rules around plagiarism, and be more concerned about motivating students to read widely, research thoroughly, and to present their findings and sources clearly. I believe that a student who does this, is a student who has achieved the goal that the teacher had in mind, namely to learn about the area in question, through a thorough engagement with the material, and by presenting this understanding in a format that is concise and clear.

The Process of Constructive Copying

Cut, copy, paste, swap, then repeat. [peterellisjones]

For the 'how', rather than the 'why' see my other post here.

Wednesday, 23 September 2015

A Response to Recent News Articles on the OECD Report "Students, Computers and Learning: Making The Connection"

Many people have seen news articles about a new report from the OECD in recent days. A typical article and headline is this one from the BBC - Computers 'do not improve' pupil results, says OECD” This headline is followed by the opening line: “Investing heavily in school computers and classroom technology does not improve pupils' performance, says a global study from the OECD.

Firstly, I would always advise caution when looking at news headlines. The purpose of a headline is by its nature to grab attention, not to necessarily give a balanced viewpoint. Counterpoint the BBC’s headline and opening line with the OECD’s from their own press release for the report:

New approach needed to deliver on technology’s potential in schools
Schools have yet to take advantage of the potential of technology in the classroom to tackle the digital divide and give every student the skills they need in today’s connected world, according to the first OECD PISA assessment of digital skills.

The full report - titled “Students, Computers and Learning: Making The Connection” can be found here.

Digging in to the BBC article and the report itself, shows very clearly that the intent of the OECD report is not to advocate that schools do not use computers with students, it is rather that educators need to get better at using computers to yield improvements.

“School systems need to find more effective ways to integrate technology into teaching and learning  to provide educators with learning environments that support 21st century pedagogies and provide children with the 21st century skills they need to succeed in tomorrow’s world,” said Andreas Schleicher, OECD Director for Education and Skills. “Technology is the only way to dramatically expand access to knowledge. To deliver on the promises technology holds, countries need to invest more effectively and ensure that teachers are at the forefront of designing and implementing this change.” Quote from OECD press release

“....Mr Schleicher says the findings of the report should not be used as an "excuse" not to use technology, but as a spur to finding a more effective approach.” Quote from BBC article

This is a sentiment that the College very much subscribes to and one that we believe we are robustly implementing. The College, and more importantly our teachers, are very reflective in their use of technology to support teaching and learning and are constantly looking for uses that show real value add over traditional approaches.

Something that the article does not do is to question the method that the report uses to measure educational success. The measure of educational success used is the OECDs PISA tests. There are reasons why we might question this as a baseline.

The first one is the assumption that the purpose of using computers in schools is to improve academic results in traditional science, english and maths tests. Certainly at UWCSEA, we have never stated that the purpose of using technology to enrich teaching and learning is about improving test results. Rather the stated aims of the original iLearn initiative were to “ to improve learning and develop skills through:

  • Flexible Progression
  • Critical Thinking
  • Unhindered Innovation
  • Collaborative Learning”

For a complete overview of the original initiative please see here. Please note that these outcomes are now embedded into the UWCSEA Profile.

The PISA tests simply do not measure these skills and qualities, so they are not a valid measure for many of the desired outcomes that technology use can bring to teaching and learning. Nor do they test digital skills, which are in themselves a desirable outcome and for many a requirement for future preparedness in our students.

Further to this, there are many noted academics who question the value of the PISA tests even for their own stated aims. One of the most well known and a frequent guest speaker for the Singapore Ministry of Education, is the American Professor Yong Zhao; Director of the Institute for Global and Online Education in the College of Education, University of Oregon.

"PISA, the OECD’s triennial international assessment of 15 year olds in math, reading, and science, has become one of the most destructive forces in education today. It creates illusory models of excellence, romanticizes misery, glorifies educational authoritarianism, and most serious, directs the world’s attention to the past instead of pointing to the future."

For more information you can read Zhao’s blogposts here.

So in summary, the College has found that much of the press coverage of the research has been misleading. We are fully supportive of the recommendations in the original report, although we feel they have limited relevance to our situation. For a full discussion of the report it is also necessary to question the use of the PISA tests as a measure of educational success, particularly as regards the success of “21st century pedagogies.”

Sunday, 20 September 2015

Typing Club - our touch typing tutor

Typing Club is an excellent touch-typing tutor. Since the introduction of student laptops, the ability to type accurately and quickly has become an important skill that sits alongside handwriting and other communication skills. The site is free so anyone can open and website and give it a try.

We suggest that our students work through the hundred lessons at their own pace and focus on the feedback that the typing tutor offers. Before each lesson a small animation is shown which highlights finger positioning and effective technique. Throughout the lesson the website records speed, accuracy and key strokes and provides a graphical summary at the end which provides useful feedback. Overtime we hope that our students feel comfortable typing and that is becomes and life-long skill.

As students complete lessons they receive points which reflect their typing accuracy and speed. Each tutor group in in a discreet online space so if they want to compare their progress to other students they can check out the class scoreboard. Other than some broad observations from the points collected, their will not be a direct assessment of the student progress by the individual tutors.

Handwriting or drawing are essential skills for our students. Therefore assessments are frequently in the written format and students are encouraged to take hand-written notes in many subjects.

Note from SMc (another DLC)

If you have not been set up to use the college account (grades 4-7), fret not, you can use the public site (not the college link) here: (Public link - paid with advertising) (Private UWC link, needs student GMail log in)

Another great free option, especially for Primary kids is on the BBC site: 

or Ratatype

And these sites are great motivator for kids to increase their WPM through 'gamification':, and

How long does it take to learn to touch type?

"Practising 'little and often' (15 -30 minutes a day) works much better than an hour or more once a week. If you practise regularly and don't give up, you should be able to learn to touch type fluently in 2-3 months, maybe even less. A total of 10 – 15 hours of practice should get you touch typing slowly:

What soon can kids start learning touch typing?

Developmental appropriateness is key here, obviously they need to be able to write sentences, and they need to have hands/fingers long enough to reach from the home keys to the other keys, eg from the 'f' to the r, t, g, c, v—and still be able to tap the spacebar with a thumb... Bear in mind they will almost certainly be learning this on a keyboard designed for adult hands, and no they can't learn how to touch type on a touch screen like an iPad.

The general consensus seems to be from Grade 3/age 8:
" it’s generally believed that they may not have the motor coordination or finger span to truly touch type until about seven or 8 years of age." 
"kids gain the finger span and motor coordination to touch type around 7 and 8 years old."
"They can start this as early as the first grade, but their hand span and the length of their fingers can cover the entire keyboard area comfortably only by the time they are 7 or 8 years old. By this age, they can start building their typing test wpm speed."

When are you going to teach touch typing to all students at school?

It's time to accept that typing is now effectively writing, so for a school to purport to be teaching their students elementary skills like writing, it is fair to ask why would anyone emphasise the teaching of handwriting over the skill of typing? Surely the aim with both is fluidity; essentially we want our students to be able to write 'at the speed of thought' or as close to that as possible, regardless of the medium.

WPM (Words per minute)

The fact is that the speed of (legible) handwriting (with or without cursive) is much slower with touch-typing. Over to Wikipedia for the breakdown of relative speeds:

The average human being hand-writes at 31 words per minute for memorised text and 22 words per minute while copying (Brown CM, 1988).

Whereas an average professional typist types usually in speeds of 50 to 80 wpm, some advanced typists work at speeds above 120 wpm. "Hunt and peck" typists, commonly reach sustained speeds of about 37 wpm for memorised text and 27 wpm when copying text.

Go on, try it yourself, I used typeracer, and scored 35 wpm, first try, them timed myself writing the same text (so a slight advantage) as fast as I could by hand, focusing on speed over beauty, but still maintaining legibility, and scored ... 17 wpm. Pathetic, I know.

So to summarise: that's handwriting at 22 wpm, hunt & peck at 27 wpm (about the same) and between 50-120 wpm for touch-typists. So if we don't teach our students how to touch-type they are in theory at least no worse off than they would have been if we asked them to write it all by hand. But the gains in terms of speed with touch-typing hovering in the range of double to triple the speed are clearly something we'd be crazy to ignore. Even with my own rudimentary experiment I was twice the speed with my tedious 'hunt and peck' technique than I was with handwriting... And of course digital text is capable of so much more than handwritten—it's situated, accessible, mutable...

Ok, I'm convinced, so how? When?

Touch typing is really the kind of thing best done at home, due to the need for it to be extended, diligent, and regular. We just don't have the time in school to dedicate to this, and to be honest, it's not a great use of a teacher's time, as they wouldn't be actually teaching, they'd just be invigilating silent drilling. This really is one of those things best done at home with a dedicated parent, I often suggest this a good holiday target, with a reward for the kids who learn it.

Really the only way I can see touch typing becoming a school focus is if it replaced the teaching of handwriting/cursive, which is exactly what they're doing in Finland. Something tells me that would not go down well with most parents though!

Brown CM (1988). Human-computer interface design guidelines. Norwood, NJ: Ablex Publishing.

Friday, 11 September 2015

Is Cloud Confusion Driving you Crazy?

Ridiculous name, revolutionary technology.

With the exponential increase in 'cloud' capacity, it is becoming increasingly critical to rely on this powerful technology to ensure that all of our essential data is safe, and accessible, from, well, any screen with an internet connection. With the multiplicity of devices in our lives, this functionality is pretty much essential.

This is more of a blessing than a curse, BUT.

There's always a but.

Now I know Benjen Stark said "You know, my brother once told me that nothing someone says before the word "but" really counts… (GoT)


The fact is that despite it's magnificence, the 'cloud' can cause a huge amount of confusion, so let's just break this down a bit.

What's the cloud?

Essentially, the 'cloud' is a rather dubious name for describing any of your data which is not just stored on your actual device, instead it's stored on a remote server (very much on the ground) that pulls and pushes your content to your devices over the internet. 

How many clouds?

Well, as it turns out there are quite a few forms of cloud technology, but the ones of most interest to us at UWCSEA, are Apple's 'iCloud' and, well pretty much everything even vaguely Google related from 'Drive' to Google Photos et cetera.

Cloud confusion...

The confusion stems from the fact that we all need to separate our home and work life, not to mention that the school's user agreement clearly lays out expectations that there should be clear boundaries between personal and professional use of UWCSEA devices. Anyway, no one really wants to see those pictures of me in a bikini on a beach in Magaluf, allegedly. So whether you realise this or not, you have cloud accounts associated with every Google account, and they are completely separate, as it should be, the same is true for any Apple ID you use (more below on that).

In our hyperconnected world, digital objects have inherited the property of stickiness. Photos end up everywhere and it takes not only the knowledge of how all of the synchronisation works to understand where, but also a determined approach to 'e-Cleaning' to make sure that they are not in places you didn’t expect.

If you're confused, don't feel bad, this stuff is CONFUSING for everyone, why? Two reasons: 

The first is that people don’t know, or understand, what happens to their digital property when they tick the “backup everything to iCloud/Google” check boxes. 

The second is that Apple, and Google (and other cloud providers, eg DropBox, SkyDrive et al), in their eagerness to make the process as simple as possible, do a really bad job of explaining what is going on, and importantly, what can happen if things go wrong. It just 'works' apart from when it 'just works' in a way that you don't want it to work... 

Go wrong? What do you mean 'go wrong'!?

Well if you mix up your accounts with your devices you can end up accidentally having the 'cloud' hoover up all of your photos and videos that you're taking on your phone/tablet/laptop/desktop and adding them to your online collection, if you share that collection with other people (they are private by default) then that audience can see everything, you might be surprised at the kind of content your device has helpfully uploaded in the background for you... 

Don't Panic!

This problem really only relates to your 'rich' media, specifically photos and videos; you can happily access work/home email without any conflict, although I'd still use separate apps to minimise confusion, see below.


You can only have ONE cloud account associated with a mobile device*, you can sign in and out, but this just gets more confusing, so as a general rule, pick one and stick with that. So are the photos/videos on your phone more work or home related? If so, use the home cloud account, but don't use that device for taking videos/photos for work use, unless it is temporary, ie email/transfer them to a device dedicated to work use, and delete them afterwards.  Is your Pad more work related? Well sign into a work cloud account—which is most likely the UWC Gapps account, but then don't capture media for home use on that device, unless it is temporary, ie email/transfer them to a device dedicated to home use, and delete them afterwards.

Isolate with Apps

By dedicating specific apps to work/home you can mitigate the confusion, eg (assuming an iOS device here) Use the Mail app for your home account, use the Gmail App for work, use the Safari app for home browsing, use the Chrome app for school browsing...  This separation breaks down with rich media though, as the apps generally link to your device camera roll, which is shared across the entire device, regardless of the account an app is associated with.

Kids/Hubbie's/Wife's content mixed up with yours? 

Welcome to Apple ID vs App Store

Many of you want to use the same Apple ID on multiple family devices, so that if you purchase something from the App store you can install it on any device in the family without paying twice, that's fine, but don't confuse this with sharing the same iCloud account. 

WHAT? They're different?
Well, yes... and.. no.
You can use the same Apple ID for iCloud AND for purchasing things from Apple, especially the App Store. But these don't have to be the same, and if you want family members to be able to download stuff you've purchased you will need to separate their Apple ID (iCloud) from your Apple ID (App Store).
See? I told you it was/is confusing.
So in my case, scenario, all of the devices in my house can use MY Apple ID in the App Store to download things I've purchased for myself, or for them. BUT 

(and it's another big but)
Everyone in your family should use their OWN Apple ID on their own device, these can (and should be) be separate, that way everyone in your family keeps their 'stuff' separate from yours, but you can all download content from the same App Store account.

You can use a shared account for App Store, & an individual account/s for iCloud

Family Sharing

After years of using the above method to stay sane, Apple finally conceded that there is an issue and rolled out Family Sharing' last year, this should hopefully simplify things, unless you like me use more than one App Store account (I have one in Singapore and one in the UK), as Family Sharing is restricted on onto App Store, ie if you set it up using the Singapore store, all accounts have to be in the Singapore store, even if you leave and move overseas... Which means you need to have an active credit card account for that store. ...

Which is why I don't use it. However it could be JUST what you need, in which case click here, to follow Apple's guidelines to setting this up for your family. Also note, if you use Apple's Family Sharing method to set up Apple IDs for your kids (under 13), they will have to use an iCloud email account, not their school account, this is fine, but you may need to explain this to them...

Smarter iCloud Settings

If you're one of these teachers who is fortunate enough to have had an iPad provided for you by the college, for lots of reasons, including:
  • trickledown learning—becoming familiar with device by just using it—you use it for personal reasons, like making a home video, but this skills you up, so you can use the new found skills with those apps with your students with greater confidence
  • enabling you to easily explore and learn how to use apps you want to use with your students, and new apps
  • to allow you to more easily capture evidence of learning (or the opposite) for your own planning and prepping purposes
  • to more easily facilitate recording and assessment, without the many limitations posed by paper based systems, using apps like Numbers, Notability, iDoceo et al.
The problem is this means you will have a load of content (if not all of it) in your camera roll which is student/college related, if you use the same device on holiday, well, all of your snaps will get mixed up with your school content - not ideal. if you connect this device to your iCloud account all of your personal media on other connected devices will also all stream into the same camera roll all 'polluting' the stream. 

You could easily solve this by just not connecting your iPad to your iCloud account, the problem with this is being connected to the iCloud is really useful, being able to sync all your Apple content from other apps like Notes, Pages, Safari bookmarks et cetera can be really useful, not to mention all the content on iTunes you might have purchased. So how can you have your school iPad connected to your iCloud account and avoid this? Easy, just go into your iCloud settings and switch off Photos, done. Everything in your camera roll is now only content you captured with this device, you can still transfer it to other devices using email, Airdrop, apps like Send it Anywhere. If you install the Google Photos app, it will sync all your camera roll content to your college GApps account, so it's all easily accessible from your laptop—winwin! In fact using the suite of Google Apps (Google Drive, GMail...) is another easy way to keep your work/home life separate on the same device. 

Settings - iCloud

You can turn any/all of these off

Final Advice...

I hate to break it to you, but we have really reached a point where trying to manage the entire scope of your digital life on one laptop is increasingly untenable. My advice? Get a dedicated device for home, and keep that content completely separate for the device you use for work. Simple. If you are a family with loads of video/photos, and media being captured and shared by everyone (our home as 2 adults, 2 kids, one helper, and 5 laptops, 5 iPhones, 5 tablets, and one desktop) then I'd advise you to purchase a dedicated desktop computer (I like the iMac, surprise, surprise) with the BIGGEST hard drive available. That is where all family media is stored, all other devices are dedicated to that individual's content only; so any 'family' content is temporarily on their device, and is transferred to the BIG MAC ASAP.  That's it.

*On a laptop/desktop, you can have more than one cloud account, but they would need to be associated with different user accounts on that machine, you probably need to see IT Support, or a DLC to help with this...