Merging Data in Google Docs

Some rights reserved by mag3737

One of the power features that I’ve always loved in Microsoft Office is the ability to merge data from Excel into individual Word documents. I used to use this extensively with my Excel Gradebooks when I was a classroom teacher: writing comments for each student in a column, recording the various criteria scores, and then merging all of that data on to a template that included the original assessment criteria and task-specific clarifiers. It took a bit of time to set up, but it was powerful assessment data.

As I’ve made the move to Google Docs, I’ve often lamented the missing ability to merge data. I put it down to ‘one of those things’ that I would have to live without in order to use the collaborative power of GDocs.

Yesterday, in our weekly tech team meeting, we were discussing how we handle the reporting of loss or damages. This is an area that has caused much angst amongst all parties involved: students, parents, teachers, admin, technicians. What could we do to make the system more user-friendly and more efficient?

The idea of students filling out a Google Form came up. While it is a great way to collect information from the students, it’s not a great way to view all that data. Cells and cells of text in a spreadsheet is not my idea of a good time. What we needed was a way of getting all of that information on to a document – an individualized report for each incident.

Back to the idea of merging data…

I started searching for a solution. I asked Twitter. I asked the IT teachers if they thought they could write the java script necessary. Finally, I decided to check the script gallery on Google Spreadsheets. Jackpot.

autoCrat by Andrew Stillman does exactly what we need.  Amazingly, it was updated just a few days before I searched for it!

Here’s what I’m hoping to set up in the next week or so:

  1. Students will come into the Tech Office and use a dedicated computer to fill out a Google Form to report loss or damage. That information automatically gets sent to a Spreadsheet.
  2. autoCrat will automatically generate a Doc for each entry as it is created, pulling the information from the Spreadsheet. This document will be emailed to both the student and his/her parent.
  3. The Doc will be created in a folder that is shared by all technicians, using a naming convention based on the name of the person filling out the form and the date it was created.
  4. As the issues are resolved, technicians will write in how and when the problem was resolved and the Doc will be transferred to another shared folder for archiving. The ‘how’ and ‘when’ will also be copied back to the spreadsheet.
  5. Because all of that information is in the spreadsheet, I’m hoping to be able to easily collect stats about the types of problems that have been fixed, average turnaround time, etc.

While this is more of a systems implementation, I can see autoCrat being used in the classroom. I’m not sure how yet, but I think it can be used to collect and share formative assessment, for students, parents and teachers.

How do you think you might be able to use autoCrat?

Electronic MYP Gradebooks

As I explain over on my ‘other’ blog, I’ve never been very happy with how electronic gradebooks deal with criterion-based assessment, particularly as it relates to the MYP. So over the years I gradually developed my own for my math classroom using Excel and a few macros. Since then I’ve created versions for most if not all of the MYP subject groups. (Downloads below as .xlsm files (Excel 2007/2010 macro enabled))

Each gradebook should use the correct criteria and attainment levels for your subject so make sure you download the right one. If there is a mistake, or if there isn’t one for your subject, let me know and I’ll try and fix it ASAP.

I’ve also created a few short screencasts using Screenr on how I envision the gradebooks being used. (If you click on the full screen icon, it’s much easier to see!) Because I believe sharing makes things better, feel free to hack away at these and adjust them for your own use. Because I see these as being covered by a Creative Commons license (Attribution, Non-Commercial, Share-Alike) all I ask for in return is that you attribute me, you don’t make any money off of them and you share what you create as well!

Adding Assessments to the Gradebook

Determining Quarter Scores

Collecting Formative and ATL Data

Gradebook Downloads

MYP Arts

MYP English

MYP Language A

MYP Language B

MYP Mathematics

MYP Humanities

MYP Science



MYP Technology

A Criterion Based Gradebook

The Problem

I’ve searched everywhere for a digital gradebook solution that can handle the rigors of criterion-based assessment. The MYP isn’t predicated on percentages (how can you give an 84% for an English essay anyway? How does it differ from an 86%?) but rather descriptors of performance. A mark of 4 out of 8 doesn’t mean the student got half of the things correct; it corresponds to a description of the work. A good description of the nuances of MYP assessment can be found here (.pdf).

Since I couldn’t find a decent ready-made solution I decided to create one. I’ve tailored it to the needs of my school: we are a tablet PC school so I thought it would be nice to use the stylus to input the marks. I’ve also created several iterations for different MYP subjects to fit with their specific criteria and grade boundaries. The Math version is linked below. It’s nothing fancy; just an Excel document with a few macros (nothing malicious, I promise!). It gets the job done, though.

The Walkthrough

Summative Grades – This is for the major summative tasks. Each task may be assessed on more than one criterion so it is important that you input date and title for each criterion used.

Formative Grades – This is where homework can be recorded. You can also assess classwork on specific criteria or record results from quizzes. I was thining of the old +, √, – method here and used a numerical equivalent.

ATL Skills – Approaches to Learning, for the un-MYP among us, are specific study skills that are explained in detail through the program. I found it useful to track these ATL skills to better provide reporting data.

The Macros

At the end of each reporting period the teacher is required to determine at what level each student is performing for each criterion. To aid this, I’ve set up a simple sort macro which groups all of the same criterion grades together in chronological order. You can then return it to its original order by using the date sort. It’s probably a good idea to put in the reporting period headers and date first before sorting by criteria so that you have a place to put your final assessment.

The Disclaimer

Like all work on this site, these gradebooks are shared under a Creative Commons 3.0 Non-Commerical Share Alike license. If you find ways of improving upon this, I would love to know!

MYP Gradebook Math;
MYP Gradebook Language A
MYP Gradebook Language B
MYP Gradebook Humanities
MYP Gradebook Science

Deconstructing Skill Based Assessment

Since Dan has linked to a bunch of readers who have implemented skills based assessment derived from his model, I thought I better post my own experiences with this.

I used this in my grade 8 math class (integrated concepts, heterogeneous grouping) as a way to revise the algebra that they had already been exposed to, with an eye towards needs-based differentiation. I divvied the algebra units from grade 7 and 8 into 25 skills, 5 of which I would consider ‘extension’. (Resources at the end.)

How it worked:

  • I handed out a sheet that listed the 25 skills and asked the students to self-evaluate their abilities, based on their performance on homework and tests if possible, using a 3 band rubric: sad face – not sure of what to do or how to do it; blah face – can do it sometimes but with minor mistakes; happy face – anytime, anyplace, bring it.
  • Once a week (usually on a Monday, based on our homework rotation) students would be given a 25 question quiz, with each question was aligned to the corresponding skill. The students had 20 minutes to complete as many questions they wanted. Each question usually had two or more problems: one simpler, one more complex.
  • I would then give the students 40 minutes to review any topic that they wanted using any method available to them: practice worksheets in small groups, one-to-one or small group help with me, using web resources that I had already searched out, or viewing screencast movies created by me or my teaching partner for that particular skill.
  • I would grade the quizzes on a 3 point scale, corresponding to the different faces and return the work within a day or two.
  • The goal was to receive two happy faces (show skill mastery twice) in order to receive a ‘gold cup’ for that skill. Students kept track of their own results on their skills sheets but I only asked them to record progress.
  • I kept track of all results in my gradebook.

After a couple of weeks, some of the students pointed out the obvious: why not review at the beginning of the lesson and then complete the assessment at the end? Much better…

This went on for 10 weeks, excluding the brief hiatus in the middle so that we could complete some in-class projects. All in all, I’m quite happy with the outcome of this little experiment and will definitely include it next year.

The Good:

  • Reinforced discrete algebra skills on a weekly basis, something we couldn’t always do in our integrated setting.
  • Students seemed to enjoy it and liked the simple 3-band rubric.
  • Many students showed significant growth and improvement over the 10 weeks.
  • Teachers in grade 9 will have a way of checking the entry level of students by looking over their skill sheet.

The Bad:

  • Time in class needs to be more structured – at times it felt like a free-for-all where little was being accomplished.
  • Students kept working on skills that they had already shown mastery in. This is a fundamental change in assessment and learning for them and that needs to be stressed. They didn’t really like the idea of skipping questions at the beginning (what does that say about how they’ve been trained???).
  • It got a bit repetitive for the students. More variation in the materials would be nice. These can be added throughout the process in the future.

The Unknown:

  • Will this actually have an effect in the grade 9 classroom? I’ll find out since I’m one of the grade 9 teachers…
  • The idea was to have students who showed mastery in all skills act as student leaders as well as create their own screencast movies for skills of their choice. Only a couple of students actually got that far, however. As tablets become more pervasive in the school, I hope it will be easier to get the students creating content that can help their peers.



algebra-skills-test-1 – Apologies: some of the characters have magically changed, but you get the idea I hope.

A sample math screencast: Graphing in Standard Form

A sample student screencast: FOIL in Korean

Assessing Assessment

There has been a fantastic free-for-all going on over at Beyond School. I won’t get into the specifics – check it out for yourself;the real excitement is in the 75+ comments – but it has focused on, among other things, assessing students in an English Language Arts classroom. In this age, how much weight should be given to “traditional” writing assignments and what is the place for

At the same time, the Faculty Room has been giving assessment a closer look. Dan Meyer expounds on his system, which is well-suited for mathematics (I should know: I’ve adapted his strategy to implement an on-going revision of algebraic concepts in my Grade 8 class). Simon Cheatle gives his perspective from an international school in the Phillipines.

The American Paradigm

The vast majority of commentators present a very American slant on assessment. After spending the last 6 years overseas in truly international schools (my first two years were in a school that could have been situated in the middle of Iowa or California or North Carolina) I wonder why this American paradigm persists? Only in the arguments put forward by Grant Wiggins do I see any reference to criterion-based assessment. Being a mathematics teacher, I wonder how English teachers or History teachers go about grading an essay. How do you tell a B+ from an A-? Do you apply some sort of percentage? What do you do with the student who has a clear grasp of the language but a poor working knowledge of spelling? What do you do with the student who knows all of the grammar and structure protocols, but can’t present a reasoned argument? (For those who didn’t check it out, this is the initial focus of Clay Burell’s post.)

Enter Criteria

The answer, in my mind, is criterion-based grading. Why not separate the necessary skills of your course and grade each one appropriately? As an IB Middle Years Programme (MYP) school, we do exactly that. For example, in mathematics we assess four separate criteria: Knowledge and Understanding, Investigation of Patterns, Communication, and Reflection in Mathematics. If a student obviously knows the material but cannot present her information clearly, I can grade her higher in Knowledge and Understanding and lower in Communication. I don’t need to find a middle ground and she can know exactly what her strengths and weaknesses are.

A Step Further

At the end of the term, I look into my gradebook and find the highest sustained level of achievement for each criteria. I do not find the mean. If a student starts the year poorly but shows improvement, I reward that. If a student does poorly on one assessment task, it does not come back to hurt him.

Not Perfect

I will be the first person to admit that this system is not perfect. There is no room for formative assessments to influence the final grade, except as practice for the summative assessments. In my subject, life would be simpler to assign grades based on percentages. The assessment criteria, in my experience, lend themselves to major assessment tasks which are difficult to write, time consuming for students, and bloody hard to mark. Oh, and it’s a difficult system to get your head around, especially coming from The American Paradigm. Ask any other MYP teacher and they will probably have their own list of grievances.

The debate surrounding assessment is one that is necessary. There is no “right” answer as each teacher, school, and district is in a different situation. However, that doesn’t mean we should not strive to find that perfect way of assessing student performance. On the contrary, only by looking critically at our own practices and our motivations behind those practices can we, as professionals, ever hope to evolve.

 MYP Criteria