Friday, February 14, 2014

Gamification Challenges

If you're interested in learning more about gamification, I set up some challenges for a session at Convention last week. I'm reposting them here. Unfortunately we won't have a leaderboard or group discussions though.

  1. Come up with a definition for gamification.
  2. Identify the difference between gamified and game-based learning.
  3. Consider how you think student motivation might be affected by gamified learning.
  4. Consider how you think learner self-efficacy might be affected by gamified learning.
  5. Decide whether or not to continue working on these challenges.
  6. Read the Wikipedia article on gamification.
  7. Do a survey of those around you to see what percentage of them play games (video games, board games, sports, etc.).
  8. Reflect and share how gamification might be similar to some things you've tried in the past.
  9. Discuss how gamification might affect student engagement.
  10. Investigate Csikszentmihalyi's concept of "flow" as it relates to games and/or learning.
  11. Find a peer-reviewed academic article on the benefits (or risks) of gamification.
  12. Discuss how gamification might align with Inspiring Education, High School Flexibility, and/or Curriculum Redesign.
  13. Read about Quest to Learn (Q2L), a public school in New York City.
  14. Find and share an example of gamified learning, such as the UofA's EDU210: http://goo.gl/cgoRrt
  15. Share online (social media, blog, etc.) an example of gamification, in education or some other field.
  16. Explore how "serious games" (also called persuasive games or applied games) are similar to and different from gamified environments.
  17. Check out "Games for Change", "Fold.it", and "Play to Cure: Genes in Space".
  18. Brainstorm a quick and easy way you (or someone you know) can quickly and easily gamify something.
  19. Come up with a list of things to consider when designing gamified activities or environments.
  20. Watch a TED talk by Jane McGonigal, Gabe Zichermann, Seth Priebatsch, Ali Carr-Chellman, Tom Chatfield, or Mihaly Csikszentmihalyi.
  21. Read about some of the criticisms of gamification, such as Ian Bogost's "exploitationware", Jane McGonigal's "gameful design", or others who discuss dangers of extrinsic motivation.
  22. Have an extended conversation about why gamification might or might not be a good idea.
  23. Write about your experiences with gamification (either here or elsewhere) in your blog, journal, social media, or on paper.
  24. Come up with three pairs of statements in the form "I used to think _____, now I think _____.

Thursday, February 13, 2014

Simon Breakspear - Innovative Learning Environments

I've heard Simon Breakspear speak a couple of times lately, including at Convention last week, and I've been very impressed. Here's a YouTube clip of him talking about "How do the innovative learning environments get created?"


Monday, January 20, 2014

Projecting from an iPad or iPhone

A while ago I wrote about showing your iPad/iPhone on the big screen using iOS 6, but now that most people have upgraded to iOS 7 perhaps it's time for another post.

I you have an Apple TV (or a computer running Reflector) connected to your projector or TV, you can wirelessly project your iPad, iPhone, or iPod screen using AirPlay Mirroring. AirPlay will work from any Apple device (and some non-Apple devices), but the screen mirroring part doesn't work on some older devices.

To start, make sure you are connected to the same network as your AirPlay receiver (Apple TV or computer running Reflector).

Swipe your finger up from the bottom of the screen to access the Control Center. If there's an AirPlay receiver on the network, tap on AirPlay and select which device you'd like to AirPlay to.

Once you've selected an AirPlay receiver, you can turn on Mirroring if it's supported.

If you have AirPlay security enabled on your AirPlay receiver (either Onscreen Code or Password) you will be prompted to enter it.

When AirPlay is active, the top bar of your device will be blue, and you'll see the AirPlay icon  at the top right.


When you want to stop AirPlaying, swipe up from the bottom again, tap AirPlay, and choose iPad.


You can also look for the the AirPlay icon   in specific apps such as YouTube.

Tuesday, January 14, 2014

A Simpler Leaderboard

The other day a teacher was looking for an easy way to display a leaderboard for her students. You may recall a previous blog post describing a somewhat complicated (but cool IMHO) way that we set up a leaderboard for Scratch Day 2013.

Of course a leaderboard isn't all you need for a gamified classroom, but it might be part of what you want.

So again we'll use a Google Spreadsheet and publish part of it so that students will be able to see the "levels" that they, and others, have achieved, but they won't be able to see the points that you've awarded.

Create a new spreadsheet and name it whatever you'd like.

On the first line, label the columns "Name", "Points", and "Level". Then leave a blank line and put in the names (or pseudonyms) of the participants.

To the right of that (starting in cell D1), title the rows "Maximum Value" and "Level". Decide on the names for the levels and the maximum values, but you can always change those later.

In cell C3 (the third cell down in the "Level" column) paste in the following formula:

=if(B3<=$E$1,$E$2,if(B3<=$F$1,$F$2,if(B3<=$G$1,$G$2,if(B3<=$H$1,$H$2,if(B3<=$I$1,$I$2,if(B3<=$J$1,$J$2,$K$2))))))


Then press enter and place your mouse cursor at the bottom right of cell C3 (where you just pasted the formula). Click and drag it down in order to fill that formula in for the rest of the column.

When you're done, it should look like this:

Now when you change the points value, it will automatically change the "Level". If you don't mind students seeing the "points" values then you can just share the spreadsheet with them as viewers and you're done.

However if you want to allow the participants to see the "levels" but not the "points", then you need to create another sheet that you can publish. Click the + sign at the bottom left to add another sheet.

Open that new sheet by clicking on "Sheet2".

In cell A1 of the new sheet, paste or type  =Sheet1!A1  and press enter so that cell A1 in this sheet will display the contents of cell A1 in the other sheet.

Again, click and drag from the bottom right corner of cell A1 to fill in the formula for the rest of the column. Do the same for cell B1, but use the formula  =Sheet1!C1  so that it will display the contents of cell C1 from the other sheet. Fill down again, and it should look like this:

Now publish just Sheet2 and share the link with your participants by posting it on your website or LMS.




Sorry, that was a little more complicated than I initially though, but you can do it. If you want to see the spreadsheet that I used for this post, click this link.

Let me know if it works for you.

Friday, January 10, 2014

setting up a MinecraftEdu server

If you'd like your students to be able to play (or work in) Minecraft together, you can easily set up a MinecraftEdu server. There is more documentation on the MinecraftEdu wiki, but I'll quickly go over the basics.

From the MinecraftEdu launcher, click the "Start Minecraft Servertool" button.

If a teacher password hasn't been set yet, it will ask you to set one. You can always change this later.

You'll then be given some options about what kind of world you'd like to create (or open).

For example "Generate a Completely Flat World" if you'd like student to build things without worrying about cutting down trees or hills.

Once you have started the server, you can see the information and change settings from this window. It also shows you the IP address that students should directly connect to in order to join your world.

Before you quit the server, remember to save the world.

I also highly recommend that you make a backup of your saved world. Copy the appropriate folder to a USB drive or a network location that is backed up. If MinecraftEdu was installed by EIPS Tech Services, saved worlds will be in C:\Program Files (x86)\minecraftedu\servertool\worlds\savedworlds

Of course if you want to try out a simpler process, you (or a student) can start a single player world and allow others to join by typing /publish in the Minecraft chat or by opening the game menu and clicking the "Open to LAN" button. Unfortunately this doesn't allow you to use many of the MinecraftEdu features.

Friday, December 20, 2013

Displaying Calendar Events on Digital Signage Using a Raspberry Pi and Google Apps Script

In our office many of us are often out of the building, so we decided to hang a TV that would display our calendar events. The hardware consists of a 40" LED mounted on the wall, with a Raspberry Pi (with a Wi-Fi adapter) connected to the HDMI input.

To display the calendars, a Google Apps Script edits a Google Sites page, that is refreshed on the Raspberry Pi every 30 minutes. Google Apps Script is great at parsing Google Calendars, but unfortunately there is some incompatibility with our Exchange calendars, so I ended up using Yahoo!Pipes to parse the ICS files served by our Exchange server.

I'll write out some instructions for recreating what I've done, but your results may vary. Feel free to ask for help in the comments or on social media.

Start by creating yourself a Google Site that has no navigation elements or other clutter. Probably the easiest is to create one from this template I've shared.

Next, a Yahoo!Pipe to parse the ICS files. If you can subscribe to the calendars in Google Calendar then you don't need this because you can use the Calendar Service in Google Apps Script. I'll leave it to you to figure out how to do that, though. If you want to use Yahoo!Pipes to parse ICS files, check out the pipe I created.

Then a Google Apps Script to add events to the Google Site you created. If you're not familiar with Google Sites, there are many tutorials and code examples. I'll just paste my script code in here and hope you can make sense of it.

//Remember to set up a trigger set up to run every 30 minutes, or at whatever frequency you prefer
function runThis() {
  var page = SitesApp.getPageByUrl("https://sites.google.com/whatever-your-site-is-should-be-here");
  var dateToday = new Date();
  //format the date to print at the top of the webpage
  var dayToday = Utilities.formatDate(dateToday, Session.getTimeZone(), "EEEE, MMMM dd");
  //put the date at the top of the web page
  var pageContent = "<div style='font-size:larger'>Today is " + dayToday + "</div><br>";
  //add the calendar stuff by calling the appropriate functions and appending to the variable
  var pageContent = pageContent + parseCalendar("Your Name","#00FF00","http://your-link-to-an-online-ics-file.ics");
  var pageContent = pageContent + parseCalendar("Another Name","#FF0000","http://a-link-to-another-online-ics-file.ics");
  page.setHtmlContent(pageContent);
}

function parseCalendar(name,color,iCalUrl) {
  //declare and empty pageContent variable that we will fill with calendar entries
  var pageContent = "";
  //format the iCal URL for submission to Yahoo!Pipes
   var replaceColon = iCalUrl.replace(":","%3A");
   var replaceSlash = replaceColon.replace(/\//g,"%2F");
   var translatediCalUrl = replaceSlash.replace(/@/g,"%40");
   //replace spaces with + signs
   var translatedName = name.replace(" ","+");
   //concatenate the strings to make a URL
   var rssUrl = "http://pipes.yahoo.com/pipes/pipe.run?CalendarURL=" + translatediCalUrl + "&Name=" + translatedName + "&_id=9e11d02f251ade5c10a6f5501bfe181f&_render=rss"; 
  //fetch the RSS feed from that URL
  var rssContent = UrlFetchApp.fetch(rssUrl).getContentText();
  //parse the RSS feed that we just fetched
  var items = XmlService.parse(rssContent).getRootElement().getChild("channel").getChildren("item");
  //loop through the items we just parsed
  for (var i = 0; i < items.length; i++) {
    var item = items[i];
    var title = item.getChild("title").getText();
    // if there is a location, then get it from the "description" field
    if (item.getChild("description") != null) {
      var location = item.getChild("description").getText();
      var output = title + " at " + location;
      }
    // if there isn't a location, output just the title
    else {output = title};
    var pageContent = pageContent + "<div style='color:" + color + "'>" + output + "</div>\r";
    }
  return pageContent;
}
Finally, on the Raspberry Pi:
  1. Set up Debian Linux on the Pi:http://www.raspberrypi.org/downloads
  2. (optional): Force the Raspberry Pi to use HDMI output even if it doesn't detect a display there:
    1. in the terminal emulator type  sudo leafpad /boot/config.txt
    2. add the following lines to that file:
      1. hdmi_force_hotplug=1
      2. hdmi_drive=2
    3. remove (or comment out) any similar lines at the bottom of the file that may have been added by the NOOBS install process
    4. save and close the config.txt file
  3. Have the mouse cursor auto-hide using unclutter: in the terminal emulator type  sudo apt-get install unclutter
  4. Edit the autostart file: in the terminal emulator type  sudo leafpad /etc/xdg/lxsession/LXDE/autostart
  5. Disable screen sleeping and autostart the browser by adding the following lines to the file you just opened for editing (include the @ signs, but not the line numbers):
    1. @xset s off
    2. @set -dpms
    3. @xset s noblank
    4. @midori -e Fullscreen -i 1800 -a https://sites.google.com/whatever-your-site-is-should-be-here
    5. @unclutter -display :0.0 -idle 5
  6. Reboot the Raspberry Pi, and you're done.

Friday, October 18, 2013

School Announcements: Auto-Generating Announcement Documents (printable and viewable online)

Rather than having to manually create a document every day with the daily announcements, I've created a script that will do it for you. There are, of course, other features that could be added, but this is good enough for today.

To start, announcements are submitted via a Google Form, so they end in a spreadsheet. There are three pieces of data: the text of the announcement, the category, and the expiry date.


The script creates a new Google Document (in a public folder), then takes data from the spreadsheet and pastes it into that newly created document. The code for the script follows. (Creative Commons Attribution-ShareAlike).


function createAnnoucementDocument() {
  // Set up a trigger to run this every weekday, perhaps at 8:00 am
 // Define a custom paragraph style.
var styleHeading = {};
 styleHeading[DocumentApp.Attribute.HORIZONTAL_ALIGNMENT] = DocumentApp.HorizontalAlignment.CENTER;
 styleHeading[DocumentApp.Attribute.FONT_SIZE] = 18;
 styleHeading[DocumentApp.Attribute.BOLD] = true;

var styleCategory = {};
 styleCategory[DocumentApp.Attribute.HORIZONTAL_ALIGNMENT] = DocumentApp.HorizontalAlignment.LEFT;
 styleCategory[DocumentApp.Attribute.FONT_SIZE] = 12;
 styleCategory[DocumentApp.Attribute.BOLD] = true;

var styleText = {};
 styleText[DocumentApp.Attribute.HORIZONTAL_ALIGNMENT] = DocumentApp.HorizontalAlignment.LEFT;
 styleText[DocumentApp.Attribute.FONT_SIZE] = 12;
 styleText[DocumentApp.Attribute.BOLD] = false;

  // Get the current date
  var app = UiApp.createApplication();
  var dateToday = new Date();
  // date formatting here: http://docs.oracle.com/javase/6/docs/api/java/text/SimpleDateFormat.html
  var formattedDateToday = Utilities.formatDate(new Date(), "MST", "yyyy-MM-dd");
  
  // Create the document
  var documentName = formattedDateToday + " School Announcements";
  var doc = DocumentApp.create(documentName);
  // Move the document to the shared folder entitled "Announcements"
  var documentFile = DocsList.getFileById(doc.getId());
  var folderName = DocsList.getFolder("Announcements");
  documentFile.addToFolder(folderName);
  
  // the spreadsheet that contains the results of the announcement submission form
  var sheet = SpreadsheetApp.openById("***INSERT_SPREADSHEET_KEY_HERE***").getSheets()[0];

  // Start creating the body of the document
  var body = doc.getBody();
  
  body.appendParagraph("Name of School Goes Here\r"+formattedDateToday).setAttributes(styleHeading);

  // Read the whole spreadsheet into a list
  //  if Expiry Date (column D) is > or equal to today's date then process it, else ignore
  //  switch if category match then append to document

  var data = sheet.getRange(2, 2, sheet.getLastRow(), sheet.getLastColumn()).getValues();
  for (var row=0, total=data.length; row < total; row++) {
    var rowData = data[row];
    var announcementText = rowData[0];
    var announcementCategory = rowData[1];
    var announcementExpiry = rowData[2];
    if (announcementExpiry >= dateToday) {
      switch (announcementCategory) {
        case "General":
          body.appendParagraph(announcementText).setAttributes(styleText);
        break;
      }
    }
  }
  body.appendParagraph("Events and Meetings").setAttributes(styleCategory);
    for (var row=0, total=data.length; row < total; row++) {
    var rowData = data[row];
    var announcementText = rowData[0];
    var announcementCategory = rowData[1];
    var announcementExpiry = rowData[2];
    if (announcementExpiry >= dateToday) {
      switch (announcementCategory) {
        case "Events and Meetings":
          body.appendParagraph(announcementText).setAttributes(styleText);
        break;
      }
    }
  }
  body.appendParagraph("Athletics").setAttributes(styleCategory);
    for (var row=0, total=data.length; row < total; row++) {
    var rowData = data[row];
    var announcementText = rowData[0];
    var announcementCategory = rowData[1];
    var announcementExpiry = rowData[2];
    if (announcementExpiry >= dateToday) {
      switch (announcementCategory) {
        case "Athletics":
          body.appendParagraph(announcementText).setAttributes(styleText);
        break;
      }
    }
  }
  body.appendParagraph("Cafeteria").setAttributes(styleCategory);
    for (var row=0, total=data.length; row < total; row++) {
    var rowData = data[row];
    var announcementText = rowData[0];
    var announcementCategory = rowData[1];
    var announcementExpiry = rowData[2];
    if (announcementExpiry >= dateToday) {
      switch (announcementCategory) {
          case "Cafeteria":
        body.appendParagraph(announcementText).setAttributes(styleText);
        break;
      }
    }
  }
  return app;
}

Monday, October 14, 2013

Gamification of Health

Feel free to correct me in the comments, but I'm seeing two main trends in the gamification of health. The first is using game mechanics for fitness, motivating us to get us off the couch. The second is gamified, or at least game-based, treatments.

Since I'm and educator and not a health professional, I'm not particularly qualified to comment on the latter. However I have been reading many interesting articles about video games for pain reduction or for treating ADHD, and I'm very interested in devices that help us measure ourselves. For example, check out this story about using an inexpensive EEG device together with video games as therapy for ADHD.

Of broader application, though, is the use of game mechanics to help reverse our sedentary patterns. Devices such as the FitBit products, and games such as Nike+ Kinect Training help us to measure ourselves and set goals. As I write this during a weekend of overeating, I realize that these things need to be as "frictionless" as possible. It's much easier to have another slice of pie than to go to the gym, or to turn on the Xbox and spend half an hour working out. There continues to be a lot of thought put into seamlessly incorporating fitness (and motivation) into our daily lives.

Often the best motivation, for fitness or anything else, involves collaboration or competition with other people. Upcoming competitions such as triathlons or games encourage us to train, and collaborations such the November Project motivate us because our friends are doing it. Of course this means that we need to convince our friends to participate.

In education, however, we have a unique environment with a captive audience. Students participate in events such as the Terry Fox Run, as well as school sports and intramurals. Some organizations try to replicate this with Corporate Challenges, or with online gamification systems such as OfficeVibe, but for some reason those don't seem as successful. Maybe because we don't usually have PhysEd teachers working in our offices.

So there are two things that I'm thinking about related to this. First, of course, is how to continue use our time with students to encourage lifelong fitness. The second, though, is how to replicate for adults the fitness motivation that we see in schools. I see the principles of gamification as one of the best ways to continue doing that.

But first I think I'll go have another slice of pie.

Gamification of Health

Feel free to correct me in the comments, but I'm seeing two main trends in the gamification of health. The first is using game mechanics for fitness, motivating us to get us off the couch. The second is gamified, or at least game-based, treatments.

Since I'm and educator and not a health professional, I'm not particularly qualified to comment on the latter. However I have been reading many interesting articles about video games for pain reduction or for treating ADHD, and I'm very interested in devices that help us measure ourselves. For example, check out this story about using an inexpensive EEG device together with video games as therapy for ADHD.

Of broader application, though, is the use of game mechanics to help reverse our sedentary patterns. Devices such as the FitBit products, and games such as Nike+ Kinect Training help us to measure ourselves and set goals. As I write this during a weekend of overeating, I realize that these things need to be as "frictionless" as possible. It's much easier to have another slice of pie than to go to the gym, or to turn on the Xbox and spend half an hour working out. There continues to be a lot of thought put into seamlessly incorporating fitness (and motivation) into our daily lives.

Often the best motivation, for fitness or anything else, involves collaboration or competition with other people. Upcoming competitions such as triathlons or games encourage us to train, and collaborations such the November Project motivate us because our friends are doing it. Of course this means that we need to convince our friends to participate.

In education, however, we have a unique environment with a captive audience. Students participate in events such as the Terry Fox Run, as well as school sports and intramurals. Some organizations try to replicate this with Corporate Challenges, or with online gamification systems such as OfficeVibe, but for some reason those don't seem as successful. Maybe because we don't usually have PhysEd teachers working in our offices.

So there are two things that I'm thinking about related to this. First, of course, is how to continue use our time with students to encourage lifelong fitness. The second, though, is how to replicate for adults the fitness motivation that we see in schools. I see the principles of gamification as one of the best ways to continue doing that.

But first I think I'll go have another slice of pie.

Thursday, September 19, 2013

Empowered by Technology

The reason that I like technology so much is that it's empowering. The Internet, programming, and even video games allow us to do things that would be otherwise impossible. These technologies also give us a very real sense of power and control over our lives.

The Internet is the greatest repository of information, and misinformation, ever created. You can learn how to do things, share your own expertise, and even look up obscure facts. Scientia potentia est.

Programming is another way that technology empowers us. Basic services such as "If This Then That" (ifttt.com) and more advanced programming languages such as Python allow us to make these machines do our bidding. We can turn our ideas into reality.

Video games are, for many, much more than entertainment. They are empowering in that they allow us control over our own stories and our virtual worlds. They make possible things that are otherwise impossible or impractical, such as driving really fast, flying, or building worlds for others to experience. Minecraft is a great example.

Of course we also need to consider the negative impacts that technology is having on our society and on ourselves. TLDR.

Saturday, June 1, 2013

Google Apps Bulk Delete Users Script

Google Apps has a bulk account creation tool, but nothing for bulk deleting accounts. I had previously written something in Python to bulk delete accounts from a text file, but it was time to create something web-based. This will only work if you have the Domain API enabled, which means that you'll have to enable it in your Google Apps for Education Admin console.

Just a note, though, before deleting users you may want to direct them to the Data Liberation tools for downloading their data.

The actual script is in Google Apps Script. You can see it by visiting Google Apps Delete Users, or create your own copy at script.google.com using the source code below.

Let me know if this works for you, or if you have suggestions for improvements.


 function doGet() {  
 // get the user's credentials for their Google Apps account  
  var user = Session.getEffectiveUser().getUserLoginId()  
  var domain = UserManager.getDomain();  
  var welcome = "You are running this script as " + user + " on the domain " + domain;  
 // set up the user interface  
  var app = UiApp.createApplication().setTitle('Delete Google Apps Users by MisterHay');  
  app.add(app.createLabel(welcome));  
  app.add(app.createHTML("<br>Make sure you have enabled the Provisioning API (support.google.com/a/bin/answer.py?hl=en&answer=60757).<br>This script will delete Google Apps user accounts that you paste or type below. Each account name must be on its own line.<br>e.g.<br>misterhay<br>mpython<br>unladen.swallow<p>"));  
  var textArea = app.createTextArea().setName("textArea").setSize("20%", "60%").setStyleAttribute("background", "white").setStyleAttribute("color", "black").setFocus(true);  
  var serverHandler = app.createServerHandler("deleteAccounts").addCallbackElement(textArea);  
  var clientHandler = app.createClientHandler().forEventSource().setEnabled(false).setText("deleting accounts...");  
  app.add(textArea);  
  var button = app.createButton("Delete Accounts");  
  button.addClickHandler(serverHandler);  
  button.addClickHandler(clientHandler);  
  app.add(button);  
  app.add(app.createLabel("no accounts deleted").setId("finishedLabel").setVisible(false));  
  return app;  
 }  
   
 function deleteAccounts(eventInfo) {  
  var app = UiApp.createApplication();  
  var deleteThese = eventInfo.parameter.textArea;  
  var stringArray = deleteThese.split(/\n/);  
  for (var loopNumber = 0; loopNumber < stringArray.length; loopNumber++) {  
   var deleteThisUser = stringArray[loopNumber];  
 // rename the accounts before we delete them to avoid the five day wait before account names can be reused  
   var renamedUser = deleteThisUser + '_old';  
   Logger.log(renamedUser);  
 // delete the account  
   UserManager.getUser(deleteThisUser).setUsername(renamedUser);  
   UserManager.getUser(renamedUser).deleteUser();  
   }  
 // tell the user how many accounts we deleted.  
  app.getElementById("finishedLabel").setText(loopNumber + " accounts deleted").setVisible(true);  
  return app;  
 }  

Monday, May 27, 2013

Playing a YouTube Playlist in your DigitalSignage

If you're using DigitalSignage.com you'd probably like to play some videos on the screens. You can, of course, upload them to the Resources section, but if you'd just like to loop a YouTube playlist that's fairly easy.

First you'll need to find the URL for a YouTube playlist. Click the title of the playlist, then click the Share button. The URL will be something like http://www.youtube.com/playlist?list=PL627F181E0CB37E19 and the important part is the characters after the = sign.

I'm assuming that you know how to set up screen divisions and campaigns, so grab an HTML5 component from the toolbox and drop it in place. The URL you will put in that component will look like this http://www.youtube.com/embed?listType=playlist&autoplay=1&loop=1&list=PL627F181E0CB37E19 where autoplay=1 means that it will automatically play the through the playlist, and loop=1 means that it will go back to the beginning once it finishes. Of course you will replace the characters after list= with the playlist that you want to use.

Saturday, May 18, 2013

Playing Songza on an EeePC 701

This old seven-inch laptop is still useful for a few things, I've recently connected it to my stereo system as a Songza player.

There were two things I needed to do to get this working:

Install Lubuntu
Install Adobe Flash Player from the tar.gz (which involved copying the .so file to /usr/lib/Chromium and copying the folder somewhere else)

I can write up more details if you ask in the comments.

I tried Google Chrome, but it was too resource-intensive so the audio stuttered. I wasn't able to install the Flash plugin in Midori for some reason, so I went back to Chromium and got it working there. The next step will be to connect a USB IR remote receiver with some scripts for selecting different Songza playlists, pausing, and adjusting the volume.

Wednesday, May 15, 2013

form data averages Google Apps Script

When a Google Form is submitted, it adds a row to your spreadsheet. This changes your formulas, which is a problem if you are trying to do live calculations on submitted data, so you need a script to copy in the correct formulas after each form submission.

Here's an example of how I did that. The script is set to trigger whenever a form is submitted.

function insertAverage() {
  var sheet = SpreadsheetApp.getActiveSpreadsheet().getSheets()[0];
  var formulas = [
   ["=AVERAGE(C3:C100)", "=AVERAGE(D3:D100)", "=AVERAGE(E3:E100)"]
   ];
  var destination = sheet.getRange("C2:E2");
  destination.setFormulas(formulas);
};

Monday, May 6, 2013

Programming with Pure Data and Open Sound Control for the Behringer X32

Pure Data is an easy graphical programming environment. It can speak the OSC (Open Sound Control) protocol, so you can write programs to communicate with Behringer X32 digital mixers using their published X32 OSC Remote Protocol. Don't worry, it's easier than it sounds.


1. Download Pure Data Extended (which includes things that we'll need).
2. Unzip and/or install it.
3. Run the program Pd-extended.
4. If you're using Windows, it will likely ask you if you want to allow it to communicate on the network. Click Allow access and input your Administrator password if necessary.


5. In the window that pops up, choose New under the File menu.

6. You now have a blank canvas to put commands, controls, and such on to.

7. Under the Put menu choose Object (or hold the Ctrl key and press 1).
8. In the newly-created object type  import mrpeach  to tell it that you need to use that package.

9.  Under the Put menu choose Message and type in it  connect x.x.x.x 10023  but replace the x values with the IP address of your X32 mixer. For testing purposes you can have it connect to localhost (the computer that you're sitting at) instead.
10. Create a  disconnect  message and a  udpsend  object as well.
11. Drag from the bottom left corner of each of those messages to the top left corner of the udpsend object in order to connect their outputs to the udpsend input.

12. Now everything is set up to communicate, but we need to actually compose some messages to send. To start let's set up a fader to control a channel level, and a button (toggle) to control the mute for that channel. Under the Put menu choose Vslider to place a vertical slider and Toggle to place your mute button.

13. Put messages for sending mute on/off and fader levels for a particular channel following the X32 OSC specifications. For example,  send /ch/01/mix/on $1  will take the input value ($1) to send the message  /ch/01/mix/on 0  to mute channel one or  /ch/01/mix/on 1  to unmute it.

14. You'll also need to put in a  packOSC  object to create the OSC messages, and connect its output to the  udpsend  object.

15. One last thing you'll need to do is change the Vslider's properties to output values from 0 to 1 as the X32 expects. Right-click the Vslider, choose Properties and change the top value to 1 instead of 127. You can also change the size and color here if you'd like.

16. Your program is now all set up. To run it, go to the Edit menu and choose Edit Mode (to turn off the edit mode) or hold the Ctrl key and press the e key. Once the program is running, click on one of your connect messages to connect to the X32, then drag the slider and click the toggle to see their effects.

Optional: If you'd like to test without connecting to the X32, create another Pure Data program by choosing New under the File menu and make it look something like the following. Remember that the Vslider range should be 0 to 1.

When your new OSC receiving program is running (not in edit mode), you should be able to click connect localhost 10023 in your original program and then see that the slider and toggle there will affect the corresponding ones in your new program.

Hopefully that's enough to get your started in writing Pure Data programs for you X32 digital mixer.

Wednesday, May 1, 2013

Lighting LEDs on the Raspberry Pi with Python

There are many tutorials available for this already, but I just wanted to collect my observations into a series of steps that I can repeat later.

For reference, I used raspberry-gpio-python, Raspberry Leaf, and How to use your Raspberry Pi like an Arduino.

I connected an LED and an inline resistor to GPIO pin 7 and ground, one side of a momentary pushbutton to ground, and the other side of the pushbutton to a 1 kOhm resistor connected to 3.3 V.

Starting from a fresh install of Raspbian, at the command line (or in LXTerminal) input the following -commands: (edit: the crossed-out commands aren't really necessary)

sudo apt-get update
sudo apt-get install python
sudo apt-get install python-dev
sudo python distribute_setup.py
sudo easy_install pip
sudo apt-get install python-rpi.gpio

sudo python


#!/usr/bin/env python
from time import sleep
import os
import RPi.GPIO as GPIO
GPIO.setmode(GPIO.BCM)

# define the pins we're using
button1 = 4
LED = 7

set up the pins
GPIO.setup(button1, GPIO.IN)
GPIO.setup(LED, GPIO.OUT)

# turn on the LED
GPIO.output(LED, True)
# wait for half a second
sleep(0.5)
# turn off the LED
GPIO.output(LED, False)
# toggle the LED
GPIO.output(LED, not GPIO.input(4))

# set up some variable to read from the button
input = False
previousInput = True

# loop
while True:
 # read the button state to the variable input
 input = GPIO.input(button1)
 # make sure the button state isn't what it used to be
 if ((not previousInput) and input):
  print("button pushed")
 # copy the button state to the previousInput variable
 previousInput = input
 # wait to "debounce" the input
 sleep(0.05)

# clean things up
GPIO.cleanup()

Vectorizing a Scanned Object for CNC Cutting

If you have a physical object or paper drawing that you'd like to carve out using a CNC router or plasma cutter, here's one way to do it. We're going to use the open source drawing program Inkscape for turning scanned images into vectors that the machine can cut.

Start by scanning the object (or drawing) on a flatbed scanner. If you don't have access to a scanner, you can try taking a picture from directly above the object (or the drawing). For this example I've scanned a handheld wireless microphone.
This will be a somewhat complicated image to vectorize, it would be better if it was simply black and white (like a line drawing or most logos). The nice thing about how this scanned, though, is that the background is totally white. Don't worry if yours isn't, that's something we can deal with or fix later.

Start up Inkscape (you can download the portable version if it's not installed on your computer) and under the File menu choose Import... and find the image that you scanned earlier. It doesn't matter if you link or embed, since you'll be removing the image from Inkscape soon anyway.

Now it's time to turn that image into a vector outline. Make sure that the image is selected, and under the Path menu choose Trace Bitmap... to bring up the following window.

Experiment with changing the Brightness cutoff Threshold. I tried values of 0.4, 0.6, 0.8, and 0.9 to get the following vectors (from left to right, with the original bitmap image on the far left).

Each time you click OK it will create another vector object, which you can delete if you don't like it. In my example above there's a little white spot that shows up even at high threshold values, so that's something that I could go back to the original image and paint over, or I can break apart the vector group and delete the parts that I don't want. (Under the Path menu choose Break Apart, then delete everything you don't want).

After all of that, we end up with a fairly good outline of the object.

You can also experiment with the Simplify command under the Path menu, each time click that it will remove some points from your vector.

Depending on the resolution that you scanned at, you will probably have to resize the vector to make it the size of the actual object. Make sure the lock is engaged (to maintain the proper aspect ratio) and the units are in inches (or mm if you prefer). Typing a value in the width (W) will change the height (H) and vice versa.
So we have now vectorized the scanned image, and it's ready to be saved and then imported into your favorite CAM program to generate g-code for your machine. Or you can use the Gcodetools plugin for Inkscape.

Thursday, April 25, 2013

iPad app: Entrepreneurship Essentials course


ADLCRobots and Pencils, and GoForth Institute have developed an iPad app that teaches Entrepreneurship Essentials. Through the simulation of running a lemonade stand, it uses gamification principles, text slides, videos, and auto-graded quizzes to deliver the content for five one-credit CTS courses (ENT1010, 1020, 2010, 2020, and 2030).

It is fairly well designed, certainly better than most LMS-based online courses. Students will still need to be somewhat self-motivated to make it through, but the badges, notifications, and challenges should help.

From what I hear, schools that register students in this "course" can receive all of the CEU funding if they have a teacher assigned to do the "marking" and tracking, or they can receive 3/5 of the funding if they get ADLC to do all of that.

There has been some media coverage by CBC News and the Edmonton Journal. If you're interested, check out more information and an introduction video.

At any rate, I'm certainly going to recommend it to high schools that have students interested in entrepreneurship and/or interested in trying a different method for getting CTS credits. Information about registration is here.

Thursday, April 11, 2013

things I've learned about Source Filmmaker

I've been playing with Source Filmmaker a bit lately, and I wanted to document some of the things that I've learned.

The best source of information is the official SFM wiki.

There are also numerous video tutorials available on YouTube, including the official ones.

The up and down arrows take the playhead to the beginning and ending of a shot, respectively.

Audio should be in WAV format. And I recommend using a nice USB microphone for recording audio.

More models and props can be downloaded (subscribed) from the Steam Workshop. I was particularly interested in models with facial animations for lip-sync, and there are a number available there.

Attach a prop (e.g. a hat) to a character's head by locking it to the bip_head control in the character's animation set. Drag the character's bip_head control on to the hat's bip_head (or rootTransform) control, then select all of time in the Motion Editor (if it's not all green already), select the Body animation set of your prop and then under Procedural drag the Default control from left to right. You may need to adjust its position a little, especially if you used rootTransform because there was no bip_head.

Attach a prop to a character's hand by locking it to the weapon_bone under Unknown in the character's animation set. Similar to the above process.

In order to render 1080p videos, you need to start the program with the argument -sfm_resolution 1080 (either from your Steam library by right-clicking it and choosing Properties then SET LAUNCH OPTIONS... or by editing the desktop shortcut). This isn't recommended until you're ready to render, since working in 1080p makes things a little slow. Rendering 1080p videos takes a long time too, on my i7 desktop it took almost an hour and a half to render a one minute clip. If you're really adventurous you can render at 4K with -sfm_resolution 2160 but you'll need a decent computer a lot of time.

The basic process I go through to create videos (like the one below) is:
  1. launch Source Filmmaker
  2. name your session (or open an existing session)
  3. set the Framerate to 30 (although the default 24 is fine too)
  4. right-click the black part where it says "No Map Loaded!" and choose Load Map...
  5. decide where you'd like to film
    • you can move the camera around by holding down the mouse button where you'd just loaded a map (the viewport) while you use the keys WASD and ZX
  6. click the + sign right underneath Animation Set Editor to Create Animation Set for New Model
    • look in the categories player and survivors for good human models
  7. move the model to the appropriate place by selecting its name in the Animation Set Editor, the move tool  (near the bottom right of the map window) and either the Motion Editor (F3) or the Graph Editor (F4)
  8. add a camera by pressing c on the keyboard
    • or click the down arrow on the right of the active camera button (below the viewport), click Change Scene Camera, and click New Camera.
  9. click the + sign to Create Animation Set(s) for Existing Element(s) and choose camera1 so that you can animate that camera
  10. add any props and position them how you'd like
    • the same way that you added a new model in step 6
    • if you want them to move with a character, make sure you lock them to that character (e.g. their weapon_bone or bip_head)
  11. add your recorded voice clip by selecting the clip editor (F2), right clicking on Dialog (near the bottom), and choosing Add Clip to Track or Record Narration
    • when you are adding a clip to a track, any WAV files that are in the ...\steam\steamapps\common\sourcefilmmaker\game\usermod\sound file will show up
  12. to have your character move their lips along with the voice track, select the Dialog track, the character's Lower Face (and Tongue) in the Animation Set Editor, right-click in the Animation Set Editor, and choose Extract Phonemes
  13. I usually animate the camera by moving it around rather than using the fieldOfView control
So that is a very brief overview of some of the things I've done in Source Filmmaker so far. Here's an example of one I did recently:

Monday, April 8, 2013

installing Red5 in Ubuntu from the command line

First install the requirements for Red5:

sudo apt-get install java-package openjdk-7-jre openjdk-7-jdk ant subversion

Next download and untar Red5:

tar xvfz red5-1.0.0.tar.gz

Rename the folder:

mv red5-1.0.0 red5

And then run Red5:

cd red5
sh red5.sh

After scrolling through a lot of verbose information about the startup of Red5, you'll know it's running when you see the line that ends with Installer service created.


If you're browsing from the computer that it's installed on, you can see if the Red5 server is running by going to http://localhost:5080 .