Friday, December 20, 2013

Displaying Calendar Events on Digital Signage Using a Raspberry Pi and Google Apps Script

In our office many of us are often out of the building, so we decided to hang a TV that would display our calendar events. The hardware consists of a 40" LED mounted on the wall, with a Raspberry Pi (with a Wi-Fi adapter) connected to the HDMI input.

To display the calendars, a Google Apps Script edits a Google Sites page, that is refreshed on the Raspberry Pi every 30 minutes. Google Apps Script is great at parsing Google Calendars, but unfortunately there is some incompatibility with our Exchange calendars, so I ended up using Yahoo!Pipes to parse the ICS files served by our Exchange server.

I'll write out some instructions for recreating what I've done, but your results may vary. Feel free to ask for help in the comments or on social media.

Start by creating yourself a Google Site that has no navigation elements or other clutter. Probably the easiest is to create one from this template I've shared.

Next, a Yahoo!Pipe to parse the ICS files. If you can subscribe to the calendars in Google Calendar then you don't need this because you can use the Calendar Service in Google Apps Script. I'll leave it to you to figure out how to do that, though. If you want to use Yahoo!Pipes to parse ICS files, check out the pipe I created.

Then a Google Apps Script to add events to the Google Site you created. If you're not familiar with Google Sites, there are many tutorials and code examples. I'll just paste my script code in here and hope you can make sense of it.

//Remember to set up a trigger set up to run every 30 minutes, or at whatever frequency you prefer
function runThis() {
  var page = SitesApp.getPageByUrl("https://sites.google.com/whatever-your-site-is-should-be-here");
  var dateToday = new Date();
  //format the date to print at the top of the webpage
  var dayToday = Utilities.formatDate(dateToday, Session.getTimeZone(), "EEEE, MMMM dd");
  //put the date at the top of the web page
  var pageContent = "<div style='font-size:larger'>Today is " + dayToday + "</div><br>";
  //add the calendar stuff by calling the appropriate functions and appending to the variable
  var pageContent = pageContent + parseCalendar("Your Name","#00FF00","http://your-link-to-an-online-ics-file.ics");
  var pageContent = pageContent + parseCalendar("Another Name","#FF0000","http://a-link-to-another-online-ics-file.ics");
  page.setHtmlContent(pageContent);
}

function parseCalendar(name,color,iCalUrl) {
  //declare and empty pageContent variable that we will fill with calendar entries
  var pageContent = "";
  //format the iCal URL for submission to Yahoo!Pipes
   var replaceColon = iCalUrl.replace(":","%3A");
   var replaceSlash = replaceColon.replace(/\//g,"%2F");
   var translatediCalUrl = replaceSlash.replace(/@/g,"%40");
   //replace spaces with + signs
   var translatedName = name.replace(" ","+");
   //concatenate the strings to make a URL
   var rssUrl = "http://pipes.yahoo.com/pipes/pipe.run?CalendarURL=" + translatediCalUrl + "&Name=" + translatedName + "&_id=9e11d02f251ade5c10a6f5501bfe181f&_render=rss"; 
  //fetch the RSS feed from that URL
  var rssContent = UrlFetchApp.fetch(rssUrl).getContentText();
  //parse the RSS feed that we just fetched
  var items = XmlService.parse(rssContent).getRootElement().getChild("channel").getChildren("item");
  //loop through the items we just parsed
  for (var i = 0; i < items.length; i++) {
    var item = items[i];
    var title = item.getChild("title").getText();
    // if there is a location, then get it from the "description" field
    if (item.getChild("description") != null) {
      var location = item.getChild("description").getText();
      var output = title + " at " + location;
      }
    // if there isn't a location, output just the title
    else {output = title};
    var pageContent = pageContent + "<div style='color:" + color + "'>" + output + "</div>\r";
    }
  return pageContent;
}
Finally, on the Raspberry Pi:
  1. Set up Debian Linux on the Pi:http://www.raspberrypi.org/downloads
  2. (optional): Force the Raspberry Pi to use HDMI output even if it doesn't detect a display there:
    1. in the terminal emulator type  sudo leafpad /boot/config.txt
    2. add the following lines to that file:
      1. hdmi_force_hotplug=1
      2. hdmi_drive=2
    3. remove (or comment out) any similar lines at the bottom of the file that may have been added by the NOOBS install process
    4. save and close the config.txt file
  3. Have the mouse cursor auto-hide using unclutter: in the terminal emulator type  sudo apt-get install unclutter
  4. Edit the autostart file: in the terminal emulator type  sudo leafpad /etc/xdg/lxsession/LXDE/autostart
  5. Disable screen sleeping and autostart the browser by adding the following lines to the file you just opened for editing (include the @ signs, but not the line numbers):
    1. @xset s off
    2. @set -dpms
    3. @xset s noblank
    4. @midori -e Fullscreen -i 1800 -a https://sites.google.com/whatever-your-site-is-should-be-here
    5. @unclutter -display :0.0 -idle 5
  6. Reboot the Raspberry Pi, and you're done.

Friday, October 18, 2013

School Announcements: Auto-Generating Announcement Documents (printable and viewable online)

Rather than having to manually create a document every day with the daily announcements, I've created a script that will do it for you. There are, of course, other features that could be added, but this is good enough for today.

To start, announcements are submitted via a Google Form, so they end in a spreadsheet. There are three pieces of data: the text of the announcement, the category, and the expiry date.


The script creates a new Google Document (in a public folder), then takes data from the spreadsheet and pastes it into that newly created document. The code for the script follows. (Creative Commons Attribution-ShareAlike).


function createAnnoucementDocument() {
  // Set up a trigger to run this every weekday, perhaps at 8:00 am
 // Define a custom paragraph style.
var styleHeading = {};
 styleHeading[DocumentApp.Attribute.HORIZONTAL_ALIGNMENT] = DocumentApp.HorizontalAlignment.CENTER;
 styleHeading[DocumentApp.Attribute.FONT_SIZE] = 18;
 styleHeading[DocumentApp.Attribute.BOLD] = true;

var styleCategory = {};
 styleCategory[DocumentApp.Attribute.HORIZONTAL_ALIGNMENT] = DocumentApp.HorizontalAlignment.LEFT;
 styleCategory[DocumentApp.Attribute.FONT_SIZE] = 12;
 styleCategory[DocumentApp.Attribute.BOLD] = true;

var styleText = {};
 styleText[DocumentApp.Attribute.HORIZONTAL_ALIGNMENT] = DocumentApp.HorizontalAlignment.LEFT;
 styleText[DocumentApp.Attribute.FONT_SIZE] = 12;
 styleText[DocumentApp.Attribute.BOLD] = false;

  // Get the current date
  var app = UiApp.createApplication();
  var dateToday = new Date();
  // date formatting here: http://docs.oracle.com/javase/6/docs/api/java/text/SimpleDateFormat.html
  var formattedDateToday = Utilities.formatDate(new Date(), "MST", "yyyy-MM-dd");
  
  // Create the document
  var documentName = formattedDateToday + " School Announcements";
  var doc = DocumentApp.create(documentName);
  // Move the document to the shared folder entitled "Announcements"
  var documentFile = DocsList.getFileById(doc.getId());
  var folderName = DocsList.getFolder("Announcements");
  documentFile.addToFolder(folderName);
  
  // the spreadsheet that contains the results of the announcement submission form
  var sheet = SpreadsheetApp.openById("***INSERT_SPREADSHEET_KEY_HERE***").getSheets()[0];

  // Start creating the body of the document
  var body = doc.getBody();
  
  body.appendParagraph("Name of School Goes Here\r"+formattedDateToday).setAttributes(styleHeading);

  // Read the whole spreadsheet into a list
  //  if Expiry Date (column D) is > or equal to today's date then process it, else ignore
  //  switch if category match then append to document

  var data = sheet.getRange(2, 2, sheet.getLastRow(), sheet.getLastColumn()).getValues();
  for (var row=0, total=data.length; row < total; row++) {
    var rowData = data[row];
    var announcementText = rowData[0];
    var announcementCategory = rowData[1];
    var announcementExpiry = rowData[2];
    if (announcementExpiry >= dateToday) {
      switch (announcementCategory) {
        case "General":
          body.appendParagraph(announcementText).setAttributes(styleText);
        break;
      }
    }
  }
  body.appendParagraph("Events and Meetings").setAttributes(styleCategory);
    for (var row=0, total=data.length; row < total; row++) {
    var rowData = data[row];
    var announcementText = rowData[0];
    var announcementCategory = rowData[1];
    var announcementExpiry = rowData[2];
    if (announcementExpiry >= dateToday) {
      switch (announcementCategory) {
        case "Events and Meetings":
          body.appendParagraph(announcementText).setAttributes(styleText);
        break;
      }
    }
  }
  body.appendParagraph("Athletics").setAttributes(styleCategory);
    for (var row=0, total=data.length; row < total; row++) {
    var rowData = data[row];
    var announcementText = rowData[0];
    var announcementCategory = rowData[1];
    var announcementExpiry = rowData[2];
    if (announcementExpiry >= dateToday) {
      switch (announcementCategory) {
        case "Athletics":
          body.appendParagraph(announcementText).setAttributes(styleText);
        break;
      }
    }
  }
  body.appendParagraph("Cafeteria").setAttributes(styleCategory);
    for (var row=0, total=data.length; row < total; row++) {
    var rowData = data[row];
    var announcementText = rowData[0];
    var announcementCategory = rowData[1];
    var announcementExpiry = rowData[2];
    if (announcementExpiry >= dateToday) {
      switch (announcementCategory) {
          case "Cafeteria":
        body.appendParagraph(announcementText).setAttributes(styleText);
        break;
      }
    }
  }
  return app;
}

Monday, October 14, 2013

Gamification of Health

Feel free to correct me in the comments, but I'm seeing two main trends in the gamification of health. The first is using game mechanics for fitness, motivating us to get us off the couch. The second is gamified, or at least game-based, treatments.

Since I'm and educator and not a health professional, I'm not particularly qualified to comment on the latter. However I have been reading many interesting articles about video games for pain reduction or for treating ADHD, and I'm very interested in devices that help us measure ourselves. For example, check out this story about using an inexpensive EEG device together with video games as therapy for ADHD.

Of broader application, though, is the use of game mechanics to help reverse our sedentary patterns. Devices such as the FitBit products, and games such as Nike+ Kinect Training help us to measure ourselves and set goals. As I write this during a weekend of overeating, I realize that these things need to be as "frictionless" as possible. It's much easier to have another slice of pie than to go to the gym, or to turn on the Xbox and spend half an hour working out. There continues to be a lot of thought put into seamlessly incorporating fitness (and motivation) into our daily lives.

Often the best motivation, for fitness or anything else, involves collaboration or competition with other people. Upcoming competitions such as triathlons or games encourage us to train, and collaborations such the November Project motivate us because our friends are doing it. Of course this means that we need to convince our friends to participate.

In education, however, we have a unique environment with a captive audience. Students participate in events such as the Terry Fox Run, as well as school sports and intramurals. Some organizations try to replicate this with Corporate Challenges, or with online gamification systems such as OfficeVibe, but for some reason those don't seem as successful. Maybe because we don't usually have PhysEd teachers working in our offices.

So there are two things that I'm thinking about related to this. First, of course, is how to continue use our time with students to encourage lifelong fitness. The second, though, is how to replicate for adults the fitness motivation that we see in schools. I see the principles of gamification as one of the best ways to continue doing that.

But first I think I'll go have another slice of pie.

Gamification of Health

Feel free to correct me in the comments, but I'm seeing two main trends in the gamification of health. The first is using game mechanics for fitness, motivating us to get us off the couch. The second is gamified, or at least game-based, treatments.

Since I'm and educator and not a health professional, I'm not particularly qualified to comment on the latter. However I have been reading many interesting articles about video games for pain reduction or for treating ADHD, and I'm very interested in devices that help us measure ourselves. For example, check out this story about using an inexpensive EEG device together with video games as therapy for ADHD.

Of broader application, though, is the use of game mechanics to help reverse our sedentary patterns. Devices such as the FitBit products, and games such as Nike+ Kinect Training help us to measure ourselves and set goals. As I write this during a weekend of overeating, I realize that these things need to be as "frictionless" as possible. It's much easier to have another slice of pie than to go to the gym, or to turn on the Xbox and spend half an hour working out. There continues to be a lot of thought put into seamlessly incorporating fitness (and motivation) into our daily lives.

Often the best motivation, for fitness or anything else, involves collaboration or competition with other people. Upcoming competitions such as triathlons or games encourage us to train, and collaborations such the November Project motivate us because our friends are doing it. Of course this means that we need to convince our friends to participate.

In education, however, we have a unique environment with a captive audience. Students participate in events such as the Terry Fox Run, as well as school sports and intramurals. Some organizations try to replicate this with Corporate Challenges, or with online gamification systems such as OfficeVibe, but for some reason those don't seem as successful. Maybe because we don't usually have PhysEd teachers working in our offices.

So there are two things that I'm thinking about related to this. First, of course, is how to continue use our time with students to encourage lifelong fitness. The second, though, is how to replicate for adults the fitness motivation that we see in schools. I see the principles of gamification as one of the best ways to continue doing that.

But first I think I'll go have another slice of pie.

Thursday, September 19, 2013

Empowered by Technology

The reason that I like technology so much is that it's empowering. The Internet, programming, and even video games allow us to do things that would be otherwise impossible. These technologies also give us a very real sense of power and control over our lives.

The Internet is the greatest repository of information, and misinformation, ever created. You can learn how to do things, share your own expertise, and even look up obscure facts. Scientia potentia est.

Programming is another way that technology empowers us. Basic services such as "If This Then That" (ifttt.com) and more advanced programming languages such as Python allow us to make these machines do our bidding. We can turn our ideas into reality.

Video games are, for many, much more than entertainment. They are empowering in that they allow us control over our own stories and our virtual worlds. They make possible things that are otherwise impossible or impractical, such as driving really fast, flying, or building worlds for others to experience. Minecraft is a great example.

Of course we also need to consider the negative impacts that technology is having on our society and on ourselves. TLDR.

Saturday, June 1, 2013

Google Apps Bulk Delete Users Script

Google Apps has a bulk account creation tool, but nothing for bulk deleting accounts. I had previously written something in Python to bulk delete accounts from a text file, but it was time to create something web-based. This will only work if you have the Domain API enabled, which means that you'll have to enable it in your Google Apps for Education Admin console.

Just a note, though, before deleting users you may want to direct them to the Data Liberation tools for downloading their data.

The actual script is in Google Apps Script. You can see it by visiting Google Apps Delete Users, or create your own copy at script.google.com using the source code below.

Let me know if this works for you, or if you have suggestions for improvements.


 function doGet() {  
 // get the user's credentials for their Google Apps account  
  var user = Session.getEffectiveUser().getUserLoginId()  
  var domain = UserManager.getDomain();  
  var welcome = "You are running this script as " + user + " on the domain " + domain;  
 // set up the user interface  
  var app = UiApp.createApplication().setTitle('Delete Google Apps Users by MisterHay');  
  app.add(app.createLabel(welcome));  
  app.add(app.createHTML("<br>Make sure you have enabled the Provisioning API (support.google.com/a/bin/answer.py?hl=en&answer=60757).<br>This script will delete Google Apps user accounts that you paste or type below. Each account name must be on its own line.<br>e.g.<br>misterhay<br>mpython<br>unladen.swallow<p>"));  
  var textArea = app.createTextArea().setName("textArea").setSize("20%", "60%").setStyleAttribute("background", "white").setStyleAttribute("color", "black").setFocus(true);  
  var serverHandler = app.createServerHandler("deleteAccounts").addCallbackElement(textArea);  
  var clientHandler = app.createClientHandler().forEventSource().setEnabled(false).setText("deleting accounts...");  
  app.add(textArea);  
  var button = app.createButton("Delete Accounts");  
  button.addClickHandler(serverHandler);  
  button.addClickHandler(clientHandler);  
  app.add(button);  
  app.add(app.createLabel("no accounts deleted").setId("finishedLabel").setVisible(false));  
  return app;  
 }  
   
 function deleteAccounts(eventInfo) {  
  var app = UiApp.createApplication();  
  var deleteThese = eventInfo.parameter.textArea;  
  var stringArray = deleteThese.split(/\n/);  
  for (var loopNumber = 0; loopNumber < stringArray.length; loopNumber++) {  
   var deleteThisUser = stringArray[loopNumber];  
 // rename the accounts before we delete them to avoid the five day wait before account names can be reused  
   var renamedUser = deleteThisUser + '_old';  
   Logger.log(renamedUser);  
 // delete the account  
   UserManager.getUser(deleteThisUser).setUsername(renamedUser);  
   UserManager.getUser(renamedUser).deleteUser();  
   }  
 // tell the user how many accounts we deleted.  
  app.getElementById("finishedLabel").setText(loopNumber + " accounts deleted").setVisible(true);  
  return app;  
 }  

Monday, May 27, 2013

Playing a YouTube Playlist in your DigitalSignage

If you're using DigitalSignage.com you'd probably like to play some videos on the screens. You can, of course, upload them to the Resources section, but if you'd just like to loop a YouTube playlist that's fairly easy.

First you'll need to find the URL for a YouTube playlist. Click the title of the playlist, then click the Share button. The URL will be something like http://www.youtube.com/playlist?list=PL627F181E0CB37E19 and the important part is the characters after the = sign.

I'm assuming that you know how to set up screen divisions and campaigns, so grab an HTML5 component from the toolbox and drop it in place. The URL you will put in that component will look like this http://www.youtube.com/embed?listType=playlist&autoplay=1&loop=1&list=PL627F181E0CB37E19 where autoplay=1 means that it will automatically play the through the playlist, and loop=1 means that it will go back to the beginning once it finishes. Of course you will replace the characters after list= with the playlist that you want to use.

Saturday, May 18, 2013

Playing Songza on an EeePC 701

This old seven-inch laptop is still useful for a few things, I've recently connected it to my stereo system as a Songza player.

There were two things I needed to do to get this working:

Install Lubuntu
Install Adobe Flash Player from the tar.gz (which involved copying the .so file to /usr/lib/Chromium and copying the folder somewhere else)

I can write up more details if you ask in the comments.

I tried Google Chrome, but it was too resource-intensive so the audio stuttered. I wasn't able to install the Flash plugin in Midori for some reason, so I went back to Chromium and got it working there. The next step will be to connect a USB IR remote receiver with some scripts for selecting different Songza playlists, pausing, and adjusting the volume.

Wednesday, May 15, 2013

form data averages Google Apps Script

When a Google Form is submitted, it adds a row to your spreadsheet. This changes your formulas, which is a problem if you are trying to do live calculations on submitted data, so you need a script to copy in the correct formulas after each form submission.

Here's an example of how I did that. The script is set to trigger whenever a form is submitted.

function insertAverage() {
  var sheet = SpreadsheetApp.getActiveSpreadsheet().getSheets()[0];
  var formulas = [
   ["=AVERAGE(C3:C100)", "=AVERAGE(D3:D100)", "=AVERAGE(E3:E100)"]
   ];
  var destination = sheet.getRange("C2:E2");
  destination.setFormulas(formulas);
};

Monday, May 6, 2013

Programming with Pure Data and Open Sound Control for the Behringer X32

Pure Data is an easy graphical programming environment. It can speak the OSC (Open Sound Control) protocol, so you can write programs to communicate with Behringer X32 digital mixers using their published X32 OSC Remote Protocol. Don't worry, it's easier than it sounds.


1. Download Pure Data Extended (which includes things that we'll need).
2. Unzip and/or install it.
3. Run the program Pd-extended.
4. If you're using Windows, it will likely ask you if you want to allow it to communicate on the network. Click Allow access and input your Administrator password if necessary.


5. In the window that pops up, choose New under the File menu.

6. You now have a blank canvas to put commands, controls, and such on to.

7. Under the Put menu choose Object (or hold the Ctrl key and press 1).
8. In the newly-created object type  import mrpeach  to tell it that you need to use that package.

9.  Under the Put menu choose Message and type in it  connect x.x.x.x 10023  but replace the x values with the IP address of your X32 mixer. For testing purposes you can have it connect to localhost (the computer that you're sitting at) instead.
10. Create a  disconnect  message and a  udpsend  object as well.
11. Drag from the bottom left corner of each of those messages to the top left corner of the udpsend object in order to connect their outputs to the udpsend input.

12. Now everything is set up to communicate, but we need to actually compose some messages to send. To start let's set up a fader to control a channel level, and a button (toggle) to control the mute for that channel. Under the Put menu choose Vslider to place a vertical slider and Toggle to place your mute button.

13. Put messages for sending mute on/off and fader levels for a particular channel following the X32 OSC specifications. For example,  send /ch/01/mix/on $1  will take the input value ($1) to send the message  /ch/01/mix/on 0  to mute channel one or  /ch/01/mix/on 1  to unmute it.

14. You'll also need to put in a  packOSC  object to create the OSC messages, and connect its output to the  udpsend  object.

15. One last thing you'll need to do is change the Vslider's properties to output values from 0 to 1 as the X32 expects. Right-click the Vslider, choose Properties and change the top value to 1 instead of 127. You can also change the size and color here if you'd like.

16. Your program is now all set up. To run it, go to the Edit menu and choose Edit Mode (to turn off the edit mode) or hold the Ctrl key and press the e key. Once the program is running, click on one of your connect messages to connect to the X32, then drag the slider and click the toggle to see their effects.

Optional: If you'd like to test without connecting to the X32, create another Pure Data program by choosing New under the File menu and make it look something like the following. Remember that the Vslider range should be 0 to 1.

When your new OSC receiving program is running (not in edit mode), you should be able to click connect localhost 10023 in your original program and then see that the slider and toggle there will affect the corresponding ones in your new program.

Hopefully that's enough to get your started in writing Pure Data programs for you X32 digital mixer.

Wednesday, May 1, 2013

Lighting LEDs on the Raspberry Pi with Python

There are many tutorials available for this already, but I just wanted to collect my observations into a series of steps that I can repeat later.

For reference, I used raspberry-gpio-python, Raspberry Leaf, and How to use your Raspberry Pi like an Arduino.

I connected an LED and an inline resistor to GPIO pin 7 and ground, one side of a momentary pushbutton to ground, and the other side of the pushbutton to a 1 kOhm resistor connected to 3.3 V.

Starting from a fresh install of Raspbian, at the command line (or in LXTerminal) input the following -commands: (edit: the crossed-out commands aren't really necessary)

sudo apt-get update
sudo apt-get install python
sudo apt-get install python-dev
sudo python distribute_setup.py
sudo easy_install pip
sudo apt-get install python-rpi.gpio

sudo python


#!/usr/bin/env python
from time import sleep
import os
import RPi.GPIO as GPIO
GPIO.setmode(GPIO.BCM)

# define the pins we're using
button1 = 4
LED = 7

set up the pins
GPIO.setup(button1, GPIO.IN)
GPIO.setup(LED, GPIO.OUT)

# turn on the LED
GPIO.output(LED, True)
# wait for half a second
sleep(0.5)
# turn off the LED
GPIO.output(LED, False)
# toggle the LED
GPIO.output(LED, not GPIO.input(4))

# set up some variable to read from the button
input = False
previousInput = True

# loop
while True:
 # read the button state to the variable input
 input = GPIO.input(button1)
 # make sure the button state isn't what it used to be
 if ((not previousInput) and input):
  print("button pushed")
 # copy the button state to the previousInput variable
 previousInput = input
 # wait to "debounce" the input
 sleep(0.05)

# clean things up
GPIO.cleanup()

Vectorizing a Scanned Object for CNC Cutting

If you have a physical object or paper drawing that you'd like to carve out using a CNC router or plasma cutter, here's one way to do it. We're going to use the open source drawing program Inkscape for turning scanned images into vectors that the machine can cut.

Start by scanning the object (or drawing) on a flatbed scanner. If you don't have access to a scanner, you can try taking a picture from directly above the object (or the drawing). For this example I've scanned a handheld wireless microphone.
This will be a somewhat complicated image to vectorize, it would be better if it was simply black and white (like a line drawing or most logos). The nice thing about how this scanned, though, is that the background is totally white. Don't worry if yours isn't, that's something we can deal with or fix later.

Start up Inkscape (you can download the portable version if it's not installed on your computer) and under the File menu choose Import... and find the image that you scanned earlier. It doesn't matter if you link or embed, since you'll be removing the image from Inkscape soon anyway.

Now it's time to turn that image into a vector outline. Make sure that the image is selected, and under the Path menu choose Trace Bitmap... to bring up the following window.

Experiment with changing the Brightness cutoff Threshold. I tried values of 0.4, 0.6, 0.8, and 0.9 to get the following vectors (from left to right, with the original bitmap image on the far left).

Each time you click OK it will create another vector object, which you can delete if you don't like it. In my example above there's a little white spot that shows up even at high threshold values, so that's something that I could go back to the original image and paint over, or I can break apart the vector group and delete the parts that I don't want. (Under the Path menu choose Break Apart, then delete everything you don't want).

After all of that, we end up with a fairly good outline of the object.

You can also experiment with the Simplify command under the Path menu, each time click that it will remove some points from your vector.

Depending on the resolution that you scanned at, you will probably have to resize the vector to make it the size of the actual object. Make sure the lock is engaged (to maintain the proper aspect ratio) and the units are in inches (or mm if you prefer). Typing a value in the width (W) will change the height (H) and vice versa.
So we have now vectorized the scanned image, and it's ready to be saved and then imported into your favorite CAM program to generate g-code for your machine. Or you can use the Gcodetools plugin for Inkscape.

Thursday, April 25, 2013

iPad app: Entrepreneurship Essentials course


ADLCRobots and Pencils, and GoForth Institute have developed an iPad app that teaches Entrepreneurship Essentials. Through the simulation of running a lemonade stand, it uses gamification principles, text slides, videos, and auto-graded quizzes to deliver the content for five one-credit CTS courses (ENT1010, 1020, 2010, 2020, and 2030).

It is fairly well designed, certainly better than most LMS-based online courses. Students will still need to be somewhat self-motivated to make it through, but the badges, notifications, and challenges should help.

From what I hear, schools that register students in this "course" can receive all of the CEU funding if they have a teacher assigned to do the "marking" and tracking, or they can receive 3/5 of the funding if they get ADLC to do all of that.

There has been some media coverage by CBC News and the Edmonton Journal. If you're interested, check out more information and an introduction video.

At any rate, I'm certainly going to recommend it to high schools that have students interested in entrepreneurship and/or interested in trying a different method for getting CTS credits. Information about registration is here.

Thursday, April 11, 2013

things I've learned about Source Filmmaker

I've been playing with Source Filmmaker a bit lately, and I wanted to document some of the things that I've learned.

The best source of information is the official SFM wiki.

There are also numerous video tutorials available on YouTube, including the official ones.

The up and down arrows take the playhead to the beginning and ending of a shot, respectively.

Audio should be in WAV format. And I recommend using a nice USB microphone for recording audio.

More models and props can be downloaded (subscribed) from the Steam Workshop. I was particularly interested in models with facial animations for lip-sync, and there are a number available there.

Attach a prop (e.g. a hat) to a character's head by locking it to the bip_head control in the character's animation set. Drag the character's bip_head control on to the hat's bip_head (or rootTransform) control, then select all of time in the Motion Editor (if it's not all green already), select the Body animation set of your prop and then under Procedural drag the Default control from left to right. You may need to adjust its position a little, especially if you used rootTransform because there was no bip_head.

Attach a prop to a character's hand by locking it to the weapon_bone under Unknown in the character's animation set. Similar to the above process.

In order to render 1080p videos, you need to start the program with the argument -sfm_resolution 1080 (either from your Steam library by right-clicking it and choosing Properties then SET LAUNCH OPTIONS... or by editing the desktop shortcut). This isn't recommended until you're ready to render, since working in 1080p makes things a little slow. Rendering 1080p videos takes a long time too, on my i7 desktop it took almost an hour and a half to render a one minute clip. If you're really adventurous you can render at 4K with -sfm_resolution 2160 but you'll need a decent computer a lot of time.

The basic process I go through to create videos (like the one below) is:
  1. launch Source Filmmaker
  2. name your session (or open an existing session)
  3. set the Framerate to 30 (although the default 24 is fine too)
  4. right-click the black part where it says "No Map Loaded!" and choose Load Map...
  5. decide where you'd like to film
    • you can move the camera around by holding down the mouse button where you'd just loaded a map (the viewport) while you use the keys WASD and ZX
  6. click the + sign right underneath Animation Set Editor to Create Animation Set for New Model
    • look in the categories player and survivors for good human models
  7. move the model to the appropriate place by selecting its name in the Animation Set Editor, the move tool  (near the bottom right of the map window) and either the Motion Editor (F3) or the Graph Editor (F4)
  8. add a camera by pressing c on the keyboard
    • or click the down arrow on the right of the active camera button (below the viewport), click Change Scene Camera, and click New Camera.
  9. click the + sign to Create Animation Set(s) for Existing Element(s) and choose camera1 so that you can animate that camera
  10. add any props and position them how you'd like
    • the same way that you added a new model in step 6
    • if you want them to move with a character, make sure you lock them to that character (e.g. their weapon_bone or bip_head)
  11. add your recorded voice clip by selecting the clip editor (F2), right clicking on Dialog (near the bottom), and choosing Add Clip to Track or Record Narration
    • when you are adding a clip to a track, any WAV files that are in the ...\steam\steamapps\common\sourcefilmmaker\game\usermod\sound file will show up
  12. to have your character move their lips along with the voice track, select the Dialog track, the character's Lower Face (and Tongue) in the Animation Set Editor, right-click in the Animation Set Editor, and choose Extract Phonemes
  13. I usually animate the camera by moving it around rather than using the fieldOfView control
So that is a very brief overview of some of the things I've done in Source Filmmaker so far. Here's an example of one I did recently:

Monday, April 8, 2013

installing Red5 in Ubuntu from the command line

First install the requirements for Red5:

sudo apt-get install java-package openjdk-7-jre openjdk-7-jdk ant subversion

Next download and untar Red5:

tar xvfz red5-1.0.0.tar.gz

Rename the folder:

mv red5-1.0.0 red5

And then run Red5:

cd red5
sh red5.sh

After scrolling through a lot of verbose information about the startup of Red5, you'll know it's running when you see the line that ends with Installer service created.


If you're browsing from the computer that it's installed on, you can see if the Red5 server is running by going to http://localhost:5080 .

Monday, January 28, 2013

Google script to email form data

I helped someone set up a Google Form with email notifications the other day. This morning they asked if they could be emailed the submitted data, rather than just a link to the spreadsheet. Since there's only one question on the form, that's an easy script:


function onSubmit(e) {
  var timestamp = e.values[0];
  var question1 = e.values[1];
  var address = "firstaddress@example.com, anotheraddress@example.com";
  var subject = "Feedback form submitted";
  var body = "Someone submitted the form at " + timestamp + " and said " + question1;
  MailApp.sendEmail(address, subject, body);
}


This script will send an email to the address(es) in quotes after the var address = . The body of the email will contain the string that is in quotes after var body =  , which includes the timestamp (from column A of the spreadsheet) and what the user submitted in the form (the contents of column B).

To get this to work for you, paste the above code into a new blank script that's associated with a form you've created, and then set up a trigger that runs the function onSubmit(e) when the form is submitted.

To set up the trigger, follow the instructions from this page:


  1. Open or a create a new form, then go to the results spreadsheet of that form.
  2. Click the Unsaved Spreadsheet dialog box and change the name.
  3. Choose Tools > Script Editor and write the function you want to run.
  4. Choose Resources > Current script's triggers. You see a panel with the message No triggers set up. Click here to add one now.
  5. Click the link.
  6. Under Run, select the function you want executed by the trigger.
  7. Under Events, select From Spreadsheet.
  8. From the next drop-down list, select On form submit.
  9. Click Save.
Of course if you have more than one question in your form, you'll need a variable for each of them (e.g. var question2 = e.values[2];) and you'll need to call those variables in the var body =  statement.

Let me know if this works for you.

Thursday, January 24, 2013

automatically adding column data to google form submissions with a script


I've often come across the issue of wanting to manipulate data that has been submitted with a Google Spreadsheets Form. For example, automatically marking and totaling formative quizzes where students submit their answers in a Google Form.

Unfortunately when a user submits a form, a new row with those data is inserted on the spreadsheet. This means any formulas that you've manually added to the Spreadsheet will be above or below that row.

To solve this issue, I wrote a Script that copies (to that inserted row) the contents of the columns you've added in the first row.


function addFormula() {
  var sheet = SpreadsheetApp.getActiveSheet();
  var startRow = 2;
  var startColumn = 8;
  var numberRows = 1;
  var numberColumns = 15;
  var lastRow = sheet.getLastRow();
  var sourceRange = sheet.getRange(startRow, startColumn, numberRows, numberColumns);
  var destinationRange = sheet.getRange(lastRow, startColumn, numberRows, numberColumns);
  sourceRange.copyTo(destinationRange);
};


The meanings of the variables are:
sheet is a shortcut so we don't have to keep typing SpreadsheetApp.getActiveSheet()
startRow is the row number that's the source of your formula that you want to copy
startColumn is the column number where your source formula starts
numberRows should usually be 1, it's the number of rows that you would like to copy each time
numberColumns is the number of columns that contain your source formula
lastRow is a shortcut so we don't have to type sheet.getLastRow() when we want to use it
sourceRange collects together the information to tell copyTo where to get the data
destinationRange collects together the the information to tell copyTo where to put it


If you prefer, this could also be done in a single line without all of the variable declarations:


function addFormula() {SpreadsheetApp.getActiveSheet().getRange(2, 8, 1, 15).copyTo(SpreadsheetApp.getActiveSheet().getRange(SpreadsheetApp.getActiveSheet().getLastRow(), 8, 1, 15))};


To add this script to your spreadsheet of data from a Google Form
  1. open the spreadsheet and under the Tools menu choose Script editor...
  2. under "Create script for" click Spreadsheet
  3. delete everything in the Code.gs pane and replace it with the script from this blog post
  4. if necessary, change the numbers for the variables to what they should be for your spreadsheet
  5. under the File menu click Save
  6. under the Resources menu click Current script's triggers...
  7. if you haven't already named your project, do so now in the box that comes up
  8. click No triggers set up. Click here to add one now.
  9. in the third drop-down list, select On form submit
  10. click the Save button
And you're done. The script will run whenever a user submits the form, and it will copy the formulas that you've set up on the first line of submitted data. Leave a comment below if this works for you or if you have any questions.