UCLA CS113, Introduction to Distributed Embeded Systems

UCLA CS113, Introduction to Distributed Embeded Systems



Winter Quarter, 2006

Deborah Estrin




Office: 3531H Boelter Hall


When: Lectures are Monday and Tuesday 4:00 ‚‚“ 5:50 PM

Discussion sections are Fridays 10:00 ‚‚“ 11:50 AM

Where: 5264 Boelter Hall (Lecture)

2432 Boelter Hall (Discussion Section and Lab)

Class URL: http://websrv.seas.ucla.edu:8901/classView.php?term=06W&srs=187378200

Or just go to http://courseweb.seas.ucla.edu/ and click on ‚Å“Computer Science‚ on the left-hand side.

This course will introduce basic concepts needed to understand, design, and implement wireless distributed embedded systems. Topics include: a) design implications of energy, and otherwise resource-constrained nodes; b) network self-configuration and adaptation; c) data routing and transport; c) applications; and e) software design issues. The course will be heavily project based. Working knowledge of C programming in the UNIX environment (particularly GNU/Linux) is assumed.

Lecture / Tutorial Schedule

  • Jan 09 Mon: Introduction by Deborah Estrin (HW0 Out)
  • Jan 11 Wed: EmStar I by Andrew Parker, and Martin Lukac
  • Jan 13 Fri: Initial machine setup
  • Jan 16 Mon: NO CLASS (HW1 Out)
  • Jan 18 Wed: EmStar II by Andrew Parker, and Martin Lukac
  • Jan 21 Fri: Final group formation
  • Jan 23 Mon: EmStar interfaces and services by Andrew Parker, and Martin Lukac (HW2 Out)
  • Jan 25 Wed: EmStar running, debugging, and use by Andrew Parker, and Martin Lukac
  • Jan 27 Fri: HW1 Demonstrations
  • Jan 30 Mon: TinyOS/NesC by Ben Greenstein, Thomas Schoellhammer, Thanos Stathopoulos, and Karen Weeks (HW3 Out)
  • Feb 01 Wed: TinyOS/NesC by Ben Greenstein, Thomas Schoellhammer, Thanos Stathopoulos, and Karen Weeks
  • Feb 03: Fri: HW2 Demonstrations
  • Feb 06 Mon: EmTOS by Ben Greenstein, Thomas Schoellhammer, Thanos Stathopoulos, and Karen Weeks (HW4 Out)
  • Feb 08 Wed: SNACK by Ben Greenstein, Thomas Schoellhammer, Thanos Stathopoulos, and Karen Weeks
  • Feb 10 Fri: HW3 Demonstrations
  • Feb 13 Mon: SOS and other tools by Roy Shea (HW5 Out)
  • Feb 15 Wed: Midterm in class
  • Feb 17 Fri: Mid quarter class evaluation
  • Feb 20 Mon: NO CLASS
  • Feb 22 Wed: Energy harvesting by Jonathan Friedman
  • Feb 24 Fri: HW4 Demonstrations
  • Feb 27 Mon: Debugging by Roy Shea and Nithya Ramanathan (HW6 Out)
  • Mar 01 Wed: MAC Protocols by Saurabh Ganeriwal
  • Mar 03 Fri: HW5 Demonstrations
  • Mar 06 Mon: Collaborative Signal Processing by Hanbiao Wang (HW7 Out)
  • Mar 08 Wed: Cyclops and NIMS by Mohammad Rahimi
  • Mar 10 Fri: HW06 Demonstrations
  • Mar 13 Mon: Trip to garden deployment (date subject to change)
  • Mar 15 Wed:  Trip to garden deployment (date subject to change)
  • Mar 17 Fri: HW7 Demonstrations
  • Mar 20 Mon: Final from 8-11 and project deadline


5% Class attendance

45% Weekly Assignments HW1 ‚‚“ HW6: 7.5% each .

15% Integrated project assignment and demonstration. (HW7)

15% Midterm: In class.  Systems questions on EmStar and TOS.  Individual.

15% Final exam: In class.  Systems question on EmStar, TOS, and general sensor network topics.  Individual.

5% Final Project writeup.

All assignments due by 8 pm on Thursdays. All assignment evaluations will happen in lab on Fridays the following. Points will be taken off for submission of assignment past due date/time. See assignments for more details.

Project Description


Gain appreciation for distributed system/protocol challenges and approaches in distributed embedded systems; experience programming under various resource constraints; experience debugging distributed applications; learn sensor network system tools such as EmStar and TinyOS.

The CS113 Project is a variant on the Distributed Pursuit Evasion Game  (H. J. Kim, R. Vidal, D. H. Shim, O. Shakernia, and S. Sastry, “A Hierarchical Approach to Probabilistic Pursuit-Evasion Games with Unmanned Ground and Aerial Vehicles,” IEEE Conf. Decision and Control, Orlando, FL, December 2001.)

The general idea is that there are two kinds of mobile entities: pursuers and evaders. Your goal is to coordinate the pursuers to capture the evaders. The variant is that the field in which the game is played is instrumented with wireless sensors that send information to pursuers. This game has many practical applications such as security, search and rescue, habitat monitoring, etc.


In the lab, we have set up a playing field of 16 Mica2 Motes in a 4 by 4 grid (approximately 4 feet by 4 feet) equipped with light sensors. This is the sensor network. A projector will be positioned to emit animated images (white circles) representing the evaders on top of the 4 by 4 grid. For example, the projector may display two white circles bouncing around on a black background. The pursuers are represented by at most three web cameras with controllable pan-tilt-zoom attached to microservers. The goal is to have the cameras pursue the evaders as they wander over the playing field using only the data provided by the sensor network.

With this physical set up, there are many ways to play this game. Here is the approach we will take in class:

The Sensor Network

The motes are close enough that they are all within one hop of each other. However, software will be installed that only allows packets to be received from adjacent nodes to enable the creation of a multihop network when we choose. Each of the motes will be equipped with a light sensor and the ability to communicate wirelessly with its neighbors. Each mote will also know its relative physical location.

The Evaders

The system that you build will need to cope with a number of different scenarios. The scenarios include evaders represented by a solid white circle of varying sizes with either sharp of fuzzy edges. Other variables include the total number of evaders, their brightness, and continuous or discontinuous movement, etc. The evaders‚€ž¢ movement may be controlled by a script (for reliable replay) or by a human player.

The Pursuers

There will be a variable number of pursuers. A pursuer is composed of a microserver, an attached mote (so that the server can talk to other motes), and a controllable web camera. You be given the relative position of the camera with respect to the sensor field. For example, this should be enough information to calculate the pan and tilt angles to point the camera squarely at any one of the motes or anywhere else on the playing field. In the case where there may be more than one evader, the pursuers will need to cooperate so that they efficiently track the evaders.


There will be several metrics for the game that will be used to compare your implementation against other teams‚€ž¢.

Average brightness of web cam pictures will be measured. This should indicate how well you are tracking the target.

The number of total packets transmitted will be recorded. Every packet transmitted is one step closer to death in a real sensor network; your application isn‚€ž¢t useful if it doesn‚€ž¢t last very long.

Individual Assignments

These are just descriptions. More detail will be provided at the time of assignment.  Specifics of the descriptions may change during the quarter.

Homeworks HW1 ‚‚“ HW7 will be assigned on a Monday. By Wednesday 11:59 PM (two days later), your team is required to submit approximately a one page description of how you plan to approach the problem. Typically, code for the assignment will be due by Thursday 8 PM of the following week (eight days later). On the following day, your team will demonstrate that the code has met the specified requirements. The write-up, timely submission of code, and demonstration, will all count towards your grade.

Homework 0: Install EmStar and NesC and other necessary tools. Work through the tutorials.

This will take several hours, but it is extremely valuable to learn how to install all the necessary tools that you will be using for this class. Make sure to allocate enough time for this! This homework does not require a write-up or code submission, and it does not count for any points. But it must be done if you hope to get started on your next assignment.

Homework 1: Controlling Cameras (EmStar application)

Write an EmStar application that controls the camera to point to specific coordinates on the test-bed.

Homework 2: Cooperating Cameras (EmStar application)

Consider two microservers each controlling a camera and both microservers are independently receiving event notifications: a list of coordinates.  Have the microservers coordinate their actions such that they avoid covering the same event.

Homework 3: Tracking Location of a Single Evader, Single Hop (TinyOS application)

Goal is to track the location of a single evader. Working from the experience you gained from the TinyOS tutorial, have your motes continually broadcast their light readings. Next, write a NesC application (call it the Sink) running on the microserver to log the mote ID nearest to the evader. Assume that all of your motes can hear one another.

The sensors on the motes may not be calibrated. Write an application that calibrates the light sensors against a minimum stimulus, and a maximum stimulus. This requires writing a program where the user is able to issue three different commands: calibrate min, calibrate max, and play game. The way to do this is to write an EmTos program running on a server that accepts input from the user specifying if nodes should calibrate for the min light value, max light value, or track the evader. Then the EmTos program broadcasts the command.

Homework 4: Tracking Location of a Single Evader, Multihop (TinyOS application)

Now assume that not all of your motes can hear one another. Form a tree with the Sink node as the root, and forward ALL sensor readings to the Sink. Have the Sink log the mote ID nearest to the evader.  Testing on real nodes can be both time consuming and frustrating.  This application will be developed using the EmTos simulator that provides great visibility into the simulated network and better information for debugging.

The motes are not in a single broadcast domain. Refine the calibration code to flood the command message so that everyone hears it. The result of your calibration process should be that all motes report zero when in dark room and 100 when exposed to a bright light.

Homework 5: Multiple Targets and Preserving Tracking Continuity (EmStar application)

There may be more than one target on a field.  Can you develop a technique to differentiate between one large target and multiple small targets?  After identifying more than one target you will need to delegate the cameras such that they do not all track the same target.  Finally, when a camera switches from one evader to another, you lose continuity. Can you come up with a way to minimize the amount of switching a camera does and smoothly hand off an evader from one camera to another? A hand off is when Camera A moves its attention away from Evader 1 only after Camera B acquires the Evader 1. This of course assumes that Camera A and Camera B can talk to each other.

Homework 6: In-Network Aggregation and Self-Filtering, Multihop (TinyOS application)

The amount of data transmitted has a huge impact on the lifetime of the network. The goal is to use in-network aggregation to reduce the total amount of traffic. The basic idea is that as nodes forward traffic up the tree, it would be smart to only forward the maximum reading, instead of all of them. A requirement is that all nodes must broadcast their own sensor readings. It‚€ž¢s only at the time of forwarding that the node makes a decision. Graph the amount of traffic and compare it to the old way of doing it.

Another idea to reduce the amount of traffic is to make a local decision on whether or not to broadcast its sensor reading based on the readings it hears from its neighbors. But once the mote broadcasts its data, it‚€ž¢s forwarded in the usual manner. How does this compare to in-network aggregation and all-broadcast in terms of traffic? What about combining self-filtering and in-network aggregation?  How does this behave with more than one evader?

Homework 7: Optimization and Final Preparation and Integration

The objective is to make sure that all the parts of your system work together. You may also take this opportunity to implement some small optimization or cool feature that you‚€ž¢d like to see in your system and clean up the interactions between components of the system.


TinyOS and EmStar documentation available online and are mandatory reading.



Wireless Communications IEEE, Dec. 2004, Volume 11, Issue 6. Recommended reading for the 2nd half of the course. You should have free access to IEEE material from UCLA.

Leave a Reply

Your email address will not be published. Required fields are marked *