Monday, May 31, 2010
The Guardian: Inexpensive AVR-based Cat Food Protection
Over 6 months ago we added two kittens (Duke and Ella) to our household, and we found out that our dog, Riley was eating their food. To prevent this, I built the Guardian, an AVR-based protection system that humanely keeps Riley away from the cat food, but is not triggered by the cats themselves.
I wanted this setup to be cheap, so I decided to use an infrared "trip-wire" system. On one side of the hallway is an always-on infrared emitter, and on the other side is an infrared receiver, an AVR ATmega8 microcontroller, and a buzzer. The micocontroller continuously checks to see if the beam between the emitter and receiver has been broken - if it has, it sounds the alarm, which Riley hates, driving him away. The microcontroller is programmed so that small, fast things (such as cat tails) do not trigger the alarm. Also, the whole rig is place at Rileys shoulder height, which is much taller than the cats bodies.
This may not work for most dogs, as Riley is a pansy and afraid of loud noises, but here are the schematics, source code, and a rough bill of materials:
Schematics (Eagle, PNG)
Source Code
Bill of Materials
Labels:
Arduino,
dog,
electrical engineering,
machines,
Maker,
rapid prototyping,
UAT
Saturday, May 29, 2010
3D Printed Tensegrity Structure
This weekend I built a Tensegrity structure as a demonstration of principles of balancing tension and compression in structures for my course RBT379 - Mechatronics. I've been fascinated by the aesthetic of these structures - all triangles and edges, but somehow magical as your brain tries to wrap itself around the reasons why the solid supports seem to be floating in air, and the structure is still standing.
I found a couple of excellent pages documenting how the structures work and are assembled - one from the artist Kenneth Snelson explaining the aesthetic inspiration for Tensegrity structures as a kind of 3-dimensional weaving, and another more practical site detailing the construction of a Tensegrity coffee table.
I used the UAT 3D printer to make 12 struts. You can make these out of much simpler materials (wood, copper, chopsticks, etc), but the Inventor Part design for the struts are provided here. A gallery of the build process and completed 3-level structure (and one rubber-band version) are below.
I used 20lb test mono-filament fishing line and 2 sets of good smooth-jaw pliers to tie the knots (grip-jaw pliers scar the fishing line, causing breaks after you've carefully tied many little knots), following the guidelines at the Tensegrity coffee table website. If you do this yourself, construct a jig that will allow you to accurately create the tension links at a certain distance (the length of your struts divided by 1.4).
I built 3 modules, 2 "left-handed" and 1 "right-handed." To assemble them into the tower, I hooked the struts of one module into the base triangle of tension elements of another, forcing the base triangle into a hexagon.
The resulting structure is surprisingly sturdy, and the fishing line is somewhat transparent, adding to the mystical effect I was trying to emphasize.
I found a couple of excellent pages documenting how the structures work and are assembled - one from the artist Kenneth Snelson explaining the aesthetic inspiration for Tensegrity structures as a kind of 3-dimensional weaving, and another more practical site detailing the construction of a Tensegrity coffee table.
I used the UAT 3D printer to make 12 struts. You can make these out of much simpler materials (wood, copper, chopsticks, etc), but the Inventor Part design for the struts are provided here. A gallery of the build process and completed 3-level structure (and one rubber-band version) are below.
I used 20lb test mono-filament fishing line and 2 sets of good smooth-jaw pliers to tie the knots (grip-jaw pliers scar the fishing line, causing breaks after you've carefully tied many little knots), following the guidelines at the Tensegrity coffee table website. If you do this yourself, construct a jig that will allow you to accurately create the tension links at a certain distance (the length of your struts divided by 1.4).
I built 3 modules, 2 "left-handed" and 1 "right-handed." To assemble them into the tower, I hooked the struts of one module into the base triangle of tension elements of another, forcing the base triangle into a hexagon.
The resulting structure is surprisingly sturdy, and the fishing line is somewhat transparent, adding to the mystical effect I was trying to emphasize.
Labels:
3D modeling,
machines,
rapid prototyping,
RBT379,
UAT
Wednesday, May 26, 2010
ArcAttack: Ride the Lightning
I've always been fascinated by Tesla Coils. The operating principle is so elegant - you set up one circuit (your power supply) that oscillates at some high frequency, and you set up another circuit that uses the earth to resonate with your power circuit. Energy is transferred and converted from current to voltage, and the end result is man-made lightning - an incredible display of light and sound.
Newer versions of the Tesla coil allow the control of the resonant frequency. Have you ever noticed the sound that a flourescent light makes? That kind of "buzz" ? You're hearing the frequency of the power that is coming into your house (around 60Hz). When you control the resonant frequency of the Tesla, you control the tone that it generates, and thus the Singing Tesla Coil is born. This is not lightning set to music - the sound is generated by the spark itself.
The band ArcAttack has integrated a pair of singing Tesla coils into their repertoire, with epic results:
Newer versions of the Tesla coil allow the control of the resonant frequency. Have you ever noticed the sound that a flourescent light makes? That kind of "buzz" ? You're hearing the frequency of the power that is coming into your house (around 60Hz). When you control the resonant frequency of the Tesla, you control the tone that it generates, and thus the Singing Tesla Coil is born. This is not lightning set to music - the sound is generated by the spark itself.
The band ArcAttack has integrated a pair of singing Tesla coils into their repertoire, with epic results:
Sunday, May 23, 2010
Inventor Models
For this semester's Mechatronics course (RBT379) the semester project is to build a self-balancing two-wheeled robot from scratch. Over the course of the semester we will be designing the frame of the robot, constructing the control schematic, laying out a PCB, and finally programming the micro-controller to self-balance using a PID loop.
The first part of the semester we're using Autodesk Inventor to design and layout the frame of the robot. To help with this process I've modeled the major components. I had a really difficult time finding models of these online, so I'm sharing them with the world:
Standard Servo: Modeled after the Parallax Continuous Rotation Servo
Inventor Part (*.ipt), Drawing (*.pdf)
Parallax Boe Bot Wheel - Inventor Part (*.ipt), Drawing (*.pdf)
Sharp GP2D12 - Inventor Part (*.ipt), Drawing (*.pdf)
Switched AAx4 Battery Box - Modeled after Jameco PN#216187
Inventor Part (*.ipt), Drawing (*.pdf)
Circuit Board, 80mm x 100mm (Maximum Eagle Free dimension), 4mm holes.
Inventor Part (*.ipt), Drawing (*.pdf)
SparkFun SEN-09652 Triple Axis Accelerometer Breakout Board
Inventor Part (*.ipt), Drawing (*.pdf)
Most of these I measured by hand, so take the dimensions with a grain of salt, but they should be accurate within a millimeter or so.
The first part of the semester we're using Autodesk Inventor to design and layout the frame of the robot. To help with this process I've modeled the major components. I had a really difficult time finding models of these online, so I'm sharing them with the world:
Standard Servo: Modeled after the Parallax Continuous Rotation Servo
Inventor Part (*.ipt), Drawing (*.pdf)
Parallax Boe Bot Wheel - Inventor Part (*.ipt), Drawing (*.pdf)
Sharp GP2D12 - Inventor Part (*.ipt), Drawing (*.pdf)
Switched AAx4 Battery Box - Modeled after Jameco PN#216187
Inventor Part (*.ipt), Drawing (*.pdf)
Circuit Board, 80mm x 100mm (Maximum Eagle Free dimension), 4mm holes.
Inventor Part (*.ipt), Drawing (*.pdf)
SparkFun SEN-09652 Triple Axis Accelerometer Breakout Board
Inventor Part (*.ipt), Drawing (*.pdf)
Most of these I measured by hand, so take the dimensions with a grain of salt, but they should be accurate within a millimeter or so.
Labels:
3D modeling,
machines,
RBT379,
student projects,
UAT
Sunday, May 16, 2010
Human Tetris
Cornell Students Adam Papamarcos and Kerran Flanagan have built an awesome set of small games using micro-controller based video processing. The details of the build (excellently documented - my students should take note) are provided at the Cornell Project Website, and more videos detailing how the system works are available at Engadget.
Labels:
algorithms,
Art,
games,
machine vision,
student projects,
video
Thursday, May 13, 2010
RBT337 Final Project: Augmented Reality Pong
Dan Willinger is back with his final project for RBT337 - Digital Vision and Sensor Processing. Using OpenCV, Dan implemented an augmented reality Pong clone that tracks the size and location of two white objects (pens in the demo video) that act as the paddles in the game. Also, the length of the white object can change the size of the paddle.
Labels:
augmented reality,
games,
machine vision,
RBT337,
student projects,
UAT,
video
Tuesday, May 11, 2010
The Facade Printer
Automated Paintball Turret = Large Scale Printer.
This is a project I've wanted to build for years - my own concept was to construct a system powerful enough to paint a graphic on the side of a dorm at my Alma Mater from across campus.
Monday, May 10, 2010
Laser Command
Laser Command from Eiji Hayashi on Vimeo.
This awesome project was built by Eiji Hayashi at Carnegie Mellon University. What stands out is the use of LED's as both display and input. The full design is available here.
Wednesday, May 5, 2010
UAT Educational Robot Platform
This is a motor test of the UAT Educational Robot Platform, developed as a class project by Dan Willinger and Stephen Harper. The robot will be used as a learning platform for advanced sensor interfacing and mobile autonomous system development at the University of Advancing Technology. This is one of the two robots that Dan and Stephen built, each costing about $1200.
Here are specs for the robot:
Here are specs for the robot:
- Mini-ITX Motherboard with Atom 230 CPU (1.6Ghz)
- 2GB RAM
- 8GB Solid-State Storage (Compact Flash)
- Firewire
- WiFi
- PicoPSU DC-DC 150W Power Converter
- Track Chassis Kit
- Lithium-Ion Battery (25.9v, 6.4Ah)
- Phidget Motor Control Board
- Phidget USB Interface Board
- Running Ubuntu Server Edition (9.10)
- Custom Clear Lexan Chassis
RBT337 Final Project: Face Recognition
A demonstration of the final project for my course RBT337 - Digital Vision and Sensor Processing by Brittany Wilkerson and Casey Johnson. Their final project used OpenCV's face detection and SURF algorithm to identify faces in a live video feed.
RBT173 Final Project: Accelerometer Controlled Robot
A final project demonstration by Ryan Carmain, Kayla Bayens, and Leonard Hockett of their Accelerometer controlled robot. This project was for my course RBT173 - Introduction to Microcontrollers. The controller uses an accelerometer to sense the direction of gravity, which is sent over a serial connection to the robot, and translated into motor commands.
RBT173 Final Project: Text-to-Speech Twitter Robot
This is Andre Walker's final project for my course RBT173 - Introduction to Microcontrollers. The project is a text-to-speech robot that mimics emotions and reads specially tagged twitter messages out loud. It is a modified version of the GanzBot project.
Labels:
Arduino,
blinkenlights,
RBT173,
robot,
student projects,
UAT
Tuesday, May 4, 2010
RBT337 Final Project: Glyph Tracking
Mike's back, demonstrating his final project for UAT's Digital Vision and Sensor Processing course. In this video, Mike is demonstrating his SURF-based Glyph tracking system. Take it away Mike!
RBT337 Final Project: Connect 4
For their semester project in UAT's Digital Vision and Sensor Processing course, Josh Butler and Mark Stoddard implemented an excellent Connect 4 augmented reality program that warns a user if 3 pieces of the same color are placed in a row by highlighting the warning area in green. If a set of 3 is blocked, it is eliminated as a possible "win."
This first video shows the program in operation, live, raw video in the top left, augmented video in the bottom left, and color filters on the right for red and black pieces.
This second video demonstrates some of the inner workings of the program.
The bottom left pane now shows how the program scans over the all of the possible positions in the live feed, determining if the location contains a red or black piece, or is empty. This information is used to populate an array internally, which is then checked for "3 in a row."
This first video shows the program in operation, live, raw video in the top left, augmented video in the bottom left, and color filters on the right for red and black pieces.
This second video demonstrates some of the inner workings of the program.
The bottom left pane now shows how the program scans over the all of the possible positions in the live feed, determining if the location contains a red or black piece, or is empty. This information is used to populate an array internally, which is then checked for "3 in a row."
RBT337 - Optical Flow
This is another assignment in UAT's Digital Vision and Sensor Processing course. In this laboratory, students are tasked with implementing and comparing optical flow algorithms, one using Lucas Kanade, and another using SURF.
Here is Mike Peters demonstrating optical flow using the Lucas Kanade algorithm:
And the SURF Algorithm:
Here is Mike Peters demonstrating optical flow using the Lucas Kanade algorithm:
And the SURF Algorithm:
RBT337 - Object Tracking
As one of the laboratory assignments in the UAT Digital Vision and Sensor Processing course, students implement the OpenCV SURF algorithm on a live video feed. Here are some example videos of what my students produced. In the videos, the white lines indicate the tracking of matched features in one image (usually a target) to another (the live video).
(By Josh Butler)
(By Leonard Hockett)
(By Ryan Carmain)
(By Mike Peters)
(By Josh Butler)
(By Leonard Hockett)
(By Ryan Carmain)
(By Mike Peters)
Subscribe to:
Posts (Atom)