This is the blog for the Robotics and Embedded Systems program at the University of Advancing Technology in Tempe, AZ. Find out more about our program at majors.uat.edu/robotics/
This article covers a educational videogame (if the game is programming) called "Alice" from carnegiemellon that can teach the introductory concepts behind programming.
I think its very cool that it's available online for free, and whats more I can recommend it to kids and adults alike to help them learn programming! I installed the latest "Alice 3" and while I haven't had much chance to play with it in depth, I can see some kids liking this a lot.
About 10 percent of the nation’s colleges now use Alice, an open-source, graphical software program available free online that allows users to learn the very basics of programming -- concepts like iteration, if statements and methods -- while making 3-D animations. It also has a textbook to accompany it if needed.
Can you say biocomputer? It's very cool in a geeky programmer/mad scientist kind of way to hear about being able to program living cells to become very tiny computers! Check out the article here!
This is slightly old new
"The purpose of programming cells is not to have them overtake electronic computers, rather, it is to be able to access all of the things that biology can do in a reliable, programmable way."
This is slightly old news though, as you can see here. Last year another group created bacterial computers capable of solving a classic mathematical problem known as the Hamiltonian Path Problem. An extention on previous work last year to produce bacterial computers that could solve the Burnt Pancake Problem.
Phone apps dont get much cooler than this! The application looks for characters within an image, assembles the characters into words then looks up the word litterally in the dictionary, and then (the coolest part) not only draws the translated words over the original, but replaces them entirely! surely with the characters recognized it was simple enough to take out that color information and fill it in with the surrounding information, but wow, they did a good job I cant wait to have someone with an iPhone try this out for me in person!
OK OK OK... So I read the paper, and as expected at the end there was a whole ton of "This is what I think people should do" from the author. but nestled in there was a interesting factor of evolutionary changes at work. While the paper is slightly laughable, nobody can take away the fact that the author bred flies that lived several times longer than average. Below was the only comment of note on the subject, but after a little more research I found this link which gives a far more detailed view of how it all came about.
He began his work on fruit flies by tricking natural selection to produce what eventually became “Methuselah flies,” for which he is well known. The trick? Take the eggs from fruit flies that have maintained enough of their physiological function to reproduce in old age, and repeat.
Selection for late-life reproduction eventually made longer-lived fruit flies. This delayed-reproduction lineage, Rose showed, lives up to five times longer than average.
This page features an in depth explanation with code included on how the person went about creating a GA Sudoku solver! Its also a fun read as the writer has an entertaining personality.
"I remember that I once read something from someone who tried a GA C-library on Sudoku and concluded that it was not a suitable problem. If I could solve it with my slick library, that random person on the internet, whose web page I might never find again but who may exist as far as you know, would certainly be proven wrong. A worthy cause."
For anyone who Enjoys the topic, or just needs a good place for doing Research on machine learning, this is a great resource I've found that is extremely well referenced and full of content. "Find a bug in a program, and fix it, and the program will work today. Show the program how to find and fix a bug, and the program will work forever." - Oliver G. Selfridge,
Tinkercell. This cool program is intended to make it much easier to create and simulate life on a genetic level! More Detail.
"The package has a library of the components of life, from which users can pick different cells, membrane proteins, fluorescent proteins, enzymes and genes to create their organism. Tinkercell can then simulate the life form to see if it functions as expected."
Watch Craig Venter explain how his team of researchers created a new life form – and what happens next
When asked whether this latest venture wasn't, at least a bit, like "playing God?" His answer was swift and to the point. "No, we're not playing anything. We're learning the rules of life."
The online submission date is past due and after the submitted AI's duked it out and a winner has been declared, but the fun isnt over for those of you that missed the chance! Here is the page with all the starter kits in many various languages that will allow you to program an AI to play this game!
While you may need an old operating system to play it,its still cool to see an evolutionary algorithm put into game AI. This site hosts download for an Alife game that offers its source code called bSerene is a game where monsters learn to play against the player via Artificial Life. The behavior of the monsters is governed by about 50 parameters, which are used as genes in a genetic algorithm. At the end of every arena the monsters that have inflicted the most damage on the players are saved to file and used as the parents of monsters in subsequent games. Monsters who are close to another monster that manages to hit you get a percentage of the credit.
Without having to look at the source code: You can change the appearance of the program by putting in your own pictures for the explosions power-ups and projectiles, or you can design your own arenas by simply modifying some txt-files in the game folder. The source code and executable game is provided for download here.
The following project description is an entry in the Visual Representation of Learn Experience Innovate of the Academic Palloza, Fall 2010 competition held at the University of Advancing Technology.
The overall process in design is actually fairly simple.The circuit design shown in Figure 1 below shows the simple circuit used for each LED to be lit.
Figure 1.Schematic.
Figure 1 is easy to understand.Signal goes from a Digital I/O pin, through an LED, a 150 ohm resistor, and to ground.It can be repeated for as many pins as the user wants.The software is set up to run a 5 X 9 grid of LEDs. The software uses a 5 X 15 array that represents all of the LEDs plus the extra on the right, and the text scrolls to the left.
Below are several photos from the testing phases of the design and implementation of software. Below that is the source code, written for an Arduino Mega.
Early testing phase of the design and wiring.
Testing of everything wired.
The video above shows an early version, before the resistors were added to the LEDs (A VERY BAD CHOICE TO DO) and before the serial communication was working.
Things are moving fast for the RoboWargames team. In and out of UAT. The team (headquarters located in the 244 hardware lab) took their stock Parallax rover out for a spin yesterday after refueling it with both hydraulic fluid and gasoline. After a short warm up and a few minor hiccups the rover was soon getting carried away with itself. Under the control of a remote ofcourse. The max speed of the rover is ~15 mph but due to the hydraulic settings it was cruising at a good ~10 mph.
The rover will eventually be pimped out with a laser range finder, onboard cameras, various sensors, a sleek casing, and autonomous programming. It will be our entry into the IGVC (Intelligent Ground Vehicle Competition) in the summer of 2011.
Greg James, Barry Silverman and Brian Silverman from Visual6502.org have been working for the last year on a transistor-level visual simulation of the 6502 microprocessor. Their work is incredibly detailed and very interesting. The 6502 design is a classic processor, and very important in computer history as it was used in the Apple I & II, Commodore, Atari and Nintendo computing systems.
In the summer of 2009, working from a single 6502, we exposed the silicon die, photographed its surface at high resolution and also photographed its substrate. Using these two highly detailed aligned photographs, we created vector polygon models of each of the chip's physical components - about 20,000 of them in total for the 6502. These components form circuits in a few simple ways according to how they contact each other, so by intersecting our polygons, we were able to create a complete digital model and transistor-level simulation of the chip.
The CREATE Lab at Carnegie Mellon University is sponsoring a contest to develop efficient battery-management algorithms for electric vehicles. At the end of every month, high-performing entries will be awarded small prizes ($100 in Amazon gift cards), with a grand prize (possibly an actual electric vehicle) awarded at the end of the competition. At the end of each monthly judging phase, entries are open-sourced for all to see and share.
Researchers at Ecole Polytechnic in Lausanne have developed marXbot and snazzy battery exchange system. The batteries are modular and charge in a kind of ferris-wheel. When the robot runs low on juice, it docks with the station, and the dead battery is removed from the robot chassis by a manipulator. During the swap, robot power is provided by an alternate ultra-capacitor based system that can provide 15 seconds of power for the critical systems (i.e. only maintaining cpu memory state). The wheel rotates to a fresh battery, and the manipulator pushes it into the chassis, completing the hot-swap.
Graffiti Analysis: Sculpturesis a series of new physical sculptures that I am making from motion tracked graffiti data. New software (GA 3D) imports .gml files (Graffiti Markup Language) captured usingGraffiti Analysis, creates 3D geometry based on the data and then exports a 3D representation of the tag as a .stl file (a common file format compatible with most 3D software packages includingBlender, Maya and 3DS Max). Time is extruded in the Z dimension and pen speed is represented by the thickness of the model at any given point. I then have this data 3D printed to create a physical sculpture that serves as a data visualization of the tag. For theStreet and Studioexhibition at the Kunsthalle Wein, I collaborated with an anonymous local Viennese graffiti writer and had the GA sculpture printed in ABS plastic. Graffiti motion data of his tag was captured in the streets (for the first time) at various points around Vienna.
More information (including software, source code, and many more pictures) can be found at Evan's website.
Robert Wood of Harvard University, Daniela Rus and Erik Demaine at the Massachusetts Institute of Technology take us to a new level of robotic creepiness with self-folding origami. The device is able to morph itself into new shapes using shape-memory alloys that line the edges of triangular modules. Each module also sports a strong magnet that holds the final form after a module has executed a desired fold.
Last week I built this shadow sculpture inspired by the cover of the Pulitzer-prize winning book "Gödel, Escher, Bach: An Eternal Golden Braid" this is a 3D-printed form that casts three orthogonal shadows in the shapes of the letters UAT. After I printed it, I used some blue LEDs and foam-core to build a little display platform for it.
Over 6 months ago we added two kittens (Duke and Ella) to our household, and we found out that our dog, Riley was eating their food. To prevent this, I built the Guardian, an AVR-based protection system that humanely keeps Riley away from the cat food, but is not triggered by the cats themselves.
I wanted this setup to be cheap, so I decided to use an infrared "trip-wire" system. On one side of the hallway is an always-on infrared emitter, and on the other side is an infrared receiver, an AVR ATmega8 microcontroller, and a buzzer. The micocontroller continuously checks to see if the beam between the emitter and receiver has been broken - if it has, it sounds the alarm, which Riley hates, driving him away. The microcontroller is programmed so that small, fast things (such as cat tails) do not trigger the alarm. Also, the whole rig is place at Rileys shoulder height, which is much taller than the cats bodies.
This may not work for most dogs, as Riley is a pansy and afraid of loud noises, but here are the schematics, source code, and a rough bill of materials:
This weekend I built a Tensegrity structure as a demonstration of principles of balancing tension and compression in structures for my course RBT379 - Mechatronics. I've been fascinated by the aesthetic of these structures - all triangles and edges, but somehow magical as your brain tries to wrap itself around the reasons why the solid supports seem to be floating in air, and the structure is still standing.
I found a couple of excellent pages documenting how the structures work and are assembled - one from the artist Kenneth Snelson explaining the aesthetic inspiration for Tensegrity structures as a kind of 3-dimensional weaving, and another more practical site detailing the construction of a Tensegrity coffee table.
I used the UAT 3D printer to make 12 struts. You can make these out of much simpler materials (wood, copper, chopsticks, etc), but the Inventor Part design for the struts are provided here. A gallery of the build process and completed 3-level structure (and one rubber-band version) are below.
I used 20lb test mono-filament fishing line and 2 sets of good smooth-jaw pliers to tie the knots (grip-jaw pliers scar the fishing line, causing breaks after you've carefully tied many little knots), following the guidelines at the Tensegrity coffee table website. If you do this yourself, construct a jig that will allow you to accurately create the tension links at a certain distance (the length of your struts divided by 1.4).
I built 3 modules, 2 "left-handed" and 1 "right-handed." To assemble them into the tower, I hooked the struts of one module into the base triangle of tension elements of another, forcing the base triangle into a hexagon.
The resulting structure is surprisingly sturdy, and the fishing line is somewhat transparent, adding to the mystical effect I was trying to emphasize.
I've always been fascinated by Tesla Coils. The operating principle is so elegant - you set up one circuit (your power supply) that oscillates at some high frequency, and you set up another circuit that uses the earth to resonate with your power circuit. Energy is transferred and converted from current to voltage, and the end result is man-made lightning - an incredible display of light and sound.
Newer versions of the Tesla coil allow the control of the resonant frequency. Have you ever noticed the sound that a flourescent light makes? That kind of "buzz" ? You're hearing the frequency of the power that is coming into your house (around 60Hz). When you control the resonant frequency of the Tesla, you control the tone that it generates, and thus the Singing Tesla Coil is born. This is not lightning set to music - the sound is generated by the spark itself.
The band ArcAttack has integrated a pair of singing Tesla coils into their repertoire, with epic results:
For this semester's Mechatronics course (RBT379) the semester project is to build a self-balancing two-wheeled robot from scratch. Over the course of the semester we will be designing the frame of the robot, constructing the control schematic, laying out a PCB, and finally programming the micro-controller to self-balance using a PID loop.
The first part of the semester we're using Autodesk Inventor to design and layout the frame of the robot. To help with this process I've modeled the major components. I had a really difficult time finding models of these online, so I'm sharing them with the world:
Cornell Students Adam Papamarcos and Kerran Flanagan have built an awesome set of small games using micro-controller based video processing. The details of the build (excellently documented - my students should take note) are provided at the Cornell Project Website, and more videos detailing how the system works are available at Engadget.
Dan Willinger is back with his final project for RBT337 - Digital Vision and Sensor Processing. Using OpenCV, Dan implemented an augmented reality Pong clone that tracks the size and location of two white objects (pens in the demo video) that act as the paddles in the game. Also, the length of the white object can change the size of the paddle.
This is a project I've wanted to build for years - my own concept was to construct a system powerful enough to paint a graphic on the side of a dorm at my Alma Mater from across campus.
This awesome project was built by Eiji Hayashi at Carnegie Mellon University. What stands out is the use of LED's as both display and input. The full design is available here.
This is a motor test of the UAT Educational Robot Platform, developed as a class project by Dan Willinger and Stephen Harper. The robot will be used as a learning platform for advanced sensor interfacing and mobile autonomous system development at the University of Advancing Technology. This is one of the two robots that Dan and Stephen built, each costing about $1200.
A demonstration of the final project for my course RBT337 - Digital Vision and Sensor Processing by Brittany Wilkerson and Casey Johnson. Their final project used OpenCV's face detection and SURF algorithm to identify faces in a live video feed.
A final project demonstration by Ryan Carmain, Kayla Bayens, and Leonard Hockett of their Accelerometer controlled robot. This project was for my course RBT173 - Introduction to Microcontrollers. The controller uses an accelerometer to sense the direction of gravity, which is sent over a serial connection to the robot, and translated into motor commands.
This is Andre Walker's final project for my course RBT173 - Introduction to Microcontrollers. The project is a text-to-speech robot that mimics emotions and reads specially tagged twitter messages out loud. It is a modified version of the GanzBot project.
Mike's back, demonstrating his final project for UAT's Digital Vision and Sensor Processing course. In this video, Mike is demonstrating his SURF-based Glyph tracking system. Take it away Mike!
For their semester project in UAT's Digital Vision and Sensor Processing course, Josh Butler and Mark Stoddard implemented an excellent Connect 4 augmented reality program that warns a user if 3 pieces of the same color are placed in a row by highlighting the warning area in green. If a set of 3 is blocked, it is eliminated as a possible "win."
This first video shows the program in operation, live, raw video in the top left, augmented video in the bottom left, and color filters on the right for red and black pieces.
This second video demonstrates some of the inner workings of the program.
The bottom left pane now shows how the program scans over the all of the possible positions in the live feed, determining if the location contains a red or black piece, or is empty. This information is used to populate an array internally, which is then checked for "3 in a row."
This is another assignment in UAT's Digital Vision and Sensor Processing course. In this laboratory, students are tasked with implementing and comparing optical flow algorithms, one using Lucas Kanade, and another using SURF.
Here is Mike Peters demonstrating optical flow using the Lucas Kanade algorithm:
As one of the laboratory assignments in the UAT Digital Vision and Sensor Processing course, students implement the OpenCV SURF algorithm on a live video feed. Here are some example videos of what my students produced. In the videos, the white lines indicate the tracking of matched features in one image (usually a target) to another (the live video).
I found this gem in a gallery of vintage robots featured in Popular Science.
This robot, called "The Beetle" weighed 170,000 pounds and cost $1.5 Million in 1962. It was originally designed as a "mechanic's suit" for some kind of half-baked nuclear-powered airplane. There are more awesome pictures in the original article, particularly of the "pilot" sitting behind 2 feet of leaded glass.
The whole thing looks like something out of the movie Aliens.
CHARLI is actually a series of robots that initially consists of the 5-foot tall CHARLI-L (or lightweight, pictured above), and the forthcoming CHARLI-H (or heavy), both of which are completely autonomous, with a full range of movements and gestures thanks to a series of pulleys, springs, carbon fiber rods, and actuators (not to mention some slightly more mysterious AI). What's more, while CHARLI-L is currently restricted to walking on flat surfaces, CHARLI-H promises to be able to walk on the uneven ground around the Virginia Tech campus, and eventually even be able to "run, jump, kick, open doors, pick up objects, and do just about anything a real person can do."
This robot reminds me a lot of the machines from I, Robot. I can't wait until machines like these are ninja-ing around the world.
This year they've added a level-generation aspect to the competition.
The level generation track of the competition is about creating procedural level generators for Infinite Mario Bros. Competitors will submit Java code that takes as input desired level characteristics, and output a fun level implementing these particular characteristics. The winner will be decided through live play tests.
Unlike it’s predecessor this one has three axes. It was very challenging to build, with a total of 9 slip-contacts, not including the motors. I made it from scrap I had laying around and it took about a week to build. I use standard DC-motors controlled with pulse width modulation, the LEDs are controlled with a modified bike light with adjustable frequency.
New algorithms from NaturalMotion allow digital characters to dynamically and realistically respond to changes in their environment. What is most interesting about this work is that the methods that they use are not hard-coded - rather than completely and painstakingly modeling the motion of walking characters by hand, they use a mixture of physics modeling and evolutionary algorithms to allow the system to 'learn' how to react to the environment. This enables the characters first to learn about walking, then dynamically adapt to perturbations like pushes and hits from objects and other characters. This results in very robust and realistic motion.
Have you ever wanted to meet a Mythbuster in real life? Have you ever wanted to see behind the scenes of a ComBots event, and see how some of the world’s most badass robots are built? Now’s your chance to do both of those things at once! Enter the RoboGames 2010 Bots Behaving Badly Contest! Take a video, take a picture, or make a photoshop of a robot behaving badly, and submit it to YouTube, or Flickr. Get everyone you can to check it out, because the submission with the most views wins!!
How To Enter:
- Take a video or a picture, or make a photoshop of a robot behaving badly
- Upload it to YouTube or Flickr, with the tag “BadBots2010″ (IF YOU DON’T, YOU WONT BE ENTERED)
- Every submission MUST link prominently to RoboGames.net, and somehow be related to Bots Behaving Badly (this is at the judges’ discretion)
- Share your entry- send it to friends, post it on digg etc etc etc
- The entry with the most views wins!
Prizes:
1st Place: 2 tickets to RoboGames 2010, 2 passes to the RoboGames Builder’s Party on Friday, April 23, a RoboGames Goodie Bag, and access to the pit at RoboGames (where contestants build their robots) with a meet-and-greet with Mythbusters’ Grant Imahara
2nd Place: 2 tickets to RoboGames 2010, 2 passes to the RoboGames Builder’s Party on Friday, April 23, 2 RoboGames T-Shirts, and 2 packs of RoboGames Trading Cards
3rd Place: 2 tickets to RoboGames 2010, a RoboGames T-Shirt and a pack of RoboGames Trading Cards
Judges Award: 1 ticket to RoboGames and a RoboGames T-Shirt
Deadline: 4/22/2010 at 11:59pm
Can’t make it to RoboGames? No problem! Alternatives to tickets will be also available to winners!
All rules subject to change without notice
The Fine Print:
· Participants agree to abide by all rules and decisions set by RoboGames.
· RoboGames reserves the right to reject an entry for any reason.
· Should a winner be unable to attend RoboGames 2010, RoboGames will determine an appropriate substitute of approximately equal value.
· RoboGames reserves the right to modify or cancel the contest at any time, at its sole discretion.
· By submitting your entry and entering this contest, you grant RoboGames royalty-free rights to publish, reproduce, or otherwise distribute your work commercially or by any other means.
· Governing Law: All issues and questions concerning the construction, validity, interpretation and enforceability of the official rules, or the rights of entrants, shall be governed by and construed in accordance with, the substance laws of the State of California and any applicable laws and regulations of the United States.
· You must be over the age of 18.
Recently a federal appeals court ruled against the Federal Communications Commission on net neutrality. In the case, Comcast was challenging the power of the FCC to tell Comcast how to manage its network, specifically pertaining to Comcasts ability to throttle bit-torrent network traffic.
Mike Davey presents his hand-built Turing Machine. The machine uses a Parallax Propeller chip for mechanical control, but actual computation is executed using the tape reel.
My goal in building this project was to create a machine that embodied the classic look and feel of the machine presented in Turing’s paper. I wanted to build a machine that would be immediately recognizable as a Turing machine to someone familiar with Turing's work.
Although this Turing machine is controlled by a Parallax Propeller microcontroller, its operation while running is based only on a set of state transformations loaded from an SD card and what is written to and read from the tape. While it may seem as if the tape is merely the input and output of the machine, it is not! Nor is the tape just the memory of the machine. In a way the tape is the computer. As the symbols on the tape are manipulated by simple rules, the computing happens. The output is really more of an artifact of the machine using the tape as the computer.
You can get more details about the construction of the machine at Mike's Site.
Two copies of the same circuit. The one on the right was done first, and is a pretty good rat's nest, if I do say so myself. The one on the left was built based on the other one, but with an eye toward clarity. Much easier to see what's going on! I guess that figuring out the layout based on a schematic is a skill that needs practice.
Researchers at the University of Liege in Belgium have made a breakthrough in machine vision. Background extraction is the separation of a "normal" background image from more interesting "new" pixels such as moving objects.
This new algorithm is very high performance and computationally efficient. Unfortunately, it's completely patented, but a demo video and a paper describing the method are linked below.
O. Barnich and M. Van Droogenbroeck. ViBe: a powerful random technique to estimate the background in video sequences. In International Conference on Acoustics, Speech, and Signal Processing (ICASSP 2009), pages 945-948, April 2009. Available as a IEEE publication or on the University site.
Here's a time-lapse video I made demonstrating the Autodesk Inventor interface and 3D printing. The model I'm making is a simple robot hand finger tip. The total time, from conception to physical model, is about an hour.
Here is another lab demonstration from some of my students in RBT173 - "Introduction to Micro-controllers."
The goal of the laboratory was to construct and program a simple breadboard "Simon" style game. Some of the challenges of this lab were construction of input and output circuits, and generating a random sequence of light blinks. The most challenging part of this lab for most students was recognizing user input and determining whether it matched the generated quiz sequence.
This is the kind of thing that I'd like to see students produce in the Physical Computing Studio (RBT307) course next semester. Quick, dirty robots held together by hot glue and electrical tape. The idea is that once we strip away the requirement of long-term durability, we can move quickly through design iterations, and learn a great deal about functional, physical design and manufacturing methods.
When I was growing up, I was destined to be an engineer. I would go to the library and check out every book I could on robots - big picture books of robots, children's books on robots, even some academic books that I couldn't really understand as a child, but the diagrams and illustrations captivated me. I was enchanted by the idea of building my own machine.
Make has a short article on one of my favorites, and I'm now destined to spend the rest of my evening perusing the Old Robots website.
Build your own tools!
This ATiny45-based dual channel oscilloscope is home-made, cheap and simple. It doesn't capture at high sample rates, but it's neat nonetheless.
A 9-cubic millimeter solar-powered sensor system developed at the University of Michigan is the smallest that can harvest energy from its surroundings to operate nearly perpetually. The U-M system’s processor, solar cells, and battery are all contained in its tiny frame, which measures 2.5 by 3.5 by 1 millimeters. It is 1,000 times smaller than comparable commercial counterparts.
Here is a short video of some of my freshman students completed lab assignments.
This is an early laboratory assignment in a freshman-level course. The assignment takes the place of the typical "Hello World" blinking light program for micro-controllers. The course is RBT173: Introduction to Microcontrollers. We're using hand-built Arduino-compatible micro-controller boards and exploring all of their ins and outs with a series of weekly hardware / software labs culminating in the construction of a small mobile robot based on the board.
Feeling nostalgic for those frantic days in the digital design lab? Wish there were perhaps a game that put you in the place of an engineer, trying to design new ICs to meet the goals of some unspecified organization? Want to get your SI on? No idea what I am talking about?
Well then, let me introduce you to Kohctpyktop: engineer of the people, by Zachtronics Industries. It's the first game I've ever seen where you have to design integrated circuits as a challenge, sort of like pipe dream for electrical engineers.