A voice activated, assistive robotic arm that can retrieve objects from within its work-space and give them to a user upon request. This project is built with the intent to help the elderly, disabled, and forgetful. Controlled by the Robot Operating System (ROS), the arm has five degrees of freedom and a two finger gripper. It is composed of five stepper motors and one servo controlled by a RAMPS 1.4, an Arduino Mega and 2 3-axis stepper drivers.
The two types of input sensors are a multi-directional microphone and a camera. The Intel Speech Enabling Developer Kit is connected to a Raspberry Pi3+ and Alexa Voice Service. The processed voice input is sent to ROS through web sockets. The RealSense Camera detects AprilTags, a form of QR codes. To request an object from Knuckles, the user needs to say: “Alexa trigger Knuckles to give me the bottle”.
Tails is an autonomous drone that will optimize flight paths based on the surroundings while simultaneously avoiding collisions. Other planned features of the drone will include auto-stabilization, a return-to-home function, and a friendly user interface. Our current application includes delivering mail around the University of Houston’s Electrical and Computer Engineering Department.
Standard drones are controlled by a handheld transmitter that communicates to a flight controller which handles low-level flight abstraction to achieve the user’s desired result; however, Tails’ flight controller receives input via pulse-width modulation generated from a raspberry pi running ROS. Robot Operating System (ROS) is the meta-operating system for Tails, providing tools for distributed computation, rapid testing, as well as access to good algorithms for navigation, motion planning, and mapping. Utilizing ROS, separate nodes programmed in python were created for outputting signals (fcs_node), a control system based off a finite state machine (tails_node), and manual controls (teleop_node). Currently, the team working on this project is focusing on mapping and navigation through rooms.
Node Leap is a flexible multi-purpose network architecture which allows for real-time, bi-directional communication across devices. In addition to supporting Wi-Fi enabled smart things, Node Leaps capabilities can be expanded using interfacing peripherals called “Lili pads”, allowing communication with older or non-IoT devices using IR, RF or Bluetooth. This allows for seamless and inexpensive integration in environments where every device might not be IoT capable while creating future upgrade paths.
The current lightbar demo platform showcases the capabilities of this architecture using 87 individually addressable RGB LED’s and stereo speaker system. Though currently only audio over Wi-Fi works the vision for the final product includes the ability to sync the lightbar with other nodes and configure lighting modes and audio output from an app.
Capture the Hole is a multiplayer derivative of Tic-Tac-Toe that uses an array of lasers and photoresistors to identify which hole has been triggered and which player triggered the hole. The lights around the rings will then change color according to the color of the projectile that passed through. The objective of the game is to light three rings in a row of the same color; however, players can capture holes of opposing players, making the game much more challenging than traditional Tic-Tac-Toe.
The projectile colors are detected using a computer vision and motion tracking program that we built for a Raspberry Pi microcomputer that was written in Python. The script will detect the color of moving objects within certain parameters to more accurately determine the color of the projectile as it is moving towards the game board. While the object is approaching the board, its color is detected and transmitted to an Arduino through serial communications.
An Arduino Mega was programmed in C to process the readings from sensors, to set the colors of rings, and to detect if a player has won in real-time. This is accomplished by utilizing the laser and photoresistor array; when two intersecting laser paths are broken by a projectile passing through, the system recognizes which hole that is associated with and writes the color of the object to that specific hole after receiving the information from the Raspberry Pi.
A wand-based gesture control system that seeks to map movement patterns as a user swings the device in the air. The concept of this system would allow users to cast ‘spells’ and wirelessly control devices in their household such as speakers, televisions, etc. The project is Arduino-based connects to a Raspberry Pi for Bluetooth forwarding. The whole body of the wand is a custom 3D-print based on the hilt of the Master Sword in game The Legend of Zelda. Current progress recognizes basic swings and can send control signals to the mini-mobile robot, Pompeux.
Mobile droids controlled by wacky joystick calibrations where players have to complete a wall maze in the shortest time. We also have different configurations such as the lock mechanism, the swerve control, and wiggle. The joystick calibrations such as Wiggle are intentionally programmed where the player has to wiggle the joystick in the direction where the robot has to go rather than pointing. This is an Arduino based robot in addition to DC motors, motor controllers, and custom programmed joysticks. This project is used to train and educate students about the fundamentals of mobile robot building and manipulating the joystick controls.
A game designed to use student-built, mobile robots that are controlled using intentionally un-intuitive joystick calibrations. Used for the IEEE UH Makers events, players must complete a (customizable) maze in the shortest time. The project was used a teaching tool during student-driven, interactive Arduino workshops to teach about the fundamentals of DC motors, motor controllers, mobile robot building and manipulating the joystick controls.
Student learners and workshop coordinators created multiple versions of ‘wacky’ calibrations based on the projects assigned in the workshops. These calibrations we lovingly named: Swerve, TikTok, and Wiggle (shown in the video).
Pompuex is a redesign of the Arduino Nano-based mobile robots built for Wacky Races. Major changes to the design include a smaller (more mobile) body, a new power switch, an onboard rechargeable battery and more accessible prototyping board layout. The new, cleaner board layout allows users to swap between an attachable joystick, the Wanduino (another IEEE UH Makers project) or wireless controls. The cabling is color-coded and visible for demonstrations. Our students want to add sensing and other machine learning capabilities, using Jetson Nanos, to this mini mobile robot.
Arduino-based sequence game where the player is given a certain sequence of blinking colors to mimic by pushing corresponding colored buttons. The objective of this game is to mimic the sequence until the player makes a mistake. Each time a sequence is completed successfully, a point is added, and another random color is appended to the end of the sequence making it more challenging to recall. Upon a loss, the final game score is displayed on the surface mounted LED display.
The physical structure of the game includes four different colored buttons that are programmed to display an initial sequence that needs to be repeated and waits for the user to repeat the sequence back through pushing the buttons in the correct order. The game also includes buzzer tones to correspond with the status of the game – each successful completion of a sequence is met with a winning buzzer tone and in contrast, a different buzzer tone is applied to the loss of the game.
IEEE Lounge Security & Pest Detection
We will use computer vision to detect and record any potential snack thieves in our student organization lounge. Based on the recorded footage, we will create a plan of action to prevent snack theft.
Smart Security Camera
Package theft has become a massive problem in the Amazon age. We plan to create a smart surveillance system similar to commercially available security camera systems that can detect package deliveries and theft by identifying people and packages and tracking their movement. To make the camera achieve this level of intelligence, we plan to utilize the latest machine learning innovations.