Sunteți pe pagina 1din 93

ELEC/COEN 490 CAPSTONE: March 24, 2014

PHASE THREE REPORT

TEAM 9

Human-Robot Communication
Through Voice-Command
Submitted by: Supervised by:
Venkata Lakshmi Mantha 9060203 Dr.Wei-Ping Zhu

Omer Khan 6168876 Mr.Sujan Kumar Roy

Yazen Alkouri 9231196

Mital Prajapati 9776796


Abstract

This project involves the design of an autonomous robot controlled through human voice
command. The robotic design will involve movement with bipedal legs. The user will interact
with the robot through voice commands. Commands are: go forward, backward, left, turn right,
speed up, slow, ready and end. MATLAB is used for the speech recognition algorithm. The
STM32F3Discovery is used to control the servomotors of the bipedal robot. The bipedal chassis
was designed with Solidworks, taking inspiration from several Lynxmotion designs. In the
training phase of the system, the Mel Frequency Cepstral Coefficients (MFCCs) of a voice signal
are extracted and saved in the speech recognition database. In the testing phase, the saved
coefficients are matched with the received arbitrary voice commands using the Artificial Neural
Network algorithm. When a command is recognized, a signal is wirelessly sent from the PC to
the robot. The robot’s microcontroller, STM32F3discovery, interprets the signal which in turn is
converted to mechanical movement. For the mechanical movement, we used four servo motors
on each leg of the robot. Now that we have reached the final phase of the project, we will
present our deliverable, a robot named Gump, with all its specifications and features.

I|Page
Table of Contents
1 Introduction .......................................................................................................................................... 1
1.1 Objectives……………………………………………………………………………………………………………………... 1
1.2 Project Outline………………………………………………………………………………………………………………. 1
2 Project Overview ................................................................................................................................... 3
2.1 Requirements ...................................................................................................................... 3
2.1.1 Functional Requirements.…………………………………………………………………………………….,…. 3
2.1.2 Nonfunctional Requirements....……………………………………………………………………………..… 4
2.2 Specifications……………………………………………………………………………………………………………………..… 4
2.3.1 Physical Specifications…………………………………………………………………………………………….. 4
2.3.2 Electrical Specifications…………………………………………………………………………………………… 4
2.3.3 Environmental Specifications…………………………………………………………………….……………. 5
2.3.4 Operational Specifications………………………………………………………………………….…………… 5
3 Design Process and Project Implementation ......................................................................................... 6
3.1 Proposed Design…………………………………………………………………………………………………………………. 6
3.2 Electrical Implementation…………………………………………………………………………………………………. 6
3.2.1 Overview of the System…………………………………………………………………………………………. 6
3.2.2 User-Side Circuit…………………………………………………………………………………………………….. 6
3.2.2.1 Overview………………………………………………………………………………………………….. 6
3.2.2.2 User-Side Schematic and PCB…………………………………………………………………… 7
3.2.2.3 Problems Encountered…………………………………………………………………………….. 8
3.2.3 Robot-Side Circuit…………………………………………………………………………………………………… 8
3.2.3.1 Overview…………………………………………………………………………………………………….8
3.2.3.2 Robot-Side Schematic and PCB…………………………………………………………………..9
3.2.3.3 Problems Encountered…………………………………………………………………………..… 10
3.3 Mechanical Implementation……………………………………………………………………………………………….. 11
3.3.1 Design Process……..……………………………………………………………………………………………….. 11
3.3.1.1 First Prototype……………………………………………………………………………………….. 11
3.3.1.2 Problems Encountered in the First Prototype………………………………………… 12
3.3.1.3 Second Prototype…………………………………………………………………………………… 13
3.3.1.4 Problems Encountered in the Second Prototype…………………………………….. 13
3.3.1.5 Third Prototype………………………………………………………………………………………. 14
3.3.1.6 Problems Encountered in the Third Prototype……………………………………….. 14

II | P a g e
3.3.2 Overview of the Mechanical Movements and Design………………………………………….. 17
3.3.3 Material Selection……………………………………………………………………………………………….. 20
3.3.3.1 Stress Analysis………………………………………………………………………………………… 20
3.3.3.2 Stress Analysis Results………………………………………………………………..………….. 21
3.3.3.3 Mass Properties……………………………………………………………………………………… 22
3.4 Speech Recognition Implementation………………………………………………………………………………... 23
3.4.1 Overview……………………………………………………………………………………………………………... 23
3.4.2 Design Process……………………………………………………………………………………………………... 23
3.4.2.1 Choice of Recognition Method……………………………………………………………….. 23
3.4.2.2 Changes from Phase Two………………………………………………………………………… 23
3.4.2.3 Overview of Artificial Neural Networks……………………………..…………………….. 24
3.4.2.4 Development of the System…………………………………………………………………….. 25
3.4.2.5 System Database Formation…………………………………………………………………….. 25
3.4.2.6 Testing the System…………………………………………………………………………………... 26
3.4.3 Problems with Implementing the ANN Algorithm…………………………………………..…….. 26
3.4.4 Testing the Accuracy of the Speech Recognition System…………………..………………….. 27
3.5 Control Code Implementation…………………………………………………………………………………………….. 30
3.5.1 Overview of the Control Code……………………………………………………………………………….. 30
3.5.1.1 Handshake Protocol…………………………………………………………..…………………….. 31
3.5.2 Control Code In-Depth…………………………………………………………………………………………….31
3.5.3 Interrupts Execution………………………………………………………………………………………………..32
3.5.3.1 Timer Interrupt Execution………………………………………………………………………… 32
3.5.3.2 Object Detection and Balancing Interrupt……………………………………….……….. 32
3.5.4 Control Code Testing……………………………………………………………………………………………….34
3.5.4.1 Test 1: Servo Calibrations…………………………………………………………………………. 34
3.5.4.2 Test 2: Smooth Movements……………………………………………………………………… 34
3.5.4.3 Test 3: Appropriate Angles for Proper Movement and Shifting of Weight… 34
3.5.4.4 Test 4: Selecting Appropriate Interrupt Angles for Balancing……………………. 35
4 Deliverable……………………………………………………………………………………………………………………………………….. 36
4.1 Final Product Overview……………………………………………………………………………………………………….. 36
4.1.1 Product Name……………………………………………………………………………………………………….. 36
4.1.2 Gump Features……………………………………………………………………………………………………… 36
4.2 System Specifications…………………………………………………………………………………………………………. 38
4.3 Project Budget……………………………………………………………………………………………………………………. 39

III | P a g e
4.3.1 Prototyping Costs………………………………………………………………………………………………….. 39
4.3.2 Final Design Costs…………………………………………………………………………………………………. 41
4.3.2.1 User-Side System…………………………………………………………………………………….. 41
4.3.2.1 Robot-Side System…………………………………………………………………………………… 42
4.3.3 Human Resources Cost………………………………………………………………………………………….. 44
4.3.4 Overall Project Cost……………………………………………………………………………………………….. 44
4.4 Impact on Environment………………………………………………………………………………………………………. 45
4.4.1 Mechanical Impact………………………………………………………………………………………………… 45
4.4.1.1 Sustainability Analysis Results…………………………………………………………………… 47
4.4.2 Electrical Impact on Environment………………………………………………………………………….. 48
5 Conclusion………………………………………………………………………………………………………………………………………… 50
5.1 Future Improvements…………………………………………………………………………………………………………. 50
5.2 Conclusion………………………………………………………………………………………………………………………….. 50
6 Bibliography……………………………………………………………………………………………………………………………………... 51
APPENDIX A Mechanical Drawings…………………………………………………………………………………………………………. 53
A.1 List of Parts…………………………………………………………………………………………………………………………. 53
A.2 Mechanical Drawings………………………………………………………………………………………………………….. 54
APPENDIX B Stress Analysis…………………………………………………………………………………………………………………….. 61
B.1 ABSplus Plastic Stress Analysis Results………………………………………………………………………………… 61
B.2 Aluminum 6061 T-6 Alloy Stress Analysis Results………………………………………………………………… 65
B.3 Acrylic Stress Analysis………………………………………………………………………………………………………….. 67
APPENDIX C Electrical Schematics…………………………………………………………………………………………………………… 69
C.1 User Schematic……………………………………………………………………………………………………………………. 69
C.2 Robot Schematic…………………………………………………………………………………………………………………. 71
APPENDIX D Bill of Materials……………………………………………………………………………………………………………………. 73
APPENDIX E Sustainability Definitions……………………………………………………………………………………………………… 79
APPENDIX F Speech Recognition Test Data..……………………………………………………………………………………………. 80
APPENDIX G Object Detection Logic………………………………………………………………………………………………………… 82
G.1 Moving Forward………………………………………………………………………………………………………………….. 82
G.2 Turning Right……………………………………………………………………………………………………………………….. 83
APPENDIX H Handshake Protocol…………………………………………………………………………………………………………….. 84
APPENDIX I Control Code Flow Chart……………………………………………………………………………………………………… 85
APPENDIX J Gantt Chart………………………………………………………………………………………………………………………….. 86
APPENDIX K Speech-Recognition Flow Chart…………………………………………………………………………………………… 87

IV | P a g e
List of Tables

Table 1: Functional Requirements ................................................................................................................. 3


Table 2: Non-Functional Requirements ......................................................................................................... 3
Table 3: Physical Specifications .................................................................................................................... 4
Table 4: Electrical Specifications……………….……………….….……………………………………………………………………..……………4
Table 5: Environmental Specifications……………….……………….….……………………………………………………………………..…. 5
Table 6: Operational Specifications……………….……………….….……………………………………………………………………..……… 5
Table 7: Servomotor Positions……………….……………….….……………………………………………………………………..…………….17
Table 8: Material Factor of Safety……………….……………….….……………………………………………………………………..……… 21
Table 9: Material Mass Analysis……………….……………….….……………………………………………………………………..…………. 22
Table 10: ANN Problems and Solutions……………….……………….….…………………………………………………………………….. 27
Table 11: Commands and Their Binary Numbers……………….……………….….………………………………………………………. 30
Table 12: System Features……………….……………….….……………………………………………………………………..…………………. 37
Table 13: System Specifications……………….……………….….……………………………………………………………………..…………. 38
Table 14: 3D Printing Costs……………….……………….….……………………………………………………………………..………………… 39
Table 15: Laser Cutting Costs……………….……………….….……………………………………………………………………..…………….. 40
Table 16: Thermal Heating Costs……………….……………….….……………………………………………………………………..……..… 40
Table 17: Two-Wheeled Prototype……………….……………….….……………………………………………………………………..….… 41
Table 18: User-Side System Costs……………….……………….….……………………………………………………………………..……… 42
Table 19: Robot-Side System Costs……………….……………….….……………………………………………………………………..….… 43
Table 20: Labor Costs……………….……………….….…………………………………………………………………………………..…………… 44
Table 21: Overall Project Costs……………….……………….….……………………………………………………………………..……………44
Table 22: Regions Explained……………….……………….….……………………………………………………………………..………………. 45
Table 23: Sustainability Assumptions……………….……………….….……………………………………………………….…..…………… 46

V|Page
List of Figures

Figure 1: User-Side Block Diagram ................................................................................................................ 7


Figure 2: Robot-Side Block Diagram .............................................................................................................. 9
Figure 3: Lynxmotion BRAT Biped Robot .................................................................................................... 11
Figure 4: First Bipedal Prototype ……………….…………….….……..……………………………………………………………..……………12
Figure 5: Mechanical Assembly for Rotational Movement ……………….…………….….…………………………………………. 12
Figure 6: Second Bipedal Prototype ……………….……………….………………………………………………………………..…………… 13
Figure 7: Third Bipedal Prototype ……………….……………….….…………………………………………..…………………..…………….14
Figure 8: First Shift in Right Turn………………………………...….……………………………………………………………………..……… 15
Figure 9: Second Shift in Right Turn ……….……………….….……………………………………………………………………..…………. 16
Figure 10: Servomotor Positioning…………….……………….….………………………………………………………………………………. 17
Figure 11: Initial Position………………………………………………….……………….….………………………………………………………. 18
Figure 12: Ankle Tilt for Forward Movement ……………………………………………………………………………..…………………. 19
Figure 13: Forward Movement…………….….……………………………………………………………………..……………………………… 19
Figure 14: Servo Bracket with Forces specified……………………..………………… …………………………………………………… 20
Figure 15: Artificial Neuron……………….……………….….……………………………………………………………………..……………….. 24
Figure 16: Artificial Neural Network ……………….…….….……………………………………………………………………..……..……. 25
Figure 17: Female Voice Recognition Accuracy………………………….……………….….……………………………………….….… 28
Figure 18: Male Voice Recognition Accuracy ……….……………….….…………………………………………………………..……… 29
Figure 19: Overall Voice Recognition Accuracy…………….….……………………………………………………………………..….… 29
Figure 20: IR Sensors Structure………………………………………………………………………………………………………..…………… 33
Figure 21: US Sensor Structure…………………….……….….……………………………………………………………………..…………… 33
Figure 22: Gump Logo……..……………….……………….….……………………………………………………………………..………………. 36
Figure 23: Gump Features…………………………….……………….….……………………………………………………….…..……………. 37
Figure 24: Gump Dimensions………………….…….……………….….……………………………………………………….…..……………. 38
Figure 25: Regions Assumed…….……………….….……………………………………………………….…..…………………………………. 45
Figure 26: Effect on the Environment………………………………………….…..…………………………………………..………………. 47
Figure 27: Component Effect on Environment……………….…..…………………………………………..……………………………. 48

VI | P a g e
1 Introduction
1.1 Objectives
The role robotics plays in our society is increasing as technology grows. They can be found in
the medical and manufacturing industries, the military, and even entertainment. Their
applications have a huge potential as they help automate tasks, enabling humans to focus on
more important work. Artificial voice recognition is becoming more and more popular by the
day. There are many different applications that use voice recognition software such as Apple’s
SIRI, Samsung, S-Voice and Google Search, where the voice recognition software can
understand a string of commands and provide the necessary results to the user’s benefits.

In this project, we would like to bridge the gap between the voice recognition and the robotics
in order to create a user-independent robot which can be controlled via voice commands. The
main goal of the project is to understand words from a given list of commands which will be
transmitted to the robot for movement execution. For the voice recognition, commands will be
single or two-words, which will be processed using MATLAB for the correct command matching.
Our specified level of success is a success rate of 90 % for each command given in an isolated
environment. As for the robotics side of the project, we have designed a bipedal robot in order
to mimic human movement in terms of walking and turning in different directions. The link
between the voice recognition system and the robot will be a wireless signal. The reason we
selected wireless communication is to make the overall system much more independent and
have room for future upgrades.

1.2 Project Outline


In the primary phase of the project, the team researched the various possibilities of the voice
recognition and a suitable design for the robot. For voice recognition, we first need to
understand the theory behind the system and look at the various alternatives. We at first
selected the Dynamic Time Warping algorithm to implement the recognition part of the system,
but as will be explained later on, we decided to select the more sophisticated Artificial Neural
Network algorithm. After selection, we began implementing the speech recognition system and
testing its results.

1|Page
Proudly sponsored by:
As for the robot design, we initially proposed a four-wheeled robot. Due to the simplicity of
that design, we then decided to propose a different alternative. The alternatives were a
hexapod robot or a bipedal robot. We opted for the bipedal robot since we wanted to try
something much more different that has not been done before at Concordia University.

In the second phase of the project, the team was focused mostly on executing the work that
had to be done. For the circuit, we proposed a user-side and a robot-side schematic. Once the
schematics were completed, we then focused on the components that we needed to modify in
order to comply with our requirements. For the voice-recognition software, a first version had
been delivered, but required improvements in terms of accuracy and speed of execution time.
In addition, the safety of our robot was a concern. The robot must be able to make decisions on
its own in unavoidable situations when encountering an obstacle. As a result, we needed to add
object detection sensors in order for the robot to stop executing commands and protect itself.
In addition, a STM32F3Discovery microcontroller was suggested by our technical coordinator
since it had a built-in gyroscope and accelerometer which were an important factor for the
stability of the robot while moving.

In the third phase, the team was busy assembling the various parts of the project and
implementing the necessary changes either if it was for the voice recognition, electrical system,
control code or the mechanical design. There has been several revisions for the design to
improve the stability of the robot while moving. The schematic and PCB design were finalized
and ready for building. As for the voice-recognition software, we changed to the Artificial
Neural Network (ANN) algorithm since it provided better accuracy and a shorter execution
time. In parallel, all the necessary documentation were being produced in order to save time
and be able to document much more efficiently.

2|Page
Proudly sponsored by:
2 Project Overview
2.1 Requirements
2.1.1 Functional Requirements

TABLE 1: FUNCTIONAL REQUIREME NTS

NO. Requirements Details

Recognize 8 Ready, Go Forward, Back, Turn Right,


1 commands Left, Slow, Speed Up, End
Control the movement Movements execute according to
2 of chassis command.

2.1.2 Nonfunctional Requirements

TABLE 2: NON-FUNCTIONAL REQUIREMENTS

NO. Requirements Details on how to achieve it

 Establish handshake protocol between the computer


1 Robot must be controlled and the robot’s wireless module
wirelessly.  Transfer commands to the robot so it can execute
the appropriate movements.
Unexpected situations and steps the robot will take
The user sends incorrect
commands or is unable to Prompts the user to give another command
recognize commands
 Check stability of wireless connection by running
The user is unable to
handshake protocol every five seconds
2 communicate with the
 Wait for the next command
robot
 Reset the system
 The robot will move away from obstacles
Obstacles detection  In case obstacle is detected the sensors have priority
over user commands.
 First option: Use WDT (watchdog timer) for control
The robot’s control system
system reset
failure
 Second option: Manual reset

3|Page
Proudly sponsored by:
2.2 Specifications
2.2.1 Physical Specifications

TABLE 3: PHYSICAL SPECIFICATIONS

Specifications Maximum
NO.

Width 400 mm
1

Length 300mm
2

Height 550mm
3

Weight 6 kg
4

2.2.2 Electrical Specifications

TABLE 4: ELECTRICAL SPECIFICATIONS

NO.
Specifications Maximum

Operating time between charge(s)


1 2 hours

2 PC Power 110 V, 60 Hz

4|Page
Proudly sponsored by:
2.2.3 Environmental Specifications

TABLE 5: ENVIRONMENT AL SPECIFICATIONS

NO. Specifications Maximum

1
Type of Environment Indoor

2 Common object found in


Obstacles above mentioned type of
environments
Maximum line-of-sight distance between the
3 100 m
computer and the robot`s control system.

2.2.4 Operational Specifications

TABLE 6: OPERATIONAL SPECIFICATIONS

NO. Specifications Maximum

Degree of freedom of bipedal


1 8 degrees
movement

2 Operational temperature -10 to 40°C

Surface requirements for operation


3
Inclination : <5degrees

Surface deviation: ±3 mm from reference

4 Reliability of speech recognition 90%

5|Page
Proudly sponsored by:
3 Design Process and Project Implementation
3.1 Proposed Design
By the end of Phase Two, we selected the bipedal leg chassis for our robot design. The bipedal
chassis available on the market mostly allowed for six degrees of freedom for a reasonable
price. The issue was, we didn’t want six degrees of freedom, as the turning of the robot would
be done by subtle shifting of the feet. We wanted to explore the challenge of designing a
humanoid-movement robot, so we decided that we wanted eight-degrees of freedom instead.
The extra two-degrees allowed for easier turning of left and right by rotating the feet. Of
course, since we didn’t have any mechanical engineering students on our team, our team had
to design the mechanical structure of an eight degree freedom bipedal chassis on its own. The
added work also meant we had to understand how best to design a robot that had a stable
center of gravity and full balance when it moves.

We selected STM32F3Discovery as our microcontroller because it has major important


components such as the three-axis gyroscope, accelerometer, and e-compass. The price of the
module as well was a huge factor, as we wouldn’t have to buy separate parts in order to have
the same functionalities. The disadvantage of the STM32F3Discovery was that it increased the
level of complexity of programming the module as compared to a simpler microcontroller such
as ATmega32.

3.2 Electrical Implementation


3.2.1 Overview of the System
On the electrical side of the project, we have two circuits, the user-side and the robot-side. The
user-side circuit will send a signal from the speech-recognition software on the PC wirelessly to
the robot-side circuit. Once the robot-side circuit receives the wireless signal, the
microcontroller starts controlling the servo-motors according to the voice-command issued by
the user.

3.2.2 User-Side Circuit


3.2.2.1 Overview

The User-Side block diagram is shown in Figure 1. The user will send a voice command via the
microphone to the speech recognition software on MATLAB on the PC. Once the speech

6|Page
Proudly sponsored by:
recognition software recognizes the command, MATLAB outputs a binary number
corresponding to the command issued by the user. The User-Side circuit receives the signal
from the PC via USB and it’s sent to the ATtiny44 microcontroller. Furthermore, the voltage
from the USB is converted to 3.3 V to power the ATtiny44. Once the signal is received on
ATtiny44, the microcontroller sends the signal to the nRF24L01+ wireless module via SPI. The
wireless module then which sends the signal to the Robot-Side circuit. The LEDs on the User-
Side circuit indicate whether data transfer has been sent or not.

User

Microphone
+5v from USB
Computer LD1117V33
110V AC Voltage Regulator
Antenna

Attiny 44 Wireless Module


D+/D- SPI NRF24L01+
USB
RED LED
Connection

GREEN LED

Figure 1: User-Side Block Diagram

3.2.2.2 User-Side Schematic and PCB

Originally, for the User-Side Schematic, the team wanted to purchase an embedded solution
that connects directly to the PC by USB. However, after a careful revision of the budget and
available components available to us by the ECE Technician Hardware Supply Store, we decided
to design the schematic and the PCB on our own. The final PCB for the User-Side circuit was
approved in late January. It was designed to fit into the 1591LSBK case[2].

For more information about the User-Side Schematic and PCB, please refer to Appendix C.1. For
the official Bill of Material of the User-Side PCB, please refer to Appendix D.

7|Page
Proudly sponsored by:
3.2.2.3 Problems Encountered

Before the finalization of the schematic, there was a problem in the synchronization of the data
transfer when using a USB cable. In order to fix it, a crystal oscillator is required to synchronize
the clock of the controller with the USB data transfer. In this case, a crystal of 12 MHz was used.
It was also highly recommended that a capacitor should be placed near the power port of the
NRF24L01+ in order to reduce the effects of voltage ripple and noise going to the wireless
module.

3.2.3 Robot-Side Circuit


3.2.3.1 Overview

The Robot-Side circuit, with a block diagram shown in Figure 2, will be powered by a 7.4V
lithium polymer battery[5]. The STM32F3 Discovery, the Sharp IR sensors, Ping Ultrasonic
sensor and the LD1117V33 regulator will all be powered by the 5 V regulator, the LM2575. The
wireless module is powered by the LD1117V33, a 3.3 V voltage regulator. The six servomotors
(HS-5485HB) for hip pan movement, knee pan movement, and ankle tilt movement are
powered by the 6 V Regulator. The two servomotors in charge of hip rotation (HS-7954SH) are
powered by the 7.4 V Battery. All the servo motors will allow the robot to move are powered at
a maximum of 6V using a switching regulator, the LT1374, which is designed to output 6V.

In order for the robot to function according to the given commands, the commands will be
received using the nRF24L01+. Once the data is received by the nRF24L01+, it will be
transmitted to the STM32F3 Discovery microcontroller module to control each and every servo
motors simultaneously. We have also placed a total of four object detection sensors, to allow
the robot to stop when facing an obstacle. When an obstacle has been detected by the sensors,
a flag will be raised in the control code which will lead to the suspension of the current
execution and wait until a new command is issued by the user.

8|Page
Proudly sponsored by:
LiPo Rechargeable
battery
Antenna

Wireless LM1117V33 LM 2575 LT1374


Module Voltage Voltage Voltage
NRF24L01+ Regulator Regulator Regulator

STM32F3 Discovery
SPI

Vertical Knee Right


Horizontal Hip Left

Vertical Hip Right

Vertical Knee left

Horizontal Ankle
Horizontal Ankle
Ping Ultransonic

Vertical Hip Left


Sharp IR Sensor

Sharp IR Sensor

Ping Ultrasonic

Servo Motor 1

Servo Motor 2

Servo Motor 3

Servo Motor 4

Servo Motor 5

Servo Motor 6

Servo Motor 7

Servo Motor 8
Horizontal Hip
Sensor

Sensor

Right

Right
Left
Figure 2: Robot-Side Block Diagram

3.2.3.2 Robot-Side Schematic and PCB

The Robot-Side schematic was altered from Phase Two to represent the connectors we needed
and their specific part numbers before we went ahead and designed the PCB. The schematic
shows where the various sensors, two batteries, STM32F3Discovery, and eight servo-motors
connect to. It also houses the voltage regulators and the 8-bit serial bus. The batteries we
selected to power the Robot were two Lithium Polymer batteries rated at 7.4 V and 2400 mAh
each. The connection we made in the schematic allows us to keep the 7.4 V while doubling the
current to 4800 mAh. The switch to power the system is mounted on the Robot Chassis, which
will be seen in the Mechanical Design section in the report.

9|Page
Proudly sponsored by:
The PCB for the Robot-Side was approved in late February. The Robot-Side PCB and Schematic
drawings can be seen on Appendix C.2. The Robot-Side PCB was designed to fit in the BT-
2727[1]. To view the mechanical drawings of the Robot-Side PCB Box, refer to Appendix A.2.

3.2.3.3 Problems Encountered

At first, we wanted to use the LM7805 fixed linear voltage regulator for all the components that
needed to be powered by 5V. This was not a great solution since it required a heat sink that
would need to be designed and would add much more work. The solution was to implement a
buck converter which led to using the LM2575 regulator for a fixed output of 5 V with 1 Amp.
Using a buck converter also increases the efficiency of the voltage regulator and heat
dissipation. A safety fuse has also been added in the circuit for protection. If the circuit draws
more than 5 A, the fuse will save the circuit from an excess of current being drawn into the
circuit. One problem we haven’t foreseen was that the PIXY CMUCam 5, which we ordered
from a Kickstarter campaign, wasn’t ready for shipment yet, so it would have caused delays in
implementing object detection in our system[15]. To solve that, we switched to using IR and
Ultrasonic Sensors. Another problem that was encountered was how to distinguish the
orientation of the various three-pin connectors which were being used for the servomotors and
sensors. To distinguish them, the servomotors, IR sensors and Ping Ultrasonic sensor all have
their own connectors, eliminating the possibility of connecting the wrong device at the wrong
place.

When choosing the right box to mount the PCB, the design had to be changed in order to
maximize the space available with all the constraints. In addition, it was suggested that all the
connectors either for the sensors or servomotors be placed as close to the edge of the box.
Another key point when designing the PCB, as pointed out by our Technical Coordinator, was
that all the power should be concentrated in one corner and that the voltage regulator LT1374
should be as close to the servo motor connectors so there is no loss in power.

10 | P a g e
Proudly sponsored by:
3.3 Mechanical Implementation
3.3.1 Design Process
When we thought about how best to design a bipedal robot with eight degrees of freedom, we
searched for inspiration from existing designs on the market. One that came to our attention
was the BRAT Bipedal Robot from Lynxmotion[6], which offers us six degrees of freedom.

Figure 3: Lynxmotion BRAT Biped Robot

Since in our system design, our specifications require 8 degrees of freedom, we couldn’t use the
BRAT design. The problem with 6 degrees of freedom in a bipedal design is that turning left and
right is done by shuffling the feet in that direction, which is not exactly the humanoid
implementation that we were looking for.

Another issue with the BRAT was that the dimensions of the bipedal chassis were much smaller
than what we were interested in for our specifications of the system. So, we instead decided to
improve on the BRAT design to allow for 8 degrees of freedom and redesign the components to
expand the dimensions, making the overall bipedal chassis height taller. This has been done
using Solidworks[16].

3.1.1.1 First Prototype

At the beginning of the design process, we decided to rapid prototype our design with help
from Concordia’s Engineering Design and Manufacturing Lab (EDML). The material used for our
design via the 3D Printer was ABSplus Plastic[10] (For more information about the positioning
of the servos and their functionalities, please refer to Section 3.3.2 of the report).

After careful revisions and verification of our 3D printed parts individually, we ordered a full set
in to begin leg assembly. The figure below shows the first prototype of our design.

11 | P a g e
Proudly sponsored by:
Figure 4: First Bipedal Prototype

3.3.1.2 Problems Encountered in the First Prototype


After constructing the 3D Printed parts, we noticed several problems. First was the thickness of
the parts, which was at 1.5 mm. This caused several parts to break easily, thus proving to us
that ABSplus Plastic wasn’t strong enough. Second were the dimensions of the servo brackets.
It wouldn’t allow for our preferred servomotors, the HS-5485HB, to be easily placed within the
brackets and perform smoothly with the rest of the assembly. The final problem was the upper
two degrees of freedom, which allowed for hip rotation. Our servomotors can rotate the
brackets as part of the assembly due to the bearings and lock washers allowing freedom of
movement while ensuring a locked position for the assembly with the nuts. Please refer to
Figure 5[7].

Figure 5: Mechanical Assembly for Rotational Movement

However, that degree of freedom on the hip rotation couldn’t be controlled by the servo
motors exclusively, so if the robot lifts its legs and the servomotors weren’t programmed to
allow for a hip rotation movement, the bearing and lock washer mechanism will still rotate the
leg without our control. Obviously, that affected our design, so we had to review our
mechanical assembly.

12 | P a g e
Proudly sponsored by:
3.3.1.3 Second Prototype
The second prototype we designed solved the issue of those three problems. First, after
running stress analysis tests on three different materials (see Section 3.3.3), we decided to use
acrylic for the entire assembly. To design the parts in acrylic, we used a laser cutter and
thermally heated the acrylic and bent it to achieve our required part design (see Section 3.2 of
the Technical Manual for more information about acrylic design). We increased the thickness of
the parts to 3 mm, which also increased the strength of the assembly. Second, we designed a
two parts platform separated by hex standoffs to house the battery on the first platform and
the PCB in the box on the second. Third, we placed the servomotors mounted on the platform
with their wheels facing downwards and directly attached to each leg. This ensured that the
servomotor for hip rotation has direct control over the rotation of the leg.

Figure 6: Second Bipedal Prototype

3.3.1.4 Problems Encountered in the Second Prototype


The second prototype proved much better than the first prototype for several reasons, but also
had several problems. It allowed the hip rotation movement we wanted to ease the legs to turn
left and right. However, moving forward was an issue since whenever the legs were lifted, a lot
of weight due to the battery and the PCB Box housed on top made the robot unstable and
subject to falling. Another issue was the design of servo brackets in acrylic still would be subject
to breaking, which meant that we needed a stronger material for the servo bracket. Also, the
feet needed to be reinforced with a stronger material in case the feet landing on the floor
would cause them to break. Therefore, we had to redesign the assembly again in order to fix
those problems.

13 | P a g e
Proudly sponsored by:
3.3.1.5 Third Prototype
To fix the problem in the third prototype, we decided to widen the platform at the beginning.
The widened platform allowed us to place the PCB Box in the middle, as pointed out by our
Technical Coordinator. However, since we were concerned about the weight of the PCB Box, we
decided to use half the PCB Box instead and mount it under the platform. The increase in the
width of the platform allowed us to have a more stable design and reduce the weight from the
box by half. Furthermore, by placing the box under the platform, we decrease the height,
therefore lowering the center of gravity. Another issue was the battery weight. Since we didn’t
want to increase the weight on the top, we decided to place a battery on each foot.

This solved another problem we encountered, that of fixating one foot to the ground while the
other foot was moving. The increased weight on the foot due to the battery weight allowed us
to have a smoother movement. In terms of finding new material for the servo bracket and the
feet, we decided to use the aluminum 6061-T6 alloy for the two parts. When we requested
those parts to be made for us by the EDML, they informed us that they’re busy helping the
Mechanical Department Capstone teams and instead we had to design them ourselves in
aluminum. Due to a shortage on time and a lack of a mechanical engineering student on our
team, we decided to order the parts from Lynxmotion[6]. For the aluminum feet used to
reinforce our original acrylic feet, we decided to widen the base of the acrylic feet to be able to
house a battery on each foot.

Figure 7: Third Bipedal Prototype

3.3.1.6 Problems Encountered in the Third Prototype


The third prototype worked better than we expected. We saw that the even distribution of the
weight allowed us to move forward smoothly. Turning left right and right were done

14 | P a g e
Proudly sponsored by:
successfully as well. The aluminum parts were strong and housed the servomotors perfectly.
However, there was a mechanical issue we didn’t foresee at the beginning. The servomotors we
used were the HS-5485HB, which had a torque rating of 6 𝑘𝑔 ∙ 𝑐𝑚 [3].

Let’s say for example we wanted to turn left. We would first turn the left foot 45° right with the
left hip rotation servomotor as shown in Figure 8. Then, we would shift the ankle servomotors
to the left by, lifting the ankle of the left foot to the side of the floor, as seen Figure 12. Then,
we would have to realign the right hip rotation servomotor back to its initial position. This
realignment, coupled with the weight redistribution on the right side and the left foot being
slightly lifted off the floor, would allow the robot to turn right. However, as pointed out earlier,
the servomotor we used was the HS-5485HB, with the torque rating of 6 𝑘𝑔 ∙ 𝑐𝑚[3]. If we look
at the first shift of rotating the right foot 45 degrees to the right, the servomotor was rotating
the right foot a distance of 73.46 mm from the origin at the center of the right hip rotation
servomotor wheel. The weight it was rotating was 427.35 grams. Therefore, the torque needed
to perform the first shift in the right movement was about 3.14 𝑘𝑔 ∙ 𝑐𝑚, so the HS-5485HB
servomotors were more than enough for this task.

Figure 8: First Shift in Right Turn

15 | P a g e
Proudly sponsored by:
Figure 9: Second Shift in Right Turn

However, when performing the second shift for realigning the right hip servomotor to complete
the right turn, we measured the distance the right servomotor was rotating to be
approximately 26.3 cm. The rotating weight was about 1047.65 grams (the total weight minus
the weight of the entire right leg in the first shift of the right turn). So, the torque needed to
perform the second shift of the right turn was about 27.55 𝑘𝑔 ∙ 𝑐𝑚. It became apparent that
the HS-5485HB servomotors would eventually break from performing over their standard
torque. We were recommended by our Technical Coordinator to lower the length of the robot,
eliminating a few parts to reduce the weight of the assembly. Furthermore, by placing the PCB
Box sideways from its shortest edge, we were allowed to reduce the width of the robot, which
would mean that the torque demands were reduced. Those recommendations were the best in
terms of practicality, cost-effectiveness, and improvising. However, since the team really liked
the design of the robot, we chose not to alter it according to the recommendations of our
Technical Coordinator.

So, we needed to find two new servomotors for the hip rotation to ensure the safety of the
design. The servomotor we decided to use was the HS-7954HS, which gave us a maximum of
29 𝑘𝑔 ∙ 𝑐𝑚 [4]. This was the more expensive solution, but it helped us maintain the asymmetry
of the design. So, one of the few changes from the third prototype to the fourth prototype was
the servomotors, which solved the turning right and left torque requirement. The other thing
we decided to add to the fourth prototype was the rubber circles on the bottom of the feet.
This increased traction of the feet when we wanted more precise movement from the
assembly. In conclusion, the fourth prototype, which was very similar to the third prototype,
became our final design.

16 | P a g e
Proudly sponsored by:
3.3.2 Overview of the Mechanical Movements and Design
An overview of the final design with a picture and balloons highlighting the different
servomotors in the assembly and their purpose. For a list of mechanical parts that are part of
the final assembly, please reference Appendix A.1.

TABLE 7: SERVOMOTOR
POSITIONS

NO. Servomotor

Left Hip Rotation


1

2 Right Hip Rotation

3 Left Hip Pan

4 Right Hip Pan

5 Left Knee Pan

6 Right Knee Pan

7 Left Ankle Tilt

8 Right Ankle Tilt

Figure 10: Servomotor Positioning

We shall show an example of the various steps the robot takes when implementing the forward
movement. All servos are positioned with the wheel’s midpoint aligned with the mechanical
parts in such a manner that if the servos are set to their midpoints, the robot will be in its initial
position. We will cover in this section the forward movement. Right movement has already
been covered in Section 3.3.1.6. Left movement is similar to right movement, but in opposite
direction. Backward movement is similar to forward movement but in the opposite direction.
Ready command initializes the system and end command stops it.

The robot’s position is controlled by the settings of the individual positions of the 8
servomotors. This is done by providing the servomotors with PWM signals whose width
outlines specific servo wheel positions. Now, if the positions of the servomotors are changed, it
has an overall effect over the robot’s movements. The transition of the servo position from one

17 | P a g e
Proudly sponsored by:
point to another is not done abruptly. Instead, it’s done in steps (increments or decrements) of
the PWM widths. This allows us to control the speed of the robot since the increments are done
over delays in milliseconds. The higher the delay, the slower the robot movement is. The lower
the delay, the faster the robot movement is. We found that the best range for the delays was
between 10 ms and 50 ms as anything below 10 ms didn’t give enough time for the
servomotors to react and anything above 50 ms made the servomotor transition choppy
instead of smooth.

In the end we decided that a 20 ms delay gave us the speed we want as the fastest speed for
the robot, 40ms gave us the speed we want as the slowest speed of the robot and 30ms gave
us the appropriate default or middle speed. In the Figure 11, we can see a side-view of the
robot in its initial position.

Figure 11: Initial Position

All the following angles that will be discussed must be the same value throughout the execution
of the command. Angles cannot be less than 125° since the robot must maintain its balance
and the parts are not subject to damage (i.e. acrylic feet). After the initial position in Figure 11,
robot must shift weight onto the right leg as shown in the Figure 12. Angle for ankle shift
cannot exceed 40° as shown by the arrows.

18 | P a g e
Proudly sponsored by:
Figure 12: Ankle Tilt for Forward Movement

After the ankle shift to the right, the robot moves it left leg forward as can be seen in Figure 13.

Figure 13: Forward Movement

The arrows show the maximum angle the legs separate from the initial position before
realigning themselves back and moving forward again with the other leg. In the case of Figure
13, it is 125°. After that, Robot realigns to initial position as was seen in Figure 11. After that,
the robot shifts the ankles 40° to the left, similar to Figure 12. Then the right leg moves forward
similar to the manner of Figure 13.

19 | P a g e
Proudly sponsored by:
3.3.3 Material Selection
For the bipedal leg design, we had to consider the best out of three materials for our final
design. The three materials we wanted to evaluate are the following:

1) ABSplus Plastic
2) Acrylic
3) Aluminum 6061 T-6 Alloy

3.3.3.1 Stress Analysis


Stress analysis computes stress, deformation, and factor of safety results for a component by
discretizing component geometry into a mesh of points at which to compute values[8].

For stress analysis, we chose to evaluate one part of our design, the servo bracket.

NOTE: This has been done using Solidworks[16].

Our initial assumptions for evaluating this part are the following:

1) Force exerted on the design: 10 N


2) Fixture applied to one plane to hold the design in place

As can be seen in the figure below, the purple arrows indicate the 10 N force exerted on the
design while the green arrows indicate the fixture applied on one plane to hold the design.

Figure 14: Servo Bracket with Forces Specified

20 | P a g e
Proudly sponsored by:
3.3.3.2 Stress Analysis Results

We begin our analysis of the material by first beginning by taking into account the von Mises
Stress of each material. Results for each von Mises stress can be found in Appendix B.

The von Mises stress is used in determining whether an isotropic and ductile metal will yield
when subjected to a load condition. This is accomplished by calculating the von Mises stress
and comparing it to the material's yield stress, which constitutes the von Mises Yield Criterion
[9].

The next thing to be able to compute would be the factor of safety criteria. This will help us
conclude whether a material is strong enough for our use or not. The Factor of Safety (FoS)
essentially informs us of the structural capacity of a system beyond the load applied [18]. It
allows us to see whether the system can be able to handle more than the load we wish to apply
to it. In the servo bracket part example, our load was 10 N applied. This is due to the fact that
the servomotor placement on the servo bracket would exert a small force when being housed
in that part.

Taking all this assumptions, we shall use the Factor of Safety formula to fully understand what
we are measuring:

𝑀𝑎𝑡𝑒𝑟𝑖𝑎𝑙 𝑆𝑡𝑟𝑒𝑛𝑔𝑡ℎ
𝐹𝑎𝑐𝑡𝑜𝑟 𝑜𝑓 𝑆𝑎𝑓𝑒𝑡𝑦 =
𝐷𝑒𝑠𝑖𝑔𝑛 𝐿𝑜𝑎𝑑

We need a minimum Factor of Safety of 1 to indicate that the material strength can handle the
design load. So, we consider looking at the materials. Their characteristics and stress analysis
results can be found in Appendix B. Table 11 features the final values for the Factor of Safety.

TABLE 8: MATERIAL FACTOR OF SAFETY

NO. Aluminum
Feature ABSplus Acrylic
6061 T-6

1
Factor of Safety 0.62295 0.96461 5.78206

The results tell us that Acrylic and Aluminum 6061 T-6 have the best values for Factor of Safety.
While it is suitable to just select Aluminum, we still need to consider other factors, such as
sustainability and mass.

21 | P a g e
Proudly sponsored by:
3.3.3.3 Mass Properties

We have calculated the mass for the Regular C bracket to be able to differentiate between the
different mass values. It can be seen that the aluminum material is more than double the mass
value of ABSplus and Acrylic.

TABLE 9: MATERIAL MASS ANALYSIS

NO. Aluminum 6061


Property ABSplus Acrylic
T-6

1 Part
(C Bracket)

2 Mass of Part 5.92 grams 6.96 grams 15.67 grams

Mass of
3 Total 182.92 grams 208.8 grams 417.77 grams
Bipedal
Assembly

For the final mass of the bipedal assembly, as was originally assumed, the aluminum parts
weigh the bipedal legs more than acrylic or ABSplus. However, since this weight still falls within
our specifications, it’s not a problem.

22 | P a g e
Proudly sponsored by:
3.4 Speech Recognition Implementation
3.4.1 Overview
To build the artificial intelligence software for the robot, we used an Integrated Development
Environment (IDE) called MATLAB[22]. MATLAB is a high-level language and interactive
environment which includes signal processing and perdition of an input voice[12]. The objective
is to build a speech-recognition software that will recognize spoken commands given by the
user regardless of the different accents.

The speech recognition system can be classified as isolated or continuous. Isolated word
recognition requires a brief pause between each spoken word unlike the continuous word
recognition. The speech recognition system can be further classified as user-dependent or user-
independent. When the system is user-dependent, the system only understands commands
from a particular speaker. On the other hand, when the system is user-independent, the
software understands the spoken words from anybody[20]. For our application, the speech
recognition software will understand isolated words and will not be user-dependent.

3.4.2 Design Process


3.4.2.1 Choice of Recognition Method

To develop the algorithm, we used VOICEBOX and Artificial Neural Network toolbox (ANN).
VOICEBOX is an open-source speech processing toolbox that has MATLAB routines and written
by Mike Brookes[21]. This is specifically used for being able to accept input voice commands via
a microphone, analyze, processing the signal and store the essential features of the signal. On
the other hand, Artificial Neural Networks toolbox (ANN) is mainly used to process and match
the spoken words correctly to the features that have been extracted and stored [13].

3.4.2.2 Changes from Phase Two

Initially, the Dynamic-Time Warping (DTW) algorithm was used for the prediction of the
commands. After testing the accuracy with DTW, the result was that the accuracy was lower
than excepted and we came to a conclusion that this method is user-dependent and uses only
one method to predict the commands understood. On the other hand, Artificial Neural
Networks uses various sophisticated tactics to match up the spoken word correctly and most
importantly, it is user-independent. Therefore, the commands given by anyone would be
understood even if the user has not trained his/her voice in the system.

23 | P a g e
Proudly sponsored by:
Other changes from Phase Two are the changing of the voice-commands used for executing a
specific function. For instance, we have several two-word commands in our system when our
initial specifications required us to use one-word commands.

3.4.2.3 Overview of Artificial Neural Networks

Artificial neural networks (ANN) are electronic models of the brain’s neural system that can
learn through experience. They allow for problem-solving that is very cost-efficient, reducing
the need for vast computing power. They try to replicate only the basic functionalities of
biological neurons[19].

In current technology, neurons are called “processing elements”. As can be seen by the artificial
neuron schematic in Figure 15, first step is the input of variables which are multiplied by their
weighting factor. Next, the inputs enter the summing function, which can have a variety of
operations and not necessarily be restricted to only summing. After summing function, the
inputs enter a transfer function, which converts the input into an output through an algorithm.
The algorithm is what manipulates the inputs to produce the desired output[19].

Figure 15: Artificial Neuron[19]

When we take this into account as an overall artificial neural network, what we end up with is
several different artificial neurons clustered together to work as a whole. They’re not as

24 | P a g e
Proudly sponsored by:
complex as the biological neural network since they’re two dimensional with a limit on the
number of layers, as can be seen in Figure 16[19].

Figure 16: Artificial Neural Network[19]

3.4.2.4 Development of the System

Thanks to the help of one of our colleagues[23], we saw an earlier example of Neural Network
Toolbox implementation in MATLAB, which guided us to building our own speech-recognition
system with ANN. In general, there are two major stages in building this type of system: a
training stage and a testing stage. In the training stage the system’s library is developed and this
has been successfully created using the VOICEBOX speech recognition toolbox. To build this
library an acoustic model for each word that the system needs to recognize is saved. In our
library we have the commands along with the reference number as shown in Table 11. For the
testing stage, Artificial Neural Networks toolbox (ANN) has been used to use the acoustic
models of the commands to recognize isolated words.

3.4.2.5 System Database Formation

To form this database for the automatic speech recognition system, we first need to extract the
features of the voice input which is used to identify the essential components of the spoken
word. The method used for the extraction of the features is a very common and robust method
called Mel Frequency Cepstral Coefficients function (MFCC) which is available in the VOICEBOX

25 | P a g e
Proudly sponsored by:
toolbox[21]. The MFCCs calculates the energies of the commands spoken and stores it in the
database. When developing the database of the system, it is crucial that every command has
repeated utterances in turn having repeated features of that specific word.

For example, we can assume that the database is like a library full of stored books. Every book
is like a command filled with that specific command’s repeated utterances. These books have
different versions which is in our case different users and have multiple copies of each version.

Approximate number of samples:

𝑁𝑜. 𝑜𝑓 𝑐𝑜𝑚𝑚𝑎𝑛𝑑𝑠 ∗ 𝑁𝑜. 𝑜𝑓 𝑟𝑒𝑝𝑖𝑡𝑖𝑜𝑛𝑠 ∗ 𝑁𝑜. 𝑜𝑓 𝑝𝑒𝑜𝑝𝑙𝑒 = 8 𝑥 20 𝑥 31 = 4960

For an overview of the speech-recognition algorithm control code, refer to Appendix K.

3.4.2.6 Testing the System

For the testing stage, we have used the Neural Network Toolbox that acts like a biological
neural network can dynamically learn, train, recognize pattern, and classify data [13]. The
individually connected computing elements determine the behaviors of the neural network.
The strength or weight of these connections keep changing automatically as the system is
trained which can be adjusted depending on how the system is trained and whether the task is
correctly performed.

Ones the neural network toolbox is activated it starts its training its own behavior dependent
on the number of samples in the database which takes approximately 2-3 minutes. The neural
network app automatically trains your neural network to fit input and target data [13].

3.4.3 Problems with Implementing the ANN Algorithm


In the following table, we listed all the problems we faced and how we solved them when
implementing the speech-recognition system.

26 | P a g e
Proudly sponsored by:
TABLE 10: ANN PROBLEMS AND SOLUTIONS

NO.
Problem Solution

When taking future samples the


1 Training of the voice samples has
user will have to record the
been done continuously which has
commands by given a brief pause
made the recognition less accurate
between two words.
The recordings done in a noise The future samples much be
environment has an impact on the recorded in a quite environment in
accuracy order to extract the correct
features of the spoken word.
Longer training time due to large Distribute computation across
data sets multiple processors [13]

Microphone– uncomfortable
placement which requires the user
Purchase of a microphone that has
to regularly position the
a comfortable and stable grip on
microphone correctly. This
the user.
produces noise in turn effecting
the accuracy.
Change the training method not Use other methods to train the
effective system in order to learn how to
predict better

3.4.4 Testing the Accuracy of the Speech Recognition System


Testing it was necessary in order to determine the accuracy of the recognition feature. As
mentioned in the specification, our goal is to achieve 90% accuracy for the system. The higher
the voice samples trained, the more accurate the recognition feature was. We had a total of
4960 voice samples in a mixture of male and female voices of varying characteristics.

For the testing of the speech-recognition software, we made set the following conditions:

 30 unique voice samples already trained by the system


 6 people testing the system (3 male and 3 female)
 Multi-cultural demographics for the 6 people testing the system (different accents,
nationalities, tone of voice)
 Each user will test each command ten times. Total number of samples obtained for the
test are 480 samples.
 Environment for testing: Lab environment with average noise level.

27 | P a g e
Proudly sponsored by:
The raw data obtained from the tests are can be found in Appendix F. We decided to compute
an average of each command separately for both female and male, which can be seen in Figure
17 and 18, respectively. The results of the tests for both male and female looked promising, so
we decided to compute an average of both male and female commands, which can be shown in
Figure 19. We computed the overall average of all the commands and the results was an
88.125%, which we deemed successful considering our initial requirement was a 90% accuracy.

Of course, due to lack of enough human resources and time, the accuracy could have been
improved. We could have increased the unique voice samples trained by the system to more
than 30 users. Each user records each command twenty times. Therefore, each user contributes
160 voice samples to the Artificial Neural Network. At the moment, we have 4,800 samples in
the system. The Artificial Neural Network increases in accuracy the more samples are trained in
it. Ideally, at around 100,000 samples in the system, we could have more commands trained by
the system and more combinations of words as one command. This would require an external
hard drive, however. Furthermore, testing in several different environments with different
levels of noise would help with improving the recognition feature of the ANN. A higher quality
microphone would also help with commanding the system in the future.

Voice Recognition Accuracy (Female)


100.00%
90.00%
80.00%
70.00%
60.00%
50.00%
40.00%
30.00%
20.00%
10.00%
0.00%
Ready End Go Back Left Turn Right Slow Speed up
Forward

Figure 17: Female Voice Recognition Accuracy

28 | P a g e
Proudly sponsored by:
Voice Recognition Accuracy (Male)
100.00%
90.00%
80.00%
70.00%
60.00%
50.00%
40.00%
30.00%
20.00%
10.00%
0.00%
Ready End Go Back Left Turn Right Slow Speed up
Forward

Figure 18: Male Voice Recognition Accuracy

Voice Recognition Accuracy


100.00%

90.00%

80.00%

70.00%

60.00%

50.00%

40.00%

30.00%

20.00%

10.00%

0.00%
Ready End Go Back Left Turn Right Slow Speed up
Forward

Figure 19: Overall Voice Recognition Accuracy

29 | P a g e
Proudly sponsored by:
3.5 Control Code Implementation
3.5.1 Overview of the Control Code
Once the speech recognition program in MATLAB recognizes the input command, it must send
this information to the control code on the STM32F3Discovery microcontroller for the robot to
move. To do this, a wireless interface is implemented between the MATLAB program and the
microcontroller. The interface has 2 main steps:

1. Convert the Input Commands to Digits


This is done by MATLAB and the digit is then sent wirelessly to the microcontroller.
Depending on the digit received by the microcontroller, it moves the robot accordingly.
The conversion of every command to its respective digit is shown in Table 11 below.
2. Handshake Protocol
A handshake protocol is used to establish a wireless communication channel between
MATLAB program running on the terminal and the microcontroller, using the nRF24L01+
wireless module. The protocol will be initiated by the terminal that runs MATLAB.

TABLE 11: COMMANDS AND THEIR


BINARY NUMBERS

NO. Command Digit

Ready 1
1

Go Forward 2
2

Back 3
3

Turn Right 4
4

Left 5
5

Speed Up 6
6

Slow 7
7

End 8
8

30 | P a g e
Proudly sponsored by:
3.5.1.1 Handshake Protocol

The following steps highlight the handshake protocol execution. The flowchart can be seen in
Appendix H.

 In order to start the control program MATLAB sends a signal, for example the character
‘a’.
 Control program receives the signal and checks if the character is indeed ‘a’. If it is, the
control program sends an acknowledgement to the speech recognition program.
 If the character is not ‘a’, the control program ignores the received initialization signal.
 If the recognition program does not receive an acknowledgement from the control
program within 3 seconds, it will restart the handshake protocol by sending the
character ‘a’ again.
 If the recognition program receives an acknowledgement from the control program
within 3 seconds, it will send an acknowledgement received signal to the control
program.
 The system will continue in this loop until a connection is established.

3.5.2 Control Code In-Depth


The control code has 8 movement inputs that are the movement commands issued by the user.
It executes a routine for each input for example it executes the forward routine when the user
commands ‘Forward’. The control code is also interrupted by 3 interrupts, timer, object
detection and balance. The program flowchart is shown in Appendix I.

 Step 1: Initialize the System Before the Bipedal Robot is Ready to Move
As soon as the power is turned on the control program waits for MATLAB to initialize the
handshake protocol. The handshake protocol establishes a wireless connection between
the MATLAB terminal and the microcontroller. Once the wireless connection is
established the control program waits for the user to give a voice command ‘START’.
 Step 2: Ready State
Once the user gives the command ‘START’ the robot will give a signal to the user
indicating it is ready to move. This is done by lighting up an LED. The command ‘START’
will execute a while loop. This while loop keeps switching between movement routines
as commanded by the user. For example, forward, backward, etc. After the command is
received, the control code executes the specific movement routine. This in turn controls
the servos that perform the actual movement. The specific routine is executed multiple

31 | P a g e
Proudly sponsored by:
times until it is interrupted either by the user issuing a different movement command or
if the routine has run for more than 1 minute.

The program comes out of the ready state when the user commands the robot to ‘STOP’. The
user will have to say ‘start’ to re-enter this state to get the robot to move again.

3.5.3 Interrupts Execution


3.5.3.1 Timer Interrupt Execution

Timer interrupt just increments the timer variable. The main program checks if the routine
execution hits the maximum timer value (indicates passage of 1 minute), if yes then the
program stops the execution of the routine.

3.5.3.2 Object Detection and Balancing Interrupt

In phase 2 we had a balancing interrupt and an object detected interrupt. However we decided
it is better two combine these two interrupts into one where a timer would interrupt the
control code every 500 ms and perform the balancing and object detection checks. The balance
will be checked first and then if an object is detected as it is primary for the robot to maintain
its balance. The routines for both balancing and object detection are mentioned below.

OD interrupt executes only when the program is running the forward or back routines. When
activated, it halts the execution of the current routine and sets the ‘detect’ global variable. This
forces the main program to execute the detection function. The detection function checks 3
conditions in a loop. They are:

 If the user issues a different command. If yes, the function ends and the main program
starts executing the new command routine
 If the object moves out of the way. If yes, the function ends and the main program
continues executing the command routine that was executed by the interrupt.
 If the timer runs out for the execution of the detection function, the function ends and
the main program awaits the next user command.

Balance interrupt is used to maintain the balance of the robot. It is executed when the
gyroscope readings are out of bounds of the values that are required to maintain balance. If the
readings are out of bounds the interrupt immediately resets all the robot servo values to the
values they have while standing stationary in an attempt to regain balance. If the readings are

32 | P a g e
Proudly sponsored by:
within bounds it means balance is restored. But if the readings are still out of bounds after
resetting the servos to their initial values the interrupt sends a message to the MATLAB
program saying it has tipped over.

In Figure 20, the IR detection range criteria shows that we are interested in detecting objects by
the sensors in the range of 15-30 cm. In Figure 21, the Ultrasonic Sensor shows we are
interested in detecting objects in the area after the red dotted line. In this setup, since the
Ultrasound Sensor has a 15o angle deviation from its original position, we tilted it downwards.
This ensures it can detect any object approaching after the 30 cm maximum detection range by
the IR Sensors.

Figure 20: IR Sensors Structure

Figure 21: US Sensor Structure

33 | P a g e
Proudly sponsored by:
3.5.4 Control Code Testing
3.5.4.1 Test 1: Servo Calibrations

In the first test for control code, we had to parallel work done with mechanical design to
choose appropriate servo positions that will give the robot the degrees of freedoms it needs.

To solve this problem, we first attached the servos to the legs of the robot and moved the legs
by hand to make sure that the legs have the degrees of freedom needed to accomplish
movement and compared that with the PWM calculations the microcontroller will have to
make to keep the robot balanced and moving.

Later on, the microcontroller was programmed with the PWMs that will set the robot in various
positions that robot will be in while moving (refer to section 3.3.2) to check if the calculations
made earlier were correct. This part was done in 2 steps because we needed to make sure that
when the PWMs are implemented by the microcontroller the servos do not work against each
other and damage their internal gears or the mechanical parts the make up the legs.

3.5.4.2 Test 2: Smooth Movements

In this test, the problem we faced was that the servomotors moved the legs swiftly when we
programmed them to move. This was a major concern as swift servomotor movement had the
potential to damage the bipedal leg chassis. To solve this problem, the servomotors were
programmed in such a way to be incremented in steps to transition from one position to
another instead of one swift fast movement that can lead to loss of stability and balance. The
steps are incremented in delays in microseconds so that the servo movement seems like one
smooth transition from one position to another.

3.5.4.3 Test 3: Appropriate Angles for Proper Movement and Shifting of Weight

As the mechanical design of the robot kept changing from one prototype to another, we had to
test different combinations of positions of the robot by manipulating the servo positions
through the PWMs to choose the appropriate stride and sway of the robot that will keep its
feet from slipping too much while walking, resulting in a change of direction.

NOTE:
Stride is the maximum distance between the robots legs while walking forward or back.
Sway is the shift of the ankle angles to shift the weight from one foot to another while walking
forward or back and while turning

34 | P a g e
Proudly sponsored by:
3.5.4.4 Test 4: Selecting Appropriate Interrupt Angles for Balancing the Robot

In this test, we needed to solve the problem of maintaining the angles the robot needs to walk
to ensure balance and stability. We decided to place the STM32F3Discovery near the top of the
robot as it gives us the best indication of the robots inclinations. The STM32F3Discovery has an
accelerometer and gyroscope on it that can keep track of the evaluation board`s inclinations. So
we calculated and selected the appropriate pitch and roll angles (inclination) for the
STM32F3Discovery evaluation board i.e. the top of the robot for it to stay balanced. Now if the
robots inclination is more than the selected angles then it will correct itself.

To maintain absolute balance, the robot will not try to balance itself via individual joint
correction of the legs. Instead, it will immediately stand straight as can be seen in Figure 11. We
thought this was a better approach because our mechanical design gives us a lot of stability
when the robot is standing straight. If we tried to individually correct each joint, we ran the risk
of overcorrecting and end up losing the balance again by losing control over the position of our
center of gravity. Also, individual correction of each joint is required if the robot was walking on
uneven terrain which is outside our project specifications as our robots operational parameter
are limited to a flat surface.

35 | P a g e
Proudly sponsored by:
4 Deliverable

4.1 Final Product Overview


4.1.1 Product Name
The name we selected for our final product, as contributed by one of our colleagues[24] was
Gump, inspired by the 1994 Motion Picture Forrest Gump[14], to highlight the fact that our
bipedal robot was struggling to walk at first before improving its movements. Another variable
was that the main character in the film always did as he was told yet still managed to appeal to
many people. We felt that our robot had that same level of innocence from its design and
voice-command recognition features.

Figure 22: Gump Logo

4.1.2 Gump Features


Gump is a bipedal robot that has eight degrees of freedom. It can recognize up to eight voice
commands by humans wirelessly and execute the movements of those commands. It also
features object detection to protect itself in the environment it is operating in.

36 | P a g e
Proudly sponsored by:
Figure 23: Gump Features

TABLE 12: SYSTEM FEATURES

NO. Description

1
Servomotor

2 Torso Platform

3 C Bracket

4 IR Sensor

5 Battery

6 Foot Base

7 Foot

8 Servo Bracket

9 Ultrasonic Sensor

10 PCB Box

11 PCB

12 Switch

13 L Bracket

37 | P a g e
Proudly sponsored by:
4.2 System Specifications
Figure 24 shows Gump’s different dimensions. The following table outlines its final
specifications.

TABLE 13: SYSTEM SPECIFICATIONS

NO. Specifications Value

1 Length 179.62 mm

2 Height 299.77 mm

3 Width 336.46 mm

4 Weight 1,475 grams

5 Operation Time 1.5 hours

6 Degrees of Freedom 8

7 Operation Temperature -10o to 40o C

8 Wireless Communication Range 100 m

Figure 24: Gump Dimensions

38 | P a g e
Proudly sponsored by:
4.3 Project Budget
4.3.1 Prototyping Costs
The prototyping costs highlight how much was spent on selecting and iterating different parts
and components to get the final robot design we needed for Gump. This includes the 3D
Printing and laser cutting costs. The assumption, based on the input of the EDML Staff, was that
the market price for 3D Printing a part was $25 per cubic inch. The following table illustrates
the 3D Printing costs.

TABLE 14: 3D PRINTING COSTS

Printing Cubic Cost per


NO. Part Name Total
Amount Inch Cubic Inch

1 Foot Base 1 2 3.227 $25 $161.35

2 Bridge 1 6.554 $25 $163.85

3 New L Bracket 4 1.74 $25 $174

4 Long L Bracket 2 2.267 $25 $113.35

5 Servo Bracket 1 1 3.1078 $25 $77.7

6 Servo Bracket 2 1 3.1078 $25 $77.7

7 Servo Bracket 3 10 3.1078 $25 $776.95

8 Long C Bracket 6 4.67759 $25 $701.64

Small C
9 2 2.765 $25 $138.25
Bracket
Regular C
10 2 3.18622 $25 $159.31
Bracket
Total Cost $2,544.09

The final costs of 3D Printing came up to $2,544.09, a significantly high cost for testing the best
leg design. Luckily for us, ECE students had access to the Mechanical Department 3D Printer, so
the costs were already covered.

39 | P a g e
Proudly sponsored by:
For the acrylic laser cutting and thermal heating, the following table illustrates the costs of both
aspects of the manufacturing process. The costs, according to the EDML staff, are $20 every 30
minutes of use of either the laser cutter or the thermal heater.

TABLE 15: LASER CUTTING COSTS

Time
NO. Batch Number Cost (per hour) Total
(Minutes)

1 1 30 $40.00 $20.00

2 2 90 $40.00 $60.00

3 3 90 $40.00 $60.00

4 4 30 $40.00 $20.00

5 5 30 $40.00 $20.00

Total Cost $180.00

TABLE 16: THERMAL HEATING COSTS

Time
NO. Batch Number Cost (per hour) Total
(Minutes)

1 1 30 $40.00 $20.00

2 2 60 $40.00 $40.00

3 3 60 $40.00 $40.00

4 4 0 $40.00 $0.00

5 5 0 $40.00 $0.00

Total Cost $100.00

The final total for manufacturing our acrylic parts came to a total of $280 for the combined use
of laser cutting and thermal heating. Furthermore, when we first began doing experiments in
Phase I of the project, we decided to buy a small two-wheeled robot for prototyping commands

40 | P a g e
Proudly sponsored by:
and movements. The chassis we chose was the 2WD Beginner Robot Chassis V2[11] and the
costs are shown in the following table.

TABLE 17: TWO-WHEELED PROTOTYPE

NO Part Name Quantity Cost Total


.
1 Arduino Uno 1 $28.25 $28.25

2WD Beginner
2 1 $20.00 $20.00
Robot Chassis V2

3 SN754110 1 $2.00 $2.00

4 USB A to B cable 1 $2.59 $2.59

5 9 V Battery 1 $12.00 $12.00

6 Breadboard 1 $10.00 $10.00

Wires, Capacitors,
7 15 $3.00 $3.00
etc.
Total Cost $77.84

In addition, since we have substituted two HS-5485HB servomotors on the hip rotation area for
two HS-7954SH servomotors, the two HS-5485HB are now part of the prototyping costs. Each
HS-5485HB costs $25, which adds $50 to the prototyping costs. We bought three acrylic sheets,
each one costs us $30 dollars, so we spent $90 on acrylic, which is also added to the
prototyping costs. The final prototyping cost of the entire project is $3,041.93.

4.3.2 Final Design Costs


4.3.2.1 User-Side System

The final costs of the user-side system are shown in Figure 18. For a more comprehensive Bill of
Materials, please refer to Appendix D. Note however that the part numbers for each number of
components coincide with the PCB as found in Appendix C.1.

41 | P a g e
Proudly sponsored by:
TABLE 18: USER-SIDE SYSTEM COSTS

NO. Components Quantity Total

Resistors,
1
Capacitors, LEDs, 19 $3.80
etc.

2 ATtiny44 1 $1.50

3 Voltage Regulator 1 $0.75

4 Connectors 2 $3.00

5 Wireless Module 1 $18.75

6 Box 1 $3.99

7 MATLAB 1 $2,150.00

Neural Network
8 1 $1000.00
Toolbox

9 PC 1 $1000.00

10 Microphone 1 $50.00

11 USB A to B Cable 1 $2.59

Total Cost: $4,234.38

For a more realistic estimate considering someone would want to recreate this project outside
academia, the estimated costs of MATLAB[12] and the Neural Network Tool Box[13] are about
$3150 plus tax for a standard license. If we include a simple Laptop PC as part of the system
housing the speech recognition software, at an average cost of $1000 based on powerful
desktop costs, and an average cost of $50 for the microphone to improve voice command
quality, then the more realistic costs of the User-Side System comes up $4,234.38 plus tax.

4.3.2.2 Robot-Side System

Gump’s high costs have more to do with the expensive servo motors that allow it to walk in a
humanoid fashion. A look at the component list below gives an idea about Gump’s total cost.

42 | P a g e
Proudly sponsored by:
TABLE 19: ROBOT-SIDE SYSTEM COSTS

NO. Components Quantity Total

Resistors,
1
Capacitors, LEDs, 27 $5.40
etc.

2 STM32F3Discovery 1 $12.54

3 Voltage Regulator 3 $15.90

4 Connectors 18 $12.80

5 Wireless Module 1 $18.75

6 Box 1 $29.99

7 Sensors 4 $88.76

8 Battery 2 $69.98

9 Charger 1 $24.99

10 Screws, Nuts, etc. 105 $23.20

11 Servomotors 8 $318.00

12 Fuse 1 $4.95

13 Switch 1 $2.00

14 Aluminum Parts 8 $59.40

Total Cost: $686.66

In this overview of Gump’s costs, we have used aluminum parts for the bipedal which were
bought as separate components from Lynxmotion[6]. We haven’t listed any IDE costs for the
STM32F3Discovery since we were using the free version and had no need for a more advanced
IDE. The exceeded cost of the entire project was made up for by our application for additional
funding from the Concordia University Alumni Association.

43 | P a g e
Proudly sponsored by:
4.3.3 Human Resources Cost
Estimating the labor costs of this project took into account two things. First thing was the
average rate per hour per team member, which was extracted from the CO-OP program
average rate to be $18.75[17]. Secondly, since work varied greatly on this project as it increased
in complexity, our conservative estimate of the hours worked per week was about 25 hours. For
a look at the project timeline Gantt chart, please refer to Appendix J.

TABLE 20: LABOR COSTS

NO. Team Hours Worked Weeks


Rate Total
Member per Week Worked

1 Yaz 25 24 $18.75 $11,250.00

2 Rasna 25 24 $18.75 $11,250.00

3 Omer 25 24 $18.75 $11,250.00

4 Mital 25 24 $18.75 $11,250.00

Total Cost $45,000

4.3.4 Overall Project Cost

TABLE 21: OVERALL PROJECT COST S

NO. Subsystem Cost Total

1 Prototyping Cost $3,041.93

2 User-Side System Cost $4,234.38

3 Robot-Side System Cost $686.66

4 Labor Cost $45,000.00


Miscellaneous Cost (Printing, Delivery
5 $280.00
fees, etc.)
Total Cost $53,242.97

44 | P a g e
Proudly sponsored by:
4.4 Impact on Environment
This year’s Capstone emphasizes a need to discuss the engineering impact on the environment
as we proceed to complete our entire project. We simulated a sustainability analysis test for
the mechanical materials of the robot as shown in the next section. Section 4.5.2 will go over
how electronics affect the environment as they approach their end cycle.

4.4.1 Mechanical Impact


The following sustainability analysis was made possible via the Solidworks software[16]. We
begin with the following assumptions about the manufacturing of the parts, their intended time
use, their power requirement over the intended time use, and what we determined can be
recycled. Our first assumption was that the manufacturing region for our assembly was done in
Asia. The region it was intended to be used was in North America. Table 21 highlights the
importance of the region selection.

Manufacturing Region
The choice of manufacturing
region determines the energy
sources and technologies used
in the modeled material
creation and manufacturing
steps of the product’s life cycle.

Use Region
The use region is used to
determine the energy sources
consumed during the product’s
Figure 25: Regions Assumed[16]
use phase (if applicable) and the
destination for the product at its
end-of-life. Together with the
manufacturing region, the use
region is also used to estimate
the environmental impacts
associated with transporting the
product from its manufacturing
location to its use location.

Table 22: Regions Explained[16]

45 | P a g e
Proudly sponsored by:
Our next assumption was that the product lifespan was designed to be one year. After that, we
assumed that the product could only be used for one year, provided the user recharges it every
day. To calculate the power needed by Gump over a span of one year, we made the following
assumptions. Gump needs 7.4 V x 4800 mAh = 35.52 Wh for a full charge. Let’s assume the user
will charge it and use it once a day for an entire year. That means that the energy requirements
of Gump over a span of one year is 12.9648 kWh.

The other assumption we made was that we can recycle the aluminum, acrylic, and
polycarbonate materials that make up Gump’s chassis. They form about 45% of Gump’s total
components, which we assume can be recycled. The remaining components will be either
incinerated or taken to a landfill. The final assumption we are going to make is the costs of
transportation of Gump from the manufacturing assembly in Asia to North America. We
assumed that transportation by sea was the most effective method, so we include it in our
sustainability analysis report. To summarize our assumptions, we tabulated the data in the
following table.

TABLE 23: SUSTAINABILITY ASSUMPTIONS

NO. Assembly Process Use

1
Region Asia Region North America

Energy
2 Electricity Energy Type Electricity
Type
Energy Energy
3 0.02kWh 12.9
Amount Amount
Built to Duration of
4 1 year 1 year
Last Use

5 Transportation End of Life

Truck
6 0 km Recycled 45 %
Distance
Train
7 0 km Incinerated 13 %
Distance
Ship 1.2E+4
8 Distance km
Landfill 42%
Airplane
9 0 km
Distance

46 | P a g e
Proudly sponsored by:
4.4.1.1 Sustainability Analysis Results

The following table shows how much Gump’s manufacturing and use over the lifespan of the
product affects the environment. For an expansion on the definitions of each chart, please refer
to Appendix E.
Carbon Footprint Total Energy Consumed

Material: 3.5 kg CO2e Material: 55 MJ

Manufacturing: 1.4 kg CO2e Manufacturing: 14 MJ

Use: 10 kg CO2e Use: 150 MJ

Transportation: 0.159 kg CO2e Transportation: 2.1 MJ

End of Life: 0.389 kg CO2e End of Life: 0.287 MJ

16 kg CO2e 220 MJ

Air Acidification
Water Eutrophication
Material: 0.016 kg SO2e
Material: 1.9E-3 kg PO4e
Manufacturing: 0.020 kg SO2e
Manufacturing: 7.8E-4 kg PO4e
Use: 0.070 kg SO2e
Use: 2.6E-3 kg PO4e
Transportation: 1.5E-3 kg SO2e
Transportation: 2.2E-4 kg PO4e
End of Life: 2.1E-4 kg SO2e
End of Life: 4.5E-4 kg PO4e
0.109 kg SO2e
5.9E-3 kg PO4e

Figure 26: Effect on the Environment

What is so interesting about Gump’s sustainability analysis report is that the largest factor that
affects our environment was the use of the product, compared to its manufacturing process. Of
course, any user that has to charge Gump daily for a whole year of use is consuming more
energy that is contributing to affecting the environment. Our solution to the problem would be

47 | P a g e
Proudly sponsored by:
to make Gump more energy efficient in a future version, where recharging each day won’t be
necessary. The following figure shows the top components that make up Gump which are
affecting the environment the most.

Component Carbon Water Air Energy

PCB Box 1.1 5.3E-4 7.9E-3 15

Torso Platform 0.523 2.5E-4 3.8E-3 7.2

Foot 0.481 1.1E-4 3.4E-3 5.9

Foot Base 0.245 1.2E-4 1.8E-3 3.3

Multi-Purpose Servo
0.160 3.8E-5 1.1E-3 1.9
Bracket

Spline 0.038 1.4E-4 2.1E-4 0.420

Figure 27: Component Effect on Environment

What is so interesting about the analysis was that the PCB Box was having the largest impact on
the environment. The PCB Box is the only component in Gump made out of Polycarbonate. Our
conclusion for a future version of Gump would be to use a more environmentally friendly
material rather than Polycarbonate.

4.4.2 Electrical Impact on Environment


While we can’t run a full sustainability test on the electronic components that make up Gump,
we can still find ways to reduce the electronic impact on the environment.

48 | P a g e
Proudly sponsored by:
At the end of a product’s life cycle, battery disposal is harmful to the environment. So, for the
future version of Gump, we could offer our customers a trade. They can come in to our stores
and trade an old version of Gump for a newer version. In return, we have the advantage of
receiving all the old models back. That way, if the batteries of the old version have exceeded
their lifespans, we can take control of disposing them in a controlled scenario having the least
amount of impact on the environment and avoiding leaking mercury in our surroundings.

Furthermore, the fact that a customer can trade an old Gump model for a new one and get a
reduced price on the new version means that we can use the old Gump models for recycling
their components, electronics, refurbishing and donating them, and many more usages that
decrease the impact of electronic materials in affecting the environment.

49 | P a g e
Proudly sponsored by:
5 Conclusion
5.1 Future Improvements
Gump can improve in several ways. As can be seen by the egg logo, Gump is still an infant. It
has the potential to do so much more. Imagine if you will if Gump was average human height,
with arms and an interface that a user can directly interact with. Gump can connect to a user’s
phone and give so many updates via WIFI. It can take out the trash, help with kitchen chores,
and learn other things. It can respond in audio due to speakers and some pre-recorded
answers. The Artificial Neural Network can essentially be held inside Gump and built with a
grander voice-command recognition framework. The object detection system can be improved
to detect color combinations, faces, and other pre-programmed indicators that can help Gump
function more advanced tasks. Object detection can also be part of the Artificial Neural
Network. Its components can be updated in-store as technology grows, allowing flexibility for
potential customers. Gump, along with the entire future of robotics, can have a place in
households, becoming an extension of humanity’s daily life. Personal robots, for entertainment
and also for household management, will exist someday.

5.2 Conclusion
In conclusion, this project has influenced us all in so many ways. When we first undertook the
problem of designing a voice-commanded robot, we were recommended several designs that
would have easy implementations. However, we decided not to use them as we were
passionate about building a robot that is as humanoid as possible. With our understanding of
the Artificial Neural Network for voice-command recognition, we knew that it was a window
into A.I. Our amazement with what MATLAB can help us do increased our admiration for the
sophistication of the framework. Our introduction into PCB design was fulfilling, as it allowed us
to design electronics in a neat and aligned structure. It also allowed us to understand deeper
concepts behind schematic designs and grow an appreciation for the harmony of an electrical
system. Our experimentation with the STM32F3Discovery amazed us at the current level
advances in computing have reached. A fully assembled microcontroller with a built in
gyroscope, e-compass, and accelerometer all for the price of lunch is truly remarkable. Our leap
into mechanical design without the help of a mechanical engineering student gave us an
appreciation for the role mechanical engineering plays in electronics. It helped us experiment
with a more beautiful design for Gump. Our appreciation for robotics has grown dramatically
and we find ourselves ready to fulfill future contributions to that field and the entire Artificial
Intelligence field. Gump is already a part of it.

50 | P a g e
Proudly sponsored by:
6 Bibliography

[1] B. Industries, "Blue NEMA Box," [Online]. Available:


http://www.budind.com/view/NEMA+Boxes/Blue+Transparent+NEMA+4X.

[2] H. Manufacturing, "1591LSBK," [Online]. Available: http://ca.mouser.com/ProductDetail/Hammond-


Manufacturing/1591LSBK/?qs=KTIoeDiBmBLYrjsBruP65g==.

[3] Hitec, "HS-5485HB," [Online]. Available: http://hitecrcd.com/products/servos/sport-servos/digital-


sport-servos/hs-5485hb-standard-karbonite-digital-servo/product.

[4] Hitec, "HS-7954SH," [Online]. Available: http://hitecrcd.com/products/servos/premium-digital-


servos/hs-7954sh-high-torque-hv-coreless-steel-gear-servo/product.

[5] V. Groups, "5C 2S 2400mAh 7.4 V LiPo," [Online]. Available: http://www.venom-group.com/5C-2S-


2400mah-7-4v-LiPO-RX-TX-Flat-pack.html?sc=9&category=19506.

[6] Lynxmotion, "BRAT Biped," [Online]. Available: http://www.lynxmotion.com/c-139-auton-combo-


kit.aspx.

[7] Lynxmotion, "BRAT Biped Assembly," [Online]. Available:


http://www.lynxmotion.com/images/html/build104.htm.

[8] Solidworks, "Motion Studies," [Online]. Available:


http://help.solidworks.com/2013/English/SolidWorks/motionstudies/t_modifying_mesh_density.htm.

[9] C. Mechanics, "Von Mises Stress," [Online]. Available:


http://www.continuummechanics.org/cm/vonmisesstress.html.

[10] Stratasys, "ABSplus Plastic for Fortus 3D Production Systems," [Online]. Available:
http://www.stratasys.com/~/media/Main/Secure/Material%20Specs%20MS/Fortus-Material-
Specs/Fortus-MS-ABSplus-P430-01-13-web.ashx.

[11] R. Shop, "2WD Beginner Robot Chassis V2," [Online]. Available:


http://www.robotshop.com/ca/en/2wd-beginner-robot-chassis.html.

[12] Mathworks, "MATLAB Pricing," [Online]. Available: http://www.mathworks.com/pricing-


licensing/index.html?prodcode=ML&s_iid=main_pl_ML_tb.

[13] Mathworks, "Neural Network Toolbox Pricing," [Online]. Available:


http://www.mathworks.com/pricing-licensing/index.html?prodcode=NN.

[14] I. M. Database, "Forrest Gump," [Online]. Available: http://www.imdb.com/title/tt0109830/.

51 | P a g e
[15] Kickstarter, "PIXY CMUCam 5," [Online]. Available:
https://www.kickstarter.com/projects/254449872/pixy-cmucam5-a-fast-easy-to-use-vision-sensor.

[16] Solidworks, "Solidworks Software Packages," [Online]. Available:


http://www.solidworks.com/sw/products/3d-cad/packages.htm.

[17] C. University, "CO-OP Program," [Online]. Available: http://www.concordia.ca/academics/co-


op/programs.html.

[18] J. Edward R. Evans, "METBD 111: Factor of Safety," Penn State Erie, The Behrend College, [Online].
Available: http://engr.bd.psu.edu/rxm61/213/problems/factor_of_safety.pdf.

[19] E. Reingold, "PSY371: Artificial Neural Networks Technology," University of Toronto, [Online].
Available: http://www.psych.utoronto.ca/users/reingold/courses/ai/cache/neural2.html.

[20] D. Ning, "Developing an Isolated Word Recognition System in MATLAB," MathWorks, [Online].
Available: http://www.mathworks.com/company/newsletters/articles/developing-an-isolated-word-
recognition-system-in-matlab.html.

[21] M. Brookes, "VOICEBOKX: Speech Recognition Toolbox for MATLAB," Department of Electrical and
Electronic Engineering, Imperial College. [Online].

[22] J. Kyrnin, "What is an IDE and do you need an IDE to build Web Applications?," About.com, [Online].
Available: http://webdesign.about.com/od/webprogramming/a/what-is-an-ide.htm.

[23] A. Romanul, Montreal, QC, Canada.

[24] G. Leenders, Montreal, QC, Canada.

52 | P a g e
APPENDIX A: Mechanical Drawings and List of Parts
A.1 List of Parts
The following table shows the list of parts used in Gump’s final assembly. Their mechanical
drawings are shown in Section A.2.
Item Quantity Material

L Bracket 4 Acrylic Plastic


C Bracket 6 Acrylic Plastic
Multi-Purpose Servo Bracket 8 Aluminum 6061-T6
Feet 2 Aluminum 6061-T6
Feet Base 2 Acrylic Plastic
Torso Platform 1 Acrylic Plastic
PCB Box 1 Polycarbonate

The following table shows the mechanical parts used to hold all the brackets together.
Item Quantity

Bearing, Flanged 3 mm ID 8 mm 8
OD 4 mm THK
M3 X 8 mm LG PHILLIPS Pan 8
Head Steel Screw
3 mm Lock Washer 8

3 x .50 Hex Nut 28

2-56 X .25 LG PHILLIPS Pan Head 18


Screw
2-56 Nuts 18

2 x 0.25 Tap Screw 16

M4 x 8 mm LG PHILLIPS Pan 4
Head Steel Screw
M3 X 0.5’’ LG PHILLIPS Pan Head 20
Steel Screw

53 | P a g e
A.2 Mechanical Drawings

54 | P a g e
55 | P a g e
56 | P a g e
57 | P a g e
58 | P a g e
59 | P a g e
60 | P a g e
APPENDIX B: Stress Analysis
B.1: ABSplus Plastic Stress Analysis Results
The assumptions for the stress analysis have already been made in Section 3.3.3.1

MESH INFORMATION

Specification Type
NO.

Mesh Type Solid Mesh


1
Mesher Used Standard Mesh
2
Jacobian Points 4 Points
3
Element Size 1.60149 mm
4
Tolerance 0.0800745 mm
5
Mesh Quality High
6
Total Nodes 17242
7
Total Elements 8795
8
Maximum 8.1215
9 Aspect Ratio

61 | P a g e
We begin our analysis of the material by first beginning with ABSplus Plastic. The first stress
result shows the von Mises Stress.
The von Mises stress is used in determining whether an isotropic and ductile metal will yield
when subjected to a load condition. This is accomplished by calculating the von Mises stress
and comparing it to the material's yield stress, which constitutes the von Mises Yield Criterion
[9]. The following table indicates the ABSplus material properties[10].

ABSPLUS PLASTIC MATE RIAL PROPERTIES

Property Value
NO.

Name ABSplus
1
Yield Strength 28 N/mm^2
2
Tensile Strength 36 N/mm^2
3
Elastic Modulus 2000 N/mm^2
4

The following figure indicates the minimum and maximum value of the von Mises stress
analysis. Regions that are closer to red indicate the maximum stress analysis value. In the case
of ABSplus, the maximum and minimum values are indicated in the following table.

ABSPLUS VON MISES STRESS ANALYSIS

Name Type Min Max


NO.

Stress Von Mises 5.14911e-010 44.9473


1 Stress N/mm^2 (MPa) N/mm^2
(MPa)

62 | P a g e
The next thing to be able to compute would the factor of safety criteria. This will help us
conclude whether a material is strong enough for our use or not.
The Factor of Safety (FoS) essentially informs us of the structural capacity of a system beyond
the load applied [18]. It allows us to see whether the system can be able to handle more than
the load we wish to apply to it. In the servo bracket part example, our load was 10 N applied.
This is due to the fact that the servomotor placement on the servo bracket would exert a small
force when being housed in that part.
Taking all this assumptions, we shall use the Factor of Safety formula to fully understand what
we are measuring:
𝑀𝑎𝑡𝑒𝑟𝑖𝑎𝑙 𝑆𝑡𝑟𝑒𝑛𝑔𝑡ℎ
𝐹𝑎𝑐𝑡𝑜𝑟 𝑜𝑓 𝑆𝑎𝑓𝑒𝑡𝑦 =
𝐷𝑒𝑠𝑖𝑔𝑛 𝐿𝑜𝑎𝑑
In the previous analysis for von Mises Strength, we noted that the maximum value we obtained
was 44.9473 N/mm^2. The ABSplus Yield Strength is 28 N/mm^2.
Therefore, our obtained Factor of Safety for ABSplus is:
28 𝑁/𝑚𝑚^2
𝐹𝑎𝑐𝑡𝑜𝑟 𝑜𝑓 𝑆𝑎𝑓𝑒𝑡𝑦𝐴𝐵𝑆𝑝𝑙𝑢𝑠 = = 0.62295
44.9473 𝑁/𝑚𝑚^2
Our obtained final value of 0.62295 for the ABSplus Factor of Safety is not satisfactory. We need
a minimum Factor of Safety of 1 to indicate that the material strength can handle the design
load.

63 | P a g e
B.2 Aluminum 6061 T-6 Alloy Stress Analysis

The first stress result shows the von Mises Stress.


The von Mises stress is used in determining whether an isotropic and ductile metal will yield
when subjected to a load condition. This is accomplished by calculating the von Mises stress
and comparing it to the material's yield stress, which constitutes the von Mises Yield Criterion.

Property Value
Name Aluminum 6061 T-6 Alloy
Yield Strength 2.75e+008 N/m^2
Tensile Strength 3.1e+008 N/m^2

The following figure indicates the minimum and maximum value of the von Mises stress
analysis. Regions that are closer to red indicate the maximum stress analysis value. In the case
of Aluminum 6061 T-6 Alloy, the maximum and minimum values are indicated in the following
table.

64 | P a g e
Name Type Min Max
Stress VON: von Mises Stress 8.75638e-005 N/m^2 4.85014e+007 N/m^2
Node: 11027 Node: 16382

The following table shows the Factor of Safety for Aluminum 6061 T-6 Alloy.

Name Type Min Max


Factor of Safety Max von Mises Stress 5.66993 3.14057e+012
Node: 16382 Node: 11027

65 | P a g e
B.3: Acrylic Stress Analysis
The assumptions for the stress analysis have already been made in Section 3.3.3.1.

The first stress result shows the von Mises Stress.


The von Mises stress is used in determining whether an isotropic and ductile metal will yield
when subjected to a load condition. This is accomplished by calculating the von Mises stress
and comparing it to the material's yield stress, which constitutes the von Mises Yield Criterion.

Property Value
Name Acrylic Plastic
Yield Strength 45 N/mm^2
Tensile Strength 73 N/mm^2

The following figure indicates the minimum and maximum value of the von Mises stress
analysis. Regions that are closer to red indicate the maximum stress analysis value. In the case
of Acrylic Plastic, the maximum and minimum values are indicated in the following table.

66 | P a g e
Name Type Min Max
Stress VON: von Mises Stress 2.20689e-009 46.651 N/mm^2
N/mm^2 (MPa) (MPa)
Node: 9196 Node: 14824

The following table shows the Factor of Safety for Aluminum 6061 T-6 Alloy.

Name Type Min Max


Factor of Safety Max von Mises Stress 0.96461 2.03907e+010
Node: 14824 Node: 9196

67 | P a g e
APPENDIX C: Electrical Schematics
C.1 User Schematic

68 | P a g e
69 | P a g e
C.2: Robot Schematic

70 | P a g e
71 | P a g e
APPENDIX D: Bill of Materials

Part Number Part Supplier Supplier Part Quantity Unit Total Price
Description Number Required Price
(US or CAN?)
(US or
CAN?)
STM32F3DISCO KIT EVAL Digikey.ca 497-13192-ND 1 12.54 12.54
VERY DISCOVERY
STM32F3
LM2575T- IC REG Digikey.ca LM2575T- 1 2.95 2.95
5.0/NOPB BUCK 5V 5.0/NOPB-ND
1A TO220-
5
IC MCU 8BIT IC MCU Digikey.ca ATTINY44A-PU- 1 1.50 1.50
4KB FLASH 8BIT 4KB ND
14DIP FLASH
14DIP
LT1374CT7#PB IC REG Digikey.ca LT1374CT7#PBF- 1 12.11 12.11
F BUCK ADJ ND
4.5A
TO220-7
LD1117V33 IC REG LDO Digikey.ca 497-1491-5-ND 2 0.74 1.48
3.3V 0.8A
TO220AB
LTL-4222N LED 3MM Digikey.ca 160-1140-ND 1 0.39 0.39
HI-EFF RED
TRANSPAR
ENT
LTL-1CHG LED 3MM Digikey.ca 160-1710-ND 3 0.39 1.17
GREEN
DIFFUSED
PA0431LNL INDUCT Digikey.ca 553-1503-ND 1 2.23 2.23
PWR
7.8UH TH
PE-52627NL INDUCTOR Digikey.ca 553-1567-5-ND 1 4.54 4.54
LOW
POWER
330UH T/H
1N5821-E3/54 DIODE Digikey.ca 1N5821- 1 0.61 0.61
SCHOTTKY E3/54GITR-ND
30V 3A
DO201AD
MMBD914-7-F DIODE Digikey.ca MMBD914- 1 0.14 0.14
SWITCHIN FDICT-ND
G 75V 0.2A
SOT23-3
1N5819-TP DIODE Digikey.ca 1N5819-TPCT- 1 0.47 0.47
SCHOTTKY ND

72 | P a g e
40V 1A
DO41
TXB0108PWR IC 8-BIT Digikey.ca 296-21527-1-ND 1 2.65 2.65
TRNSTR
15KV ESD
20TSSOP
292304-1 CONN USB Digikey.ca A31725-ND 1 2.21 2.21
RECEPT
R/A TYPE B
4POS
FK18X7R1E104 CAP CER Digikey.ca 445-8421-ND 9 0.23 2.07
K 0.1UF 25V
10%
RADIAL
FK26X7R1E106 CAP CER Digikey.ca 445-8552-ND 4 0.72 2.88
K 10UF 25V
10%
RADIAL
ECA-2EHG4R7 CAP ALUM Digikey.ca P5866-ND 1 0.49 0.49
4.7UF
250V 20%
RADIAL
FK22X5R1E156 CAP CER Digikey.ca 445-8474-ND 1 1.22 1.22
M 15UF 25V
20%
RADIAL
FK28X7R1H152 CAP CER Digikey.ca 445-5247-ND 1 0.33 0.33
K 1500PF
50V 10%
RADIAL
C322C274K5R5 CAP CER Digikey.ca 399-9806-ND 1 0.79 0.79
TA 0.27UF
50V 10%
RADIAL
TPSD477M006 CAP TANT Digikey.ca 478-1791-1-ND 1 4.68 4.68
R0200 470UF
6.3V 20%
2917
ECA-1VHG101 CAP ALUM Digikey.ca P5551-ND 1 0.36 0.36
100UF 35V
20%
RADIAL
EEU-FC1E331 CAP ALUM Digikey.ca P10273-ND 1 0.65 0.65
330UF 25V
20%
RADIAL
CF14JT68R0 RES 68 Digikey.ca CF14JT68R0CT- 2 0.10 0.20
OHM ND
1/4W 5%
CARBON
FILM

73 | P a g e
CF18JT300R RES 300 Digikey.ca CF18JT300RCT- 3 0.11 0.33
OHM ND
1/8W 5%
CF AXIAL
CF18JT150R RES 150 Digikey.ca CF18JT150RCT- 1 0.11 0.11
OHM ND
1/8W 5%
CF AXIAL
CF14JT1K50 RES 1.5K Digikey.ca CF14JT1K50TR- 1 0.10 0.10
OHM ND
1/4W 5%
CARBON
FILM
CF14JT10K0 RES 10K Digikey.ca CF14JT10K0TR- 1 0.10 0.10
OHM ND
1/4W 5%
CARBON
FILM
CF12JT3K00 RES 3K Digikey.ca CF12JT3K00TR- 1 0.17 0.17
OHM ND
1/2W 5%
CARBON
FILM
CF18JT5K10 RES 5.1K Digikey.ca CF18JT5K10TR- 1 0.12 0.12
OHM ND
1/8W 5%
CF AXIAL
CF18JT7K50 RES 7.5K Digikey.ca CF18JT7K50TR- 1 0.11 0.11
OHM ND
1/8W 5%
CF AXIAL
CF18JT51K0 RES 51K Digikey.ca CF18JT51K0TR- 1 0.12 0.12
OHM ND
1/8W 5%
CF AXIAL
CF18JT24K0 RES 24K Digikey.ca CF18JT24K0TR- 1 0.12 0.12
OHM ND
1/8W 5%
CF AXIAL
CF18JT1M00 RES 1M Digikey.ca CF18JT1M00TR- 1 0.12 0.12
OHM ND
1/8W 5%
CF AXIAL
ATS120B CRYSTAL Digikey.ca CTX904-ND 1 0.43 0.43
12MHZ
18PF THRU
FK18C0G1H18 CAP CER Digikey.ca 445-4762-ND 2 0.35 0.70
0J 18PF 50V
5% RADIAL
PPTC042LFBN- CONN Digikey.ca S7072-ND 2 1.00 2.00
RC HEADER
FMALE

74 | P a g e
8PS .1" DL
TIN
PPTC252LFBN- CONN Digikey.ca S7093-ND 2 3.53 7.06
RC HEADER
FMAL
50PS .1" DL
TIN
SN74LVC2T45D IC BUS Digikey.ca 296-16845-1-ND 1 0.89 0.89
CTR TRANSCVR
2BIT N-INV
SM8
BT-2727 BOX POLY Digikey.ca 377-1545-ND 1 29.99 29.99
6.73X4.76X
3.16"
TR/BLU
4527 FUSE HLDR Digikey.ca 4527K-ND 1 0.93 0.93
CARTRIDG
E 250V 5A
PCB
7010.3480 FUSE 5A Digikey.ca 486-1864-ND 1 2.73 2.73
125V FAST
5X20
GLASS
S3B-EH(LF)(SN) CONN Digikey.ca 455-1626-ND 2 0.23 0.46
HEADER
EH SIDE
3POS
2.5MM
EHR-3 CONN Digikey.ca 455-1001-ND 2 0.10 0.20
HOUSING
EH 3POS
2.5MM
CRIMP
SEH-001T-P0.6 CONN Digikey.ca 455-1042-1-ND 1 0.59 0.59
(10pk) TERM
CRIMP EH
22-30AWG
S3B-PH-K- CONN Digikey.ca 455-1720-ND 2 0.20 0.40
S(LF)(SN) HEADER
PH SIDE
3POS 2MM
PHR-3 CONN Digikey.ca 455-1126-ND 2 0.11 0.22
HOUSING
PH 3POS
2.0MM
SPH-002T- CONN Digikey.ca 455-1127-1-ND 1 0.69 0.69
P0.5S TERM
(10pk) CRIMP PH
24-30AWG
S3B-XH- CONN Digikey.ca 455-2250-ND 8 0.25 2.00
A(LF)(SN) HEADER
XH SIDE

75 | P a g e
3POS
2.5MM
XHP-3 CONN Digikey.ca 455-2219-ND 8 0.12 0.96
HOUSING
2.5MM XH
3POS
SXH-002T-P0.6 CONN Digikey.ca 455-2261-1-ND 2 0.75 1.50
(10k) TERM
CRIMP XH
26-30AWG
SXH-002T-P0.6 CONN Digikey.ca 455-2261-1-ND 4 0.08 0.32
TERM
CRIMP XH
26-30AWG
1987724 CONN Digikey.ca 277-9146-ND 2 0.48 0.96
TERM
BLOCK
2POS
5.0MM
1984772 CONN Digikey.ca 277-1735-ND 1 0.79 0.79
TERM
BLOCK
RT/A 3POS
3.5MM
AR14-HZL-TT-R IC SOCKET Digikey.ca AE10012-ND 1 0.95 0.95
MACH PIN
ST 14POS
TIN
IM120606003 2.4G Robotshop.c RB-Ite-47 2 18.65 37.30
nRF24L01 a
Wireless
Module w
/ PA and
LNA
ASB-04 Lynxmotio Robotshop.c RB-Lyn-81 3 13.28 39.84
n a
Aluminum
Multi-
Purpose
Servo
Bracket
Two Pack
ARF-02 Lynxmotio Robotshop.c RB-Lyn-183 1 19.94 19.94
n Black a
Aluminum
Robot Feet
28015 Parallax RB-Plx-73 2 33.32 66.64
PING
Ultrasonic
Sensor
GP2Y0A21YK0F Sharp Robotshop.c RB-Dem-01 2 11.06 22.12
GP2Y0A21 a

76 | P a g e
YK0F IR
Range
Sensor
SEA-02 SEA-02 Robotshop.c RB-Onl-03 6 2.17 13.02
Servo a
Extender
Cable
SIRC-01 SIRC-01 Robotshop.c RB-Onl-11 2 2.17 4.34
Sharp GP2 a
IR Sensor
Cable
1591LSBK Box Access 60008 1 3.99 3.99
Electronique
D.D.O
Switch On-Off Access 120716 1 1.99 1.99
Switch Electronique
D.D.O
HS-5485HB Servo Amazon.ca B001KYSE2G 1 25.99 25.99
Motor
HS-5485HB Servo Amazon.com B001KYSE2G 3 23.18 69.55
Motor (21.23 (63.69 US $)
US $ )
HS-5485HB Servo Amazon.com B001KYSE2G 4 22.00 87.43
Motor (20.15 (80.06 US$)
US $)
Acrylic 24”x48” Canadian 093-6403-4 1 29.98 29.98
tire
Venom 5C 2S Lithium Bigboyswithc VNR15000 2 39.99 79.98
2400mAh 7.4 polymer ooltoys.ca
battery
VNR0653 Venom Bigboyswithc VNR0653 1 25.99 25.99
Power 2-3 ooltoys.ca
Cell LiPo
Balance
Charger

TOTAL before tax 642.93

Tax : 5% 32.15
Tax: 7.5% 48.22
TOTAL 723.30

77 | P a g e
APPENDIX E: Sustainability Definitions

Air Acidification - Sulfur dioxide, nitrous oxides other acidic emissions to air cause an increase in
the acidity of rainwater, which in turn acidifies lakes and soil. These acids can make the land and
water toxic for plants and aquatic life. Acid rain can also slowly dissolve manmade building
materials such as concrete. This impact is typically measured in units of either kg sulfur dioxide
equivalent (SO2), or moles H+ equivalent.
Carbon Footprint - Carbon-dioxide and other gasses which result from the burning of fossil fuels
accumulate in the atmosphere which in turn increases the earth’s average temperature. Carbon
footprint acts as a proxy for the larger impact factor referred to as Global Warming Potential
(GWP). Global warming is blamed for problems like loss of glaciers, extinction of species, and
more extreme weather, among others.
Total Energy Consumed - A measure of the non-renewable energy sources associated with the
part’s lifecycle in units of megajoules (MJ). This impact includes not only the electricity or fuels
used during the product’s lifecycle, but also the upstream energy required to obtain and process
these fuels, and the embodied energy of materials which would be released if burned. PED is
expressed as the net calorific value of energy demand from non-renewable resources (e.g.
petroleum, natural gas, etc.). Efficiencies in energy conversion (e.g. power, heat, steam, etc.) are
taken into account.
Water Eutrophication - When an over abundance of nutrients are added to a water ecosystem,
eutrophication occurs. Nitrogen and phosphorous from waste water and agricultural fertilizers
causes an overabundance of algae to bloom, which then depletes the water of oxygen and
results in the death of both plant and animal life. This impact is typically measured in either kg
phosphate equivalent (PO4) or kg nitrogen (N) equivalent.
Life Cycle Assessment (LCA) - This is a method to quantitatively assess the environmental impact
of a product throughout its entire lifecycle, from the procurement of the raw materials, through
the production, distribution, use, disposal and recycling of that product.
Material Financial Impact - This is the financial impact associated with the material only. The
mass of the model is multiplied by the financial impact unit (units of currency/units of mass) to
calculate the financial impact (in units of currency).
NOTE: All Definitions have been taken from Solidworks Sustainability Analysis Test.

78 | P a g e
APPENDIX F: Speech Recognition Test Data
F.1 Male 1
Recognition Ready End Go Forward Back Left Turn Right Slow Speed Up
Number of Trials
1 1 1 1 1 1 1 1 1
2 0 0 1 0 0 1 0 1
3 1 0 1 1 0 1 0 1
4 1 1 1 1 1 0 0 1
5 1 0 1 1 1 0 1 1
6 1 0 1 1 0 1 1 1
7 1 0 1 1 1 1 1 1
8 1 1 1 1 1 1 0 1
9 1 0 1 1 1 1 1 1
10 1 1 1 1 1 1 1 1
F.2 Male 2
Recognition Ready End Go Forward Back Left Turn Right Slow Speed Up
Number of Trials
1 1 1 1 1 0 1 1 1
2 1 1 1 1 1 1 1 1
3 1 1 1 1 0 1 1 1
4 1 1 1 1 0 1 1 1
5 1 1 1 1 1 1 1 1
6 1 1 1 1 1 1 1 1
7 1 1 1 1 1 1 0 1
8 1 1 1 0 1 1 1 1
9 1 1 1 1 0 1 1 1
10 1 1 1 1 1 1 1 1
F.3 Male 3
Recognition Ready End Go Forward Back Left Turn Right Slow Speed Up
Number of Trials
1 1 0 1 1 0 0 1 1
2 1 1 1 1 0 0 1 1
3 1 1 1 0 0 0 1 1
4 1 1 1 1 0 0 1 1
5 1 1 1 1 0 0 1 1
6 1 1 1 1 0 0 1 1
7 1 1 1 1 1 0 0 1
8 1 1 1 0 0 0 1 1
9 1 1 1 1 0 0 1 1
10 1 0 1 1 0 0 1 1

79 | P a g e
F.4 Female 1
Recognition Ready End Go Forward Back Left Turn Right Slow Speed Up
Number of Trials
1 0 1 1 1 1 1 1 1
2 1 1 1 1 1 1 1 1
3 1 1 1 1 1 1 1 1
4 1 1 1 1 1 1 1 1
5 1 1 1 1 1 1 1 1
6 1 1 1 1 1 0 1 1
7 1 1 1 1 1 1 1 1
8 1 1 1 1 1 1 1 1
9 0 1 1 1 1 1 1 1
10 1 1 1 1 1 0 1 1
F.5 Female 2
Recognition Ready End Go Forward Back Left Turn Right Slow Speed Up
Number of Trials
1 1 0 1 1 1 1 1 1
2 1 0 0 1 1 1 0 1
3 1 0 1 1 1 1 0 1
4 1 1 0 1 0 1 0 1
5 1 1 1 1 1 1 0 1
6 1 1 1 1 1 0 0 1
7 1 1 1 0 1 1 0 1
8 1 1 1 1 1 0 0 1
9 1 1 0 1 1 1 0 1
10 1 1 1 1 1 1 0 1
F.6 Female 3
Recognition Ready End Go Forward Back Left Turn Right Slow Speed Up
Number of Trials
1 1 1 1 1 1 1 1 1
2 1 1 1 1 1 1 1 1
3 1 1 1 1 1 1 1 1
4 1 1 1 1 1 1 1 1
5 1 1 1 1 1 0 1 1
6 1 1 1 1 1 1 1 1
7 1 1 0 1 1 1 1 0
8 1 1 1 1 1 1 1 1
9 1 0 0 1 1 1 1 1
10 1 1 1 1 1 1 0 1

80 | P a g e
APPENDIX G: Object Detection Logic
G.1 Moving Forward, IR Sensor 1, IR Sensor 2, US Sensor, Stop Command
Forward IR1 IR2 US Stop

0 X X X 0

0 X X X 0

0 X X X 0

0 X X X 0

0 X X X 0

0 X X X 0

0 X X X 0

0 X X X 0

1 0 0 0 0

1 0 0 1 0

1 0 1 0 0

1 0 1 1 1

1 1 0 0 0

1 1 0 1 1

1 1 1 0 1

1 1 1 1 1

81 | P a g e
G.2 Turning Right/Left, IR1, IR2, US, Stop
Turning RL IR1 IR2 US Stop

0 0 0 0 0

0 0 0 1 0

0 0 1 0 0

0 0 1 1 1

0 1 0 0 0

0 1 0 1 1

0 1 1 0 1

0 1 1 1 1

1 X X X 0

1 X X X 0

1 X X X 0

1 X X X 0

1 X X X 0

1 X X X 0

1 X X X 0

1 X X X 0

82 | P a g e
APPENDIX H: Handshake Protocol

83 | P a g e
APPENDIX I: Control Code Flow Chart

Start
Waiting for next
Input Command

Is the YES Run Go Forward


Command Go Movement
Waiting for Forward Routine
Handshake
Signal NO

Ignore
Handshake Is the Run Back
YES
Signal Command Movement
Back Routine
Is Handshake NO
Signal Valid?
NO
Is the YES Run Turn Right
Command Movement
YES Turn Right Routine
NO

Waiting for
Initial Input YES Run Left
Is the
Command Movement
Command Left
Routne
Ignore NO Error System
Handshake already at Max
YES
Signal Speep

Is the Is the YES NO Increase System’s


NO Is the current
Command Movement Speed
Command Speed Max
Speed up Settings
Ready NO
Error System
YES
already at Max
YES Speep
YES NO
Is the Reduce System’s
Is the current
Command Movement Speed
Run Start Speed Min Settings
Slow
Routine NO

NO
Is the
Command End

YES

84 | P a g e
APPENDIX J: Gantt Chart

85 | P a g e
APPENDIX K: Speech-Recognition Flow Chart

Start

User input for


mode

Command
Exit Recognition
Training

User choses
command to Train the Neural
Train network toolbox

User input
if ready to
record
User input if
ready to give
command

False
If ready to record = 1

True If ready to record = 1

Record signal for


True
2 seconds

Record signal for


2 seconds
Pre-process the
signal

Display the
Extract MFCC recognized
command

User input if
continue to save

False
If continue to save =1

Save data
True

86 | P a g e

S-ar putea să vă placă și