Sunteți pe pagina 1din 39

Integrated Assistive Module for

Cartbot (Final Report)


11.19.2019

Team Members
Alphonsus Adu-Bredu
Andrew Sack
James Liao
Jonathan Chang

Client
Prof. William Messner, Assistive Robotics Group, Tufts University
2
Executive Summary 3

Problem Background 5

Design Recommendations 6
Problem Definition 6
Concept Generation and Evaluation 8
Final Weighted Matrix of Requirements and Needs 12
Final Design Solution 13
Object Acquisition Mechanism 13
Elevation mechanism 15
Control Interfaces 16
Joystick Controller 16
Voice Control 17
Web Interface 18
Experimental validation of design solution 19
Data Analysis and Summary of Experimental Validation 25
Visualization of Data 26
Interpretation of data 27

Conclusions and Future Work 28


Strengths of design solution 28
Limitations of design solution 28
Suggestions for revisions to the design solution 29

APPENDIX 30
A. Needs and Requirements Matrix 30
B. Bill of Materials 31
C. Assembly Procedure 33
D. Technical Drawings 36
E. User Guide 38

REFERENCES 39
3

Executive Summary
Patients who live with spinal cord injury must live with exorbitant costs and struggle to
be independent. Most efforts in the assistive robotics industry focus on building a personal
‘companion’ that is human-like, making the robots overly expensive and complex. A simple
robot that accomplishes basic tasks that help foster independence, while staying at a low
price point is not a product on the market.

This unmet need led to the creation of the Cartbot, an idea originated by Professor
Messner at Tufts University in the Assistive Robotics Lab. Cartbot is a modular assistive
robotic platform designed to help people with spinal cord injuries. The goal of the product is
to be affordable, easy-to-produce, utilitarian, and provide autonomy and control to the user.
The robot is crafted from a shopping cart with multiple modules built off the cart frame to
accomplish basic tasks. This is what makes Cartbot different from other domestic robotics
efforts. The Cartbot approach is more utilitarian and practical. The focus is on building a
functional product; a robot that can perform basic functions that SCI patients need help with
by using minimal hardware complexity. It doesn’t have to be sexy and the costs need to be
kept down.

One of these basic functions SCI patients need is assistance with picking up objects
off the floor. Some of the patients taking part in the Assistive Robotics lab studies said they
would like a module that would help them pick up objects that have fallen off the floor and
bring them back[1]. Our project was to build a module to be attached to Cartbot that can
pick up objects off the floor and return them to the user. This was accomplished by dividing
the task into two distinct parts. One was to create a modular pick up mechanism to pick up a
wide variety of objects off the floor, and two, an elevation mechanism to deliver the grounded
object to the user or onto a table.

Our goal to complete this semester was to build sound, robust and capable
mechanism for picking up the objects. The margin for error of the pick up mechanism
needed to be large enough to allow a single module to be able to pick up a variety of objects.
Also, the precision needed to allow for errors in position to allow for voice commands to be
easy to use and timely. Once in the pick up module, the object should be raised to a level
where the object can be delivered to the user sitting in a chair. Most existing object grabber
technology requires a high amount of precision, so our compartment scooper technology
was more ideal towards allowing for a margin of error. We came up with our own modular
design.
4
An emphasis of our module, similar in logic to a simpler design, was that people with
spinal cord injuries care about a sense of self autonomy and having some control over the
product they use. In studies conducted by the Assistive Robotics Lab, disabled persons
were brought into the lab and questioned/tested on the types of modules that would be most
helpful to them[1]. They determined that freedom of control was desirable due to the agency
it provided them in dictating and executing object manipulation tasks[1]. From those results,
our goal was determined to be a robot that is not some form of autonomous personal
assistant, but an extension of the user’s physical ability. The target users (people with spinal
cord injury) are only physically incapable and generally are cognitively capable. As such, the
robots are not required to be autonomous or ‘intelligent’. The users want to see, think and
control the robot for themselves to perform tasks. The robots only need to have the
mechanical robustness and software capabilities to be easily and effectively teleoperated by
the user. We accomplished this task by creating controls with a video game controller, then
added module with precise voice commands with a self-contained language processing
program and a web interface to control the Cartbot from your phone.
5

Problem Background
A recent estimate taken in 2019 showed that the annual incidence of spinal cord
injury (SCI) is approximately 54 cases per one million people in the United States, or about
17,730 new SCI cases each year[3]. There is an estimated 291,000 persons with SCI living
in the United States[3]! Most people with serious physical disabilities live with exorbitant
costs as shown in this study[3]:

This causes serious financial strain for an upper-class family and is not even close to
affordable for a majority of these patients. A part of these costs come from associated care
outside the hospital. The average outpatient care costs range from $12,183 for cervical
complete injuries to $7,168 for lumbar complete injuries[4]. Many tasks become difficult
when living with these injuries and even easy tasks can need assistance. This can be costly
to have someone tending to basic tasks at all times. There is a clear market in the U.S., and
even including the world, to assist people with serious physical disabilities at a much
decreased cost.

There are countless engineering projects and companies that aim to create robots
and tools to enable self-autonomy for disabled persons. Most efforts in the industry focus on
building a personal ‘companion’ for their users. Most efforts put a great deal of strain and
money into making their robots as human-like as possible. This decision causes
manufacturing and engineering costs to skyrocket, making the robots too expensive for a
middle-class person to purchase. This is not what the actual people using them desire and
in reality, SCI patients are not looking for a companion, as they have human companions
that help with complex tasks no current technology can assist with. They want a tool to help
them with the simple tasks that they are still unable to deal with on their own power[1].
6

Design Recommendations

Problem Definition
Based on the major points we discussed with our client, Prof. Messner, we specified
the needs for the module we will develop. We organized the needs in a table, shown in
appendix A. The major needs we anticipate can be categorized into four different groups; the
user interface or mode of interaction, the functional expectations of the module, the
structural integrity and reliability of the module and the safety needs it must satisfy to be
deemed suitable for use.

We discussed the various ways a user can interface with and control the module with
Prof. Messner. Prominent amongst the discussed options were voice control, joystick control
and autonomous object location and pick up. The interface mode we all decided on was the
voice-control interface. The user’s would control the module by giving it directional
commands. We would have a feature where the user can tune the sensitivity or speed of the
module’s motion by themselves, also over voice commands. This way, the user can have as
much control as they need to use the pick-up module.

As for the main functional expectation of our module, what we are expected to
accomplish first this semester is to pick a water-bottle sized object from an open, hard floor
and raise it to a height around chest level. Once we accomplish that, we should then
improve and modify our module to pick objects in difficult locations such as in a corner, on a
carpet, or under a table. We should also aim to pick up objects with less rigid morphology
like towels, shirts or socks.

Our client Prof. Messner expects the attachment interfaces of the module to Cartbot
to be firm and robust to mild crushes. He also expects that the wiring is neatly tucked aside
to prevent them from accidentally winding around the module as the module’s actuators are
in motion. The module’s software system should also be able to function reliably with
minimal interferences like the program crushing or malfunctioning while a user is using
Cartbot.

The module also needs to operate safely in a manner that doesn’t put the user at risk
of physical injury. The speed of the actuators would need to be tuned to be slow enough to
give a person standing close to Cartbot ample time to get out of the way as it picks up an
object from the floor. The module’s parts have to be securely fastened together to prevent
them from flying off and hurting someone as Cartbot moves around a room.
7
In order to ensure the proper functioning of the product and meeting the needs of the
customers, we created a table of requirements, shown in Appendix A. The requirements of
the product mostly focus on the functional and technical aspects of the module, as well as
the social and safety requirements.

A major functional requirement of the module is its precision. The module is required
to perform high precision pick up tasks within ±10cm margin of error. We believe that, this
margin of error is small enough to ensure that, if the module misses its mark by this margin,
it will still be able to successfully pick up the target object. The module should also be able
to alter its height within a range of 1.5 meters from the ground, which according to [5] is
within the average height of domestic desks. The module should be able to reach the object
within a range of 0.5 meters from Cartbot. Since our initial designs of the picking mechanism
of the module extends to about 0.5 meters from the base of Cartbot, we believe this should
be the range of reach of the module. Going beyond 0.5 meters would mean the picking
mechanism would affect the maneuverability of Cartbot, especially in closed spaces or
cluttered environments.

The module should be able to be easily controlled by the user through verbal
commands just like in work done by Seung-Joon Yi et. al. [6]. The reasoning here is that,
since the users of Cartbot already have their hands full with controlling the joystick of their
electric wheelchair as well as that of Cartbot, it will be inconvenient for them to be given yet
another physical controller to control the module. Voice control appears to be the best
control option to us. Since Cartbot already uses voice control to trigger certain behaviors [1],
it will be a straight-forward task to set up voice control for the module. An average of three
verbal commands is all the user will need to get the module to reach for and pick up an
object.

A major social requirement is the aesthetics of the module. Professor Messner


emphasized consistently that, even though a good-looking module would be appreciated, we
should focus more on functionality than on aesthetics. Only we have a functional prototype
should we go ahead to make the module look nice. We aim to use the high-torque muffled
servos in Prof. Messner’s lab so motor noise would hopefully not be a problem with the
module.

A significant safety concern would be the speed of the moving parts. We aim to limit
the speed of the servo motors to 10 rev/s to reduce the chances of the module hurting a
user. We would spend a significant amount of time designing and implementing rounded
edges on the modules and potentially padding the sides of the module with soft material.
8

Concept Generation and Evaluation


We each independently came up with concepts for creating the object-picker module
and made quick prototypes of them. Diagrams of our concept generation approach are in
Figure 3 below. When we met, we weighed the advantages and disadvantages of each of
our modules and selected the prototype which proffered the most advantages. Prominent
amongst the candidate prototypes were the sweeping mechanism and the scooping
mechanism (Figures 1 & 2). We ended up choosing the scooping mechanism for the
following reasons:

1. It was able to pick a variety of objects; from voluminous ones to flat, credit card-sized
ones
2. It didn’t require much positioning accuracy to pick up an object successfully
3. It required very little actuation force to successfully pick up an object

Figure 1: First pan concept


9

Figure 2: The scooping mechanism (top) and the sweeping mechanism (bottom)

Figure 3.a: Concepts for the scooping mechanism


10

Figure 3.b: More concepts of the scooping mechanism

For the elevation mechanism, we considered two main approaches:

The first involved a pulley mechanism made up of cascading linear bars all
connected together with a strong string. This mechanism was inspired by the Cascading
X-rail Slide Kit [7]. One end of the string is connected to the end of a bar while the other end
is connected to the pulley wheel of the servo. When the servo rotates in one direction, the
cascading bars extend until they are stretched end-to-end (Figure 4). When the servo is
rotated in the opposite direction, the bars collapse together and are stacked on top of each
other (Figure 5).

Figure 4: The cascading slides in the extension mode [7]


11

Figure 5: The cascading slides in collapsed position [7]

Bringing these two ideas together, we came up with the extension concept illustrated in
Figure 6 below.

The elevation concept is made up of two extruded aluminum bars; a longer one
which is 36 inches long and a shorter which is 12 inches long. The long bar is fixed to the
cart whilst the short bar slides along the long bar. A string is attached to the top of the short
bar to raise it up. The other end of the string is connected to a pulley mechanism which is
fixed to a servo motor. Just like in the xrail cascading slide above, when the servo rotates in
one direction, the short bar is pulled up along the long bar. And when it rotates in the other
direction, the short bar slides down the long bar with the help of the weight of the short bar
and gravity.

As indicated in Figure 6, this concept would be mounted on the front end of Cartbot.

Figure 6: First elevation mechanism concept


12

Final Weighted Matrix of Requirements and Needs


The final weighted decision matrix can be found in Appendix A. It remained
unchanged at the end of the project. From the matrix, the three major requirements were
usage, precision of pick-up and cost. Usage had the highest target values across all needs
because all the needs we came up with had strong relationships with usage. Since the users
of the pickup module have certain physical disabilities, it was imperative that the product we
came up with focused more on enabling people with less physical abilities to be able to use
the product. With this in mind, we developed three independent control interfaces for the
module; voice control, joystick control and a web-application control. More details about
each of these interfaces will be given in subsequent sections. With these three diverse set of
interfaces, a user of the module has the freedom of controlling the robot in whichever way
they find easiest.

The attribute with the second highest target value was the precision of a pickup task.
Since the module was going to be teleoperated, it was important that our pickup mechanism
had a high rate of success in picking up objects so as to avoid having to repeat the task
multiple times in order to pick up an object successfully. A high rate of pickup success would
also, in the long run, make the module less-stressful to use. We addressed this need by
designing an object-collecting mechanism which we call the “pan”. The collection
mechanism of the pan was designed in a way that required very little placement precision to
successfully pick up an object. More details about the pan’s mechanism will be given in
subsequent sections.

The final important attribute was the cost of the pickup module. This attribute was
important because it determined significantly, whether or not a user would purchase and use
our product. Even if the above desired attributes were met, a user would still not use our
product if it ended up being way more expensive than the user can afford. As such, we
focused on using inexpensive parts in building the module. We also put in significant thought
and effort into reducing the number of physical actuators we used and relying on natural
forces like gravity in the operation of the module. Actuators were one of the most expensive
components of our module so reducing our dependence on them helped to significantly
reduce the cost of production of the module. Details about how we specifically did this will be
given in subsequent sections.
13
Final Design Solution

The final design solution for the pickup mechanism consists of 3 components: the
object acquisition mechanism (also referred to as the pan), the object elevation mechanism
(also referred to as the elevator), and the controls for the mechanism. Each of these
components are discussed separately in the subsections below.

Object Acquisition Mechanism

The object acquisition mechanism is the core element of our design. It is responsible
for aligning an object to be picked up, picking it up, keeping it secure during motion and,
finally, releasing the object in the desired location. As such, this mechanism has received a
majority of our focus and has gone through numerous revisions. However, the core concept
has remained relatively constant throughout the process as is visible if you compare the first
prototype in Figure 1 to the final design in Figure 7.

Figure 7: Final Pan design

A servo drives two connected gears, which turn in opposite directions due to the
properties of gears. Attached to these gears are the pan halves, which wedge under an
object when they are driven towards each other. When the halves are shut, they form a full
pan which has walls to prevent the object from falling out. The leading edge of the pan is
chamfered to allow it to smoothly move under flat objects. There are angled pieces inside
14
the pan which center objects, as well as prevent them from becoming stuck in the corner of
the pan, ensuring that when the pan is opened, the object is dropped.

The pan is made out of ⅛” acrylic as it is a thin and easy to work with material. It is
fastened together with superglue because no fasteners can protrude beyond the bottom of
the pan. Doing so would prevent the pan from sitting flat on the floor and being able to slide
under flat objects. The gears are 3D printed with a material called Onyx which is a chopped
carbon fiber reinforced nylon. This material was selected due to its strength and low friction,
as well as the print quality of the Markforged printer needed to print it. All of these properties
are important for gear teeth.

The whole assembly attaches to the elevation mechanism with T-nuts that slot into
the extrusion of the elevation mechanism. Many of the later revisions to the gearbox design
relate to how it mounts to the elevator. Earlier versions of the gearbox were larger, especially
in how far they protruded from the front of the cart. One of these earlier prototypes is shown
below in Figure 8. This larger moment arm increased the torque on the elevator causing it to
jam and derail, as well as causing the pan to flex downwards considerably. The final gearbox
was designed to be as compact as possible to reduce these issues.

Figure 8: Earlier prototype of pan


15
Elevation mechanism

Figure 9: Close-up view of the elevation mechanism’s actuation. A - Pulley wheel, B -


roller, C - servomotor, D - metallic brackets, E - String, F - aluminum extrusion

The elevation mechanism is a simple pulley mechanism (A) that is actuated by a


servomotor (C). One end of the string (E) is attached to the top of the vertical pan
attachment and the other end is wound over a roller (B) and around a wheel that is directly
attached to the head of the servomotor (A). When the servo motor rotates clockwise, it winds
the string along the rims of the wheel, thereby pulling the other end of the string upwards,
which in turn raises the pan. When the servo motor rotates couter-clockwise, it unwinds the
string from the wheel. This reduces the tension in the string and lets gravity take over,
thereby gradually lowering the pan to ground level. The level at which the pan is raised or
lowered depends on how much of the string is wound or unwound. To be precise, the height
of the pan above the ground is exactly equal to the length of the string that has been wound
on the wheel by the servomotor. This mathematical relationship enables us to raise the pan
to precise vertical heights above the ground.
16
The entire elevation mechanism is framed with a 55 inch long aluminum extrusion (F)
which is firmly attached to the cart. The servomotor is attached to the right side of the top
end of the aluminum extrusion by metallic brackets (D) as shown in the image above. We
initially used laser cut plastics for the attachments but they ended up breaking after a while
because they were not tough enough to withstand the torsional stress exerted on them as
the servo motor raised up a heavy pan. The servomotor is controlled by PWM signals sent to
it from an arduino.

Control Interfaces

Joystick Controller

Figure 10: Joystick controller. (1) - Open pan button, (2) - Close pan button, (3) - move cart
toggle, (4) - elevate cart toggle

Joystick control is an alternative mode of control of the module. It has two dedicated
buttons for opening (1) or closing (2) the pan. The left toggle of the joystick (4) controls the
elevation mechanism to either raise or lower the pan. The pan continuously moves up when
the left toggle is pushed up. When the left toggle is released, the pan stops moving. When it
17
is pushed down, the pan is lowered. Pushing the left toggle left or right has no effect. The
right toggle (3) is used to control the movement of the cart itself. When pushed up, the cart
moves forward and moves backward when the right toggle is pushed down. Pushing the
right toggle left or right gets the cart to turn left or right respectively.

The joystick communicates with the raspberry pi over a 2.4GHz channel established
between it and the raspberry pi. Once a button is pressed or a toggle is pushed, a signal is
sent over this channel to the raspberry pi. The raspberry pi interprets this signal upon
receiving it and publishes the interpretation of the signal. The control software we developed
for the module receives this interpretation and sends the corresponding signal to the arduino
which in turn sends the appropriate PWM to the servos that make up the module.

Voice Control

Voice control is the second mode of control of the pickup module. We used the
Google Cloud Speech API to handle the transcription of the audio into text. When the user
gives a voice command, the audio of the command is recorded and uploaded to Google
Cloud’s servers. They then transcribe the audio into text using their Speech recognition
engine and send back the text results of the transcription. The command parsing software
we developed reads the text result and searches for ‘keyword commands’ which we specify
apriori in the software. If a keyword is found in the text result, the software then goes ahead
to send the corresponding signal to the arduino, which in turn sends a PWM signal to the
corresponding servomotor to actuate it. The keyword commands are :

“Up” , “Down”, “Open”, “Close”

“Up” is the command to get the elevation mechanism to raise up the pan

“Down” is the command to get the elevation mechanism to lower the pan

“Open” is the command to get the pan to open up and release the object (if it is
holding one)

“Close” is the command to get the pan to close up

The command parsing software is written to be robust to inaccuracies in transcription.


We do this by keeping the keyword commands as short as possible. For example: Instead of
making the command, “Raise the pan up”, we simply make it “up”. This way, if the
recognition engine transcribes the above phrase as “Grace the can up” because of indoor
noise or accent differences, the “up” keyword would still be found in the result and the
module would act just as expected. The speech recognition engine also takes the context of
words into account when transcribing them. For example, “Raise the pan up” could just as
18
easily be transcribed as “Raise the pan app”. But since “up” would be much more
contextually meaningful than “app” in that phrase, the recognition engine would make the
right transcription. The voice control interface works best when in quiet environments.

Web Interface

One of the most important functions of Cartbot is its usability. This is important for
the pick up module, but also for Cartbot as a whole. Cartbot itself has multiple Raspberry
Pi’s and Arduinos that control all of its modules. We decided to utilize the Raspberry Pi
Apache server in order to create a remote way to control Cartbot. This included the scope of
our pick-up project, but also integrated the rest of the Cartbot features. This involved
creating a web server off the Pi and designing a website with a user interface controller to
simulate the physical joysticks and buttons of the joystick controller. The web UI utilized the
nipplejs library to create the joystick and HTML/Bootstrap to design the front end code. The
standard ROS packages were implemented in the server back end and acted as a wrapper
controller for Cartbot. This allows any person with access to the website/app to be able to
control Cartbot.

We decided this was a useful addition for two reasons. One was that for the users,
they may not have the ability to do many complex tasks, but a subset of the users have
devised ways to use a phone as it is a very important ability to have. By creating a web
interface, we take advantage of those already developed capabilities so that the users could
control Cartbot from their phone. We also had plans of integrating the voice controls into the
server so that the microphone of a users phone, which is right next to them, can be utilized.
The second reason for the addition was so that a caretaker could potentially control the
Cartbot remotely and complete tasks for people living with spinal cord injuries.
19
Experimental validation of design solution

We run extensive pick-up tests on five different objects, in two different ground
surface textures for each of the three control interfaces we developed (i.e. Joystick Control,
Voice control and Web App control). For each combination of ground surface texture, control
interface and object type, we run five independent trials. We measured both the success
rates for one pick up attempt of the object (as a binary score: 1 for success, 0 for failure) and
the time (in seconds) taken to successfully pick up the object. If an initial pick-up attempt
was unsuccessful, we kept retrying until we made a successful pick-up, all the while. The
time (in seconds) we measured was for the entire retrying process.

In all, we measured a total of 150 data points.

The object description, details and images can be found in the table and images below:

Object Size Weight


soda bottle 23.0 x 6.1 x 6.1cm 25g
Phone 1.0 x 8.5 x 15.5cm 220g
stylus 0.5 x 0.8 x 13cm 10g
scissors 1.0 x 10.1 x 20.0cm 50g
cup 8.1 x 6.0 x 6.0cm 30g
Table A: Test object specifications

Stylus Cup Soda bottle Scissors Phone

Figure 11: Objects used in experimental validation


20

We considered two different ground surface textures. The first was the bare, smooth ground
and the second was a carpeted ground as shown in the images below:

Rough, Carpeted ground Smooth, hard ground

Figure 12: Different surface textures on which we performed pickup tasks

Pick-up task on hard, smooth ground Pick-up task on rough, carpeted ground

Figure 13: Pick up task on different ground surface textures.


21

Pick-up success/failures on Hard, smooth ground for different Control interfaces.

1 represents success and 0 represents failure in first pickup attempt

Joystick Interface

Object Trial 1 Trial 2 Trial 3 Trial 4 Trial 5


Soda bottle 1 1 1 1 1
Phone 1 1 1 1 1
Stylus pen 1 1 1 1 1
Pair of scissors 1 1 1 1 1
Cup 1 1 1 1 1

Voice command Interface

Object Trial 1 Trial 2 Trial 3 Trial 4 Trial 5


Soda bottle 1 1 1 1 1
Phone 1 1 1 0 1
Stylus pen 1 1 1 1 1
Pair of scissors 1 1 1 1 1
Cup 1 1 1 1 1

Web App Interface

Object Trial 1 Trial 2 Trial 3 Trial 4 Trial 5


Soda bottle 1 1 1 1 1
Phone 0 1 1 1 1
Stylus pen 1 1 1 1 1
Pair of scissors 1 1 1 1 1
Cup 1 1 1 1 1
22

Pick-up durations on Hard, smooth ground for different Control interfaces.

Durations are recorded in seconds

Joystick Interface

Object Trial 1 Trial 2 Trial 3 Trial 4 Trial 5


Soda bottle 10 11 13 26 12
Phone 11 18 13 13 24
Stylus pen 11 11 11 12 12
Pair of scissors 11 15 15 22 14
Cup 14 13 12 12 12

Voice command Interface

Object Trial 1 Trial 2 Trial 3 Trial 4 Trial 5


Soda bottle 24 34 21 46 18
Phone 32 38 21 31 29
Stylus pen 28 26 16 27 42
Pair of scissors 61 34 22 22 33
Cup 20 32 28 21 25

Web App Interface

Object Trial 1 Trial 2 Trial 3 Trial 4 Trial 5


Soda bottle 28 25 37 35 28
Phone 27 50 31 25 45
Stylus pen 22 25 29 22 27
Pair of scissors 22 29 22 23 28
Cup 23 23 25 26 24
23

Pick-up success/failures on Rough, carpeted ground for different Control interfaces.

1 represents success and 0 represents failure in first pickup attempt

Joystick Interface

Object Trial 1 Trial 2 Trial 3 Trial 4 Trial 5


Soda bottle 1 1 1 1 0
Phone 1 1 1 1 1
Stylus pen 1 1 1 1 1
Pair of scissors 1 1 1 1 1
Cup 1 1 1 1 1

Voice Command Interface

Object Trial 1 Trial 2 Trial 3 Trial 4 Trial 5


Soda bottle 1 1 1 1 1
Phone 1 1 1 0 1
Stylus pen 1 1 1 1 1
Pair of scissors 1 1 0 1 0
Cup 1 1 1 1 1

Web App Interface

Object Trial 1 Trial 2 Trial 3 Trial 4 Trial 5


Soda bottle 1 1 1 1 1
Phone 1 1 0 1 1
Stylus pen 1 1 1 1 1
Pair of scissors 1 0 0 0 0
Cup 1 1 1 1 1
24

Pick-up durations on Rough, carpeted ground for different Control interfaces.

Durations are recorded in seconds

Joystick Interface

Object Trial 1 Trial 2 Trial 3 Trial 4 Trial 5


Soda bottle 14 12 14 13 15
Phone 11 12 10 14 13
Stylus pen 12 12 9 9 10
Pair of scissors 14 13 15 13 14
Cup 13 13 14 14 15

Voice Command Interface

Object Trial 1 Trial 2 Trial 3 Trial 4 Trial 5


Soda bottle 36 31 36 21 30
Phone 24 20 22 19 21
Stylus pen 27 19 19 19 17
Pair of scissors 25 16 62 24 29
Cup 20 31 20 19 22

Web App Interface

Object Trial 1 Trial 2 Trial 3 Trial 4 Trial 5


Soda bottle 21 22 24 36 25
Phone 18 17 21 35 21
Stylus pen 23 16 16 25 20
Pair of scissors 17 45 51 34 48
Cup 19 29 18 33 24
25

Data Analysis and Summary of Experimental Validation

1. Average success rates (percentages) for objects for the different control interfaces on
Hard, smooth ground

Object Joystick Voice Command Web App


Soda bottle 100% 100% 100%
Phone 100% 80% 80%
Stylus pen 100% 100% 100%
Pair of scissors 100% 100% 100%
Cup 100% 100% 100%

2. Average duration in seconds for pick-up tasks for objects on Hard, smooth ground

Object Joystick Voice Command Web App


Soda bottle 14.1 28.6 30.6
Phone 15.8 30.3 35.9
Stylus pen 11.5 27.9 25.0
Pair of scissors 15.4 34.4 24.9
Cup 12.5 25.3 24.2

3. Average success rates (percentages) for objects for the different control interfaces on
Rough, carpeted ground

Object Joystick Voice Command Web App


Soda bottle 80% 100% 100%
Phone 100% 80% 80%
Stylus pen 100% 100% 100%
Pair of scissors 100% 60% 20%
Cup 100% 100% 100%

4. Average duration in seconds for pick-up tasks for objects on Rough, carpeted ground

Object Joystick Voice Command Web App


Soda bottle 13.6 30.8 25.6
Phone 12 21.2 22.4
Stylus pen 10.4 20.2 20
Pair of scissors 13.8 31.2 39
Cup 13.8 22.4 24.6
26

Visualization of Data

Figure 14: Bar chart of pickup durations on hard, smooth ground

Object
Figure 15: Bar chart of pickup duration on rough, carpeted ground
27

Interpretation of data

● Joystick control was the most accurate control interface on both hard, smooth floor
and rough, carpeted grounds, with an average of ​100% ​success rate on hard,
smooth floor and ​96%​ success rate on rough, carpeted floor. Voice control and Web
App control both had an average success rate of ​96%​ on hard, smooth floor. Voice
command however performed better on rough carpeted floor with an average
success rate of​ 88%​ compared to Web App control’s ​80%

● Across all objects and all floor textures, Joystick control was the fastest control
interface, with a mean duration of​ 13.86 seconds​ on hard, smooth floor and​ 12.72
seconds​ on rough, carpeted floor. Web App was slightly faster than Voice command
on hard, smooth floor with an average duration of ​28.12 seconds​ as against Voice
commands ​29.3 seconds.​ Voice command was however slightly faster on rough,
carpeted floor with a mean duration of ​25.16 seconds ​as against Web App’s​ 26.32
seconds.

● Using the TTEST measure, we calculated the p-value for the durations of both​ voice
control and web app​ for both floor textures. The p-value for hard, smooth floor was
0.6772​ whilst that for rough, carpeted floor was ​0.7849​. Since both p-values are
significantly greater than ​5% (0.05), ​we can conclude that ​the difference in
durations between voice control and web app control are statistically
insignificant.

● We also used the TTEST measure to calculate the p-value for the duration pairs of
joystick and voice command as well as joystick and web app for both floor textures.
For hard, smooth floor, p-value between joystick and voice command was
0.000088369​ whilst that for joystick and web app was ​0.001887.​ For rough, carpeted
floor, the p-value between joystick and voice command was ​0.0052​ whilst that for
joystick and web app was​ 0.0135​. Since none of these p-values are greater than
0.05​, we can conclude that ​the differences in durations between joystick control
and web app as well as joystick control and voice control are statistically
significant​.
28

Conclusions and Future Work

Strengths of design solution

The key strengths of our design solution are its robustness and versatility to perform
under various circumstances.

First of all, it is able to pick up a variety of objects, including voluminous objects like a
box, flat objects like a credit card, and small objects like a screw. Even if the object is larger
than the size of the pan and cannot be fully enclosed, it would still be able to get picked up
by the module.

Secondly, our module does not require much positioning accuracy to pick up an
object successfully. Unlike a robotic arm or grabber, instead of having to position near the
object within a small margin of error, our module only needs to get close enough to the
object within a relatively large margin of error.

Additionally, the pan can be powered by just one servo motor, so it requires very little
actuation force to operate. It also operates with a simple mechanical motion with just two
moving parts, reducing the likelihood of encountering mechanical failures.

Furthermore, with the added ramps on both sides of the pan, our module can avoid
the problem of a small object being stuck on one side of the pan when it is supposed to be
dropped off. Our original solution to this potential problem was a newly designed hinged pan,
whose bottom surface could open and drop down to a certain degree to ensure the proper
release of the object. However, due to the relatively complicated design and additional
moving parts, that design was prone to failures, and improving it would take more time and
iterations. Therefore, we adopted a much simpler design, which is the aforementioned
ramps, and it is just as effective and much more robust. Our module can now properly
release any object without the risk of it getting stuck.

Limitations of design solution

While we are very happy with what we accomplished for our design, there are still
limitations in our design solution that could potentially be resolved with further refinement or
focused redesign.

One of the limitations was the precision of dropoff. We were able to come up with a
solution that could pick up objects with a large margin for error and solved the problem of
29
objects not getting stuck in the pan during drop off, but it is not at a precision level that
allows exact location for dropoff. This can be important in applications where the object is
not flat and has a tendency to roll.

One of the other limitations of the object pick up module was the load capacity. We
had the ability to carry 1.5kg of weight, but ideally we would be able to pick up items up to
twice that heavy (full water bottles, books, etc). This is a limitation that can be fixed with
stronger building materials and would naturally get stronger in future iterations as production
is approached.

Another limitation was the precision of control of the Cartbot. The control process
was not hypersmooth and even after adjusting to the controller for Cartbot, it is not easy to
make precise maneuvers. In general, this is not an issue since the lack of precision is in
tolerance, but the operating time is just a bit longer.

Suggestions for revisions to the design solution

The main revision to our design solution would be to use more permanent stronger
materials so that the module would be more stable and stronger. This is an expected
revision given it is a proof of concept product that would need refinement before going into
production.

A nice addition to our design solution would be to implement full remote control with
web interface hosted on the web associated with your account for your Cartbot. This could
then lead to a full user experience in Cartbot where an app is paired with each bot and allow
for customization and easy controls.

A stretch goal that we didn’t really scope into was moving onto incorporating some
level of autonomy in the object picker module to assist the person controlling Cartbot. By
utilizing the sensors on the Cartbot, we could implement computer vision, motion planning
and other high-level robotic manipulation software strategies to increase the assistance,
accuracy, and speed of the module without adding to the costs. When objects are being
approached, controls could definitely be used to help lock onto the object.
30

APPENDIX

A. Needs and Requirements Matrix

Below is our final weighted decision matrix of requirements and needs mainly
remained unchanged at the end of the project.

Needs
Easy to use 5 10 3 7 8 8 7 1 8 10 4 5 7 7 10 2 10

Robustness 4 5 7 9 7 8 6 10 8 6 5 6 9 5 8 2 7

Budget 5 5 10 6 9 9 8 10 7 7 6 8 10 5 9 9 7

Dimension 3 2 7 3 10 6 10 1 0 5 3 8 9 10 7 2 6
Boundaries

Aesthetics 1 0 10 0 0 0 0 0 0 2 1 1 0 0 0 10 6

Voice 4 9 10 0 0 0 0 0 10 10 1 1 0 3 10 0 10
Controlled

Customizable 2 5 9 0 7 7 6 0 0 7 5 2 10 9 9 0 9

Pick up 4 7 5 7 7 8 8 0 0 0 2 8 5 9 9 0 8
variety of
objects

Raise object 5 7 6 9 9 10 10 0 0 0 2 9 8 8 9 0 7
to chest level

Not damage 5 9 7 5 0 7 3 3 0 0 9 8 4 4 9 0 3
the object

Pick up 3 9 7 6 10 5 7 2 0 0 6 4 7 9 9 0 3
objects in a
variety of
envs

Self 4 9 4 0 2 6 3 3 0 10 2 4 3 4 9 0 2
autonomy of
person

Interface with 5 6 5 0 0 0 0 0 7 10 3 2 1 0 9 0 3
Cartbot

TOTAL 348 329 226 268 305 271 131 182 270 328 277 286 279 440 79 309
31
B. Bill of Materials

Item Description Source Part # Quantity Cost


#

#1 ⅛” Acrylic Sheet McMaster 8560K257 1ft x 2 ft (approx) $16.70


Carr

#2 ¼” Acrylic Sheet McMaster 8560K359 0.5ft x 1ft (approx) $11.65


Carr

#3 PLA Filament Amazon B00J0EC 1 kg $19.99


R5I

#4 Onyx Filament Markforged F-MF-000 1 spool (need $189.00


1 much less)

#5 Super Glue Home 1647358 1 bottle $3.97


Depot

#6 TowerPro MG996R Amazon B071F91P 1 $9.98


Metal Gear Digital High JV
Torque Servo 55g

#7 #6-32 x ½” Screw McMaster 91251A14 4 $8.96/100


Carr 8

#8 #6-32 Nut McMaster 90480A00 4 $1.28/100


Carr 7

#9 #10-24 x ½” Screw McMaster 91251A24 6 $11.19/100


Carr 2

#10 #4-40 x 1” Screw McMaster 90044A11 2 $8.27/25


Carr 1

#11 Rail Slide Kit Servo City 565050 1 $122.99

#12 3/8" X-rail Gusset Servo City 585071 5 $31.92

#13 X-rail Nut Servo City 585756 5 $24.95

#14 24” X-rail Servo City 565074 2 $17.14

#15 Raised Perpendicular Servo City 585074 8 $31.92


X-rail mount

#16 X-rail L-Bracket Servo City 585076 4 $7.96

#17 36" X-Rail Servo City 585078 1 $10.99

Note:​ Many basic items can be found from other sources for cheaper
32
Laser Cut Parts:

Item # Description Material Quantity

L1 Gearbox Top Plate ¼” Acrylic 1

L2 Gearbox Bottom Plate ¼” Acrylic 1

L3 Pan Bottom Plate ⅛” Acrylic 2

L4 Pan Side Plate ⅛” Acrylic 2

L5 Pan Front Plate ⅛” Acrylic 2

L6 Pan Slanted Plate ⅛” Acrylic 2

3D Printed Parts:

Item # Description Material Quantity

P1 Large Gear Onyx 2

P2 Small Gear Onyx 1

P3 Gear Top PLA 2

P4 Small Standoff PLA 2

P5 Extrusion Standoff PLA 1

P6 Pan Hub PLA 2


33
C. Assembly Procedure

1. Laser cut and 3D print all required parts. Parts are described in Appendix B.

2. Assemble Pan Halves

a. Using super glue, glue L3-L6 together. There should be 2 mirrored halves.

b. Glue the pan hubs (P6) into the holes in Pan Bottom Plates (L3).

3. Assemble Gearbox

a. Using super glue, glue P4 (x2) and P5 into the Gearbox Bottom Plate (L2) as
shown.
34

b. Bolt the servo (#6) onto the gearbox top plate (L1) with #6 screws and nuts.

c. Press the small gear (P2) onto the servo spline. The hole in the gear may
have to be expanded slightly to achieve a press fit.

d. Press the gear tops (P3) into the hole in the large gears (P1).
35

e. Place the large gear assemblies into the ½” holes in the gearbox bottom
plate.

f. Place gearbox top plate assembly onto gearbox bottom plate assembly,
ensuring gears mesh. Make sure the servo is calibrated such that it is at
position 0 when gears are in closed position. This may require powering the
servo to test.

g. Connect the 2 plates by screwing #10 screws through the top plate into the
standoffs in the bottom plate.

4. Attach pan halves to the gearbox. Slide the shafts of the large gears (P1) through the
pan hubs (P6). Secure by screwing a #4 screw into the hole in the pan hub through
the shaft. It is easiest to do this while the pan is in the open position.
36

D. Technical Drawings
37
38
E. User Guide

1. Connect a LiPo battery to Cartbot.

2. Connect both the arduino and the raspberry pi to the power-bank in the top bin of
Cartbot.

3. Once the screen of the raspberry pi displays the desktop, launch the Cartbot
program on the desktop to start-up both the picker module and Cartbot.

4. You can now control Cartbot and the picker module with the joystick, voice control
and web app.
39

REFERENCES
[1] ​Kentaro Barhydt, Alphonsus Adu-Bredu, Sarah Everhart-Skeels, Gary Bedell, Karen
Panetta and William Messner. 2019. Cartbot: an Assistive Mobile Manipulator Robot with a
Minimal Degree of Freedom Design Approach. 2020 Human Robot Interaction Conference,
Cambridge/UK. [In Review]

[2] ​Julie Dumora, Franck Geffard, Catherine Bidard, Nikos A. Aspragathos, and Philippe
Fraisse. 2013. Robot Assistance Selection for Large Object Manipulation with a Human.
2013 IEEE International Conference on Systems, Man, and Cybernetics (2013), 1828–1833

[3] ​National Spinal Cord Injury Statistical Center, Facts and Figures at a Glance.
Birmingham, AL: University of Alabama at Birmingham, 2019.

[4]​ ​French, Dustin D et al. “Health care costs for patients with chronic spinal cord injury in the
Veterans Health Administration.” ​The journal of spinal cord medicine​ vol. 30,5 (2007):
477-81. doi:10.1080/10790268.2007.11754581

[5] ​Dining Room Table Heights. Online. May 2017. url:


https://www.furniture.com/tips-and-trends/dining-room-table-heights

[6] ​Yi, S. (2018). Software Framework for an Intelligent Mobile Manipulation Robot. 2018
International Conference on Information and Communication Technology Robotics
(ICT-ROBOT).

[7] ​Cascading X-rail slide kit. URL: ​https://www.servocity.com/cascading-x-rail-slide-kit

[8] ​Google Cloud Speech-To-Text. URL: ​https://cloud.google.com/speech-to-text/

[9]​ Rotary Encoder


https://howtomechatronics.com/tutorials/arduino/rotary-encoder-works-use-arduino/

[​10​]​ ​https://www.sparkfun.com/products/242

[​11​]​ ​https://www.alliedelec.com/search/productview.aspx?SKU=70175373

S-ar putea să vă placă și