Sunteți pe pagina 1din 8

Some interesting questions.

Robotic Systems (14)

How do ants forage How do birds flock Hoe do we drive a car round a track without a driver Can we use a bees vision system to land a helicopter

Current Perception Paradigm


Time

Information from images


Advanced robotics needs to extract as much information as possible from images. Perception Additional position information Motion

Vehicle

Moving Object

Static Object

Smooth Surfaces

Height from a 2D image

Motion from an images


Image differencing Simple, but limited Optical flow

Distances gives height

More complex Biologically inspired

Image differencing

Typical results
Take background Take image Find difference Isolate largest object Locate centroid

Background

Image

Plot

Difference

Optical Flow

Distant objects appear to move slowly

Optical Flow
Optical flow is the distribution of apparent velocities of movement of brightness patterns in an image. Optical flow can arise from relative motion of objects and the viewer Calculate the motion between two image frames.

Near objects appear to move faster

Optical Flow
I(x, y, t ) = I( x + x, y + y, t + t ) I I I = I(x, y, t ) + x + y + t x y t I I I 0= Vx + Vy + x y t

Optical flow

Note this equation cannot be solved unless other constraints are imposed on the problem, this is the Aperture Problem

Optical Flow

Direct Perception
Any motion is a combination of six basic movements The flow-field for a movement is the sum of each movements component. Relatively easy to decompose a complex motion into six basic movements Egomotion - estimating a camera's motion relative to rigidly placed objects in a scene.

Take one step forward Arrows indicate direction and magnitude of the motion

http://people.csail.mit.edu/lpk/mars/temizer_2001/Optical_Flow/

Bees and optical flow

Bees and optical flow

How Bees Exploit Optic Flow: Behavioural Experiments and Neural Models [and Discussion], by Mandyam V. Srinivasan and R. L. Gregory Philosophical Transactions: Biological Sciences.

Bees and optical flow

Bees and optical flow

The landing problem


Speed = V

However

Time =

D V
During approach: Image grows Speed of edges increases Time = Image size (degrees)

Difficult to calculate in real time No time for slow binocular vision No time for stopping and peering

Velocity of edges (degrees per second)

Diving Gannets

Optical flow from a aerial platform

Autonomous Landing for Indoor Flying Robots Using Optic Flow, Green et al, 2003

Landing Control
Forward speed, Vf

Landing rules
Vd Vf

Descent Speed (Vd)

= Tan

Vf ( t ) = h ( t ) Vd ( t ) = Hence h( t ) = h( t o )e
B ( t t o ) B ( t t o )

Flight path

dh( t ) = BVf ( t ) dt

Height

Vf h Vf ( t ) = h( t ) =

Hold constant

Vf ( t ) = h ( t o )e

Typical results

Experimental System

Optic flow-based vision system for autonomous 3D localization and control of small aerial vehicles Kendoul, Fantonia, and Nonamib, IRJJ, 2009

Experimental System

Typical application

Concept for a plane to fly within the Martian atmosphere

Translation vs Rotation
In translation close objects move faster In rotation Objects at all distances move at the same speed

Biological viewpoints
Saccades Sudden movement of the eyes or heads Crabs can move their eyes Insects move their heads

Translatory motion can reveal 3D structure, but rotary motion cannot. Motion Parallax

Fiddler Crab

Eye movements
Increase field of view Stabilise image Estimate distances Track objects Scan objects
Biological Systems Robotic

Obstacle Avoidance

Objective
Flying autonomously Two poles 10 m apart. The rotor span of the helicopter is 3.4 m

Test vehicle

Lidar scans

170m

Evidence Grid

Obstacle avoidance
Uses the principle of attraction and repulsion. The helicopter is attracted to the objective The helicopter is repulsed from an obstacle Flies along the resultant valley

Build a 3d image of the environment as we are considering probability, transient image are filtered out Flying Fast and Low Among Obstacles Sebastian Scherer, Sanjiv Singh, Lyle Chamberlain and Srikanth Saripalli, 2007, IEEE ICRA

Example

Reported results

Final thoughts
Commercial low cost systems Vacuum cleaners Lawn mowers Perimeter boundary control Electric fence Physical obstacles

Agriculture
Reduce labour Smaller tractors More hours of usage Cabs not required Increased safety Fixed paths, GPS based

True autonomous Driving

Sensors and computers


GPS (four separate system) Five laser range finders looking at the terrain out to 25m Colour camera for long range perception of the road Two radar units out to 200m

Reference paper
Stanley: The Robot that Won the DARPA Grand Challenge Journal of Field Robotics 23(9), 661692 (2006)

Six Pentium processor + gigabit Ethernet Input/Output to the vehicle


Treat autonomous navigation as a software problem.

Architecture

Deliberative and Reactive Control


Modify the world

Sense Model Plan Act

Create maps

Discover new areas

Avoid collisions

Move around

Deliberative

Reactive

Processed camera image

Module summary
Conventional Robotics Sensors Biologically inspired system

S-ar putea să vă placă și