Robotics机器人技术(PPT).ppt
Smart Home Technologies,Automation and Robotics,Motivation,Intelligent Environments are aimed at improving the inhabitants experience and task performanceAutomate functions in the homeProvide services to the inhabitantsDecisions coming from the decision maker(s)in the environment have to be executed.Decisions require actions to be performed on devicesDecisions are frequently not elementary device interactions but rather relatively complex commandsDecisions define set points or results that have to be achievedDecisions can require entire tasks to be performed,Automation and Robotics in Intelligent Environments,Control of the physical environmentAutomated blindsThermostats and heating ductsAutomatic doorsAutomatic room partitioningPersonal service robotsHouse cleaningLawn mowingAssistance to the elderly and handicappedOffice assistantsSecurity services,Robots,Robota(Czech)=A worker of forced laborFrom Czech playwright Karel Capeks 1921 play“R.U.R”(“Rossums Universal Robots”)Japanese Industrial Robot Association(JIRA):“A device with degrees of freedom that can be controlled.”Class 1:Manual handling deviceClass 2:Fixed sequence robotClass 3:Variable sequence robotClass 4:Playback robotClass 5:Numerical control robotClass 6:Intelligent robot,A Brief History of Robotics,Mechanical Automata Ancient Greece&EgyptWater powered for ceremonies14th 19th century EuropeClockwork driven for entertainmentMotor driven Robots1928:First motor driven automata1961:UnimateFirst industrial robot1967:ShakeyAutonomous mobile research robot1969:Stanford ArmDextrous,electric motor driven robot arm,Maillardets Automaton,Unimate,Robots,Robot ManipulatorsMobile Robots,Robots,Walking RobotsHumanoid Robots,Autonomous Robots,The control of autonomous robots involves a number of subtasksUnderstanding and modeling of the mechanismKinematics,Dynamics,and OdometryReliable control of the actuatorsClosed-loop controlGeneration of task-specific motionsPath planningIntegration of sensorsSelection and interfacing of various types of sensorsCoping with noise and uncertaintyFiltering of sensor noise and actuator uncertaintyCreation of flexible control policiesControl has to deal with new situations,Traditional Industrial Robots,Traditional industrial robot control uses robot arms and largely pre-computed motionsProgramming using“teach box”Repetitive tasksHigh speedFew sensing operations High precision movementsPre-planned trajectories and task policiesNo interaction with humans,Problems,Traditional programming techniques for industrial robots lack key capabilities necessary in intelligent environmentsOnly limited on-line sensingNo incorporation of uncertaintyNo interaction with humansReliance on perfect task informationComplete re-programming for new tasks,Requirements for Robots in Intelligent Environments,AutonomyRobots have to be capable of achieving task objectives without human inputRobots have to be able to make and execute their own decisions based on sensor informationIntuitive Human-Robot InterfacesUse of robots in smart homes can not require extensive user trainingCommands to robots should be natural for inhabitantsAdaptationRobots have to be able to adjust to changes in the environment,Robots for Intelligent Environments,Service RobotsSecurity guardDeliveryCleaningMowingAssistance RobotsMobilityServices for elderly and People with disabilities,Autonomous Robot Control,To control robots to perform tasks autonomously a number of tasks have to be addressed:Modeling of robot mechanismsKinematics,DynamicsRobot sensor selectionActive and passive proximity sensorsLow-level control of actuatorsClosed-loop controlControl architecturesTraditional planning architecturesBehavior-based control architecturesHybrid architectures,Forward kinematics describes how the robots joint angle configurations translate to locations in the worldInverse kinematics computes the joint angle configuration necessary to reach a particular point in space.Jacobians calculate how the speed and configuration of the actuators translate into velocity of the robot,Modeling the Robot Mechanism,In mobile robots the same configuration in terms of joint angles does not identify a unique locationTo keep track of the robot it is necessary to incrementally update the location(this process is called odometry or dead reckoning)Example:A differential drive robot,Mobile Robot Odometry,R,L,Actuator Control,To get a particular robot actuator to a particular location it is important to apply the correct amount of force or torque to it.Requires knowledge of the dynamics of the robotMass,inertia,frictionFor a simplistic mobile robot:F=m a+B v Frequently actuators are treated as if they were independent(i.e.as if moving one joint would not affect any of the other joints).The most common control approach is PD-control(proportional,differential control)For the simplistic mobile robot moving in the x direction:,Robot Navigation,Path planning addresses the task of computing a trajectory for the robot such that it reaches the desired goal without colliding with obstaclesOptimal paths are hard to compute in particular for robots that can not move in arbitrary directions(i.e.nonholonomic robots)Shortest distance paths can be dangerous since they always graze obstaclesPaths for robot arms have to take into account the entire robot(not only the endeffector),Sensor-Driven Robot Control,To accurately achieve a task in an intelligent environment,a robot has to be able to react dynamically to changes ion its surroundingRobots need sensors to perceive the environmentMost robots use a set of different sensorsDifferent sensors serve different purposesInformation from sensors has to be integrated into the control of the robot,Robot Sensors,Internal sensors to measure the robot configurationEncoders measure the rotation angle of a jointLimit switches detect when the joint has reached the limit,Robot Sensors,Proximity sensors are used to measure the distance or location of objects in the environment.This can then be used to determine the location of the robot.Infrared sensors determine the distance to an object by measuring the amount of infrared light the object reflects back to the robotUltrasonic sensors(sonars)measure the time that an ultrasonic signal takes until it returns to the robot Laser range finders determine distance by measuring either the time it takes for a laser beam to be reflected back to the robot or by measuring where the laser hits the object,Computer Vision provides robots with the capability to passively observe the environmentStereo vision systems provide complete location information using triangulationHowever,computer vision is very complexCorrespondence problem makes stereo vision even more difficult,Robot Sensors,Uncertainty in Robot Systems,Robot systems in intelligent environments have to deal with sensor noise and uncertaintySensor uncertaintySensor readings are imprecise and unreliableNon-observabilityVarious aspects of the environment can not be observed The environment is initially unknown Action uncertaintyActions can failActions have nondeterministic outcomes,Probabilistic Robot Localization,Explicit reasoning about Uncertainty using Bayes filters:Used for:Localization Mapping Model building,Deliberative Robot Control Architectures,In a deliberative control architecture the robot first plans a solution for the task by reasoning about the outcome of its actions and then executes itControl process goes through a sequence of sencing,model update,and planning steps,Deliberative Control Architectures,AdvantagesReasons about contingenciesComputes solutions to the given taskGoal-directed strategiesProblemsSolutions tend to be fragile in the presence of uncertaintyRequires frequent replanningReacts relatively slowly to changes and unexpected occurrences,Behavior-BasedRobot Control Architectures,In a behavior-based control architecture the robots actions are determined by a set of parallel,reactive behaviors which map sensory input and state to actions.,Behavior-BasedRobot Control Architectures,Reactive,behavior-based control combines relatively simple behaviors,each of which achieves a particular subtask,to achieve the overall task.Robot can react fast to changes System does not depend on complete knowledge of the environmentEmergent behavior(resulting from combining initial behaviors)can make it difficult to predict exact behaviorDifficult to assure that the overall task is achieved,Complex behavior can be achieved using very simple control mechanismsBraitenberg vehicles:differential drive mobile robots with two light sensorsComplex external behavior does not necessarily require a complex reasoning mechanism,Complex Behavior from Simple Elements:Braitenberg Vehicles,“Coward”,“Aggressive”,“Love”,“Explore”,Behavior-Based Architectures:Subsumption Example,Subsumption architecture is one of the earliest behavior-based architecturesBehaviors are arranged in a strict priority order where higher priority behaviors subsume lower priority ones as long as they are not inhibited.,Subsumption Example,A variety of tasks can be robustly performed from a small number of behavioral elements,MIT AI Labhttp:/www-robotics.usc.edu/maja/robot-video.mpg,Reactive,Behavior-Based Control Architectures,AdvantagesReacts fast to changesDoes not rely on accurate models“The world is its own best model”No need for replanningProblemsDifficult to anticipate what effect combinations of behaviors will haveDifficult to construct strategies that will achieve complex,novel tasksRequires redesign of control system for new tasks,Hybrid Control Architectures,Hybrid architectures combine reactive control with abstract task planningAbstract task planning layerDeliberative decisionsPlans goal directed policiesReactive behavior layerProvides reactive actionsHandles sensors and actuators,Hybrid Control Policies,Task Plan:,Behavioral Strategy:,Example Task:Changing a Light Bulb,Hybrid Control Architectures,AdvantagesPermits goal-based strategiesEnsures fast reactions to unexpected changesReduces complexity of planningProblemsChoice of behaviors limits range of possible tasksBehavior interactions have to be well modeled to be able to form plans,Traditional Human-Robot Interface:Teleoperation,Remote Teleoperation:Direct operation of the robot by the userUser uses a 3-D joystick or an exoskeleton to drive the robotSimple to installRemoves user from dangerous areasProblems:Requires insight into the mechanismCan be exhaustive Easily leads to operation errors,Human-Robot Interaction in Intelligent Environments,Personal service robotControlled and used by untrained usersIntuitive,easy to use interfaceInterface has to“filter”user inputEliminate dangerous instructionsFind closest possible actionReceive only intermittent commandsRobot requires autonomous capabilitiesUser commands can be at various levels of complexityControl system merges instructions and autonomous operationInteract with a variety of humansHumans have to feel“comfortable”around robotsRobots have to communicate intentions in a natural way,Example:Minerva the Tour Guide Robot(CMU/Bonn),CMU Robotics Institutehttp:/www.cs.cmu.edu/thrun/movies/minerva.mpg,Intuitive Robot Interfaces:Command Input,Graphical programming interfacesUsers construct policies form elemental blocksProblems:Requires substantial understanding of the robotDeictic(pointing)interfacesHumans point at desired targets in the world orTarget specification on a computer screenProblems:How to interpret human gestures?Voice recognitionHumans instruct the robot verballyProblems:Speech recognition is very difficultRobot actions corresponding to words has to be defined,Intuitive Robot Interfaces:Robot-Human Interaction,He robot has to be able to communicate its intentions to the humanOutput has to be easy to understand by humansRobot has to be able to encode its intentionInterface has to keep humans attention without annoying herRobot communication devices:Easy to understand computer screensSpeech synthesisRobot“gestures”,Example:The Nursebot Project,CMU Robotics Institutehttp:/www/cs/cmu.edu/thrun/movies/pearl_assist.mpg,Human-Robot Interfaces,Existing technologiesSimple voice recognition and speech synthesisGesture recognition systemsOn-screen,text-based interactionResearch challengesHow to convey robot intentions?How to infer user intent from visual observation(how can a robot imitate a human)?How to keep the attention of a human on the robot?How to integrate human input with autonomous operation?,Integration of Commands and Autonomous Operation,Adjustable AutonomyThe robot can operate at varying levels of autonomy Operational modes:Autonomous operation User operation/teleoperation Behavioral programmingFollowing user instructionsImitationTypes of user commands:Continuous,low-level instructions(teleoperation)Goal specifications Task demonstrations,Example System,Social Robot Interactions,To make robots acceptable to average users they should appear and behave“natural”Attentional Robots Robot focuses on the user or the taskAttention forms the first step to imitationEmotional RobotsRobot exhibits“emotional”responsesRobot follows human social norms for behaviorBetter acceptance by the user(users are more forgiving)Human-machine interaction appears more“natural”Robot can influence how the human reacts,Social Robot Example:Kismet,MIT AI Labhttp:/www.ai.mit.edu/projects/cog/Video/kismet/kismet_face_30fps.mpg,Social Robot Interactions,Advantages:Robots that look human and that show“emotions”can make interactions more“natural”Humans tend to focus more attention on people than on objectsHumans tend to be more forgiving when a mistake is made if it looks“human”Robots showing“emotions”can modify the way in which humans interact with themProblems:How can robots determine the right emotion?How can“emotions”be expressed by a robot?,Human-Robot Interfaces for Intelligent Environments,Robot Interfaces have to be easy to useRobots have to be controllable by untrained usersRobots have to be able to interact not only with their owner but also with other peopleRobot interfaces have to be usable at the humans discretionHuman-robot interaction occurs on an irregular basisFrequently the robot has to operate autonomouslyWhenever user input is provided the robot has to react to itInterfaces have to be designed human-centricThe role of the robot is it to make the humans life easier and more comfortable(it is not just a tech toy),Intelligent Environments are non-stationary and change frequently,requiring robots to adaptAdaptation to changes in the environmentLearning to address changes in inhabitant preferencesRobots in intelligent environments can frequently not be pre-programmedThe environment is unknown The list of tasks that the robot should perform might not be known beforehandNo proliferation of robots in the home Different users have different preferences,Adaptation and Learning for Robots in Smart Homes,Adaptation and LearningIn Autonomous