7 | | === Teleop Control === |
8 | | [[Image(Driver_Station.jpg,25%,nolink,right)]] In many ways, a robot in teleop mode is like a giant, sophisticated remote-control car. When an FRC robot is in Teleop mode, a human driver stands at a Driver Station (see picture to right) and uses a laptop running special software and joysticks or gamepads to control the robot. |
| 7 | == Teleop Control == |
| 8 | [[Image(Driver_Station.jpg,25%,nolink,right,margin=20)]] In many ways, a robot in teleop mode is like a giant, sophisticated remote-control car. When an FRC robot is in Teleop mode, a human driver stands at a Driver Station (see picture to right) and uses a laptop running special software and joysticks or gamepads to control the robot. |
29 | | At the start of each competition, the robot is placed in "autonomous mode" where it operates independently to perform a sequence of required tasks. The autonomous operation is controlled by the robots on-board computer running software that monitors the robot's environment using sensors and interacts with it using ''actuators ''(e.g. motors). To accomplish tasks, the software must determine the robots location and position, locate the objects it must interact with, move to them and manipulate them with its actuators, all without human intervention. |
| 28 | At the start of each competition, the robot is placed in "autonomous mode" where it operates independently to perform a sequence of required tasks. The autonomous operation is controlled by the robot's [http://www.ni.com/en-us/support/model.roborio.html on-board computer] running software that monitors the robot's environment using sensors and interacts with it using ''actuators ''(e.g. motors) driving mechanisms (arms, levers, etc.). To accomplish tasks, the software must determine the robots location and position, locate the objects it must interact with, move to them and manipulate them with its actuators, all without human intervention. |
47 | | It's important to note that this is '''''WAY '''''simpler than programming a real-world robot. The turtle is an ideal robot operating in an ideal world. It has no obstacles to avoid, it doesn't need to find targets (it has no eyes!), the terrain is perfect and so is its movement. With a real robot, even simple things like moving forward in a straight line are quite complex: the motors that power each wheel vary slightly in speed and strength so applying the same power to each motor will not result in them turning the robot wheels the same amount causing the robot to turn. Even worse, robot wheels don't have equal traction with the ground (especially if the robot has no mechanical suspension)...if the robot isn't perfectly level and balanced (which it never is), one wheel on a 4 wheel robot may have little or no traction at all! The terrain the robot is traversing may have obstacles, bumps, ramps, etc. So even the act of driving forward in a straight line is actually fairly challenging. In order to allow simple operations like driving straight forward, Control Systems students must make use of sensors to continuously monitor the robot's environment (e.g. a compass to determine orientation, a gyroscope to determine when the robot is turning, sensors to determine how fast each wheel is turning, etc.). The students write software to monitor these sensors and continuously adjust power to each motor to achieve the desired results. Once moving in a straight line, determining how far the robot has traveled is another challenge since wheels will slip and terrain can be uneven; again, software must use information from sensors to count wheel rotations, laser or ultrasonic range finders to measure distances, and intelligent vision systems to try to identify targets.[[BR]][[BR]]This should give some appreciation for the challenges facing [https://waymo.com/ self-driving cars] which are essentially robots that carry people. |
| 48 | === Software is Complex === |
| 49 | It's important to note that the robot turtle exercise involves an ideal robot operating in an ideal world. Much of the complexity of software involves its need to interact in the real world. For example, the robot turtle: |
| 50 | * has no obstacles to avoid |
| 51 | * doesn't need to find targets |
| 52 | * walks on perfect terrain |
| 53 | * moves precisely how you instruct it |
| 54 | By contrast, for a real robot, even simple things like moving forward in a straight line are quite complex: |
| 55 | * The motors that power each wheel vary slightly in speed and strength so applying the same power to each motor will cause the robot to turn, not go straight |
| 56 | * Robot wheels don't have equal traction with the ground (especially if the robot has no [https://www.youtube.com/watch?v=R3G7AGtMSoU mechanical suspension])...if the robot isn't perfectly level and balanced (which it never is), one wheel on a 4 wheel robot may have little or no traction at all! |
| 57 | * The terrain the robot is traversing may have obstacles, bumps, ramps, etc. |
| 58 | |
| 59 | So even the act of driving forward in a straight line is challenging. To achieve this, Control Systems students must write software that makes use of sensors to continuously monitor the robot's environment such as: |
| 60 | * a compass to determine orientation |
| 61 | * a gyroscope to determine when the robot is turning |
| 62 | * wheel encoders to determine how fast each wheel is turning |
| 63 | * laser rangefinders to determine how far the robot is from objects around it |
| 64 | * intelligent vision systems to identify targets |
| 65 | Control systems students write software to monitor these sensors continuously while adjusting power to each motor to achieve the desired results. |
| 66 | |
| 67 | [[BR]][[BR]]This should give some appreciation for the challenges facing [https://waymo.com/ self-driving cars] which are essentially robots that carry people. |