Lessons Learnt from the agBOT Challenge

Introduction

Having worked on a non-ROS robot, I was sold on the idea of using ROS for the IUPUI agBOT 2. As an advocate for using ROS, I was able to convince our sponsors and my team to implement ROS on the next generation agBOT which was going to be a 4×4 ATV by Yamaha – the 2016 Wolverine R-Spec EPS. The task at hand was to automate the wolverine so it can traverse rows of corn to detect weed and spray herbicides and fertilizers as needed. We were able to achieve some of the functionality, thanks to ROS and the hard work of our team who spent several sleepless nights to make it happen in less than 3 months and a budget < $5000 USD*. This article is an account of my presentation “Lessons learnt from the agBOT challenge” for the ROS-Agriculture Community meeting dated August 14, 2018.

Steering

The Yamaha Wolverine had a good EPS motor that we were able to hack into. The steering control was achieved by splicing the motor terminals to connect with the stock EPS control system as well as an H-Bridge motor controller, simultaneously. The H-bridge interfaces with ROS through Arduino on ROS Serial with a physical switch between both controls. A rotary encoder is connected to the shaft that comes out of the other end of the motor using a custom 3D printed pulley and a rubber belt. The encoder reads back values to the Arduino to provide feedback.

What could be better and why:

Not having used an Arduino:

Arduino is a great tool for interfacing with some hardware such as an H-Bridge and that is why we opted to use it over serial. However, a Raspberry Pi or a PWM board with Jetson would have given us much more flexibility to use different control techniques. The current system just reads values from the encoder and sends signals to the motor controller. This system experiences some delays as the turn signals are relayed from ROS to Arduino to the controller – too many things that could go wrong.
Not having used an encoder for feedback
The rotary encoder provides a solid resolution of 1024 pulse per revolution which might be an overkill. The steering wheel encoder needs to turn 540 degrees, while a potentiometer would provide a full revolution. However, using the rotary encoder with an Arduino brings up another issue as it can rotate freely – each time the Arduino restarts, it reads the current value as the center value, so the steering would need to be centered manually. This may be fixed using a different approach while coding it, we did address it due to time constraints. Another simpler solution is to use a potentiometer with appropriate gearing. I advocate the use potentiometers because they are way more cheaper as compared to the encoders and can provide enough resolution to steer a vehicle. Also, they don’t change value when the microcontroller restarts.

Throttle

The throttle control system for the agBOT uses not one, but two stepper motors running in synchronization to pull a bicycle brake cable attached to the throttle pedal. The throttle pedal is connected to the bicycle cable using a metallic custom-made part and the other end of the cable attaches with the stepper motors by a custom made 3-D printed part in tough resin material.

What could be better and why:

Not having used an Arduino:

The throttle commands are sent by a script that reads inputs from the path planning algorithm and to the Arduino. The planner reads the velocity data from the gps device and generates velocity outputs for the controller to follow. Before the controller receives the signal to move the throttle to the desired position, a PID control script reads the output of the planner and relays it to the Arduino. More control techniques could’ve been employed here if it weren’t an Arduino board.

Not having used the two steppers

Using two steppers was a design choice that was made way early in the ideation phase of the project and cost us a lot of time to ensure the system performs as it should. Only one stepper could have easily been used or some better servo motor can also react faster than a stepper motor. Some of the issues we ran into was to synchronize the movement of both servos. The other was to 3D print the part in a material that can withstand the right amount of load on the part.
“Super-servo” motors are easily available for a cheap price with a wide range of torque capacities. These servos come with build in controllers and can run on several voltage options. It is my opinion that a servo motor used with a similar bicycle cable to control throttle of the vehicle.

Having reverse functionality

The agBOT lacked automated reverse functionality. This was mainly because the gear shaft was located beside the driver’s seat and trying to hack the gearbox would risk losing the ability to also manually control the system. Time to be invested to make this functional didn’t make sense.

Emergency

The emergency system on the agBOT is based on a simple ROS node, which listens to input from an Arduino over ROSserial. The Arduino keeps listening for an interrupt signal from either the emergency button on-board or the Xbee. As soon as ROS receives the interrupt, it changes the emergency variable to 1 and this invokes a series of actions – such as applying brakes, cutting off the engine and stopping all spray functions and drive functions.

What could be better and why:

This system was functional and did what it is supposed to do. One functionality that could be added however would be the ability to return to the zero state after the emergency has subsided. Current system will only stop the functions and will not bring the accelerator and brake pedals back to the zero position, unless manually brought back. One can argue it is a safer, but to me it is not as smart. A safety system for a commercial system would need to be so much more advanced as compared to this, but our application is limited to research – for now.

Sensors

My definition of a robot is – “a machine that is capable of observing its surroundings and of reacting to the changes in the surroundings accordingly while accomplishing a specified task.”
I’ve learnt that having more sensors on a robot can make development less intimidating. It is always good practice to have multiple streams of data come in to the system from different sensors. A good way to better understand this is to know what kind of sensors need to go on a robot. Splitting the types of sensors that go on a robot in to two categories – intrinsic and extrinsic, can be help in determining and achieving the purpose of a robot.

The objective is to fill out the pose matrix by sourcing data from different sensors on board. The following table is a representation of what a PoseWithCovairanceStamped message looks like in ROS:

On the IUPUI agBOT, an emlid RTK GPS was used with a Tinkerforge IMU.

What could be better and why:

More sensors for localization:

It is possible to localize a robot using just IMU and GPS data. The vehicle velocity information is sourced from the GPS unit. Adding wheel encoders could have improved the accuracy of the pose generated by the robot_localization package. The robot_localization package supports input from multiple sensor streams of same kind to generate a state estimate of the robot’s pose in free space. So, adding more than one way to collect information about the robot’s pose – or redundancy is recommended.

Computing

Computing can be costly- both in terms of computing power and electric power. Our team decided to use a full-size desktop computer to be put on board. This was an old 3rd Generation Intel Core i7 powered DELL Optiplex sourced from the surplus store. By adding a Nvidia 1060Ti and an 8GB RAM, the PC was good enough to be used on a robot, but very costly in terms of power consumption.

What could be better and why:

Using one of the embedded systems by Nvidia, could have saved the team plenty of time as it would’ve eliminated the need for a complex power system that was developed for this application. The TX2 could’ve provided similar graphics processing power with the 256 CUDA cores on board, at a much, much cheaper power consumption of just 30W. The current system draws a frightful 1.7 kW when operational.

Tips on Developing a robot

Life-cycle

Robots such as the agBOT are geared towards research, knowledge and value creation. Such systems are made for comparatively longer product life cycles than smartphones. Research budgets are hard to come by and working on the agBOT taught us to be frugal, and innovative (the hard way). Reuse and proper consideration of interfaces can drastically increase the life cycle of such systems. These systems should be upgraded each time a minute requirement is changed but starting to make a system (essentially a prototype) from scratch can be counter intuitive. Considering a 3-5 year product life cycle for developing a robot can help reduce redundant work and accelerate development as different team members contribute to the project over time. I plan to do another in-depth piece about robot life cycle and a systems engineering approach to robotics development soon.

Community Engagement

Working on a robot can get overwhelming, especially when you’re racing against time. Getting involved with the ROS-Agriculture community helped me a lot because it exposed me with new perspectives and ideas, from real, like-minded people with just as much interest in having the technology succeed. This is truly the age of collaborative development and being a part of a community has proven extremely important. I have learnt a lot from just being present for the weekly ROS-A community meetings and it always motivates me to work on solving one of the many problems the robotics field faces today. I hope this information helps someone who is thinking of working on their own robot.

 

-Sohin

Leave a Reply

Your email address will not be published. Required fields are marked *