Skip to Content

College of Engineering and Computing

  • AI-driven robot for farming

Utilizing robotics and AI for planting and harvesting

According to the most recent data from the U.S. Department of Agriculture, approximately 900 million acres, or more than half of the continental U.S., is used for farming. Many farms currently utilize digital farming, which are tools that collect, store, analyze and share electronic data. 

But can digital farming go beyond data purposes and incorporate artificial intelligence? Mechanical Engineering Professor Sourav Banerjee is continuing his research into using AI-driven robots for planting, placing and removing stakes on farms. 

Due in part to a lack of laborers caused by the COVID-19 pandemic, South Carolina farmers are struggling to maintain demands. Staking, the process of driving stakes into the ground and fastening plants for strength and support, is performed by laborers. But since a sizable number never returned after the pandemic, produce was weakened and did not have enough strength, causing fruit and vegetables to rot. 

Banerjee’s research interests include robotics, automation and AI-based learning. After meeting with the South Carolina Department of Agriculture (SCDA) and proposing the use of robots in farming, his project was funded for improving crop production. Initially a two-year project from June 2021 to October 2023, it was recently renewed for an additional two years for a total of $325,000 in funding. 

Banerjee’s graduate student Corey Leydig began by designing and building an autonomous robot capable of driving stakes into soil beds for produce with weak stems. Known as StakeBot, the robot includes stake driving arms and a carrier, a vision system for detecting bed plot boundary walls, a waterproof electronics and control system, and all-wheel drive robust wheels.

“I was responsible for the complete development. It involved creating the design, including engineering analysis, physics simulations and proof of concepting for all the physical parts,” Leydig says. “I also created wiring diagrams and developed the electrical systems that run the StakeBot and some custom printed circuit boards that I designed and used for the arm motor controls.”  

Banerjee and Leydig were challenged with determining how and where the robot would drive the stake into the ground. One of the first tasks was teaching the robot to learn the environment by sound and light. This was achieved with vision-based AI and control for self-driving by using depth cameras to determine distance measurements from each pixel of an image. The cameras perceive the environment to help the StakeBot understand the difference between a drivable area and obstacles and boundaries. An algorithm was also developed for autonomous navigation for turning in the correct direction to stay centered between bed plots. 

“We went to a farm and took a series of pictures, which were processed and used to train the robot to determine if a particular ground is flat, which means it can drive stakes, or to avoid an area if there is an obstacle,” Banerjee says. “All of these cases have been implemented in a small computer based on previous learning.” 

Another challenge was determining how the StakeBot would drive the stake into the ground. First, vision-based AI perceives the environment by sending an image to the robot to determine where it can place and drive the stake or if it needs to readjust due to an obstacle. The robot carries a stake on its back and uses one of its arms to place it on the bed plot. The motors then provide the required force to drive a typical stake into the ground. 

These farms are getting to a point where their only option is to turn to automation if they want to get the most out of their crops.

- Corey Leydig

For calculating the required force, Banerjee and Leydig completed several experiments at farms across the state. They determined that the robot can provide as much as 25 pounds of force required to drive a stake into a bed plot. 

“We found that the maximum force required is based on the type of soil. The range was anywhere from eight to 25 pounds. Based on the environment, the robot will perceive and use the required force to drive the stake,” Banerjee says.

Leydig also developed an iPhone Operating System app for a user to communicate with the StakeBot and program it for moving through bed plots and drive stakes. Features include manual driving, receiving status updates, depth camera views, and programming and storing job settings. He also created the software control for both user and AI controls and a vision processing algorithm to help the robot auto-navigate between bed plots. 

“These farms are getting to a point where their only option is to turn to automation if they want to get the most out of their crops. The StakeBot robot can help these farmers do some of the more tedious tasks that traditionally require a large work force, which is becoming harder to maintain,” Leydig says. “This research helps bring to light the problems in the industry and provides solutions for autonomous robots being able to operate in a farm setting.” 

The SCDA believes robotics in farming and harvesting is the future. While farms are currently not designed for robotics, they could be modified to allow for advancements in digital farming. 

“For example, fruits and vegetables can hang from specially prepared perforated walls built inside a greenhouse. Using AI, a robot can automatically image and detect which tomatoes are not ripe and which ones can be harvested,” Banerjee says. “New facilities should be designed for robots because AI could help harvest food better than current methods.” 

Banerjee’s current goal for the project’s extension period is to create a virtual robot, since there is not enough room in his lab to construct a large robot. Commercialization is the eventual long-term goal since design changes and field testing are still needed. 

“We can prove it works because we trained the robot to do everything virtually with all the pictures of farmland that we have,” Banerjee says. “If we get a contract, we’ll design the robot for someone to manufacture.” 


Challenge the conventional. Create the exceptional. No Limits.

©