Skip to main content

The Exhibitor Blog

Subpage Hero

12 Mar 2021

Ag robotics take the spotlight at virtual event

by Arable Farming

The annual FIRA robotics conference in Toulouse, France, was presented as a virtual event at the end of 2020, with more than 1,500 participants from 70 countries. Jane Carley takes a look at some of the latest developments.

The second International Forum of Agricultural Robotics (FIRA) conference focused on the importance of robotics in a changing agricultural market, how robots can contribute to the social and economic landscape and what the obstacles to developers, purchasers and operators are likely to be.

Speakers ranged from the French Minister of Agriculture to representatives of the French national research organisation and developers and manufacturers.

Presentations looked at developments in robots for the arable, fruit growing, vineyard and livestock sectors, while exhibitors were able to offer virtual pitches on new products and technologies and video demonstrations allowed delegates to see robots in action.

Many of the products ready for market targeted the vineyard and fruit growing sectors, which not only represent high value produce but also face labour supply issues which could be tackled by automation.

For arable operations, benefits include improved targeting and lower use of inputs, helping to cut costs and meet legislative requirements.

Wider consultation needed

Ensuring that industry and the public understand and accept robots and that codes of practice are in place for their use is key to their successful introduction, said Dr David Rose of Reading University.

Dr Rose, along with colleagues from the University of Lincoln and industry partners, is taking part in the UK Government-funded Robot Highways demonstration.

The £2.5 million project will establish what is said to be the worlds first robotic farm.

Dr Rose said: Agriculture 4.0 is perceived as ‘exciting but scary, and alongside its benefits there are social, environmental, legal and ethical issues which could derail its implementation. He has co-authored a study, currently in review, entitled ‘Towards the responsible development of autonomous robotics in agriculture: A call to action, which looks at the pros and cons of automation.

Examples include the potential for automation to attract younger workers, help with labour shortages and provide skilled job opportunities while tackling the dull and dangerous tasks.

This is countered by the fact that some farmworkers could lose jobs and that dangerous jobs may actually be displaced to sectors which manufacture robots mining, for example. The study proposes a suite of measures needed for responsible innovation, and Dr Rose said: We have seen this in action with Innovate UKs responsible innovation guide which introduces the PAS 440:2020 standard for such developments, while in Australia, a Code of Practice for Agricultural Mobile Field Machinery with Autonomous Functions has been drafted.

Inclusive

However, Dr Rose said that surveys into farmer and consumer acceptance of autonomous agriculture have not yet been truly inclusive, tending to focus on selfselecting farmers.

There is a real need to test different methods of inclusion.

There is also a need to respond to feedback by questioning underlying assumptions and considering alternatives,he said.

Developments must also be responsive, he added.

Regulations must keep up with technological advances and codes of practice or standards for the design and operation of autonomous robots must be established, constantly reviewed and updated. Robot Highways plans to include both growers and operators in its demonstration project and aims to improve user-centred design and peer-to-peer knowledge exchange.

Spotting obstacles in 3D

Koen van Boheemen and Pieter Block (Wageningen University) and Dr Kim Gookwan (Rural Development Administration, Korea) have brought together their expertise in precision agriculture, vision and robotics and smart farming to develop a smart obstacle detection system for an autonomous orchard robot.

Mr Van Boheemen explained that fruit and salad production are at the forefront of research and development, due to the high value of the crops, but much of the knowledge will also apply to row crops in the future.

He pointed out that there are an increasing number of robots used in research, but the focus is often on their development and application, with safety being overlooked.

Existing autonomous orchard robots use a system called 2 SICK LMS111 LiDAR for their navigation and safety if it detects an object in its path, the robot stops.

The disadvantage is that this system operates in 2D so cannot detect an object above or below it, said Mr Van Boheemen.

Research

To improve upon this, the teams research uses an RGB-D camera for environment sensing which produces both colour and depth data.

It is small, reliable, waterproof to IP65 and affordable at around €200, Mr Van Boheemen said.

If there is no object in view, the processor publishes a ‘safe message and the robot proceeds.

If there is an object and thus no ‘safe message, the robot stops; the images can also be viewed by the operator.

The next challenge will be to enable the robot to respond differently to different obstacles.

For example, to sound a horn to alert a human and to stop if it is a crate that is in the way, or to programme in other actions such as backing up away from an obstacle, added Mr van Boheemen.

Machine learning

Formerly Bosch Robotics, Farming Revolution is developing autonomous weeding robots to use mechanical tools for weed control, eliminating the use of chemicals.

This requires the use of multi-spectral cameras which collect images to allow the identification of crop and weed growth.

Sales manager Markus Hofferlin said: The challenges include varying soil types, overlapping and diffuse growth, varying appearance of both crop and weeds and changing environmental conditions. Farming Revolution has compiled data from more than 50 fields, from 2am to nightfall, in wet and dry conditions.

Over five years, more than 65 species were studied, in dew, dust and mud a total of 12 million images.

The data, said to offer 99% accuracy, will be used to guide just three responses from the robot; do not treat (crop identified), treat (weed identified) or do not care (soil).

However, there are 60 terrabytes of data, so it would take 18 months to ‘train these responses, said Mr Hofferlin.

Evaluate

Thus, we ‘steer the training by tagging the data at image level, to indicate, for example, growth stage, weather and light conditions.

We then evaluate how well the robot performs in each situation and place more emphasis on training for poor performing areas. Continued evaluation will also be offered when the robots go to market, on both labelled and unlabelled data, he added.

If a customer is getting poor results we can upload the data to see how the robot is performing and use it to train the machine learning.

Autonomous or tractor mount

Swiss precision weeding robot specialist Ecorobotix is developing the AVO weeding robot which uses a camera and on-board processor to control spot spraying, a system which has undergone independent trials in several countries.

Using a two-metre spray bar with 52 nozzles and a 120-litre chemical tank, it offers an output of up to eight hectares per day.

Marketing manager, Claude Juriens, said: AVO can reduce herbicide use by up to 95%, while boosting yield and cutting pesticide residues. Trials were carried out on sugar beet in Switzerland where two applications at 0.4 litres/ha reduced application by 41% compared to a single standard spray at one litres/ha, and in Germany, breeder KWS found that weed populations fell by 71% after a single spray, with a 78.4% reduction in chemical use.

However, concerns about autonomous vehicle legislation and uncertainty over the acceptance of robots by farmers, plus a challenging business case on broad acre crops, has led the company to a further development.

Spot spraying

Farmers asked if our technology could be used behind a tractor, said Mr Juriens.

Therefore we have launched the ARA, which uses the spot spraying technology in a six-metre width tractor-mounted unit, with six ultra-high precision cameras, capable of spraying 20-40ha per day.

We will continue with the additional work needed on the robot and its certification process, but in the meantime we have a precision sprayer that we can bring to market.

Best in show

Tevel Aerobotics Technologies, which recently secured a $20 million (£14.5m) investment from Kubota, won FIRAs Best Field Robot Concept Award for its Flying Autonomous Robot (FAR).

Designed to pick fruit from apples to avocados, the robot uses a range of artificial intelligence including vision algorithms to detect fruit, foliage and other objects and can classify fruit for size and ripeness.

It can also offer harvest management to help determine the size of the picking fleet.

Based in Israel, the company plans commercial roll-out of FAR this year.

View all The Exhibitor Blog
Loading

Leaderboard