THIS ARTICLE/PRESS RELEASE IS PAID FOR AND PRESENTED BY the Norwegian University of Life Sciences (NMBU) - read more
The digitization revolution with drones in agriculture
This summer, several drones hovered over an experimental field at the Centre for Plant Research in Controlled Climate at NMBU to gather information about the growth and health of wheat plants.
20 meters above the experimental field, a drone hangs in the air while it continuously photographs every single bit of the field. What was previously recorded through time-consuming manual measurements on the ground, the drone takes care of in a few minutes. On the ground, scientist Sahameh Shafiee and PhD student Tomasz Mroz at the Norwegian University of Life Sciences (NMBU) are watching the information from the drone ticking into the iPad.
“We want to use drone technology to be able to study more detailed growth and development in the field. With the drone we can collect much more information than we can do manually,” Shafiee says.
The goal is to develop new plant breeding tools, and to enable plant breeders to make a better selection of plants for use when breeding new varieties. The benefit is to automate the manual data collection and increase the amount of information.
“To compare different varieties, we need to picture them in the same lighting condition, clear sunny sky or sky covered by clouds totally, having a homogenous lighting condition is challenging when we are screening a field with hundreds of trials since the sun’s angle changes during the day.”
Besides, in Norwegian weather conditions having sudden clouds in a sunny day is always expected. So the flight duration and flight time has a big impact on the quality of the data and the amount of work to process the data later,” Shafiee says.
Flight altitude and speed are important factors in drone imagery and will be calculated by considering different factors including camera properties, amount of coverage, and at the end the image resolution needed for different traits measurements.
Flight altitude and speed are important factors in drone images. Depending on the plant traits to be measured, flight altitude and speed are calculated by considering the camera's properties, the need for coverage of the field and the camera's image resolution.
See what the eye cannot see
Sahameh Shafiee works with technology development, especially machine learning and identifying different characteristics of plants and developing prediction models for calculating yields.
The researchers use multispectral cameras that can capture information about the plant that is not visible to the naked eye. This is mainly based on the interaction between light and plants that can be captured in wavelengths that are beyond human eyes. There are two important areas that lie between red light and near-infrared light that capture information about the plants' state of health and production capacity.
“To calculate indices for the vegetation, we compile information from different wavebands of the multispectral camera such as RGB, RedEdge, and NIR to measure or estimate chlorophyll activity and concentration, plant coverage, biomass, maturity and heading dates, yield, and protein content,” Shafiee says.
Measuring crop progress over the last 50 years
In the experimental fields at the Centre for Plant Research in Controlled Climate (SKP), open field department at Vollebekk, there are experimental fields with 24 historical spring wheat varieties from the Runar variety that came in 1972, up to the most recent variety that was approved in 2020.
In his PhD project, Tomasz Mroz looks for the genetic and physiological foundation for grain yield progress in spring wheat as well as crop developmental patterns. He uses the drone images to investigate the basis for historical crop yield increase.
“In my research on historical crop growth in Norwegian spring wheat, drone images together with traditional records on the ground provide interesting knowledge. For example, we can use the information to calculate the plants amount of biomass and we can make quite good estimates of the plants' yield potential already at an early stage in the season,” Mroz says.
Inspecting a field from the computer
The time at which heading and maturity occur are important characteristics that describe a variety. These measurements are usually made by visual assessment of the plants. It is common to physically go through the experimental field every other day for about two weeks to make manual observations.
One of the objectives is to be able to estimate accurately both the heading time and the maturity time based on image analysis. Only one average value per plot is needed to register these properties, and we have enough overlapping images and enough resolution in the images for that.
Plant height measured from drone images
Usually, plant height is also recorded manually with a rod and a yardstick. However, using the drone images it is possible to reconstruct a three-dimensional model of each plot, and obtain a precise estimate of the actual plant height.
When the drone takes one picture per second, each point in the field is covered by several pictures, which are taken from different angles. The method is far from new, but is still good, and is the same method that has been used in aerial photography since the 1950s.
The researchers use photogrammetry to obtain three-dimensionality. By putting information from several images from different angles together, they produce three-dimensional information that can be used to extrapolate plant height.
New robot for close-ups
To be able to count the number of leaves, spikes and other parts on each individual plant, there is a need for close-up photographs with a much larger resolution than the drone images. Therefore, the researchers use a robot with a camera that rolls over the experimental field.
“We also aim to be able to identify disease attacks, so we must be able to identify an area on a leaf and quantify it. The robot drives into the field, stops at a plot, and takes a close-up with a high-resolution camera. But it is also a very time-consuming process,” Mroz says.
Testing drones for close-ups
The researchers are testing flying with a drone, equipped with a high-resolution camera, at lower altitude above the experimental field to investigate the possibility of capturing high quality images faster that by robot. But flying a drone at low altitude above the test field, could be challenge due to turbulence in plants, created by the propeller’s run, that doesn’t allow for capturing sharp images.
There will be a trade-off between flight altitude and camera quality. It's about finding the right height that does not give turbulence, while having such a good camera that they can still manage to zoom in and get a good enough resolution on the images.
Need for upgraded equipment portfolio
The Centre for Plant Research in Controlled Climate at NMBU owns and operates the fields where the experiments take place. The centre focuses on development and continuous improvement, and aims to increase precision when upgrading machines, tools and equipment. Researchers can obtain accurate and large amounts of information through image information and advanced technology. Meeting the scientists needs with the use of precision equipment in a modern machine park, provides a great synergy effects for research on plants.
See more content from NMBU:
Questioning social sustainability in Oslo
When global climate chills, glaciers speed up
True size of world's largest tropical peatland revealed for the first time
Can a robotic butcher make the meat industry more sustainable?
Widely used food additive affects the human gut microbiota
Home sweet home: pet cats rarely stray far