SAES-422 Multistate Research Activity Accomplishments Report

Status: Approved

Basic Information

Participants

PARTICIPANTS: The meeting was a hybrid session. There were in-person and online participants. Day 2 Attendance (In person): Jefferson (Jeff) Vitale Thanos Gentimis Yaqing Xu Roberts Strong Akinbode Adedeji Lauren Godsmith Mahendra Bhandari Karun Kaniyamattam Gordan Rojan. Day 2 Attendance (Online): Nikolay Bliznyuk Ziteng Xu Dongyi Wang Yuzhen Lu Cindy Morley Won Suk Daniel Lee Gary Thompson Maria Bampasidou Brent Arnoldussen Young Chang Hussein Gharakhani Katsutoshi Mizuta Carlos Rodriguez Lopez. Day 3 Attendance (In person): Jefferson Vitale Thanos Gentimis Yaqing Xu Roberts Strong Akinbode Adedeji Mahendra Bhandari Karan Kariyamatahm Gordan Rojan Ali Fares. Day 3 Attendance (Online): Carlos Rodriguez Lopez Yuzhen Lu Ziteng Xu Dongyi Wang Hussein Gharakhani Daniel Lee Brent Arnoldussen Katsutoshi Mizuta Brent Arnoldussen Dongyi Wang.

MINUTES OF THE ANNUAL MEETING

Host: Texas A&M, AgriLife Center, Corpus Christi, TX

Date: August 7 – 9, 2024

The three-day annual meeting commenced on Wednesday, Aug 7, 2024, with the arrival of some of the members. Our first gathering was held on Aug 8, 2024. The main thrust of our meeting this year were:

  1. Day Two was attended by 9 members in person, and 13 members online. (See minutes for the list).
  2. The representative of the Texas A&M AgriLife Center, Dr. Gary (another Gary) came to give a welcome speech to the meeting.
  3. All the three objectives in our proposal for 2021 – 2026 were reviewed to gauge the extent of activities and impact across stations and review progress of work in order to prepare a better progress report this year. This was led by the current chair of the group, Jeff Vitale of Oklahoma State University. Dr. Gary Thompson, S1090 advisor, gave a presentation of our performance vis-à-vis our objectives. Details are in the minutes of the meeting submitted with this report. We ranked GOOD, GOOD, GOOD, and EXCELLENT on four criteria (Project reporting, Linkages, Funding, and Information & technology transfer, respectively) that were used for evaluating multistate groups.
    1. A template was developed for station report by the secretary, Bode Adedeji and was improved by all. The template was test-used immediately, and three of our members used it to present their station report and additional feedback was provided to improve the template. It has since been shared for a single station report to all our members.
    2. A suggestion that we create a hashtag for our multistate, and Robert Strong took the task of coming up with several options that he presented to the group.
    3. The group was encouraged to use several outlets (social media, website, Southern communicator consortium, repository as database (Thanos mentioned RoboFollow), etc.) to publicize its activities going forward.
    4. Some critical questions were raised to help focus our efforts. What is our role within the Land Grant System?

The first day of activities ended with a visit to the Digital Twin Lab at Texas A&M AgriLife Center, led by Mahendra, our host.

Day three was attended by 9 members in person and 10 members online (See minutes for the list). The focus was on April 2024 AI Conference Survey Summary, strategy for the next five-year proposal re-write, theme suggestion for 2025 AI conference, election of a new secretary and venue and date for 2025 meeting. Details are provided in the minutes. Key highlights include:

  1. A favorable review of the AI conference by the participants. Several themes were suggested for the next AI conference to be hosted in March 31 – April 2, 2025, by Mississippi State University.
  2. Six themes were suggested for 2025 AI conference for the host to consider. See minutes.
  3. Ad hoc committees were formed, and co-leads selected for each objective. Members were encouraged to volunteer to serve in at least one Ad hoc committee.
  4. The decision about venue for 2025 annual was left to Yuzhen, the chair-elect to decide. There are two options on the table. Oklahoma State University and Michigan State University. Consideration was given to co-hosting our annual meeting with Yuzhen’s other multistate group, W4009.
  5. Dongyi Wang of University of Arkansas was elected as the new secretary of the group.

The meeting was adjourned at 11:50 am.

Accomplishments

OVERVIEW OF THE GROUP’S OUTPUTS AND ACCOMPLISHMENTS FOR THE YEAR IN-VIEW

S1090 is a USDA multistate group founded in 2021 to foster collaboration across US institutions (land grant and others) and industry in the US where the focus is on artificial intelligence (AI) and digital agriculture. The group started with several institutions within the US South-Eastern conference and has since expanded to other land grant (and non-land grant alike) institutions across the nation with strong industry collaboration. We are in the third year of our current proposal. The proposal laid out three main objectives that are supposed to foster cross-disciplinary and multistate collaborations within our members. This report documents our activities in the last one year: 2023/2024. Primarily on collaborative research projects across stations of our members (including the Ag. Machinery and Production industry). It highlights key ideas and issues tackled, the number of personnel trained, the fundings secured as results of our collaboration, and the meetings we attended where findings of our research outputs were shared nationally and internationally. The group has also organized three AI conferences since inception, and last one was hosted in April this year in Texas. The report for the conference has been submitted separately.

 

OUTPUTS OF PROJECTS BY OBJECTIVE AND STATION

OBJECTIVE 1a: AI tools for crop (Agrifood) and animal production

Project 1: Monitoring Nitrogen Stress in Maize (Zea mays L.) under Excessive Moisture with UAV Remote Sensing. (Under Review by Sensors (MDPI)) is a paper submitted this year that utilizes AI and remote sensing to improve Nitrogen usage in Maize. Station: LSU; Students Trained: 5 MSc.

Project 2: The AI Based Methodologies for Major Crops in Louisiana grant was awarded to Dr. Gentimis this year by the Louisiana Board of Reagents ($154,000) will explore the use of AI in historical datasets in Louisiana. (established 2023-24) and it will fund 2 graduate students. Station: LSU; Students Trained: 1 Ph.D., 1 MSc).

Project 3: The Remote Sensing and Artificial Intelligence for Supporting Sugarcane Breeding and Forecasting Sugar Yield grant was established this year in collaboration with John Deere Thibodeaux LA, ($285,000) and it will utilize AI and remote sensing to improve outcomes in Sugarcane. We note here that the collaboration between the John Deere company and LSU was strengthened during the visit the multistate conference in Louisiana (2023) had at John Deere’s facility at Thibodeaux. This created the initial introductions between Dr. Setiyono and Dr. Gentimis with the team at John Deere. (1 Ph.D.) (Station(s):  LSU-John Deere.

Project 4: The paper Application of TensorFlow model for identification of herbaceous mimosa (Mimosa strigillosa) from digital images was published this year by Dr. Setiyono and Dr. Gentimis and it utilizes Convolutional neural networks to detect herbaceous mimosa. (5 MSc).

Project 5: Developed and validated remote sensing tools tailored to assess yield parameters and detect sugarcane yellow leaf disease, sugarcane stalk rots, Cercospora net blotch in rice in Louisiana farms. Station(s): LSU-USDA ARS

Project 6: Harnessing UAV and machine learning technologies to promote resilient soybean and corn production is a grant awarded to Dr. Setiyono this year from the Louisiana Soybean and Grain Research Promotion Board ($30,000). Station: LSU.

Project 7: AI-based approaches were developed for optical technologies, especially machine vision, applied to automated sweet potato grading and sorting. This is part of an ongoing multi-state collaborative research project funded by USDA Agricultural Marketing Services, involving five universities including MSU, MS State University, NCSU, UIUC, and LSU. One Postdoc and one PhD student are being trained on the project.

Project 8: AI models were developed for machine vision-based weed detection and control. To facilitate weed control research efforts, we developed open-source software called OpenWeedGUI which integrates YOLO models for weed imaging and detection. Moreover, we designed and evaluated an AI-based smart sprayer for precision vegetable weeding. We have received funds from the Michigan Department of Agriculture and Rural Development. We are planning to submit a USDA-NIFA proposal to expand and deepen our research this year. One PhD student is being trained on the project. Station: LSU.

Project 9: Harvest decision marking is important for blueberry growers to maximize quality and yield. We are developing computer vision and AI tools for automated blueberry counting and harvest maturity estimation, currently under the support of Michigan State University AgBioResearch. We are planning to submit a USDA-NIFA proposal this year. Station: LSU.

Project 10: We are in partnership with MotionGrazer AI (a startup company) to develop 3D computer vision and AI-based methods for automated sow body condition estimation and lameness detection. The effort was funded by an NSF STTR Phase I grant. We are planning to apply for the Phase II grant to scale up the efforts and make the technology available to swine producers. Station: LSU.

Project 11. Poultry meat myopathies, such as white striping and woody breast, downgrade quality and cause significant loss to the U.S. poultry industry. AI models were developed for white striping detection of broiler meant using structured-illumination imaging. This work was funded by a USDA-NIFA-AFRI seed grant and initiated at Mississippi State University and then transferred to MSU. One MS student has been finished, and the second MS student is being trained on the project. We are planning to submit a standard NIFA proposal to push forward the research. Institutions involved include MSU and Mississippi State University.

Project 12: Climate variability has complicated irrigation and disease management in crop production. AI models are being developed to predict crop water stress and plant disease risks. This work was funded by Michigan State University Project GREEEN. The project team is planning to submit a proposal to the USDA NIFA program. Station: MSU.

Project 13: Yiannis Ampatzidis: (1) Development of early pest and disease detection systems for a variety of vegetable crops utilizing AI and remote sensing; (2) Development of an innovative optoelectronic nose for detecting adulteration in quince seed oil. Station: (UF)

Project 14: Nikolay Bliznyuk: In development, as part of FDACS “BMP Phosphorus (P) recommendations” project: ML-based prediction of yield of select crops (tomato, potato, and green bean) using P and other nutrients. Station: (UF)

Project 15: Won Suk Lee: (1) Strawberry flower and fruit detection toward the development of yield forecasting models (Station: UF); (2) Two spotted spider mite detection using smartphones and a single-camera device (Station: UF, UC Davis, UC ANR); (3) Ground- and UAV-based strawberry plant canopy volume detection (UF); (4) Strawberry plant wetness detection using color imaging and AI.

Project 16: Henry Medeiros: We continue to explore novel computer vision techniques for behavioral studies in animal production facilities. In collaboration with colleagues at the ABE department, we received a seed grant from the University of Florida Institute of Food and Agricultural Sciences Launching Innovative Faculty Teams in AI (LIFT/AI) initiative (PI: D. Hofstetter) to develop computer vision models to analyze the behavior of turkeys. Preliminary results obtained using this grant were presented at the American Society of Agricultural and Biological Engineers Annual International Meeting. Station: UF

Project 17: The work conducted by Tennessee focused on the development of AI technology for the improvement of livestock and poultry health and welfare. The development of an automated vision system for broiler welfare behavior assessment. The outcome of this project is an automated tool that helps broiler farmers to have a better insight on animal performance. It provides timely information for broiler farmers to improve their farm management practices for better animal welfare and higher production. Station: UTK.

Project 18: Developed AI algorithms to analyze data for precision nutrient monitoring and management for hydroponic production for leafy greens. Developed machine learning-based algorithms for predicting calcium deficiency in the hydroponic production of lettuce. Station: UCDavis. We have presented research outcomes at the 2024 American Society of Agricultural and Biological Engineering Conferences in Anaheim, California.

Project 19: Developed the model for forecasting the biomass yield of lettuce production in a hydroponic system under artificial lighting in an indoor vertical farming setup. Station: UCDavis.

Project 20: Developed fault detection and diagnosis tool of EC and pH sensors for precision nutrient and water management for hydroponic production. This tool will be critical for identifying any faults and anomalies in sensor (EC and p-H) readings and ensuring the precision operation of hydroponic production. Station: UCDavis. We presented our research findings at the 2023 American Society of Agricultural and Biological Engineering Conferences in Ohama, Iowa.

Project 21:  Color is a critical parameter in meat quality evaluation. Objective measurement of meat saves money, can be easily automated and increase return for the stakeholders. UArk and UK PIs, are working on developing deep learning models for meat color prediction. We submitted a USDA proposal which was not funded. We are planning to resubmit during the current cycle (UK and Uark).

Project 22: Cross-contamination of grain cultivar is a major issue in postharvest grain processing. Current methods of detection are ineffective, leads to waste and unable to quantify. Nondestructive methods of detection and quantification of cross-contamination of proso millet cultivars were developed using sensor (hyperspectral imaging) and machine learning Station: UK.

Project 23: Allergen detection is a critical step in food manufacturing, especially where the risk of contamination is high. Noninvasive methods are more effective. We are working on developing multispectral model based on RGB data, which can be deployed in mobile APP for everyday use of consumers and the food industry in allergen detection in foods Station: UK.

Project 24: Evaluation of Traditional and AI-Driven On-farm Trial Data; In-season Diagnosis of Nitrogen and Water Status for Corn Using UAV Multispectral and Thermal Remote Sensing Station: UK.

Project 25: Using agent-based modeling for precision swine nutrition. Station: TAMU.

Project 26: Digital Twin System for In-season Crop Growth and Yield Forecasting: A DT twin framework was developed using the phenotypic features of plants growth and development as a virtual replica of plants and use the data generated from this framework to predict crop growth, forecast management, and yield during the season. Station: TAMU.

Project 27: Crop Yield Modeling and Drought Monitoring: As a joint study between Prairie View A&M University and station Sam Houston State University, we did a series of UAV data collection over a sorghum field located at Prairie View A&M University. These data include high-resolution RGB, thermal, and multispectral data. This is a collaborative work aiming for developing tools for crop yield modeling and drought monitoring. This work is an outcome of the AI for Ag conference 2024 hosted in Texas A&M. Station: TAMU, Sam Houston State Uni., Prairie View A&M University.

Project 28: K-State are working on an idea to verify and validate robotics and automation in food production. This new concept has caught on in the last 10 years but has considerable room for study and must be validated to prove they can accomplish the necessary work needed for growing food, fiber, and fuel.

OBJECTIVE 1 B: AI tools for autonomous system perception, localization, manipulation, and planning for agroecosystems.

Project 1: Developed dual laser active scanning camera to generate high-resolution RGB-D camera and AI - based robotic solution for advanced food manufacturing. Station: UArk

Project 2: Developed a novel neural network model (wide-deep learning, SSNet, SCNet) for hyperspectral imaging-based non-invasive bioproduct analysis.  Station: UArk

Project 3: Developed novel illumination robust machine learning model to predict human sensory grading based on food appearance under different illumination conditions. The project promotes collaboration with the UK for one USDA grant submission. Station: UArk and UK

Project 4: Yiannis Ampatzidis: (1) Development of an AI-enhanced smart sprayer for precision weed management in specialty crops (UF, Carnegie Mellon Un); (2) Development of an AI-enabled automated needle-based trunk injection system for HLB-affected citrus trees Station: UF, UC Davis; (3) Development of an AI-enabled smart tree crop sprayer using sensor fusion.

Project 5: Dana Choi: Our project established a comprehensive strategy to address the challenges of agricultural robotics and data collection in strawberry farming by utilizing advanced procedural modeling to generate synthetic plant models, thereby enhancing the quality of datasets beyond the capabilities of traditional methods. We leveraged NVIDIA Omniverse for realistic simulations and integrated these with the Robot Operating System (ROS) for sophisticated data management, employing deep learning models for precise strawberry detection and classification. Additionally, ISAAC SIM’s automated labeling system significantly improved the efficiency and accuracy of our training processes. The culmination of these innovations is a robust fruit detection system that not only elevates yield prediction accuracy but also reduces the need for intensive data labeling and curation, thereby cutting costs and streamlining the development of robotics applications. Station: UF.

Project 6: Dana Choi: We initiated the "Advancing AI Competency for Agricultural Extension" project with the objective of enhancing AI literacy to promote innovation and productivity in agriculture. This multifaceted program encompasses a variety of key initiatives. Through educational outreach, we captivated students at the Ag AI Youth Expo, encouraging them to pursue future careers in agricultural technology. In support of growers, we introduced AI solutions to Florida specialty crop producers, enhancing their understanding of AI systems. As part of our commitment to professional development, we demonstrated future AI applications to IFAS Certified Crop Advisers. Additionally, we conducted the comprehensive AI workshop titled “AI Essentials for Extension Professionals” for extension faculty members, cultivating an environment conducive to continuous learning and innovation within the community. Station: UF.

Project 7: Henry Medeiros: (1) We further developed a multiple object tracking algorithm that detects, and tracks plants observed by mobile robotic platforms equipped with video cameras. We also extended the evaluation of the algorithm to additional publicly available datasets and demonstrated its state-of-the-art performance. A manuscript submitted to Computers and Electronics in Agriculture received positive initial comments from the reviewers and the revised submission is currently under review. Station: UF.

Project 8: We developed a novel computer vision model that simultaneously performs object detection and association. Rather than using conventional detection association mechanisms based on linear cost assignment methods, our model directly learns the correspondences between detections in a pair of images with partially overlapping fields of view. Given only the bounding boxes in two images, our model learns the associations among detections between two subsequent frames, thus providing all the information needed to track object identities over the video sequence. We evaluated our model on publicly available datasets and obtained promising preliminary results, which were presented at the American Society of Agricultural and Biological Engineers Annual International Meeting. Station: UF.

Project 9: AI algorithms are being incorporated into multiple robotic systems.  First, AI-based vision is being developed to control a high-speed robotic arm that is capable of manipulating multiple objects like various fruits simultaneously while consuming less energy than conventional robotic arms.  AI-based vision is also being developed to enable a versatile robotic end-effector to manipulate fruit in situ (through rotating and bending), pick it using various methods (pulling, bending, twisting, or combinations), and continuously transfer it to the back of the end-effector.  As a specific example, AI-based vision is being developed to measure cotton boll orientation, which affects the performance of robotic cotton harvesting.  AI-based vision is also being developed to enable a ground-based robot to detect and collect plastic bags, a significant source of cotton fiber contamination, in cotton fields.  In collaboration with USDA-ARS, we developed an AI-based model that estimates moisture content and bulk density in grains, and a similar ongoing project with USDA-ARS focuses on developing an AI-based model to detect diseases from images of crop leaves. The model is being designed to identify at least five different diseases. Station: MS.

Project 10: We have developed nutrient dosing algorithms for the precision supplying of individual nutrient components (Macro Nutrients) for hydroponic production instead of EC based conventional dosing. This research aims to reduce the human involvement in monitoring and dosing the precise nutrient components and ultimately improve the nutrient and water use efficiency of hydroponic production.  Also, the autonomous nutrient dosing tool would help to improve the yield as the potential for nutrient deficiency and toxicity would be minimized. Stations: UCDavis and Delft University of Tech.

Project 11: Design and Deployment of a User-Centric Customizable Digital Twin System for Apple Production, USDA NIFA Stations: MSU (Uyeh & Morris) and UK (Arnoldussen). Status: not funded

Project 12: Collaborative Research: CPS: Medium: Growing Fruit Trees in Dynamic Digital Environments, NSF CPS, Stations: MSU (Uyeh and Morris), Oregon State University (Davidson and Grimm, UF (Mederios and Lee), Texas A&M University (Zahid), UK (Arnoldussen, Adedeji, Rodriguez). Status: pending

Project 13: Digital Orchards for Guiding Micro-Climate Interventions. Funding Agency: USDA FFAR. Stations: Institutions: MSU (Uyeh, Morris, and Perkins), UK (Arnoldussen), Tennessee State University (Mohaei). Status: pending

Project 14: Using computer vision to predict Bovine Respiratory Disease. Station: TAMU.

Project 15: Using computer vision for precision feeding in beef production systems. Station: TAMU.

Project 16: K-State station plans to develop computer vision and other digital systems (in collaboration with other stations) that will allow an ag machine to collect and analyze quality of work performed or machine performance based on an assessment of the area behind a machine. Continued development of tools for monitoring planter performance in field conditions.  This work will provide data for building a truly AI driven autonomous planting system. Information generated by this work will be required as foundational information for AI to make decisions. Continued development of a sensor-based system to sense soil compact in a field prior to tillage.  This work will better integrate soil condition information into precision agriculture systems. Began work on developing a machine vision test stand to study tillage tool field performance. Data collected will provide foundational knowledge necessary for AI driven autonomous machines.

 

OBJECTIVE 1 C: Natural resources scouting and monitoring.

Project 1: AI-driven data fusion of in situ handheld spectral sensing devices and geospatial environmental covariates generated via an Earth observation data cube for soil properties estimation at a continental scale (USA), utilizing the USDA soil spectral library. Stations: UF and U-Sao Paolo.

Project 2: Self-supervised and contrastive learning methods for analyzing multimodal data to estimate soil organic carbon (SOC) stock, addressing data availability challenges in the USA and Europe. Stations: UF and U-Sao Paolo.

Project 3: Develop a dual-branch neural network architecture to analyze and interpret multimodal spaceborne data, also considering the temporal evolution of bare soil reflectance properties. Stations: UF and U-Sao Paolo.

Project 4: In collaboration with USDA-NRCS, 422 soil samples were collected from 33 soil series in Texas at three depths (0-5 cm, 5-15 cm, and 15-30 cm) during summer 2023. Soil data including moisture content, bulk density (0-5 cm), field saturated hydraulic conductivity (0-5 cm), GPS location, and soil spectra from Mississippi State University’s Advanced Plant and Soil Sensing lab were obtained. Total carbon, total nitrogen, available phosphorus, potassium, magnesium, calcium, hydrogen, zinc, manganese, organic matter, soil pH (water and buffer), cation exchange capacity, and percent base saturation of cation elements were analyzed by the Waters Agricultural Lab in Mississippi. Soil texture, lab saturated hydraulic conductivity (0-5 cm), and soil water retention curve (0-5 cm) were analyzed at USDA-ARS, Starkville, Mississippi. Samples obtained from Mississippi and Texas so far exceed 800. Once laboratory analyses are complete, AI models will be used to develop soil property estimations from soil spectra. Station: MS and USDA-NRCS.

Project 5: Sensitivity Evaluation of Visible Near-Infrared Spectroscopy Data to Variable-Rate Soil Moisture for AI-Driven Prediction of Soil Properties. Station: UK.

Project 6: Improvement of Soil Spectral Prediction for Plant-Available Nutrients Using Machine and Deep Learning algorithms. Station: UK.

Project 7: Using satellite Imagery and machine learning to predict forage quality and quantity. Station: TAMU.

Project 8: Completed a project constituent company that monitors large square bales after they are dropped in the field behind the baler.  This machine vision-based device provides operator warnings if the bales are misshapen or have broken twines which are outside a given set of parameters. Currently working to develop methods of predicting field finish behind tillage tools based on tillage tool performance.  This system will be a machine vision-based system that is supported by other information such as draft and machine vibrations. Station: K-State.

 

OBJECTIVE 1 D: Socioeconomic sustainability

Project 1: This project addresses the issues related to crop monitoring for yield estimation, quality assessment, disease detection, irrigation scheduling, and prediction of nutrient concentrations. Station: Clemson.

Project 2: The major activities included the use of aerial and ground-based sensing systems for assessing aboveground biomass on forage crops, and irrigation decision support system development for cotton. We continued the research activities in above projects and collected preliminary data related to the moisture and nutrient content assessment using NIR Spectrometer and tested artificial intelligence algorithms for predicting aboveground forage yields. Deep neural networks were applied for cotton yield estimation and soil moisture predictions. Station: Clemson.

Project 3: Evaluation of Agronomic, Economic, and Environmental Benefits of Remote Sensing and AI-Based Calibration Strip Technology for In-season Nitrogen Application for Corn. Stations: UMN, UK.

Project 4: Assessment of Soil Carbon Sequestration Capability by Depths and Crops Using Econometric Techniques. Station: UK.

Project 5: Developing a system dynamics model for climate smart beef production systems. Station: TAMU.

OBJECTIVE 1 E: Phenotyping and genotyping

Project 1: The paper Grapevine Rootstock and Scion Genotypes' Symbiosis with Soil Microbiome: A Machine Learning Revelation for Climate-Resilient Viticulture is under review by Microbiome, utilizes ML techniques on Grapevines. (Students Trained: 1 Ph.D.). Stations: LSU-UK

Project 2: Yiannis Ampatzidis: Development of AI-enabled high throughput phenotyping technologies to enhance citrus, sugarcane, and wheat breeding programs. Station: UF

Project 3: Henry Medeiros: (1) We engaged with several members of the S1090 multi-state project and identified multiple collaboration opportunities. Based on these discussions, we developed a joint proposal for the National Science Foundation Cyber Physical Systems program. The proposal was submitted in May of 2024 and is currently under review by the agency.  Discussions regarding additional opportunities are currently underway. Station: UF

Project 4: As part of our broader research efforts on phenotyping techniques, in partnership with colleagues from several department, we secured a seed grant from the University of Florida Institute of Food and Agricultural Sciences Launching Innovative Faculty Teams in AI (LIFT/AI) initiative focused on AI-driven Phenomics to Advance Plant Breeding in Florida. Station: UF.

Project 5: Paper under review (Microbiome, IF: 13.8) with Dr. Thanos Gentimis (LSU). In this paper we described the cultivated grapevine PanMicrobiome and development of ML algorithms to predict provenance, and grapevine planted genotype using microbiome data. Our research offers a novel perspective on the predictive power of genotype selection on the microbial assemblage in vineyard soils. By employing a suite of machine learning models, we have dissected the complex interactions between grapevine cultivars and their root microbiomes across continents, countries, and cultivars. The robustness of our methodology, which integrates multiple machine learning algorithms and evaluates them through the lens of F1-scores to account for class imbalance, sets a new standard in the field. The crux of our findings demonstrates that the successful prediction of rootstock and scion combinations from soil microbiomes, irrespective of their provenance, re-affirms that the genotypes of both plant parts are determinants in shaping the microbiome. This underlines the potential for targeted breeding programs to not only consider the direct traits of the grapevines but also their indirect influence on the surrounding microbial environment, which is pivotal for plant health and productivity. Stations: LSU and UK

Project 6: Funded project. EpicMare; Using ML for the development of gestational age epigenetic clocks to predict delivery date and pregnancy complications in pregnant mares. The project has received funding from the university of Kentucky Igniting Research Collaborations ($50,000(2024)), and the Martin Gatton Fundation ($150,000 (2024-2027)). Additionally, a NIFA-AFRI application was submitted by UK in collaboration with Dr. Shavahn Loux (LSU) (Requested $750,000 (2025-2029)).

Project 7: Phenotyping for Biomass Quantity and Quality: A work initiated at University of Tennessee Knoxville and Oak Ridge National Lab but maintained its collaboration when Sam Houston State University was created. This research assesses the biomass quantity (crop yield) and phenotypic traits associated with biomass quality, i.e., leaf nitrogen concentration, chlorophyll content, and disease resistance, etc. Using AI approach, we were able to develop high-throughput models for detecting these phenotypic traits for switchgrass, a dominating bio-fuel crop. Station: TAMU, Sam Houston State Uni., UTK.

 

OBJECTIVE 2 A: Data curation, management, and accessibility, and security, ethics.

Project 1: An annotated dataset of soybean images was generated to be used in the Master’s thesis of Ms. Bhawana Acharya named: Digital Agriculture Applications In Maize Nutrient Management And Soybean Crop Stands Assessment. This dataset is publicly available on the online platform “Roboflow”. Dr. Gentimis was introduced to “Roboflow” during a talk at the 2022 conference in Florida, and he has proposed the creation of an online repository for LSU which could also be extended to the whole Multistate System. The corresponding paper is under preparation. Station: LSU

Project 2: The lack of public, large-scale weed datasets is considered a bottleneck to developing robust machine vision-based weeding systems. To alleviate this, we have been developing public weed datasets. In the Years 2023-2024, we acquired and published two open-source, multi-class weed detection datasets and associated benchmarks of AI models for weed detection. We also experimented with generative modeling through stable diffusion to augment multi-class weed image data for enhanced weed detection. We are now seeking federal funding to sustain the efforts to develop big data-powered robust weed recognition systems to enable precision vegetable weeding Station: MSU.

Project 3: K-State group is expanding the capabilities of previously developed devices, CatAPP and GreaterAPP, which allow the quantification of force and energy requirements for modern tillage tool related components.  This device is intended to provide improved data for designers, planners, producers, and policy makers.  Much of the data and guidance currently available to these individuals was generated better than 40 years ago and isn’t relative to modern field equipment. Station: K-State.

Project 4: Working with agronomists and entomologists to develop methods for detecting and determining the location of aphids and other pests in production fields.  These systems are using robotic and autonomous systems to perform historically labor intensive that are more thorough and accurate. Station: K-State.

OBJECTIVE 2B: Standardization and testbed development – data standardization and software development.

Project 1: The goal of the project is to use MIR spectral data and soil properties from Kellogg Soil Survey Laboratory to build and provide a user-friendly, web-based portal that would automate the modeling process and estimate soil properties and soil health indicators from MIR spectra. The web-based portal will enable MIR-equipped NRCS-SPSD field offices, researchers, and land managers to upload MIR spectra and obtain estimates of various soil properties across the US. It will reduce the inconsistency and errors in soil property estimation by field scientists and differences in lab instruments and analytic methods. The estimated soil properties serve a variety of end-user-determined applications, including soil monitoring/surveillance, soil health assessment, soil classification, and dynamic soil survey. Stations: Oregon State University/Ag Experiment Station AND University of Wisconsin-Madison.

Project 2: The paper Soybean yield prediction using machine learning algorithms under a cover crop management was published this year, that extends data for uses in ML through statistical means. Station: LSU.

Project 3: Develop an AI-driven online spectral analysis tool for global use; based on novel local regression techniques and soil spectral data from >50 countries. Stations: U-Sao Paolo and UF.

Project 4: Establishment of a testbed for developing precision agriculture tools: Producer’s fields field growing cotton and sorghum has been identified and high spatio-temporal resolution data using remote sensing tools such as UAS and satellite imagery. The obtained data has been processed to extract phenotypic features and standardized to develop tool AI/ML tools for management decisions and yield prediction. Station:

TAMU.

Project 5: Web-based data management system for UAS-based phenotyping: A web-based platform integrated with database and processing pipeline was developed for Unmanned Aerial Systems (UAS) data management, processing, analysis, and communication. Stations: TAMU, OK.

OBJECTIVE 3: AI adoption (technology transfer) and workforce development

Project 1: Dr. Gentimis and Dr. Bampasidou secured a SERME grant ($46,176) called: Digital Agriculture: Training Louisiana Growers to develop a Data Management Plan. This grant will support 3 workshops for 20-50 people each, giving them a gentle introduction to data management, precision and digital Ag. Station: LSU.

Project 2: The paper Overcoming ‘Digital Divides’: Leveraging higher education to develop next generation digital agriculture professionals was published this year by Dr. Bampasidou, Dr. Gentimis and Dr. Mandalika, discussing opportunities for various stakeholders for training in Digital Ag as well as discussing various ethical implications of the field. Station: LSU.

Project 3: Dr. Gentimis is teaching a course on Digital Agriculture with 16 master’s and PhD students every spring semester. Station: LSU.

Project 4: Dr. Gentimis gave a workshop at the annual conference organized by our project that attracted 60+ students and various stakeholders, on the use of XGBOOST (advanced ML technique). Station: LSU.

Project 5: Dr. Gentimis gave a workshop at Vidalia LA, that attracted 20+ stakeholders in the area about the uses of Data and Digital Ag in soybean production. Station: LSU.

Project 6: Technical workshop with university of Sao Paolo in AI spectral analysis within the framework of GFSI project. Stations: UF

Project 7: Development of Online Platform for Automated Fertilizer Prescription Calculation Driven by (AI and) Remote Sensing-Based Calibration Strip Technology: only first prototype was developed. The AI algorithm needs to be added in the future for advanced calculation of fertilizer recommendation rates. Station: UK.

Project 8: Development of Cheap and Rapid Soil Testing Service Using Spectroscopy. Station: UK.

Project 9: Development of Campus-wide AI Agrifood Institute. Station: UK.

Project 10: Kentucky Fruit and Vegetable Growers Meeting- 30m Informational Session on using digital remote sensing platforms for precision apple crop load management. 48 attendees

Project 11: PickTN (Tennessee fruit growers) 1 h Informational Session on using digital remote sensing platforms for precision apple crop load management. 55 attendees. Station: UK.

Project 12: Kentucky State Hort Society Spring Field Day. Station: UK.

Project 13: 30 Minute training on use of digital platforms for weather and crop load management 37 attendee. Station: UK.

Project 14: Conceptualized and organized the first joint symposium between AgriLife-COALS, and Texas A&M Institute of Data Science (Conference, College Station, Texas). Station: TAMU.

Project 15: Interdisciplinary training program in digital agriculture tools development: A Research and Extension Education for Undergraduate (REEU) students was conducted to train students in digital agriculture technologies. Station: TAMU, LSU, UK, OK-State.

Project 16: Community based course for GEOG 4468 Remote Sensing course at Sam Houston State University: The course features remote sensing and drone technology and train the students (10 undergraduate students) for data collection over an agriculture field. Station: TAMU.

Project 17: Invitation to floy drones: Inspired from the presentation at the AI for Ag conference, Dr. Yaping Xu was invited to fly drones over North Carolina Arboretum at Asheville to collect video footage over the arboretum. Station: TAMU.

Project 18: Dr. Gentimis helped secure funding for the workshop ($50,000) through a USDA competitive grant. Stations: All Multistate.

 

ACTIVITIES

  • S1090 organized it 3rd AI in Agriculture and Natural Resources Conference at the College State, Texas from April 15 – 17, 2024.
  • 1st joint symposium between AgriLife-COALS, and Texas A&M Institute of Data Science (Conference, College Station, Texas).

 

MILESTONES AND IMPACT SUMMARY

The metrics of impact in the reporting year include a total of 90 peer-reviewed publications, 80 conference presentations/podium/posters, three extension publications. A total of 38 undergraduate research assistants, 73 MSc students, 74 Ph.D. students, 24 post-docs, 15 other researchers (visiting scientists, research assistants) and a total of 170 farmers/growers/aggregators were trained. A total of 525 conference attendees were facilitated in through two conferences related to AI co-organized by our members. This makes a total of 1,092 personnel that were impacted by S1090 members in the reporting year. A sum of $819,000 in funded grant money was reported from collaborative projects by our members. A total of 91 projects are at various stages of execution were reported at a total of 13 stations. Out of these are 26 collaborative projects.

 

 

 

 

 

Impacts

  1. S1090 started in 2021. The group has organized three successful AI conferences held at three locations across the country with participants drawn both private and public institutions, and involvement of seasoned scientists and student scholars. During each of the conference, a hands on session on big data analytics was held as well. The group in the reporting year published 90 peer-reviewed articles, presented 80 conference papers, trained a total of 147 graduate students, 24 post-docs, and 15 researchers on various subject related to AI applications in Agro-ecosystems.

Publications

Refereed Journals/Book Chapters

Oregon State University

  1. Zhang, Y., Hartemink, A.E., Weerasekara, M., 2023. An automated, web-based soil property and soil health estimation tool using mid-infrared (MIR) spectroscopy and machine learning. National Cooperative Soil Survey Meeting, July 9–13, Bismarck, ND, USA.

 

South Dakota State University, SDSU

  1. Antora, S.S., Chang, Y.K., Nguyen-Quang, T., & Heung, B. (2023). Development and Assessment of a Field-Programmable Gate Array (FPGA)-Based Image Processing (FIP) System for Agricultural Field Monitoring Applications. AgriEngineering, 5(2), 886-904.
  2. Shin, J., Mahmud, M., Rehman, T. U., Ravichandran, P., Heung, B., & Chang, Y.K. † (2023). Trends and Prospect of Machine Vision Technology for Stresses and Diseases Detection in Precision Agriculture. AgriEngineering, 5(1), 20-39.
  3. Conference paper
  4. Alahe, M.A., Kemeshi, J., & Chang, Y. (2024) Comparison Between Jetson Nano and Jetson Xavier NX for Ag Data Security. In 2024 ASABE Annual International Meeting. Oral presentation with conference paper (doi: 10.13031/aim.202400811).
  5. Kemeshi, J., Alahe, M.A., Chang, Y., & Yadav, P.K. (2024) Effect of Camera Shutter Mechanism on the Accuracy of a Custom YOLOv8 Model for Pattern Recognition in Motion on a UGV. In 2024 ASABE Annual International Meeting. Oral presentation with conference paper (doi: 10.13031/aim.202400812).
  6. Alahe, M.A., Kemeshi, J., Chang, Y., & Menendez, H. (2024) Sustainable Livestock Management and Pasture Utilization using Automotive Electric Fencing System. In 2024 ASABE Annual International Meeting. Oral presentation with conference paper (doi: 10.13031/aim.202400820).
  7. Kemeshi, J., Gummi, S.R., & Chang, Y. (2024) R2B2 Project: Design and Construction of a Low-cost and Efficient Autonomous UGV For Row Crop Monitoring. 16th ICPA. Oral presentation with conference paper (#10111).
  8. Gummi, S.R., Kemeshi, J., & Chang, Y. (2024) Botanix Explorer (BX1): Precision plant phenotyping robot detecting Stomatal openings for Precision Irrigation and Drought Tolerance experiments. 16th ICPA. Oral presentation with conference paper (#10202).
  9. Kemeshi, J., Chang, Y., Yadav, P.K., & Alahe, M.A. (2024) Comparing Global Shutter and Rolling Shutter Cameras for Image Data Collection in Motion on a UGV. 16th ICPA. Oral presentation with conference paper (#10223).
  10. Alahe, M.A., Kemeshi, J., Chang, Y., Won, K., Yang, X., & Sher, M. (2024) Securing Agricultural Data with Encryption Algorithms on Embedded GPU based Edge Computing Devices. 16th ICPA. Oral presentation with conference paper (#10244).
  11. Alahe, M.A., Kemeshi, J., Gummi, S.R., Chang, Y., & Menendez, H. (2024) Design of an Automatic Travelling Electric Fence System for Sustainable Grazing Management. 16th ICPA. Oral presentation with conference paper (#10246).
  12. Gummi, S.R., Alahe, M.A., Kemeshi, J., & Chang, Y. (2024) Securing Agricultural Imaging Data in Smart Agriculture: A Blockchain-Based Approach to Mitigate Cybersecurity Threats and Future Innovations. 16th ICPA. Oral presentation with conference paper (#10247).
  13. Gummi, S.R., Alahe, M.A., Pack, C., & Chang, Y. (2024) A Swarm Robotics Navigation Simulator for Phenotyping Soybean Plants using Voronoi-Ant Colony Optimization. 16th ICPA. Oral presentation with conference paper (#10282).
  14. Brennan, J., Parsons, I., Harrison, M. & Menendez, H. Development of an Application Programming Interface (API) to automate downloading and processing of precision livestock data. (2024).
  15. Brennan, J., Parsons, I., Harrison, M. & Menendez, H. Development of an Application Programming Interface (API) to automate downloading and processing of precision livestock data. ASAS, Calgary Alberta (2024).
  16. Parsons, Ira Lloyd, Brandi B Karisch, Amanda E Stone, Stephen L Webb, Durham A Norman, and Garrett M Street. Machine Learning Methods and Visual Observations to Categorize Behavior of Grazing Cattle Using Accelerometer Signals, 2024.
  17. Wang T., H. Jin, H. Sieverding, S. Kumar, Y. Miao, O. Obembe, X. Rao, A. Nafchi, D. Redfearn, S. Cheye. 2023. “Understanding farmer views of precision agriculture profitability in the US Midwest.” Ecological Economics, 213, 107950.
  18. Wang T., H. Jin, and S. Heidi. 2023. Factors affecting farmer perceived challenges towards precision agriculture. Precision Agriculture. https://doi.org/10.1007/s11119-023-10048-2.
  19. Adereti, D. T., Gardezi, M., Wang, T., McMaine, J. 2023. Understanding farmers’ engagement and barrier to machine learning-based intelligent agricultural decision support systems. Agronomy Journal. https://doi.org/10.1002/agj2.21358.

 LSU

  1. Setiyono, T., Gentimis, T., Rontani, F., Duron, D., Bortolon, G., Adhikari, R., ... & Pitman, W. D. (2024). Application of TensorFlow model for identification of herbaceous mimosa (Mimosa strigillosa) from digital images. Smart Agricultural Technology7, 100400.
  2. Santos, L. B., Gentry, D., Tryforos, A., Fultz, L., Beasley, J., & Gentimis, T. (2024). Soybean yield prediction using machine learning algorithms under a cover crop management system. Smart Agricultural Technology, 100442.
  3. Bampasidou, M., Goldgaber, D., Gentimis, T., & Mandalika, A. (2024). Overcoming ‘Digital Divides’: Leveraging higher education to develop next generation digital agriculture professionals. Computers and Electronics in Agriculture224, 109181.

Clemson

  1. Koc, A.B., Erwin, C., Aguerre, M., Chastain, J. 2024. Estimating Tall Fescue and Alfalfa Forage Biomass Using an Unmanned Ground Vehicle. 15th International Congress on Agricultural Mechanization and Energy in Agriculture Cham 2024. Lecture Notes in Civil Engineering, vol 458. Springer, Cham. https://doi.org/10.1007/978-3-031-51579-8_32. Publisher: Springer Nature Switzerland Pages: 357-372.
  2. Singh, J., Koc, A.B., Aguerre, M.J., Chastain, J.P., and Shaik, S. 2024. Estimating Bermudagrass Aboveground Biomass Using Stereovision and Vegetation Coverage. Remote Sensing, 16, 2646. https://doi.org/10.3390/rs16142646 .
  3. Koc, A.B., Erwin, C., Aguerre, M., Chastain, J. 2023. Estimating Tall Fescue and Alfalfa Forage Biomass Using an Unmanned Ground Vehicle. 15ᵗʰ International Congress of Agricultural Mechanization and Energy in Agriculture (AnkAgEng'23 - Antalya-Turkiye, Oct. 29 - Nov. 2,2023).
  4. Koc, A. B., Singh, J., Aguerre, M. J. (2023). Estimating forage biomass using unmanned ground and aerial vehicles. In Proceedings of International Grassland Congress 2023. Pp. 1449-1452. https://doi.org/10.52202/071171-0352 .

 

MSU

  1. Ahmed, T., Wijewardane, N., Lu, Y., Jones, D., Kudenov, M., Williams, C., Villordon, A., Kamruzzaman, M., 2024. Advancing sweetpotato quality assessment with hyperspectral imaging and explainable artificial intelligence. Computers and Electronics in Agriculture 220, 108855.
  2. Xu, J., Lu, Y., 2024. Prototyping and evaluation of a novel machine vision system for real-time, automated quality grading of sweetpotatoes. Computers and Electronics in Agriculture 219, 108826.
  3. Xu. J., Lu, Y., Deng, B., 2024. Design, prototyping, and evaluation of a machine vision-based automated sweetpotato grading and sorting system. Journal of the ASABE (under review).
  4. Xu, J., Lu, Y., Deng, B., 2024. OpenWeedGUI: an open-source graphical tool for weed Imaging and YOLO-based weed detection. Electronics 13 (9), 1699. (Project#2, Lu)
  5. Deng, B., Lu, Y., 2024. Canopy Image-based Blueberry Detection by YOLOv8 and YOLOv9. Artificial Intelligence in Agriculture (under review).
  6. Wang, Y., Lu, Y., Morris, D., Benjamin, M., Lavagnino, M., McIntyre, J., 2024. Automated sow body condition estimation by 3D computer vision towards precision livestock farming. Artificial Intelligence in Agriculture (submitted to journal).
  7. Olaniyi, E., Lu, Y., Sukumaran, A., Jarvis, T., Clinton, R., 2023. Non-destructive Assessment of White Striping in Broiler Breast Meat Using Structured Illumination Reflectance Imaging with Deep Learning. Journal of the ASABE 66(6), 1437-1447.
  8. Dang, F., Chen, D., Lu, Y., Li, Z., 2023. YOLOWeeds: a novel benchmark of YOLO object detectors for multi-class weed detection in cotton production systems. Computers and Electronics in Agriculture 205, 107655.
  9. Chen, D., Qi, X., Zheng, Y., Lu, Y., Huang, Y., Li, Z., 2024. Synthetic data augmentation by diffusion probabilistic models to enhance weed recognition. Computers and Electronics in Agriculture 216, 108517.
  10. Deng, B., Lu, Y., Xu, J., 2024. Weed database development: An updated survey of public weed datasets and cross-season weed detection adaptation. Ecological Informatics, 102546.

 

UArk

  1. Li, Z., Wang, D., Zhu, T., Tao, Y., & Ni, C. (2024). Review of deep learning-based methods for non-destructive evaluation of agricultural products. Biosystems Engineering245, 56-83.
  2. Wang, D., Sethu, S., Nathan, S., Li, Z., Hogan, V. J., Ni, C., ... & Seo, H. S. (2024). Is human perception reliable? Toward illumination robust food freshness prediction from food appearance—Taking lettuce freshness evaluation as an example. Journal of Food Engineering, 112179.
  3. Zhou, C., Li, Z., Wang, D., Xue, S., Zhu, T., & Ni, C. (2024). SSNet: Exploiting Spatial Information for Tobacco Stem Impurity Detection with Hyperspectral Imaging. IEEE Access.
  4. Ali, M. A., Wang, D., & Tao, Y. (2024). Active Dual Line-Laser Scanning for Depth Imaging of Piled Agricultural Commodities for Itemized Processing Lines. Sensors24(8), 2385.
  5. Xu, Z., Uppuluri, R., Shou, W., Wang, D., & She, Y. (2024). Whole Chicken Pushing Manipulation via Imitation Learning. In 2024 ASABE Annual International Meeting. American Society of Agricultural and Biological Engineers.
  6. Li, Z., Wang, D., Zhu, T., Ni, C., & Zhou, C. (2023). SCNet: A deep learning network framework for analyzing near-infrared spectroscopy using short-cut. Infrared Physics & Technology132, 104731.

 

UF

  1. da Cunha V.G., A. Hariharan J., Ampatzidis Y., Roberts P., 2023. Early detection of tomato bacterial spot disease in transplant tomato seedlings utilizing remote sensing and artificial intelligence. Biosystems Engineering, 234, 172-186, https://doi.org/10.1016/j.biosystemseng.2023.09.002.
  2. da Cunha V.A.G., Pullock D., Ali M., Neto A.D.C., Ampatzidis Y., Weldon C., Kruger K., Manrakhan A., Qureshi J., 2024. Psyllid Detector: a web-based application to automate insect detection utilizing image processing and artificial intelligence. Applied Engineering in Agriculture, 40(4), 427-439. https://doi.org/10.13031/aea.15826. 
  3. Javidan S.M., Banakar A., Rahnama K., Vakilian K.A., Ampatzidis Y., Feature engineering to identify plant diseases using image processing and artificial intelligence: a comprehensive review. Smart Agricultural Technology, 8, 100480, https://doi.org/10.1016/j.atech.2024.100480.
  4. Javidan S.M., Banakar A., Vakilian K.A., Ampatzidis Y., Rahnama K., 2024. Diagnosing the spores of tomato fungal diseases using microscopic image processing and machine learning. Multimedia Tools and Applications, 1-19, https://doi.org/10.1007/s11042-024-18214-y. 
  5. Kim, D.W., S.J. Jeong, S. Lee, H. Yun, Y.S., Chung, Y.-S. Kwon, and H.-J. Kim. 2023. Growth monitoring of field-grown onion and garlic by CIE L*a*b* color space and region-based crop segmentation of UAV RGB images. Precision Agric 24, 1982–2001. https://doi.org/10.1007/s11119-023-10026-8.
  6. Kondaparthi AK, Lee WS, Peres NA. Utilizing High-Resolution Imaging and Artificial Intelligence for Accurate Leaf Wetness Detection for the Strawberry Advisory System (SAS). Sensors. 2024; 24(15):4836. https://doi.org/10.3390/s24154836.
  7. Liu X., Zhang Z., Igathinathane C., Flores P., Zhang M., Li H., Han X., Ha T., Ampatzidis Y., Kim H-J., 2024. Infield corn kernel detection using image processing, machine learning, and deep learning methodologies. Expert Systems with Applications, 238 (part E), 122278, https://doi.org/10.1016/j.eswa.2023.122278.
  8. Mehdizadeh S.A., Noshad M., Chaharlangi M., Ampatzidis Y., Development of an innovative optoelectronic nose for detecting adulteration in quince seed oil. Foods, 12(23), 4350, https://doi.org/10.3390/foods12234350.
  9. Mirbod, O., Choi, D., Heinemann, P. H., Marini, R. P., & He, L. (2023). On-tree apple fruit size estimation using stereo vision with deep learning-based occlusion handling. Biosystems Engineering, 226, 27-42.
  10. Ojo I., Ampatzidis Y., Neto A.D.C., Batuman O., 2024. Development of an automated needle-based trunk injection system for HLB-affected citrus trees. Biosystems Engineering, 240, 90-99, https://doi.org/10.1016/j.biosystemseng.2024.03.003.
  11. Ojo I., Ampatzidis Y., Neto A.D.C., Bayabil K.H., Schueller K.J., Batuman O., 2024. Determination of needle penetration force and pump pressure for the development of an automated trunk injection system for HLB-affected citrus trees. Journal of ASABE, 67, 4, https://doi.org/10.13031/ja.15975.
  12. Teshome F.T., Bayabil H.K., Hoogenboom G., Schaffer B., Singh A., Ampatzidis Y., Unmanned aerial vehicle (UAV) imaging and machine learning applications for plant phenotyping. Computers and Electronics in Agriculture, 212, 108064, https://doi.org/10.1016/j.compag.2023.108064.
  13. Teshome F.T., Bayabil H.K., Schaffer B., Ampatzidis Y., Hoogenboom G., Singh A., 2024. Simulating soil hydrologic dynamics using crop growth and machine learning models. Computers and Electronics in Agriculture, 224, 109186, https://doi.org/10.1016/j.compag.2024.109186.
  14. Zhang L., Ferguson L., Ying L., Lyons A., Laca E., and Ampatzidis Y., Developing a web-based pistachio nut growth prediction system for orchard management. HortTechnology, 34,1, 1-7, https://doi.org/10.21273/HORTTECH05270-23.
  15. Zhou, C., S. Lee, O. E. Liburd, I. Aygun, X. Zhou, A. Pourreza, J. K. Schueller, Y. Ampatzidis. 2023. Detecting two-spotted spider mites and predatory mites in strawberry using deep learning. Smart Agricultural Technology, 4, 100229. https://doi.org/10.1016/j.atech.2023.100229.
  16. Zhou C., S. Lee, S. Zhang, O. E. Liburd, A. Pourreza, J. K. Schueller, Y. Ampatzidis. 2024. A smartphone application for site-specific pest management based on deep learning and spatial interpolation. Computers and Electronics in Agriculture, 218, 2024, 108726, ISSN 0168-1699, https://doi.org/10.1016/j.compag.2024.108726.
  17. De Vries, A., Bliznyuk, N., & Pinedo, P. (2023). Invited Review: Examples and opportunities for artificial intelligence (AI) in dairy farms. Applied Animal Science, 39(1), 14-22.
  18. Kalopesa, E., Tziolas, N., Tsakiridis, N., Multimodal Fusion for soil organic carbon estimation at continental scale. Remote Sensing. (submitted)
  19. Rosin, N. A., Demattê, J. A. M., Carvalho, H. W. P., Rodriguez-Albarracín, H. S., Rosas, J. T. F., Novais, J. J., Dalmolin, R. S. D., Alves, M. R., Falcioni, R., Tziolas, N., Mallah, S., de Mello, D. C., & Francelino, M. R. (2024). Spatializing soil elemental concentration as measured by X-ray fluorescence analysis using remote sensing data. Catena, 240, 107988. https://doi.org/10.1016/j.catena.2024.107988  
  20. Tziolas, N., Tsakiridis, N., Heiden, U., & van Wesemael, B. (2024). Soil organic carbon mapping utilizing convolutional neural networks and Earth observation data: A case study in Bavaria state, Germany. Geoderma, 444, 116867. https://doi.org/10.1016/j.geoderma.2024.116867
  21. Patnam Reddy, K., Tziolas, N., Dematte, J., AI-driven online spectral analysis tool for global use. Geoderma. (being prepared).
  22. Qian, H., McLamore, E., & Bliznyuk, N. (2023). Machine learning for improved detection of pathogenic E. coli in hydroponic irrigation water using impedimetric aptasensors: A comparative study. ACS omega8(37), 34171-34179.

 

Mississippi State University

  1. Gharakhani, H., Thomasson, J. A., Lu, Y., & Reddy, K. R. (2023). Field Test and Evaluation of an Innovative Vision-Guided Robotic Cotton Harvester. Computers and Electronics in Agriculture. 225: 109314.

 

UTK

  1. Amirivojdan, A., Nasiri, A., Zhou, S., Zhao, Y., & Gan, H. (2024). ChickenSense: A Low-Cost Deep Learning-Based Solution for Poultry Feed Consumption Monitoring Using Sound Technology. AgriEngineering, 6(3), 2115-2129.
  2. Nasiri, A., Zhao, Y., & Gan, H. (2024). Automated detection and counting of broiler behaviors using a video recognition system. Computers and Electronics in Agriculture, 221, 108930. DOI: 10.1016/j.compag.2024.108930
  3. Nasiri, A., Amirivojdan, A., Zhao, Y., & Gan, H. (2024). An automated video action recognition-based system for drinking time estimation of individual broilers. Smart Agricultural Technology, 100409. https://DOI:10.1016/j.atech.2024.100409

UK

  1. Ekramirad, N., Doyle, L.E., Loeb, J.R., Santra, D., Adedeji, A.A. (2024). Hyperspectral imaging and machine learning as a nondestructive method for proso millet seed detection and classification. Foods 13(9), 1330.
  2. Adedeji, A.A, Ekramirad, N., Khaled, Y.A., and Villanueva, R. (2024). Impact of storage on nondestructive detectability of codling moth infestation in apples. Journal of ASABE 67(2):401-408. https://doi.org/10.13031/ja.15583. JIF
  3. Tizhe Liberty, J., Sun, S., Kucha, C., Adedeji, A. A., Agidi, G., & Ngadi, M. O. (2024). Augmented reality for food quality assessment: Bridging the physical and digital worlds. Journal of Food Engineering 367, 111893. https://doi.org/10.1016/j.jfoodeng.2023.111893
  4. Adedeji, A.A., Okeke, A., and Rady, A. (2023). Utilization of FTIR and machine learning for evaluating gluten-free bread contaminated with wheat flour. SustainabilityFood Processing Safety and Public Health 15(11), 8742.
  5. Khaled, Y.A., Ekramirad, N., Donohue, K., Villanueva, R., and Adedeji, A.A. (2023). Non-destructive hyperspectral imaging and machine learning-based predictive models for physicochemical quality attributes of apples during storage as affected by codling moth infestation. Agriculture – Digital Agriculture 13(5),1086. https://doi.org/10.3390/agriculture13051086.
  6. Ekramirad, N., Khaled, Y.A., Donohue, K., Villanueva, R., and Adedeji, A.A. (2023). Classification of codling moth infested apples using sensor data fusion of acoustic and hyperspectral features coupled with machine learning. Agriculture - Agricultural Technology 13(4), 839. https://doi.org/10.3390/agriculture13040839.

 

TAMU

  1. Fernandes, M.M., Fernandes Junior, J.d., Adams, J.M., Tedeschi, L.O. et al.(2024). Using sentinel-2 satellite images and machine learning algorithms to predict tropical pasture forage mass, crude protein, and fiber content. Scientific Report. 14, 8704. https://doi.org/10.1038/s41598-024-59160-x
  2. Kaniyamattam, Bhandari, M., Hardin, R., Tao, J., Landivar, J., and Tedeschi, L. (2023). Scalable Data-driven Intelligent Agri-Systems: Opportunities, Challenges, and Research Investment Analysis for the State of Texas. A white paper submitted to Texas A&M AgriLife Research.
  3. Risal, A., Niu, H., Landivar-Scott, J. L., Maeda, M. M., Bednarz, C. W., Landivar-Bowles, J., ... & Bhandari, M. (2024). Improving Irrigation Management of Cotton with Small Unmanned Aerial Vehicle (UAV) in Texas High Plains. Water, 16(9), 1300.
  4. Niu, H., Peddagudreddygari, J. R., Bhandari, M., Landivar, J. A., Bednarz, C. W., & Duffield, N. (2024). In-Season Cotton Yield Prediction with Scale-Aware Convolutional Neural Network Models and Unmanned Aerial Vehicle RGB Imagery. Sensors, 24(8), 2432.
  5. Khuimphukhieo, I., Bhandari, M., Enciso, J., & da Silva, J. A. (2024). Assessing Drought Stress of Sugarcane Cultivars Using Unmanned Vehicle System (UAS)-Based Vegetation Indices and Physiological Parameters. Remote Sensing, 16(8), 1433.
  6. Zhao, L., Bhandari, M., Um, D., Nowka, K., Landivar, J., & Landivar, J. Cotton Yield Prediction Utilizing Unmanned Aerial Vehicles (Uav) and Bayesian Neural Networks. Available at SSRN 4693599.
  7. Dhal, S. B., Kalafatis, S., Braga-Neto, U., Gadepally, K. C., Landivar-Scott, J. L., Zhao, L., ... & Bhandari, M. (2024). Testing the Performance of LSTM and ARIMA Models for In-Season Forecasting of Canopy Cover (CC) in Cotton Crops. Remote Sensing, 16(11), 1906.
  8. Happs, R. M., Hanes, R. J., Bartling, A. W., Field, J. L., Harman-Ware, A. E., Clark, R. J., Yaping, X., ... & Davison, B. H. (2024). Economic and Sustainability Impacts of Yield and Composition Variation in Bioenergy Crops: Switchgrass (Panicum virgatum L.). ACS Sustainable Chemistry & Engineering, 12(5), 1897-1910.
  9. Bhandari, M., Chang, A., Jung, J., Ibrahim, A. M., Rudd, J. C., Baker, S., ... & Landivar, J. (2023). Unmanned aerial system‐based high‐throughput phenotyping for plant breeding. The Plant Phenome Journal, 6(1), e20058.

K-State

  1. McGinty H, Shimizu C, Hitzler P, & Sharda A. (2024). Towards a Global Food Systems Datahub. Semantic Web -1 (2024) 1–4. https://DOI.org/10.3233/SW-243688
  2. Badua S, Sharda A, Aryal B. 2024. Quantifying real-time opening disk load during planting operations to assess compaction and potential for planter control. Precision Agriculture25(4):1-13. https://DOI.org/1007/s11119-024-10151-y   
  3. Das S, Flippo D, Welch S. 2024. Autonomous robot system for steep terrain farming operations. U.S. Patent and Trademark Office. 
  4. Grijalva I, Kang Q, Flippo D, Sharda A, McCornack B. 2024. Unconventional strategies for aphid management in sorghum. Insects, 15(475).
  5. Rahman R, Indris C, Bramesfeld G, Zhang T, Li K, Chen X, Grijalva I, McCornack B, Flippo D, Sharda A, Wang G. A new dataset and comparative study for aphid cluster detection and segmentation in sorghum fields. Journal of Imaging, 10(5), 2024-5-08.
  6. Pokharel P, Sharda A, Flippo D, Ladino K. Design and systematic evaluation of an under-canopy robotic spray system for row crops.  Smart Agricultural Technology, 8:100510.

ALL STATION CONFERENCE PRESENTATIONS: PODIUM/POSTER

SDSU

  1. Wang, T. and H. Jin. Factors Affecting Farmer Adoption of Unmanned Aerial Vehicles: Current and Future. 2024 AI in Agriculture and Natural Resources Conference. April 15-17, 2024, College Station, Texas.
  2. Wang, T. and H. Jin. Factors Affecting Farmer Adoption of Unmanned Aerial Vehicles: Current and Future. Southern Agricultural Economics Association (SAEA) 56th Annual Meeting. February 3-6, 2024, Atlanta, Georgia.
  3. Adereti, D. T., Gardezi, M., Wang, T., McMaine, J. 2023. Understanding farmers’ engagement and barrier to machine learning-based intelligent agricultural decision support systems. 85th Annual Meeting of the Rural Sociological Society. August 2-6, Burlington, VT.

 

LSU

  1. Adhikari, R., Setiyono, T., Dodla, S. K., Pabuayon, I. L., Duron, D., Acharya, B., ... & Shiratsuchi, L. S. (2023, October). Evaluation of Varying Canopy Distance on Crop Circle Phenom Sensor Measurements: Implications for Remote Sensing of Crop Parameters. In ASA, CSSA, SSSA International Annual Meeting. ASA-CSSA-SSSA
  2. Setiyono, T., Dodla, S. K., Rontani, F. A., Acharya, B., Duron, D., Adhikari, R., ... & Gentimis, T. (2023, October). Precision Positioning in UAV Remote Sensing: Case Study in Corn N Rates and Soybean Seeding Rates Experiments. In ASA, CSSA, SSSA International Annual Meeting. ASA-CSSA-SSSA.
  3. Acharya, B., Setiyono, T., Rontani, F. A., Dodla, S. K., Adhikari, R., Duron, D., ... & Parvej, R. (2023, October). Application of UAV Remote Sensing for Monitoring Nitrogen Status in Corn Under Excessive Rainfall Conditions. In ASA, CSSA, SSSA International Annual Meeting. ASA-CSSA-SSSA.
  4. Duron, D., Rontani, F. A., Acharya, B., Adhikari, R., Taylor, Z., Blanchard, B., ... & Setiyono, T. (2023, October). Integrating Crop Modeling and Remote Sensing Data for Prediction of Sugarcane Growth, Yield, and Sugar Content and Their Field Spatial Variability. In ASA, CSSA, SSSA International Annual Meeting. ASA-CSSA-SSSA.
  5. Adhikari, R., Setiyono, T., Dodla, S. K., Pabuayon, I. L., Duron, D., Acharya, B., ... & Shiratsuchi, L. S. (2023, October). Multi-Sensor Crop Sensing Platforms for Monitoring Agronomic Practices Under Different Tillage and Fertilization Systems. In ASA, CSSA, SSSA International Annual Meeting. ASA-CSSA-SSSA.
  6. Lanza, P., Santos, L., Gentimis, A., Yang, Y., Conger, S., & Beasley, J. (2023). Parameters to increase LiDAR mounted UAV efficiency on agricultural field elevation measurements. In Precision agriculture'23(pp. 715-721). Wageningen Academic.
  7. Júnior, M. R. B., de Almeida Moreira, B. R., Duron, D., Setiyono, T., Shiratsuchi, L. S., & da Silva, R. P. (2024). Integrated sensing and machine learning: Predicting saccharine and bioenergy feedstocks in sugarcane. Industrial Crops and Products215, 118627.

Clemson

  1. Singh, J., Koc, A.B., Aguerre, M.J., Chastain, J.P., and Shaik, S. 2024. Stereoscopic Morphometry in Forages: Predicting Pasture Quantity with Field Robotics. Presented at the 2024 ASABE Annual International Meeting, July 28-31, 2024. Anaheim CA.
  2. Lisa Umutoni, Vidya Samadi, George Vellidis, Jose Payero, Bulent Koc, Charles Privette III. 2024. Application of Deep Neural Networks for Seasonal Cotton Yield Estimation. Presented at the 2024 ASABE Annual International Meeting, July 28-31, 2024. Anaheim CA.
  3. Shaik, S., B. Koc, J. Singh, M. Aguerre, J. P. Chastain. 2024. Aboveground Biomass Prediction of Bermudagrass: A Comparative Analysis of Machine Learning Models. 2024 AI in Agriculture and Natural Resources Conference. April 15, 2024 - April 17, 2024.

 

MSU

  1. Xu, J., Lu, Y., Deng, B., 2024. Design, prototyping, and evaluation of a machine vision-based automated sweet potato grading and sorting System. ASABE Annual International Meeting 2400102.
  2. Xu, J., Lu, Y, 2024. Design and preliminary evaluation of a machine vision-based automated sweet potato sorting system. Sensing for Agriculture and Food Quality and Safety XVI Proceedings Volume PC13060.
  3. Xu., J., Lu, Y., 2024. Prototyping and preliminary evaluation of a real-time multispectral vision system for automated sweet potato quality grading. Presented at the 2024 International Conference on Precision Agriculture. (Project #1, Lu)
  4. Xu, J., Lu, Y., 2023. OpenWeedGUI: an open-source graphical user interface for weed imaging and detection. Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping VIII 12539, 97-106.
  5. Deng, B., Lu, Y., Vander Weide, J., 2024. Development and Preliminary Evaluation of a Deep Learning-based Fruit Counting Mobile APP for High-bush Blueberries. ASABE Annual International Meeting 2401022
  6. Wang, Y., Lu, Y., Morris, D., Benjamin, M., Lavagnino, M., McIntyre, J., 2024. 3D Computer Vision-Based Sow Body Condition Estimation Towards Precision Livestock Farming. Presented at the 2024 AI Conference in Agriculture.
  7. Wang, Y., Lu, Y., Morris, D., Benjamin, M., Lavagnino, M., McIntyre, J., 2024. 3D Computer Vision with A Spatial-Temporal Neural Network for Lameness Detection of Sows. Presented at the 2024 International Conference on Precision Agriculture.
  8. Deng, B., Lu, Y., 2023. Factors influencing the detection of Lambsquarters by YOLOv8 towards precision weed control. Poster presented at the Great Lakes EXPO (Grand Rapids, Michigan).
  9. Deng, B., Lu, Y., 2024. Weed Image Augmentation by ControlNet-Added Stable Diffusion. Proceedings Volume 13035, Synthetic Data for Artificial Intelligence and Machine Learning: Tools, Techniques, and Applications II 130350M. https://doi.org/10.1117/12.3014145

 

UArk

  1. Pallerla C., Owens, C., Wang D., (2024) Hyperspectral imaging and Machine learning algorithms for foreign material detection on the chicken surface. In 2024 American Society of Agricultural and Biological Engineers (ASABE) Annual International Meeting.Anaheim, CA [Poster presentation]
  2. Pallerla C., Owens, C., Wang D., (2024) Hyperspectral imaging and Machine learning algorithms for foreign material detection on the chicken surface. In 2024 Poultry Science Asscoiation Annual International Meeting. Louisville, KY [Poster presentation]
  3. Feng Y., Wang D., (2024) Synthetic Data Augmentation for Chicken Carcass Instance Segmentation with Mask Transformer. In 2024 American Society of Agricultural and Biological Engineers (ASABE) Annual International Meeting. Anaheim, CA [Poster presentation]
  4. Mahmoudi S., Wang D., (2024) Automated Solutions for Poultry Processing: Integrating Robotic Swab Sampling and Pathogen Detection Technologies. In 2024 American Society of Agricultural and Biological Engineers (ASABE) Annual International Meeting. Anaheim, CA [Poster presentation]
  5. Sohrabipour P., Wan S., Yu S., Wang D., (2024) Depth image guided Mask-RCNN model for chicken detection in poultry processing line. In 2024 American Society of Agricultural and Biological Engineers (ASABE) Annual International Meeting.Anaheim, CA [Oral presentation]
  6. Mahmoudi S., Sohrabipour P., Obe T., Gibson K., Crandall P., Jeyam S., Wang D. (2024), Automated Environmental Swabbing: A Robotic Solution for Enhancing Food Safety in Poultry Processing. In 2024 the Third Annual Artificial Intelligence in Agriculture Conference. College Station, TX [Poster presentation]
  7. Sohrabipour P., Mahmoudi S., She Y., Shou W., Pallerla C., Schrader L., Wang D. (2024), Advanced Poultry Automation: Integrating 3D Vision Reconstruction and Mask R-CNN for Efficient Chicken Handling. In 2024 the Third Annual Artificial Intelligence in Agriculture Conference. College Station, TX [Poster presentation, First place winner]

 

UF

  1. Ampatzidis Y., 2024. Can AI and automation transform specialty crop production? 16th International Conference on Precision Agriculture (ICPA), International Symposium on robotics and Automation, Manhattan, Kansas, USA, July 21-24.
  2. Ampatzidis Y., 2024. Agroview and Agrosense for AI-enhanced precision orchard management. SE Regional Fruit and Vegetable Conference, Savannah, GA, January 11-14, 2024
  3. Ampatzidis Y., 2023. Emerging and advanced technologies in agriculture. Link (Linking Industry Networks through Certifications; High School Teachers Training) Conference, Daytona Beach, FL, October 10-12, 2023.
  4. Ampatzidis Y., 2023. AI and Extension. Possibilities and Challenges. 2023 SR-PLN Middle Managers Conference, Next Generation: Evolving the Extension Enterprise, Orlando, FL, August 22-24.
  5. Ampatzidis Y., 2023. AI-Enhanced Technologies for Precision Management of Specialty Crops. Sustainable Precision Agriculture in the Era of IoT and Artificial Intelligence, Bard Ag-AI Workshop, Be’er Sheva, Israel, July 18-20, 2023.
  6. Ampatzidis Y., Ojo I., Neto A.D.C., Batuman O., 2024. Automated needle-based trunk injection system for HLB-affected citrus trees. AgEng International Conference of EurAgEng, Agricultural Engineering Challenges in Existing and New Agrosystems, Athens, Greece, July 1-4, 2024.
  7. Ampatzidis Y., Vijayakumar V., Pardalos P., 2024. AI-enabled robotic spraying technology for precision weed management in specialty crops. Optimization, Analytics, and Decision in Big Data Era Conference (in honor of the 70th birthday of Dr. Panos Pardalos), Halkidiki, Greece, June 16-21.
  8. Banakar A., Javidan S.M., Vakilian K.A., Ampatzidis Y., 2024. Detection of spectral signature and classification of Alternaria alternata and Alternaria solani diseases in tomato plant by analysis of hyperspectral images and support vector machine. AgEng International Conference of EurAgEng, Agricultural Engineering Challenges in Existing and New Agrosystems, Athens, Greece, July 1-4, 2024.
  9. Cho Y., Yu, Z., Ampatzidis Y., Nam J., 2024. Blockchain-enhanced security and data management in smart agriculture. 6th CIGR International Conference, Jeju, Korea, May 19–23, 2024.
  10. Dutt, N., & Choi, D. (2024). A Computer Vision System for Mushroom Detection and Maturity Estimation using Depth Images. 2024 ASABE Annual International Meeting.
  11. Etefaghi, A., Medeiros, H. “ViLAD: Video-based Lettuce Association and Detection ,” American Society of Agricultural and Biological Engineers Annual International Meeting, Anaheim, CA, July 2024.
  12. Gallios, I., & Tziolas, N. (2024). Synergistic use of low-cost NIR scanner and geospatial covariates to enhance soil organic carbon predictions using dual input deep learning techniques. IEEE International Geoscience and Remote Sensing Symposium (IGARSS), 8-12 July, Athens, Greece.
  13. Hernandez, B., Medeiros, H. “Multiple Plant Tracking for Robotized Spraying of Ground Plants,” 2023 IROS Workshop on Agricultural Robotics for a Sustainable Future Innovation in Precision Agriculture (3rd paper prize), Detroit, MI, Oct 2023.
  14. Huang, Z., W. S. Lee, N.C. Takkellapati. 2024. Strawberry canopy size estimation with SAM guided by YOLOv8 detection. ASABE Paper No. 2400181. St. Joseph, MI.: ASABE.  
  15. Huang, Z., W. S. Lee, N.C. Takkellapati. 2024. HOPSY: Harvesting Optimization for Production of StrawberrY using real-time detection with modified YOLOv8-nano. In Proceedings of the 16th International Conference on Precision Agriculture (unpaginated, online). Monticello, IL: International Society of Precision Agriculture.
  16. Ilodibe, U., & Choi, D. (2024). Evaluating The Performance of a Mite Dispensing System for Biological Control of Chilli Thrips in Strawberry Production in Florida. 2024 ASABE Annual International Meeting.
  17. Lacerda C., and Neto A.D.C., Ampatzidis Y., 2024. Agroview: enhance satellite imagery using super-resolution and generative AI for precision management in specialty crops. AgEng International Conference of EurAgEng, Agricultural Engineering Challenges in Existing and New Agrosystems, Athens, Greece, July 1-4, 2024.
  18. Lee, W. S. 2023. Strawberry plant wetness detection using color imaging and artificial intelligence for the Strawberry Advisory System (SAS). 2023 Annual Strawberry AgriTech Conference, Plant City, FL, May 17, 2023.
  19. Lee, W. S., T. Burks, and Y. Ampatzidis. 2023. Precision agriculture in Florida, USA – The Beginning, Progress, and Future. Chungnam National University, Daejeon-si, Korea. May 24, 2023.
  20. Lee, W. S., T. Burks, and Y. Ampatzidis. 2023. Precision agriculture in Florida, USA – The Beginning, Progress, and Future. Department of Agricultural Engineering, Division of Smart Farm Development, National Institute of Agricultural Sciences, Jeonju-si, Korea. May 25, 2023.
  21. Lee, W. S., T. Burks, and Y. Ampatzidis. 2023. Precision agriculture in Florida, USA – The Beginning, Progress, and Future. Seoul National University, Seoul, Korea. May 31, 2023.
  22. Lee, W. S., Y. Ampatzidis, and D. Choi. 2023. University of Florida 2023 W-3009 Report (presented via Zoom). Cornell AgriTech, Cornell
  23. Mirbod, O., & Choi, D. (2023). Synthetic Data-Driven AI Using Mixture of Rendered and Real Imaging Data foUniversity, Geneva, NY. June 20-21, 2023.
  24. Medeiros, H. “Self-supervised Learning for Panoptic Segmentation of Multiple Fruit Flower Species,” IEEE/RSJ International Conference on Intelligent Robots and Systems, Detroit, MI, Oct 2023.r Strawberry Yield Estimation. 2023 ASABE Annual International Meeting.
  25. Ojo I., Neto A.D.C., Ampatzidis Y., Batuman O., Albrecht U., 2024. Needle-based, automated trunk injection system for HLB-affected citrus trees. International Research Conference on Huanglongbing VII, Riverside, CA, March 26-29, 2024.
  26. Ottoy, S., Karyotis, K., Kalopesa, E., Van Meerbeek, K., Nedelkou, J., Gkrimpizis, T., De Vocht, A., Zalidis, G., & Tziolas, N. (2024). Digital mapping of soil organic carbon using drone remote sensing. IEEE International Geoscience and Remote Sensing Symposium (IGARSS), 8-12 July, Athens, Greece.
  27. Vijayakumar V., Ampatzidis Y., 2024. Development of a machine vision and spraying system of a robotic precision smart sprayer for specialty crops. 3rd Annual AI in Agriculture and Natural Resources Conference, College Station, TX, April 15-17, 2024.
  28. Wang, R., Hofstetter, D. Medeiros, H. Boney, J. Kassub, H. “Evaluation of turkey behavior under different night lighting treatments using machine learning.” American Society of Agricultural and Biological Engineers Annual International Meeting, Anaheim, CA, July 2024.
  29. Zhou C., Ampatzidis Y., Pullock D., 2024. Detecting citrus pests from sticky traps using deep learning. 3rd Annual AI in Agriculture and Natural Resources Conference, College Station, TX, April 15-17, 2024.
  30. Zhou, X., Y. Ampatzidis, W. S. Lee, S. Agehara, and J. K. Schueller. 2023. AI-based inspection system for mechanical strawberry harvesters. AI in Agriculture: Innovation and discovery to equitably meet producer needs and perceptions Conference, Orlando, FL, April 17-19, 2023.
  31. Zhou, C., W. S. Lee, W. Kratochvil, J. K. Schueller, and A. Pourreza. 2023. A portable imaging device for twospotted spider mite detection in strawberry. ASABE Annual Meeting, Omaha, NE, July 9-12, 2023.
  32. Zhou, C., W. S. Lee, N. Peres, B. S. Kim, J. H. Kim, and H. C. Moon. 2023. Strawberry flower and fruit detection based on an autonomous imaging robot and deep learning. 14th European Conference on Precision Agriculture, Bologna, Italy, July 2-6, 2023.

UTK

  1. Nasiri, A., Zhao, Y., Gan, H. (2024). Automated broiler behaviors measurement through deep learning models. ASABE Annual International Meeting, Anaheim, CA.
  2. Amirivojdan, A., Nasiri, A., Zhao, Y., Gan, H. (2024). A machine vision system for broiler body weight estimation. ASABE Annual International Meeting, Anaheim, CA.

UCDavis

  1. Li, Z; Karimzadeh, S.; Ahamed, M. S. (2024). Detection of Calcium Deficiency in the Growing Stage of Lettuce Using Computer Vision. ASABE Annual Meeting 2024, July 28-31, Anaheim, California.
  2. Karimzdeh, S.; Chowdhury, M.; Ahamed, M. S. (2023). Fault Detection and Diagnosis of Hydroponic System using Intelligent Computational Model. ASABE Annual Meeting, July 9-12, Omaha, Nebraska.
  3. Li, Z; Karimzadeh, S.; Ahamed, M. S. (2024). Nutrient Dosing Algorithms to Mitigate Ion Imbalance in Closed-Loop Hydroponic Systems. ASABE Annual Meeting 2024, July 28-31, Anaheim, California.

UK

  1. Mizuta K., Miao Y, Lu J, and Negrini R. (2024) Evaluating Different Strategies to Analyze On-farm Precision Nitrogen Trial Data. 16th International Conference on Precision Agriculture, Manhattan, KS.
  2. Miao Y, Kechchour A, Sharma V, Flores A, Lacerda L, Mizuta K, Lu J, and Huang Y. (2024) In-season Diagnosis of Corn Nitrogen and Water Status Using UAV Multispectral and Thermal Remote Sensing. 16th International Conference on Precision Agriculture, Manhattan, KS.
  3. Oloyede, A. and Adedeji, A.A. (2024). Near-infrared hyperspectral imaging sensing for gluten detection and quantification. Accepted for presentation at 2024 ASABE Annual International Meeting, Anaheim, CA. July 28 – 31, 2024. Paper #: 2400053.
  4. Adedeji, A.A, Loeb, J.R., Doyle, L.E., Ekramirad, N., and Khaled, Y. Al Fadhl. (2023). Photon-induced reduction in barley malt processing time and quality improvement. A paper presented (oral) at the 14th International Congress on Engineering and Food (ICEF14) held in the city of Nantes France from June 20 – 23, 2023.
  5. Adedeji, A.A., Ekramirad, N., Al Khaled, Y.A., Donohue, K., and Villanueva, R. (2023). Sensor data fusion and machine learning approach for pest infestation detection in apples. A poster presented at the SEC Conference with the theme: “USDA-NIFA AI in Agriculture: Innovation and Discovery to Equitably Meet Producers’ Needs and Perceptions” held in Orlando Florida on April 17 – 19, 2023.

K-State

  1. Alamdari S, Brokesh 2024. “Enhancing Soil Health Monitoring in Precision Agriculture: A Comparative Analysis of avDAQ Vibration Data Collection System and Traditional Soil Sensors” ASABE-AIM, Presentation # 2400896
  2. Peiretti J, Sharda A. “Experimental study on the impact of planter tool bar position on row unit behavior” ASABE-AIM, Presentation # 2400215
  3. Vail B, Brokesh E. “Design and field-testing of a pull-force measuring frame for the testing of agricultural tire rolling resistance” ASABE-AIM, Presentation # 2401007
  4. Shende K, Sharda A. “Integration & testing of wireless data communication system for autonomous liquid application platform” ASABE-AIM, Presentation # 2400833
  5. Kaushal S, Sharda A. “Enhancing Agricultural Feedback Analysis through VUI and Deep Learning Integration” ASABE-AIM, Presentation # 2400287
  6. Abon J, Sharda A. “Optimizing Corn Irrigation Strategies: Insights from ND VI Trends, Soil Moisture Dynamics, and Remote Sensing” ASABE-AIM, Presentation # 2400814
  7. Peiretti J, Sharda A. “Effective Strategies for Closing Furrows Based on Corn Planter Settings” ASABE-AIM, Presentation # 2400215

 

Extension Articles

UF

  1. Choi, D., Mirbod, O., Ilodibe, U., & Kinsey, S. (2023). Understanding Artificial Intelligence: What It Is and How It Is Used in Agriculture: AE589, 10/2023. EDIS,2023(6).
  2. Her Y.G., Bliznyuk N., Ampatzidis, Yu Z., and Bayabil H., 2024. Introduction to Artificial Intelligence in Agriculture. EDIS, University of Florida, IFAS Extension (accepted).
  3. Sharma L., and Ampatzidis Y., Approaches to consider for site-specific field mapping. SS713, EDIS, University of Florida, IFAS Extension, doi.org/10.32473/edis-SS713-2023.

 

 

Log Out ?

Are you sure you want to log out?

Press No if you want to continue work. Press Yes to logout current user.

Report a Bug
Report a Bug

Describe your bug clearly, including the steps you used to create it.