S1090: AI in Agroecosystems: Big Data and Smart Technology-Driven Sustainable Production

(Multistate Research Project)

Status: Active

SAES-422 Reports

Annual/Termination Reports:

[08/31/2022] [07/24/2023] [09/05/2023] [06/11/2024] [10/02/2024]

Date of Annual Report: 08/31/2022

Report Information

Annual Meeting Dates: 08/04/2022 - 08/05/2022
Period the Report Covers: 10/01/2021 - 08/05/2022

Participants

Confirmed in-person attendees:

1. Yiannis Ampatzidis (Univ. of Florida)
2. Tom Burks (Univ. of Florida)
3. Dana Choi (Univ. of Florida)
4. Thanos Gentimis (Louisiana State Univ.)
5. Zhen Jia (Univ. of Florida)
6. Edward Kick (North Carolina State Univ.)
7. Juan Landivar (Texas A&M Univ.)
8. Daniel Lee (Univ. of Florida)
9. Amando Lopes de Brito Filho (Louisiana State U.)
10. Yuzhen Lu (Mississippi State Univ.)
11. Henry Medeiros (Univ. of Florida)
12. Brenda Ortiz (Auburn Univ.)
13. Luciano Shiratsuchi (Louisiana State Univ.)
14. Alex Thomasson (Mississippi State Univ.)
15. Gary Thompson (University of Arkansas)
16. Jeffrey Vitale (Oklahoma State Univ.)

Confirmed online attendees:
1. Tom Burks (Univ. of Florida, on Friday)
2. Matt Donovan (AgIntel)
3. Hao Gan (Univ. of Tennessee)
4. Steve Thomson (USDA, NIFA)
5. Paul Weckler (Oklahoma State Univ.)

Brief Summary of Minutes

Aug. 4: Field trip        


7:45 – 8:00 am            Met at the Hilton hotel parking lot and departed at 8 am for field trip. 


8:00 – 8:30 am            Traveled to PSREU (Citra, FL) (https://plantscienceunit.ifas.ufl.edu/)


8:30 – 9:00 am            Dr. Jim Boyer gave a tour of the PSREU facilities including a detailed explanation of the extensive field trials conducted on-site. Our group engaged in an active discussion with Dr. Boyer regarding various agronomic issues and constraints encountered during field trials.  


9:00 – 9:50 am            Dr. Congliang Zhou provided a precision ag demonstration of a robot programmed to analyze plant wetness using on-board sensors as well to detect soil mites.  


9:50 – 10:00 am          Break was provided by PSREU at their main deadquarters. 


10:00 – 10:30 am        Mr. Whitehurst gave a PowerPoint presentation of his 4,000 acres farm/plantation. This included a video of how Mr. Whitehurst uses aerial drones in his ranching operations to herd cattle remotely. He was assisted by Yilin Zhuang and Stacy Strickland. Mr. Whitehrus also explained how data collected from drones is used to mange his extensive plantation.


10:30 – 11:00 am        Traveled to the Whitehurst Cattle Farm  in Williston, FL.


11:00 – 11:45 am        Mr. Whitehurst gave  brief farm tour and a presentation of the various drones he owns and operates. This was followed by an in-field demonstration of Mr. Whitehurst herding his cattle using a drone.


11:45 – 12:15 pm        Participants returned to Dept. of ABE in Gainesville, FL.


12:15 – 1:00 pm          Lunch.                                               


1:00 – 2:00 pm            Meeting with Dr. David Reed of the AI2 Center. Dr. Reed dicussed the new AI initiiatives including the hiring of over 100 new faculty dedicated to AI focused positions. Other issues discussed included the SEC Consortium.


2:00 – 3:00 pm            Meeting with Dr. Amber Ross, AI ethics expert. Dr. Ross generated a spritied discussion on the ethics of AI use in agriculture and in general throughout society.


3:00 – 3:30 pm            Break and travel to HiPerGator supercomputer facilities on the UF campus.


3:30 – 5:00 pm            HiPerGator tour was provided by Dr. Erik Deumens. Participants were allowed access into the Hypergator's complex of servers Dr. Deumens explained hoe Hypergator utilizes the CPU computing power of video cards to process computing tasks. participants also viwed the immense cooling faciliies required by HiPerGator.  


5:00 - 7:00 pm             Dinner and networking at Mildred’s Restaruant. Meal was sponsored by Auburn University.


 


Aug. 5: Meeting         


7:30 – 7:45 am            Meet at the Hilton hotel parking lot


7:45 – 8:00 am            Drive to ABE Department on the UF campus.


8:00 – 8:10 am            Introduction of the participants including Zoom participants. 


8:10 – 8:20 am            Dr. Gary Thompson, Executive Director, SAAESD, University of Arkansas. Dr. Thompson provided an overview of multi state Hatch projects incuding how to develop and submit annual reports. Dr. Thompson has provided hisPowerPoint.  


8:20 – 8:35 am            Dr. Damian Adams, S1090 Administrative Advisor, Associate Dean for Research, UF. Dr. Adams streesed the importance of strengthenin AI in the southern region, which lags behind Corn Belt and Western region. 


8:35 – 9:10 am            Dr. Steve Thomson, USDA-NIFA (video for funding programs, Q&A via zoom). Dr. Thompsen provided a thorough review of cvarious funding oipportuntiies availabel to AI reserach, including fundamental science based research as well as development and implementation. 


9:10 – 9:30 am            Dr. Shai Sela, Chief Scientist, Agmatix, Ramat Gan, Israel,gave a presentation of his company's AI technology applications.


9:30 – 9:40 am            Coffee break.


9:40 – 11:30 am          In the first part of this session, participants were grouped into three teams based on area of expertise to encourage team building and future collaboration. In a follow-up session, groups reconvended to discuss plans for the second project year. Consensus was reached to plan for developing a research proposal to be submitted through an agency such as NSF, USDA, etc. A committee was selected to develop a white paper to begin the proposal writing.


 


11:30 – 1:00 pm          This session was a "working lunch" session to take care of several business items such as electiosn, locations of future meetings, annul reporting etc. The following outcomes were achieved:


       Business meeting outcomes:



  • Elected   Yuzhen Lu  as our new secretary

  • LSU was selected as the 2023 meeting location sometine in May 2023.

  • Jeff Vitale was selected to submit the annual report.


 


1:00 pm                       Meeting was adjourned by President Daniel Lee circa 1 pm.

Accomplishments

<h1><strong>Activities (2021-2022)</strong></h1><br /> <p>&nbsp;</p><br /> <h2 class="x_MsoNormal"><em>Project Level Activities</em></h2><br /> <p>Members of S1090 project from Auburn University led by Dr. Brenda Ortiz organized a conference targeting undegraduate and young professionals.&nbsp; A total of 250 participants attened in-person and anothe r150 online. the conference was well received and plans are going forward to hold a similar conference next year. Conference details available online:&nbsp;&nbsp;</p><br /> <p class="x_MsoNormal">Website:&nbsp;<a title="Original URL: https://aaes.auburn.edu/ai-driven-innovations-in-agriculture/. Click or tap if you trust this link." href="https://nam04.safelinks.protection.outlook.com/?url=https%3A%2F%2Faaes.auburn.edu%2Fai-driven-innovations-in-agriculture%2F&amp;data=05%7C01%7Cjeffrey.vitale%40okstate.edu%7C83af757bee5a44e01b1208da911e60a1%7C2a69c91de8494e34a230cdf8b27e1964%7C0%7C0%7C637981856910954682%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000%7C%7C%7C&amp;sdata=G6rexetJ8dAa5fFi5oUF%2BjznH1IK0n6oXxhss4vK0H8%3D&amp;reserved=0" target="_blank" rel="noopener noreferrer" data-auth="Verified" data-linkindex="0">https://aaes.auburn.edu/ai-driven-innovations-in-agriculture/</a></p><br /> <p class="x_MsoNormal">Website with conference posters:&nbsp;<a title="Original URL: https://auburncatalog.instructure.com/courses/1860/pages/conference-posters. Click or tap if you trust this link." href="https://nam04.safelinks.protection.outlook.com/?url=https%3A%2F%2Fauburncatalog.instructure.com%2Fcourses%2F1860%2Fpages%2Fconference-posters&amp;data=05%7C01%7Cjeffrey.vitale%40okstate.edu%7C83af757bee5a44e01b1208da911e60a1%7C2a69c91de8494e34a230cdf8b27e1964%7C0%7C0%7C637981856910954682%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000%7C%7C%7C&amp;sdata=hziKrx1lRLlVjn9K4f0r4x7TyFNoOl36PP9uCUoUAKc%3D&amp;reserved=0" target="_blank" rel="noopener noreferrer" data-auth="Verified" data-linkindex="1">https://auburncatalog.instructure.com/courses/1860/pages/conference-posters</a></p><br /> <p class="x_MsoNormal">&nbsp;</p><br /> <h2 class="x_MsoNormal"><em>State Level Activities</em></h2><br /> <p class="x_MsoNormal"><span style="text-decoration: underline;">Alabama: Auburn University<br /></span></p><br /> <ol><br /> <li>AI-driven high-throughput phenotyping of agronomic and physiological traits in peanut (Yin Bao)</li><br /> </ol><br /> <p>Weekly UAV-based VNIR hyperspectral imagery data were collected for a F<sub>1</sub> peanut population for identifying drought tolerant lines under rainout shelters at the USDA-ARS National Peanut Research Laboratory (NPRL) in Dawson, GA, during pod filling stage in 2021. The project is in collaboration with a peanut breeder (Dr. Charles Chen) and a plant physiologist (Dr. Alvaro Sanz-Saez) from Auburn University and a research chemist (Dr. Phat Dang) from NPRL. Machine and deep learning models were developed to predict three agronomic traits (i.e., pod yield, biomass, and pod count) and two physiological traits (i.e., photosynthesis and stomatal conductance) with reasonable accuracies (R<sup>2 </sup>values around 0.55). A manuscript has been submitted to <em>Remote Sensing</em>.</p><br /> <ol start="2"><br /> <li>AI-based remote sensing of water quality and HABs for inland water bodies in Southeast (Yin Bao)</li><br /> </ol><br /> <p>A dataset including in-situ chlorophyll a and/or microcystin concentration measurements and Sentinel 2 multispectral satellite imagery has been curated for Lake Okeechobee (FL), Lake Thonotosassa (FL), and Lake Seminole (GA) from 2016 to 2021. LSTM models have been trained and tested to forecast chlorophyll a and/or microcystin concentrations in one month ahead using time-series satellite spectral response. Preliminary results are promising but need further improvement. Continued investigation is needed to see if including other features such as weather parameters can improve prediction accuracy.</p><br /> <p>The developed machine and deep learning models for peanut agronomic and physiological traits prediction will enable screening of a large population for drought tolerance by reducing the labor requirement with traditional phenotyping methods, thus accelerating releasing of climate-smart peanut lines for the Southeast.</p><br /> <p>&nbsp;</p><br /> <p class="x_MsoNormal">Peanut Maturity Assessment: Remote sensing and Artificial Intelligence (Brenda Ortiz)</p><br /> <p class="x_MsoNormal">Our research team has begun our work on a project purposed to use remote sensing and AI technology to assist peanut producers in the Southern region in developing improved methods to determine peanut maturity. We are seeking non-desctrucitve methods to assess maturity in cost effective ways to improve peanut harvest and subsequent farm income.&nbsp;</p><br /> <h4>&nbsp;</h4><br /> <p><span style="text-decoration: underline;">Florida: University of Florida</span></p><br /> <p>Uncertainty-aware Robotic Perception Models for Agricultural Production Systems (Dr. Henry Medeiros)</p><br /> <p>Our team developed a self-supervised machine learning model to detect flowers in images of trees acquired in an orchard. Our algorithm makes it possible to detect individual flowers in real-world conditions without the need for specialized data acquisition systems or training data. An evaluation on publicly available benchmark datasets containing images of multiple flower species collected under various environmental conditions demonstrates that our method substantially outperforms existing techniques, despite the fact that it does not need to be trained using images of the target flower species. A manuscript describing our research findings has been submitted to IEEE Robotics and Automation Letters and is currently undergoing its second round of revisions.</p><br /> <p>Expected Impact(s): The self-supervised machine learning model described above enables the development of systems to detect flowers, fruit, buds, and other relevant plant parts in the field without the need to collect and annotate hundreds to thousands of images reflecting all the potential data acquisition scenarios that may impact algorithmic performance, such as illumination variation and image resolution. Data collection and annotation is currently one of the main factors hindering the application of modern artificial intelligence techniques to agricultural problems. Hence, we expect our model to serve as a foundational architecture for the development of future agricultural robotic perception systems.</p><br /> <p>&nbsp;</p><br /> <p>Deep Learning Algorithms (Dr. Dana Choi)</p><br /> <p>A deep learning based algorithm was developed to segment green fruits and fruit stems, then the orientation of the fruits were identified to provide guidance for the robotic green fruit system to remove fruits. A path planning algorithm was also developed with a six-degree-freedom robotic arm to engage targeted green fruits. A series of early apple buds images were acquired with two image acquisition systems, and a YOLOv4 model was developed to detect the buds in the tree canopies.</p><br /> <p>Expected impact(s): Machine vision systems are being utilized extensively in agriculture applications. Daytime imaging in outdoor field conditions presents challenges such as variable lighting and color inconsistencies due to sunlight. Motion blur can occur due to vehicle movement and vibrations from ground terrain. A camera system with active lighting can be a solution to overcome these challenges. The developed on-tree apple fruit sizing system with high-resolution stereo cameras and artificial lighting increased performance of fruit sizing compared to manual inspection. Apple fruit size plays an integral role in orchard management decision-making, particularly during chemical thinning, fruit quality assessment, and yield prediction. UAV-based systems for thermal and RGB imaging with machine vision algorithms demonstrated the feasibility of the orchard heating requirement determination methodology, which has the potential to be a critical component of an autonomous, precise frost management system in future studies.</p><br /> <p>&nbsp;</p><br /> <p>Fruit Based AI Technology (Dr. Daniel Lee)</p><br /> <p>Our team accomplished the following over the past reporting year:</p><br /> <ul style="list-style-type: circle;"><br /> <li>Strawberry plant wetness detection system has been developed using color imaging and deep learning for strawberry production. Based on the 2021-22 results, a portable wetness sensor will be designed for use in commercial strawberry fields.</li><br /> <li>Smartphone-based tool was developed to detect and count two-spotted spider mites (TSSM) on strawberry plants. Various deep learning methods were used to detect TSSM, eggs, and predatory mites. A portable six-camera sensor device was developed and is currently being tested for detecting TSSM in strawberry and almond leaves.</li><br /> <li>Strawberry bruise and size detection systems for postharvest fruit quality evaluation were developed utilizing machine vision and deep learning. These systems can be used in strawberry packinghouses.</li><br /> </ul><br /> <p>Expected impact(s): The plant wetness detection system could enhance the performance of the disease prediction models for strawberry growers in Florida and other parts of the US. The TSSM detection device and tool will increase the efficiency of pest management and thereby increase strawberry yield and profit. The device could be used for other row crops affected by TSSM. The strawberry bruise and size detection system could improve the quality of strawberries.</p><br /> <p>&nbsp;</p><br /> <p>Computer Algorithms for Machine Vision Applications in Agriculture (Yiannis Ampatzidis)<strong><br /></strong></p><br /> <p>Over the reporting year our team accompllshed the following:&nbsp;</p><br /> <ul style="list-style-type: circle;"><br /> <li>Strawberry bruise detection system for postharvest fruit quality evaluation was developed utilizing machine vision and deep learning.</li><br /> <li>Disease detection and monitoring system was developed for downy mildew in watermelons utilizing UAV-based hyperspectral imaging and machine learning. This technique was able to classify several severity stages of the disease.</li><br /> <li>Yield and related traits prediction system was developed for wheat under heat-related stress environments. This high-throughput system utilizes UAV-based hyperspectral imaging and machine learning. A yield prediction system was developed for citrus too utilizing UAV-based multispectral imaging and AI.</li><br /> <li>System was developed to determine leaf stomatal properties in citrus trees utilizing machine vision and artificial intelligence.</li><br /> <li>Machine vision based system was developed to measure pecan nut growth utilizing deep learning for a better understanding of the fruit growth curve.</li><br /> </ul><br /> <p>&nbsp;</p><br /> <p><span style="text-decoration: underline;">Kentucky (University of Kentucky)</span></p><br /> <p>Non-Destructive Testing of Fruit in Fod Processing and Manufacturing (<strong>Akinbode A. Adedeji)</strong></p><br /> <p>Over the reporting year the following was accompished:</p><br /> <ul style="list-style-type: circle;"><br /> <li>Fundamental understanding of the state-of-the-art in area that relates to the application of nondestructive (NDT) approach (Intelligent sensors) to food (meat, surfaces, and apples) quality evaluation by writing review papers on the subject and published them in high impact journals.</li><br /> <li>Advanced the understanding of application of two sensing methods for qualitative and qualitative assessment of apples and millet cultivars.</li><br /> <li>Developed hyperspectral imaging (HSI) and vibro-acoustic methods for nondestructive testing of apple for codling moth pest. The classification results were well above 90% in both cases for test-set results.</li><br /> </ul><br /> <p>&nbsp;</p><br /> <p>Expected impact(s): The timely publication of review papers provides a succinct summary of current state of knowledge in these areas that is a resource for many of our colleagues. One of the papers has seen double digit citation in less than a year. Also, some of the results from our work on nondestructive method developments will form the foundation for the application of sensing methods in artificial systems (robotics) development for implementation in the apple and meat processing industries.</p><br /> <p>&nbsp;</p><br /> <p>Machine Learning Applications in Grape Production (<strong>Carlos M. Rodriguez Lopez)</strong></p><br /> <p>Over the reporting year the following was accompished:</p><br /> <ul style="list-style-type: circle;"><br /> <li><strong>Predicting continent and country of origin of vineyard soil samples:</strong> We tested the efficiency of 9 different Machine Learning models (i.e., Random Forest, AdaBoost, Bernouli Na&iuml;ve Bayes, Gradient Boosting Machine, Gausian Na&iuml;ve Bayes, k-NN(k=5), k-NN(k=10), SVM, and Neural Network) to predict the origin of soil samples using freely available next generation sequencing 16S data from 233 vineyards planted in 5 countries (Australia (n=32), Spain (n=86), Denmark (n=15), Germany (n=10), and USA (n=63)), distributed within 3 different continents (Australia (n=32), Europe (n=138), and North America&nbsp; (n=63)). The accuracy of the tested models to predict the country of origin varied between 63% and 92% obtained by the Bernouli Na&iuml;ve Bayes and the Neural Network models respectively. As expected, continent prediction was slightly higher and varied between 69% and 94% obtained by the k-NN(k=10) and the Neural Network models respectively.</li><br /> </ul><br /> <ul style="list-style-type: circle;"><br /> <li><strong>Planted genotype prediction using microbiome data from vineyard soil samples:</strong> The same ML models enumerated above were used to predict the planted grapevine genotype (cultivar) using freely available next generation sequencing 16S data from 177 soil samples from vineyards planted with 7 different cultivars in 9 different countries (Cabernet Sauvignon,(n=65; planted in Australia, Spain, South Africa, and USA), Tempranillo (n=60; planted in Spain), Syrah/Shiraz (n=12; planted in Australia, Italy, Spain, and South Africa), Chardonnay (n=12; planted in Argentina and USA), Pinot Noir (n=10; planted in Croacia and USA), Riesling (n=10; planted in Germany) and Solaris (n=10; planted in Denmark). The accuracy of the tested models to predict the country of origin varied between 63% and 81% obtained by the the Bernouli Na&iuml;ve Bayes and the Neural Network models respectively. All models however showed high levels of variability in their prediction accuracy. We hypothesize that this is due to data imbalance due to the disparity on the number of data sets between cultivars. To test this hypothesis, we will use synthetic and real datasets generated in house.</li><br /> </ul><br /> <p>&nbsp;</p><br /> <p>Expected impact(s): The quality of grapes used for wine production has been traditionally associated to the concept of Terroir. This concept captures the interaction between the cultivated grapevine variety and the complete natural environment in which a particular wine is produced, including the soil, topography, climate, and the viticultural and oenological practices used to manage the vineyard and during wine production respectively. Recent studies (e.g. Zhou et al. 2021) show that the composition, diversity and function of soil bacterial communities play important roles in determining wine quality which can indirectly affect its economic value. Two of the main drivers of soil bacterial community composition in vineyards are the environmental conditions (Zhou et al. 2021), and the planted grapevine cultivar, suggesting that terroir is not a unidirectional vector, but a feed-back loop between the original soil microbial communities, the vineyard environment, and the planted cultivar. Understanding how the environment and the plant genotype interact to alter the soil microbial communities, is therefore of paramount importance for the elucidation of the elusive concept of terroir.</p><br /> <p>&nbsp;</p><br /> <p><span style="text-decoration: underline;">Mississippi: Mississippi State U.</span></p><br /> <p>AI Appications in Cotton and Fruit Crops (Alex Thomasson)</p><br /> <p>Over the reporting year our team accompllshed the following:&nbsp;</p><br /> <ul style="list-style-type: circle;"><br /> <li>Generation of big data sets:<br /> <ul><br /> <li>Collected and processed hundreds of soil samples from benchmark soil series in Mississippi in summer 2022. &nbsp;These samples are being scanned to collect spectra in order to create a dataset that will be used to develop AI-based soil carbon estimations.&nbsp; (Dr. Nuwan Wijewardane)</li><br /> <li class="x_xxmsolistparagraph">Collected and processed hundreds of images of weeds that are common competitors in cotton crops.&nbsp; These images are being used to develop AI models that can enable real-time detection of weeds for spot spraying in cotton crops.&nbsp; (Dr. Yuzhen Lu)</li><br /> </ul><br /> </li><br /> <li>Development of AI-based models for natural resources applications:<br /> <ul><br /> <li class="x_xxmsolistparagraph">Used AI on multisource data to forecast groundwater levels in the Mississippi River Valley Alluvial Aquifer. &nbsp;(Drs. Joel Paz and Mary Love Tagert)</li><br /> </ul><br /> </li><br /> </ul><br /> <ul style="list-style-type: circle;"><br /> <li>Development of AI for image-based detection and classification in the following applications:<br /> <ul><br /> <li class="x_xxmsolistparagraph">Cranberry fruit at different maturity levels for soft robotic harvesting. &nbsp;(Dr. Xin Zhang)</li><br /> <li class="x_xxmsolistparagraph">Treatment of herbicide-resistant weeds in real time, directing an automated tillage implement to reduce herbicide usage and prevent unnecessary soil moisture loss that occur with whole-field tillage.&nbsp; (Dr. Wes Lowe)</li><br /> <li class="x_xxmsolistparagraph">Separate plastic contaminants from cotton fiber. &nbsp;(Filip To)</li><br /> <li class="x_xxmsolistparagraph">Locate cotton bolls on plants for robotic harvesting.&nbsp; (Dr.&nbsp;<span class="markrfaqfn884" data-markjs="true" data-ogac="" data-ogab="" data-ogsc="" data-ogsb="">Alex</span>&nbsp;Thomasson)</li><br /> <li class="x_xxmsolistparagraph">Predict the yield of cotton plants from early-season multisource data including drones.&nbsp; (Dr.&nbsp;<span class="markrfaqfn884" data-markjs="true" data-ogac="" data-ogab="" data-ogsc="" data-ogsb="">Alex</span>&nbsp;Thomasson)</li><br /> </ul><br /> </li><br /> <ul type="circle"><br /> <li class="x_xxmsolistparagraph">Platic rubbish in cotton fields before harvest, with data from drones. (Dr.&nbsp;<span class="markrfaqfn884" data-markjs="true" data-ogac="" data-ogab="" data-ogsc="" data-ogsb="">Alex</span>&nbsp;Thomasson)</li><br /> <li class="x_xxmsolistparagraph">Volunteer cotton plants in corn and sorghum fields.&nbsp; (Dr.&nbsp;<span class="markrfaqfn884" data-markjs="true" data-ogac="" data-ogab="" data-ogsc="" data-ogsb="">Alex</span>&nbsp;Thomasson)</li><br /> </ul><br /> </ul><br /> <p>Expected impact(s): Our research is expected to generate impacts from three major areas in the development and application of AI in agriculture:</p><br /> <ol><br /> <li>Generation of big data sets and the enhanced informed decision making capacity that will be unfold.</li><br /> <li>Modeling for natural resource applications to provide stakeholders with more refined, accurate, and wider scoped information from whcich policy and business decisions can be undertaken.</li><br /> <li>The marked advancements in detection and classification of images expanded to new applications will continue to be adopted by producers as a practical tool for real-time automated decision-making and applications.&nbsp; For example, we have shown that AI can be used for real-time detection of (a) contaminants in cotton fiber, (b) cotton bolls for robotic harvesting, (c) plastic rubbish in cotton fields that can contaminate cotton fiber, and (d) cotton plants that have germinated from seed left in fields at harvest during the prior season, which can serve as a host for pernicious insects. Such improved detection will generate greater productivity and enhanced profits for producers in the Southern region.&nbsp;&nbsp;</li><br /> </ol><br /> <p>&nbsp;</p><br /> <p><span style="text-decoration: underline;">North Carolina: North Carolina State (Edward Kick)&nbsp;</span></p><br /> <p>Our research accompished the foillowing over the past reporting year:</p><br /> <ul style="list-style-type: circle;"><br /> <li>Collected and merged large data sets from published sources such as World Bank, United Nations, Fa. Circa 50 variables were coded using R and Python into an Excel file. 3. Data set was cleaned using R and SPSS, as missing data were searched for missing data that were located, coded, and cleaned using R.</li><br /> <li>Descriptive data analysis was undertaken to identify miscodes, means, standard deviations, etc. Data transformations were applied as necessary in log transformations for skewed data. Replacement data as needed were located, coded, and cleaned in a new access data file.</li><br /> <li>A total of 120 hours of regression analyses were undertaken under the program &ldquo;structural equation modeling&rdquo; (SEM) using AMOS. SEM permits tests of hypothesized linkages among all variables, thereby showing the strength of a series of direct and indirect effects. This helps the researchers avoid multicollinearity, which otherwise would compromise the estimations and essentially provide inaccurate inferences. Preliminary results indicate that industrial agriculture results in unsatisfactory consequences for food production. The blind faith that accompanies its usage is seriously questioned for the 134 nations examined. 10. Results and conclusions are published in an agricultural journal. The next set of even more sophisticated results is under examination and very likely to be published in the Swiss journal, Sustainability. These findings are corroborated by Carolan (2016: 112-115).</li><br /> <li>Preliminary literature review of artificial intelligence and agriculture literature undertaken for one section of our recently supported Multistate request.</li><br /> </ul><br /> <p>Expected impact(s): There are said to be four or five nations in the world that use the model of agricultural production that guided much of the Green Revolution (GR). GR is lauded for substantially increasing production of essential agricultural products such as wheat, which helped millions of starving persons in the middle 1900s. However, meta-analyses clearly show the many negative impacts of agricultural production on communities. Our intensive investigation of 134 countries further shows that industrial agriculture has not improved undernourishment in the modern world, and in fact, it has contributed substantially to the degradation of our global environment through production and release. Eco-agricultural farming promises to be a superior alternative, particularly when it is carefully coupled with artificial intelligence. We plan to investigate the attitudes of farming communities, those with a substantial component of farming as we once knew it, to ascertain their views of both Eco-agriculture as explained by us, and artificial intelligence. We have already gathered the base data on communities in every corner of the United States. We have begun to analyze the BIG DATA, which will guide us in the selection of communities for examination.</p><br /> <p>&nbsp;</p><br /> <p><span style="text-decoration: underline;">Oklahoma: Oklahoma State (Brian Arnall, Tyson Ochsner, Jason Warren, and Jeff Vitale)</span></p><br /> <p>Research over the past reporting year achived the following:</p><br /> <ul style="list-style-type: circle;"><br /> <li>Developed and impmentted an AI algorthm on a data set of winter wheat from nitrogen studies which have been continued for 15 years. Data includes yield, protein, NDVI, soil characteristics, and mesonet wheat and soil data. Results generate yield and various plant growth characteristics.&nbsp;</li><br /> <li>Developed a machine learning model to predict the movement of sugar cane aphid on oklahoma sorghum fields.</li><br /> <li>Developed and applied neural network models for nondestructive estimation of aboveground biomass in rangelands and for high resolution mapping of soil moisture across heterogenous land covers.</li><br /> <li>Aerial sensor data collected on wheat field trials at the Perkins Experiment Station&nbsp;</li><br /> </ul><br /> <p>&nbsp;</p><br /> <p>&nbsp;</p><br /> <p>&nbsp;</p><br /> <p><span style="text-decoration: underline;">South Carolina: Clemson University</span></p><br /> <p>Our research over the reporting year completed two successive years of hyperspectral data collection of peach leaves at one of Clemson Research Centers &ndash; Piedmont Research and Education Center. The data collection started in 2020 and continued until late 2021. Before the pandemic, our team and collaborators from CSIC, Spain collected hyperspectral images at the same peach orchards to determine the effects of Silicon applications on a peach tree and water stress. Young, full-sized leaves with petioles attached were picked for the trees with high and medium K concentrations (45 leaves), while 50 were selected for the peach trees with low K concentrations due to their size. The center research staff working on the same plot designated the plots for high, medium, and low K concentration. The leaves were collected from the midpoint near the base of each tree. The mature leaf samples were then grouped into low, mid, and high K. A snapshot hyperspectral camera was used to scan each leaf of each group before sending it to the Agricultural Services Laboratory for a plant tissue analysis. The spectral data were preprocessed using a calibration panel. Four pretreatment methods (Multiplicative Scatter Effect, Savitzky-Golay first derivative, Savitzky-Golay second derivative and Standard Normal Variate) were applied to the calibrated raw data and partial least square (PLS) was used to develop a model for each pretreatment.</p><br /> <p>Expected impact(s): The impact of the K prediction project on the peach tree is twofold, 1 -helps determine the spectral signature where K can be predicted, and 2&mdash;use of pretreatment methods significantly improves the development of PLS models. The results of this work open the possibility of developing a more miniature detector of K, which only uses the essential bands. It will also facilitate the development of sensors specific to K detection which will be cheaper for farmers to use.</p><br /> <p>&nbsp;</p><br /> <p><span style="text-decoration: underline;">Tennessee: University of Tennessee</span></p><br /> <p>The work conducted by Tennessee focused on the development of AI technology for the improvement of livestock and poultry health and welfare. Our research team developed and evaluated ancamera system in the lab and in the broiler farm for automated gait score assessment. The outcome of this project is an automated tool that helps broiler farmers to identify lameness in broilers early. It provides timely information for broiler farmers to improve their farm management practices for better animal welfare and higher production.multiple research and commercial broiler farms in the U.S.</p><br /> <p>Expected impact(s): In this project, a computer vision system was developed to provide an automated assessment of broiler gait scores in commercial farms. The system was low cost, required minimum maintenance, and was designed to be used for most commercial broiler farms. The potential impact of the research is to provide farmers with an automated tool for accurate and timely broiler welfare evaluation. It will lead to improvements in farm management, thus, improvements in animal welfare and health. It will eventually help improve animal productions and will also bring in economic benefits to US agriculture and food systems.</p><br /> <p>&nbsp;</p><br /> <p>&nbsp;</p><br /> <p><span style="text-decoration: underline;">Texas: Texas A&amp;M (Juan Landivar)</span></p><br /> <p>1. Crop Phenotyping: Cotton and Wheat</p><br /> <p>Cotton: We developed an Unmanned Aircraft System (UAS) based High Throughput Phenotyping (HTP) pipeline that can measure temporal growth and spatial development parameters for cotton. The system includes a big data management system (CottonHub) which facilitates UAS data management (search, upload, and download), generates geospatial data products for visualization purposes, extracts plant growth features, and communicates with users and cooperators. The system includes tools to perform growth analysis of the experimental units or genotypes and extract approximately twelve growth parameters depicting the characteristics and performance of the genotypes in field conditions.</p><br /> <p>Wheat: We are participating in a wheat CAP grant as part of a team of 19 wheat programs in the USA. Our work involves developing a standardized UAS data collection protocol for high-quality data collection, training the WheatCAP team members in UAS data collection, management/processing of the UAS data collected by the national wheat breeding programs, providing visualization tools to the WheatCAP team members, and extracting plant growth features to select elite germplasm.</p><br /> <p>Expected impact(s):</p><br /> <p>The phenotyping systems described above are used by cotton and wheat breeders to manage, display, and analyze phenotypic features of experimental genotypes. The system is included in the NSF grant led by Drs. Hombing Zhang, Wayne Smith, and Steve Hague (Texas A&amp;M University, cotton Breeders), and other breeders from across the Cotton Belt. The cotton UASHub has the potential to save cotton breeders as much as 60% of the cost of collecting phenotype and yield data from field plots. Similarly, the WheatCAPHub (https://wheatcap.uashubs.com) is part of a USDA-NIFA grant (funded, $15M), led by Dr. Amir Ibrahim (Texas A&amp;M Wheat Breeder). Our contribution has the potential of similar cost reductions to the wheat breeder, in addition to facilitating communication among scientists from across the country.</p><br /> <p>&nbsp;</p><br /> <p>2. Testbed Data Management and Uses</p><br /> <p>We completed four years (2019 to 2022) of UAS data collection in a large (approximately 100 acres) commercial cotton field located in Driscoll, Texas as a testbed. RGB and multispectral data were collected on a weekly basis from UAS. The collected data were processed to generate fine spatial and high temporal resolution orthomosaic and Digital Surface Models (DSM). The testbed fields were divided into grids of 10 x 10 m (100 m2) which resulted in a total of approximately 4,000 grids within the testbed. Plant features extracted from each grid include: Canopy Cover (%), Plant Height (m), Canopy Volume (m3), and Vegetative Indexes (Excess Greenness Index and NDVI). Seedcotton yield was obtained from the yield monitoring system of the cotton harvester. The data was used to develop Digital Twin models for in-season crop management and to obtain an early-season estimation of cotton yield for marketing purposes.</p><br /> <p>Expected impact(s): The digital twin system developed from the testbed data described above was used to estimate crop termination time and defoliant rates during the 2021 and 2022 seasons. Canopy Cover data was used to create management zones for the precision application of defoliants. The system accurately estimated the time of defoliation as early as one month before the event. The digital twin model estimated cottonseed yield within 5% of the actual yield approximately a month before harvest. It is expected that the prescription crop management package along with in-season yield forecasting capabilities developed from the testbed data would optimize production cost, yield, and fiber quality. Having earlier and more accurate forecasts of pre-harvest yield would be extremely useful in facilitating forward selling from growers to merchant buyers. Thus, a reduction in production cost along with enhanced marketing strategies could enhance the profitability of cotton production by approximately 10%.</p><br /> <p>3. Extension &amp; Outreach</p><br /> <p>Cotton cultivar tests are crucial educational activities of research and extension agronomists, but their establishment, maintenance, and data collection are time-consuming and costly. Although the information generated is valuable, the field plots are seldom visited by producers. This project proposes to bring the field plots to producers via a web-based platform designed to analyze and visualize the growth characteristics of cultivars and to make comparisons between entries. A cotton cultivar data management web-based Hub (CultivarHub) was developed to facilitate the communication between extension personnel (crop specialists and county agents) with producers.</p><br /> <p>Expected impact(s):</p><br /> <p>The impact of the CultivarHub is twofold; (1) help cotton specialist manage, analyze, and summarize data from cultivar, agrochemicals, or irrigation evaluation trials, and (2) facilitate the communication and educational outreach of extension specialist, county agents, or IPM agents with producers. It is expected that this web-based platform can increase the outreach and technical communication efficiency of extension personnel by 60%.</p><br /> <p>&nbsp;Extension presentation(s):</p><br /> <p>Landivar J, Mahendra Bhandari, Josh McGinty, Murilo Maeda, Jose Landivar, Hend Alkittawi, Anjin Chang, Daniel Gonzales. 2022. UAS-Based Platform for Evaluating the Performance of Cotton Cultivars for Research and Outreach. 2022 Beltwide Cotton Conferences, San Antonio, Texas. January 7, 2022.</p><br /> <p>&nbsp;</p><br /> <p>&nbsp;</p>

Publications

<p>Alabama: Aubrun University&nbsp;</p><br /> <p>Citation of the conference proceedings paper:&nbsp;Oliveira, M.F., F.M. Carneiro, M. Thurmond, M.D. Del val, L.P. Oliveira, B. Ortiz, A. Sanz-saez, D. Tedesco. 2022.&nbsp;Predicting Below and Above Ground Peanut Biomass and Maturity Using Multi-target Regression. In Proceedings of the&nbsp;2022 International Conference of Precision Agriculture. June 26-29, 2022 Minneapolis.</p><br /> <p>&nbsp;</p><br /> <p>Florida: University of Florida</p><br /> <p>Yuan, W., Choi, D., Bolkas, D., Heinemann, P.H. and He, L., 2022. Sensitivity examination of YOLOv4 regarding test image distortion and training dataset attribute for apple flower bud classification. International Journal of Remote Sensing, 43(8), pp.3106-3130.</p><br /> <p>Yuan, W., Choi, D., &amp; Bolkas, D. (2022). GNSS-IMU-assisted colored ICP for UAV-LiDAR point cloud registration of peach trees. Computers and Electronics in Agriculture, 197, 106966.</p><br /> <p>Zahid, A., Mahmud, M.S., He, L., Schupp, J., Choi, D. and Heinemann, P., 2022. An Apple Tree Branch Pruning Analysis. HortTechnology, 32(2), pp.90-98.</p><br /> <p>Zhang, H., He, L., Di Gioia, F., Choi, D., Elia, A. and Heinemann, P., 2022. LoRaWAN based internet of things (IoT) system for precision irrigation in plasticulture fresh-market tomato. Smart Agricultural Technology, 2, p.100053.</p><br /> <p>Mahmud, M.S., Zahid, A., He, L., Choi, D., Krawczyk, G. and Zhu, H., 2021. LiDAR-sensed tree canopy correction in uneven terrain conditions using a sensor fusion approach for precision sprayers. Computers and Electronics in Agriculture, 191, p.106565.</p><br /> <p>Patel, A., W. S. Lee, N. A. Peres, and C. W. Fraisse. 2021. Strawberry plant wetness detection using computer vision and deep learning. Smart Agricultural Technology 1, 2021, 100013, ISSN 2772-3755, https://doi.org/10.1016/j.atech.2021.100013</p><br /> <p>Yun, C., H.-J. Kim, C.-W. Jeon, M. Gang, W. S. Lee, and J. G. Han. 2021. Stereovision-based ridge-furrow detection and tracking for auto-guided cultivator. Computers and Electronics in Agriculture 191, 2021, 106490, ISSN 0168-1699, https://doi.org/10.1016/j.compag.2021.106490.</p><br /> <p>Puranik, P., W. S. Lee, N. Peres, F. Wu, A. Abd-Elrahman, and S. Agehara. 2021. Strawberry flower and fruit detection using deep learning for developing yield prediction models. In the Proceedings of the 13th European Conference on Precision Agriculture (ECPA), July 19-22, 2021, Budapest, Hungary.</p><br /> <p>Zhou, X., W. S. Lee, Y. Ampatzidis, Y. Chen, N. Peres, and C. Fraisse. 2021. Strawberry maturity classification from UAV and near-ground imaging using deep learning. Smart Agricultural Technology 1, 2021, 100001, ISSN 2772-3755, https://doi.org/10.1016/j.atech.2021.10000</p><br /> <p>Costa L., McBreen J., Ampatzidis Y., Guo J., Reisi Gahrooei M., Babar A., 2021. Using UAV-based hyperspectral imaging and functional regression to assist in predicting grain yield and related traits in wheat under heat-related stress environments for the purpose of stable yielding genotypes. Precision Agriculture, 23 (2), 622-642.</p><br /> <p>Costa L., Ampatzidis Y., Rohla C., Maness N., Cheary B., Zhang L., 2021. Measuring pecan nut growth utilizing machine vision and deep learning for the better understanding of the fruit growth curve. Computers and Electronics in Agriculture, 181, 105964, <a href="https://doi.org/10.1016/j.compag.2020.105964">doi.org/10.1016/j.compag.2020.105964</a>.</p><br /> <p>Costa L., Archer L., Ampatzidis Y., Casteluci L., Caurin G.A.P., Albrecht U., 2021. Determining leaf stomatal properties in citrus trees utilizing machine vision and artificial intelligence. Precision Agriculture 22, 1107-1119, <a href="https://doi.org/10.1007/s11119-020-09771-x">https://doi.org/10.1007/s11119-020-09771-x</a>.</p><br /> <p>Nunes L., Ampatzidis Y., Costa L., Wallau M., 2021. Horse foraging behavior detection using sound recognition techniques and artificial intelligence. Computers and Electronics in Agriculture, 183, 106080, <a href="https://doi.org/10.1016/j.compag.2021.106080">doi.org/10.1016/j.compag.2021.106080</a>.</p><br /> <p>Vijayakumar V., Costa L., Ampatzidis Y., 2021. Prediction of citrus yield with AI using ground-based fruit detection and UAV imagery. 2021 Virtual ASABE Annual International Meeting, July 11-14, 2021, 2100493, doi:10.13031/aim.202100493.</p><br /> <p>Zhou, C., W. S. Lee, O. E. Liburd, I. Aygun, J. K. Schueller, and I. Ampatzidis. 2021. Smartphone-based tool for two-spotted spider mite detection in strawberry. ASABE Paper No. 2100558. St. Joseph, MI.: ASABE.</p><br /> <p>Zhou, X., Y. Ampatzidis, W. S. Lee, and S. Agehara. 2021. Postharvest strawberry bruise detection using deep learning. ASABE Paper No. 2100458. St. Joseph, MI.: ASABE.</p><br /> <p>Influence of Planting Date, Maturity Group, and Harvest Timing on Soybean (Glycine max (L.) Yield and Seed Quality, PRISCILA CAMPOS, DONNIE MILLER, JOSH COPES, MELANIE NETTERVILLE, SEBE BROWN, TREY PRICE, DAVID MOSELEY, THANOS GENTIMIS, PETERS EGBEDI, RASEL PARVEJ3 (Accepted by Crop, Forage, &amp; Turfgrass Management, Summer 2022). In this paper, modern methodologies were implemented in the analysis of the results, as well as more traditional statistical techniques.</p><br /> <p>The Time of Day Is Key to Discriminate Cultivars of Sugarcane upon Imagery Data from Unmanned Aerial Vehicle, BARBOSA J&Uacute;NIOR, M.R.; TEDESCO, D.; CARREIRA, V.S.; PINTO, A.A.; MOREIRA, B.R.A.; SHIRATSUCHI, L.S.; ZERBATO, C.; SILVA, R.P., Drones 2022, 6, 112. https://doi.org/10.3390/drones6050112</p><br /> <p>UAVs to Monitor and Manage Sugarcane: Integrative Review, BARBOSA J&Uacute;NIOR, M.R.; MOREIRA, B.R.A.; BRITO FILHO, A.L.; TEDESCO, D.; SHIRATSUCHI, L.S.; SILVA, R.P., Agronomy 2022, 12, 661. https://doi.org/10.3390/agronomy12030661</p><br /> <p>Predicting Days to Maturity, Plant Height, and Grain Yield in Soybean: A Machine and Deep Learning Approach Using Multispectral Data, TEODORO, P. E.; TEODORO, L. P. R.; BAIO, F. H. R.; SILVA JUNIOR, C. A.; SANTOS, R. G.; RAMOS, A. P. M.; PINHEIRO, M. M. F.; OSCO, L. P.; GONCALVES, W. N.; CARNEIRO, A. M.; MARCATO JUNIOR, J.; PISTORI, H.; SHIRATSUCHI, L. S., Remote Sensing, v. 13, p. 4632, 2021</p><br /> <p>Comparison of Machine Learning Techniques in Cotton Yield Prediction Using Satellite Remote Sensing, MORELLI-FERREIRA, F.; MAIA, N.J.C.; TEDESCO, D.; KAZAMA, E.H.; MORLIN CARNEIRO, F.; SANTOS, L.B.; SEBEN JUNIOR, G.F.; ROLIM, G.S.; SHIRATSUCHI, L.S.; SILVA, R.P. Preprints 2021, 2021120138 (doi: 10.20944/preprints202112.0138.v2). Published and in preparation for Remote Sensing.</p><br /> <p>&nbsp;</p><br /> <p>&nbsp;</p><br /> <p>Kentucky: University of Kentucky</p><br /> <p>Ekramirad, N., Khaled, Y.A., Doyle, L., Loeb, J., Donohue, K.D., Villanueva, R., and <strong>Adedeji, A.A.</strong> (2022). Nondestructive detection of codling moth infestation in apples using pixel-based NIR hyperspectral imaging with machine learning and feature selection. <em>Foods</em> 11(8), 1 - 16. (Citation: 2)</p><br /> <p>Rady, A., Watson, N., and <strong>Adedeji, A.A. </strong>(2021). Color imaging and machine learning for adulteration detection in minced meat. <em>Journal of Agriculture and Food Research </em>6(100251), 1-11. (Citation: 1)</p><br /> <p>Watson, N.J., Bowler, A.L., Rady, A., Fisher, O.J., Simeone, A., Escrig, J., Woolley, E., and <strong>Adedeji, A.A</strong>. (2021). Intelligent sensors for sustainable food and drink manufacturing. <em>Frontiers in Sustainable Food Systems.</em> 5, 642786. (Citation: 5)</p><br /> <p>Ekramirad, N., Al Khaled, Y.A., Donohue, K., Villanueva, R., Parrish, C.A., and <strong>Adedeji, A</strong>. (2021). Development of pattern recognition and classification models for the detection of vibro-acoustic emissions from codling moth infested apples. <em>Postharvest Biology and Technology </em>181, 111633. (Citation: 1)</p><br /> <p>Khaled, Y.A., Parrish, C. and *<strong>Adedeji, A.A</strong>. (2021). Emerging nondestructive approaches for meat quality and safety evaluation. <em>Comprehensive Reviews in Food Science and Food Safety. </em>20(4): 3438-3463. (Citation: 15)</p><br /> <p>&nbsp;</p><br /> <p>Mississppi: Mississippi State University</p><br /> <p>Chen, D., Lu, Y., Li, Z., and Young, S.&nbsp; 2022. &nbsp;Performance evaluation of deep transfer learning on multi-class identification of common weed species in cotton production systems.&nbsp;&nbsp;<em>Computers and Electronics in Agriculture</em>.</p><br /> <p>Yadav, P.K., Thomasson, J.A., Hardin, R.G., Searcy, S.W., Braga-Neto, U., Popescu, S.C., Martin, D.E., Rodriguez III, R., Meza, K., Enciso, J. and Solorzano, J.&nbsp; 2022. Volunteer cotton plant detection in corn field with deep learning. &nbsp;In Proc.&nbsp;<em>Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping VII</em>&nbsp;(Vol. 12114, pp. 15-22). SPIE.</p><br /> <p>&nbsp;</p><br /> <p>North Carolina: NC State&nbsp;&nbsp;</p><br /> <p>Edward L. Kick, Laura McKinney, Steve McDonald, Andrew Jorgenson. &ldquo;A Multiple-Network Analysis of the World System of Nations&rdquo; is printed in Chinese and is used in what we hope will be a forthcoming publication in Sustainability.&rdquo; That paper introduces the possibility of using artificial intelligence in farming around the world.</p><br /> <p>Edward L. Kick, Gregory Fulkerson, and Ahad Pezeshkpoor. &ldquo;Agriculture Grains, and Beef Production: Remedies for food for Food Insecurity and the Ecological Footprint When the Cataclysm Comes?&rdquo; Agricultural Research and Technology 23: 53-57.</p><br /> <p>Edward L. Kick &ldquo;Cross-National Empirical Studies of Sustainability, Agriculture and the Environment: Cumulating Forward or Erring in an About Face?&rdquo; Agricultural Research and Technology 25: 601-603.</p><br /> <p>Edward L. Kick. &ldquo;Taking a World View&rdquo;. College of Agriculture and Life Sciences Newsletter.</p><br /> <p>Edward L. Kick and Ahad Pezeshkpoor. &ldquo;Biomes, World-System Positions, and National Characteristics as linked Precursors to Global Undernourishment and the Ecological Footprint&rdquo; Under revision for publication in Sustainability.</p><br /> <p>&nbsp;</p><br /> <p>&nbsp;</p><br /> <p>South Carolina: Clemson University</p><br /> <p>Abenina MIA, Maja JM, Cutulle M, Melgar JC, Liu H. Prediction of Potassium in Peach Leaves Using Hyperspectral Imaging and Multivariate Analysis.</p><br /> <p>AgriEngineering.2022;4(2):400-413. https://doi.org/10.3390/agriengineering4020027</p><br /> <p class="x_MsoNormal">&nbsp;</p><br /> <p class="x_MsoNormal">Tennessee; University of Tennessee</p><br /> <p>Nasiri, A., Yoder, J., Zhao, Y., Hawkins, S., Prado, M., &amp; Gan, H. (2022). Pose estimation-based lameness recognition in broiler using CNN-LSTM network. Computers and Electronics in Agriculture, 197, 106931.</p><br /> <p>&nbsp;</p><br /> <p>Texas: Texas A&amp;M</p><br /> <p>Bhandari, M.; Baker, S.; Rudd, J. C.; Ibrahim, A. M. H.; Chang, A.; Xue, Q.; Jung, J.; Landivar, J.; Auvermann, B. Assessing the Effect of Drought on Winter Wheat Growth Using Unmanned Aerial System (Uas)-Based Phenotyping. Remote Sens. 2021, 13 (6). https://doi.org/10.3390/rs13061144.</p><br /> <p>W. Wu, S. Hague, J. Jung, A. Ashapure, M. Maeda, A. Maeda, A. Chang, D. Jones, J.A. Thomasson, J. Landivar, "Cotton row spacing and Unmanned Aerial Vehicle sensors," Agronomy Journal, https://doi.org/10.1002/agj2.20902, 2021</p><br /> <p>A. Chang, J. Jung, J. Landivar, J. Landivar, B. Barker, R. Ghosh, "Performance evaluation of parallel structure from motion (SfM) processing with public cloud computing and an on-premise cluster system for UAS images in agriculture," International Journal of Geo-Information, 10, 677, https://doi.org/10.3390/ijgi10100677, 2021</p><br /> <p>S. Oh, A. Chang, A. Ashapure, J. Jung, N. Dube, M. Maeda, D. Gonzalez, J. Landivar, "Plant Counting of Cotton from UAS Imagery Using Deep Learning-Based Object Detection Framework", Remote Sensing, 12(18):2981, DOI: 10.3390/rs12182981, 2020</p><br /> <p>M. Bhandari, A. Ibrahim, Q. Xue, J. Jung, A. Chang, J. Rudd, M. Maeda, N. Rajan, H. Neely, J. Landivar, "Assessing winter wheat foliage disease severity using aerial imagery acquired from small Unmanned Aerial Vehicle (UAV)", Computers and Electronics in Agriculture, 176:105665, DOI: 10.1016/j.compag.2020.105665, 2020</p><br /> <p>Landivar J., J. Jung, A. Ashapure, M. Bhandari, M. Maeda, J. Landivar, A. Chang and D. Gonzalez. 2021. In-Season Management of Cotton Crops Using &ldquo;Digital Twins&rdquo; Models. ASA-CSSA-SSSA International Annual Meeting, Salt Lake City, UT, November 9-11, 2021.</p><br /> <p>Landivar J, M. Maeda, A. Chang, J. Jung, J. McGinty, C. Bednarz, 2021. "Estimating the time and rate of harvest aid chemicals using an Unmanned Aircraft System," 2021 Beltwide Cotton Conferences, Online Conference, January 5 - 7, 2021</p><br /> <p>Ashapure A., J. Jung, A. Chang, S. Oh, J. Yeom, M. Maeda, A. Maeda, N. Dube, J. Landivar, S. Hague, W. Smith, 2020. "Developing a machine learning based cotton yield estimation framework using multi-temporal UAS data", ISPRS Journal of Photogrammetry and Remote Sensing, vol. 169, pp. 180-194.</p><br /> <p>J. Jung, M. Maeda, A. Chang, M. Bhandari, A. Ashapure, J. Landivar, "The potential of remote sensing and artificial intelligence as tools to improve the resilience of agriculture production systems", Current Opinion in Biotechnology, vol. 70, pp. 15-22, 2021</p><br /> <p>&nbsp;</p><br /> <p>&nbsp;</p>

Impact Statements

  1. Our first year of project activities has had positive impacts on Southern regional agriculture through a diverse range of AI activities as detailed in the Accomplishments section of this report. Our accomplishments include advances in row crop production, fruit production and harvesting, and in food processing and manufacturing.
Back to top

Date of Annual Report: 07/24/2023

Report Information

Annual Meeting Dates: 04/17/2023 - 04/19/2023
Period the Report Covers: 09/01/2022 - 04/17/2023

Participants

Please see attached below

Brief Summary of Minutes

Accomplishments

Publications

Impact Statements

Back to top

Date of Annual Report: 09/05/2023

Report Information

Annual Meeting Dates: 08/09/2023 - 08/11/2023
Period the Report Covers: 09/01/2022 - 01/01/2023

Participants

Brief Summary of Minutes

Duirng the meeting, we had the opportunity to discuss various collaborative efforts, that culminated to a google document spreadsheet with multiple participants "buying in to various projects"


The first day of the conference was dedicated to showcasing the use of AI in agriculture to a specialty crop that is important for Louisiana (sugarcane). The participants had the opportunity to visit the John Deere factory at Thibodaux and discuss cutting edge technology initiatives with the technical lead team there. The participants also got to tour the facility on site. The second stop was the LSU Ag Center Sugarcane Station, where the participants got to see first hand the use of AI to improve the Sugarcane Breeding program in Louisiana. The team there presented us with the whole process, and showed how an AI powered app is fundamental in their new breeding scheduling. 


The second day of the conference was dedicated to creating possible collaborations, both for common papers and common grant writting. 


 

Accomplishments

Publications

Impact Statements

Back to top

Date of Annual Report: 06/11/2024

Report Information

Annual Meeting Dates: 04/15/2024 - 04/17/2024
Period the Report Covers: 04/15/2024 - 04/17/2024

Participants

Juan Landivar
Gary Thompson
Alex Thomasson
Ioannis Ampatzidis
Damian Adams
Jeffrey Vitale
Thanos Gentimis
Mahendra Bhandari

Brief Summary of Minutes

The S_1090 mid year meeting took place at the 2024 "AI in Agriculture" conference at College station, TX. It was called to order at 10 am on Monday April 15, 2024, by Jeffrey Vitale.


The main speaker for the two hour meeting was Gary Thompson. Two main topics were discussed. The first was the weak annual reporting that the S_1090 project has been submitting. Dr. Thompson presented comments from external reviewers that criticized the lack of detail and insuffienct attention to how the project is meeting objectives. In general, the problem is that participants have been reporting state-level Hatch project acomplishments rather than satisfying the multi-state objectives. It was decided to devote significant time during the upcoming annual meeting (Aug 8-9, 2024) to draft the 2024 annual report. This will include having participants provide their acomplishemnts during the meeting to filter state-level Hatch activities from the multi-state. A two hour time slot has already been scheulde for the upcoming meeting. 


The second order of business was to choose the venue for the upcoming 2025 "AI in Agriculture" conference. Prior to th emeeting, there was a movement to have Washington State host the meeting. After discussion, it was decided that their participation in the S_1090 has not been active enough to warrant this choice. Afer discussing for 45minutes, it was decided to choose between either Mississippi State or North Carolina State as potential hosts for the 2025 conference venue. There was also the decision to put forth a formal committee to plan the annual conference. This would include the hosts of the upcoimng conference serving as the primary planners, but the committee will also include hosts for the following year. This will give them them experince in learning how to organize and plan the conference. 


The meeting was unanimously adjourned at 12 noon by participants. After the meeting , the S_1090 participants continued to discuss project issues with one another. 


 


Introduction


The Annual AI in Agriculture Conference, sponsored by the United States Department of Agriculture (USDA), has rapidly become a cornerstone for advancing the application of artificial intelligence (AI) in the agricultural sector. This annual conference, which began in 2022 in Auburn, has shown significant growth and impact in its brief history. It brings together experts, researchers, and practitioners from multiple land-grant universities across the Southern United States to explore and exchange cutting-edge ideas and technologies that aim to revolutionize agricultural practices through AI. The conference is chaperoned by the S1090 multistate project AI: In Agroecosystems: Big Data and Smart Technology-Driven Sustainable Production, which has grown in parallel with the conference.  In the following paragraphs we will give an overview of both, explaining their importance and contribution to the field.


Conference History:


2022 Inaugural Conference: Laying the Foundation


The inaugural session of the conference held in Auburn laid a solid foundation with 120 participants marking the beginning of a southern consortium focused on integrating AI into agriculture. The conference highlighted initial research focuses such as predictive analytics for crop and soil management, AI-driven pest control solutions, and automation technologies for improved agricultural productivity. This first meeting was crucial for setting the stage, defining the roadmap, and fostering collaborations among leading land-grant universities. The conference also introduced practical workshops where participants could gain hands-on experience with AI tools and technologies, which were highly appreciated for their immediate applicability.


2023 Expansion: Widening the Scope and Reach


In 2023, the conference was held in Florida, seeing a significant increase in participation to 200 attendees. This session expanded the scope of discussions to include AI applications in climate resilience, water resource management, and sustainable farming practices. It underscored the importance of AI in managing large datasets for real-time decision making and precision agriculture.


2024 Milestone: Record Participation and Diverse Innovations


The latest conference in Texas marked a new milestone with 320 participants, showcasing the growing interest and investment in AI-driven agricultural innovations. The 2024 conference not only addressed advanced AI applications in optimizing farm machinery, enhancing livestock management, and automating food systems but also explored the socio-economic impacts of AI, such as job creation in tech-driven farming and ethical considerations in data usage. This year’s meeting demonstrated significant progress in turning research into actionable solutions that attendees were eager to implement.


Review:


Over the years, the conference has been pivotal in introducing and discussing various AI technologies. Key advancements include the development of AI models that predict crop yields more accurately and earlier in the season, drones equipped with imaging sensors to monitor crop health, and robotic systems for harvesting. The integration of these technologies aims to reduce labor costs, increase precision in pesticide application, and enhance overall farm efficiency.


A notable outcome of these conferences has been the initiation of several collaborative projects and research initiatives. These projects leverage the collective expertise and resources of the participating universities to tackle large-scale challenges that no single institution could handle alone. Topics such as genetic crop improvement, pest migration patterns, and water use efficiency have benefited from such collaborative efforts, driving forward the research frontier in agricultural sciences.


The involvement of industry partners has also been a critical component of the conference’s success. Companies specializing in AI, robotics, and agricultural technology have found the conference to be an excellent platform for showcasing their latest products and for scouting new ideas and talents. This industry-academia partnership is vital for translating research insights into market-ready products and services that can significantly impact the agricultural sector.


Each conference iteration places a strong emphasis on education and workforce development. Through workshops, seminars, and panel discussions, participants, including students and early-career researchers, are educated on the latest AI tools and techniques, ethical issues in AI, and the future of agricultural jobs. These educational activities are crucial for preparing a tech-savvy workforce ready to implement and innovate within the AI-agriculture nexus.


As the conference looks to the future, several challenges and opportunities remain. Key among these is the need to enhance AI interpretability and trust among farmers, integrate more comprehensive data sets for AI models, and address the digital divide that could hinder technology adoption. Future conferences will likely focus more on these aspects, along with continued exploration of AI’s role in climate change mitigation and adaptation in agriculture.


Panels:


Water Resource Management Panel


One of the critical focuses of this years’ conference was the Water Resource Management Panel, which convened experts in hydrology, climatology, and agricultural planning. This panel addressed the pressing issues of water scarcity and efficiency in irrigation practices, crucial under the growing strain of climate variability. Innovations discussed included AI-based predictive models for water demand forecasting and optimization algorithms for irrigation systems that significantly reduce water waste while maintaining crop health. The panel also explored the implications of regulatory policies on water resources, emphasizing the need for synergy between technological advancements and sustainable water governance. This session was instrumental in highlighting AI's potential to enhance water use efficiency in agricultural practices, making it a cornerstone for future conferences.


Industry Expert Panel


The Industry Expert Panel brought together leaders from tech giants and startups within the agricultural technology sector to discuss the future trajectory of AI in farming. This panel provided insights into the latest technological advancements, such as machine learning models that improve pest detection and drones that optimize seed planting patterns. Industry representatives shared case studies where AI integration had led to tangible benefits, including increased yields and reduced operational costs. Furthermore, this dialogue fostered a critical discussion on the barriers to adopting these technologies, primarily focusing on the economic and infrastructural challenges faced by the agricultural community. The panel concluded with a commitment to closer collaboration between tech companies and farming professionals to tailor AI solutions that are accessible and beneficial to all farmers.


Farmers' Panel


Perhaps the most impactful session was the Farmers' Panel, which directly involved the end-users of agricultural AI technologies—the farmers. This panel provided a platform for farmers to voice their experiences, concerns, and the practical impact of AI on their farming operations. Topics of discussion included the usability of AI tools in everyday farming activities, the economic impact of AI investments, and the cultural shifts required within farming communities to embrace such technologies. Farmers shared success stories of using AI to enhance crop diagnostics and yield predictions, which have led to better crop management and reduced waste. This session was crucial for technology developers and researchers to receive grounded feedback, ensuring that future AI innovations are user-centric and address the real-world challenges of farmers. The line: “AI will not replace a human, but a human with AI will replace a human without one” coined by one of the panelists completely encapsulates the trajectory of our field.


These panels not only highlighted the diverse applications of AI in agriculture but also fostered a multi-stakeholder dialogue that is essential for the holistic adoption of technology in this traditionally conservative field. Each panel, by focusing on different aspects of the agricultural industry, helped to paint a comprehensive picture of the challenges and opportunities presented by AI, guiding the pathway for future research and implementation strategies.


Poster Sessions


The poster sessions at the conference have consistently been a highlight, offering a dynamic forum for both budding and established researchers to display their findings. In its inaugural year in Auburn, the focus of these sessions was largely introductory, designed to educate and inform participants about the fundamental aspects of artificial intelligence in the context of agriculture. Researchers presented on a range of topics from basic AI principles and data handling to preliminary applications in monitoring soil health and crop conditions. These early presentations played a crucial role in setting the educational tone for the conference, helping to align the varied expertise of participants towards a common understanding of AI’s potential in agriculture. They served as a springboard for further exploration and set the groundwork for more advanced applications, facilitating a shared baseline from which all attendees could progress.


By the 2024 conference in Texas, the evolution of the poster sessions mirrored the overall growth and deepening focus of the conference itself. The number of presentations had expanded dramatically, with more than 50 posters illustrating sophisticated uses of AI across a broad spectrum of agricultural needs. These included advanced automation systems that integrate drones and robotic technologies for precision farming, and machine learning models that enhance predictive analytics for crop yield and detect plant diseases early. The range of topics showcased a significant shift towards the implementation of complex AI solutions tailored to specific agricultural challenges, reflecting a move from theoretical to practical, impact-driven research. This maturation in the content offered not only provided new insights and knowledge but also highlighted the practical benefits and improvements AI technologies are beginning to bring to the agricultural sector. The enthusiastic participation and the quality of research presented underscored the vibrant, innovative spirit that defines the S1090 conference, marking it as a seminal event in the field of AI in agriculture.


Invited Speakers


The conference has consistently attracted top-tier talent and experts in the field of AI and agriculture, with invited speakers from globally recognized corporations such as IBM, Microsoft, NVIDIA, John Deere, and more. These industry leaders brought with them insights into the cutting-edge applications of AI technologies that are currently being developed or implemented. For instance, speakers from NVIDIA discussed advancements in GPU-accelerated computing that facilitate deep learning models capable of analyzing vast amounts of agricultural data in real time. Meanwhile, representatives from John Deere showcased the latest in farm machinery automation, which incorporates AI to optimize planting and harvesting operations. The presence of these high-caliber speakers not only elevated the conference’s prestige but also enriched the learning experience for all attendees, providing them with a glimpse into the future of technologically driven agriculture.


The contributions from such esteemed corporate speakers were instrumental in highlighting the innovative, practical applications of AI that their companies are pioneering. A speaker from IBM explored the integration of AI with weather prediction models to improve crop yield predictions and manage risks associated with climate variability. Microsoft experts provided insights into cloud computing infrastructures that support AI algorithms in processing agricultural data more efficiently and securely. These sessions not only offered theoretical knowledge but also practical strategies that participants could consider implementing in their own agricultural practices.


A notable highlight was the return of a speaker from the first conference, who remarked on the substantial growth and evolution of the event over the years. He pointed out that what started as a regional meeting has matured significantly to be considered a national event, indicative of its expanding influence and the critical role it plays in shaping the future of agriculture in the United States. This sentiment was echoed by many participants who appreciated the increasingly diverse topics and the inclusion of more complex discussions surrounding the ethical, economic, and social implications of AI in agriculture. The conference's evolution reflects its success in fostering a comprehensive dialogue that not only addresses the technical aspects of AI but also its real-world impacts, positioning the conference as a pivotal, nationally recognized platform for future innovations in agricultural technologies.


Conclusion


The S1090 Multistate Project and its corresponding AI in Agriculture annual conference has made commendable progress in fostering innovation, collaboration, and education among the southern land-grant universities. As it grows, the conference not only serves as a beacon of knowledge and innovation but also as a catalyst for tangible improvements in agricultural practices through AI. The continued success of this conference promises to usher in a new era of agriculture that is smarter, more efficient, and sustainable, benefiting stakeholders across the spectrum.


 


 


 

Accomplishments

<p>The major accomplishment was to put in place a more organized effort to write annual reports in response to recent feedback.&nbsp;</p>

Publications

Impact Statements

  1. The meeting developed a new protocol for the planning of the "AI in Agriculture" conference that will strengthen them into the future.
Back to top

Date of Annual Report: 10/02/2024

Report Information

Annual Meeting Dates: 08/07/2024 - 08/09/2024
Period the Report Covers: 10/01/2023 - 09/30/2024

Participants

PARTICIPANTS: The meeting was a hybrid session. There were in-person and online participants.

Day 2 Attendance (In person):
Jefferson (Jeff) Vitale
Thanos Gentimis
Yaqing Xu
Roberts Strong
Akinbode Adedeji
Lauren Godsmith
Mahendra Bhandari
Karun Kaniyamattam
Gordan Rojan.

Day 2 Attendance (Online):
Nikolay Bliznyuk
Ziteng Xu
Dongyi Wang
Yuzhen Lu
Cindy Morley
Won Suk Daniel Lee
Gary Thompson
Maria Bampasidou
Brent Arnoldussen
Young Chang
Hussein Gharakhani
Katsutoshi Mizuta
Carlos Rodriguez Lopez.

Day 3 Attendance (In person):
Jefferson Vitale
Thanos Gentimis
Yaqing Xu
Roberts Strong
Akinbode Adedeji
Mahendra Bhandari
Karan Kariyamatahm
Gordan Rojan
Ali Fares.

Day 3 Attendance (Online):
Carlos Rodriguez Lopez
Yuzhen Lu
Ziteng Xu
Dongyi Wang
Hussein Gharakhani
Daniel Lee
Brent Arnoldussen
Katsutoshi Mizuta
Brent Arnoldussen
Dongyi Wang.

Brief Summary of Minutes

MINUTES OF THE ANNUAL MEETING


Host: Texas A&M, AgriLife Center, Corpus Christi, TX


Date: August 7 – 9, 2024


The three-day annual meeting commenced on Wednesday, Aug 7, 2024, with the arrival of some of the members. Our first gathering was held on Aug 8, 2024. The main thrust of our meeting this year were:



  1. Day Two was attended by 9 members in person, and 13 members online. (See minutes for the list).

  2. The representative of the Texas A&M AgriLife Center, Dr. Gary (another Gary) came to give a welcome speech to the meeting.

  3. All the three objectives in our proposal for 2021 – 2026 were reviewed to gauge the extent of activities and impact across stations and review progress of work in order to prepare a better progress report this year. This was led by the current chair of the group, Jeff Vitale of Oklahoma State University. Dr. Gary Thompson, S1090 advisor, gave a presentation of our performance vis-à-vis our objectives. Details are in the minutes of the meeting submitted with this report. We ranked GOOD, GOOD, GOOD, and EXCELLENT on four criteria (Project reporting, Linkages, Funding, and Information & technology transfer, respectively) that were used for evaluating multistate groups.

    1. A template was developed for station report by the secretary, Bode Adedeji and was improved by all. The template was test-used immediately, and three of our members used it to present their station report and additional feedback was provided to improve the template. It has since been shared for a single station report to all our members.

    2. A suggestion that we create a hashtag for our multistate, and Robert Strong took the task of coming up with several options that he presented to the group.

    3. The group was encouraged to use several outlets (social media, website, Southern communicator consortium, repository as database (Thanos mentioned RoboFollow), etc.) to publicize its activities going forward.

    4. Some critical questions were raised to help focus our efforts. What is our role within the Land Grant System?




The first day of activities ended with a visit to the Digital Twin Lab at Texas A&M AgriLife Center, led by Mahendra, our host.


Day three was attended by 9 members in person and 10 members online (See minutes for the list). The focus was on April 2024 AI Conference Survey Summary, strategy for the next five-year proposal re-write, theme suggestion for 2025 AI conference, election of a new secretary and venue and date for 2025 meeting. Details are provided in the minutes. Key highlights include:



  1. A favorable review of the AI conference by the participants. Several themes were suggested for the next AI conference to be hosted in March 31 – April 2, 2025, by Mississippi State University.

  2. Six themes were suggested for 2025 AI conference for the host to consider. See minutes.

  3. Ad hoc committees were formed, and co-leads selected for each objective. Members were encouraged to volunteer to serve in at least one Ad hoc committee.

  4. The decision about venue for 2025 annual was left to Yuzhen, the chair-elect to decide. There are two options on the table. Oklahoma State University and Michigan State University. Consideration was given to co-hosting our annual meeting with Yuzhen’s other multistate group, W4009.

  5. Dongyi Wang of University of Arkansas was elected as the new secretary of the group.


The meeting was adjourned at 11:50 am.

Accomplishments

<p><strong>OVERVIEW OF THE GROUP&rsquo;S <span style="text-decoration: underline;">OUTPUTS AND ACCOMPLISHMENTS</span> FOR THE YEAR IN-VIEW</strong></p><br /> <p style="text-align: justify;">S1090 is a USDA multistate group founded in 2021 to foster collaboration across US institutions (land grant and others) and industry in the US where the focus is on artificial intelligence (AI) and digital agriculture. The group started with several institutions within the US South-Eastern conference and has since expanded to other land grant (and non-land grant alike) institutions across the nation with strong industry collaboration. We are in the third year of our current proposal. The proposal laid out three main objectives that are supposed to foster cross-disciplinary and multistate collaborations within our members. This report documents our activities in the last one year: 2023/2024. Primarily on collaborative research projects across stations of our members (including the Ag. Machinery and Production industry). It highlights key ideas and issues tackled, the number of personnel trained, the fundings secured as results of our collaboration, and the meetings we attended where findings of our research outputs were shared nationally and internationally. The group has also organized three AI conferences since inception, and last one was hosted in April this year in Texas. The report for the conference has been submitted separately.</p><br /> <p>&nbsp;</p><br /> <p><strong>OUTPUTS&nbsp;OF</strong>&nbsp;<strong>PROJECTS BY OBJECTIVE AND STATION</strong></p><br /> <p><strong>OBJECTIVE 1a: </strong>AI tools for crop (Agrifood) and animal production</p><br /> <p><strong>Project 1:</strong> Monitoring Nitrogen Stress in Maize (<em>Zea mays</em> L.) under Excessive Moisture with UAV Remote Sensing. (Under Review by Sensors (MDPI)) is a paper submitted this year that utilizes AI and remote sensing to improve Nitrogen usage in Maize. <strong>Station: LSU;</strong> Students Trained: 5 MSc.</p><br /> <p><strong>Project 2:</strong> The AI Based Methodologies for Major Crops in Louisiana grant was awarded to Dr. Gentimis this year by the Louisiana Board of Reagents ($154,000) will explore the use of AI in historical datasets in Louisiana. (established 2023-24) and it will fund 2 graduate students. Station: <strong>LSU</strong>; Students Trained: 1 Ph.D., 1 MSc).</p><br /> <p><strong>Project 3:</strong> The Remote Sensing and Artificial Intelligence for Supporting Sugarcane Breeding and Forecasting Sugar Yield grant was established this year in collaboration with John Deere Thibodeaux LA, ($285,000) and it will utilize AI and remote sensing to improve outcomes in Sugarcane. We note here that the collaboration between the John Deere company and LSU was strengthened during the visit the multistate conference in Louisiana (2023) had at John Deere&rsquo;s facility at Thibodeaux. This created the initial introductions between Dr. Setiyono and Dr. Gentimis with the team at John Deere. (1 Ph.D.) (<strong>Station(s): </strong><strong>&nbsp;LSU-John Deere.</strong></p><br /> <p><strong>Project 4:</strong> The paper Application of TensorFlow model for identification of herbaceous mimosa (<em>Mimosa strigillosa</em>) from digital images was published this year by Dr. Setiyono and Dr. Gentimis and it utilizes Convolutional neural networks to detect herbaceous mimosa. (5 MSc).</p><br /> <p><strong>Project 5:</strong> Developed and validated remote sensing tools tailored to assess yield parameters and detect sugarcane yellow leaf disease, sugarcane stalk rots, Cercospora net blotch in rice in Louisiana farms. <strong>Station(s): LSU-USDA ARS</strong></p><br /> <p><strong>Project 6</strong>: Harnessing UAV and machine learning technologies to promote resilient soybean and corn production is a grant awarded to Dr. Setiyono this year from the Louisiana Soybean and Grain Research Promotion Board ($30,000). <strong>Station: LSU.</strong></p><br /> <p><strong>Project 7</strong>: AI-based approaches were developed for optical technologies, especially machine vision, applied to automated sweet potato grading and sorting. This is part of an ongoing <strong>multi-state collaborative research </strong>project funded by USDA Agricultural Marketing Services, involving five universities including <strong>MSU, MS State University, NCSU, UIUC, and LSU</strong>. <strong>One Postdoc </strong>and one <strong>PhD</strong> student are being trained on the project.</p><br /> <p><strong>Project 8</strong>: AI models were developed for machine vision-based weed detection and control. To facilitate weed control research efforts, we developed open-source software called OpenWeedGUI which integrates YOLO models for weed imaging and detection. Moreover, we designed and evaluated an AI-based smart sprayer for precision vegetable weeding. We have received funds from the Michigan Department of Agriculture and Rural Development. We are planning to submit a USDA-NIFA proposal to expand and deepen our research this year. <strong>One PhD</strong> student is being trained on the project. <strong>Station: LSU.</strong></p><br /> <p><strong>Project 9</strong>: Harvest decision marking is important for blueberry growers to maximize quality and yield. We are developing computer vision and AI tools for automated blueberry counting and harvest maturity estimation, currently under the support of Michigan State University AgBioResearch. We are planning to submit a USDA-NIFA proposal this year. <strong>Station: LSU.</strong></p><br /> <p><strong>Project 10</strong>: We are in partnership with <a href="https://motiongrazer.com/">MotionGrazer</a><a href="https://motiongrazer.com/"> AI</a> (a startup company) to develop 3D computer vision and AI-based methods for automated sow body condition estimation and lameness detection. The effort was funded by an NSF STTR Phase I grant. We are planning to apply for the Phase II grant to scale up the efforts and make the technology available to swine producers. <strong>Station: LSU.</strong></p><br /> <p><strong>Project 11</strong>. Poultry meat myopathies, such as white striping and woody breast, downgrade quality and cause significant loss to the U.S. poultry industry. AI models were developed for white striping detection of broiler meant using structured-illumination imaging. This work was funded by a USDA-NIFA-AFRI seed grant and initiated at Mississippi State University and then transferred to MSU. One MS student has been finished, and the second MS student is being trained on the project. We are planning to submit a standard NIFA proposal to push forward the research. Institutions involved include <strong>MSU and Mississippi State University.</strong></p><br /> <p><strong>Project 12:</strong> Climate variability has complicated irrigation and disease management in crop production. AI models are being developed to predict crop water stress and plant disease risks. This work was funded by Michigan State University&nbsp;Project GREEEN. The project team is planning to submit a proposal to the USDA NIFA program. <strong>Station: MSU.</strong></p><br /> <p><strong>Project 13:</strong> <span style="text-decoration: underline;">Yiannis Ampatzidis</span>: (1) Development of early pest and disease detection systems for a variety of vegetable crops utilizing AI and remote sensing; (2) Development of an innovative optoelectronic nose for detecting adulteration in quince seed oil. <strong>Station: (UF)</strong></p><br /> <p><strong>Project 14: </strong><span style="text-decoration: underline;">Nikolay Bliznyuk</span>: In development, as part of FDACS &ldquo;BMP Phosphorus (P) recommendations&rdquo; project: ML-based prediction of yield of select crops (tomato, potato, and green bean) using P and other nutrients. <strong>Station: (UF)</strong></p><br /> <p><strong>Project 15: </strong><span style="text-decoration: underline;">Won Suk Lee</span>: (1) Strawberry flower and fruit detection toward the development of yield forecasting models (<strong>Station: UF</strong>); (2) Two spotted spider mite detection using smartphones and a single-camera device (<strong>Station: UF, UC Davis, UC ANR</strong>); (3) Ground- and UAV-based strawberry plant canopy volume detection (UF); (4) Strawberry plant wetness detection using color imaging and AI.</p><br /> <p><strong>Project 16: </strong><span style="text-decoration: underline;">Henry Medeiros</span>: We continue to explore novel computer vision techniques for behavioral studies in animal production facilities. In collaboration with colleagues at the ABE department, we received a seed grant from the University of Florida Institute of Food and Agricultural Sciences Launching Innovative Faculty Teams in AI (LIFT/AI) initiative (PI: D. Hofstetter) to develop computer vision models to analyze the behavior of turkeys. Preliminary results obtained using this grant were presented at the American Society of Agricultural and Biological Engineers Annual International Meeting. <strong>Station: UF</strong></p><br /> <p><strong>Project 17:</strong> The work conducted by Tennessee focused on the development of AI technology for the improvement of livestock and poultry health and welfare. The development of an automated vision system for broiler welfare behavior assessment. The outcome of this project is an automated tool that helps broiler farmers to have a better insight on animal performance. It provides timely information for broiler farmers to improve their farm management practices for better animal welfare and higher production. <strong>Station: UTK.</strong></p><br /> <p><strong>Project 18:</strong> Developed AI algorithms to analyze data for precision nutrient monitoring and management for hydroponic production for leafy greens. Developed machine learning-based algorithms for predicting calcium deficiency in the hydroponic production of lettuce. <strong>Station: UCDavis.</strong> We have presented research outcomes at the 2024 American Society of Agricultural and Biological Engineering Conferences in Anaheim, California.</p><br /> <p><strong>Project 19:</strong> Developed the model for forecasting the biomass yield of lettuce production in a hydroponic system under artificial lighting in an indoor vertical farming setup. <strong>Station: UCDavis.</strong></p><br /> <p><strong>Project 20:</strong> Developed fault detection and diagnosis tool of EC and pH sensors for precision nutrient and water management for hydroponic production. This tool will be critical for identifying any faults and anomalies in sensor (EC and p-H) readings and ensuring the precision operation of hydroponic production. <strong>Station: UCDavis. </strong>We presented our research findings at the 2023 American Society of Agricultural and Biological Engineering Conferences in Ohama, Iowa.</p><br /> <p><strong>Project 21:</strong>&nbsp; Color is a critical parameter in meat quality evaluation. Objective measurement of meat saves money, can be easily automated and increase return for the stakeholders. UArk and UK PIs, are working on developing deep learning models for meat color prediction. We submitted a USDA proposal which was not funded. We are planning to resubmit during the current cycle (<strong>UK and Uark</strong>).</p><br /> <p><strong>Project 22</strong>: Cross-contamination of grain cultivar is a major issue in postharvest grain processing. Current methods of detection are ineffective, leads to waste and unable to quantify. Nondestructive methods of detection and quantification of cross-contamination of proso millet cultivars were developed using sensor (hyperspectral imaging) and machine learning <strong>Station: UK</strong>.</p><br /> <p><strong>Project 23:</strong> Allergen detection is a critical step in food manufacturing, especially where the risk of contamination is high. Noninvasive methods are more effective. We are working on developing multispectral model based on RGB data, which can be deployed in mobile APP for everyday use of consumers and the food industry in allergen detection in foods <strong>Station: UK</strong>.</p><br /> <p><strong>Project 24:</strong> Evaluation of Traditional and AI-Driven On-farm Trial Data; In-season Diagnosis of Nitrogen and Water Status for Corn Using UAV Multispectral and Thermal Remote Sensing <strong>Station: UK</strong>.</p><br /> <p><strong>Project 25:</strong> Using agent-based modeling for precision swine nutrition. <strong>Station: TAMU</strong>.</p><br /> <p><strong>Project 26: </strong>Digital Twin System for In-season Crop Growth and Yield Forecasting: A DT twin framework was developed using the phenotypic features of plants growth and development as a virtual replica of plants and use the data generated from this framework to predict crop growth, forecast management, and yield during the season. <strong>Station: TAMU</strong>.</p><br /> <p><strong>Project 27:</strong> Crop Yield Modeling and Drought Monitoring: As a joint study between Prairie View A&amp;M University and station Sam Houston State University, we did a series of UAV data collection over a sorghum field located at Prairie View A&amp;M University. These data include high-resolution RGB, thermal, and multispectral data. This is a collaborative work aiming for developing tools for crop yield modeling and drought monitoring. This work is an outcome of the AI for Ag conference 2024 hosted in Texas A&amp;M. <strong>Station: TAMU, Sam Houston State Uni., Prairie View A&amp;M University.</strong></p><br /> <p><strong>Project 28: K-State </strong>are working on an idea to verify and validate robotics and automation in food production. This new concept has caught on in the last 10 years but has considerable room for study and must be validated to prove they can accomplish the necessary work needed for growing food, fiber, and fuel.</p><br /> <p><strong>OBJECTIVE 1 B: </strong>AI tools for autonomous system perception, localization, manipulation, and planning for agroecosystems.</p><br /> <p><strong>Project 1:</strong> Developed dual laser active scanning camera to generate high-resolution RGB-D camera and AI - based robotic solution for advanced food manufacturing. <strong>Station: UArk</strong></p><br /> <p><strong>Project 2:</strong> Developed a novel neural network model (wide-deep learning, SSNet, SCNet) for hyperspectral imaging-based non-invasive bioproduct analysis.&nbsp; <strong>Station: UArk</strong></p><br /> <p><strong>Project 3:</strong> Developed novel illumination robust machine learning model to predict human sensory grading based on food appearance under different illumination conditions. The project promotes collaboration with the UK for one USDA grant submission. <strong>Station: UArk</strong> <strong>and</strong> <strong>UK</strong></p><br /> <p><strong>Project 4: </strong><span style="text-decoration: underline;">Yiannis Ampatzidis</span>: (1) Development of an AI-enhanced smart sprayer for precision weed management in specialty crops (UF, Carnegie Mellon Un); (2) Development of an AI-enabled automated needle-based trunk injection system for HLB-affected citrus trees <strong>Station: UF, UC Davis</strong>; (3) Development of an AI-enabled smart tree crop sprayer using sensor fusion.</p><br /> <p><strong>Project 5: </strong><span style="text-decoration: underline;">Dana Choi</span>: Our project established a comprehensive strategy to address the challenges of agricultural robotics and data collection in strawberry farming by utilizing advanced procedural modeling to generate synthetic plant models, thereby enhancing the quality of datasets beyond the capabilities of traditional methods. We leveraged NVIDIA Omniverse for realistic simulations and integrated these with the Robot Operating System (ROS) for sophisticated data management, employing deep learning models for precise strawberry detection and classification. Additionally, ISAAC SIM&rsquo;s automated labeling system significantly improved the efficiency and accuracy of our training processes. The culmination of these innovations is a robust fruit detection system that not only elevates yield prediction accuracy but also reduces the need for intensive data labeling and curation, thereby cutting costs and streamlining the development of robotics applications. <strong>Station: UF.</strong></p><br /> <p><strong>Project 6: </strong><span style="text-decoration: underline;">Dana Choi</span>: We initiated the "Advancing AI Competency for Agricultural Extension" project with the objective of enhancing AI literacy to promote innovation and productivity in agriculture. This multifaceted program encompasses a variety of key initiatives. Through educational outreach, we captivated students at the Ag AI Youth Expo, encouraging them to pursue future careers in agricultural technology. In support of growers, we introduced AI solutions to Florida specialty crop producers, enhancing their understanding of AI systems. As part of our commitment to professional development, we demonstrated future AI applications to IFAS Certified Crop Advisers. Additionally, we conducted the comprehensive AI workshop titled &ldquo;AI Essentials for Extension Professionals&rdquo; for extension faculty members, cultivating an environment conducive to continuous learning and innovation within the community. <strong>Station: UF.</strong></p><br /> <p><strong>Project 7: </strong><span style="text-decoration: underline;">Henry Medeiros</span>: (1) We further developed a multiple object tracking algorithm that detects, and tracks plants observed by mobile robotic platforms equipped with video cameras. We also extended the evaluation of the algorithm to additional publicly available datasets and demonstrated its state-of-the-art performance. A manuscript submitted to Computers and Electronics in Agriculture received positive initial comments from the reviewers and the revised submission is currently under review. <strong>Station: UF.</strong></p><br /> <p><strong>Project 8:</strong> We developed a novel computer vision model that simultaneously performs object detection and association. Rather than using conventional detection association mechanisms based on linear cost assignment methods, our model directly learns the correspondences between detections in a pair of images with partially overlapping fields of view. Given only the bounding boxes in two images, our model learns the associations among detections between two subsequent frames, thus providing all the information needed to track object identities over the video sequence. We evaluated our model on publicly available datasets and obtained promising preliminary results, which were presented at the American Society of Agricultural and Biological Engineers Annual International Meeting. <strong>Station: UF.</strong></p><br /> <p><strong>Project 9: </strong>AI algorithms are being incorporated into multiple robotic systems.&nbsp; First, AI-based vision is being developed to control a high-speed robotic arm that is capable of manipulating multiple objects like various fruits simultaneously while consuming less energy than conventional robotic arms.&nbsp; AI-based vision is also being developed to enable a versatile robotic end-effector to manipulate fruit in situ (through rotating and bending), pick it using various methods (pulling, bending, twisting, or combinations), and continuously transfer it to the back of the end-effector.&nbsp; As a specific example, AI-based vision is being developed to measure cotton boll orientation, which affects the performance of robotic cotton harvesting.&nbsp; AI-based vision is also being developed to enable a ground-based robot to detect and collect plastic bags, a significant source of cotton fiber contamination, in cotton fields.&nbsp; In collaboration with USDA-ARS, we developed an AI-based model that estimates moisture content and bulk density in grains, and a similar ongoing project with USDA-ARS focuses on developing an AI-based model to detect diseases from images of crop leaves. The model is being designed to identify at least five different diseases. <strong>Station: MS.</strong></p><br /> <p><strong>Project 10: </strong>We have developed nutrient dosing algorithms for the precision supplying of individual nutrient components (Macro Nutrients) for hydroponic production instead of EC based conventional dosing. This research aims to reduce the human involvement in monitoring and dosing the precise nutrient components and ultimately improve the nutrient and water use efficiency of hydroponic production.&nbsp; Also, the autonomous nutrient dosing tool would help to improve the yield as the potential for nutrient deficiency and toxicity would be minimized. <strong>Stations: UCDavis and Delft University of Tech.</strong></p><br /> <p><strong>Project 11: </strong>Design and Deployment of a User-Centric Customizable Digital Twin System for Apple Production, USDA NIFA<strong> Stations:</strong> <strong>MSU</strong> (Uyeh &amp; Morris) and <strong>UK</strong> (Arnoldussen). Status: not funded</p><br /> <p><strong>Project 12: </strong>Collaborative Research: CPS: Medium: Growing Fruit Trees in Dynamic Digital Environments, NSF CPS, <strong>Stations:</strong> <strong>MSU</strong> (Uyeh and Morris), <strong>Oregon State University</strong> (Davidson and Grimm, <strong>UF</strong> (Mederios and Lee), <strong>Texas A&amp;M University </strong>(Zahid), <strong>UK</strong> (Arnoldussen, Adedeji, Rodriguez). Status: pending</p><br /> <p><strong>Project 13: </strong>Digital Orchards for Guiding Micro-Climate Interventions. Funding Agency: <strong>USDA FFAR.</strong> <strong>Stations: </strong>Institutions<strong>: MSU</strong> (Uyeh, Morris, and Perkins), <strong>UK</strong> (Arnoldussen), <strong>Tennessee State University</strong> (Mohaei). Status: pending</p><br /> <p><strong>Project 14:</strong> Using computer vision to predict Bovine Respiratory Disease. <strong>Station: TAMU</strong>.</p><br /> <p><strong>Project 15:</strong> Using computer vision for precision feeding in beef production systems. <strong>Station: TAMU</strong>.</p><br /> <p><strong>Project 16:</strong> <strong>K-State</strong> <strong>station</strong> plans to develop computer vision and other digital systems (in collaboration with <strong>other stations</strong>) that will allow an ag machine to collect and analyze quality of work performed or machine performance based on an assessment of the area behind a machine. Continued development of tools for monitoring planter performance in field conditions.&nbsp; This work will provide data for building a truly AI driven autonomous planting system. Information generated by this work will be required as foundational information for AI to make decisions. Continued development of a sensor-based system to sense soil compact in a field prior to tillage.&nbsp; This work will better integrate soil condition information into precision agriculture systems. Began work on developing a machine vision test stand to study tillage tool field performance. Data collected will provide foundational knowledge necessary for AI driven autonomous machines.</p><br /> <p>&nbsp;</p><br /> <p><strong>OBJECTIVE 1 C: </strong>Natural resources scouting and monitoring.</p><br /> <p><strong>Project 1: </strong>AI-driven data fusion of in situ handheld spectral sensing devices and geospatial environmental covariates generated via an Earth observation data cube for soil properties estimation at a continental scale (USA), utilizing the USDA soil spectral library. <strong>Stations: UF and U-Sao Paolo.</strong></p><br /> <p><strong>Project 2: </strong>Self-supervised and contrastive learning methods for analyzing multimodal data to estimate soil organic carbon (SOC) stock, addressing data availability challenges in the USA and Europe. <strong>Stations: UF and U-Sao Paolo.</strong></p><br /> <p><strong>Project 3: </strong>Develop a dual-branch neural network architecture to analyze and interpret multimodal spaceborne data, also considering the temporal evolution of bare soil reflectance properties. <strong>Stations: UF and U-Sao Paolo.</strong></p><br /> <p><strong>Project 4: </strong>In collaboration with USDA-NRCS, 422 soil samples were collected from 33 soil series in Texas at three depths (0-5 cm, 5-15 cm, and 15-30 cm) during summer 2023. Soil data including moisture content, bulk density (0-5 cm), field saturated hydraulic conductivity (0-5 cm), GPS location, and soil spectra from Mississippi State University&rsquo;s Advanced Plant and Soil Sensing lab were obtained. Total carbon, total nitrogen, available phosphorus, potassium, magnesium, calcium, hydrogen, zinc, manganese, organic matter, soil pH (water and buffer), cation exchange capacity, and percent base saturation of cation elements were analyzed by the Waters Agricultural Lab in Mississippi. Soil texture, lab saturated hydraulic conductivity (0-5 cm), and soil water retention curve (0-5 cm) were analyzed at USDA-ARS, Starkville, Mississippi. Samples obtained from Mississippi and Texas so far exceed 800. Once laboratory analyses are complete, AI models will be used to develop soil property estimations from soil spectra. <strong>Station: MS and USDA-NRCS.</strong></p><br /> <p><strong>Project 5: </strong>Sensitivity Evaluation of Visible Near-Infrared Spectroscopy Data to Variable-Rate Soil Moisture for AI-Driven Prediction of Soil Properties. <strong>Station: UK.</strong></p><br /> <p><strong>Project 6: </strong>Improvement of Soil Spectral Prediction for Plant-Available Nutrients Using Machine and Deep Learning algorithms. <strong>Station: UK.</strong></p><br /> <p><strong>Project 7: </strong>Using satellite Imagery and machine learning to predict forage quality and quantity. <strong>Station: TAMU</strong>.</p><br /> <p><strong>Project 8:</strong> Completed a project constituent company that monitors large square bales after they are dropped in the field behind the baler.&nbsp; This machine vision-based device provides operator warnings if the bales are misshapen or have broken twines which are outside a given set of parameters. Currently working to develop methods of predicting field finish behind tillage tools based on tillage tool performance.&nbsp; This system will be a machine vision-based system that is supported by other information such as draft and machine vibrations. <strong>Station: K-State.</strong></p><br /> <p>&nbsp;</p><br /> <p><strong>OBJECTIVE 1 D: </strong>Socioeconomic sustainability</p><br /> <p><strong>Project 1: </strong>This project addresses the issues related to crop monitoring for yield estimation, quality assessment, disease detection, irrigation scheduling, and prediction of nutrient concentrations.<strong> Station: Clemson.</strong></p><br /> <p><strong>Project 2: </strong>The major activities included the use of aerial and ground-based sensing systems for assessing aboveground biomass on forage crops, and irrigation decision support system development for cotton. We continued the research activities in above projects and collected preliminary data related to the moisture and nutrient content assessment using NIR Spectrometer and tested artificial intelligence algorithms for predicting aboveground forage yields. Deep neural networks were applied for cotton yield estimation and soil moisture predictions. <strong>Station: Clemson.</strong></p><br /> <p><strong>Project 3: </strong>Evaluation of Agronomic, Economic, and Environmental Benefits of Remote Sensing and AI-Based Calibration Strip Technology for In-season Nitrogen Application for Corn. <strong>Stations:</strong> <strong>UMN, UK.</strong></p><br /> <p><strong>Project 4: </strong>Assessment of Soil Carbon Sequestration Capability by Depths and Crops Using Econometric Techniques. <strong>Station:</strong> <strong>UK.</strong></p><br /> <p><strong>Project 5</strong>: Developing a system dynamics model for climate smart beef production systems. <strong>Station: TAMU</strong>.</p><br /> <p><strong>OBJECTIVE 1 E: Phenotyping and genotyping</strong></p><br /> <p><strong>Project 1: </strong>The paper Grapevine Rootstock and Scion Genotypes' Symbiosis with Soil Microbiome: A Machine Learning Revelation for Climate-Resilient Viticulture is under review by Microbiome, utilizes ML techniques on Grapevines. (Students Trained: 1 Ph.D.).<strong> Stations: LSU-UK</strong></p><br /> <p><strong>Project 2: </strong><span style="text-decoration: underline;">Yiannis Ampatzidis</span>: Development of AI-enabled high throughput phenotyping technologies to enhance citrus, sugarcane, and wheat breeding programs. <strong>Station: UF</strong></p><br /> <p><strong>Project 3: </strong><span style="text-decoration: underline;">Henry Medeiros</span>: (1) We engaged with several members of the S1090 multi-state project and identified multiple collaboration opportunities. Based on these discussions, we developed a joint proposal for the National Science Foundation Cyber Physical Systems program. The proposal was submitted in May of 2024 and is currently under review by the agency.&nbsp; Discussions regarding additional opportunities are currently underway. <strong>Station: UF</strong></p><br /> <p><strong>Project 4: </strong>As part of our broader research efforts on phenotyping techniques, in partnership with colleagues from several department, we secured a seed grant from the University of Florida Institute of Food and Agricultural Sciences Launching Innovative Faculty Teams in AI (LIFT/AI) initiative focused on AI-driven Phenomics to Advance Plant Breeding in Florida.<strong> Station: UF.</strong></p><br /> <p><strong>Project 5: </strong>Paper under review (Microbiome, IF: 13.8) with Dr. Thanos Gentimis (LSU). In this paper we described the cultivated grapevine PanMicrobiome and development of ML algorithms to predict provenance, and grapevine planted genotype using microbiome data. Our research offers a novel perspective on the predictive power of genotype selection on the microbial assemblage in vineyard soils. By employing a suite of machine learning models, we have dissected the complex interactions between grapevine cultivars and their root microbiomes across continents, countries, and cultivars. The robustness of our methodology, which integrates multiple machine learning algorithms and evaluates them through the lens of F1-scores to account for class imbalance, sets a new standard in the field. The crux of our findings demonstrates that the successful prediction of rootstock and scion combinations from soil microbiomes, irrespective of their provenance, re-affirms that the genotypes of both plant parts are determinants in shaping the microbiome. This underlines the potential for targeted breeding programs to not only consider the direct traits of the grapevines but also their indirect influence on the surrounding microbial environment, which is pivotal for plant health and productivity. <strong>Stations: LSU and UK</strong></p><br /> <p><strong>Project 6: </strong>Funded project. EpicMare; Using ML for the development of gestational age epigenetic clocks to predict delivery date and pregnancy complications in pregnant mares. The project has received funding from the university of Kentucky Igniting Research Collaborations ($50,000(2024)), and the Martin Gatton Fundation ($150,000 (2024-2027)). Additionally, a <strong>NIFA-AFRI</strong> application was submitted by <strong>UK</strong> in collaboration with Dr. Shavahn Loux (<strong>LSU</strong>) (Requested $750,000 (2025-2029)).</p><br /> <p><strong>Project 7:</strong> Phenotyping for Biomass Quantity and Quality: A work initiated at University of Tennessee Knoxville and Oak Ridge National Lab but maintained its collaboration when Sam Houston State University was created. This research assesses the biomass quantity (crop yield) and phenotypic traits associated with biomass quality, i.e., leaf nitrogen concentration, chlorophyll content, and disease resistance, etc. Using AI approach, we were able to develop high-throughput models for detecting these phenotypic traits for switchgrass, a dominating bio-fuel crop. <strong>Station: TAMU, Sam Houston State Uni., UTK</strong>.</p><br /> <p>&nbsp;</p><br /> <p><strong>OBJECTIVE 2 A: Data curation, management, and accessibility, and security, ethics.</strong></p><br /> <p><strong>Project 1: </strong>An annotated dataset of soybean images was generated to be used in the Master&rsquo;s thesis of Ms. Bhawana Acharya named: Digital Agriculture Applications In Maize Nutrient Management And Soybean Crop Stands Assessment. This dataset is publicly available on the online platform &ldquo;Roboflow&rdquo;. Dr. Gentimis was introduced to &ldquo;Roboflow&rdquo; during a talk at the 2022 conference in Florida, and he has proposed the creation of an online repository for LSU which could also be extended to the whole Multistate System. The corresponding paper is under preparation. <strong>Station: LSU</strong></p><br /> <p><strong>Project 2: </strong>The lack of public, large-scale weed datasets is considered a bottleneck to developing robust machine vision-based weeding systems. To alleviate this, we have been developing public weed datasets. In the Years 2023-2024, we acquired and published two open-source, multi-class weed detection datasets and associated benchmarks of AI models for weed detection. We also experimented with generative modeling through stable diffusion to augment multi-class weed image data for enhanced weed detection. We are now seeking federal funding to sustain the efforts to develop big data-powered robust weed recognition systems to enable precision vegetable weeding <strong>Station: MSU. </strong></p><br /> <p><strong>Project 3</strong>: K-State group is expanding the capabilities of previously developed devices, CatAPP and GreaterAPP, which allow the quantification of force and energy requirements for modern tillage tool related components.&nbsp; This device is intended to provide improved data for designers, planners, producers, and policy makers.&nbsp; Much of the data and guidance currently available to these individuals was generated better than 40 years ago and isn&rsquo;t relative to modern field equipment. <strong>Station: K-State.</strong></p><br /> <p><strong>Project 4</strong>: Working with agronomists and entomologists to develop methods for detecting and determining the location of aphids and other pests in production fields.&nbsp; These systems are using robotic and autonomous systems to perform historically labor intensive that are more thorough and accurate. <strong>Station: K-State.</strong></p><br /> <p><strong>OBJECTIVE 2B: </strong>Standardization and testbed development &ndash; data standardization and software development.</p><br /> <p><strong>Project 1: </strong>The goal of the project is to use MIR spectral data and soil properties from Kellogg Soil Survey Laboratory to build and provide a user-friendly, web-based portal that would automate the modeling process and estimate soil properties and soil health indicators from MIR spectra. The web-based portal will enable MIR-equipped NRCS-SPSD field offices, researchers, and land managers to upload MIR spectra and obtain estimates of various soil properties across the US. It will reduce the inconsistency and errors in soil property estimation by field scientists and differences in lab instruments and analytic methods. The estimated soil properties serve a variety of end-user-determined applications, including soil monitoring/surveillance, soil health assessment, soil classification, and dynamic soil survey. <strong>Stations: Oregon State University/Ag Experiment Station AND University of Wisconsin-Madison</strong>.</p><br /> <p><strong>Project 2: </strong>The paper Soybean yield prediction using machine learning algorithms under a cover crop management was published this year, that extends data for uses in ML through statistical means.<strong> Station: LSU.</strong></p><br /> <p><strong>Project 3: </strong>Develop an AI-driven online spectral analysis tool for global use; based on novel local regression techniques and soil spectral data from &gt;50 countries. <strong>Stations: U-Sao Paolo and UF.</strong></p><br /> <p><strong>Project 4: </strong>Establishment of a testbed for developing precision agriculture tools: Producer&rsquo;s fields field growing cotton and sorghum has been identified and high spatio-temporal resolution data using remote sensing tools such as UAS and satellite imagery. The obtained data has been processed to extract phenotypic features and standardized to develop tool AI/ML tools for management decisions and yield prediction. <strong>Station: </strong></p><br /> <p><strong>TAMU.</strong></p><br /> <p><strong>Project 5: </strong>Web-based data management system for UAS-based phenotyping: A web-based platform integrated with database and processing pipeline was developed for Unmanned Aerial Systems (UAS) data management, processing, analysis, and communication. <strong>Stations: TAMU, OK.</strong></p><br /> <p><strong>OBJECTIVE 3:&nbsp;AI adoption (technology transfer) and workforce development</strong></p><br /> <p><strong>Project 1: </strong>Dr. Gentimis and Dr. Bampasidou secured a SERME grant ($46,176) called: Digital Agriculture: Training Louisiana Growers to develop a Data Management Plan. This grant will support 3 workshops for 20-50 people each, giving them a gentle introduction to data management, precision and digital Ag. <strong>Station: LSU.</strong></p><br /> <p><strong>Project 2: </strong>The paper Overcoming &lsquo;Digital Divides&rsquo;: Leveraging higher education to develop next generation digital agriculture professionals was published this year by Dr. Bampasidou, Dr. Gentimis and Dr. Mandalika, discussing opportunities for various stakeholders for training in Digital Ag as well as discussing various ethical implications of the field. <strong>Station: LSU.</strong></p><br /> <p><strong>Project 3: </strong>Dr. Gentimis is teaching a course on Digital Agriculture with 16 master&rsquo;s and PhD students every spring semester.<strong> Station: LSU.</strong></p><br /> <p><strong>Project 4:</strong> Dr. Gentimis gave a workshop at the annual conference organized by our project that attracted 60+ students and various stakeholders, on the use of XGBOOST (advanced ML technique). <strong>Station: LSU.</strong></p><br /> <p><strong>Project 5: </strong>Dr. Gentimis gave a workshop at Vidalia LA, that attracted 20+ stakeholders in the area about the uses of Data and Digital Ag in soybean production.<strong> Station: LSU.</strong></p><br /> <p><strong>Project 6</strong>: Technical workshop with university of Sao Paolo in AI spectral analysis within the framework of GFSI project. <strong>Stations: UF</strong></p><br /> <p><strong>Project 7: </strong>Development of Online Platform for Automated Fertilizer Prescription Calculation Driven by (AI and) Remote Sensing-Based Calibration Strip Technology: only first prototype was developed. The AI algorithm needs to be added in the future for advanced calculation of fertilizer recommendation rates. <strong>Station: UK.</strong></p><br /> <p><strong>Project 8: </strong>Development of Cheap and Rapid Soil Testing Service Using Spectroscopy. <strong>Station: UK.</strong></p><br /> <p><strong>Project 9: </strong>Development of Campus-wide AI Agrifood Institute. <strong>Station: UK.</strong></p><br /> <p><strong>Project 10: </strong>Kentucky Fruit and Vegetable Growers Meeting- 30m Informational Session on using digital remote sensing platforms for precision apple crop load management. 48 attendees</p><br /> <p><strong>Project 11: </strong>PickTN (Tennessee fruit growers) 1 h Informational Session on using digital remote sensing platforms for precision apple crop load management. 55 attendees. <strong>Station: UK.</strong></p><br /> <p><strong>Project 12: </strong>Kentucky State Hort Society Spring Field Day. <strong>Station: UK.</strong></p><br /> <p><strong>Project 13: </strong>30 Minute training on use of digital platforms for weather and crop load management 37 attendee. <strong>Station: UK.</strong></p><br /> <p><strong>Project 14: </strong>Conceptualized and organized the first joint symposium between AgriLife-COALS, and Texas A&amp;M Institute of Data Science (Conference, College Station, Texas). <strong>Station: TAMU.</strong></p><br /> <p><strong>Project 15: </strong>Interdisciplinary training program in digital agriculture tools development: A Research and Extension Education for Undergraduate (REEU) students was conducted to train students in digital agriculture technologies. <strong>Station: TAMU, LSU, UK, OK-State.</strong></p><br /> <p><strong>Project 16: </strong>Community based course for GEOG 4468 Remote Sensing course at Sam Houston State University: The course features remote sensing and drone technology and train the students (10 undergraduate students) for data collection over an agriculture field. <strong>Station: TAMU.</strong></p><br /> <p><strong>Project 17: </strong>Invitation to floy drones: Inspired from the presentation at the AI for Ag conference, Dr. Yaping Xu was invited to fly drones over North Carolina Arboretum at Asheville to collect video footage over the arboretum. <strong>Station: TAMU.</strong></p><br /> <p><strong>Project 18: </strong>Dr. Gentimis helped secure funding for the workshop ($50,000) through a USDA competitive grant.<strong> Stations: All Multistate.</strong></p><br /> <p>&nbsp;</p><br /> <p><strong>ACTIVITIES</strong></p><br /> <ul><br /> <li>S1090 organized it 3rd AI in Agriculture and Natural Resources Conference at the College State, Texas from April 15 &ndash; 17, 2024.</li><br /> <li>1<sup>st</sup> joint symposium between AgriLife-COALS, and Texas A&amp;M Institute of Data Science (Conference, College Station, Texas).</li><br /> </ul><br /> <p>&nbsp;</p><br /> <p><strong>MILESTONES AND IMPACT SUMMARY</strong></p><br /> <p>The metrics of impact in the reporting year include a total of <strong>90</strong> peer-reviewed publications, <strong>80</strong> conference presentations/podium/posters, <strong>three</strong> extension publications. A total of <strong>38</strong> undergraduate research assistants, <strong>73</strong> MSc students, <strong>74</strong> Ph.D. students, <strong>24</strong> post-docs, <strong>15</strong> other researchers (visiting scientists, research assistants) and a total of <strong>170</strong> farmers/growers/aggregators were trained. A total of <strong>525</strong> conference attendees were facilitated in through two conferences related to AI co-organized by our members. This makes a total of <strong>1,092</strong> personnel that were impacted by S1090 members in the reporting year. A sum of <strong>$819,000</strong> in funded grant money was reported from collaborative projects by our members. A total of 91 projects are at various stages of execution were reported at a total of 13 stations. Out of these are <strong>26 </strong>collaborative projects.</p><br /> <p>&nbsp;</p><br /> <p>&nbsp;</p><br /> <p>&nbsp;</p><br /> <p>&nbsp;</p><br /> <p>&nbsp;</p>

Publications

<p><strong>Refereed Journals/Book Chapters</strong></p><br /> <p>Oregon State University</p><br /> <ol><br /> <li><strong>Zhang, Y</strong>., Hartemink, A.E., Weerasekara, M., 2023. An automated, web-based soil property and soil health estimation tool using mid-infrared (MIR) spectroscopy and machine learning. National Cooperative Soil Survey Meeting, July 9&ndash;13, Bismarck, ND, USA.</li><br /> </ol><br /> <p>&nbsp;</p><br /> <p>South Dakota State University, SDSU</p><br /> <ol><br /> <li>Antora, S.S., <strong>Chang, Y.K.,</strong> Nguyen-Quang, T., &amp; Heung, B. (2023). Development and Assessment of a Field-Programmable Gate Array (FPGA)-Based Image Processing (FIP) System for Agricultural Field Monitoring Applications. AgriEngineering, 5(2), 886-904.</li><br /> <li>Shin, J., Mahmud, M., Rehman, T. U., Ravichandran, P., Heung, B., &amp; <strong>Chang, Y.K.</strong> &dagger; (2023). Trends and Prospect of Machine Vision Technology for Stresses and Diseases Detection in Precision Agriculture. AgriEngineering, 5(1), 20-39.</li><br /> <li>Conference paper</li><br /> <li>Alahe, M.A., Kemeshi, J., &amp; <strong>Chang, Y.</strong> (2024) Comparison Between Jetson Nano and Jetson Xavier NX for Ag Data Security. In 2024 ASABE Annual International Meeting. Oral presentation with conference paper (doi: 10.13031/aim.202400811).</li><br /> <li>Kemeshi, J., Alahe, M.A., <strong>Chang, Y</strong>., &amp; Yadav, P.K. (2024) Effect of Camera Shutter Mechanism on the Accuracy of a Custom YOLOv8 Model for Pattern Recognition in Motion on a UGV. In 2024 ASABE Annual International Meeting. Oral presentation with conference paper (doi: 10.13031/aim.202400812).</li><br /> <li>Alahe, M.A., Kemeshi, J., <strong>Chang, Y.,</strong> &amp; Menendez, H. (2024) Sustainable Livestock Management and Pasture Utilization using Automotive Electric Fencing System. In 2024 ASABE Annual International Meeting. Oral presentation with conference paper (doi: 10.13031/aim.202400820).</li><br /> <li>Kemeshi, J., Gummi, S.R., &amp; <strong>Chang, Y.</strong> (2024) R2B2 Project: Design and Construction of a Low-cost and Efficient Autonomous UGV For Row Crop Monitoring. 16th ICPA. Oral presentation with conference paper (#10111).</li><br /> <li>Gummi, S.R., Kemeshi, J., &amp; <strong>Chang, Y</strong>. (2024) Botanix Explorer (BX1): Precision plant phenotyping robot detecting Stomatal openings for Precision Irrigation and Drought Tolerance experiments. 16th ICPA. Oral presentation with conference paper (#10202).</li><br /> <li>Kemeshi, J., <strong>Chang, Y.,</strong> Yadav, P.K., &amp; Alahe, M.A. (2024) Comparing Global Shutter and Rolling Shutter Cameras for Image Data Collection in Motion on a UGV. 16th ICPA. Oral presentation with conference paper (#10223).</li><br /> <li>Alahe, M.A., Kemeshi, J., <strong>Chang, Y.,</strong> Won, K., Yang, X., &amp; Sher, M. (2024) Securing Agricultural Data with Encryption Algorithms on Embedded GPU based Edge Computing Devices. 16th ICPA. Oral presentation with conference paper (#10244).</li><br /> <li>Alahe, M.A., Kemeshi, J., Gummi, S.R., <strong>Chang, Y.,</strong> &amp; Menendez, H. (2024) Design of an Automatic Travelling Electric Fence System for Sustainable Grazing Management. 16th ICPA. Oral presentation with conference paper (#10246).</li><br /> <li>Gummi, S.R., Alahe, M.A., Kemeshi, J., &amp; <strong>Chang, Y.</strong> (2024) Securing Agricultural Imaging Data in Smart Agriculture: A Blockchain-Based Approach to Mitigate Cybersecurity Threats and Future Innovations. 16th ICPA. Oral presentation with conference paper (#10247).</li><br /> <li>Gummi, S.R., Alahe, M.A., Pack, C., &amp; <strong>Chang, Y.</strong> (2024) A Swarm Robotics Navigation Simulator for Phenotyping Soybean Plants using Voronoi-Ant Colony Optimization. 16th ICPA. Oral presentation with conference paper (#10282).</li><br /> <li>Brennan, J<strong>., Parsons, I.,</strong> Harrison, M. &amp; Menendez, H. Development of an Application Programming Interface (API) to automate downloading and processing of precision livestock data. (2024).</li><br /> <li>Brennan, J., <strong>Parsons, I</strong>., Harrison, M. &amp; Menendez, H. Development of an Application Programming Interface (API) to automate downloading and processing of precision livestock data. ASAS, Calgary Alberta (2024).</li><br /> <li><strong>Parsons, Ira Lloyd</strong>, Brandi B Karisch, Amanda E Stone, Stephen L Webb, Durham A Norman, and Garrett M Street. Machine Learning Methods and Visual Observations to Categorize Behavior of Grazing Cattle Using Accelerometer Signals, 2024.</li><br /> <li><strong>Wang T.</strong>, H. Jin, H. Sieverding, S. Kumar, Y. Miao, O. Obembe, X. Rao, A. Nafchi, D. Redfearn, S. Cheye. 2023. &ldquo;Understanding farmer views of precision agriculture profitability in the US Midwest.&rdquo; <em>Ecological Economics</em>, 213, 107950.</li><br /> <li><strong>Wang T.</strong>, H. Jin, and S. Heidi. 2023. Factors affecting farmer perceived challenges towards precision agriculture. <em>Precision Agriculture</em>. <a href="https://doi.org/10.1007/s11119-023-10048-2">https://doi.org/10.1007/s11119-023-10048-2</a>.</li><br /> <li>Adereti, D. T., Gardezi, M., <strong>Wang, T.</strong>, McMaine, J. 2023. Understanding farmers&rsquo; engagement and barrier to machine learning-based intelligent agricultural decision support systems. <em>Agronomy Journal</em>. <a href="https://doi.org/10.1002/agj2.21358">https://doi.org/10.1002/agj2.21358</a>.</li><br /> </ol><br /> <p>&nbsp;LSU</p><br /> <ol><br /> <li><strong>Setiyono, T., Gentimis, T., </strong>Rontani, F., Duron, D., Bortolon, G., Adhikari, R., ... &amp; Pitman, W. D. (2024). Application of TensorFlow model for identification of herbaceous mimosa (Mimosa strigillosa) from digital images.&nbsp;<em>Smart Agricultural Technology</em>,&nbsp;<em>7</em>, 100400.</li><br /> <li>Santos, L. B., Gentry, D., Tryforos, A., Fultz, L., Beasley, J., &amp; <strong>Gentimis, T.</strong> (2024). Soybean yield prediction using machine learning algorithms under a cover crop management system.&nbsp;<em>Smart Agricultural Technology</em>, 100442.</li><br /> <li>Bampasidou, M., Goldgaber, D., <strong>Gentimis, T.,</strong> &amp; Mandalika, A. (2024). Overcoming &lsquo;Digital Divides&rsquo;: Leveraging higher education to develop next generation digital agriculture professionals.&nbsp;<em>Computers and Electronics in Agriculture</em>,&nbsp;<em>224</em>, 109181.</li><br /> </ol><br /> <p>Clemson</p><br /> <ol><br /> <li><strong>Koc, A.B</strong>., Erwin, C., Aguerre, M., Chastain, J. 2024. Estimating Tall Fescue and Alfalfa Forage Biomass Using an Unmanned Ground Vehicle. 15th International Congress on Agricultural Mechanization and Energy in Agriculture Cham 2024. Lecture Notes in Civil Engineering, vol 458. Springer, Cham. <a href="https://doi.org/10.1007/978-3-031-51579-8_32">https://doi.org/10.1007/978-3-031-51579-8_32</a>. Publisher: Springer Nature Switzerland Pages: 357-372.</li><br /> <li>Singh, J., <strong>Koc, A.B</strong>., Aguerre, M.J., Chastain, J.P., and Shaik, S. 2024. Estimating Bermudagrass Aboveground Biomass Using Stereovision and Vegetation Coverage.&nbsp;<em>Remote Sensing</em>,&nbsp;16, 2646. <a href="https://doi.org/10.3390/rs16142646">https://doi.org/10.3390/rs16142646</a> .</li><br /> <li><strong>Koc, A.B.,</strong> Erwin, C., Aguerre, M., Chastain, J. 2023. Estimating Tall Fescue and Alfalfa Forage Biomass Using an Unmanned Ground Vehicle. 15ᵗʰ International Congress of Agricultural Mechanization and Energy in Agriculture (AnkAgEng'23 - Antalya-Turkiye, Oct. 29 - Nov. 2,2023).</li><br /> <li><strong>Koc, A. B.,</strong> Singh, J., Aguerre, M. J. (2023). Estimating forage biomass using unmanned ground and aerial vehicles. In Proceedings of International Grassland Congress 2023. Pp. 1449-1452. <a href="https://doi.org/10.52202/071171-0352">https://doi.org/10.52202/071171-0352</a> .</li><br /> </ol><br /> <p>&nbsp;</p><br /> <p>MSU</p><br /> <ol><br /> <li>Ahmed, T., Wijewardane, N., <strong>Lu, Y</strong>., Jones, D., Kudenov, M., Williams, C., Villordon, A., Kamruzzaman, M., 2024. Advancing sweetpotato quality assessment with hyperspectral imaging and explainable artificial intelligence. Computers and Electronics in Agriculture 220, 108855.</li><br /> <li>Xu, J., <strong>Lu, Y</strong>., 2024. Prototyping and evaluation of a novel machine vision system for real-time, automated quality grading of sweetpotatoes. Computers and Electronics in Agriculture 219, 108826.</li><br /> <li>Xu. J., Lu, Y., Deng, B., 2024. Design, prototyping, and evaluation of a machine vision-based automated sweetpotato grading and sorting system. Journal of the ASABE (under review).</li><br /> <li>Xu, J., <strong>Lu, Y</strong>., Deng, B., 2024. OpenWeedGUI: an open-source graphical tool for weed Imaging and YOLO-based weed detection. Electronics 13 (9), 1699. (Project#2, Lu)</li><br /> <li>Deng, B., <strong>Lu, Y</strong>., 2024. Canopy Image-based Blueberry Detection by YOLOv8 and YOLOv9. Artificial Intelligence in Agriculture (under review).</li><br /> <li>Wang, Y., <strong>Lu, Y</strong>., Morris, D., Benjamin, M., Lavagnino, M., McIntyre, J., 2024. Automated sow body condition estimation by 3D computer vision towards precision livestock farming. Artificial Intelligence in Agriculture (submitted to journal).</li><br /> <li>Olaniyi, E., <strong>Lu, Y.,</strong> Sukumaran, A., Jarvis, T., Clinton, R., 2023. Non-destructive Assessment of White Striping in Broiler Breast Meat Using Structured Illumination Reflectance Imaging with Deep Learning. Journal of the ASABE 66(6), 1437-1447.</li><br /> <li>Dang, F., Chen, D., <strong>Lu, Y.,</strong> Li, Z., 2023. YOLOWeeds: a novel benchmark of YOLO object detectors for multi-class weed detection in cotton production systems. Computers and Electronics in Agriculture 205, 107655.</li><br /> <li>Chen, D., Qi, X., Zheng, Y., <strong>Lu, Y.,</strong> Huang, Y., Li, Z., 2024. Synthetic data augmentation by diffusion probabilistic models to enhance weed recognition. Computers and Electronics in Agriculture 216, 108517.</li><br /> <li>Deng, B., <strong>Lu, Y.,</strong> Xu, J., 2024. Weed database development: An updated survey of public weed datasets and cross-season weed detection adaptation. Ecological Informatics, 102546.</li><br /> </ol><br /> <p>&nbsp;</p><br /> <p>UArk</p><br /> <ol><br /> <li>Li, Z., <strong>Wang, D.,</strong> Zhu, T., Tao, Y., &amp; Ni, C. (2024). Review of deep learning-based methods for non-destructive evaluation of agricultural products.&nbsp;<em>Biosystems Engineering</em>,&nbsp;<em>245</em>, 56-83.</li><br /> <li><strong>Wang, D.,</strong> Sethu, S., Nathan, S., Li, Z., Hogan, V. J., Ni, C., ... &amp; Seo, H. S. (2024). Is human perception reliable? Toward illumination robust food freshness prediction from food appearance&mdash;Taking lettuce freshness evaluation as an example.&nbsp;<em>Journal of Food Engineering</em>, 112179.</li><br /> <li>Zhou, C., Li, Z., <strong>Wang, D.,</strong> Xue, S., Zhu, T., &amp; Ni, C. (2024). SSNet: Exploiting Spatial Information for Tobacco Stem Impurity Detection with Hyperspectral Imaging.&nbsp;<em>IEEE Access</em>.</li><br /> <li>Ali, M. A., <strong>Wang, D.,</strong> &amp; Tao, Y. (2024). Active Dual Line-Laser Scanning for Depth Imaging of Piled Agricultural Commodities for Itemized Processing Lines.&nbsp;<em>Sensors</em>,&nbsp;<em>24</em>(8), 2385.</li><br /> <li>Xu, Z., Uppuluri, R., Shou, W., <strong>Wang, D.,</strong> &amp; She, Y. (2024). Whole Chicken Pushing Manipulation via Imitation Learning. In&nbsp;<em>2024 ASABE Annual International Meeting</em>. American Society of Agricultural and Biological Engineers.</li><br /> <li>Li, Z., <strong>Wang, D.,</strong> Zhu, T., Ni, C., &amp; Zhou, C. (2023). SCNet: A deep learning network framework for analyzing near-infrared spectroscopy using short-cut.&nbsp;<em>Infrared Physics &amp; Technology</em>,&nbsp;<em>132</em>, 104731.</li><br /> </ol><br /> <p>&nbsp;</p><br /> <p>UF</p><br /> <ol><br /> <li>da Cunha V.G., A. Hariharan J., <strong>Ampatzidis Y.,</strong> Roberts P., 2023. Early detection of tomato bacterial spot disease in transplant tomato seedlings utilizing remote sensing and artificial intelligence. <em>Biosystems Engineering</em>, 234, 172-186, <a href="https://doi.org/10.1016/j.biosystemseng.2023.09.002">https://doi.org/10.1016/j.biosystemseng.2023.09.002</a>.</li><br /> <li>da Cunha V.A.G., Pullock D., Ali M., Neto A.D.C., <strong>Ampatzidis Y.,</strong> Weldon C., Kruger K., Manrakhan A., Qureshi J., 2024. Psyllid Detector: a web-based application to automate insect detection utilizing image processing and artificial intelligence. Applied Engineering in Agriculture, 40(4), 427-439. https://doi.org/10.13031/aea.15826.&nbsp;</li><br /> <li>Javidan S.M., Banakar A., Rahnama K., Vakilian K.A., <strong>Ampatzidis Y.,</strong> Feature engineering to identify plant diseases using image processing and artificial intelligence: a comprehensive review. <em>Smart Agricultural Technology</em>, 8, 100480, <a href="https://doi.org/10.1016/j.atech.2024.100480">https://doi.org/10.1016/j.atech.2024.100480</a>.</li><br /> <li>Javidan S.M., Banakar A., Vakilian K.A., <strong>Ampatzidis Y.,</strong> Rahnama K., 2024. Diagnosing the spores of tomato fungal diseases using microscopic image processing and machine learning. <em>Multimedia Tools and Applications</em>, 1-19, https://doi.org/10.1007/s11042-024-18214-y.&nbsp;</li><br /> <li>Kim, D.W., S.J. Jeong, <strong>S. Lee</strong>, H. Yun, Y.S., Chung, Y.-S. Kwon, and H.-J. Kim.&nbsp;2023. Growth monitoring of field-grown onion and garlic by CIE L*a*b* color space and region-based crop segmentation of UAV RGB images.&nbsp;Precision Agric&nbsp;24, 1982&ndash;2001. <a href="https://doi.org/10.1007/s11119-023-10026-8">https://doi.org/10.1007/s11119-023-10026-8</a>.</li><br /> <li>Kondaparthi AK, <strong>Lee WS</strong>, Peres NA. Utilizing High-Resolution Imaging and Artificial Intelligence for Accurate Leaf Wetness Detection for the Strawberry Advisory System (SAS).&nbsp;Sensors. 2024; 24(15):4836. <a href="https://doi.org/10.3390/s24154836">https://doi.org/10.3390/s24154836</a>.</li><br /> <li>Liu X., Zhang Z., Igathinathane C., Flores P., Zhang M., Li H., Han X., Ha T., <strong>Ampatzidis Y.,</strong> Kim H-J., 2024. Infield corn kernel detection using image processing, machine learning, and deep learning methodologies. <em>Expert Systems with Applications</em>, 238 (part E), 122278, <a href="https://doi.org/10.1016/j.eswa.2023.122278">https://doi.org/10.1016/j.eswa.2023.122278</a>.</li><br /> <li>Mehdizadeh S.A., Noshad M., Chaharlangi M., <strong>Ampatzidis Y.,</strong> Development of an innovative optoelectronic nose for detecting adulteration in quince seed oil. <em>Foods</em>, 12(23), 4350, <a href="https://doi.org/10.3390/foods12234350">https://doi.org/10.3390/foods12234350</a>.</li><br /> <li>Mirbod, O., <strong>Choi, D.,</strong> Heinemann, P. H., Marini, R. P., &amp; He, L. (2023). On-tree apple fruit size estimation using stereo vision with deep learning-based occlusion handling.&nbsp;Biosystems Engineering,&nbsp;226, 27-42.</li><br /> <li>Ojo I., <strong>Ampatzidis Y.,</strong> Neto A.D.C., Batuman O., 2024. Development of an automated needle-based trunk injection system for HLB-affected citrus trees. <em>Biosystems Engineering</em>, 240, 90-99, <a href="https://doi.org/10.1016/j.biosystemseng.2024.03.003">https://doi.org/10.1016/j.biosystemseng.2024.03.003</a>.</li><br /> <li>Ojo I., <strong>Ampatzidis Y.,</strong> Neto A.D.C., Bayabil K.H., Schueller K.J., Batuman O., 2024. Determination of needle penetration force and pump pressure for the development of an automated trunk injection system for HLB-affected citrus trees. <em>Journal of ASABE</em>, 67, 4, https://doi.org/10.13031/ja.15975.</li><br /> <li>Teshome F.T., Bayabil H.K., Hoogenboom G., Schaffer B., Singh A., <strong>Ampatzidis Y.,</strong> Unmanned aerial vehicle (UAV) imaging and machine learning applications for plant phenotyping. <em>Computers and Electronics in Agriculture</em>, 212, 108064, <a href="https://doi.org/10.1016/j.compag.2023.108064">https://doi.org/10.1016/j.compag.2023.108064</a>.</li><br /> <li>Teshome F.T., Bayabil H.K., Schaffer B., <strong>Ampatzidis Y.,</strong> Hoogenboom G., Singh A., 2024. Simulating soil hydrologic dynamics using crop growth and machine learning models. Computers and Electronics in Agriculture, 224, 109186, <a href="https://doi.org/10.1016/j.compag.2024.109186">https://doi.org/10.1016/j.compag.2024.109186</a>.</li><br /> <li>Zhang L., Ferguson L., Ying L., Lyons A., Laca E., and <strong>Ampatzidis Y.,</strong> Developing a web-based pistachio nut growth prediction system for orchard management. <em>HortTechnology</em>, 34,1, 1-7, <a href="https://doi.org/10.21273/HORTTECH05270-23">https://doi.org/10.21273/HORTTECH05270-23</a>.</li><br /> <li>Zhou, C., <strong>S. Lee</strong>, O. E. Liburd, I. Aygun, X. Zhou, A. Pourreza, J. K. Schueller, Y. Ampatzidis. 2023. Detecting two-spotted spider mites and predatory mites in strawberry using deep learning. Smart Agricultural Technology, 4, 100229. <a href="https://doi.org/10.1016/j.atech.2023.100229">https://doi.org/10.1016/j.atech.2023.100229</a>.</li><br /> <li>Zhou C., <strong>S. Lee</strong>, S. Zhang, O. E. Liburd, A. Pourreza, J. K. Schueller, <strong>Y. Ampatzidis</strong>. 2024. A smartphone application for site-specific pest management based on deep learning and spatial interpolation. <em>Computers and Electronics in Agriculture</em>, 218, 2024, 108726, ISSN 0168-1699, <a href="https://doi.org/10.1016/j.compag.2024.108726">https://doi.org/10.1016/j.compag.2024.108726</a>.</li><br /> <li>De Vries, A., <strong>Bliznyuk, N.,</strong> &amp; Pinedo, P. (2023). Invited Review: Examples and opportunities for artificial intelligence (AI) in dairy farms. Applied Animal Science, 39(1), 14-22.</li><br /> <li>Kalopesa, E., <strong>Tziolas, N.,</strong> Tsakiridis, N., Multimodal Fusion for soil organic carbon estimation at continental scale. Remote Sensing. (submitted)</li><br /> <li>Rosin, N. A., Dematt&ecirc;, J. A. M., Carvalho, H. W. P., Rodriguez-Albarrac&iacute;n, H. S., Rosas, J. T. F., Novais, J. J., Dalmolin, R. S. D., Alves, M. R., Falcioni, R., <strong>Tziolas, N.,</strong> Mallah, S., de Mello, D. C., &amp; Francelino, M. R. (2024). Spatializing soil elemental concentration as measured by X-ray fluorescence analysis using remote sensing data. Catena, 240, 107988. <a href="https://doi.org/10.1016/j.catena.2024.107988">https://doi.org/10.1016/j.catena.2024.107988</a> &nbsp;</li><br /> <li><strong>Tziolas, N.,</strong> Tsakiridis, N., Heiden, U., &amp; van Wesemael, B. (2024). Soil organic carbon mapping utilizing convolutional neural networks and Earth observation data: A case study in Bavaria state, Germany. Geoderma, 444, 116867. <a href="https://doi.org/10.1016/j.geoderma.2024.116867">https://doi.org/10.1016/j.geoderma.2024.116867</a></li><br /> <li>Patnam Reddy, K., <strong>Tziolas, N.</strong>, Dematte, J., AI-driven online spectral analysis tool for global use. Geoderma. (being prepared).</li><br /> <li>Qian, H., McLamore, E., &amp; <strong>Bliznyuk, N.</strong> (2023). Machine learning for improved detection of pathogenic E. coli in hydroponic irrigation water using impedimetric aptasensors: A comparative study.&nbsp;<em>ACS omega</em>,&nbsp;<em>8</em>(37), 34171-34179.</li><br /> </ol><br /> <p>&nbsp;</p><br /> <p>Mississippi State University</p><br /> <ol><br /> <li><strong>Gharakhani, H., Thomasson, J. A</strong>., Lu, Y., &amp; Reddy, K. R. (2023). Field Test and Evaluation of an Innovative Vision-Guided Robotic Cotton Harvester. <em>Computers and Electronics in Agriculture. </em>225: 109314.</li><br /> </ol><br /> <p>&nbsp;</p><br /> <p>UTK</p><br /> <ol><br /> <li>Amirivojdan, A., Nasiri, A., Zhou, S., Zhao, Y., &amp; <strong>Gan, H</strong>. (2024). ChickenSense: A Low-Cost Deep Learning-Based Solution for Poultry Feed Consumption Monitoring Using Sound Technology. AgriEngineering, 6(3), 2115-2129.</li><br /> <li>Nasiri, A., Zhao, Y., &amp; <strong>Gan, H</strong>. (2024). Automated detection and counting of broiler behaviors using a video recognition system. Computers and Electronics in Agriculture, 221, 108930. DOI: 10.1016/j.compag.2024.108930</li><br /> <li>Nasiri, A., Amirivojdan, A., Zhao, Y., &amp; <strong>Gan, H.</strong> (2024). An automated video action recognition-based system for drinking time estimation of individual broilers. Smart Agricultural Technology, 100409. <a href="https://DOI:10.1016">https://DOI:10.1016/j.atech.2024.100409</a></li><br /> </ol><br /> <p>UK</p><br /> <ol><br /> <li>Ekramirad, N., Doyle, L.E., Loeb, J.R., Santra, D., <strong>Adedeji, A.A.</strong> (2024). Hyperspectral imaging and machine learning as a nondestructive method for proso millet seed detection and classification. <em>Foods 13</em>(9), 1330<em>. </em></li><br /> <li><strong>Adedeji, A</strong>.<strong>A,</strong> Ekramirad, N., Khaled, Y.A., and Villanueva, R<strong>. </strong>(2024). Impact of storage on nondestructive detectability of codling moth infestation in apples. <em>Journal of ASABE 67</em>(2):401-408. <a href="https://doi.org/10.13031/ja.15583">https://doi.org/10.13031/ja.15583</a>. <strong>JIF</strong></li><br /> <li>Tizhe Liberty, J., Sun, S., Kucha, C., <strong>Adedeji, A. A.,</strong> Agidi, G., &amp; Ngadi, M. O. (2024). Augmented reality for food quality assessment: Bridging the physical and digital worlds. <em>Journal of Food Engineering</em> 367, 111893. <a href="https://doi.org/10.1016/j.jfoodeng.2023.111893">https://doi.org/10.1016/j.jfoodeng.2023.111893</a></li><br /> <li><strong>Adedeji, A.A.</strong>, Okeke, A., and Rady, A. (2023). Utilization of FTIR and machine learning for evaluating gluten-free bread contaminated with wheat flour. <em>Sustainability</em> &ndash; <em>Food Processing Safety and Public Health 15</em>(11), 8742.</li><br /> <li>Khaled, Y.A., Ekramirad, N., Donohue, K., Villanueva, R., and <strong>Adedeji, A</strong>.<strong>A. </strong>(2023). Non-destructive hyperspectral imaging and machine learning-based predictive models for physicochemical quality attributes of apples during storage as affected by codling moth infestation. <em>Agriculture &ndash; Digital Agriculture 13</em>(5),1086. <a href="https://doi.org/10.3390/agriculture13051086">https://doi.org/10.3390/agriculture13051086</a>.</li><br /> <li>Ekramirad, N., Khaled, Y.A., Donohue, K., Villanueva, R., and <strong>Adedeji, A</strong>.<strong>A. </strong>(2023). Classification of codling moth infested apples using sensor data fusion of acoustic and hyperspectral features coupled with machine learning. <em>Agriculture - Agricultural Technology 13</em>(4), 839. <a href="https://doi.org/10.3390/agriculture13040839">https://doi.org/10.3390/agriculture13040839</a>.</li><br /> </ol><br /> <p>&nbsp;</p><br /> <p>TAMU</p><br /> <ol><br /> <li>Fernandes, M.M., Fernandes Junior, J.d., Adams, J.M., <strong>Tedeschi, L.O.</strong> <em>et al.</em>(2024). Using sentinel-2 satellite images and machine learning algorithms to predict tropical pasture forage mass, crude protein, and fiber content.&nbsp;<em>Scientific Report.</em>&nbsp;14, 8704. <a href="https://doi.org/10.1038/s41598-024-59160-x">https://doi.org/10.1038/s41598-024-59160-x</a></li><br /> <li><strong> Kaniyamattam,</strong> <strong>Bhandari, M.,</strong> Hardin, R., Tao, J., <strong>Landivar, J</strong>., and <strong>Tedeschi, L.</strong> (2023). Scalable Data-driven Intelligent Agri-Systems: Opportunities, Challenges, and Research Investment Analysis for the State of Texas. A white paper submitted to Texas A&amp;M AgriLife Research.</li><br /> <li>Risal, A., Niu, H., Landivar-Scott, J. L., Maeda, M. M., Bednarz, C. W., <strong>Landivar-Bowles, J.,</strong> ... &amp; <strong>Bhandari, M.</strong> (2024). Improving Irrigation Management of Cotton with Small Unmanned Aerial Vehicle (UAV) in Texas High Plains. <em>Water,</em> 16(9), 1300.</li><br /> <li>Niu, H., Peddagudreddygari, J. R., <strong>Bhandari, M., Landivar, J. A.,</strong> Bednarz, C. W., &amp; Duffield, N. (2024). In-Season Cotton Yield Prediction with Scale-Aware Convolutional Neural Network Models and Unmanned Aerial Vehicle RGB Imagery. <em>Sensors,</em> 24(8), 2432.</li><br /> <li>Khuimphukhieo, I., <strong>Bhandari, M.,</strong> Enciso, J., &amp; da Silva, J. A. (2024). Assessing Drought Stress of Sugarcane Cultivars Using Unmanned Vehicle System (UAS)-Based Vegetation Indices and Physiological Parameters. <em>Remote Sensing,</em> 16(8), 1433.</li><br /> <li>Zhao, L., <strong>Bhandari, M</strong>., Um, D., Nowka, K., <strong>Landivar, J.,</strong> &amp; Landivar, J. Cotton Yield Prediction Utilizing Unmanned Aerial Vehicles (Uav) and Bayesian Neural Networks.&nbsp;<em>Available at SSRN 4693599</em>.</li><br /> <li>Dhal, S. B., Kalafatis, S., Braga-Neto, U., Gadepally, K. C., <strong>Landivar-Scott, J. L.,</strong> Zhao, L., ... &amp; <strong>Bhandari, M.</strong> (2024). Testing the Performance of LSTM and ARIMA Models for In-Season Forecasting of Canopy Cover (CC) in Cotton Crops. <em>Remote Sensing,</em> 16(11), 1906.</li><br /> <li>Happs, R. M., Hanes, R. J., Bartling, A. W., Field, J. L., Harman-Ware, A. E., Clark, R. J., <strong>Yaping, X.,</strong> ... &amp; Davison, B. H. (2024). Economic and Sustainability Impacts of Yield and Composition Variation in Bioenergy Crops: Switchgrass (Panicum virgatum L.). <em>ACS Sustainable Chemistry &amp; Engineering</em>, 12(5), 1897-1910.</li><br /> <li><strong>Bhandari, M.,</strong> Chang, A., Jung, J., Ibrahim, A. M., Rudd, J. C., Baker, S., ... &amp; <strong>Landivar, J.</strong> (2023). Unmanned aerial system‐based high‐throughput phenotyping for plant breeding. <em>The Plant Phenome Journal</em>, 6(1), e20058.</li><br /> </ol><br /> <p>K-State</p><br /> <ol><br /> <li>McGinty H, Shimizu C, Hitzler P, &amp; <strong>Sharda A.</strong> (2024). Towards a Global Food Systems Datahub. Semantic Web -1 (2024) 1&ndash;4. <a href="https://DOI.org/10.3233/SW-243688">https://DOI.org/10.3233/SW-243688</a></li><br /> <li>Badua S, Sharda A, Aryal B. 2024. Quantifying real-time opening disk load during planting operations to assess compaction and potential for planter control. <a href="https://www.researchgate.net/journal/Precision-Agriculture-1573-1618?_sg=wUVWW5w9BbswL7KgR9KknO3rNEp_a_fBz-v-UX2O0jOoYmHnWXNTcKrIADquDo4pEnm87PnWoQnh84MJQh5CqvNw-lvBkg.3nevR9ILfX0BhSZcIKZPOor82Jpkn8mHa2Q8bF2knyI2Tqp0T1AZWJfNNVypNUa7pfMSwyVZBFbZE1r90HxDYA&amp;_tp=eyJjb250ZXh0Ijp7ImZpcnN0UGFnZSI6InByb2ZpbGUiLCJwYWdlIjoicHVibGljYXRpb24iLCJwcmV2aW91c1BhZ2UiOiJwcm9maWxlIiwicG9zaXRpb24iOiJwYWdlSGVhZGVyIn19">Precision Agriculture</a>25(4):1-13. https://DOI.org/<a href="http://dx.doi.org/10.1007/s11119-024-10151-y">1007/s11119-024-10151-y</a> &nbsp;&nbsp;</li><br /> <li>Das S, <strong>Flippo D,</strong> Welch S. 2024. Autonomous robot system for steep terrain farming operations. U.S. Patent and Trademark Office.&nbsp;</li><br /> <li>Grijalva I, Kang Q, Flippo D, <strong>Sharda A,</strong> McCornack B. 2024. Unconventional strategies for aphid management in sorghum.&nbsp;Insects, 15(475).</li><br /> <li>Rahman R, Indris C, Bramesfeld G, Zhang T, Li K, Chen X, Grijalva I, McCornack B, <strong>Flippo D,</strong> <strong>Sharda A,</strong> Wang G. A new dataset and comparative study for aphid cluster detection and segmentation in sorghum fields.&nbsp;Journal of Imaging, 10(5), 2024-5-08.</li><br /> <li>Pokharel P, <strong>Sharda A, Flippo D,</strong> Ladino K. Design and systematic evaluation of an under-canopy robotic spray system for row crops.&nbsp; Smart Agricultural Technology, 8:100510.</li><br /> </ol><br /> <p><strong>ALL STATION CONFERENCE PRESENTATIONS: PODIUM/POSTER</strong></p><br /> <p>SDSU</p><br /> <ol><br /> <li><strong>Wang, T.</strong> and H. Jin. Factors Affecting Farmer Adoption of Unmanned Aerial Vehicles: Current and Future. 2024 AI in Agriculture and Natural Resources Conference. April 15-17, 2024, College Station, Texas.</li><br /> <li><strong>Wang, T.</strong> and H. Jin. Factors Affecting Farmer Adoption of Unmanned Aerial Vehicles: Current and Future. Southern Agricultural Economics Association (SAEA) 56th Annual Meeting. February 3-6, 2024, Atlanta, Georgia.</li><br /> <li>Adereti, D. T., Gardezi, M., <strong>Wang, T.</strong>, McMaine, J. 2023. Understanding farmers&rsquo; engagement and barrier to machine learning-based intelligent agricultural decision support systems. 85th Annual Meeting of the Rural Sociological Society. August 2-6, Burlington, VT.</li><br /> </ol><br /> <p>&nbsp;</p><br /> <p>LSU</p><br /> <ol><br /> <li>Adhikari, R., <strong>Setiyono, T.,</strong> Dodla, S. K., Pabuayon, I. L., Duron, D., Acharya, B., ... &amp; Shiratsuchi, L. S. (2023, October). Evaluation of Varying Canopy Distance on Crop Circle Phenom Sensor Measurements: Implications for Remote Sensing of Crop Parameters. In&nbsp;<em>ASA, CSSA, SSSA International Annual Meeting</em>. ASA-CSSA-SSSA</li><br /> <li><strong>Setiyono, T.,</strong> Dodla, S. K., Rontani, F. A., Acharya, B., Duron, D., Adhikari, R., ... &amp; Gentimis, T. (2023, October). Precision Positioning in UAV Remote Sensing: Case Study in Corn N Rates and Soybean Seeding Rates Experiments. In&nbsp;<em>ASA, CSSA, SSSA International Annual Meeting</em>. ASA-CSSA-SSSA.</li><br /> <li>Acharya, B., <strong>Setiyono, T.,</strong> Rontani, F. A., Dodla, S. K., Adhikari, R., Duron, D., ... &amp; Parvej, R. (2023, October). Application of UAV Remote Sensing for Monitoring Nitrogen Status in Corn Under Excessive Rainfall Conditions. In&nbsp;<em>ASA, CSSA, SSSA International Annual Meeting</em>. ASA-CSSA-SSSA.</li><br /> <li>Duron, D., Rontani, F. A., Acharya, B., Adhikari, R., Taylor, Z., Blanchard, B., ... &amp; <strong>Setiyono, T.</strong> (2023, October). Integrating Crop Modeling and Remote Sensing Data for Prediction of Sugarcane Growth, Yield, and Sugar Content and Their Field Spatial Variability. In&nbsp;<em>ASA, CSSA, SSSA International Annual Meeting</em>. ASA-CSSA-SSSA.</li><br /> <li>Adhikari, R., <strong>Setiyono, T.,</strong> Dodla, S. K., Pabuayon, I. L., Duron, D., Acharya, B., ... &amp; Shiratsuchi, L. S. (2023, October). Multi-Sensor Crop Sensing Platforms for Monitoring Agronomic Practices Under Different Tillage and Fertilization Systems. In&nbsp;<em>ASA, CSSA, SSSA International Annual Meeting</em>. ASA-CSSA-SSSA.</li><br /> <li>Lanza, P., Santos, L., <strong>Gentimis, A.,</strong> Yang, Y., Conger, S., &amp; Beasley, J. (2023). Parameters to increase LiDAR mounted UAV efficiency on agricultural field elevation measurements. In&nbsp;<em>Precision agriculture'23</em>(pp. 715-721). Wageningen Academic.</li><br /> <li>J&uacute;nior, M. R. B., de Almeida Moreira, B. R., Duron, D., <strong>Setiyono, T.,</strong> Shiratsuchi, L. S., &amp; da Silva, R. P. (2024). Integrated sensing and machine learning: Predicting saccharine and bioenergy feedstocks in sugarcane.&nbsp;<em>Industrial Crops and Products</em>,&nbsp;<em>215</em>, 118627.</li><br /> </ol><br /> <p>Clemson</p><br /> <ol><br /> <li>Singh, J., <strong>Koc, A.B.,</strong> Aguerre, M.J., Chastain, J.P., and Shaik, S. 2024. Stereoscopic Morphometry in Forages: Predicting Pasture Quantity with Field Robotics. Presented at the 2024 ASABE Annual International Meeting, July 28-31, 2024. Anaheim CA.</li><br /> <li>Lisa Umutoni, Vidya Samadi, George Vellidis, Jose Payero, <strong>Bulent Koc,</strong> Charles Privette III. 2024. Application of Deep Neural Networks for Seasonal Cotton Yield Estimation. Presented at the 2024 ASABE Annual International Meeting, July 28-31, 2024. Anaheim CA.</li><br /> <li>Shaik, S., <strong> B. Koc</strong>, J. Singh, M. Aguerre, J. P. Chastain. 2024. Aboveground Biomass Prediction of Bermudagrass: A Comparative Analysis of Machine Learning Models. 2024 AI in Agriculture and Natural Resources Conference. April 15, 2024 - April 17, 2024.</li><br /> </ol><br /> <p>&nbsp;</p><br /> <p>MSU</p><br /> <ol><br /> <li>Xu, J., <strong>Lu, Y.,</strong> Deng, B., 2024. Design, prototyping, and evaluation of a machine vision-based automated sweet potato grading and sorting System. ASABE Annual International Meeting 2400102.</li><br /> <li>Xu, J., <strong>Lu, Y,</strong> 2024. Design and preliminary evaluation of a machine vision-based automated sweet potato sorting system. Sensing for Agriculture and Food Quality and Safety XVI Proceedings Volume PC13060.</li><br /> <li>Xu., J., <strong>Lu, Y.,</strong> 2024. Prototyping and preliminary evaluation of a real-time multispectral vision system for automated sweet potato quality grading. Presented at the 2024 International Conference on Precision Agriculture. (Project #1, Lu)</li><br /> <li>Xu, J., <strong>Lu, Y.,</strong> 2023. OpenWeedGUI: an open-source graphical user interface for weed imaging and detection. Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping VIII 12539, 97-106.</li><br /> <li>Deng, B., <strong>Lu, Y.,</strong> Vander Weide, J., 2024. Development and Preliminary Evaluation of a Deep Learning-based Fruit Counting Mobile APP for High-bush Blueberries. ASABE Annual International Meeting 2401022</li><br /> <li>Wang, Y., <strong>Lu, Y.,</strong> Morris, D., Benjamin, M., Lavagnino, M., McIntyre, J., 2024. 3D Computer Vision-Based Sow Body Condition Estimation Towards Precision Livestock Farming. Presented at the 2024 AI Conference in Agriculture.</li><br /> <li>Wang, Y., <strong>Lu, Y.,</strong> Morris, D., Benjamin, M., Lavagnino, M., McIntyre, J., 2024. 3D Computer Vision with A Spatial-Temporal Neural Network for Lameness Detection of Sows. Presented at the 2024 International Conference on Precision Agriculture.</li><br /> <li>Deng, B., <strong>Lu, Y</strong>., 2023. Factors influencing the detection of Lambsquarters by YOLOv8 towards precision weed control. Poster presented at the Great Lakes EXPO (Grand Rapids, Michigan).</li><br /> <li>Deng, B., <strong>Lu, Y</strong>., 2024. Weed Image Augmentation by ControlNet-Added Stable Diffusion. Proceedings Volume 13035, Synthetic Data for Artificial Intelligence and Machine Learning: Tools, Techniques, and Applications II 130350M. <a href="https://doi.org/10.1117/12.3014145">https://doi.org/10.1117/12.3014145</a></li><br /> </ol><br /> <p>&nbsp;</p><br /> <p>UArk</p><br /> <ol><br /> <li>Pallerla C.,&nbsp;Owens, C., <strong>Wang D.,</strong> (2024) Hyperspectral imaging and Machine learning algorithms for foreign material detection on the chicken surface.&nbsp;<em>In 2024 American Society of Agricultural and Biological Engineers (ASABE) Annual International Meeting.</em>Anaheim, CA [Poster presentation]</li><br /> <li>Pallerla C.,&nbsp;Owens, C., <strong>Wang D.,</strong> (2024) Hyperspectral imaging and Machine learning algorithms for foreign material detection on the chicken surface.&nbsp;<em>In 2024 Poultry Science Asscoiation Annual International Meeting.</em> Louisville, KY [Poster presentation]</li><br /> <li>Feng Y.,&nbsp;<strong>Wang D.,</strong> (2024) Synthetic Data Augmentation for Chicken Carcass Instance Segmentation with Mask Transformer.&nbsp;<em>In 2024 American Society of Agricultural and Biological Engineers (ASABE) Annual International Meeting.</em>&nbsp;Anaheim, CA [Poster presentation]</li><br /> <li>Mahmoudi S.,&nbsp;<strong>Wang D.,</strong> (2024) Automated Solutions for Poultry Processing: Integrating Robotic Swab Sampling and Pathogen Detection Technologies.&nbsp;<em>In 2024 American Society of Agricultural and Biological Engineers (ASABE) Annual International Meeting.</em>&nbsp;Anaheim, CA [Poster presentation]</li><br /> <li>Sohrabipour P., Wan S., Yu S.,&nbsp;<strong>Wang D.,</strong> (2024) Depth image guided Mask-RCNN model for chicken detection in poultry processing line.&nbsp;<em>In 2024 American Society of Agricultural and Biological Engineers (ASABE) Annual International Meeting.</em>Anaheim, CA [Oral presentation]</li><br /> <li>Mahmoudi S., Sohrabipour P., Obe T., Gibson K., Crandall P., Jeyam S.,&nbsp;<strong>Wang D.</strong> (2024), Automated Environmental Swabbing: A Robotic Solution for Enhancing Food Safety in Poultry Processing.&nbsp;<em>In 2024 the Third Annual Artificial Intelligence in Agriculture Conference. College Station, TX</em>&nbsp;[Poster presentation]</li><br /> <li>Sohrabipour P., Mahmoudi S., She Y., Shou W., Pallerla C., Schrader L.,&nbsp;<strong>Wang D.</strong> (2024), Advanced Poultry Automation: Integrating 3D Vision Reconstruction and Mask R-CNN for Efficient Chicken Handling.&nbsp;<em>In 2024 the Third Annual Artificial Intelligence in Agriculture Conference.</em>&nbsp;College Station, TX [Poster presentation, First place winner]</li><br /> </ol><br /> <p>&nbsp;</p><br /> <p>UF</p><br /> <ol><br /> <li><strong>Ampatzidis Y.,</strong> 2024. Can AI and automation transform specialty crop production? 16th International Conference on Precision Agriculture (ICPA), International Symposium on robotics and Automation, Manhattan, Kansas, USA, July 21-24.</li><br /> <li><strong>Ampatzidis Y.</strong>, 2024. Agroview and Agrosense for AI-enhanced precision orchard management. SE Regional Fruit and Vegetable Conference, Savannah, GA, January 11-14, 2024</li><br /> <li><strong>Ampatzidis Y.,</strong> 2023. Emerging and advanced technologies in agriculture. Link (Linking Industry Networks through Certifications; High School Teachers Training) Conference, Daytona Beach, FL, October 10-12, 2023.</li><br /> <li><strong>Ampatzidis Y.,</strong> 2023. AI and Extension. Possibilities and Challenges. 2023 SR-PLN Middle Managers Conference, Next Generation: Evolving the Extension Enterprise, Orlando, FL, August 22-24.</li><br /> <li><strong>Ampatzidis Y.,</strong> 2023. AI-Enhanced Technologies for Precision Management of Specialty Crops. Sustainable Precision Agriculture in the Era of IoT and Artificial Intelligence, Bard Ag-AI Workshop, Be&rsquo;er Sheva, Israel, July 18-20, 2023.</li><br /> <li><strong>Ampatzidis Y.,</strong> Ojo I., Neto A.D.C., Batuman O., 2024. Automated needle-based trunk injection system for HLB-affected citrus trees. AgEng International Conference of EurAgEng, Agricultural Engineering Challenges in Existing and New Agrosystems, Athens, Greece, July 1-4, 2024.</li><br /> <li><strong>Ampatzidis Y.,</strong> Vijayakumar V., Pardalos P., 2024. AI-enabled robotic spraying technology for precision weed management in specialty crops. Optimization, Analytics, and Decision in Big Data Era Conference (in honor of the 70th birthday of Dr. Panos Pardalos), Halkidiki, Greece, June 16-21.</li><br /> <li>Banakar A., Javidan S.M., Vakilian K.A., <strong>Ampatzidis Y.,</strong> 2024. Detection of spectral signature and classification of Alternaria alternata and Alternaria solani diseases in tomato plant by analysis of hyperspectral images and support vector machine. AgEng International Conference of EurAgEng, Agricultural Engineering Challenges in Existing and New Agrosystems, Athens, Greece, July 1-4, 2024.</li><br /> <li>Cho Y., Yu, Z., <strong>Ampatzidis Y.,</strong> Nam J., 2024. Blockchain-enhanced security and data management in smart agriculture. 6th CIGR International Conference, Jeju, Korea, May 19&ndash;23, 2024.</li><br /> <li>Dutt, N., &amp; <strong>Choi, D.</strong> (2024). A Computer Vision System for Mushroom Detection and Maturity Estimation using Depth Images. 2024 ASABE Annual International Meeting.</li><br /> <li>Etefaghi, A., <strong>Medeiros, H</strong>. &ldquo;ViLAD: Video-based Lettuce Association and Detection ,&rdquo; American Society of Agricultural and Biological Engineers Annual International Meeting, Anaheim, CA, July 2024.</li><br /> <li>Gallios, I., &amp; <strong>Tziolas, N.</strong> (2024). Synergistic use of low-cost NIR scanner and geospatial covariates to enhance soil organic carbon predictions using dual input deep learning techniques. IEEE International Geoscience and Remote Sensing Symposium (IGARSS), 8-12 July, Athens, Greece.</li><br /> <li>Hernandez, B., <strong>Medeiros, H.</strong> &ldquo;Multiple Plant Tracking for Robotized Spraying of Ground Plants,&rdquo; 2023 IROS Workshop on Agricultural Robotics for a Sustainable Future Innovation in Precision Agriculture (3rd paper prize), Detroit, MI, Oct 2023.</li><br /> <li>Huang, Z., <strong>W. S. Lee, N.C.</strong> Takkellapati. 2024. Strawberry canopy size estimation with SAM guided by YOLOv8 detection. ASABE Paper No. 2400181. St. Joseph, MI.: ASABE.&nbsp;&nbsp;</li><br /> <li>Huang, Z<strong>., W. S. Lee,</strong> N.C. Takkellapati. 2024. HOPSY: Harvesting Optimization for Production of StrawberrY using real-time detection with modified YOLOv8-nano. In Proceedings of the 16th International Conference on Precision Agriculture (unpaginated, online). Monticello, IL: International Society of Precision Agriculture.</li><br /> <li>Ilodibe, U., &amp; <strong>Choi, D.</strong> (2024). Evaluating The Performance of a Mite Dispensing System for Biological Control of Chilli Thrips in Strawberry Production in Florida. 2024 ASABE Annual International Meeting.</li><br /> <li>Lacerda C., and Neto A.D.C<strong>., Ampatzidis Y.,</strong> 2024. Agroview: enhance satellite imagery using super-resolution and generative AI for precision management in specialty crops. AgEng International Conference of EurAgEng, Agricultural Engineering Challenges in Existing and New Agrosystems, Athens, Greece, July 1-4, 2024.</li><br /> <li><strong>Lee, W. S.</strong> 2023. Strawberry plant wetness detection using color imaging and artificial intelligence for the Strawberry Advisory System (SAS). 2023 Annual Strawberry AgriTech Conference, Plant City, FL, May 17, 2023.</li><br /> <li><strong>Lee, W. S.,</strong> T. Burks, and <strong>Y. Ampatzidis.</strong> 2023. Precision agriculture in Florida, USA &ndash; The Beginning, Progress, and Future. Chungnam National University, Daejeon-si, Korea. May 24, 2023.</li><br /> <li><strong>Lee, W. S., T</strong>. Burks, <strong>and Y. Ampatzidis</strong>. 2023. Precision agriculture in Florida, USA &ndash; The Beginning, Progress, and Future. Department of Agricultural Engineering, Division of Smart Farm Development, National Institute of Agricultural Sciences, Jeonju-si, Korea. May 25, 2023.</li><br /> <li><strong>Lee, W. S.,</strong> T. Burks, and <strong>Y. Ampatzidis</strong>. 2023. Precision agriculture in Florida, USA &ndash; The Beginning, Progress, and Future. Seoul National University, Seoul, Korea. May 31, 2023.</li><br /> <li>Lee, W. S., Y. Ampatzidis, and D. Choi. 2023. University of Florida 2023 W-3009 Report (presented via Zoom). Cornell AgriTech, Cornell</li><br /> <li>Mirbod, O., &amp; <strong>Choi, D.</strong> (2023). Synthetic Data-Driven AI Using Mixture of Rendered and Real Imaging Data foUniversity, Geneva, NY. June 20-21, 2023.</li><br /> <li><strong>Medeiros, H.</strong> &ldquo;Self-supervised Learning for Panoptic Segmentation of Multiple Fruit Flower Species,&rdquo; IEEE/RSJ International Conference on Intelligent Robots and Systems, Detroit, MI, Oct 2023.r Strawberry Yield Estimation. 2023 ASABE Annual International Meeting.</li><br /> <li>Ojo I., Neto A.D.C., <strong>Ampatzidis Y</strong>., Batuman O., Albrecht U., 2024. Needle-based, automated trunk injection system for HLB-affected citrus trees. International Research Conference on Huanglongbing VII, Riverside, CA, March 26-29, 2024.</li><br /> <li>Ottoy, S., Karyotis, K., Kalopesa, E., Van Meerbeek, K., Nedelkou, J., Gkrimpizis, T., De Vocht, A., Zalidis, G., &amp; <strong>Tziolas, N.</strong> (2024). Digital mapping of soil organic carbon using drone remote sensing. IEEE International Geoscience and Remote Sensing Symposium (IGARSS), 8-12 July, Athens, Greece.</li><br /> <li>Vijayakumar V., <strong>Ampatzidis Y.,</strong> 2024. Development of a machine vision and spraying system of a robotic precision smart sprayer for specialty crops. 3rd Annual AI in Agriculture and Natural Resources Conference, College Station, TX, April 15-17, 2024.</li><br /> <li>Wang, R., Hofstetter, D. <strong>Medeiros, H.</strong> Boney, J. Kassub, H. &ldquo;Evaluation of turkey behavior under different night lighting treatments using machine learning.&rdquo; American Society of Agricultural and Biological Engineers Annual International Meeting, Anaheim, CA, July 2024.</li><br /> <li>Zhou C., <strong>Ampatzidis Y.,</strong> Pullock D., 2024. Detecting citrus pests from sticky traps using deep learning. 3rd Annual AI in Agriculture and Natural Resources Conference, College Station, TX, April 15-17, 2024.</li><br /> <li>Zhou, X., <strong>Y. Ampatzidis, W. S. Lee</strong>, S. Agehara, and J. K. Schueller. 2023. AI-based inspection system for mechanical strawberry harvesters. AI in Agriculture: Innovation and discovery to equitably meet producer needs and perceptions Conference, Orlando, FL, April 17-19, 2023.</li><br /> <li>Zhou, C., <strong>W. S. Lee,</strong> W. Kratochvil, J. K. Schueller, and A. Pourreza. 2023. A portable imaging device for twospotted spider mite detection in strawberry. ASABE Annual Meeting, Omaha, NE, July 9-12, 2023.</li><br /> <li>Zhou, C., <strong>W. S. Lee,</strong> N. Peres, B. S. Kim, J. H. Kim, and H. C. Moon. 2023. Strawberry flower and fruit detection based on an autonomous imaging robot and deep learning. 14th European Conference on Precision Agriculture, Bologna, Italy, July 2-6, 2023.</li><br /> </ol><br /> <p>UTK</p><br /> <ol><br /> <li>Nasiri, A., Zhao, Y., <strong>Gan, H.</strong> (2024). Automated broiler behaviors measurement through deep learning models. ASABE Annual International Meeting, Anaheim, CA.</li><br /> <li>Amirivojdan, A., Nasiri, A., Zhao, Y., <strong>Gan, H.</strong> (2024). A machine vision system for broiler body weight estimation. ASABE Annual International Meeting, Anaheim, CA.</li><br /> </ol><br /> <p>UCDavis</p><br /> <ol><br /> <li>Li, Z; Karimzadeh, S.; <strong>Ahamed, M. S.</strong> (2024). Detection of Calcium Deficiency in the Growing Stage of Lettuce Using Computer Vision. ASABE Annual Meeting 2024, July 28-31, Anaheim, California.</li><br /> <li>Karimzdeh, S.; Chowdhury, M.; <strong>Ahamed, M. S</strong>. (2023). Fault Detection and Diagnosis of Hydroponic System using Intelligent Computational Model. ASABE Annual Meeting, July 9-12, Omaha, Nebraska.</li><br /> <li>Li, Z; Karimzadeh, S.; <strong>Ahamed, M. S.</strong> (2024). Nutrient Dosing Algorithms to Mitigate Ion Imbalance in Closed-Loop Hydroponic Systems. ASABE Annual Meeting 2024, July 28-31, Anaheim, California.</li><br /> </ol><br /> <p>UK</p><br /> <ol><br /> <li><strong>Mizuta K.</strong>, Miao Y, Lu J, and Negrini R. (2024) Evaluating Different Strategies to Analyze On-farm Precision Nitrogen Trial Data. 16th International Conference on Precision Agriculture, Manhattan, KS.</li><br /> <li>Miao Y, Kechchour A, Sharma V, Flores A, Lacerda L, <strong>Mizuta K</strong>, Lu J, and Huang Y. (2024) In-season Diagnosis of Corn Nitrogen and Water Status Using UAV Multispectral and Thermal Remote Sensing. 16th International Conference on Precision Agriculture, Manhattan, KS.</li><br /> <li>Oloyede, A. and <strong>Adedeji, A.A.</strong> (2024). Near-infrared hyperspectral imaging sensing for gluten detection and quantification. Accepted for presentation at 2024 ASABE Annual International Meeting, Anaheim, CA. July 28 &ndash; 31, 2024. Paper #: 2400053.</li><br /> <li><strong>Adedeji, A.A, </strong>Loeb, J.R., Doyle, L.E., Ekramirad, N., and Khaled, Y. Al Fadhl. (2023). Photon-induced reduction in barley malt processing time and quality improvement. A paper presented (oral) at the 14<sup>th</sup> International Congress on Engineering and Food (ICEF14) held in the city of Nantes France from June 20 &ndash; 23, 2023.</li><br /> <li><strong>Adedeji, A.A.,</strong> Ekramirad, N., Al Khaled, Y.A., Donohue, K., and Villanueva, R. (2023). Sensor data fusion and machine learning approach for pest infestation detection in apples. A poster presented at the SEC Conference with the theme: &ldquo;USDA-NIFA AI in Agriculture: Innovation and Discovery to Equitably Meet Producers&rsquo; Needs and Perceptions&rdquo; held in Orlando Florida on April 17 &ndash; 19, 2023.</li><br /> </ol><br /> <p>K-State</p><br /> <ol><br /> <li>Alamdari S, <strong>Brokesh</strong> 2024. &ldquo;Enhancing Soil Health Monitoring in Precision Agriculture: A Comparative Analysis of avDAQ Vibration Data Collection System and Traditional Soil Sensors&rdquo; ASABE-AIM, Presentation # 2400896</li><br /> <li>Peiretti J, <strong>Sharda A.</strong> &ldquo;Experimental study on the impact of planter tool bar position on row unit behavior&rdquo; ASABE-AIM, Presentation # 2400215</li><br /> <li>Vail B, <strong>Brokesh E.</strong> &ldquo;Design and field-testing of a pull-force measuring frame for the testing of agricultural tire rolling resistance&rdquo; ASABE-AIM, Presentation # 2401007</li><br /> <li>Shende K, <strong>Sharda A.</strong> &ldquo;Integration &amp; testing of wireless data communication system for autonomous liquid application platform&rdquo; ASABE-AIM, Presentation # 2400833</li><br /> <li>Kaushal S, <strong>Sharda A.</strong> &ldquo;Enhancing Agricultural Feedback Analysis through VUI and Deep Learning Integration&rdquo; ASABE-AIM, Presentation # 2400287</li><br /> <li>Abon J<strong>, Sharda A.</strong> &ldquo;Optimizing Corn Irrigation Strategies: Insights from ND VI Trends, Soil Moisture Dynamics, and Remote Sensing&rdquo; ASABE-AIM, Presentation # 2400814</li><br /> <li>Peiretti J, <strong>Sharda A.</strong> &ldquo;Effective Strategies for Closing Furrows Based on Corn Planter Settings&rdquo; ASABE-AIM, Presentation # 2400215</li><br /> </ol><br /> <p>&nbsp;</p><br /> <p><strong>Extension Articles </strong></p><br /> <p>UF</p><br /> <ol><br /> <li><strong>Choi, D.</strong>, Mirbod, O., Ilodibe, U., &amp; Kinsey, S. (2023). Understanding Artificial Intelligence: What It Is and How It Is Used in Agriculture: AE589, 10/2023.&nbsp;<em>EDIS</em>,<em>2023</em>(6).</li><br /> <li>Her Y.G., <strong>Bliznyuk N., Ampatzidis</strong>, Yu Z., and Bayabil H., 2024. Introduction to Artificial Intelligence in Agriculture. EDIS, University of Florida, IFAS Extension (accepted).</li><br /> <li>Sharma L., and <strong>Ampatzidis Y.,</strong> Approaches to consider for site-specific field mapping. SS713, EDIS, University of Florida, IFAS Extension, <a href="https://doi.org/10.32473/edis-SS713-2023">doi.org/10.32473/edis-SS713-2023</a>.</li><br /> </ol><br /> <p><strong>&nbsp;</strong></p><br /> <p>&nbsp;</p>

Impact Statements

  1. S1090 started in 2021. The group has organized three successful AI conferences held at three locations across the country with participants drawn both private and public institutions, and involvement of seasoned scientists and student scholars. During each of the conference, a hands on session on big data analytics was held as well. The group in the reporting year published 90 peer-reviewed articles, presented 80 conference papers, trained a total of 147 graduate students, 24 post-docs, and 15 researchers on various subject related to AI applications in Agro-ecosystems.
Back to top
Log Out ?

Are you sure you want to log out?

Press No if you want to continue work. Press Yes to logout current user.

Report a Bug
Report a Bug

Describe your bug clearly, including the steps you used to create it.