special

您的位置: 首页 > 院士专题 > 专题列表

共检索到2707条,权限内显示50条;

[前沿资讯 ] AI is opening doors for large-scale studies 进入全文

Wageningen University & Research;

The numbers of scientific applications of artificial intelligence are growing rapidly. Smart computers and robots are opening new doors to research that isn’t feasible manually. ‘We can now scale projects up to previously inconceivable levels.’ Willem-Jan Knibbe, the head of the Wageningen Data Competence Centre and its Artificial Intelligence programme leader, sees AI as a great way of extracting knowledge from data. ‘AI was already excellent at pattern recognition; it can for instance tell you what kind of animal or tree is in a picture, and it can recognize faces and look for relationships. That’s making research faster, more efficient and more productive. But over the last couple of years AI has also become creative and generative. You can make it write, draw, talk and construct things, for example with ChatGPT. That’s a radical change that affects all of us.’ Knibbe and his colleagues at the data centre want to create value from Wageningen’s data. ‘The data centre was set up in 2017, when the Big Data explosion came along with the exponential growth of data and the sheer quantities of data being produced and stored. Larger and more complex datasets led to the rise of data science as a way of gleaning understandings from the data. And now we have to deal with the developments in AI. Responding sufficiently quickly to all those changes is a continuous challenge in research and education, where the possibilities for rapid adaptation are more limited because the programme is generally fixed for a year. Bringing it all together is pretty complex.’ Wageningen World This article appeared earlier in Wageningen World 1 | 2024, the magazine of Wageningen University & Research. Would you like to read more Wageningen World stories? Please subscribe to the digital magazine. Inspiration Knibbe believes that AI tools offer boundless opportunities. ‘You can ask ChatGPT for a good study design, for instance. That doesn’t work perfectly yet, but it helps you find inspiration. AI can also help you construct, correct and combine ideas more quickly. AI has capabilities that you can’t even conceive of and it reads through more material than you can. That’s where you have to be careful, though, because AI can read the wrong things too. But if you do it properly, AI is a powerful and helpful sparring partner.’ AI is already being used avidly in innovative projects: for recognizing food quality using camera images, for the fully automated cultivation of land, for monitoring livestock health, or in the search for the hereditary characteristics of resistant strains. “ You always need human intelligence at the front and back ends ” Computer vision Erik Pekkeriet is the manager of Vision + Robotics, the programme that is bringing together experts in computer vision – image interpretation by software – and robotics and AI from all corners of WUR so that the technology can be utilized in agriculture, horticulture, fishing, livestock farming and the food supply chain. He thinks Wageningen’s researchers are still a bit traditional in their attitudes to AI. ‘We don’t really trust it yet and want to do a lot of measurements and counts manually, just to be sure that it’s all correct. AI-based image processing systems are so good and efficient nowadays, though, that they do the job better and more completely than we do, as well as saving a lot of effort. The technology has genuinely turned a corner over the past ten years. Manual measuring and counting is going to be largely redundant in future and will be replaced by AI-based, robotized systems. On top of that, researchers will often have a lot more data points available, for instance because they can now use drones to fly over an area to gather data.’ Predicting illegal deforestation Researchers at Wageningen Environmental Research are working on making the Forest Foresight system of the World Wide Fund for Nature (WWF) even smarter. This is a system that uses radar images from the Sentinel-1 satellite to make a detailed map in which AI can show where felling is likely to happen, up to several months in advance. It can for instance determine every few days where new roads have been created, which shows where heavy vehicles are going to be used, for instance for tree felling. The system is undergoing trials in the tropical rainforests of Suriname, Gabon and Kalimantan. The initial results are promising, according to Johannes Reiche, an assistant professor of Radar Remote Sensing. ‘The nice thing is that we can now also teach the system about the causes of deforestation. It can recognize activities such as mining, agriculture and tree felling and it also knows about various types of forests. That means the system can estimate accurately where the risk of illegal deforestation is highest. Felling is less likely in wet woodlands, for example. Local rangers can use this system to see where they need to send their patrols instead of just reacting to what has already happened.’ Jeroen Hoekendijk, a marine biologist and computer scientist at Wageningen Marine Research, knows all about that. For his doctoral thesis, he used AI to automate the counting of seals in the Wadden Sea using aerial images. In current research, birds above the North Sea are also being counted using photographs, he tells us. ‘In the past, birds were counted by an expert from a plane; nowadays, modern aerial cameras can photograph large areas at high resolution and the images are analysed automatically. The initial results are very promising, but a lot of example data from the experts is currently needed if the process is to be improved. The algorithm has difficulty with similar-looking bird species in particular.’ Although he started in biology, over time Hoekendijk has shifted towards computer science. ‘At the moment, I’m helping ecologists and biologists to use AI tools in their research. Because I know both sides of the coin, I’ve got a kind of bridging role.’ As an example, he gives research into determining the age of a fish. That is done by looking at growth rings in otoliths, the small ossicles in the fish’s ears, which get annual growth rings just like trees do. Researchers have been taking photos of these for years, and these old datasets can be used for machine learning and teaching the algorithm to count the annual rings. ‘You do that by showing the computer one photo at a time along with the corresponding number of rings until it can count them correctly on new photos. It’s got to be accurate and the research is very labour intensive, so it’ll be great if AI is able to take it over.’ Recognizing sounds AI is also being used increasingly often in biodiversity research, not only for image recognition but also for recognizing the sounds made by birds, marine mammals, bats and fish. This can create better understandings of where animals are located and how they behave. According to Hoekendijk, the added value of AI is in the scale. ‘Automation opens new doors for research that was impossible before because it wouldn’t be feasible manually. We can now scale projects up to previously inconceivable levels: we’re doing plankton research, for instance, at a scale that would have been unimaginable in the past. The latest technology lets us photograph 10,000 plankton particles a minute and we then use smart algorithms to analyse the photos. The quantities, species and locations of plankton all vary with the seasons, so using this tool lets us do monitoring at a much larger scale and pick up the changes more quickly.’ “'We're doing plankton research on a scale that was unimaginable in the past” AI uses a lot of electricity The United Nations and the World Economic Forum predict a major role for AI in the battle against climate change. AI could help make things greener, for instance by automatically switching off energy sources when they aren’t needed. That doesn’t alter the fact that AI itself has a substantial ecological footprint, though, for instance for producing and transporting all the hardware, for water to cool the servers in data centres and for the large amounts of electricity needed to train the AI models and keep them running. According to the International Energy Agency, data centres use about 3 per cent of all the electrical power on the planet and are responsible for 1 per cent of global CO2 emissions. That may not sound much, but even the aviation sector ‘only’ emits twice as much. The penny is slowly starting to drop in the academic world that digitalization comes at an ecological cost. Hoekendijk is positive about it nevertheless. ‘The climate impact varies from one project to the next. And AI can also have a positive impact too: we are now able to use existing satellite images, for example, which is more environmentally friendly than flying. You can also use an underwater drone to film the ocean floor and see what’s living there, without fishing or ruining the seabed by scraping it away. And images from Google Earth enable us to detect new forests of darkgreen seaweed, which we can then protect. These kelp forests are crucial for biodiversity. AI tools can't save the climate; human beings must do that. AI can help us, though.' Classifying fish catches In the Fully Documented Fisheries project, work is being done on a system that creates a picture fully automatically of fishing catches, without the trawler crew or observers having to be involved. The catch is automatically detected and classified by number and by species as it passes along the conveyor belts. The system uses GPS, sensors and cameras plus an onboard computer. Researchers at Wageningen Marine Research are developing computer vision methods for analysing and interpreting the images. This technology makes complete documentation of the fish catch possible, which will help the sector become more sustainable and allow fish populations to be managed more responsibly. Losing control Vincent Blok, professor of the Philosophy of Technology and Responsible Innovation, notes that society also has concerns about AI. ‘In a well-made marketing video about Lely milking robots, you see fully automated cowshed systems – without any people. This has an alienating effect on the general public, though: they no longer see any relationship between the humans and the animals. This had already become much less in livestock farming, but the robots draw attention to the fact.’ Blok thinks people sometimes wonder whether we’re losing control, with AI taking over. ‘Scientists need to address that concern, so that ordinary people can assess the potential, the opportunities and the risks. If public opinion turns against AI, that could work against the scientific technology. So this is something for interdisciplinary cooperation between the philosophers and the technologists.’ Blok is leading a project about the ethical, legal and social aspects (ELSA) of AI in sustainable food systems. The ELSA lab aims to develop responsible, human-centric AI. ‘We’re working with various chair groups to provide critical reflections on the negative and unforeseen effects of AI on humans, animals and society. What are the ethical issues, and who is ‘in control? In the Netherlands, we’re thinking carefully about the ethics and philosophy of AI.’ Hoekendijk has seen AI building up momentum massively over the past five years. ‘It’s difficult to say where we’ll be five years from now. I’d expect AI to need less and less example data from experts and that the tools will become ever more accessible to people who aren’t computer scientists.’ Pekkeriet believes WUR still has a way to go, and that researchers will have to learn what AI can do for them. ‘We understand which data items can be linked together, but AI doesn’t. With generative AI such as ChatGPT or Google, you often don’t know where the information has come from and so you regularly get lousy answers: GIGO (garbage in, garbage out). When we’re doing research, we know the origins of our data – and we have colossal amounts of data available.’ Human intelligence Sometimes, it is not clear whether AI would actually solve a particular problem. Selflearning machines don’t always outperform humans. ‘AI does not possess human intelligence,’ says Blok emphatically. ‘I think that we’re heading for a sort of hybrid intelligence. You always need human intelligence at the front and back ends. We need to utilize the user’s expertise in a positive way. Human-centric AI can help increase human capacities.’ According to Blok, that also raises the question of whether we aren’t defining the concept of intelligence too narrowly. ‘Why do we assume it’s either artificial intelligence or human intelligence? Maybe we ought to move from human-centric to biocentric AI. There are forms of intelligence in non-human systems too – take a flock of birds, for instance. Triage of diseased greenhouse seedlings Selecting and sorting young seedlings before they are transferred to the greenhouse to develop is a labour-intensive task. Scientists from the Vision + Robotics programme are working on a technique for automating that selection process in which diseased and non-viable plants are recognized and picked out. The new technology uses a camera that records the shape and colour of the seedling roots and shoot. Image processing and machine learning are then used to determine whether it is a viable plant. The AI technology can moreover help determine which characteristics are predictors of plant health. ‘We’re currently in the middle of our feasibility study,’ says Lydia Meesters, the project manager. ‘Can we genuinely produce the images we need of the plant properties so that its health can be determined? And if so, how can we create the best possible picture of these characteristics using simple, scalable technology? Recognizing the sounds of the sea New sensors can provide valuable information about the state of biodiversity in the oceans. Researchers from the Marine Animal Ecology chair group and others in the Next-Level Animal Sciences innovation programme are developing a smart biodiversity sensor box. This box makes underwater video and audio recordings and takes water samples for analysis of what is known as eDNA (environmental DNA). The ultimate aim is for the biodiversity box to take eDNA samples whenever the video or audio has detected an organism. An acoustic machine learning model will help detect and identify the sounds made by marine animals. The box can be deployed in places such as offshore wind farms for example in the North Sea, where diving is forbidden or hazardous, or where visibility is limited. The project is intended to generate an online database of the sounds of the North Sea. A dashboard will also be developed that combines sounds, video images and eDNA, which will allow anyone to observe marine animals in real time.

[前沿资讯 ] Wageningen launches research into sustainable solar fuel 进入全文

Wageningen University & Research;

Wageningen researchers have launched a major project (SUN-PERFORM) to develop sustainable fuels made from oil produced by algae. The researchers will use synthetic biology and nanotechnology to enhance the algae's ability to capture sunlight more efficiently for photosynthesis. The project has received €4 million in funding from Horizon Europe, the EU’s research and innovation funding programme, of which €1.5 million is allocated to Wageningen University & Research (WUR). Algae naturally produce oil within their cells, which can be converted into fuel. Researchers initially explored such biofuels in the early 2000s; however, as algae require large amounts of light and additional nutrients to grow, the process was too energy-intensive and costly. Wageningen scientists are now working on a new form of fuel using algae that obtain their energy mainly from sunlight. In addition, they want to harvest the sunlight more efficiently. The sustainable fuel that results from this is called solar fuel and is very suitable for shipping and aviation, since these sectors cannot easily switch to electricity, like land-based vehicles. “The scientists use a special film with quantum dots that convert unusable blue and UV light into light that algae can utilize.” Quantum dots To make algae work more efficiently to produce solar fuel, researchers are first attempting to increase the amount of light available to them. Sunlight spans a broad spectrum of colours, from red to green and ultraviolet (UV). “Algae, however, primarily use red light for photosynthesis and their cellular processes,” explains Sarah D’Adamo, project leader and Associate Professor of Bio Process Engineering. Other colours, including blue and UV light, are therefore not or hardly used. A shame, according to the Wageningen researchers. In this project, they plan to utilise nanotechnology to convert part of this unused light into red light, which algae can use. To achieve this, the scientists use a special transparent film containing so-called quantum dots, which convert part of the blue and UV light into red light that algae can use. They stick this foil on the glass reaction vessels in which algae grow. “You can compare it to a red plastic,” D’Adamo illustrates. “If you stick that to a window, it acts as a filter, allowing only red light to pass through.” In reality, quantum dots in the specialised film do not function as filters but actively adjust light particles (photons). With this technology, the film makes more light available to the algae, enhancing their energy uptake. “Enhancing photosynthesis is a long-standing dream among biologists.” Biological battery If the algae have more light at their disposal, they should be able to use it rather than let it go to waste. To make sure they can, biologists are attempting to incorporate a type of biological battery in the algae, temporarily storing solar energy. “This is a biological protein where the algae store phosphate groups to generate energy molecules.” The researchers are also adding a new molecular system that allows algae to absorb more carbon dioxide (CO₂) and grow more efficiently. For this part of the research, the bioprocess technologists collaborate with Wageningen microbiologist Nico Claassens. Finally, the researchers are reprogramming the algae’s DNA to stimulate higher oil production than usual. Together, these innovations should significantly improve the growth and oil production of the algae. D’Adamo is excited about the project’s launch. “Improving photosynthesis has long been a dream of biologists,” she says. “What makes this research unique is that we are combining synthetic biology with nanotechnology, pushing the boundaries of nature. It is a huge challenge, and perhaps we will not succeed in four years, but the consortium has strong expertise, and we will certainly give it our best shot.” Over the next four years, two PhD candidates and a postdoc will work on the project at WUR and several more with the other partners. Horizon Europe Horizon Europe is the EU’s largest research and innovation funding programme. Until 2027, €93.5 billion will be available for small- and large-scale projects, particularly those addressing climate change and the UN’s Sustainable Development Goals. The European Commission aims to stimulate science and innovation in Europe by encouraging collaboration between academia and industry to find solutions for societal challenges. The SUN-PERFORM project received a total of €4 million in funding, of which €1.5 million is allocated to WUR. Consortium In the SUN-PERFORM project, WUR collaborates with the following partners: Universitaet Bielefeld Politecnico Di Torino In Srl Impresa Sociale University of Amsterdam Solarfoil B.V. Max-Planck-Gesellschaft Zur Forderung Der Wissenschaften EV

[学术文献 ] From Food Industry 4.0 to Food Industry 5.0: Identifying technological enablers and potential future applications in the food sector 进入全文

COMPREHENSIVE REVIEWS IN FOOD SCIENCE AND FOOD SAFETY

Although several food-related fields have yet to fully grasp the speed and breadth of the fourth industrial revolution (also known as Industry 4.0), growing literature from other sectors shows that Industry 5.0 (referring to the fifth industrial revolution) is already underway. Food Industry 4.0 has been characterized by the fusion of physical, digital, and biological advances in food science and technology, whereas future Food Industry 5.0 could be seen as a more holistic, multidisciplinary, and multidimensional approach. This review will focus on identifying potential enabling technologies of Industry 5.0 that could be harnessed to shape the future of food in the coming years. We will review the state-of-the-art studies on the use of innovative technologies in various food and agriculture applications over the last 5 years. In addition, opportunities and challenges will be highlighted, and future directions and conclusions will be drawn. Preliminary evidence suggests that Industry 5.0 is the outcome of an evolutionary process and not of a revolution, as is often claimed. Our results show that regenerative and/or conversational artificial intelligence, the Internet of Everything, miniaturized and nanosensors, 4D printing and beyond, cobots and advanced drones, edge computing, redactable blockchain, metaverse and immersive techniques, cyber-physical systems, digital twins, and sixth-generation wireless and beyond are likely to be among the main driving technologies of Food Industry 5.0. Although the framework, vision, and value of Industry 5.0 are becoming popular research topics in various academic and industrial fields, the agri-food sector has just started to embrace some aspects and dimensions of Industry 5.0.

[学术文献 ] Exploring the Integration of Industry 4.0 Technologies in Agriculture: A Comprehensive Bibliometric Review 进入全文

SUSTAINABILITY

While it is essential to increase agricultural production to meet the needs of a growing global population, this task is becoming increasingly difficult due to the environmental challenges faced in recent decades. A promising solution to enhance the efficiency and sustainability of agricultural production is the integration of Industry 4.0 technologies, such as IoT, UAVs, AI, and Blockchain. However, despite their potential, there is a lack of comprehensive bibliometric analyses that cover the full range of these technologies in agriculture. This gap limits understanding of their integration and impact. This study aims to provide a holistic bibliometric analysis of the integration of Industry 4.0 technologies in agriculture, identifying key research trends and gaps. We analyzed relevant literature using the Scopus database and VOSviewer software (version 1.6.20, Centre for Science and Technology Studies, Leiden University, The Netherlands)and identified five major thematic clusters within Agriculture 4.0. These clusters were examined to understand the included technologies and their roles in promoting sustainable agricultural practices. The study also identified unexplored technologies that present opportunities for future research. This paper offers a comprehensive overview of the current research landscape in Agriculture 4.0, highlighting areas for innovation and development, and serves as a valuable resource for enhancing sustainable agricultural practices through technological integration.

[前沿资讯 ] Microsoft Unveils Adapted AI Models for Agriculture 进入全文

Microsoft News;Global Ag Tech Initiative;

Across every industry, AI is creating a fundamental shift in what’s possible, enabling new use cases and driving business outcomes. While organizations around the world recognize the value and potential of AI, for AI to be truly effective it must be tailored to specific industry needs. Today, we’re announcing adapted AI models, expanding our industry capabilities and enabling organizations to address their unique needs more accurately and effectively. In collaboration with industry partner experts like Bayer, Cerence, Rockwell Automation, Saifr, Siemens Digital Industries Software, Sight Machine and more, we’re making these fine-tuned models, pre-trained using industry-specific data, available to address customers’ top use cases. Underpinning these adapted AI models is the Microsoft Cloud, our platform for industry innovation. By integrating the Microsoft Cloud with our industry-specific capabilities and a robust ecosystem of partners, we provide a secure approach to advancing innovation across industries. This collaboration allows us to create extensive scenarios for customers globally, with embedded AI capabilities — from industry data solutions in Microsoft Fabric to AI agents in Microsoft Copilot Studio to AI models in Azure AI Studio — that enable industries to realize their full potential. Introducing adapted AI models for industry We’re pleased to introduce these new partner-enabled models from leading organizations that are leveraging the power of Microsoft’s Phi family of small language models (SLMs). These models will be available through the Azure AI model catalog, where customers can access a wide range of AI models to build custom AI solutions in Azure AI Studio, or directly from our partners. The models available in the Azure AI model catalog can also be used to configure agents in Microsoft Copilot Studio, a platform that allows customers to create, customize and deploy AI-powered agents, which can be applied to an industry’s top use cases to address its most pressing needs.     Bayer, a global enterprise with core competencies in the life science fields of healthcare and agriculture, will make E.L.Y. Crop Protection available in the Azure AI model catalog. A specialized SLM, it is designed to enhance crop protection sustainable use, application, compliance and knowledge within the agriculture sector. Built on Bayer’s agricultural intelligence, and trained on thousands of real-world questions on Bayer crop protection labels, the model provides ag entities, their partners and developers a valuable tool to tailor solutions for specific food and agricultural needs. The model stands out due to its commitment to responsible AI standards, scalability to farm operations of all types and sizes and customization capabilities that allow organizations to adapt the model to regional and crop-specific requirements.     Cerence, which creates intuitive, seamless and AI-powered user experiences for the world’s leading automakers, is enhancing its in-vehicle digital assistant technology with fine-tuned SLMs within the vehicle’s hardware. CaLLM™ Edge, an automotive-specific, embedded SLM, will be available in the Azure AI model catalog. It can be used for in-car controls, such as adjusting air conditioning systems, and scenarios that involve limited or no cloud connectivity, enabling drivers to access the rich, responsive experiences they’ve come to expect from cloud-based large language models (LLMs), no matter where they are.     Rockwell Automation, a global leader in industrial automation and digital transformation, will provide industrial AI expertise via the Azure AI model catalog. The FT Optix Food & Beverage model brings the benefits of industry-specific capabilities to frontline workers in manufacturing, supporting asset troubleshooting in the food and beverage domain. The model provides timely recommendations, explanations and knowledge about specific manufacturing processes, machines and inputs to factory floor workers and engineers.     Saifr, a RegTech within Fidelity Investments’ innovation incubator, Fidelity Labs, will introduce four new models in the Azure AI model catalog, empowering financial institutions to better manage regulatory compliance of broker-dealer communications and investment adviser advertising. The models can highlight potential regulatory compliance risks in text (Retail Marketing Compliance model) and images (Image Detection model); explain why something was flagged (Risk Interpretation model); and suggest alternative language that might be more compliant (Language Suggestion model). Together, these models can enhance regulatory compliance by acting as an extra set of review eyes and boost efficiency by speeding up review turnarounds and time to market.     Siemens Digital Industries Software, which helps organizations of all sizes digitally transform using software, hardware and services from the Siemens Xcelerator business platform, is introducing a new copilot for NX X software, which leverages an adapted AI model that enables users to ask natural language questions, access detailed technical insights and streamline complex design tasks for faster and smarter product development. The copilot will provide CAD designers with AI-driven recommendations and best practices to optimize the design process within the NX X experience, helping engineers implement best practices faster to ensure expected quality from design to production. The NX X copilot will be available in the Azure Marketplace and other channels.     Sight Machine, a leader in data-driven manufacturing and industrial AI, will release Factory Namespace Manager to the Azure AI model catalog. The model analyzes existing factory data, learns the patterns and rules behind the naming conventions and then automatically translates these data field names into standardized corporate formats. This translation makes the universe of plant data in the manufacturing enterprise AI-ready, enabling manufacturers to optimize production and energy use in plants, balance production with supply chain logistics and demand and integrate factory data with enterprise data systems for end-to-end optimization. The bottling company Swire Coca-Cola USA plans to use Factory Namespace Manager to efficiently map its extensive PLC and plant floor data into its corporate data namespace. We also encourage innovation in the open-source ecosystem and are offering five open-source Hugging Face models that are fine-tuned for summarization and sentiment analysis of financial data. Additionally, last month we announced new healthcare AI models in Azure AI Studio. These state-of-the-art multimodal medical imaging foundation models, created in partnership with organizations like Providence and Paige.ai, empower healthcare organizations to integrate and analyze a variety of data types, leveraging intelligence in modalities other than text in specialties like ophthalmology, pathology, radiology and cardiology. Accelerating transformation with industry agents Microsoft also offers AI agents that are purpose-built for industry scenarios. Available in Copilot Studio, these agents can be configured to support organizations’ industry-specific needs. For example, retailers can use the Store Operations Agent to support retail store associates and the Personalized Shopping Agent to enhance customers’ shopping experiences. Manufacturers can use the Factory Operations Agent to enhance production efficiency and reduce downtime by enabling engineers and frontline workers to quickly identify and troubleshoot issues. All this AI innovation wouldn’t be possible without a solid data estate, because AI is only as good as the data it’s built upon. By ensuring data is accurate, accessible and well integrated, organizations can unlock deeper insights and drive more effective decision-making with AI. Microsoft Fabric, a data platform built for the era of AI, helps unify disparate data sources and prepares data for advanced analytics and AI modeling. It offers industry data solutions that address each organization’s unique needs and allows them to discover, deploy and do more with AI. At the forefront of addressing industry needs securely At the core of our AI strategy is a commitment to trustworthy AI. This commitment encompasses safety, security and privacy, ensuring that AI solutions are built with the highest standards of integrity and responsibility. Trustworthy AI is foundational to everything we do, from how we work with customers to the capabilities we build into our products. At Microsoft, we combine industry AI experience, insights and capabilities with a deep understanding of customer challenges and objectives. Along with a trusted ecosystem of experienced partners, we unlock the full potential of AI for each industry and business. Our goal is not just to offer or implement AI tools but to help customers succeed by embedding AI into the very core of what each industry does. AI transformation is here, and Microsoft is at the forefront of this revolution. As we continue to navigate this new era of innovation, it’s clear that AI will play a pivotal role in shaping the future of business across all industries and that Microsoft will continue to lead the way. To learn more about how customers in a variety of industries are transforming with AI, visit How real-world businesses are transforming with AI.  

[前沿资讯 ] Greeneye Technology Partners with Croplands for Field Trials in Australia 进入全文

Greeneye Technology;Global Ag Tech Initiative;

Greeneye Technology, the pioneer of AI-enabled precision spraying technology that is proven to reduce non-residual herbicide use in farming by an average of 87%, today announces its first field trials outside of the U.S. The company is partnering with Croplands, Nufarm’s equipment and emerging spray solutions platform in Australia, to evaluate the effectiveness of precision spraying in post-emergence applications on Australian soil. Greeneye has already begun an extensive data collection program in Australia to customize its breakthrough precision spraying technology to local field conditions and crops including canola and cereals. The field trials will begin in 2025. Headquartered in Adelaide, South Australia, Croplands has regional locations and a firmly-established footprint throughout Australia. It is a pioneer and leading distributor of infrared-based precision spraying systems for pre-emergence applications, providing sales, service and support across the country. Commenting on the field trials, Steve Norton, Portfolio Manager at Croplands, says: “Croplands’ mission is to provide our customers with access to cutting-edge technologies that drive ROI potential and productivity. Through existing partnerships we have already helped close to a thousand farmers to significantly reduce chemical use in pre-emergence applications.  We are now looking to the next generation of precision spraying technology to unlock its full potential by offering solutions that can be utilized during both pre- and post-emergence treatment.” Croplands selected the Greeneye system for trial following an extensive evaluation of promising precision spraying technologies. “When assessing which technologies to include in these trials, there were several features that stood out about the Greeneye system, ” explains Norton. “First, it is a proven technology, having already firmly established itself in the U.S. market. Second, it is entirely machine agnostic, meaning it can be retrofitted onto farmers’ existing sprayers, overcoming a major cost-of-entry barrier. And third, it features a dual tank/line configuration that allows farmers to simultaneously broadcast residual herbicides while precisely spraying non-residual herbicides only on the weeds. We believe this will be a game changer in terms of improving sprayer operator efficiency.” “However, perhaps most compelling of all,” Norton continues, “is that Greeneye is already working with farmers in the U.S. to extend the usage of its system to other inputs such as anti-fungals and micronutrients. This approach isn’t even on the radar in Australia, yet we believe it will significantly increase the value of precision spraying for our customers.” The trials, which are supported by funding from the Grains Research and Development Corporation, an initiative designed to increase the profitability of the grains industry in Australia, marks another major milestone in Greeneye’s mission to unlock the cost and environmental benefits of precision spraying for farmers worldwide. In 2022, it became the first company to launch precision spraying commercially in the U.S. Today, it is working with dozens of corn, soybean and cotton farmers in the Midwest to transform their weed management programs. The technology harnesses cutting-edge hardware in combination with proprietary AI technology to identify and spray weeds during both pre- and post-emergence treatment with unrivalled accuracy. Cameras mounted on the sprayer boom capture high-resolution images of the field at a rate of 40 frames per second, enabling the rapid detection and precise classification of weeds down to the species level.  This information is fed back to the system’s graphics processing units which calculate the exact amount of herbicide required and signal to the appropriate nozzles to open, spraying only the weeds. This entire process takes just milliseconds to execute and can be carried out at commercial travel speeds of 15 mph, meaning no loss in productivity compared to broadcast application. Nadav Bocher, CEO, Greeneye Technology, comments: “We are delighted to announce our collaboration with Croplands, a true visionary in the ag space that shares our mission to leverage innovative technology to enhance farming outcomes and protect the environment. Croplands has established itself as a leader in driving mainstream adoption of precision spraying technology, and it has an expansive network in place to facilitate a rapid roll-out. We could not have asked for a better partner to bring our technology to Australian producers.”  

热门相关

意 见 箱

匿名:登录

个人用户登录

找回密码

第三方账号登录

忘记密码

个人用户注册

必须为有效邮箱
6~16位数字与字母组合
6~16位数字与字母组合
请输入正确的手机号码

信息补充