当前期刊: Journal of Field Robotics Go to current issue    加入关注   
显示样式:        排序: IF: - GO 导出
我的关注
我的收藏
您暂时未登录!
登录
  • WiMUST: A cooperative marine robotic system for autonomous geotechnical surveys
    J. Field Robot. (IF 3.581) Pub Date : 2020-09-16
    Enrico Simetti; Giovanni Indiveri; António M. Pascoal

    This paper presents the main results of the European H2020 WiMUST project, whose aim was the development of a system of cooperative autonomous underwater vehicles and autonomous surface vehicles for geotechnical surveying. In particular, insights on the overall robotic technologies and methodologies employed, ranging from the communications and navigation framework to the cooperative and coordinated

    更新日期:2020-09-18
  • Efficient autonomous navigation for planetary rovers with limited resources
    J. Field Robot. (IF 3.581) Pub Date : 2020-08-25
    Levin Gerdes; Martin Azkarate; José Ricardo Sánchez‐Ibáñez; Luc Joudrier; Carlos Jesús Perez‐del‐Pulgar

    Rovers operating on Mars require more and more autonomous features to fulfill their challenging mission requirements. However, the inherent constraints of space systems render the implementation of complex algorithms an expensive and difficult task. In this paper, we propose an architecture for autonomous navigation. Efficient implementations of autonomous features are built on top of the ExoMars path

    更新日期:2020-09-14
  • Self‐reliant rovers for increased mission productivity
    J. Field Robot. (IF 3.581) Pub Date : 2020-08-11
    Daniel Gaines; Gary Doran; Michael Paton; Brandon Rothrock; Joseph Russino; Ryan Mackey; Robert Anderson; Raymond Francis; Chet Joswig; Heather Justice; Ksenia Kolcio; Gregg Rabideau; Steve Schaffer; Jacek Sawoniewicz; Ashwin Vasavada; Vincent Wong; Kathryn Yu; Ali‐akbar Agha‐mohammadi

    Achieving consistently high levels of productivity has been a challenge for Mars surface missions. While the rovers have made major discoveries and dramatically increased our understanding of Mars, they require a great deal of interaction from the operations teams, and achieving mission objectives can take longer than anticipated when productivity is paced by the ground teams' ability to react. We

    更新日期:2020-09-14
  • AMZ Driverless: The full autonomous racing system
    J. Field Robot. (IF 3.581) Pub Date : 2020-08-04
    Juraj Kabzan; Miguel I. Valls; Victor J. F. Reijgwart; Hubertus F. C. Hendrikx; Claas Ehmke; Manish Prajapat; Andreas Bühler; Nikhil Gosala; Mehak Gupta; Ramya Sivanesan; Ankit Dhall; Eugenio Chisari; Napat Karnchanachari; Sonja Brits; Manuel Dangel; Inkyu Sa; Renaud Dubé; Abel Gawel; Mark Pfeiffer; Alexander Liniger; John Lygeros; Roland Siegwart

    This paper presents the algorithms and system architecture of an autonomous racecar. The introduced vehicle is powered by a software stack designed for robustness, reliability, and extensibility. To autonomously race around a previously unknown track, the proposed solution combines state of the art techniques from different fields of robotics. Specifically, perception, estimation, and control are incorporated

    更新日期:2020-09-14
  • Autonomous platooning of multiple ground vehicles in rough terrain
    J. Field Robot. (IF 3.581) Pub Date : 2020-09-01
    Jongho Shin; Dong Jun Kwak; Jun Kim

    This study proposes an autonomous platooning algorithm, composed of velocity, and path planning systems, of multiple ground vehicles in rough terrain. The velocity planning system aims to maintain desired distance between preceding and rear vehicles, and the path planning system generates reliable path to make the vehicles move safely under the given rough environment. To supply a reliable velocity

    更新日期:2020-09-02
  • Efficient obstacle detection based on prior estimation network and spatially constrained mixture model for unmanned surface vehicles
    J. Field Robot. (IF 3.581) Pub Date : 2020-08-25
    Jingyi Liu; Hengyu Li; Jun Luo; Shaorong Xie; Yu Sun

    Recently, spatially constrained mixture model has become the mainstream method for the task of vision‐based obstacle detection in unmanned surface vehicles (USVs), and has shown its potential of modeling the semantic structure of the marine environment. However, the expectation maximization (EM) optimization of this model is quite sensitive to initial values and easily falls into a local optimal solution

    更新日期:2020-08-26
  • Experimental validation of the modeling and control of a multibody underwater vehicle manipulator system for sea mining exploration
    J. Field Robot. (IF 3.581) Pub Date : 2020-08-15
    Daniele Di Vito; Daniela De Palma; Enrico Simetti; Giovanni Indiveri; Gianluca Antonelli

    This paper presents the modeling approach and the control framework developed for the ROBUST EU Horizon 2020 project. The goal of this project is to showcase technologies and methodologies for future autonomous mineral exploration missions in deep‐sea sites with an Underwater Vehicle‐Manipulator System. Within the aim to make the system reliable in performing autonomously the entire mission, specific

    更新日期:2020-08-16
  • Automatic three‐dimensional mapping for tree diameter measurements in inventory operations
    J. Field Robot. (IF 3.581) Pub Date : 2020-08-10
    Jean‐François Tremblay; Martin Béland; Richard Gagnon; François Pomerleau; Philippe Giguère

    Forestry is a major industry in many parts of the world, yet this potential domain of application area has been overlooked by the robotics community. For instance, forest inventory, a cornerstone of efficient and sustainable forestry, is still traditionally performed manually by qualified professionals. The lack of automation in this particular task, consisting chiefly of measuring tree attributes

    更新日期:2020-08-11
  • What localizes beneath: A metric multisensor localization and mapping system for autonomous underground mining vehicles
    J. Field Robot. (IF 3.581) Pub Date : 2020-08-01
    Adam Jacobson; Fan Zeng; David Smith; Nigel Boswell; Thierry Peynot; Michael Milford

    Robustly and accurately localizing vehicles in underground mines is particularly challenging due to the unavailability of GPS, variable and often poor lighting conditions, visual aliasing in long tunnels, and airborne dust and water. In this paper, we present a novel, infrastructure‐less, multisensor localization method for robust autonomous operation within underground mines. The proposed method integrates

    更新日期:2020-08-02
  • The effect of data augmentation and network simplification on the image‐based detection of broccoli heads with Mask R‐CNN
    J. Field Robot. (IF 3.581) Pub Date : 2020-07-14
    Pieter M. Blok; Frits K. van Evert; Antonius P. M. Tielen; Eldert J. van Henten; Gert Kootstra

    In current practice, broccoli heads are selectively harvested by hand. The goal of our work is to develop a robot that can selectively harvest broccoli heads, thereby reducing labor costs. An essential element of such a robot is an image‐processing algorithm that can detect broccoli heads. In this study, we developed a deep learning algorithm for this purpose, using the Mask Region‐based Convolutional

    更新日期:2020-07-15
  • Into the dirt: Datasets of sewer networks with aerial and ground platforms
    J. Field Robot. (IF 3.581) Pub Date : 2020-07-08
    David Alejo; François Chataigner; Daniel Serrano; Luis Merino; Fernando Caballero

    This paper presents an unprecedented set of data in a challenging underground environment: the visitable sewers of Barcelona. To the best of our knowledge, this is the first data set involving ground and aerial robots in such scenario: the sewer inspection autonomous robot (SIAR) ground robot and the autonomous robot for sewer inspection aerial platform. These platforms captured data from a great variety

    更新日期:2020-07-09
  • Performance improvements of a sweet pepper harvesting robot in protected cropping environments
    J. Field Robot. (IF 3.581) Pub Date : 2020-06-21
    Chris Lehnert; Chris McCool; Inkyu Sa; Tristan Perez

    Using robots to harvest sweet peppers in protected cropping environments has remained unsolved despite considerable effort by the research community over several decades. In this paper, we present the robotic harvester, Harvey, designed for sweet peppers in protected cropping environments that achieved a 76.5% success rate on 68 fruit (within a modified scenario) which improves upon our prior work

    更新日期:2020-06-21
  • Perceptive whole‐body planning for multilegged robots in confined spaces
    J. Field Robot. (IF 3.581) Pub Date : 2020-06-11
    Russell Buchanan; Lorenz Wellhausen; Marko Bjelonic; Tirthankar Bandyopadhyay; Navinda Kottege; Marco Hutter

    Legged robots are exceedingly versatile and have the potential to navigate complex, confined spaces due to their many degrees of freedom. As a result of the computational complexity, there exist no online planners for perceptive whole‐body locomotion of robots in tight spaces. In this paper, we present a new method for perceptive planning for multilegged robots, which generates body poses, footholds

    更新日期:2020-06-11
  • Design, modeling, and control of an aerial manipulator for placement and retrieval of sensors in the environment
    J. Field Robot. (IF 3.581) Pub Date : 2020-06-01
    Salua Hamaza; Ioannis Georgilas; Guillermo Heredia; Aníbal Ollero; Thomas Richardson

    On‐site inspection of large‐scale infrastructure often involves high risks for the operators and high insurance costs. Despite several safety measures already in place to avoid accidents, an increasing concern has brought the need to remotely monitor hard‐to‐reach locations, for which the use of aerial robots able to interact with the environment has arisen. In this paper a novel approach to aerial

    更新日期:2020-06-01
  • Controlling Ocean One: Human–robot collaboration for deep‐sea manipulation
    J. Field Robot. (IF 3.581) Pub Date : 2020-06-01
    Gerald Brantner; Oussama Khatib

    Deploying robots to explore venues that are inaccessible to humans, or simply inhospitable, has been a longstanding ambition of scientists, engineers, and explorers across numerous fields. The deep sea exemplifies an environment that is largely uncharted and denies human presence. Central to exploration is the capacity to deliver dexterous robotic manipulation to this unstructured environment. Unmanned

    更新日期:2020-06-01
  • Learning features from georeferenced seafloor imagery with location guided autoencoders
    J. Field Robot. (IF 3.581) Pub Date : 2020-05-28
    Takaki Yamada; Adam Prügel‐Bennett; Blair Thornton

    Although modern machine learning has the potential to greatly speed up the interpretation of imagery, the varied nature of the seabed and limited availability of expert annotations form barriers to its widespread use in seafloor mapping applications. This motivates research into unsupervised methods that function without large databases of human annotations. This paper develops an unsupervised feature

    更新日期:2020-05-28
  • Towards autonomous inspection of concrete deterioration in sewers with legged robots
    J. Field Robot. (IF 3.581) Pub Date : 2020-05-27
    Hendrik Kolvenbach; David Wisth; Russell Buchanan; Giorgio Valsecchi; Ruben Grandia; Maurice Fallon; Marco Hutter

    The regular inspection of sewer systems is essential to assess the level of degradation and to plan maintenance work. Currently, human inspectors must walk through sewers and use their sense of touch to inspect the roughness of the floor and check for cracks. The sense of touch is used since the floor is often covered by (waste) water and biofilm, which renders visual inspection very challenging. In

    更新日期:2020-05-27
  • Autonomous navigation of MAVs in unknown cluttered environments
    J. Field Robot. (IF 3.581) Pub Date : 2020-05-23
    Leobardo Campos‐Macías; Rodrigo Aldana‐López; Rafael de la Guardia; José I. Parra‐Vilchis; David Gómez‐Gutiérrez

    Recently, there have been many advances in the algorithms required for autonomous navigation in unknown environments, such as mapping, collision avoidance, trajectory planning, and motion control. These components have been integrated into drones with high‐end computers and graphics processors. However, further development is required to enable compute‐constrained platforms with such autonomous navigation

    更新日期:2020-05-23
  • Zeus: A system description of the two‐time winner of the collegiate SAE autodrive competition
    J. Field Robot. (IF 3.581) Pub Date : 2020-05-12
    Keenan Burnett; Jingxing Qian; Xintong Du; Linqiao Liu; David J. Yoon; Tianchang Shen; Susan Sun; Sepehr Samavi; Michael J. Sorocky; Mollie Bianchi; Kaicheng Zhang; Arkady Arkhangorodsky; Quinlan Sykora; Shichen Lu; Yizhou Huang; Angela P. Schoellig; Timothy D. Barfoot

    The SAE AutoDrive Challenge is a 3‐year collegiate competition to develop a self‐driving car by 2020. The second year of the competition was held in June 2019 at MCity, a mock town built for self‐driving car testing at the University of Michigan. Teams were required to autonomously navigate a series of intersections while handling pedestrians, traffic lights, and traffic signs. Zeus is aUToronto's

    更新日期:2020-05-12
  • A marsupial robotic system for surveying and inspection of freshwater ecosystems
    J. Field Robot. (IF 3.581) Pub Date : 2020-05-10
    Michail Kalaitzakis; Brennan Cain; Nikolaos Vitzilaios; Ioannis Rekleitis; Jason Moulton

    Freshwater ecosystems are vast areas that are constantly changing and evolving. To maintain the ecosystem as well as the structures located close to bodies of water, frequent monitoring is required. Although dangerous and time consuming, manual operations are the conventional way of monitoring such areas. Recently, Autonomous Surface Vehicles (ASVs) have been proposed to undertake the monitoring task

    更新日期:2020-05-10
  • Visual model‐predictive localization for computationally efficient autonomous racing of a 72‐g drone
    J. Field Robot. (IF 3.581) Pub Date : 2020-05-08
    Shuo Li; Erik van der Horst; Philipp Duernay; Christophe De Wagter; Guido C. H. E. de Croon

    Drone racing is becoming a popular e‐sport all over the world, and beating the best human drone race pilots has quickly become a new major challenge for artificial intelligence and robotics. In this paper, we propose a novel sensor fusion method called visual model‐predictive localization (VML). Within a small time window, VML approximates the error between the model prediction position and the visual

    更新日期:2020-05-08
  • Magnetic survey and autonomous target reacquisition with a scalar magnetometer on a small AUV
    J. Field Robot. (IF 3.581) Pub Date : 2020-05-02
    Eric Gallimore; Eric Terrill; Andrew Pietruszka; Jeffrey Gee; Andrew Nager; Robert Hess

    A scalar magnetometer payload has been developed and integrated into a two‐man portable autonomous underwater vehicle (AUV) for geophysical and archeological surveys. The compact system collects data from a Geometrics microfabricated atomic magnetometer, a total‐field atomic magnetometer. Data from the sensor is both stored for post‐processing and made available to an onboard autonomy engine for real‐time

    更新日期:2020-05-02
  • Multisensor online 3D view planning for autonomous underwater exploration
    J. Field Robot. (IF 3.581) Pub Date : 2020-05-02
    Eduard Vidal; Narcís Palomeras; Klemen Istenič; Nuno Gracias; Marc Carreras

    This study presents a novel octree‐based three‐dimensional (3D) exploration and coverage method for autonomous underwater vehicles (AUVs). Robotic exploration can be defined as the task of obtaining a full map of an unknown environment with a robotic system, achieving full coverage of the area of interest with data from a particular sensor or set of sensors. While most robotic exploration algorithms

    更新日期:2020-05-02
  • Erratum
    J. Field Robot. (IF 3.581) Pub Date : 2020-04-24

    https://doi.org/10.1002/rob.21917 In Kaljaca, Vroegindeweij, and van Henten (2020), there is a change in the author group, name of one of the author, Angelo Mencarelli, was accidentally removed from the author list and needs to be added back. This article was published in Journal of Field Robotics in February 2020 (https://doi.org/10.1002/rob.21917). We apologize for the error. Updated author group

    更新日期:2020-04-24
  • Falco: Fast likelihood‐based collision avoidance with extension to human‐guided navigation
    J. Field Robot. (IF 3.581) Pub Date : 2020-04-16
    Ji Zhang; Chen Hu; Rushat Gupta Chadha; Sanjiv Singh

    We propose a planning method to enable fast autonomous flight in cluttered environments. Typically, autonomous navigation through a complex environment requires a continuous search on a graph generated by a k‐connected grid or a probabilistic scheme. As the vehicle travels, updating the graph with data from onboard sensors is expensive as is the search on the graph especially if the paths must be kinodynamically

    更新日期:2020-04-22
  • An open‐source system for vision‐based micro‐aerial vehicle mapping, planning, and flight in cluttered environments
    J. Field Robot. (IF 3.581) Pub Date : 2020-04-06
    Helen Oleynikova; Christian Lanegger; Zachary Taylor; Michael Pantic; Alexander Millane; Roland Siegwart; Juan Nieto

    We present an open‐source system for Micro‐Aerial Vehicle (MAV) autonomous navigation from vision‐based sensing. Our system focuses on dense mapping, safe local planning, and global trajectory generation, especially when using narrow field‐of‐view sensors in very cluttered environments. In addition, details about other necessary parts of the system and special considerations for applications in real‐world

    更新日期:2020-04-06
  • ARDEA—An MAV with skills for future planetary missions
    J. Field Robot. (IF 3.581) Pub Date : 2020-03-12
    Philipp Lutz; Marcus G. Müller; Moritz Maier; Samantha Stoneman; Teodor Tomić; Ingo von Bargen; Martin J. Schuster; Florian Steidle; Armin Wedler; Wolfgang Stürzl; Rudolph Triebel

    We introduce a prototype flying platform for planetary exploration: autonomous robot design for extraterrestrial applications (ARDEA). Communication with unmanned missions beyond Earth orbit suffers from time delay, thus a key criterion for robotic exploration is a robot's ability to perform tasks without human intervention. For autonomous operation, all computations should be done on‐board and Global

    更新日期:2020-03-12
  • SLAM for autonomous planetary rovers with global localization
    J. Field Robot. (IF 3.581) Pub Date : 2020-02-28
    Dimitrios Geromichalos; Martin Azkarate; Emmanouil Tsardoulias; Levin Gerdes; Loukas Petrou; Carlos Perez Del Pulgar

    This paper describes a novel approach to simultaneous localization and mapping (SLAM) techniques applied to the autonomous planetary rover exploration scenario to reduce both the relative and absolute localization errors, using two well‐proven techniques: particle filters and scan matching. Continuous relative localization is improved by matching high‐resolution sensor scans to the online created local

    更新日期:2020-02-28
  • A 3D reactive collision avoidance algorithm for underactuated underwater vehicles
    J. Field Robot. (IF 3.581) Pub Date : 2020-02-28
    Martin S. Wiig; Kristin Y. Pettersen; Thomas R. Krogstad

    Avoiding collisions is an essential goal of the control system of autonomous vehicles. This paper presents a reactive algorithm for avoiding obstacles in a three‐dimensional space, and shows how the algorithm can be applied to an underactuated underwater vehicle. The algorithm is based on maintaining a constant avoidance angle to the obstacle, which ensures that a guaranteed minimum separation distance

    更新日期:2020-02-28
  • Field observation of tornadic supercells by multiple autonomous fixed‐wing unmanned aircraft
    J. Field Robot. (IF 3.581) Pub Date : 2020-02-26
    Eric W. Frew; Brian Argrow; Steve Borenstein; Sara Swenson; C. Alexander Hirst; Henno Havenga; Adam Houston

    This paper presents the results of the design and field deployment of multiple autonomous fixed‐wing unmanned aircraft into supercell thunderstorms. As part of a field campaign in Spring 2019, up to three fixed‐wing unmanned aircraft were deployed simultaneously into different regions of supercell thunderstorms, To learn more about the atmospheric conditions that lead to the formation of tornadoes

    更新日期:2020-02-26
  • Autonomous aerial robot using dual‐fisheye cameras
    J. Field Robot. (IF 3.581) Pub Date : 2020-02-25
    Wenliang Gao; Kaixuan Wang; Wenchao Ding; Fei Gao; Tong Qin; Shaojie Shen

    Safety is undoubtedly the most fundamental requirement for any aerial robotic application. It is essential to equip aerial robots with omnidirectional perception coverage to ensure safe navigation in complex environments. In this paper, we present a light‐weight and low‐cost omnidirectional perception system, which consists of two ultrawide field‐of‐view (FOV) fisheye cameras and a low‐cost inertial

    更新日期:2020-02-25
  • High‐accuracy absolute positioning for the stationary planetary rover by integrating the star sensor and inclinometer
    J. Field Robot. (IF 3.581) Pub Date : 2020-02-18
    Yinhu Zhan; Yong Zheng; Chonghui Li; Ruopu Wang; Yongxing Zhu; Zhanglei Chen

    In this paper, we introduce a novel method for the high‐accuracy absolute position determination for planetary rovers using the star sensor and inclinometer. We describe the star sensor and inclinometer model and the alignment method for the two sensors. We deduce the compensation algorithm for the atmosphere refraction correction error in detail and provide the rover's position solution, which effectively

    更新日期:2020-02-18
  • Field trials of an energy‐aware mission planner implemented on an autonomous surface vehicle
    J. Field Robot. (IF 3.581) Pub Date : 2020-02-11
    Fletcher Thompson; Roberto Galeazzi; Damien Guihen

    Mission planning for autonomous marine vehicles is nontrivial due to the dynamic and uncertain nature of the marine environment. Communication can be low‐bandwidth and is not always guaranteed, so the operator must rely on the vehicles to adjust their plans according to the realized state of the environment. This paper presents the improvements made to an energy‐aware mission planner that allows it

    更新日期:2020-02-11
  • Multirepresentation, Multiheuristic A* search‐based motion planning for a free‐floating underwater vehicle‐manipulator system in unknown environment
    J. Field Robot. (IF 3.581) Pub Date : 2020-02-09
    Dina Youakim; Patryk Cieslak; Andrew Dornbush; Albert Palomer; Pere Ridao; Maxim Likhachev

    A key challenge in autonomous mobile manipulation is the ability to determine, in real time, how to safely execute complex tasks when placed in unknown or changing world. Addressing this issue for Intervention Autonomous Underwater Vehicles (I‐AUVs), operating in potentially unstructured environment is becoming essential. Our research focuses on using motion planning to increase the I‐AUVs autonomy

    更新日期:2020-02-09
  • Design and testing of the JPL‐Nautilus Gripper for deep‐ocean geological sampling
    J. Field Robot. (IF 3.581) Pub Date : 2020-02-09
    Spencer B. Backus; Rina Onishi; Andrew Bocklund; Andrew Berg; Eric D. Contreras; Aaron Parness

    We present the design and experimental results for the JPL‐Nautilus Gripper, a 16‐finger highly underactuated microspine gripper for use in the deep ocean. The gripper can grasp objects from 10 to 30 cm in size and anchor to flat and curved rocky surfaces (i.e., cliff faces and seamounts). Laboratory results demonstrated an anchoring capability of greater than 450 N on rough rocks in both shear and

    更新日期:2020-02-09
  • Robotic weed control using automated weed and crop classification
    J. Field Robot. (IF 3.581) Pub Date : 2020-02-06
    Xiaolong Wu; Stéphanie Aravecchia; Philipp Lottes; Cyrill Stachniss; Cédric Pradalier

    Autonomous robotic weeding systems in precision farming have demonstrated their full potential to alleviate the current dependency on agrochemicals such as herbicides and pesticides, thus reducing environmental pollution and improving sustainability. However, most previous works require fast and constant‐time weed detection systems to achieve real‐time treatment, which forecloses the implementation

    更新日期:2020-02-07
  • Development of a sweet pepper harvesting robot
    J. Field Robot. (IF 3.581) Pub Date : 2020-01-27
    Boaz Arad; Jos Balendonck; Ruud Barth; Ohad Ben‐Shahar; Yael Edan; Thomas Hellström; Jochen Hemming; Polina Kurtser; Ola Ringdahl; Toon Tielen; Bart van Tuijl

    This paper presents the development, testing and validation of SWEEPER, a robot for harvesting sweet pepper fruit in greenhouses. The robotic system includes a six degrees of freedom industrial arm equipped with a specially designed end effector, RGB‐D camera, high‐end computer with graphics processing unit, programmable logic controllers, other electronic equipment, and a small container to store

    更新日期:2020-01-27
  • Two‐stage 3D model‐based UAV pose estimation: A comparison of methods for optimization
    J. Field Robot. (IF 3.581) Pub Date : 2020-01-23
    Nuno Pessanha Santos; Victor Lobo; Alexandre Bernardino

    Particle Filters (PFs) have been successfully used in three‐dimensional (3D) model‐based pose estimation. Typically, these filters depend on the computation of importance weights that use similarity metrics as a proxy to approximate the likelihood function. In this paper, we explore the use of a two‐stage 3D model‐based approach based on a PF for single‐frame pose estimation. First, we use a classifier

    更新日期:2020-01-23
  • Autonomous collision detection and avoidance for ARAGON USV: Development and field tests
    J. Field Robot. (IF 3.581) Pub Date : 2020-01-21
    Jungwook Han; Yonghoon Cho; Jonghwi Kim; Jinwhan Kim; Nam‐sun Son; Sun Young Kim

    This study addresses the development of algorithms for multiple target detection and tracking in the framework of sensor fusion and its application to autonomous navigation and collision avoidance systems for the unmanned surface vehicle (USV) Aragon. To provide autonomous navigation capabilities, various perception sensors such as radar, lidar, and cameras have been mounted on the USV platform and

    更新日期:2020-01-21
  • Multimodal localization: Stereo over LiDAR map
    J. Field Robot. (IF 3.581) Pub Date : 2020-01-21
    Xingxing Zuo; Wenlong Ye; Yulin Yang; Renjie Zheng; Teresa Vidal‐Calleja; Guoquan Huang; Yong Liu

    In this paper, we present a real‐time high‐precision visual localization system for an autonomous vehicle which employs only low‐cost stereo cameras to localize the vehicle with a priori map built using a more expensive 3D LiDAR sensor. To this end, we construct two different visual maps: a sparse feature visual map for visual odometry (VO) based motion tracking, and a semidense visual map for registration

    更新日期:2020-01-21
  • Autonomous aerial cinematography in unstructured environments with learned artistic decision‐making
    J. Field Robot. (IF 3.581) Pub Date : 2020-01-06
    Rogerio Bonatti; Wenshan Wang; Cherie Ho; Aayush Ahuja; Mirko Gschwindt; Efe Camci; Erdal Kayacan; Sanjiban Choudhury; Sebastian Scherer

    Aerial cinematography is revolutionizing industries that require live and dynamic camera viewpoints such as entertainment, sports, and security. However, safely piloting a drone while filming a moving target in the presence of obstacles is immensely taxing, often requiring multiple expert human operators. Hence, there is a demand for an autonomous cinematographer that can reason about both geometry

    更新日期:2020-01-06
  • Keyframe‐based thermal–inertial odometry
    J. Field Robot. (IF 3.581) Pub Date : 2019-12-30
    Shehryar Khattak; Christos Papachristos; Kostas Alexis

    Autonomous navigation of microaerial vehicles in environments that are simultaneously GPS‐denied and visually degraded, and especially in the dark, texture‐less and dust‐ or smoke‐filled settings, is rendered particularly hard. However, a potential solution arises if such aerial robots are equipped with long wave infrared thermal vision systems that are unaffected by darkness and can penetrate many

    更新日期:2019-12-30
  • GPS‐level accurate camera localization with HorizonNet
    J. Field Robot. (IF 3.581) Pub Date : 2019-12-23
    Bertil Grelsson; Andreas Robinson; Michael Felsberg; Fahad Shahbaz Khan

    This paper investigates the problem of position estimation of unmanned surface vessels (USVs) operating in coastal areas or in the archipelago. We propose a position estimation method where the horizon line is extracted in a 360° panoramic image around the USV. We design a convolutional neural network (CNN) architecture to determine an approximate horizon line in the image and implicitly determine

    更新日期:2019-12-23
  • Decomposition‐based mission planning for fixed‐wing UAVs surveying in wind
    J. Field Robot. (IF 3.581) Pub Date : 2019-12-16
    Matthew Coombes; Tom Fletcher; Wen‐Hua Chen; Cunjia Liu

    This paper presents a new method for planning fixed‐wing aerial survey paths that ensures efficient image coverage of a large complex agricultural field in the presence of wind. By decomposing any complex polygonal field into multiple convex polygons, the traditional back‐and‐forth boustrophedon paths can be used to ensure coverage of these decomposed regions. To decompose a complex field in an efficient

    更新日期:2019-12-16
  • Off‐road ground robot path energy cost prediction through probabilistic spatial mapping
    J. Field Robot. (IF 3.581) Pub Date : 2019-12-15
    Michael Quann; Lauro Ojeda; William Smith; Denise Rizzo; Matthew Castanier; Kira Barton

    Energy is an important factor in planning for ground robots, particularly in off‐road environments where the terrain significantly impacts energy usage. Unfortunately, energy costs in these environments are variable and uncertain, making planning difficult. In this paper, we present a method, based on Gaussian process regression (GPR) and known vehicle modeling information, for predicting future path

    更新日期:2019-12-15
  • Perception‐aware autonomous mast motion planning for planetary exploration rovers
    J. Field Robot. (IF 3.581) Pub Date : 2019-12-02
    Jared Strader; Kyohei Otsu; Ali‐akbar Agha‐mohammadi

    Highly accurate real‐time localization is of fundamental importance for the safety and efficiency of planetary rovers exploring the surface of Mars. Mars rover operations rely on vision‐based systems to avoid hazards as well as plan safe routes. However, vision‐based systems operate on the assumption that sufficient visual texture is visible in the scene. This poses a challenge for vision‐based navigation

    更新日期:2019-12-02
  • Effect of gravity in wheel/terrain interaction models
    J. Field Robot. (IF 3.581) Pub Date : 2019-12-01
    László L. Kovács; Bahareh Ghotbi; Francisco González; Parna Niksirat; Krzysztof Skonieczny; József Kövecses

    Predicting the motion of wheeled robots in unstructured environments is an important and challenging problem. The study of planetary exploration rovers on soft terrain introduces the additional need to consider the effect of nonterrestrial gravitational fields on the forces and torques developed at the wheel/terrain interface. Simply reducing the wheel load under Earth's gravity overestimates the traveled

    更新日期:2019-12-01
  • Automated crop plant detection based on the fusion of color and depth images for robotic weed control
    J. Field Robot. (IF 3.581) Pub Date : 2019-07-23
    Jingyao Gai, Lie Tang, Brian L. Steward

    Robotic weeding enables weed control near or within crop rows automatically, precisely and effectively. A computer‐vision system was developed for detecting crop plants at different growth stages for robotic weed control. Fusion of color images and depth images was investigated as a means of enhancing the detection accuracy of crop plants under conditions of high weed population. In‐field images of

    更新日期:2019-11-18
  • Improvements to and large‐scale evaluation of a robotic kiwifruit harvester
    J. Field Robot. (IF 3.581) Pub Date : 2019-07-16
    Henry Williams, Canaan Ting, Mahla Nejati, Mark Hedley Jones, Nicky Penhall, JongYoon Lim, Matthew Seabright, Jamie Bell, Ho Seok Ahn, Alistair Scarfe, Mike Duke, Bruce MacDonald

    The growing popularity of kiwifruit orchards in New Zealand is increasing the already high demand for seasonal labourers. A novel robotic kiwifruit harvester has been designed and built to help meet this demand [H. A. Williams et al. Biosystems Eng. 181 (2019), pp. 140–156]. Early evaluations of the platform have shown good results with the system capable of harvesting 51.0% of 1,456 kiwifruit in four

    更新日期:2019-11-18
  • A high‐resolution, multimodal data set for agricultural robotics: A Ladybird's‐eye view of Brassica
    J. Field Robot. (IF 3.581) Pub Date : 2019-07-11
    Asher Bender, Brett Whelan, Salah Sukkarieh

    This article presents an agricultural data set collected by Ladybird, an autonomous field robot designed at the Australian Centre for Field Robotics. The data set contains weekly scans of cauliflower and broccoli (Brassica oleracea) covering a 10 week growth cycle from transplant to harvest. The data set includes ground truth; physical characteristics of the crop; environmental data collected by a

    更新日期:2019-11-18
  • A field‐tested robotic harvesting system for iceberg lettuce
    J. Field Robot. (IF 3.581) Pub Date : 2019-07-07
    Simon Birrell, Josie Hughes, Julia Y. Cai, Fumiya Iida

    Agriculture provides an unique opportunity for the development of robotic systems; robots must be developed which can operate in harsh conditions and in highly uncertain and unknown environments. One particular challenge is performing manipulation for autonomous robotic harvesting. This paper describes recent and current work to automate the harvesting of iceberg lettuce. Unlike many other produce

    更新日期:2019-11-18
  • A low‐cost and efficient autonomous row‐following robot for food production in polytunnels
    J. Field Robot. (IF 3.581) Pub Date : 2019-06-18
    Tuan D. Le, Vignesh R. Ponnambalam, Jon G. O. Gjevestad, Pål J. From

    In this paper, we present an automatic motion planner for agricultural robots that allows us to set up a robot to follow rows without having to explicitly enter waypoints. In most cases, when robots are used to cover large agricultural areas, they will need waypoints as inputs, either as premeasured coordinates in an outdoor environment, or as positions in a map in an indoor environment. This can be

    更新日期:2019-11-18
  • Semantic mapping for orchard environments by merging two‐sides reconstructions of tree rows
    J. Field Robot. (IF 3.581) Pub Date : 2019-06-17
    Wenbo Dong, Pravakar Roy, Volkan Isler

    Measuring semantic traits for phenotyping is an essential but labor‐intensive activity in horticulture. Researchers often rely on manual measurements which may not be accurate for tasks, such as measuring tree volume. To improve the accuracy of such measurements and to automate the process, we consider the problem of building coherent three‐dimensional (3D) reconstructions of orchard rows. Even though

    更新日期:2019-11-18
  • Transfer learning between crop types for semantic segmentation of crops versus weeds in precision agriculture
    J. Field Robot. (IF 3.581) Pub Date : 2019-03-28
    Petra Bosilj, Erchan Aptoula, Tom Duckett, Grzegorz Cielniak

    Agricultural robots rely on semantic segmentation for distinguishing between crops and weeds to perform selective treatments and increase yield and crop health while reducing the amount of chemicals used. Deep‐learning approaches have recently achieved both excellent classification performance and real‐time execution. However, these techniques also rely on a large amount of training data, requiring

    更新日期:2019-11-18
  • Multimodal obstacle detection in unstructured environments with conditional random fields
    J. Field Robot. (IF 3.581) Pub Date : 2019-03-07
    Mikkel Kragh, James Underwood

    Reliable obstacle detection and classification in rough and unstructured terrain such as agricultural fields or orchards remains a challenging problem. These environments involve large variations in both geometry and appearance, challenging perception systems that rely on only a single sensor modality. Geometrically, tall grass, fallen leaves, or terrain roughness can mistakenly be perceived as nontraversable

    更新日期:2019-11-18
  • Autonomous pollination of individual kiwifruit flowers: Toward a robotic kiwifruit pollinator
    J. Field Robot. (IF 3.581) Pub Date : 2019-01-29
    Henry Williams, Mahla Nejati, Salome Hussein, Nicky Penhall, Jong Yoon Lim, Mark Hedley Jones, Jamie Bell, Ho Seok Ahn, Stuart Bradley, Peter Schaare, Paul Martinsen, Mohammad Alomar, Purak Patel, Matthew Seabright, Mike Duke, Alistair Scarfe, Bruce MacDonald

    There is an increasing concern that the traditional approach of natural kiwifruit pollination by bees may not be sustainable. The alternatives are currently too costly for most growers due to high labor requirements or inefficient usage of expensive pollen. This paper presents a performance evaluation of a novel kiwifruit pollinating robot designed to provide a more efficient, reliable, and cost‐effective

    更新日期:2019-11-18
  • A survey of deep learning techniques for autonomous driving
    J. Field Robot. (IF 3.581) Pub Date : 2019-11-14
    Sorin Grigorescu; Bogdan Trasnea; Tiberiu Cocias; Gigel Macesanu

    The last decade witnessed increasingly rapid progress in self‐driving vehicle technology, mainly backed up by advances in the area of deep learning and artificial intelligence (AI). The objective of this paper is to survey the current state‐of‐the‐art on deep learning technologies used in autonomous driving. We start by presenting AI‐based self‐driving architectures, convolutional and recurrent neural

    更新日期:2019-11-14
  • Autonomous maritime collision avoidance: Field verification of autonomous surface vehicle behavior in challenging scenarios
    J. Field Robot. (IF 3.581) Pub Date : 2019-11-11
    D. K. M. Kufoalor; T. A. Johansen; E. F. Brekke; A. Hepsø; K. Trnka

    We present results from sea trials for an autonomous surface vehicle (ASV) equipped with a collision avoidance system based on model predictive control (MPC). The sea trials were performed in the North Sea as part of an ASV Challenge posed by Deltares through a Dutch initiative involving different authorities, including the Ministry of Infrastructure and Water Management, the Netherlands Coastguard

    更新日期:2019-11-11
  • Coverage trajectory planning for a bush trimming robot arm
    J. Field Robot. (IF 3.581) Pub Date : 2019-11-05
    Dejan Kaljaca, Bastiaan Vroegindeweij, Eldert van Henten

    A novel motion planning algorithm for robotic bush trimming is presented. The algorithm is based on an optimal route search over a graph. Differently from other works in robotic surface coverage, it entails both accuracy in the surface sweeping task and smoothness in the motion of the robot arm. The proposed method requires the selection of a custom objective function in the joint space for optimal

    更新日期:2019-11-06
  • Development and applications of rescue robots for explosion accidents in coal mines
    J. Field Robot. (IF 3.581) Pub Date : 2019-11-06
    Yutan Li; Menggang Li; Hua Zhu; Eryi Hu; Chaoquan Tang; Peng Li; Shaoze You

    It is essential to provide disaster relief assistance after coal mine explosions. Often, it is life‐threatening for rescuers to enter an accident scene blindly; therefore, a coal mine rescue robot (CMRR) has been developed. However, the application of the CMRR has not proven satisfactory after decades of development. To solve this problem, we summarize the reasons for this disappointing state and address

    更新日期:2019-11-06
Contents have been reproduced by permission of the publishers.
导出
全部期刊列表>>
物理学研究前沿热点精选期刊推荐
科研绘图
欢迎报名注册2020量子在线大会
化学领域亟待解决的问题
材料学研究精选新
GIANT
自然职场线上招聘会
ACS ES&T Engineering
ACS ES&T Water
屿渡论文,编辑服务
阿拉丁试剂right
张晓晨
田蕾蕾
李闯创
刘天飞
隐藏1h前已浏览文章
课题组网站
新版X-MOL期刊搜索和高级搜索功能介绍
ACS材料视界
天合科研
x-mol收录
X-MOL
清华大学
廖矿标
陈永胜
试剂库存
down
wechat
bug