Publication /lab/correll/ en DeliGrasp: Inferring Object Mass, Friction, and Compliance with LLMs for Adaptive and Minimally Deforming Grasp Policies /lab/correll/2024/03/12/deligrasp-inferring-object-mass-friction-and-compliance-llms-adaptive-and-minimally <span>DeliGrasp: Inferring Object Mass, Friction, and Compliance with LLMs for Adaptive and Minimally Deforming Grasp Policies</span> <span><span>Anonymous (not verified)</span></span> <span><time datetime="2024-03-12T00:00:00-06:00" title="Tuesday, March 12, 2024 - 00:00">Tue, 03/12/2024 - 00:00</time> </span> <div> <div class="imageMediaStyle focal_image_wide"> <img loading="lazy" src="/lab/correll/sites/default/files/styles/focal_image_wide/public/article-thumbnail/deligrasp.png?h=fe0a7276&amp;itok=3cQWAxBt" width="1200" height="600" alt="Large language models (LLMs) have rich physical knowledge about worldly objects, but cannot directly reason robot grasps for them. Paired with open-world localization and pose estimation (left), our method (middle), queries LLMs for the salient physical characteristics of mass, friction, and compliance as the basis for an adaptive grasp controller. DeliGrasp policies successfully grasp delicate and deformable objects "> </div> </div> <div role="contentinfo" class="container ucb-article-categories" itemprop="about"> <span class="visually-hidden">Categories:</span> <div class="ucb-article-category-icon" aria-hidden="true"> <i class="fa-solid fa-folder-open"></i> </div> <a href="/lab/correll/taxonomy/term/12"> Publication </a> </div> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default 3"> <div class="ucb-article-row-subrow row"> <div class="ucb-article-text col-lg d-flex align-items-center" itemprop="articleBody"> <div><p>Large language models (LLMs) can provide rich physical descriptions of most worldly objects, allowing robots to achieve more informed and capable grasping. We leverage LLMs’ common sense physical reasoning and code-writing abilities to infer an object’s physical characteristics—mass, friction coefficient, and spring constant&nbsp;—from a semantic description, and then translate those characteristics into an executable adaptive grasp policy. Using a current-controllable, two-finger gripper with a built-in depth camera, we demonstrate that LLM-generated, physically-grounded grasp policies outperform traditional grasp policies on a custom benchmark of 12 delicate and deformable items including food, produce, toys, and other everyday items, spanning two orders of magnitude in mass and required pick-up force. We also demonstrate how compliance feedback from&nbsp;DeliGrasp&nbsp;policies can aid in downstream tasks such as measuring produce ripeness. Our code and videos are available at:&nbsp;<a href="https://deligrasp.github.io/" rel="nofollow">https://deligrasp.github.io</a>.</p><p><strong>References</strong></p><p><span>Xie, W., Valentini, M., Lavering, J. and Correll, N., 2024. DeliGrasp: Inferring Object Properties with LLMs for Adaptive Grasp Policies. In </span><em>8th Annual Conference on Robot Learning</em><span>.</span></p><p>Xie, W., Lavering, J. and Correll, N., 2024. DeliGrasp: Inferring Object Mass, Friction, and Compliance with LLMs for Adaptive and Minimally Deforming Grasp Policies.&nbsp;<a href="https://arxiv.org/pdf/2403.07832v1.pdf" rel="nofollow"><em>arXiv preprint arXiv:2403.07832</em></a>.</p></div> </div> <div class="ucb-article-content-media ucb-article-content-media-right col-lg"> <div> <div class="paragraph paragraph--type--media paragraph--view-mode--default"> <div> <div class="imageMediaStyle large_image_style"> <img loading="lazy" src="/lab/correll/sites/default/files/styles/large_image_style/public/article-image/deligrasp.png?itok=fxPRIca6" width="1500" height="2194" alt="Large language models (LLMs) have rich physical knowledge about worldly objects, but cannot directly reason robot grasps for them. Paired with open-world localization and pose estimation (left), our method (middle), queries LLMs for the salient physical characteristics of mass, friction, and compliance as the basis for an adaptive grasp controller. DeliGrasp policies successfully grasp delicate and deformable objects "> </div> </div> </div> </div> </div> </div> </div> </div> </div> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Tue, 12 Mar 2024 06:00:00 +0000 Anonymous 134 at /lab/correll A multifunctional soft robotic shape display with high-speed actuation, sensing, and control /lab/correll/2023/07/31/multifunctional-soft-robotic-shape-display-high-speed-actuation-sensing-and-control <span>A multifunctional soft robotic shape display with high-speed actuation, sensing, and control</span> <span><span>Anonymous (not verified)</span></span> <span><time datetime="2023-07-31T00:00:00-06:00" title="Monday, July 31, 2023 - 00:00">Mon, 07/31/2023 - 00:00</time> </span> <div> <div class="imageMediaStyle focal_image_wide"> <img loading="lazy" src="/lab/correll/sites/default/files/styles/focal_image_wide/public/article-thumbnail/41467_2023_39842_Fig1_HTML.png?h=f55021f1&amp;itok=oBt_HlOc" width="1200" height="600" alt="A multifunctional soft robotic shape display with high-speed actuation, sensing, and control"> </div> </div> <div role="contentinfo" class="container ucb-article-categories" itemprop="about"> <span class="visually-hidden">Categories:</span> <div class="ucb-article-category-icon" aria-hidden="true"> <i class="fa-solid fa-folder-open"></i> </div> <a href="/lab/correll/taxonomy/term/12"> Publication </a> </div> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default"> <div class="ucb-article-content-media ucb-article-content-media-above"> <div> <div class="paragraph paragraph--type--media paragraph--view-mode--default"> <div> <div class="imageMediaStyle large_image_style"> <img loading="lazy" src="/lab/correll/sites/default/files/styles/large_image_style/public/article-image/41467_2023_39842_Fig1_HTML.png?itok=-j7W5njh" width="1500" height="1152" alt="A multifunctional soft robotic shape display with high-speed actuation, sensing, and control"> </div> </div> </div> </div> </div> <div class="ucb-article-text d-flex align-items-center" itemprop="articleBody"> <div><p>Shape displays that actively manipulate surface geometry are an expanding robotics domain with applications to haptics, manufacturing, aerodynamics, and more. However, existing displays often lack high-fidelity shape morphing, high-speed deformation, and embedded state sensing, limiting their potential uses. Here, we demonstrate a multifunctional soft-shape display driven by a 10 × 10 array of scalable cellular units which combine high-speed electrohydraulic soft actuation, magnetic-based sensing, and control circuitry. We report high-performance reversible shape morphing up to 50 Hz, sensing of surface deformations with 0.1 mm sensitivity, and external forces with 50 mN sensitivity in each cell, which we demonstrate across a multitude of applications including user interaction, image display, sensing of object mass, and dynamic manipulation of solids and liquids. This work showcases the rich multifunctionality and high-performance capabilities that arise from tightly-integrating large numbers of electrohydraulic actuators, soft sensors, and controllers at a previously undemonstrated scale in soft robotics.</p> <p><strong>References</strong></p> <p>B. K. Johnson, M. Naris, V. Sundaram, A. Volchko, K. Ly, S. K. Mitchell, E. Acome, N. Kellaris, C. Keplinger, N. Correll, J. S. Humbert &amp; M. E. Rentschler. <a href="https://www.nature.com/articles/s41467-023-39842-2.pdf" rel="nofollow">A multifunctional soft robotic shape display with high-speed actuation, sensing, and control</a>. <em>Nature Communications</em> volume 14, Article number: 4516 (2023)</p></div> </div> </div> </div> </div> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Mon, 31 Jul 2023 06:00:00 +0000 Anonymous 123 at /lab/correll A versatile robotic hand with 3D perception, force sensing for autonomous manipulation /lab/correll/2023/07/10/versatile-robotic-hand-3d-perception-force-sensing-autonomous-manipulation <span>A versatile robotic hand with 3D perception, force sensing for autonomous manipulation</span> <span><span>Anonymous (not verified)</span></span> <span><time datetime="2023-07-10T12:42:28-06:00" title="Monday, July 10, 2023 - 12:42">Mon, 07/10/2023 - 12:42</time> </span> <div> <div class="imageMediaStyle focal_image_wide"> <img loading="lazy" src="/lab/correll/sites/default/files/styles/focal_image_wide/public/article-thumbnail/hand%20%281%29.png?h=fba20465&amp;itok=2sR5YRPT" width="1200" height="600" alt="The Versand"> </div> </div> <div role="contentinfo" class="container ucb-article-categories" itemprop="about"> <span class="visually-hidden">Categories:</span> <div class="ucb-article-category-icon" aria-hidden="true"> <i class="fa-solid fa-folder-open"></i> </div> <a href="/lab/correll/taxonomy/term/12"> Publication </a> </div> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default"> <div class="ucb-article-content-media ucb-article-content-media-above"> <div> <div class="paragraph paragraph--type--media paragraph--view-mode--default"> <div> <div class="imageMediaStyle large_image_style"> <img loading="lazy" src="/lab/correll/sites/default/files/styles/large_image_style/public/article-image/hand%20%281%29.png?itok=S7Ykx5_E" width="1500" height="1102" alt="The Versand"> </div> </div> </div> </div> </div> <div class="ucb-article-text d-flex align-items-center" itemprop="articleBody"> <div><p>We describe a force-controlled robotic gripper with built-in tactile and 3D perception. We also describe a complete autonomous manipulation pipeline consisting of object detection, segmentation, point cloud processing, force-controlled manipulation, and symbolic (re)-planning. The design emphasizes versatility in terms of applications, manufacturability, use of commercial off-the-shelf parts, and open-source software. We validate the design by characterizing force control (achieving up to 32N, controllable in steps of 0.08N), force measurement, and two manipulation demonstrations: assembly of the Siemens gear assembly problem, and a sensor-based stacking task requiring replanning. These demonstrate robust execution of long sequences of sensor-based manipulation tasks, which makes the resulting platform a solid foundation for researchers in task-and-motion planning, educators, and quick prototyping of household and warehouse automation tasks. &nbsp;</p> <p><strong>References</strong></p> <p>N. Correll, D. Kriegman, S. Otte and J. Watson. A versatile robotic hand with 3D perception, force sensing for autonomous manipulation. In Proceedings of Workshop on&nbsp;<a href="http://rss23.armbench.com/" rel="nofollow">Perception and Manipulation Challenges for Warehouse Automation</a>, Robotics: Science and Systems, Daegu, Korea.</p> <p>&nbsp;</p></div> </div> </div> </div> </div> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Mon, 10 Jul 2023 18:42:28 +0000 Anonymous 64 at /lab/correll Early failure prediction during robotic assembly using Transformers /lab/correll/2023/07/10/early-failure-prediction-during-robotic-assembly-using-transformers <span>Early failure prediction during robotic assembly using Transformers</span> <span><span>Anonymous (not verified)</span></span> <span><time datetime="2023-07-10T00:00:00-06:00" title="Monday, July 10, 2023 - 00:00">Mon, 07/10/2023 - 00:00</time> </span> <div> <div class="imageMediaStyle focal_image_wide"> <img loading="lazy" src="/lab/correll/sites/default/files/styles/focal_image_wide/public/article-thumbnail/transformer_diagram%20%281%29_2.png?h=0401acf6&amp;itok=9MW1MiCx" width="1200" height="600" alt="Transformer architecture"> </div> </div> <div role="contentinfo" class="container ucb-article-categories" itemprop="about"> <span class="visually-hidden">Categories:</span> <div class="ucb-article-category-icon" aria-hidden="true"> <i class="fa-solid fa-folder-open"></i> </div> <a href="/lab/correll/taxonomy/term/12"> Publication </a> </div> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default"> <div class="ucb-article-content-media ucb-article-content-media-above"> <div> <div class="paragraph paragraph--type--media paragraph--view-mode--default"> <div> <div class="imageMediaStyle large_image_style"> <img loading="lazy" src="/lab/correll/sites/default/files/styles/large_image_style/public/article-image/transformer_diagram%20%281%29.png?itok=TUdG_zlW" width="1500" height="3704" alt="Transformer architecture"> </div> </div> </div> </div> </div> <div class="ucb-article-text d-flex align-items-center" itemprop="articleBody"> <div><p>Peg-in-hole assembly of tightly fitting parts often requires multiple attempts. Parts need to be put together by performing a wiggling motion of undetermined length and can get stuck, requiring a restart. Recognizing unsuccessful insertion attempts early can help in reducing the <em>makespan</em>&nbsp;of the assembly. This can be achieved by analyzing time-series data from force and torque measurements. We describe a transformer neural network model that is three times faster, i.e. requiring much shorter time series, for predicting failure than a dilated fully convolutional neural network. Albeit the transformer provides predictions with higher confidence, it does so at reduced accuracy. Yet, being able to call unsuccessful attempts early, makespan can be reduced by almost 40% which we show using a dataset with force-torque data from 241 peg-in-hole assembly runs with known outcomes.&nbsp;</p> <p><strong>References</strong></p> <p>R. Montané-Güell, J. Watson and N. Correll, 2023.&nbsp;Early failure prediction during robotic assembly using Transformers. In Proceedings of Workshop on&nbsp;<a href="https://sites.google.com/nvidia.com/industrial-assembly" rel="nofollow">Robotics and AI: The Future of Industrial Assembly Tasks</a>&nbsp;at Robotics: Science and Systems, Daegu, Korea.&nbsp;</p></div> </div> </div> </div> </div> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Mon, 10 Jul 2023 06:00:00 +0000 Anonymous 65 at /lab/correll Distributed Tactile Sensors for Palmar Surfaces of Prosthetic Hands /lab/correll/2023/05/19/distributed-tactile-sensors-palmar-surfaces-prosthetic-hands <span>Distributed Tactile Sensors for Palmar Surfaces of Prosthetic Hands</span> <span><span>Anonymous (not verified)</span></span> <span><time datetime="2023-05-19T00:00:00-06:00" title="Friday, May 19, 2023 - 00:00">Fri, 05/19/2023 - 00:00</time> </span> <div> <div class="imageMediaStyle focal_image_wide"> <img loading="lazy" src="/lab/correll/sites/default/files/styles/focal_image_wide/public/article-thumbnail/palmarsensing.png?h=c1bec7e2&amp;itok=q5LW64uw" width="1200" height="600" alt="Different grasps that require palmar sensing"> </div> </div> <div role="contentinfo" class="container ucb-article-categories" itemprop="about"> <span class="visually-hidden">Categories:</span> <div class="ucb-article-category-icon" aria-hidden="true"> <i class="fa-solid fa-folder-open"></i> </div> <a href="/lab/correll/taxonomy/term/12"> Publication </a> </div> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default"> <div class="ucb-article-content-media ucb-article-content-media-above"> <div> <div class="paragraph paragraph--type--media paragraph--view-mode--default"> <div> <div class="imageMediaStyle large_image_style"> <img loading="lazy" src="/lab/correll/sites/default/files/styles/large_image_style/public/article-image/palmarsensing.png?itok=syz3MqD7" width="1500" height="413" alt="Different grasps that require palmar sensing"> </div> </div> </div> </div> </div> <div class="ucb-article-text d-flex align-items-center" itemprop="articleBody"> <div><p>Sensory feedback provided by prosthetic hands shows promise in increasing functional abilities and promoting embodiment of the prosthetic device. However, sensory feedback is limited based on where sensors are placed on the prosthetic device and has mainly focused on sensorizing the fingertips. Here we describe distributed tactile sensors for the palmar surfaces of prosthetic hands. We believe a sensing system that can detect interactions across the palmar surfaces in addition to the fingertips will further improve the experience for the prosthetic user and may increase embodiment of the device as well. This work details the design of a compliant distributed sensor which consists of PiezoResistive and PiezoElectric layers to produce a robust force measurement of both static and dynamic loads. This assembled sensor system is easy to customize to cover different areas of the prosthetic hand, simple to scale up, and flexible to different fabrication form-factors. The experimental results detail a load estimation accuracy of 95.4% and sensor response time of less than 200ms. Cycle tests of each sensor shows a drifting of within 10% of sensing capability under load and 6.37% in a no-load longitudinal test. These validation experiments reinforce the ability of the DualPiezo structure to provide a valuable sensor design for the palmar surfaces of prosthetic hands.</p> <p><strong>References</strong></p> <p>Truong, H., Correll, N. and Segil, J., 2023, April. Distributed Tactile Sensors for Palmar Surfaces of Prosthetic Hands. In&nbsp;<i>2023 11th International IEEE/EMBS Conference on Neural Engineering (NER)</i>&nbsp;(pp. 1-4). IEEE. [<a href="https://ieeexplore.ieee.org/abstract/document/10123819" rel="nofollow">PDF</a>]</p></div> </div> </div> </div> </div> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Fri, 19 May 2023 06:00:00 +0000 Anonymous 63 at /lab/correll Optimal decision making in robotic assembly and other trial-and-error tasks /lab/correll/2023/01/25/optimal-decision-making-robotic-assembly-and-other-trial-and-error-tasks <span>Optimal decision making in robotic assembly and other trial-and-error tasks</span> <span><span>Anonymous (not verified)</span></span> <span><time datetime="2023-01-25T00:00:00-07:00" title="Wednesday, January 25, 2023 - 00:00">Wed, 01/25/2023 - 00:00</time> </span> <div> <div class="imageMediaStyle focal_image_wide"> <img loading="lazy" src="/lab/correll/sites/default/files/styles/focal_image_wide/public/article-thumbnail/trialanderror.png?h=b59dd45b&amp;itok=JVInW0nI" width="1200" height="600" alt="A complete peg-in-hole assembly sequence: A The bearing is presented in a 3Dprinted jig, B The bearing is picked up by the robot and transported to the assembly plate C. Force and torque measurements are used to D locate the hole E and complete insertion. Insertion failure due to misalignment F. Friction with the edge of the hole has caused the twisting action to pull the bearing further from the hole center."> </div> </div> <div role="contentinfo" class="container ucb-article-categories" itemprop="about"> <span class="visually-hidden">Categories:</span> <div class="ucb-article-category-icon" aria-hidden="true"> <i class="fa-solid fa-folder-open"></i> </div> <a href="/lab/correll/taxonomy/term/12"> Publication </a> </div> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default"> <div class="ucb-article-content-media ucb-article-content-media-above"> <div> <div class="paragraph paragraph--type--media paragraph--view-mode--default"> <div> <div class="imageMediaStyle large_image_style"> <img loading="lazy" src="/lab/correll/sites/default/files/styles/large_image_style/public/article-image/trialanderror_0.png?itok=tq_y26GS" width="1500" height="1000" alt="A complete peg-in-hole assembly sequence: A The bearing is presented in a 3Dprinted jig, B The bearing is picked up by the robot and transported to the assembly plate C. Force and torque measurements are used to D locate the hole E and complete insertion. Insertion failure due to misalignment F. Friction with the edge of the hole has caused the twisting action to pull the bearing further from the hole center."> </div> </div> </div> </div> </div> <div class="ucb-article-text d-flex align-items-center" itemprop="articleBody"> <div><p>Uncertainty in perception, actuation, and the environment often require multiple attempts for a robotic task to be successful. We study a class of problems providing (1) low-entropy indicators of terminal success / failure, and (2) unreliable (high-entropy) data to predict the final outcome of an ongoing task. Examples include a robot trying to connect with a charging station, parallel parking, or assembling a tightly-fitting part. The ability to restart after predicting failure early, versus simply running to failure, can significantly decrease the makespan, that is, the total time to completion, with the drawback of potentially short-cutting an otherwise successful operation. Assuming task running times to be Poisson distributed, and using a Markov Jump process to capture the dynamics of the underlying Markov Decision Process, we derive a closed form solution that predicts makespan based on the confusion matrix of the failure predictor. This allows the robot to learn failure prediction in a production environment, and only adopt a preemptive policy when it actually saves time. We demonstrate this approach using a robotic peg-in-hole assembly problem using a real robotic system. Failures are predicted by a dilated convolutional network based on force-torque data, showing an average makespan reduction from 101s to 81s (N=120, p&lt;0.05). We posit that the proposed algorithm generalizes to any robotic behavior with an unambiguous terminal reward, with wide ranging applications on how robots can learn and improve their behaviors in the wild.</p> <p><strong>References</strong></p> <p>Watson, J. and Correll, N., 2023. <a href="https://arxiv.org/pdf/2301.10846" rel="nofollow">Optimal decision making in robotic assembly and other trial-and-error tasks</a>. Int. Conf. on Intelligent Robots and Systems (IROS), Detroit, MI.</p></div> </div> </div> </div> </div> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Wed, 25 Jan 2023 07:00:00 +0000 Anonymous 60 at /lab/correll Embedded Magnetic Sensing for Feedback Control of Soft HASEL Actuators /lab/correll/2022/09/10/embedded-magnetic-sensing-feedback-control-soft-hasel-actuators <span>Embedded Magnetic Sensing for Feedback Control of Soft HASEL Actuators</span> <span><span>Anonymous (not verified)</span></span> <span><time datetime="2022-09-10T00:00:00-06:00" title="Saturday, September 10, 2022 - 00:00">Sat, 09/10/2022 - 00:00</time> </span> <div> <div class="imageMediaStyle focal_image_wide"> <img loading="lazy" src="/lab/correll/sites/default/files/styles/focal_image_wide/public/article-thumbnail/vani.png?h=17b814f0&amp;itok=Va6Gi-It" width="1200" height="600" alt="Embedded Magnetic Sensing for Feedback Control of Soft HASEL Actuators"> </div> </div> <div role="contentinfo" class="container ucb-article-categories" itemprop="about"> <span class="visually-hidden">Categories:</span> <div class="ucb-article-category-icon" aria-hidden="true"> <i class="fa-solid fa-folder-open"></i> </div> <a href="/lab/correll/taxonomy/term/12"> Publication </a> </div> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default"> <div class="ucb-article-content-media ucb-article-content-media-above"> <div> <div class="paragraph paragraph--type--media paragraph--view-mode--default"> <div> <div class="imageMediaStyle large_image_style"> <img loading="lazy" src="/lab/correll/sites/default/files/styles/large_image_style/public/article-image/vani.png?itok=yRhaB0I5" width="1500" height="1092" alt="Embedded Magnetic Sensing for Feedback Control of Soft HASEL Actuators"> </div> </div> </div> </div> </div> <div class="ucb-article-text d-flex align-items-center" itemprop="articleBody"> <div><p>The need to create more viable soft sensors is increasing in tandem with the growing interest in soft robots. Several sensing methods, like capacitive stretch sensing and intrinsic capacitive self-sensing, have proven to be useful when controlling soft electro-hydraulic actuators, but are still problematic. This is due to challenges around high-voltage electronic interference or the inability to accurately sense the actuator at higher actuation frequencies. These issues are compounded when trying to sense and control the movement of a multiactuator system. To address these shortcomings, we describe a two-part magnetic sensing mechanism to measure the changes in displacement of an electro-hydraulic (HASEL) actuator. Our magnetic sensing mechanism can achieve high accuracy and precision for the HASEL actuator displacement range, and accurately tracks motion at actuation frequencies up to 30 Hz, while being robust to changes in ambient temperature and relative humidity. The high accuracy of the magnetic sensing mechanism is also further emphasized in the gripper demonstration. Using this sensing mechanism, we can detect submillimeter difference in the diameters of three tomatoes. Finally, we successfully perform closed-loop control of one folded HASEL actuator using the sensor, which is then scaled into a deformable tilting platform of six units (one HASEL actuator and one sensor) that control a desired end effector position in 3D space. This work demonstrates the first instance of sensing electro-hydraulic deformation using a magnetic sensing mechanism. The ability to more accurately and precisely sense and control HASEL actuators and similar soft actuators is necessary to improve the abilities of soft, robotic platforms.</p> <p><strong>Reference</strong></p> <p>Sundaram, V., Ly, K., Johnson, B.K., Naris, M., Anderson, M.P., Humbert, J.S., Correll, N. and Rentschler, M., 2022. Embedded Magnetic Sensing for Feedback Control of Soft HASEL Actuators.&nbsp;<i>IEEE Transactions on Robotics</i>. [<a href="https://ieeexplore.ieee.org/abstract/document/9882180" rel="nofollow">LINK</a>]</p></div> </div> </div> </div> </div> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Sat, 10 Sep 2022 06:00:00 +0000 Anonymous 46 at /lab/correll Introduction to Autonomous Robots: Mechanisms, Sensors, Actuators, and Algorithms /lab/correll/2022/06/02/introduction-autonomous-robots-mechanisms-sensors-actuators-and-algorithms <span>Introduction to Autonomous Robots: Mechanisms, Sensors, Actuators, and Algorithms</span> <span><span>Anonymous (not verified)</span></span> <span><time datetime="2022-06-02T07:43:40-06:00" title="Thursday, June 2, 2022 - 07:43">Thu, 06/02/2022 - 07:43</time> </span> <div> <div class="imageMediaStyle focal_image_wide"> <img loading="lazy" src="/lab/correll/sites/default/files/styles/focal_image_wide/public/article-thumbnail/mitbook.jpeg?h=e7f31f2d&amp;itok=17PDUQKw" width="1200" height="600" alt="Introduction to Autonomous Robots: Mechanisms, Sensors, Actuators, and Algorithms"> </div> </div> <div role="contentinfo" class="container ucb-article-categories" itemprop="about"> <span class="visually-hidden">Categories:</span> <div class="ucb-article-category-icon" aria-hidden="true"> <i class="fa-solid fa-folder-open"></i> </div> <a href="/lab/correll/taxonomy/term/12"> Publication </a> </div> <div role="contentinfo" class="container ucb-article-tags" itemprop="keywords"> <span class="visually-hidden">Tags:</span> <div class="ucb-article-tag-icon" aria-hidden="true"> <i class="fa-solid fa-tags"></i> </div> <a href="/lab/correll/taxonomy/term/9" hreflang="en">Grasping</a> <a href="/lab/correll/taxonomy/term/8" hreflang="en">Inverse Kinematics</a> <a href="/lab/correll/taxonomy/term/7" hreflang="en">Kinematics</a> <a href="/lab/correll/taxonomy/term/13" hreflang="en">Manipulation</a> <a href="/lab/correll/taxonomy/term/6" hreflang="en">Path Planning</a> <a href="/lab/correll/taxonomy/term/10" hreflang="en">Perception</a> <a href="/lab/correll/taxonomy/term/11" hreflang="en">SLAM</a> </div> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default"> <div class="ucb-article-content-media ucb-article-content-media-above"> <div> <div class="paragraph paragraph--type--media paragraph--view-mode--default"> <div> <div class="imageMediaStyle large_image_style"> <img loading="lazy" src="/lab/correll/sites/default/files/styles/large_image_style/public/article-image/mitbook_0.jpeg?itok=rPEHczhd" width="1500" height="1915" alt="Introduction to Autonomous Robots: Mechanisms, Sensors, Actuators, and Algorithms"> </div> </div> </div> </div> </div> <div class="ucb-article-text d-flex align-items-center" itemprop="articleBody"> <div><p>Textbooks that provide a broad algorithmic perspective on the mechanics and dynamics of robots almost unfailingly serve students at the graduate level. Introduction to Autonomous Robots offers a much-needed resource for teaching third- and fourth-year undergraduates the computational fundamentals behind the design and control of autonomous robots. The authors use a class-tested and accessible approach to present progressive, step-by-step development concepts, alongside a wide range of real-world examples and fundamental concepts in mechanisms, sensing and actuation, computation, and uncertainty. Throughout, the authors balance the impact of hardware (mechanism, sensor, actuator) and software (algorithms) in teaching robot autonomy.</p> <p>Rigorous and tested in the classroom,&nbsp;<i>Introduction to Autonomous Robots</i>&nbsp;is written for engineering and computer science undergraduates with a sophomore-level understanding of linear algebra, probability theory, trigonometry, and statistics. The text covers topics like basic concepts in robotic mechanisms like locomotion and grasping, plus the resulting forces; operation principles of sensors and actuators; basic algorithms for vision and feature detection; an introduction to artificial neural networks, including convolutional and recurrent variants. The authors have included QR codes in the text guide readers to online lecture videos and animations. The book also features extensive appendices focus on project-based curricula, pertinent areas of mathematics, backpropagation, writing a research paper, and other topics, and is accompanied by a growing library of exercises in an open-source, platform-independent simulation (Webots).</p> <p><strong>Website</strong></p> <p><a href="https://github.com/Introduction-to-Autonomous-Robots/Introduction-to-Autonomous-Robots" rel="nofollow">https://github.com/Introduction-to-Autonomous-Robots/Introduction-to-Autonomous-Robots</a></p> <p><strong>Reference</strong></p> <p>Correll, Nikolaus, Bradley Hayes, Christoffer Heckman, and Alessandro Roncone.&nbsp;<i>Introduction to Autonomous Robots: Mechanisms, Sensors, Actuators, and Algorithms</i>. MIT Press, 2022. [<a href="https://mitpress.mit.edu/books/introduction-autonomous-robots" rel="nofollow">Link</a>]&nbsp;</p></div> </div> </div> </div> </div> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Thu, 02 Jun 2022 13:43:40 +0000 Anonymous 13 at /lab/correll Augmented reality for human–swarm interaction in a swarm-robotic chemistry simulation /lab/correll/2022/05/03/augmented-reality-human%E2%80%93swarm-interaction-swarm-robotic-chemistry-simulation <span>Augmented reality for human–swarm interaction in a swarm-robotic chemistry simulation</span> <span><span>Anonymous (not verified)</span></span> <span><time datetime="2022-05-03T06:42:09-06:00" title="Tuesday, May 3, 2022 - 06:42">Tue, 05/03/2022 - 06:42</time> </span> <div> <div class="imageMediaStyle focal_image_wide"> <img loading="lazy" src="/lab/correll/sites/default/files/styles/focal_image_wide/public/article-thumbnail/swarmchemistry.png?h=7e082a4d&amp;itok=NTpoGpaR" width="1200" height="600" alt="Atoms assembled into a chemical molecule using augmented reality to show bonds."> </div> </div> <div role="contentinfo" class="container ucb-article-categories" itemprop="about"> <span class="visually-hidden">Categories:</span> <div class="ucb-article-category-icon" aria-hidden="true"> <i class="fa-solid fa-folder-open"></i> </div> <a href="/lab/correll/taxonomy/term/12"> Publication </a> </div> <div role="contentinfo" class="container ucb-article-tags" itemprop="keywords"> <span class="visually-hidden">Tags:</span> <div class="ucb-article-tag-icon" aria-hidden="true"> <i class="fa-solid fa-tags"></i> </div> <a href="/lab/correll/taxonomy/term/16" hreflang="en">artificial chemistry</a> <a href="/lab/correll/taxonomy/term/15" hreflang="en">droplets</a> <a href="/lab/correll/taxonomy/term/14" hreflang="en">swarm robotics</a> </div> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default"> <div class="ucb-article-content-media ucb-article-content-media-above"> <div> <div class="paragraph paragraph--type--media paragraph--view-mode--default"> <div> <div class="imageMediaStyle large_image_style"> <img loading="lazy" src="/lab/correll/sites/default/files/styles/large_image_style/public/article-image/swarmchemistry.png?itok=cydvYzZw" width="1500" height="1697" alt="Atoms assembled into a chemical molecule using augmented reality to show bonds."> </div> </div> </div> </div> </div> <div class="ucb-article-text d-flex align-items-center" itemprop="articleBody"> <div><p>We present a novel augmented reality (AR) framework to show relevant information about swarm dynamics to a user in the absence of markers by using blinking frequency to distinguish between groups in the swarm. In order to distinguish between groups, clusters of the same group are identified by blinking at a specific time interval that is distinct from the time interval at which their neighbors blink. The problem is thus to find&nbsp;blinking sequences that are distinct for each group with respect to the group’s neighbors. Selecting an appropriate sequence is an instance of the distributed graph coloring problem, which can be solved in O(log(n)) time with n being the number of robots involved. We demonstrate our approach using a swarm chemistry simulation in which robots simulate individual atoms that form molecules following the rules of chemistry. An AR display is then used to display information about the internal state of individual swarm members as well as their topological relationship, corresponding to molecular bonds in a context that uses robot swarms to teach basic chemistry concepts.</p> <p class="text-align-center"></p> <p><strong>Reference</strong></p> <p>Batra, Sumeet, John Klingner, and Nikolaus Correll. "Augmented reality for human–swarm interaction in a swarm-robotic chemistry simulation."&nbsp;<i>Artificial Life and Robotics</i>&nbsp;(2022): 1-9. [<a href="https://link.springer.com/article/10.1007/s10015-022-00763-w" rel="nofollow">PDF</a>]</p></div> </div> </div> </div> </div> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Tue, 03 May 2022 12:42:09 +0000 Anonymous 4 at /lab/correll Electro-Hydraulic Rolling Soft Wheel: Design, Hybrid Dynamic Modeling, and Model Predictive Control /lab/correll/2022/05/02/electro-hydraulic-rolling-soft-wheel-design-hybrid-dynamic-modeling-and-model-predictive <span>Electro-Hydraulic Rolling Soft Wheel: Design, Hybrid Dynamic Modeling, and Model Predictive Control</span> <span><span>Anonymous (not verified)</span></span> <span><time datetime="2022-05-02T09:09:15-06:00" title="Monday, May 2, 2022 - 09:09">Mon, 05/02/2022 - 09:09</time> </span> <div> <div class="imageMediaStyle focal_image_wide"> <img loading="lazy" src="/lab/correll/sites/default/files/styles/focal_image_wide/public/article-thumbnail/roboticwheel_0.png?h=5264e061&amp;itok=uSkERLqt" width="1200" height="600" alt="Motion sequence of the electrohydraulic rolling soft wheel around a pivot on a square platform"> </div> </div> <div role="contentinfo" class="container ucb-article-categories" itemprop="about"> <span class="visually-hidden">Categories:</span> <div class="ucb-article-category-icon" aria-hidden="true"> <i class="fa-solid fa-folder-open"></i> </div> <a href="/lab/correll/taxonomy/term/12"> Publication </a> </div> <div role="contentinfo" class="container ucb-article-tags" itemprop="keywords"> <span class="visually-hidden">Tags:</span> <div class="ucb-article-tag-icon" aria-hidden="true"> <i class="fa-solid fa-tags"></i> </div> <a href="/lab/correll/taxonomy/term/4" hreflang="en">HASEL</a> <a href="/lab/correll/taxonomy/term/17" hreflang="en">model-predictive control</a> <a href="/lab/correll/taxonomy/term/1" hreflang="en">soft robotics</a> </div> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default"> <div class="ucb-article-content-media ucb-article-content-media-above"> <div> <div class="paragraph paragraph--type--media paragraph--view-mode--default"> <div> <div class="imageMediaStyle large_image_style"> <img loading="lazy" src="/lab/correll/sites/default/files/styles/large_image_style/public/article-image/roboticwheel.png?itok=0ZFAa0PB" width="1500" height="925" alt="Motion sequence of the electrohydraulic rolling soft wheel around a pivot on a square platform."> </div> </div> </div> </div> </div> <div class="ucb-article-text d-flex align-items-center" itemprop="articleBody"> <div><p>Locomotion through rolling is attractive compared to other forms of locomotion thanks to uniform designs, high degree of mobility, dynamic stability, and self-recovery from collision. Despite previous efforts to design rolling soft systems, pneumatic and other soft actuators are often limited in terms of high-speed dynamics, system integration, and/or functionalities. Furthermore, mathematical description of the rolling dynamics for this type of robot and how the models can be used for speed control are often not mentioned. This article introduces a cylindrical-shaped shell-bulging rolling soft wheel that employs an array of 16 folded-HASEL actuators as a mean for improved rolling performance. The actuators represent the soft components with discrete forces that propel the wheel, whereas the wheel's frame is rigid but allows for smooth, continuous change in position and speed. We discuss the interplay between the electrical and mechanical design choices, the modeling of the wheel's hybrid (continuous and discrete) dynamic behavior, and the implementation of a model predictive controller (MPC) for the robot's speed. With the balance of several design factors, we show the wheel's ability to carry integrated hardware with a maximum rolling speed at 0.7 m/s (or 2.2 body lengths per second), despite its total weight of 979 g, allowing the wheel to outperform the existing rolling soft wheels with comparable weights and sizes. We also show that the MPC enables the wheel to accelerate and leverage its inherent braking capability to reach desired speeds—a critical function that did not exist in previous rolling soft systems.</p> <p></p> <p><strong>Reference</strong></p> <p>Ly, Khoi, Jatin V. Mayekar, Sarah Aguasvivas, Christoph Keplinger, Mark E. Rentschler, and Nikolaus Correll. "Electro-Hydraulic Rolling Soft Wheel: Design, Hybrid Dynamic Modeling, and Model Predictive Control."&nbsp;<i>IEEE Transactions on Robotics</i>&nbsp;(2022).</p></div> </div> </div> </div> </div> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Mon, 02 May 2022 15:09:15 +0000 Anonymous 32 at /lab/correll