<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" version="2.0">
  <channel>
    <title>Workfloor: Robotics News for the Factory</title>
    <link>https://blog.robotiq.com</link>
    <description>A blog about robotics news for the factory.</description>
    <language>en-us</language>
    <pubDate>Fri, 29 Mar 2019 14:15:30 GMT</pubDate>
    <dc:date>2019-03-29T14:15:30Z</dc:date>
    <dc:language>en-us</dc:language>
    <item>
      <title>What's New In Robotics?  29.03.2019</title>
      <link>https://blog.robotiq.com/whats-new-in-robotics-29.03.2019</link>
      <description>&lt;div class="hs-featured-image-wrapper"&gt; 
 &lt;a href="https://blog.robotiq.com/whats-new-in-robotics-29.03.2019" title="" class="hs-featured-image-link"&gt; &lt;img src="https://blog.robotiq.com/hubfs/40022743.jpg" alt="40022743" class="hs-featured-image" style="width:auto !important; max-width:50%; float:left; margin:0 15px 15px 0;"&gt; &lt;/a&gt; 
&lt;/div&gt;    
&lt;p&gt;&lt;strong&gt;Hi!&amp;nbsp; In this week's news mix:&amp;nbsp; we meet cobot smoothie 'Chef B', ProcoMachinery launches cobot packing solution and cobots.ie goes on tour. &amp;nbsp; We also encounter a microrobot swarm, wonder at disposable drones, meet a robot that's so soft it's scarcely there at all and much more!&lt;/strong&gt;&lt;/p&gt;</description>
      <content:encoded>&lt;p&gt;&lt;strong&gt;Hi!&amp;nbsp; In this week's news mix:&amp;nbsp; we meet cobot smoothie 'Chef B', ProcoMachinery launches cobot packing solution and cobots.ie goes on tour. &amp;nbsp; We also encounter a microrobot swarm, wonder at disposable drones, meet a robot that's so soft it's scarcely there at all and much more!&lt;/strong&gt;&lt;/p&gt; 
&lt;p&gt;&lt;/p&gt; 
&lt;h2&gt;Cobots &amp;amp; manufacturing&lt;/h2&gt; 
&lt;p&gt;Google researchers unveiled TossingBot this week.&amp;nbsp; Developed in collaboration with experts at Princeton, Columbia, and MIT TossingBot is "a picking robot for our real, random world that learns to grasp and throw objects into selected boxes outside its natural range."&lt;/p&gt; 
&lt;p style="text-align: center;"&gt;&lt;img src="https://blog.robotiq.com/hs-fs/hubfs/image2.gif?width=600&amp;amp;name=image2.gif" alt="ur-cobot-google-force-torque-sensors" style="width: 600px; margin: 0px auto;" width="600"&gt;&lt;br&gt;TossingBot in action.&amp;nbsp; Credit: Google&lt;/p&gt; 
&lt;p&gt;Via &lt;a href="https://ai.googleblog.com/2019/03/unifying-physics-and-deep-learning-with.html"&gt;Google&lt;/a&gt;:&lt;/p&gt; 
&lt;blockquote&gt; 
 &lt;p&gt;&lt;em&gt;TossingBot jointly learns grasping and throwing policies using an end-to-end neural network that maps from visual observations to control parameters for motion primitives. Using overhead cameras to track where objects land, TossingBot improves itself over time through self-supervision. &lt;/em&gt;&lt;/p&gt; 
&lt;/blockquote&gt; 
&lt;p&gt;(Paper: &lt;a href="https://tossingbot.cs.princeton.edu/paper.pdf"&gt;TossingBot: Learning to Throw Arbitrary Objects with Residual Physics&lt;/a&gt;)&lt;/p&gt; 
&lt;p&gt;Canadian robotics firm &lt;a href="http://www.procomachinery.com/"&gt;Proco Machinery&lt;/a&gt; has launched a cobot-based, lay-flat packing technology that enables blow molders to pack plastic bottles on their sides, as opposed to placing them with necks up or down.&lt;/p&gt; 
&lt;p style="text-align: center;"&gt;&lt;img src="https://blog.robotiq.com/hs-fs/hubfs/40022743.jpg?width=300&amp;amp;name=40022743.jpg" alt="proco-machinery" style="width: 300px; margin: 0px auto;" width="300"&gt;&lt;br&gt;Credit: Proco Machinery&lt;/p&gt; 
&lt;p&gt;&lt;a href="https://news.thomasnet.com/fullstory/new-lay-flat-packing-technology-allows-a-blow-molder-of-bottles-to-pack-270-bottles-in-a-box-instead-of-250-40022743"&gt;Thomas Net&lt;/a&gt; reports:&lt;/p&gt; 
&lt;blockquote&gt; 
 &lt;p&gt;&lt;em&gt;Proco’s new Lay Flat Tooling innovation helps blow molders and brand owners maximize product capacity for a given container. The company has developed two sets of tooling for two different applications - one for straight neck-to-neck packaging in the box and another for offset neck orientation from row to row. This potentially allows a blow molder of bottles to pack 270 bottles in a box instead of 250, resulting in maximized capacity and tremendous savings on freight cost.&lt;/em&gt;&lt;/p&gt; 
&lt;/blockquote&gt; 
&lt;p&gt;Netherlands-based oil burner component manufacturer &lt;a href="https://fluidics.nl/"&gt;Fluidics Instruments&lt;/a&gt; is using 12 cobots from &lt;a href="https://www.universal-robots.com/"&gt;Universal Robots&lt;/a&gt; to help it continue making "the best nozzles in the world"...&amp;nbsp;&lt;/p&gt; 
&lt;div class="hs-responsive-embed hs-responsive-embed-youtube"&gt;
 &lt;iframe class="hs-responsive-embed-iframe" style="float: none; margin-left: auto; margin-right: auto; display: block;" src="http://www.youtube.com/embed/CKncBjI3-Gs" width="600" height="336" allowfullscreen&gt;&lt;/iframe&gt;
&lt;/div&gt; 
&lt;p&gt;&amp;nbsp;&lt;/p&gt; 
&lt;p&gt;&lt;a href="https://cobots.ie/"&gt;Cobots.ie&lt;/a&gt;, a recently-launched cobot specialist that provides &lt;span&gt;hardware, after sales support &amp;amp; training for Universal Robots' cobots has been showcasing the technology at a series of events throughout Ireland.&amp;nbsp; &lt;/span&gt;&lt;/p&gt; 
&lt;p style="text-align: center;"&gt;&lt;span&gt;&lt;img src="https://blog.robotiq.com/hs-fs/hubfs/Phil_PickIt.jpg?width=600&amp;amp;name=Phil_PickIt.jpg" alt="Phil_PickIt-cobot-ur" style="width: 600px; margin: 0px auto;" width="600"&gt;&lt;br&gt;Ireland's Northwest region is a hub for pharma, bio-pharma and machining companies.&amp;nbsp; Credit: Cobots.ie &lt;/span&gt;&lt;/p&gt; 
&lt;p&gt;A new, smoothie-making cobot has begun testing on The University of San Francisco's campus.&amp;nbsp;&amp;nbsp; Dubbed "Chef B," the cobot (which features &lt;strong&gt;&lt;a href="https://robotiq.com/products"&gt;a Robotiq gripper&lt;/a&gt;&lt;/strong&gt;) can blend and dispense 35 to 45 smoothies per hour.&amp;nbsp; &lt;a href="https://abc7news.com/food/smoothie-making-robot-arrives-on-the-usf-campus/5219634/"&gt;ABC 7 News&lt;/a&gt; visited campus to take a look...&lt;/p&gt; 
&lt;div class="hs-responsive-embed hs-responsive-embed-undefined"&gt;
 &lt;iframe class="hs-responsive-embed-iframe" style="float: none; margin-left: auto; margin-right: auto; display: block;" src="https://abc7news.com/video/embed/?pid=5219673" width="600" height="337" frameborder="0" allowfullscreen&gt;&lt;/iframe&gt;
&lt;/div&gt; 
&lt;p&gt;&amp;nbsp;&lt;/p&gt; 
&lt;p&gt;Cobots are increasingly popular in the food production sector, with the Association for Packaging and Processing Technologies (PMMI), identifying collaborative robots as an important trend.&amp;nbsp;&amp;nbsp; A newly released infographic from PMMI highlights the importance of automation in food production.&amp;nbsp; (H/T &lt;a href="https://www.engineering.com/AdvancedManufacturing/ArticleID/18758/Infographic-Rapid-Growth-Ahead-for-Industrial-Robots-in-Food-and-Beverage-Processing.aspx"&gt;Engineering.com&lt;/a&gt;)&lt;/p&gt; 
&lt;p style="text-align: center;"&gt;&lt;img src="https://blog.robotiq.com/hubfs/Infoographic_-_food_bev_robots_PMMI_frnkbw.webp" alt="Infoographic_-_food_bev_robots_PMMI_frnkbw" style="width: 600px; margin: 0px auto;" width="600"&gt;&lt;/p&gt; 
&lt;p&gt;&amp;nbsp;&lt;/p&gt; 
&lt;p&gt;Germany's Factory of the Future explores the various ways cobots can assist humans across a wide range of tasks, including automotive assembly...&amp;nbsp;&lt;/p&gt; 
&lt;div class="hs-responsive-embed hs-responsive-embed-youtube"&gt;
 &lt;iframe class="hs-responsive-embed-iframe" style="float: none; margin-left: auto; margin-right: auto; display: block;" src="http://www.youtube.com/embed/RN9iskWeNfE" width="600" height="336" allowfullscreen&gt;&lt;/iframe&gt;
&lt;/div&gt; 
&lt;p&gt;&amp;nbsp;&lt;/p&gt; 
&lt;p&gt;Meanwhile, in other cobot-related reading:&lt;/p&gt; 
&lt;ul&gt; 
 &lt;li&gt;How cutting-edge robotics bring manufacturing into a new age&amp;nbsp; (&lt;a href="http://exclusive.multibriefs.com/content/how-cutting-edge-robotics-bring-manufacturing-into-a-new-age/science-technology"&gt;MultiBriefs&lt;/a&gt;)&lt;/li&gt; 
 &lt;li&gt;With Robotics, Manufacturers Have An Opportunity To Boost Workplace Morale&amp;nbsp; (&lt;a href="https://www.forbes.com/sites/richblake1/2019/03/25/with-robotics-manufacturers-have-an-opportunity-to-boost-workplace-morale/#7c43c5ae6600"&gt;Forbes&lt;/a&gt;)&lt;/li&gt; 
 &lt;li&gt;6 Robotics Trends Taking Over Manufacturing&amp;nbsp; (&lt;a href="https://www.americanmachinist.com/automation-and-robotics/6-robotics-trends-taking-over-manufacturing"&gt;American Machining&lt;/a&gt;)&lt;/li&gt; 
 &lt;li&gt;Could these robots replace EU workers after Brexit?&amp;nbsp; (&lt;a href="https://www.manchestereveningnews.co.uk/news/greater-manchester-news/robots-eu-workers-after-brexit-15998588"&gt;Manchester Evening News&lt;/a&gt;)&lt;/li&gt; 
 &lt;li&gt;ABB is writing the future of digital industries at Hannover Messe&amp;nbsp; (&lt;a href="https://new.abb.com/news/detail/18274/abb-is-writing-the-future-of-digital-industries-at-hannover-messe"&gt;ABB&lt;/a&gt;)&lt;/li&gt; 
&lt;/ul&gt; 
&lt;p&gt;&amp;nbsp;&lt;/p&gt; 
&lt;h2&gt;&lt;strong&gt;Elsewhere... &lt;/strong&gt;&lt;/h2&gt; 
&lt;p&gt;Researchers from the Harbin Institute of Technology in China and Michigan State University in the U.S. have revealed a way to control swarming microrobots that enables them to form into multiple types of shapes.&lt;/p&gt; 
&lt;p style="text-align: center;"&gt;&lt;img src="https://blog.robotiq.com/hs-fs/hubfs/5c9b52ae94c8e.jpg?width=600&amp;amp;name=5c9b52ae94c8e.jpg" alt="control-swarming-microbots-msu" style="width: 600px; margin: 0px auto;" width="600"&gt;&lt;br&gt;Credit: MSU&lt;/p&gt; 
&lt;p&gt;Via &lt;a href="https://techxplore.com/news/2019-03-microrobots-multiple-swarming.html#nRlv"&gt;TechXplore&lt;/a&gt;:&lt;/p&gt; 
&lt;blockquote&gt; 
 &lt;p&gt;&lt;em&gt;The robots in the experiments were actually just single grains of hematite—they were suspended in a liquid and manipulated using specialized magnets. By controlling the frequency and direction of the magnetic field, the researchers were able to get each robot to spin, roll, oscillate and tumble independently. When the magnetic field was controlled for a whole group of the robots, the result was a controllable swarm.&lt;/em&gt;&lt;/p&gt; 
&lt;/blockquote&gt; 
&lt;p&gt;(Paper: &lt;a href="http://robotics.sciencemag.org/content/4/28/eaav8006"&gt;Reconfigurable magnetic microrobot swarm&lt;/a&gt;)&lt;/p&gt; 
&lt;p&gt;U.S. Marines have successfully tested a new type of disposable delivery drone developed by Logistic Gliders Inc.&amp;nbsp;&lt;/p&gt; 
&lt;p style="text-align: center;"&gt;&lt;img src="https://blog.robotiq.com/hs-fs/hubfs/MzI2MDU0OA.jpeg?width=600&amp;amp;name=MzI2MDU0OA.jpeg" alt="logistic-gliders-delivery-drone" style="width: 600px; margin: 0px auto;" width="600"&gt;&lt;br&gt;Credit: Logistic Gliders Inc.&amp;nbsp;&amp;nbsp;&lt;/p&gt; 
&lt;p&gt;&lt;a href="https://spectrum.ieee.org/automaton/robotics/drones/disposable-delivery-drones-undergo-successful-tests-with-us-marines"&gt;IEEE Spectrum&lt;/a&gt; has the lowdown:&lt;/p&gt; 
&lt;blockquote&gt; 
 &lt;p&gt;&lt;em&gt;The glider can be so cheap because it’s not designed to be reused—it performs one single delivery mission that involves a (hopefully gentle) crash at the end, the supplies are removed from the inside, and then the glider is abandoned. Plus, you save money on more than the glider itself, because you don’t have to risk a manned aircraft that likely costs thousands of dollars per hour to fly.&lt;/em&gt;&amp;nbsp;&lt;/p&gt; 
&lt;/blockquote&gt; 
&lt;p&gt;Mitsubishi Heavy Industries unveiled a pair of firefighting robots this week that can collaborate in environments that are too hazardous for humans.&lt;/p&gt; 
&lt;p style="text-align: center;"&gt;&lt;img src="https://blog.robotiq.com/hs-fs/hubfs/mitsubishi-heavy-industries-firefighting-robot-system-5.jpg?width=600&amp;amp;name=mitsubishi-heavy-industries-firefighting-robot-system-5.jpg" alt="mitsubishi-heavy-industries-firefighting-robot-system-5" style="width: 600px; margin: 0px auto;" width="600"&gt;&lt;br&gt;Credit: Mitsubishi Heavy Industries&lt;/p&gt; 
&lt;p&gt;Via New Atlas:&lt;/p&gt; 
&lt;blockquote&gt; 
 &lt;p&gt;&lt;em&gt;As the Hose bot it moves along, it lays out a heavy duty firehose extension with 150 mm inner diameter. When everything is connected up, the 2,170 x 1,460 x 2,070 mm (85.4 x 57.4 x 81.4 in), 1,600 kg (3,527 lb) Cannon bot soaks or suffocates the flames with water or foam, and can drench a scene with up to 4,000 liters per minute at 1 megapascal (MPa) of pressure.&lt;/em&gt;&lt;/p&gt; 
&lt;/blockquote&gt; 
&lt;p&gt;And in other news:&lt;/p&gt; 
&lt;ul&gt; 
 &lt;li&gt;Alexa needs a robot body to escape the confines of today’s AI&amp;nbsp; (&lt;a href="https://www.technologyreview.com/s/613199/alexa-needs-a-robot-body-to-escape-the-confines-of-todays-ai/"&gt;MIT Technolgy Review&lt;/a&gt;)&lt;/li&gt; 
 &lt;li&gt;GITAI Partners With JAXA to Send Telepresence Robots to Space&amp;nbsp; (&lt;a href="https://spectrum.ieee.org/automaton/robotics/space-robots/gitai-partners-with-jaxa-to-send-telepresence-robots-to-space.amp.html"&gt;IEEE Spectrum&lt;/a&gt;)&lt;/li&gt; 
 &lt;li&gt;How Pope Francis could shape the future of robotics&amp;nbsp; (&lt;a href="https://www.bbc.com/news/technology-47668476"&gt;BBC&lt;/a&gt;)&lt;/li&gt; 
 &lt;li&gt;Artificial intelligence better than humans at predicting premature death: Study (&lt;a href="https://abcnews.go.com/Health/artificial-intelligence-humans-predicting-premature-death-study/story?id=61995695"&gt;ABC News&lt;/a&gt;)&lt;/li&gt; 
 &lt;li&gt;UPS, Matternet launch drone healthcare delivery service&amp;nbsp; (&lt;a href="https://www.zdnet.com/article/ups-matternet-launch-drone-healthcare-delivery-service/"&gt;ZDNet&lt;/a&gt;) &lt;br&gt;&lt;br&gt;&lt;/li&gt; 
&lt;/ul&gt; 
&lt;p&gt;&lt;span class="hs_cos_wrapper hs_cos_wrapper_meta_field hs_cos_wrapper_type_rich_text"&gt;&lt;span&gt;Come by next week for more of the latest robotics news!&amp;nbsp; Until then, please enjoy... &lt;/span&gt;&lt;/span&gt;&lt;/p&gt; 
&lt;h2&gt;&lt;strong&gt;Five vids for Friday&lt;/strong&gt;&lt;/h2&gt; 
&lt;p&gt;1.&amp;nbsp; NVIDIA and Ghost Robotics are working on agile and dexterous mobile robot platforms, as shown in this new video that coincides with the recent launch of NVIDIA's &lt;a href="https://blogs.nvidia.com/blog/2019/03/18/isaac-sdk-general-availability/?ncid=so-twi-gj-79310"&gt;Isaac SDK, Robot Engine and SIM packages&lt;/a&gt; for robot developers.&amp;nbsp;&lt;/p&gt; 
&lt;div class="hs-responsive-embed hs-responsive-embed-youtube"&gt;
 &lt;iframe class="hs-responsive-embed-iframe" style="float: none; margin-left: auto; margin-right: auto; display: block;" src="http://www.youtube.com/embed/FYPKBi5Wb2Y" width="600" height="336" allowfullscreen&gt;&lt;/iframe&gt;
&lt;/div&gt; 
&lt;p&gt;&amp;nbsp;&lt;/p&gt; 
&lt;p&gt;2.&amp;nbsp; &lt;a href="https://news.byu.edu/news/byu-engineers-making-more-people-friendly-robot"&gt;Engineers from Brigham Young University&lt;/a&gt; have (w/ funding assistance from NASA) developed inflatable robots that can operate safely in circumstances where rigid robots may cause injury to people or property.&amp;nbsp;&lt;/p&gt; 
&lt;div class="hs-responsive-embed hs-responsive-embed-youtube"&gt;
 &lt;iframe class="hs-responsive-embed-iframe" style="float: none; margin-left: auto; margin-right: auto; display: block;" src="http://www.youtube.com/embed/pjx9m5mONrE" width="600" height="336" allowfullscreen&gt;&lt;/iframe&gt;
&lt;/div&gt; 
&lt;p&gt;&amp;nbsp;&lt;/p&gt; 
&lt;p&gt;3.&amp;nbsp; Boston Dynamics unveiled a stunning new iteration of its Handle robot.&amp;nbsp; (H/T &lt;a href="https://www.cnet.com/news/see-boston-dynamics-robot-stack-warehouse-boxes-like-a-tetris-pro/"&gt;c|net&lt;/a&gt;)&amp;nbsp;&lt;/p&gt; 
&lt;div class="hs-responsive-embed hs-responsive-embed-youtube"&gt;
 &lt;iframe class="hs-responsive-embed-iframe" style="float: none; margin-left: auto; margin-right: auto; display: block;" src="http://www.youtube.com/embed/5iV_hB08Uns" width="600" height="336" allowfullscreen&gt;&lt;/iframe&gt;
&lt;/div&gt; 
&lt;p&gt;&amp;nbsp;&lt;/p&gt; 
&lt;p&gt;&amp;nbsp;&lt;/p&gt; 
&lt;p&gt;4.&amp;nbsp; Researchers at Harvard have brought the term "soft robot" to a new level with the unveiling of a bot that's made out of rubber and air.&amp;nbsp; (H/T &lt;a href="https://cosmosmagazine.com/technology/xxx-rubbery-figures-scientists-create-an-entirely-soft-robot"&gt;Cosmos Magazine&lt;/a&gt;)&amp;nbsp;&amp;nbsp;&lt;/p&gt; 
&lt;div class="hs-responsive-embed hs-responsive-embed-undefined"&gt;
 &lt;iframe class="hs-responsive-embed-iframe" style="float: none; margin-left: auto; margin-right: auto; display: block;" src="https://players.brightcove.net/5483960636001/HJH3i8Guf_default/index.html?videoId=6017577303001" frameborder="0" allowfullscreen&gt;&lt;/iframe&gt;
&lt;/div&gt; 
&lt;p&gt;&amp;nbsp;&lt;/p&gt; 
&lt;p&gt;5.&amp;nbsp; The Technical University of Munich's &lt;a href="https://www.amm.mw.tum.de/en/research/current-projects/humanoid-robot-lola/"&gt;LOLA humanoid&lt;/a&gt; is now able to walk over uneven terrain without relying on vision-based information thanks to a newly-developed ground force control scheme that handles ground-height variations and unplanned partial footholds.&amp;nbsp;&amp;nbsp; (Paper: &lt;a href="https://mediatum.ub.tum.de/doc/1482152/1482152.pdf"&gt;A Force-Control Scheme for BipedRobots to Walk over Uneven Terrain&lt;/a&gt;)&amp;nbsp;&lt;/p&gt; 
&lt;div class="hs-responsive-embed hs-responsive-embed-youtube"&gt;
 &lt;iframe class="hs-responsive-embed-iframe" style="float: none; margin-left: auto; margin-right: auto; display: block;" src="http://www.youtube.com/embed/pmtKv8VEItY" width="600" height="336" allowfullscreen&gt;&lt;/iframe&gt;
&lt;/div&gt; 
&lt;p&gt;&amp;nbsp;&lt;/p&gt; 
&lt;p&gt;&amp;nbsp;&lt;/p&gt; 
&lt;p&gt;&amp;nbsp;&lt;/p&gt; 
&lt;p&gt;&amp;nbsp;&lt;/p&gt;  
&lt;img src="https://track.hubspot.com/__ptq.gif?a=13401&amp;amp;k=14&amp;amp;r=https%3A%2F%2Fblog.robotiq.com%2Fwhats-new-in-robotics-29.03.2019&amp;amp;bu=https%253A%252F%252Fblog.robotiq.com&amp;amp;bvt=rss" alt="" width="1" height="1" style="min-height:1px!important;width:1px!important;border-width:0!important;margin-top:0!important;margin-bottom:0!important;margin-right:0!important;margin-left:0!important;padding-top:0!important;padding-bottom:0!important;padding-right:0!important;padding-left:0!important; "&gt;</content:encoded>
      <category>human-robot collaboration</category>
      <category>robotic research</category>
      <category>advanced manufacturing</category>
      <category>robotics news</category>
      <category>humanoid robot</category>
      <pubDate>Fri, 29 Mar 2019 13:19:50 GMT</pubDate>
      <author>emmetcole@gmail.com (Emmet Cole)</author>
      <guid>https://blog.robotiq.com/whats-new-in-robotics-29.03.2019</guid>
      <dc:date>2019-03-29T13:19:50Z</dc:date>
    </item>
    <item>
      <title>Do Robots Need 3D Vision? One Insect Says No</title>
      <link>https://blog.robotiq.com/do-robots-need-3d-vision-one-insect-says-no</link>
      <description>&lt;div class="hs-featured-image-wrapper"&gt; 
 &lt;a href="https://blog.robotiq.com/do-robots-need-3d-vision-one-insect-says-no" title="" class="hs-featured-image-link"&gt; &lt;img src="https://blog.robotiq.com/hubfs/praying-mantis-1170776_1920.jpg" alt="praying-mantis-3D-vision" class="hs-featured-image" style="width:auto !important; max-width:50%; float:left; margin:0 15px 15px 0;"&gt; &lt;/a&gt; 
&lt;/div&gt;    
&lt;p style="-qt-block-indent: 0; text-indent: 0px; margin: 10px 0px 0px 0px;"&gt; &lt;strong&gt;To use 3D or not to use 3D? That's the question. Which properties are the most important for robot vision? The natural world sheds light on the answer.&lt;/strong&gt;&lt;/p&gt;</description>
      <content:encoded>&lt;p style="-qt-block-indent: 0; text-indent: 0px; margin: 10px 0px 0px 0px;"&gt;&lt;strong&gt;To use 3D or not to use 3D? That's the question. Which properties are the most important for robot vision? The natural world sheds light on the answer.&lt;/strong&gt;&lt;/p&gt; 
&lt;p style="-qt-block-indent: 0; text-indent: 0px; margin: 10px 0px 0px 0px;"&gt;&lt;/p&gt; 
&lt;p&gt;&lt;strong&gt;Does my robot really need 3D vision? What type of vision system do I need? These are questions which come up a lot within robotics. Some people say that 3D is necessary because humans have stereo vision.&amp;nbsp;But, is 3D vision really better than 2D vision?&lt;/strong&gt;&lt;/p&gt; 
&lt;p&gt;Some recent research provided a new perspective on this issue, from an unlikely source. Research into how the praying mantis insect sees has shed some light on which properties we are most important for robot vision.&lt;/p&gt; 
&lt;h2&gt;How praying mantises see in 3D&lt;/h2&gt; 
&lt;p style="text-align: center;"&gt;&lt;a href="/robotiq-wrist-camera-update"&gt;&lt;img src="https://blog.robotiq.com/hs-fs/hubfs/praying-mantis-1170776_1920.jpg?width=600&amp;amp;name=praying-mantis-1170776_1920.jpg" alt="praying-mantis-3D-vision" width="600" style="width: 600px; display: block; margin: 0px auto;"&gt;&lt;/a&gt;It turns out that mantises have a very efficient and accurate 3D vision system.&lt;/p&gt; 
&lt;p&gt;An intriguing article caught my eye last year about 3D vision. &lt;a href="https://www.cell.com/current-biology/fulltext/S0960-9822(18)30014-9"&gt;It was some new research&lt;/a&gt; into the amazing visual system of the praying mantis insect.&lt;/p&gt; 
&lt;p&gt;Researchers from Newcastle University found that the insect does see in three dimensions, but not in the normal way (i.e the way that we see in 3D). Normally, stereo vision is obtained by taking two images of the same scene from two slightly different locations. These two images are combined by the brain — or by the computer in robot vision — and the differences between them are used to extract depth information using the distance between objects in both images.&lt;/p&gt; 
&lt;p&gt;Like humans, praying mantises have two eyes which face forward. They can also precisely detect depths for catching their prey, as humans and other predators can. As a result, you might think that they would use the same visual system as other predators use. But, you'd be wrong.&lt;/p&gt; 
&lt;p&gt;It turns out that mantises have a very efficient and accurate 3D vision system. However, the system only works properly in one specific situation: when the objects it sees are moving (e.g. its prey) and when both of the insect's eyes see the same object.&amp;nbsp;When objects are not moving, it's possible that the praying mantis doesn't see 3D at all (the researchers didn't test this so we can't be sure). Perhaps they just see in 2D.&lt;/p&gt; 
&lt;p&gt;When each of the insect's eyes is shown a different, unrelated moving image — which wouldn't happen in nature — the mantis will inaccurately detect 3D moving objects. This could be similar to an optical illusion in humans, where our visual system shows us things which aren't there.&lt;/p&gt; 
&lt;p&gt;The researchers also found that the mantis was easily able to detect camouflaged objects and wasn't distracted by changes of light. In these situations, the mantis performed much better than humans.&lt;/p&gt; 
&lt;p&gt;&lt;a href="https://www.zdnet.com/article/a-praying-mantis-wearing-tiny-glasses-holds-the-key-to-robot-vision/"&gt;In an article for ZDNet,&lt;/a&gt; robotics journalist Greg Nichols discussed how these new insights can be applied to robotics. Because the mantis's vision is very efficient (correlating to its very small brain) this type of stereo vision could be perfect for low computation robots.&lt;/p&gt; 
&lt;h2&gt;What we can learn by looking into a mantis's eyes&lt;/h2&gt; 
&lt;p&gt;There is a useful lesson that we can learn from the praying mantis:&lt;/p&gt; 
&lt;p&gt;&lt;strong&gt;Match the type of vision system to the specific needs of the task.&lt;/strong&gt;&lt;/p&gt; 
&lt;p&gt;The praying mantis insect has a specialized vision system which works very well for the specific application for which it is needed, i.e. catching moving prey which comes close to it. However, the mantis's 3D vision would probably perform badly in other situations. For example, it wouldn't be able to detect the position of a static pen to be able to pick it up. Humans are very good at this, but we find it hard to detect camouflaged moving objects.&lt;/p&gt; 
&lt;p&gt;In robotics, it's not a good idea to invest in the most complex, expensive vision system in an attempt to have a system which will work in every situation. The reality is that there is no vision system which works in every situation!&lt;/p&gt; 
&lt;p&gt;The vision system of a praying mantis works better than human vision in some situations but worse in other situations. This is also true for robot vision.&lt;/p&gt; 
&lt;p&gt;&lt;a class="cta_button" href="https://blog.robotiq.com/cs/ci/?pg=dc910fdd-73af-4256-9c1e-ac13cf4185b5&amp;amp;pid=13401&amp;amp;ecid=&amp;amp;hseid=&amp;amp;hsic="&gt;&lt;img class="hs-cta-img " style="border-width: 0px; /*hs-extra-styles*/; margin: 0 auto; display: block; margin-top: 20px; margin-bottom: 20px" alt="wrist-camera-urcap-update-cta" src="https://no-cache.hubspot.com/cta/default/13401/dc910fdd-73af-4256-9c1e-ac13cf4185b5.png" align="middle"&gt;&lt;/a&gt;&lt;/p&gt; 
&lt;h3&gt;3D robot vision vs 2D robot vision&lt;/h3&gt; 
&lt;p&gt;The research was specifically looking at the 3D vision capabilities of the insect.&lt;/p&gt; 
&lt;p&gt;In robot vision, it's sometimes thought that 3D vision is better than 2D vision. Some people think that it's only used less because it's more expensive. However, this is also not true. Just as the mantis's vision performs better in some situations than others, 3D vision can perform worse than 2D in some situations. For example, 3D requires more complex algorithms so can end up slower, it can be less robust to detecting some materials, etc.&lt;/p&gt; 
&lt;p&gt;Each of the &lt;a&gt;&lt;/a&gt;&lt;a href="/top-10-challenges-for-robot-vision"&gt;Top 10 Challenges for Robot Vision&lt;/a&gt; will affect different vision technologies to varying degrees. Some challenges, like "background" will affect 2D more than 3D. Other challenges, like "movement" might affect 3D more.&lt;/p&gt; 
&lt;ul style="list-style-type: disc;"&gt; 
 &lt;li&gt;&lt;strong&gt;Read more:&amp;nbsp;&lt;a href="/top-10-challenges-for-robot-vision"&gt;Top 10 Challenges for Robot Vision&lt;/a&gt;&lt;/strong&gt;&lt;/li&gt; 
&lt;/ul&gt; 
&lt;h2&gt;5 insect-inspired properties that will affect robot vision&lt;/h2&gt; 
&lt;p&gt;What properties are going to be important for your robot application?&lt;/p&gt; 
&lt;p&gt;Let's have a look at 5 of the top properties, inspired by the visual capabilities of our praying mantis.&lt;/p&gt; 
&lt;h3&gt;1. Speed of detection&lt;/h3&gt; 
&lt;p&gt;The praying mantis needs to detect prey at ultra-quick speeds. Although most robots don't need to be so rapid, detection speed is still very important.&lt;/p&gt; 
&lt;p&gt;Faster robot vision means that you can reduce the cycle time. In turn, this improves the productivity of your robot cell.&lt;/p&gt; 
&lt;h3&gt;2. Background robustness&lt;/h3&gt; 
&lt;p&gt;One of the amazing things about the mantis's vision is its ability to detect very camouflaged moving objects.&lt;/p&gt; 
&lt;p&gt;Robot vision relies on having a good contrast between the object in the foreground and the background. The best vision solutions are robust to different backgrounds and can detect objects on almost any material.&lt;/p&gt; 
&lt;h3&gt;3. Image definition control&lt;/h3&gt; 
&lt;p&gt;The resolution of the praying mantis is quite poor, as with many insects. They are able to function despite seeing in very low definition.&lt;/p&gt; 
&lt;p&gt;The required resolution of robot vision depends on the needs of your task. Sometimes, detection algorithms are more robust with lower resolutions. Other times, high resolution is necessary. The best systems allow you to choose the resolution based on your needs.&lt;/p&gt; 
&lt;h3&gt;4. Lighting robustness&lt;/h3&gt; 
&lt;p&gt;The praying mantis is able to see in very bad and changeable lighting situations. In fact, their vision seems to be entirely based on movement and not on light differences, as our vision is.&lt;/p&gt; 
&lt;p&gt;&lt;a&gt;&lt;/a&gt;&lt;a href="https://blog.robotiq.com/robot-vision-lighting-why-theres-no-perfect-setup"&gt;Robot vision is often very sensitive to lighting changes&lt;/a&gt;. Therefore, the best systems introduce some robustness to lighting, such as active and strobed lighting.&lt;/p&gt; 
&lt;h3&gt;5. Useful object detection&lt;/h3&gt; 
&lt;p&gt;The praying mantis has a perfect example of a vision system which detects exactly what it needs to detect (moving prey and predators) and little else.&lt;/p&gt; 
&lt;p&gt;Robot vision is only useful if it can detect what you need it to detect. This is why features like &lt;a&gt;&lt;/a&gt;&lt;a href="https://blog.robotiq.com/how-template-matching-works-in-robot-vision"&gt;easy-to-program template matching are so important&lt;/a&gt;. You want to be able to quickly program your desired objects, press play, and let the robot get on with it.&lt;/p&gt; 
&lt;p&gt;&amp;nbsp;&lt;a class="cta_button" href="https://blog.robotiq.com/cs/ci/?pg=9b957a82-9875-4c1c-9335-f931f87ee495&amp;amp;pid=13401&amp;amp;ecid=&amp;amp;hseid=&amp;amp;hsic="&gt;&lt;img class="hs-cta-img " style="border-width: 0px; /*hs-extra-styles*/; margin: 0 auto; display: block; margin-top: 20px; margin-bottom: 20px" alt="New Call-to-action" src="https://no-cache.hubspot.com/cta/default/13401/9b957a82-9875-4c1c-9335-f931f87ee495.png" align="middle"&gt;&lt;/a&gt;&lt;/p&gt; 
&lt;p style="-qt-block-indent: 0; text-indent: 0px; margin: 0px 0px 10px 0px;"&gt;&lt;em&gt;Which properties are most useful for your robot vision application? &lt;strong&gt;Tell us in the comments below or join the discussion on &lt;a href="https://www.linkedin.com/company/1695451"&gt;LinkedIn&lt;/a&gt;, &lt;a href="https://twitter.com/Robotiq_Inc"&gt;Twitter,&lt;/a&gt; &lt;a href="https://www.facebook.com/robotiq"&gt;Facebook&lt;/a&gt; or &lt;a href="http://dof.robotiq.com/"&gt;the DoF professional robotics community&lt;/a&gt;.&lt;/strong&gt;&lt;/em&gt; &lt;/p&gt;  
&lt;img src="https://track.hubspot.com/__ptq.gif?a=13401&amp;amp;k=14&amp;amp;r=https%3A%2F%2Fblog.robotiq.com%2Fdo-robots-need-3d-vision-one-insect-says-no&amp;amp;bu=https%253A%252F%252Fblog.robotiq.com&amp;amp;bvt=rss" alt="" width="1" height="1" style="min-height:1px!important;width:1px!important;border-width:0!important;margin-top:0!important;margin-bottom:0!important;margin-right:0!important;margin-left:0!important;padding-top:0!important;padding-bottom:0!important;padding-right:0!important;padding-left:0!important; "&gt;</content:encoded>
      <category>vision</category>
      <category>robot vision</category>
      <pubDate>Thu, 28 Mar 2019 15:02:00 GMT</pubDate>
      <author>alex@alexowenhill.co.uk (Alex Owen-Hill)</author>
      <guid>https://blog.robotiq.com/do-robots-need-3d-vision-one-insect-says-no</guid>
      <dc:date>2019-03-28T15:02:00Z</dc:date>
    </item>
    <item>
      <title>What's New In Robotics?  22.03.2019</title>
      <link>https://blog.robotiq.com/whats-new-in-robotics-22.03.2019</link>
      <description>&lt;div class="hs-featured-image-wrapper"&gt; 
 &lt;a href="https://blog.robotiq.com/whats-new-in-robotics-22.03.2019" title="" class="hs-featured-image-link"&gt; &lt;img src="https://blog.robotiq.com/hubfs/xtang1.2019-03-14%2015_38_17.gif" alt="xtang1.2019-03-14 15_38_17" class="hs-featured-image" style="width:auto !important; max-width:50%; float:left; margin:0 15px 15px 0;"&gt; &lt;/a&gt; 
&lt;/div&gt;    
&lt;p&gt;&lt;strong&gt;Hi!&amp;nbsp; In this week's news mix: INM unveils gecko-inspired cobot tech, new mobile, 3D-printing 'Ambots' set for launch and Automata raises Series A funding.&amp;nbsp; We encounter new artificial muscles, robots inspired by plasma-producing shrimp, liquid metal and much more!&amp;nbsp;&lt;/strong&gt;&lt;/p&gt;</description>
      <content:encoded>&lt;p&gt;&lt;strong&gt;Hi!&amp;nbsp; In this week's news mix: INM unveils gecko-inspired cobot tech, new mobile, 3D-printing 'Ambots' set for launch and Automata raises Series A funding.&amp;nbsp; We encounter new artificial muscles, robots inspired by plasma-producing shrimp, liquid metal and much more!&amp;nbsp;&lt;/strong&gt;&lt;/p&gt; 
&lt;p&gt;&lt;/p&gt; 
&lt;h2&gt;Cobots &amp;amp; manufacturing&lt;/h2&gt; 
&lt;p&gt;Scientists at the &lt;a href="https://www.leibniz-inm.de/en/"&gt;INM - Leibniz Institute for New Materials&lt;/a&gt; have unveiled a gecko-inspired cobot technology with microstructured, adhesive surfaces for object handling.&amp;nbsp; Very soft and without sharp corners and edges, the Gecomer technology is designed to further enhance cobot safety and will be displayed at Hannover Messe 2019 (April 1-5).&amp;nbsp; (Paper: &lt;a href="https://doi.org/10.3390/ma12010097"&gt;Roll-to-Roll Manufacturing of Micropatterned Adhesives by Template Compression&lt;/a&gt;)&lt;/p&gt; 
&lt;p style="text-align: center;"&gt;&lt;img src="https://blog.robotiq.com/hs-fs/hubfs/Innobot.jpg?width=600&amp;amp;name=Innobot.jpg" alt="Inm-robot-microstructured-leibniz-institute-new-materials" style="width: 600px; margin: 0px auto;" width="600"&gt;&lt;br&gt;Credit: INM - Leibniz Institute for New Materials&lt;/p&gt; 
&lt;p&gt;Via &lt;a href="https://www.alphagalileo.org/en-gb/Item-Display/ItemId/176900?returnurl=https://www.alphagalileo.org/en-gb/Item-Display/ItemId/176900"&gt;AlphaGalileo&lt;/a&gt;:&lt;/p&gt; 
&lt;blockquote&gt; 
 &lt;p&gt;&lt;em&gt;"The gripping and detaching of objects is affected by smart surface structures. This enables us to dispense with pointed grippers or tweezers," says Eduard Arzt, Scientific Director and Head of the Functional Microstructures Program Division. As a result, objects can be transported and deposited in the production process without any risk of injury to humans or damage to the objects. The adhesive structures are particularly suitable for sensitive parts, such as devices for the automotive, semiconductor and display industries."&lt;/em&gt;&lt;/p&gt; 
&lt;/blockquote&gt; 
&lt;p&gt;Meet the&lt;a href="http://www.ambots.net/"&gt; Ambots&lt;/a&gt;; mobile, collaborative 3-D printers, designed to move around factories and support human workers by providing 3-D printed pieces on demand...&amp;nbsp;&lt;/p&gt; 
&lt;div class="hs-responsive-embed hs-responsive-embed-undefined"&gt;
 &lt;iframe class="hs-responsive-embed-iframe" style="float: none; margin-left: auto; margin-right: auto; display: block;" src="https://cdnapisec.kaltura.com/p/1315742/sp/131574200/embedIframeJs/uiconf_id/40133072/partner_id/1315742?iframeembed=true&amp;amp;playerId=kaltura_player&amp;amp;entry_id=1_u0ie4coj&amp;amp;flashvars[streamerType]=auto" width="600" height="423" frameborder="0" allowfullscreen&gt;&lt;/iframe&gt;
&lt;/div&gt; 
&lt;p&gt;&amp;nbsp;&lt;/p&gt; 
&lt;p&gt;Vancouver, Canada-based collaborative welding robot maker Novarc Technologies has been named to Rocket Builders’ 17th annual “Ready to Rocket” list.&amp;nbsp; &lt;a href="http://virtual-strategy.com/2019/03/17/novarc-technologies-named-to-2019-ready-to-rocket-list-highlighting/"&gt;Virtual Strategy&lt;/a&gt; reports:&lt;/p&gt; 
&lt;blockquote&gt; 
 &lt;p&gt;&lt;em&gt;Novarc’s Spool Welding Robot (SWR) works along with the human operator to allow less-skilled welders to do the work of highly-skilled welders, by combining the cognition of a lower-skilled welder with the repeatable motion of our robot. As a result, Novarc’s SWR can produce high quality welds consistently, and significantly increases welder productivity and safety. &lt;/em&gt;&lt;/p&gt; 
&lt;/blockquote&gt; 
&lt;p&gt;One of the first firms in the Netherlands to adopt cobot technology, Heemskerk has been relying on Universal Robots' cobots for handling CNC machines and now has one &lt;a href="https://www.universal-robots.com/products/ur5-robot/"&gt;UR5&lt;/a&gt; and seven &lt;a href="https://www.universal-robots.com/products/ur10-robot/"&gt;UR10&lt;/a&gt;s in its facility...&amp;nbsp;&lt;/p&gt; 
&lt;div class="hs-responsive-embed hs-responsive-embed-youtube"&gt;
 &lt;iframe class="hs-responsive-embed-iframe" style="float: none; margin-left: auto; margin-right: auto; display: block;" src="http://www.youtube.com/embed/zADQuv_Efsc" width="600" height="336" allowfullscreen&gt;&lt;/iframe&gt;
&lt;/div&gt; 
&lt;p&gt;&amp;nbsp;&lt;/p&gt; 
&lt;p&gt;&lt;br&gt;ABB-backed robotics startup &lt;a href="https://automata.tech/"&gt;Automata Technologies&lt;/a&gt; has raised US$7.4 million Series A funding for 'Eva,' its diminutive industrial robot.&amp;nbsp; Selling for US$6,600 the 'desktop' cobot is designed to "replace tasks, not jobs" and comes with a reach of 600 mm (23.62 in.) and a maximum payload of 1.25 kg (2.75lb)&amp;nbsp;&amp;nbsp;&amp;nbsp; (H/T &lt;a href="https://techcrunch.com/2019/03/19/uks-automata-raises-7-4m-for-its-lightweight-industrial-desktop-robot/"&gt;TechCrunch&lt;/a&gt;).&lt;/p&gt; 
&lt;p style="text-align: center;"&gt;&lt;img src="https://blog.robotiq.com/hs-fs/hubfs/105793343-1552564219630eva-pick-and-place.jpg?width=600&amp;amp;name=105793343-1552564219630eva-pick-and-place.jpg" alt="industrial-robot-cobot-eva-pick-and-place" style="width: 600px; margin: 0px auto;" width="600"&gt;&lt;br&gt;Credit: Automata Technologies&lt;/p&gt; 
&lt;p&gt;Writing in &lt;a href="https://www.forbes.com/sites/forbestechcouncil/2019/03/18/advanced-robotics-versus-practical-robotics/#3334241071b8"&gt;Forbes&lt;/a&gt;, Afshin Doust, CEO at Advanced Intelligent Systems, made an interesting distinction between 'advanced' and 'practical' robotics:&amp;nbsp;&lt;/p&gt; 
&lt;blockquote&gt; 
 &lt;p&gt;&lt;em&gt;Once a practical application is identified with a big enough market, practical roboticists focus on using the technology at hand and adding more customization to create a solution for automating these tasks. Commercializing robotics helps in advancing the science of robotics. The advanced robotics engineers, without realizing it, are highly dependent on practical roboticists for their future survival.&lt;/em&gt;&lt;/p&gt; 
&lt;/blockquote&gt; 
&lt;p&gt;In other cobot news:&lt;/p&gt; 
&lt;ul&gt; 
 &lt;li&gt;Functional safety and its application to automated industrial cobots&amp;nbsp; (&lt;a href="https://www.eenewseurope.com/news/functional-safety-and-its-application-automated-industrial-cobots/page/0/1"&gt;eeNewsEurope&lt;/a&gt;)&lt;/li&gt; 
 &lt;li&gt;Is Your Company Ready for the Robotic Revolution?&amp;nbsp; (&lt;a href="https://www.logisticsmgmt.com/article/is_your_company_ready_for_the_robotic_revolution"&gt;Logistics Management&lt;/a&gt;)&lt;/li&gt; 
 &lt;li&gt;Robots in the modern factory&amp;nbsp; (&lt;a href="https://www.tgdaily.com/amp/story/technology%2Frobots-in-the-modern-factory"&gt;TG Daily&lt;/a&gt;)&lt;/li&gt; 
 &lt;li&gt;Automate 2019 Preview&amp;nbsp; (&lt;a href="https://www.robotics.org/content-detail.cfm/Industrial-Robotics-Industry-Insights/Automate-2019-Preview/content_id/7788"&gt;Robotic Industries Association&lt;/a&gt;)&lt;/li&gt; 
 &lt;li&gt;Shaping the future&amp;nbsp; (&lt;a href="http://www.controlengeurope.com/article/168943/Shaping-the-future.aspx"&gt;Control Engineering Europe&lt;/a&gt;)&lt;br&gt;&lt;br&gt;&lt;/li&gt; 
&lt;/ul&gt; 
&lt;h2&gt;&lt;strong&gt;Elsewhere...&lt;/strong&gt;&lt;/h2&gt; 
&lt;p&gt;In a busy week for bio-inspired bots, researchers unveiled a prototype, shrimp-inspired, plasma-producing underwater robot. &amp;nbsp;&lt;/p&gt; 
&lt;p style="text-align: center;"&gt;&lt;img src="https://blog.robotiq.com/hs-fs/hubfs/xtang1.2019-03-14%2015_38_17.gif?width=600&amp;amp;name=xtang1.2019-03-14%2015_38_17.gif" alt="prototype-shrimp-inspired-plasma-producing-underwater-robot" style="width: 600px; margin: 0px auto;" width="600"&gt;&lt;br&gt;Credit: David Staack&lt;/p&gt; 
&lt;p&gt;Via &lt;a href="https://www.wired.com/story/shrimp-plasma/"&gt;Wired&lt;/a&gt;:&lt;/p&gt; 
&lt;blockquote&gt; 
 &lt;p&gt;&lt;em&gt;The pistol shrimp doesn’t have a monopoly on underwater plasma generation. People weld underwater using plasma, known as plasma arc welding, which produces intense heat. And researchers can also make plasma in water with lasers. The problem is, those means are inefficient. Using the claw to generate plasma is 10 times more efficient than those previously explored methods [...]&amp;nbsp; It will, though, require more development to scale.&lt;/em&gt;&lt;/p&gt; 
&lt;/blockquote&gt; 
&lt;p&gt;Meanwhile, a research group at Saarland University have revealed artificial muscles made from shape-memory wires that "have the ability to bend in almost any direction and to wind themselves around corners."&amp;nbsp;&lt;/p&gt; 
&lt;p style="text-align: center;"&gt;&lt;img src="https://blog.robotiq.com/hs-fs/hubfs/195789_web.jpg?width=600&amp;amp;name=195789_web.jpg" alt="artificial-muscles-made-from-shape-memory-wires" style="width: 600px; margin: 0px auto;" width="600"&gt;&lt;br&gt;Credit:&amp;nbsp; Oliver Dietze&lt;/p&gt; 
&lt;p&gt;&lt;a href="https://phys.org/news/2019-03-robot-arms-flexibility-elephant-trunk.html"&gt;PhysOrg&lt;/a&gt; reports:&lt;/p&gt; 
&lt;blockquote&gt; 
 &lt;p&gt;&lt;em&gt;The flexible arms are powered electrically and so can do without the usual pneumatic equipment or other bulky accessories. As the shape-memory alloy itself has sensor properties, the arms can be controlled without the need for additional sensors. The new technology can be used to build large robotic arms with the flexibility of an elephant's trunk or ultrafine tentacles for use in endoscopic operations.&lt;/em&gt;&lt;/p&gt; 
&lt;/blockquote&gt; 
&lt;p&gt;And Jeff Bezos was spotted cavorting with one of Festo's amazing bio-inspired robot dragonflies at the MARS conference.&amp;nbsp; (H/T &lt;a href="https://news.yahoo.com/jeff-bezos-puts-robotic-dragonfly-142155207.html"&gt;Yahoo!&lt;/a&gt;)&lt;/p&gt; 
&lt;p style="text-align: center;"&gt;&lt;img src="https://blog.robotiq.com/hs-fs/hubfs/proxy.duckduckgo.com.jpg?width=300&amp;amp;name=proxy.duckduckgo.com.jpg" alt="proxy.duckduckgo.com" style="width: 300px; margin: 0px auto;" width="300"&gt;&lt;br&gt;Credit:&amp;nbsp; &lt;span class="image-source-caption "&gt;&lt;span class="image-source"&gt;&lt;a href="https://twitter.com/JeffBezos/status/1107991131147534336"&gt;@JeffBezos&lt;/a&gt;&lt;/span&gt;&lt;/span&gt;&lt;/p&gt; 
&lt;p&gt;Meanwhile, in outer space, NASA's Osiris-Rex space bot discovered that the surface of its destination (the Bennu asteroid) is littered with more boulders and rubble than expected.&amp;nbsp;&lt;/p&gt; 
&lt;p style="text-align: center;"&gt;&lt;img src="https://blog.robotiq.com/hs-fs/hubfs/f-bennu-a-20190321-870x508.jpg?width=600&amp;amp;name=f-bennu-a-20190321-870x508.jpg" alt="Artist's-impression-of-OSIRIS-REx-spacecraft-mapping-Bennu" style="width: 600px; margin: 0px auto;" width="600"&gt;&lt;br&gt;Artist's impression of OSIRIS-REx spacecraft mapping Bennu. &lt;br&gt;Credit: NASA / Goddard / University of Arizona / VIA AP&lt;/p&gt; 
&lt;p&gt;Via &lt;a href="https://www.japantimes.co.jp/news/2019/03/20/world/science-health-world/bolder-strewn-surface-poses-snag-nasa-plan-scoop-dirt-asteroid/"&gt;The Japan Times&lt;/a&gt;:&lt;/p&gt; 
&lt;blockquote&gt; 
 &lt;p&gt;&lt;em&gt;NASA’s plan to scoop up dirt and gravel from an asteroid has hit a snag, but scientists say they can overcome it.Osiris-Rex, is scheduled to descend close to the surface in the summer of 2020. It will extend a robot arm to pick up the sample, which will be returned to Earth in 2023. The spacecraft began orbiting Bennu at the end of last year, after spending two years chasing down the space rock.&lt;/em&gt;&lt;/p&gt; 
&lt;/blockquote&gt; 
&lt;p&gt;And in other news:&lt;/p&gt; 
&lt;ul&gt; 
 &lt;li&gt;Seeing through a robot's eyes helps those with profound motor impairments&amp;nbsp; (&lt;a href="https://www.sciencedaily.com/releases/2019/03/190315155451.htm"&gt;ScienceDaily&lt;/a&gt;)&lt;/li&gt; 
 &lt;li&gt;Can you murder a robot?&amp;nbsp; (&lt;a href="https://www.bbc.com/news/amp/technology-47090174"&gt;BBC&lt;/a&gt;)&lt;/li&gt; 
 &lt;li&gt;Chinese Doctors Successfully Perform Remote Brain Surgery Using Huawei’s 5G Technology&amp;nbsp; (&lt;a href="https://eurasiafuture.com/2019/03/18/chinese-doctors-successfully-perform-remote-brain-surgery-using-huaweis-5g-technology/"&gt;Eurasia Future&lt;/a&gt;)&lt;/li&gt; 
 &lt;li&gt;Can Miners Take Robot Truck Technology From Pit to Main Street?&amp;nbsp; (&lt;a href="https://www.bloomberg.com/amp/news/articles/2019-03-21/can-miners-take-robot-truck-technology-from-pit-to-main-street"&gt;Bloomberg&lt;/a&gt;)&lt;/li&gt; 
 &lt;li&gt;What AI Is Still Far From Figuring Out&amp;nbsp; (&lt;a href="https://www.wsj.com/articles/what-ai-is-still-far-from-figuring-out-11553112473"&gt;The Wall Street Journal&lt;/a&gt;)&lt;/li&gt; 
&lt;/ul&gt; 
&lt;p&gt;&lt;span class="hs_cos_wrapper hs_cos_wrapper_meta_field hs_cos_wrapper_type_rich_text"&gt;&lt;span&gt;Come by next week for more of the latest robotics news!&amp;nbsp; Until then, please enjoy..&lt;/span&gt;&lt;/span&gt;&lt;/p&gt; 
&lt;h2&gt;&lt;strong&gt;Five vids for Friday&lt;/strong&gt;&lt;/h2&gt; 
&lt;p&gt;1.&amp;nbsp; Researchers at Columbia Engineering and MIT Computer Science &amp;amp; Artificial Intelligence Lab have demonstrated a system that combines loosely coupled simple components (or "particles") to create functional robots.&amp;nbsp; (Paper: &lt;a href="https://www.nature.com/articles/s41586-019-1022-9"&gt;Particle robotics based on statistical mechanics of loosely coupled components&lt;/a&gt;)&amp;nbsp;&lt;/p&gt; 
&lt;div class="hs-responsive-embed hs-responsive-embed-youtube"&gt;
 &lt;iframe class="hs-responsive-embed-iframe" style="float: none; margin-left: auto; margin-right: auto; display: block;" src="http://www.youtube.com/embed/wrDdqjQvaoA" width="600" height="336" allowfullscreen&gt;&lt;/iframe&gt;
&lt;/div&gt; 
&lt;p&gt;&amp;nbsp;&lt;/p&gt; 
&lt;p&gt;2.&amp;nbsp; Separately, researchers have unveiled a magnetic liquid metal that could one day be used to construct robots.&amp;nbsp; The potential of this technology goes way beyond the creation of "scary" Terminator-esque bots, of course.&amp;nbsp; The industrial applications of teleoperated --or even collaborative-- liquid metal bots are virtually limitless.&amp;nbsp; (Paper: &lt;a href="https://pubs.acs.org/stoken/presspac/presspac/full/10.1021/acsami.8b22699"&gt;Magnetic Liquid Metals Manipulated in the Three-Dimensional Free Space&lt;/a&gt;)&amp;nbsp;&lt;/p&gt; 
&lt;div class="hs-responsive-embed hs-responsive-embed-youtube"&gt;
 &lt;iframe class="hs-responsive-embed-iframe" style="float: none; margin-left: auto; margin-right: auto; display: block;" src="http://www.youtube.com/embed/jFNpfD1sg6g" width="600" height="336" allowfullscreen&gt;&lt;/iframe&gt;
&lt;/div&gt; 
&lt;p&gt;&amp;nbsp;&lt;/p&gt; 
&lt;p&gt;3.&amp;nbsp; And if you think Terminator is scary, wait til you see a &lt;a href="https://www.kickstarter.com/projects/tombot/tombot-affordable-robotic-companion-animals-for-se?ref=discovery_category_newest"&gt;Tombot&lt;/a&gt; in action!&amp;nbsp; (Also, I want one.)&amp;nbsp;&amp;nbsp;&lt;/p&gt; 
&lt;div class="hs-responsive-embed hs-responsive-embed-youtube"&gt;
 &lt;iframe class="hs-responsive-embed-iframe" style="float: none; margin-left: auto; margin-right: auto; display: block;" src="http://www.youtube.com/embed/-VPbmEiOWsA" width="600" height="336" allowfullscreen&gt;&lt;/iframe&gt;
&lt;/div&gt; 
&lt;p&gt;&amp;nbsp;&lt;/p&gt; 
&lt;p&gt;4.&amp;nbsp; &lt;a href="https://www.csail.mit.edu/"&gt;MIT CSAIL&lt;/a&gt; researchers have proposed a new technique that enables robots to generalize their learning with relatively little data.&amp;nbsp; The technique enables pick and place operations based on limited knowledge of the objects being handled.&amp;nbsp; (Paper: &lt;a href="https://arxiv.org/abs/1903.06684"&gt;kPAM: KeyPoint Affordances for Category-Level Robotic Manipulation&lt;/a&gt;)&amp;nbsp;&lt;/p&gt; 
&lt;div class="hs-responsive-embed hs-responsive-embed-youtube"&gt;
 &lt;iframe class="hs-responsive-embed-iframe" style="float: none; margin-left: auto; margin-right: auto; display: block;" src="http://www.youtube.com/embed/fm5RZ-ht1y0" width="600" height="336" allowfullscreen&gt;&lt;/iframe&gt;
&lt;/div&gt; 
&lt;p&gt;&amp;nbsp;&lt;/p&gt; 
&lt;p&gt;5.&amp;nbsp; In a development that could help usher in an era of interspecies mediation robots, researchers from the École polytechnique fédérale de Lausanne's &lt;a href="https://biorob.epfl.ch/"&gt;Biorobotics Laboratory&lt;/a&gt; have created a system that enabled fish and bees 700 km (435 mi) apart to collaborate on decision making processes.&amp;nbsp; (Paper: &lt;a href="http://robotics.sciencemag.org/content/4/28/eaau7897"&gt;Robots mediating interactions between animals for interspecies collective behaviors&lt;/a&gt;)&amp;nbsp;&amp;nbsp;&lt;/p&gt; 
&lt;div class="hs-responsive-embed hs-responsive-embed-youtube"&gt;
 &lt;iframe class="hs-responsive-embed-iframe" style="float: none; margin-left: auto; margin-right: auto; display: block;" src="http://www.youtube.com/embed/g4xrN9jKEgE" width="600" height="336" allowfullscreen&gt;&lt;/iframe&gt;
&lt;/div&gt; 
&lt;p&gt;&amp;nbsp;&lt;/p&gt; 
&lt;p&gt;&amp;nbsp;&lt;/p&gt;  
&lt;img src="https://track.hubspot.com/__ptq.gif?a=13401&amp;amp;k=14&amp;amp;r=https%3A%2F%2Fblog.robotiq.com%2Fwhats-new-in-robotics-22.03.2019&amp;amp;bu=https%253A%252F%252Fblog.robotiq.com&amp;amp;bvt=rss" alt="" width="1" height="1" style="min-height:1px!important;width:1px!important;border-width:0!important;margin-top:0!important;margin-bottom:0!important;margin-right:0!important;margin-left:0!important;padding-top:0!important;padding-bottom:0!important;padding-right:0!important;padding-left:0!important; "&gt;</content:encoded>
      <category>collaborative robots</category>
      <category>biomimetic</category>
      <category>advanced manufacturing</category>
      <category>cobot</category>
      <category>robotics news</category>
      <category>bio-inspired</category>
      <pubDate>Fri, 22 Mar 2019 11:02:00 GMT</pubDate>
      <author>emmetcole@gmail.com (Emmet Cole)</author>
      <guid>https://blog.robotiq.com/whats-new-in-robotics-22.03.2019</guid>
      <dc:date>2019-03-22T11:02:00Z</dc:date>
    </item>
    <item>
      <title>Bin Picking the Easy Way vs the Hard Way With Robot Vision</title>
      <link>https://blog.robotiq.com/bin-picking-the-easy-way-vs-the-hard-way-with-robot-vision</link>
      <description>&lt;div class="hs-featured-image-wrapper"&gt; 
 &lt;a href="https://blog.robotiq.com/bin-picking-the-easy-way-vs-the-hard-way-with-robot-vision" title="" class="hs-featured-image-link"&gt; &lt;img src="https://blog.robotiq.com/hubfs/Hand-E%20Camera%20FT%20300%20Machine%20Tending-14.jpg" alt="Hand-E Camera FT 300 Machine Tending-14" class="hs-featured-image" style="width:auto !important; max-width:50%; float:left; margin:0 15px 15px 0;"&gt; &lt;/a&gt; 
&lt;/div&gt;    
&lt;p style="-qt-block-indent: 0; text-indent: 0px; margin: 10px 0px 0px 0px;"&gt; &lt;strong&gt;It's one of the trickiest robotic tasks in the world. But, you don't need a complex solution to solve it. There's a hard way to do bin picking with robot vision and an easy way.&lt;/strong&gt;&lt;/p&gt;</description>
      <content:encoded>&lt;p style="-qt-block-indent: 0; text-indent: 0px; margin: 10px 0px 0px 0px;"&gt;&lt;strong&gt;It's one of the trickiest robotic tasks in the world. But, you don't need a complex solution to solve it. There's a hard way to do bin picking with robot vision and an easy way.&lt;/strong&gt;&lt;/p&gt; 
&lt;p style="-qt-block-indent: 0; text-indent: 0px; margin: 10px 0px 0px 0px;"&gt;&lt;/p&gt; 
&lt;p style="text-align: center;"&gt;&lt;iframe class="wistia_embed" style="margin-left: auto; margin-right: auto; display: block;" src="https://fast.wistia.net/embed/iframe/w4pmovp7c9" name="wistia_embed" width="600" height="338" frameborder="0" allowfullscreen&gt;&lt;/iframe&gt;&lt;span&gt;The&lt;/span&gt;&lt;a href="/robotiq-wrist-camera-update"&gt;&lt;span&gt;&amp;nbsp;&lt;/span&gt;Robotiq Wrist Camera URCap version 1.7&lt;/a&gt;&lt;span&gt;&lt;span&gt;&amp;nbsp;&lt;/span&gt;is now the fastest and most intuitive vision system for Universal Robots.&lt;/span&gt;&lt;/p&gt; 
&lt;p&gt;Bin picking is "robotics quest for the holy grail", said Guillame Roberts &lt;a&gt;&lt;/a&gt;&lt;a href="/bid/63564/Is-2013-the-Year-for-Bin-Picking-in-Robotics"&gt;here on the blog back in 2013&lt;/a&gt;. Although 6 years have passed since he wrote that, robotic bin picking is still almost as elusive as it was back then.&lt;/p&gt; 
&lt;p&gt;Over the past two decades, a lot of research and development has gone into &lt;strong&gt;giving robots the ability to recognize jumbled objects in a box and pick them up individually&lt;/strong&gt;. This task — which is so simple for us humans — has revealed the limitations of almost every aspect of robotic manipulation, including: robot vision, multi-fingered grasping, artificial intelligence, and trajectory planning.&lt;/p&gt; 
&lt;p&gt;It's true that bin picking is now much more possible than it was 5 years ago. Technological advances in all the related fields have provided solutions to may of the challenges which previously made it impossible. However, there's still a long way to go. Last year, &lt;a href="/3-innovative-robotic-assembly-challenges-from-2018"&gt;Amazon discontinued their yearly picking challenge&lt;/a&gt; because it wasn't producing results.&lt;/p&gt; 
&lt;p&gt;Despite the advances, most current solutions are complex and require a lot of extra technology, such as advanced 3D vision setups and machine learning.&lt;/p&gt; 
&lt;p&gt;But, it doesn't have to be like this. Last year, teams here at Robotiq revealed that you don't have to go for the hard solutions to bin picking. Even with simple robot vision, you can achieve robust bin picking with a robot.&lt;/p&gt; 
&lt;p&gt;&lt;a class="cta_button" href="https://blog.robotiq.com/cs/ci/?pg=dc910fdd-73af-4256-9c1e-ac13cf4185b5&amp;amp;pid=13401&amp;amp;ecid=&amp;amp;hseid=&amp;amp;hsic="&gt;&lt;img class="hs-cta-img " style="border-width: 0px; /*hs-extra-styles*/; margin: 0 auto; display: block; margin-top: 20px; margin-bottom: 20px" alt="wrist-camera-urcap-update-cta" src="https://no-cache.hubspot.com/cta/default/13401/dc910fdd-73af-4256-9c1e-ac13cf4185b5.png" align="middle"&gt;&lt;/a&gt;&lt;/p&gt; 
&lt;h2&gt;What is bin picking?&lt;/h2&gt; 
&lt;p&gt;Bin picking is a robotic manipulation task that involves detecting objects which are arranged in a highly unstructured manner and picking them up individually.&lt;/p&gt; 
&lt;p&gt;The classic example is to have objects piled on top of each other inside a box. The robot a vision system&amp;nbsp;to detect individual objects. It then uses trajectory planning to grasp each object one by one and remove it from the box.&lt;/p&gt; 
&lt;p&gt;This is a very challenging task for robot vision.&lt;/p&gt; 
&lt;p&gt;Challenges include:&lt;/p&gt; 
&lt;ul&gt; 
 &lt;li&gt;&lt;strong&gt;Occlusion&lt;/strong&gt; — Some objects are partly or completely hidden by the objects on top of them.&lt;/li&gt; 
 &lt;li&gt;&lt;strong&gt;Lighting&lt;/strong&gt; — The objects cast shadows on each other which further hides them from the camera.&lt;/li&gt; 
 &lt;li&gt;&lt;strong&gt;Edge detection&lt;/strong&gt; — It is unclear where one object starts and the other finishes, which makes it hard to detect the outline of each individual object.&lt;/li&gt; 
&lt;/ul&gt; 
&lt;p&gt;These issues occur using both 2D and 3D vision. However, they are especially problematic in 2D robot vision as they can make it almost impossible to detect individual objects.&lt;/p&gt; 
&lt;ul style="list-style-type: disc;"&gt; 
 &lt;li&gt;&lt;strong&gt;Read more:&amp;nbsp;&lt;a href="/top-10-challenges-for-robot-vision"&gt;Top 10 Challenges for Robot Vision&lt;/a&gt;&lt;/strong&gt;&lt;/li&gt; 
&lt;/ul&gt; 
&lt;h2&gt;The hard way to do bin picking: Complex 3D vision&lt;/h2&gt; 
&lt;p style="text-align: center;"&gt;&lt;a href="/robotiq-wrist-camera-update"&gt;&lt;img src="https://blog.robotiq.com/hs-fs/hubfs/Hand-E%20Camera%20FT%20300%20Machine%20Tending-14.jpg?width=600&amp;amp;name=Hand-E%20Camera%20FT%20300%20Machine%20Tending-14.jpg" alt="Hand-E Camera FT 300 Machine Tending-14" width="600" style="width: 600px; display: block; margin: 0px auto;"&gt;&lt;/a&gt;With complete control over the camera settings and model definition, &lt;a href="/robotiq-wrist-camera-update"&gt;the software can adapt to any situation&lt;/a&gt;.&lt;/p&gt; 
&lt;p&gt;There are various commercial solutions for bin picking, but they tend to be quite complex and costly. They can also be quite inflexible — you need to add a lot of extra "stuff" around the robot for the system to work (e.g. fixed cameras, lighting, etc).&lt;/p&gt; 
&lt;p&gt;A lot of the available solutions use 3D vision. These require advanced processing and purchasing extra technology to make them work.&lt;/p&gt; 
&lt;p&gt;For example, a typical bin picking setup might include:&lt;/p&gt; 
&lt;ul&gt; 
 &lt;li&gt;&lt;strong&gt;3D laser scanner(s)&lt;/strong&gt; — These use laser light to capture a 3D depth image. The sensor produces a "point cloud" of the objects and surrounding area.&lt;/li&gt; 
 &lt;li&gt;&lt;strong&gt;Stereoscopic vision sensors&lt;/strong&gt; — These involve using dual cameras to create a 3D image of the environment. They can be used alone or together with a laser scanner to improve the detection accuracy.&lt;/li&gt; 
 &lt;li&gt;&lt;strong&gt;3D object detection&lt;/strong&gt; — These algorithms attempt to find objects within the 3D point cloud. Some use CAD models of the objects that are being detected.&lt;/li&gt; 
 &lt;li&gt;&lt;strong&gt;Fixed lighting&lt;/strong&gt; — Some systems require extra lighting to provide consistent illumination of the scene.&lt;/li&gt; 
&lt;/ul&gt; 
&lt;p&gt;The complexity of these systems varies quite a lot. Some are small, self-contained units which are mounted above the detection area. Others require you to set up various sensors and lights at precise locations around the area. Whatever method they choose, all of them require more technology than a simple Robot Camera, which increases the cost of them.&lt;/p&gt; 
&lt;p&gt;Once the technology is set up and the system has been trained, the detection can be quite robust. You program the robot to receive object locations from the sensor and then use trajectory planning to pick the objects up.&lt;/p&gt; 
&lt;h2&gt;The easy way to do bin picking: Simplify&lt;/h2&gt; 
&lt;p&gt;There is an easier way to achieve bin picking.&lt;/p&gt; 
&lt;p&gt;It only requires&lt;strong&gt;&amp;nbsp;&lt;a&gt;&lt;/a&gt;&lt;a href="https://robotiq.com/products/wrist-camera"&gt;a simple Robot Camera&lt;/a&gt;. &lt;/strong&gt;&lt;/p&gt; 
&lt;p&gt;This is a technique that was used by the participants &lt;a&gt;&lt;/a&gt;&lt;a href="/ruc-challenge-update-its-been-a-challenging-night"&gt;at last year's Robotiq User Conference Technical Challenge&lt;/a&gt;. It allows you to achieve complex bin picking without the need for complex sensing technology.&lt;/p&gt; 
&lt;p&gt;&lt;strong&gt;The trick to this technique is to simplify the detection step&lt;/strong&gt;. Instead of trying to detect objects when they are piled up on top of each other, move them to a place where they can be more easily detected by a normal, 2D vision sensor.&lt;/p&gt; 
&lt;p&gt;Here's how it works.&lt;/p&gt; 
&lt;ol&gt; 
 &lt;li&gt;Use the robot gripper to grab a "handful" of the objects that you want to pick. There's no need to detect the objects for this. Simply move the gripper into the box and grasp.&lt;/li&gt; 
 &lt;li&gt;Drop the objects onto a flat surface.&lt;/li&gt; 
 &lt;li&gt;Use the robot vision sensor to detect individual objects on the surface.&lt;/li&gt; 
 &lt;li&gt;Pick up each object one by one.&lt;/li&gt; 
&lt;/ol&gt; 
&lt;p&gt;That's it!&lt;/p&gt; 
&lt;p&gt;The real value of this technique is that it is so easy to implement. It's basically just a pick and place operation with an extra step.&lt;/p&gt; 
&lt;p&gt;Of course, this technique doesn't work for every single object. You need to have objects which don't have to be grasped in an exact same way every time. However, if your objects are so fragile that they require delicate handling, you probably won't have them piled into a box in the first place!&lt;/p&gt; 
&lt;p&gt;The secret of this technique, as with many good robot programming tricks, is… simplify.&lt;/p&gt; 
&lt;p&gt;&lt;strong&gt;Simpler robot solutions always have a better chance of success.&lt;/strong&gt;&lt;/p&gt; 
&lt;p&gt;&amp;nbsp;&lt;a class="cta_button" href="https://blog.robotiq.com/cs/ci/?pg=9b957a82-9875-4c1c-9335-f931f87ee495&amp;amp;pid=13401&amp;amp;ecid=&amp;amp;hseid=&amp;amp;hsic="&gt;&lt;img class="hs-cta-img " style="border-width: 0px; /*hs-extra-styles*/; margin: 0 auto; display: block; margin-top: 20px; margin-bottom: 20px" alt="New Call-to-action" src="https://no-cache.hubspot.com/cta/default/13401/9b957a82-9875-4c1c-9335-f931f87ee495.png" align="middle"&gt;&lt;/a&gt;&lt;/p&gt; 
&lt;p style="-qt-block-indent: 0; text-indent: 0px; margin: 0px 0px 10px 0px;"&gt;&lt;em&gt;What tricks have you used to simplify a robot picking operation? &lt;strong&gt;Tell us in the comments below or join the discussion on &lt;a href="https://www.linkedin.com/company/1695451"&gt;LinkedIn&lt;/a&gt;, &lt;a href="https://twitter.com/Robotiq_Inc"&gt;Twitter,&lt;/a&gt; &lt;a href="https://www.facebook.com/robotiq"&gt;Facebook&lt;/a&gt; or &lt;a href="http://dof.robotiq.com/"&gt;the DoF professional robotics community&lt;/a&gt;.&lt;/strong&gt;&lt;/em&gt; &lt;/p&gt;  
&lt;img src="https://track.hubspot.com/__ptq.gif?a=13401&amp;amp;k=14&amp;amp;r=https%3A%2F%2Fblog.robotiq.com%2Fbin-picking-the-easy-way-vs-the-hard-way-with-robot-vision&amp;amp;bu=https%253A%252F%252Fblog.robotiq.com&amp;amp;bvt=rss" alt="" width="1" height="1" style="min-height:1px!important;width:1px!important;border-width:0!important;margin-top:0!important;margin-bottom:0!important;margin-right:0!important;margin-left:0!important;padding-top:0!important;padding-bottom:0!important;padding-right:0!important;padding-left:0!important; "&gt;</content:encoded>
      <category>bin picking</category>
      <category>vision</category>
      <category>URCaps</category>
      <category>Wrist Camera</category>
      <category>URCap</category>
      <category>robot vision</category>
      <pubDate>Thu, 21 Mar 2019 15:02:00 GMT</pubDate>
      <author>alex@alexowenhill.co.uk (Alex Owen-Hill)</author>
      <guid>https://blog.robotiq.com/bin-picking-the-easy-way-vs-the-hard-way-with-robot-vision</guid>
      <dc:date>2019-03-21T15:02:00Z</dc:date>
    </item>
    <item>
      <title>What's New In Robotics?  15.03.2019</title>
      <link>https://blog.robotiq.com/whats-new-in-robotics-15.03.2019</link>
      <description>&lt;div class="hs-featured-image-wrapper"&gt; 
 &lt;a href="https://blog.robotiq.com/whats-new-in-robotics-15.03.2019" title="" class="hs-featured-image-link"&gt; &lt;img src="https://blog.robotiq.com/hubfs/MzI1MDIzMw.jpeg" alt="roboticists-ihmc-humanoid-robot-nadia" class="hs-featured-image" style="width:auto !important; max-width:50%; float:left; margin:0 15px 15px 0;"&gt; &lt;/a&gt; 
&lt;/div&gt;    
&lt;p&gt;&lt;strong&gt;Hi!&amp;nbsp; In this week's news mix: Barcelona launches a 5G robotics pilot, Circuit Bread meets UR, and could cobots be a way around proposed "robot taxes"?&amp;nbsp; Elsewhere, we discover USC's potentially ground-breaking algorithm, zap swordfish with Colin Angle, admire a soft gripper that can lift 100 times its weight and much more!&lt;/strong&gt;&lt;/p&gt;</description>
      <content:encoded>&lt;p&gt;&lt;strong&gt;Hi!&amp;nbsp; In this week's news mix: Barcelona launches a 5G robotics pilot, Circuit Bread meets UR, and could cobots be a way around proposed "robot taxes"?&amp;nbsp; Elsewhere, we discover USC's potentially ground-breaking algorithm, zap swordfish with Colin Angle, admire a soft gripper that can lift 100 times its weight and much more!&lt;/strong&gt;&lt;/p&gt; 
&lt;p&gt;&lt;/p&gt; 
&lt;h2&gt;&lt;strong&gt;Cobots &amp;amp; manufacturing &lt;/strong&gt;&lt;/h2&gt; 
&lt;p&gt;CCIONA, Orange and 5G Barcelona have launched a “Collaborative and autonomous robots” &lt;a href="https://5gbarcelona.org/en-case-studies/"&gt;pilot&lt;/a&gt; to explore the impact of low latency 5G technology on robot-to-robot communication in manufacturing environments.&amp;nbsp;&amp;nbsp;&lt;/p&gt; 
&lt;div class="hs-responsive-embed hs-responsive-embed-youtube"&gt;
 &lt;iframe class="hs-responsive-embed-iframe" style="float: none; margin-left: auto; margin-right: auto; display: block;" src="http://www.youtube.com/embed/f-F1w-yvh9c" allowfullscreen width="600" height="336"&gt;&lt;/iframe&gt;
&lt;/div&gt; 
&lt;p&gt;&amp;nbsp;&lt;/p&gt; 
&lt;p&gt;&lt;a href="https://www.reeco.co.uk/"&gt;Reeco Automation&lt;/a&gt; has received the Henry Ford Award for successfully integrating a collaborative robot into Ford's manufacturing process.&amp;nbsp; "&lt;em&gt;Both larger and smaller manufacturers are increasingly seeing the benefits of integrating cobots into their production processes, not to replace the human workforce but to complement them&lt;/em&gt;," Reeco's Managing Director, Llewelyn Rees told &lt;a href="https://www.roboticstomorrow.com/news/2019/03/12/reeco-wins-prestigious-henry-ford-award/13269/"&gt;Robotics Tomorrow&lt;/a&gt;.&lt;/p&gt; 
&lt;p&gt;&lt;a href="https://www.circuitbread.com/"&gt;Circuit Bread&lt;/a&gt; met up with Tim DeGrasse from &lt;a href="https://www.universal-robots.com/"&gt;Universal Robots&lt;/a&gt; to find out more about the company's collaborative robots.&amp;nbsp; The result?&amp;nbsp; Circuit Bread staff are now "actively trying to figure out how we can justify getting one of these in the office"...&amp;nbsp;&lt;/p&gt; 
&lt;p style="text-align: center;"&gt; &lt;/p&gt;
&lt;div class="hs-responsive-embed hs-responsive-embed-youtube"&gt;
 &lt;iframe class="hs-responsive-embed-iframe" src="http://www.youtube.com/embed/waVK-7AFsS0" allowfullscreen width="560" height="314"&gt;&lt;/iframe&gt;
&lt;/div&gt; 
&lt;p&gt;&lt;/p&gt; 
&lt;p&gt;&amp;nbsp;&lt;/p&gt; 
&lt;p&gt;An astonishing 79 per cent of automation distributors do not believe their customers understand the safety requirements of installing a collaborative robot, according to the results of the &lt;a href="http://news.tmrobotics.com/wp-content/uploads/2019/03/TMR006-Global-Robotics-Report-WP.pdf"&gt;Global Robotics Report&lt;/a&gt;.&amp;nbsp;&amp;nbsp; [If you're new to cobot safety topics, make sure to check out Robotiq's eBook "&lt;a href="/safety-collaborative-robots-risk-assessment"&gt;&lt;strong&gt;Collaborative Robots Risk Assessment, An Introduction&lt;/strong&gt;&lt;/a&gt;."]&lt;/p&gt; 
&lt;p&gt;&lt;a href="http://www.doosan.com/en/business/robotics/cooperation-robot/"&gt;Doosan&lt;/a&gt; released video showing its cobot helping out with a tire change....&amp;nbsp;&lt;/p&gt; 
&lt;div class="hs-responsive-embed hs-responsive-embed-youtube"&gt;
 &lt;iframe class="hs-responsive-embed-iframe" style="float: none; margin-left: auto; margin-right: auto; display: block;" src="http://www.youtube.com/embed/IauKiybkFLM" allowfullscreen width="600" height="336"&gt;&lt;/iframe&gt;
&lt;/div&gt; 
&lt;p&gt;&amp;nbsp;&lt;/p&gt; 
&lt;p&gt;Collaborative robots may provide a way for companies&amp;nbsp; "&lt;em&gt;to avoid the wrathful eye of the regulators, as they look to impose punitive measures on businesses which replace humans with automation&lt;/em&gt;," Bernard Marr suggested in &lt;a href="https://www.forbes.com/sites/bernardmarr/2019/03/08/5-major-robotics-trends-to-watch-for-in-2019/"&gt;Forbes&lt;/a&gt; this week. "&lt;em&gt;Politicians have already proposed “robot taxes” to cover these eventualities – fostering harmonious working relationships between humans and machines could be a trend which will set people’s minds at ease in 2019&lt;/em&gt;."&amp;nbsp;&lt;/p&gt; 
&lt;p&gt;In part 4 of its “Doing with duAro” miniseries, Kawasaki introduces the &lt;a href="https://robotics.kawasaki.com/en1/products/robots/dual-arm-scara/duAro2/"&gt;duAro2&lt;/a&gt;, the latest addition to its line of dual-arm cobots...&amp;nbsp;&lt;/p&gt; 
&lt;div class="hs-responsive-embed hs-responsive-embed-youtube"&gt;
 &lt;iframe class="hs-responsive-embed-iframe" style="float: none; margin-left: auto; margin-right: auto; display: block;" src="http://www.youtube.com/embed/Mb2n1HVhaf8" allowfullscreen width="600" height="336"&gt;&lt;/iframe&gt;
&lt;/div&gt; 
&lt;p&gt;&amp;nbsp;&lt;/p&gt; 
&lt;p&gt;Festo has revealed the &lt;a href="https://www.festo.com/group/en/cms/13508.htm"&gt;BionicSoftHand&lt;/a&gt; --a pneumatically operated, bio-inspired gripper that builds on existing Festo technology and could one day find its way into collaborative work spaces.&amp;nbsp;&lt;/p&gt; 
&lt;div class="hs-responsive-embed hs-responsive-embed-youtube"&gt;
 &lt;iframe class="hs-responsive-embed-iframe" style="float: none; margin-left: auto; margin-right: auto; display: block;" src="http://www.youtube.com/embed/kavHcgITpcI" allowfullscreen width="600" height="336"&gt;&lt;/iframe&gt;
&lt;/div&gt; 
&lt;p&gt;&amp;nbsp;&lt;/p&gt; 
&lt;p&gt;Via &lt;a href="https://www.hydraulicspneumatics.com/pneumatic-valves/pneumatics-robotics-and-artificial-intelligence-come-together"&gt;Hydraulics &amp;amp; Pneumatics&lt;/a&gt;:&lt;/p&gt; 
&lt;blockquote&gt; 
 &lt;p&gt;&lt;em&gt;The BionicSoftArm is a new development of Festo's BionicMotionRobot, whose range of applications has been significantly expanded. Its modular design can be combined with up to seven pneumatic bellows segments and rotary drives. This guarantees maximum flexibility in terms of reach and mobility, enabling it to work around obstacles even in the tightest of spaces. At the same time, it is completely flexible and can work safely with people.&lt;/em&gt;&lt;/p&gt; 
&lt;/blockquote&gt; 
&lt;p&gt;In related news:&lt;/p&gt; 
&lt;ul&gt; 
 &lt;li&gt;Working and Growing With Collaborative Robots (&lt;a href="https://www.qualitydigest.com/inside/innovation-article/working-and-growing-collaborative-robots-031319.html"&gt;Quality Digest&lt;/a&gt;)&lt;/li&gt; 
 &lt;li&gt;60 Seconds with… Jeff Burnstein, President of A3 (&lt;a href="https://www.mmh.com/article/60_seconds_with..._jeff_burnstein_president_of_the_association_for_advancin"&gt;MMH&lt;/a&gt;)&lt;/li&gt; 
 &lt;li&gt;SAE targets 700th aircraft scheduled maintenance this year&amp;nbsp; (&lt;a href="https://www.thesundaily.my/business/sae-targets-700th-aircraft-scheduled-maintenance-this-year-GD674888"&gt;The Sun Daily&lt;/a&gt;)&lt;/li&gt; 
 &lt;li&gt;The story behind Kawasaki’s duAro 2 robot&amp;nbsp; (&lt;a href="http://roboticsandautomationnews.com/2019/03/08/the-story-behind-kawasakis-duaro-robot/21261/"&gt;Robotics &amp;amp; Automation News&lt;/a&gt;)&lt;/li&gt; 
 &lt;li&gt;TrakRap: Embracing industry 4.0 in manufacturing&amp;nbsp; (&lt;a href="https://www.manufacturingglobal.com/lean-manufacturing/trakrap-embracing-industry-40-manufacturing-industry"&gt;Manufacturing Global&lt;/a&gt;)&lt;/li&gt; 
&lt;/ul&gt; 
&lt;h2&gt;&lt;strong&gt;&lt;br&gt;Elsewhere...&lt;/strong&gt;&lt;/h2&gt; 
&lt;p&gt;In what could prove to be a major breakthrough for robotics development, DARPA has announced the development of an AI-controlled robotic limb that can learn over time in much the same way animals and humans do.&lt;/p&gt; 
&lt;p style="text-align: center;"&gt;&lt;img src="https://blog.robotiq.com/hs-fs/hubfs/robot-leg-1280x720.jpg?width=600&amp;amp;name=robot-leg-1280x720.jpg" alt="robot-leg-university-southern-california-mathew" style="width: 600px; margin: 0px auto;" width="600"&gt;&lt;br&gt;Credit: University of Southern California/Mathew Lin&amp;nbsp;&lt;/p&gt; 
&lt;p&gt;All part of a wider project targeting the development of true lifelong machine learning, &lt;a href="http://valerolab.org/g2p/"&gt;the new, bio-inspired algorithm&lt;/a&gt; at the heart of the system is able to learn a new walking task by itself after only 5 minutes of unstructured play (also known as "motor babbling").&amp;nbsp; The algorithm enables robot limbs to adapt to other tasks without any additional programming.&amp;nbsp;&amp;nbsp; (More: PC Mag has an interview with&amp;nbsp;&lt;a href="http://valerolab.org"&gt;Dr. Francisco Valero-Cuevas&lt;/a&gt;, co-author of a paper on the research.&amp;nbsp; Paper: &lt;a href="https://www.nature.com/articles/s42256-019-0029-0"&gt;Autonomous functional movements in a tendon-driven limb via limited experience&lt;/a&gt;)&lt;/p&gt; 
&lt;p&gt;&lt;a href="https://www.irobot.com/"&gt;iRobot&lt;/a&gt; CEO Colin Angle, co-founder of &lt;a href="https://www.robotsise.org/"&gt;Robots in Service of the Environment&lt;/a&gt; (RSE) has announced the launch of the Guardian LF1 Mark 3, the latest prototype version of his swordfish zapping and gathering robot.&amp;nbsp;&lt;/p&gt; 
&lt;p style="text-align: center;"&gt;&lt;img src="https://blog.robotiq.com/hs-fs/hubfs/Copy-of-IMG_8669.jpg?width=600&amp;amp;name=Copy-of-IMG_8669.jpg" alt="robots-in-service-of-environment" style="width: 600px; margin: 0px auto;" width="600"&gt;&lt;br&gt;Credit: Robots in Service of the Environment&amp;nbsp;&lt;/p&gt; 
&lt;p&gt;Vis &lt;a href="https://www.robotsise.org/rse-in-the-news/"&gt;RSE&lt;/a&gt;:&lt;/p&gt; 
&lt;blockquote&gt; 
 &lt;p&gt;&lt;em&gt;"The Lionfish are destroying the coral reef and decimating fish populations in the Atlantic. The latest innovations incorporated into the RSE Guardian LF1, enable the undersea robotic solution to go deeper, fish longer and pull in a larger haul. With each technical milestone we cross we get one step closer to saving our greatest natural resource by empowering fisherman with new tools,” said Colin Angle.&lt;/em&gt;&lt;/p&gt; 
&lt;/blockquote&gt; 
&lt;p&gt;Roboticists at the Institute for Human &amp;amp; Machine Cognition (&lt;a href="http://robots.ihmc.us/"&gt;IHMC&lt;/a&gt;) in Florida, U.S.A. are working on a new humanoid robot dubbed 'Nadia' (after gymnast &lt;a href="https://en.wikipedia.org/wiki/Nadia_Com%C4%83neci"&gt;Nadia Comăneci&lt;/a&gt;, the first gymnast to be awarded a perfect score of 10.0 at the Olympic Games).&amp;nbsp;&lt;/p&gt; 
&lt;p style="text-align: center;"&gt;&lt;img src="https://blog.robotiq.com/hs-fs/hubfs/MzI1MDIzMw.jpeg?width=600&amp;amp;name=MzI1MDIzMw.jpeg" alt="roboticists-ihmc-humanoid-robot-nadia" style="width: 600px; margin: 0px auto;" width="600"&gt;&lt;br&gt;Credit: IHMC&lt;/p&gt; 
&lt;p&gt;&amp;nbsp;&lt;/p&gt; 
&lt;p&gt;It's part of a three year project that launched in January 2019, but this week IEEE Spectrum's Evan Ackerman published a &lt;a href="https://spectrum.ieee.org/automaton/robotics/humanoids/ihmc-developing-new-gymnastinspired-humanoid-robot.amp.html"&gt;fascinating interview &lt;/a&gt;with Robert Griffin, a research scientist at IHMC that gets into the nuts and bolts of the project:&lt;/p&gt; 
&lt;blockquote&gt; 
 &lt;p&gt;&lt;em&gt;"We’re targeting the height and weight of a human, as well as being in the ballpark of human volume. So design targets are between 5'7" and 6'0", and sub-90 kg. [...]&amp;nbsp; We’re really hesitant to say when you’ll be seeing a Nadia walking around, because we want to design the robot properly and not rush things.&amp;nbsp; But I can say that one of the final ONR project goals is to show Nadia performing tasks autonomously."&lt;/em&gt;&lt;/p&gt; 
&lt;/blockquote&gt; 
&lt;p&gt;&lt;br&gt;Finally, a new study has found that when robots beat humans in contests for cash prizes, people tend to regard themselves as being "less competent and expend slightly less effort—and tend to dislike the robots," &lt;a href="https://www.futurity.org/robots-competition-2005522/"&gt;Futurity&lt;/a&gt; reported.&amp;nbsp;&lt;/p&gt; 
&lt;p style="text-align: center;"&gt;&lt;img src="https://blog.robotiq.com/hs-fs/hubfs/playing-against-robot1_1600.jpg?width=600&amp;amp;name=playing-against-robot1_1600.jpg" alt="playing-against-robot" style="width: 600px; margin: 0px auto;" width="600"&gt;&lt;/p&gt; 
&lt;p&gt;What surprised me most about this research is not the human reactions it describes, but the fact that an easy fix for this issue appears to have been missed:&amp;nbsp; simply pay robots in a currency they can understand --reliable power supplies, quality care and maintenance, well-written code and excellent after sales service-- and they won't have any further interest in hustling humans for cash prizes.&amp;nbsp; Easy peasy!&amp;nbsp; ;)&lt;/p&gt; 
&lt;p&gt;And in other news:&lt;/p&gt; 
&lt;ul&gt; 
 &lt;li&gt;Our Robotics Innovation Centre has been officially launched&amp;nbsp; (&lt;a href="https://research.csiro.au/robotics/our-robotics-innovation-centre-launched/"&gt;CSIRO&lt;/a&gt;)&lt;/li&gt; 
 &lt;li&gt;3 New Chips to Help Robots Find Their Way Around&amp;nbsp; (&lt;a href="https://spectrum.ieee.org/automaton/semiconductors/processors/3-new-chips-to-help-robots-find-their-way-around.amp.html"&gt;IEEE Spectrum&lt;/a&gt;)&lt;/li&gt; 
 &lt;li&gt;Meet Tengai, the job interview robot who won't judge you&amp;nbsp; (&lt;a href="https://www.bbc.com/news/business-47442953"&gt;BBC&lt;/a&gt;)&lt;/li&gt; 
 &lt;li&gt;You Might Be a Robot. This Is Not a Joke.&amp;nbsp; (&lt;a href="https://www.bloomberg.com/amp/opinion/articles/2019-03-14/robots-can-be-people-too-and-vice-versa"&gt;Bloomberg&lt;/a&gt;)&lt;/li&gt; 
 &lt;li&gt;Researchers explore interactions between preschoolers and robotic partners&amp;nbsp; (&lt;a href="https://techxplore.com/news/2019-03-explore-interactions-preschoolers-robotic-partners.html"&gt;TechXplore&lt;/a&gt;) &lt;br&gt;&lt;br&gt;&lt;/li&gt; 
&lt;/ul&gt; 
&lt;p&gt;&lt;span class="hs_cos_wrapper hs_cos_wrapper_meta_field hs_cos_wrapper_type_rich_text"&gt;&lt;span&gt;Come by next week for more of the latest robotics news!&amp;nbsp; Until then, please enjoy..&lt;/span&gt;&lt;/span&gt;&lt;/p&gt; 
&lt;h2&gt;&lt;strong&gt;Five vids for Friday &lt;/strong&gt;&lt;/h2&gt; 
&lt;p&gt;1.&amp;nbsp; Researchers at MIT’s Computer Science and Artificial Intelligence Laboratory (&lt;a href="https://www.csail.mit.edu/"&gt;CSAIL&lt;/a&gt;) have created an origami-inspired, vacuum-driven, 3D-printed soft gripper that can lift 100 times its own weight.&amp;nbsp; (Paper: &lt;a href="http://dspace.mit.edu/handle/1721.1/120930"&gt;A Vacuum-driven Origami “Magic-ball” Soft Gripper&lt;/a&gt;)&amp;nbsp;&lt;/p&gt; 
&lt;div class="hs-responsive-embed hs-responsive-embed-youtube"&gt;
 &lt;iframe class="hs-responsive-embed-iframe" style="float: none; margin-left: auto; margin-right: auto; display: block;" src="http://www.youtube.com/embed/byqGFH6AZuk" allowfullscreen width="600" height="336"&gt;&lt;/iframe&gt;
&lt;/div&gt; 
&lt;p&gt;&amp;nbsp;&lt;/p&gt; 
&lt;p&gt;2.&amp;nbsp; University of Washington researchers have unveiled a robot that can feed people who find it a challenge to feed themselves.&amp;nbsp; (Paper: &lt;a href="https://ieeexplore.ieee.org/document/8624330"&gt;Towards Robotic Feeding: Role of Haptics in Fork-Based Food Manipulation&lt;/a&gt;)&amp;nbsp;&lt;/p&gt; 
&lt;div class="hs-responsive-embed hs-responsive-embed-youtube"&gt;
 &lt;iframe class="hs-responsive-embed-iframe" style="float: none; margin-left: auto; margin-right: auto; display: block;" src="http://www.youtube.com/embed/t2eO4CD-0WY" allowfullscreen width="600" height="336"&gt;&lt;/iframe&gt;
&lt;/div&gt; 
&lt;p&gt;&amp;nbsp;&lt;/p&gt; 
&lt;p&gt;3.&amp;nbsp; &lt;a href="https://www.kickstarter.com/projects/274008848/metafly-a-new-flying-experience"&gt;MetaFly&lt;/a&gt; is the latest bio-inspired bot from Edwin Van Ruymbeke.&amp;nbsp; Instead of relying on motors, like traditional designs, MetaFly relies solely on its wings, which eliminates the need for bulky batteries.&amp;nbsp;&lt;/p&gt; 
&lt;div class="hs-responsive-embed hs-responsive-embed-youtube"&gt;
 &lt;iframe class="hs-responsive-embed-iframe" style="float: none; margin-left: auto; margin-right: auto; display: block;" src="http://www.youtube.com/embed/r_1er08Tt-0" allowfullscreen width="600" height="336"&gt;&lt;/iframe&gt;
&lt;/div&gt; 
&lt;p&gt;&amp;nbsp;&lt;/p&gt; 
&lt;p&gt;4.&amp;nbsp; &lt;a href="https://www.cambridgeconsultants.com/press-releases/mamut-autonomous-robot-field-agritech"&gt;Cambridge Consultants&lt;/a&gt; unveiled Mamut this week.&amp;nbsp; Mamut is an autonomous, mobile robot packed with sensors that "explores crop fields, capturing data on health and yield at the level of individual plants and on a massive scale."&amp;nbsp;&amp;nbsp;&lt;/p&gt; 
&lt;div class="hs-responsive-embed hs-responsive-embed-vimeo"&gt;
 &lt;iframe class="hs-responsive-embed-iframe" style="float: none; margin-left: auto; margin-right: auto; display: block;" src="http://player.vimeo.com/video/315407870?title=0&amp;amp;byline=0&amp;amp;portrait=0&amp;amp;color=8dc7dc" allowfullscreen width="600" height="338"&gt;&lt;/iframe&gt;
&lt;/div&gt; 
&lt;p&gt;&amp;nbsp;&lt;/p&gt; 
&lt;p&gt;&lt;br&gt;5.&amp;nbsp; Science Robotics released a cool video that asks "Where's my robotic construction crew?" (More: &lt;a href="http://robotics.sciencemag.org/content/4/28/eaau8479"&gt;A review of collective robotic construction&lt;/a&gt;)&amp;nbsp;&lt;/p&gt; 
&lt;div class="hs-responsive-embed hs-responsive-embed-youtube"&gt;
 &lt;iframe class="hs-responsive-embed-iframe" style="float: none; margin-left: auto; margin-right: auto; display: block;" src="http://www.youtube.com/embed/VaGSX5N0ns0" allowfullscreen width="600" height="336"&gt;&lt;/iframe&gt;
&lt;/div&gt; 
&lt;p&gt;&amp;nbsp;&lt;/p&gt; 
&lt;p&gt;&amp;nbsp;&lt;/p&gt;  
&lt;img src="https://track.hubspot.com/__ptq.gif?a=13401&amp;amp;k=14&amp;amp;r=https%3A%2F%2Fblog.robotiq.com%2Fwhats-new-in-robotics-15.03.2019&amp;amp;bu=https%253A%252F%252Fblog.robotiq.com&amp;amp;bvt=rss" alt="" width="1" height="1" style="min-height:1px!important;width:1px!important;border-width:0!important;margin-top:0!important;margin-bottom:0!important;margin-right:0!important;margin-left:0!important;padding-top:0!important;padding-bottom:0!important;padding-right:0!important;padding-left:0!important; "&gt;</content:encoded>
      <category>advanced manufacturing</category>
      <category>collaborative applications</category>
      <category>robotics news</category>
      <category>robot safety</category>
      <category>human-robot interaction</category>
      <category>humanoid robot</category>
      <pubDate>Fri, 15 Mar 2019 11:02:00 GMT</pubDate>
      <author>emmetcole@gmail.com (Emmet Cole)</author>
      <guid>https://blog.robotiq.com/whats-new-in-robotics-15.03.2019</guid>
      <dc:date>2019-03-15T11:02:00Z</dc:date>
    </item>
    <item>
      <title>An Unusual Trick That Improves Robot Vision Teaching</title>
      <link>https://blog.robotiq.com/an-unusual-trick-that-improves-robot-vision-teaching</link>
      <description>&lt;div class="hs-featured-image-wrapper"&gt; 
 &lt;a href="https://blog.robotiq.com/an-unusual-trick-that-improves-robot-vision-teaching" title="" class="hs-featured-image-link"&gt; &lt;img src="https://blog.robotiq.com/hubfs/Hand-E%20Camera%20FT%20300%20Machine%20Tending-32-2.jpg" alt="Hand-E Camera FT 300 Machine Tending-32-2" class="hs-featured-image" style="width:auto !important; max-width:50%; float:left; margin:0 15px 15px 0;"&gt; &lt;/a&gt; 
&lt;/div&gt;    
&lt;p style="-qt-block-indent: 0; text-indent: 0px; margin: 10px 0px 0px 0px;"&gt; &lt;strong&gt;Your robot vision just won't detect your object! What's wrong with it!? With this unusual trick, often used by the experts, you can improve your vision teaching in a flash.&lt;/strong&gt;&lt;/p&gt;</description>
      <content:encoded>&lt;p style="-qt-block-indent: 0; text-indent: 0px; margin: 10px 0px 0px 0px;"&gt;&lt;strong&gt;Your robot vision just won't detect your object! What's wrong with it!? With this unusual trick, often used by the experts, you can improve your vision teaching in a flash.&lt;/strong&gt;&lt;/p&gt;  
&lt;p style="text-indent: 0px; margin: 10px 0px 0px; text-align: center;"&gt;&lt;iframe class="wistia_embed" style="margin-left: auto; margin-right: auto; display: block;" src="https://fast.wistia.net/embed/iframe/w4pmovp7c9" name="wistia_embed" width="600" height="338" frameborder="0" allowfullscreen&gt;&lt;/iframe&gt;&lt;span&gt;The&lt;/span&gt;&lt;a href="/robotiq-wrist-camera-update"&gt;&lt;span&gt;&amp;nbsp;&lt;/span&gt;Robotiq Wrist Camera URCap version 1.7&lt;/a&gt;&lt;span&gt;&lt;span&gt;&amp;nbsp;&lt;/span&gt;is now the fastest and most intuitive vision system for Universal Robots.&lt;/span&gt;&lt;/p&gt; 
&lt;p style="-qt-block-indent: 0; text-indent: 0px; margin: 10px 0px 0px 0px;"&gt;Imagine that you are using robot vision to recognize an object. But, there's a problem! When you used the teaching mode, there was something unusual about the object outline in the camera. Perhaps the object's shiny surface had a bright spot, or the camera picked up too many detailed features. Now, whenever you try to detect objects with the vision system, it does not recognize the object because it looks different in the camera.&lt;/p&gt; 
&lt;p&gt;This is a common problem in robot vision.&lt;/p&gt; 
&lt;p&gt;Ideally, you want to teach your vision system with a model which is accurate and only contains the necessary features. If the model you teach it has flaws, the outline of the object will be harder to detect and this will reduce the effectiveness of the whole system.&lt;/p&gt; 
&lt;p&gt;But, &lt;strong&gt;there is an unusual trick you can use to improve the teaching mode&lt;/strong&gt; and make it more robust.&lt;/p&gt; 
&lt;p&gt;I'll reveal the trick in a moment, but first let's look at what causes robot vision teaching to fail&lt;span&gt;…&lt;/span&gt;&lt;/p&gt; 
&lt;h2&gt;Why teaching objects to robot vision gets tough&lt;/h2&gt; 
&lt;p&gt;There are various tricky &lt;a&gt;&lt;/a&gt;&lt;a href="https://blog.robotiq.com/top-10-challenges-for-robot-vision"&gt;challenges for robot vision&lt;/a&gt;, including occlusion, deformation, scale, etc. These affect the vision system both in&amp;nbsp;the teaching and detection phases of operation. However, they are more problematic during the teaching phase because they distort the model that the robot uses to detect objects.&lt;/p&gt; 
&lt;p&gt;To understand why the teaching phase is so important, it's helpful to understand a little bit &lt;strong&gt;how template matching works&lt;/strong&gt; as this is the cornerstone of object detection. We wrote an entire article about this called &lt;a&gt;&lt;/a&gt;&lt;a href="https://blog.robotiq.com/how-template-matching-works-in-robot-vision"&gt;How Template Matching Works in Robot Vision,&lt;/a&gt; but here are the basic details.&lt;/p&gt; 
&lt;ul style="list-style-type: disc;"&gt; 
 &lt;li&gt;&lt;strong&gt;Read more:&amp;nbsp;&lt;a href="/how-template-matching-works-in-robot-vision"&gt;How Template Matching Works in Robot Vision&lt;/a&gt;&lt;/strong&gt;&lt;/li&gt; 
&lt;/ul&gt; 
&lt;p&gt;Template matching involves&lt;strong&gt; training your vision system with a "template image" of the object&lt;/strong&gt; you want to detect. You usually train this template image by pointing the vision sensor at your object and running the system's teaching mode. Later, during the detection phase, the vision algorithm will look for areas within the image which are similar to that template image.&lt;/p&gt; 
&lt;p&gt;If the template image is flawed, the system will not be able to detect the object.&lt;/p&gt; 
&lt;p style="text-align: center;"&gt;&lt;a href="/robotiq-wrist-camera-update"&gt;&lt;img src="https://blog.robotiq.com/hs-fs/hubfs/Hand-E%20Camera%20FT%20300%20Machine%20Tending-32-2.jpg?width=600&amp;amp;name=Hand-E%20Camera%20FT%20300%20Machine%20Tending-32-2.jpg" alt="Hand-E Camera FT 300 Machine Tending-32-2" width="600" style="width: 600px; display: block; margin: 0px auto;"&gt;&lt;/a&gt;A&amp;nbsp;&lt;a href="/robotiq-wrist-camera-update"&gt;faster and more accurate image definition&lt;/a&gt; will improve its object detection ability regardless of the work plane material.&lt;/p&gt; 
&lt;h3&gt;Three&amp;nbsp;factors which cause bad template images&lt;/h3&gt; 
&lt;p&gt;Here are three common factors which result in bad template images, and which can be solved by the unusual trick I'm about to reveal to you:&lt;/p&gt; 
&lt;ol&gt; 
 &lt;li&gt;&lt;strong&gt;Features are too detailed&lt;/strong&gt; — We tend to think that more detailed images are better. However, this is not always the case in robot vision. Often, you will get a more accurate detection when your template image has fewer features.&lt;/li&gt; 
 &lt;li&gt;&lt;strong&gt;Lighting distorts image&lt;/strong&gt; — Shiny objects and backgrounds can cause lens flare in the camera and obscure parts of the image. This can distort the object shape within the template image.&lt;/li&gt; 
 &lt;li&gt;&lt;strong&gt;Bad contrast with background&lt;/strong&gt; — The ideal conditions for teaching a template image is when the object you are detecting has high contrast with the background (e.g. black object on white background). When the color or shade of the background is too similar to the object's color, the system will have trouble defining a clear, accurate edge.&lt;/li&gt; 
&lt;/ol&gt; 
&lt;h2&gt;The unusual trick that improves robot vision teaching&lt;/h2&gt; 
&lt;p&gt;Thankfully, there is a handy trick that you can use to improve the quality of your template image.&lt;/p&gt; 
&lt;p&gt;It is a slightly unusual trick…&lt;/p&gt; 
&lt;p&gt;Don't train the vision system using the object you want to detect!&lt;/p&gt; 
&lt;p&gt;Huh?&lt;/p&gt; 
&lt;p&gt;Surely that doesn't make sense?&lt;/p&gt; 
&lt;p&gt;Surely you need to train the vision system with the object you want it to detect?&lt;/p&gt; 
&lt;p&gt;Not necessarily.&lt;/p&gt; 
&lt;p&gt;Let me explain. Most robot vision systems detect objects using 2D images captured by a camera. This means that the template image is also 2 dimensional. As a result, you don't actually need to use the 3D object to train the system.&lt;/p&gt; 
&lt;p&gt;What can you use instead?&lt;strong&gt; A printed drawing of the object&lt;/strong&gt;, for example.&lt;/p&gt; 
&lt;p&gt;This is a technique that our integration coaches often use to combat the challenges they encounter when teaching robot vision. In the past, for example, they've used it &lt;a&gt;&lt;/a&gt;&lt;a href="https://dof.robotiq.com/discussion/441/teaching-a-shiny-part-to-the-vision-system"&gt;to overcome lighting issues with shiny objects&lt;/a&gt; and &lt;a&gt;&lt;/a&gt;&lt;a href="https://dof.robotiq.com/discussion/822/robotic-vision-camera-teaching-object"&gt;to teach objects which are too big to move&lt;/a&gt;.&lt;/p&gt; 
&lt;p&gt;&lt;a class="cta_button" href="https://blog.robotiq.com/cs/ci/?pg=9b957a82-9875-4c1c-9335-f931f87ee495&amp;amp;pid=13401&amp;amp;ecid=&amp;amp;hseid=&amp;amp;hsic="&gt;&lt;img class="hs-cta-img " style="border-width: 0px; /*hs-extra-styles*/; margin: 0 auto; display: block; margin-top: 20px; margin-bottom: 20px" alt="New Call-to-action" src="https://no-cache.hubspot.com/cta/default/13401/9b957a82-9875-4c1c-9335-f931f87ee495.png" align="middle"&gt;&lt;/a&gt;&lt;/p&gt; 
&lt;h3&gt;Five&amp;nbsp;steps to teach a robot vision system without an object&lt;/h3&gt; 
&lt;p&gt;The process for using a 2D image to teach your robot vision system is quite simple. It starts by turning your CAD model into a 2D engineering drawing.&lt;/p&gt; 
&lt;p&gt;Then, follow these 5 steps:&lt;/p&gt; 
&lt;ol&gt; 
 &lt;li&gt;&lt;strong&gt;Choose your features&lt;/strong&gt; — In your CAD program, remove any of the edges and other features that you do not want to include in your detection, e.g. internal edges which aren't visible all the time, part customisations, etc.&lt;/li&gt; 
 &lt;li&gt;&lt;strong&gt;Pick the right paper color&lt;/strong&gt; — Choose a color and shade of paper which will provide high contrast with the background. Remember, it must also contrast when the color is removed from the image as edge detectors are usually run on greyscale images.&lt;/li&gt; 
 &lt;li&gt;&lt;strong&gt;Print at a scale of 1:1&lt;/strong&gt; — This is important and sometimes tricky. Ensure that your printout matches the real size of the object. Some printers have a tendency to scale images slightly so be wary of this.&lt;/li&gt; 
 &lt;li&gt;&lt;strong&gt;Set the right height&lt;/strong&gt; — You'll need to place the drawing at the same height that the features you have chosen will be during the detection phase. For example, if you have a 20 cm high rectangular object and you're detecting its top face, place the drawing 20cm above the surface.&lt;/li&gt; 
 &lt;li&gt;&lt;strong&gt;Teach the part to the system&lt;/strong&gt; — Use the vision system's teaching mode to teach the template image.&lt;/li&gt; 
&lt;/ol&gt; 
&lt;p&gt;By the way, you don't have to limit yourself to 2D drawings. &lt;strong&gt;Sometimes a 3D printed version of the object can overcome teaching issues&lt;/strong&gt; like shiny surfaces and low background contrast.&lt;/p&gt; 
&lt;p&gt;&amp;nbsp;&lt;/p&gt; 
&lt;p&gt;With this unusual trick, often used by the experts, you can easily overcome many of the common teaching problems with robot vision systems!&lt;/p&gt; 
&lt;p style="text-align: center;"&gt;&lt;a class="cta_button" href="https://blog.robotiq.com/cs/ci/?pg=dc910fdd-73af-4256-9c1e-ac13cf4185b5&amp;amp;pid=13401&amp;amp;ecid=&amp;amp;hseid=&amp;amp;hsic="&gt;&lt;img class="hs-cta-img " style="border-width: 0px; /*hs-extra-styles*/; margin: 0 auto; display: block; margin-top: 20px; margin-bottom: 20px" alt="wrist-camera-urcap-update-cta" src="https://no-cache.hubspot.com/cta/default/13401/dc910fdd-73af-4256-9c1e-ac13cf4185b5.png" align="middle"&gt;&lt;/a&gt;&lt;/p&gt; 
&lt;p style="-qt-block-indent: 0; text-indent: 0px; margin: 0px 0px 10px 0px;"&gt;&lt;em&gt;What issues have you had in the past with robot vision? &lt;strong&gt;Tell us in the comments below or join the discussion on &lt;a href="https://www.linkedin.com/company/1695451"&gt;LinkedIn&lt;/a&gt;, &lt;a href="https://twitter.com/Robotiq_Inc"&gt;Twitter,&lt;/a&gt; &lt;a href="https://www.facebook.com/robotiq"&gt;Facebook&lt;/a&gt; or &lt;a href="http://dof.robotiq.com/"&gt;the DoF professional robotics community&lt;/a&gt;.&lt;/strong&gt;&lt;/em&gt; &lt;/p&gt;  
&lt;img src="https://track.hubspot.com/__ptq.gif?a=13401&amp;amp;k=14&amp;amp;r=https%3A%2F%2Fblog.robotiq.com%2Fan-unusual-trick-that-improves-robot-vision-teaching&amp;amp;bu=https%253A%252F%252Fblog.robotiq.com&amp;amp;bvt=rss" alt="" width="1" height="1" style="min-height:1px!important;width:1px!important;border-width:0!important;margin-top:0!important;margin-bottom:0!important;margin-right:0!important;margin-left:0!important;padding-top:0!important;padding-bottom:0!important;padding-right:0!important;padding-left:0!important; "&gt;</content:encoded>
      <category>collaborative manufacturing</category>
      <category>vision</category>
      <category>URCaps</category>
      <category>Wrist Camera</category>
      <category>URCap</category>
      <category>robot vision</category>
      <category>robotics software</category>
      <pubDate>Thu, 14 Mar 2019 15:02:00 GMT</pubDate>
      <author>alex@alexowenhill.co.uk (Alex Owen-Hill)</author>
      <guid>https://blog.robotiq.com/an-unusual-trick-that-improves-robot-vision-teaching</guid>
      <dc:date>2019-03-14T15:02:00Z</dc:date>
    </item>
    <item>
      <title>What's New In Robotics?  08.03.2019</title>
      <link>https://blog.robotiq.com/whats-new-in-robotics-08.03.2019</link>
      <description>&lt;div class="hs-featured-image-wrapper"&gt; 
 &lt;a href="https://blog.robotiq.com/whats-new-in-robotics-08.03.2019" title="" class="hs-featured-image-link"&gt; &lt;img src="https://blog.robotiq.com/hubfs/PIC-2.jpg" alt="ranmarine-wasteshark" class="hs-featured-image" style="width:auto !important; max-width:50%; float:left; margin:0 15px 15px 0;"&gt; &lt;/a&gt; 
&lt;/div&gt;    
&lt;p&gt;&amp;nbsp;Hi!&amp;nbsp; In this week's news mix: UR announces Automate showcase, Korea unveils 'Mode Man' cobot and Rollon creates '7th axis' for UR cobots.&amp;nbsp; We also watch Jibo's farewell to users, meet the trash-fighting robot 'Wasteshark', marvel at a selfie taken 37,600 km from the earth's surface and much more!&lt;/p&gt;</description>
      <content:encoded>&lt;p&gt;&amp;nbsp;Hi!&amp;nbsp; In this week's news mix: UR announces Automate showcase, Korea unveils 'Mode Man' cobot and Rollon creates '7th axis' for UR cobots.&amp;nbsp; We also watch Jibo's farewell to users, meet the trash-fighting robot 'Wasteshark', marvel at a selfie taken 37,600 km from the earth's surface and much more!&lt;/p&gt; 
&lt;p&gt;&lt;/p&gt; 
&lt;h2&gt;Cobots &amp;amp; manufacturing&lt;/h2&gt; 
&lt;p&gt;Cobot manufacturer Universal Robots has announced plans to showcase four new application clusters at the &lt;a href="https://www.automateshow.com/press-releases/automate-2019-show-and-conference-comes-to-chicago-april-8-11"&gt;Automate&lt;/a&gt; event in Chicago, April 8-11, 2019.&amp;nbsp; Aimed at addressing tasks where manufacturing labor shortages are being felt most acutely, the application clusters are machine tending, packaging, assembly and processing.&amp;nbsp; (Expect to see screwdriving and sanding cobot solutions on display too.)&amp;nbsp;&lt;/p&gt; 
&lt;p style="text-align: center;"&gt;&lt;img src="https://blog.robotiq.com/hs-fs/hubfs/Sanding-768x473.jpg?width=300&amp;amp;name=Sanding-768x473.jpg" alt="Sanding-768x473" width="300" style="width: 300px; display: block; margin: 0px auto;"&gt;&lt;br&gt;Credit:&amp;nbsp; Universal Robots&lt;/p&gt; 
&lt;p&gt;One of the apps that's ready for its Automate showcase is a dual machine tending solution that features a &lt;a href="https://www.universal-robots.com/products/ur5-robot/"&gt;UR5e cobot&lt;/a&gt; fitted with &lt;a href="https://www.universal-robots.com/plus/end-effectors/robotiq-2f-140/"&gt;Robotiq’s double HAND-E gripper&lt;/a&gt; tending two CNC machines in the same cycle.&amp;nbsp; The HAND-E gripper is able to remove a part and insert a new part in the same handling move. &amp;nbsp;&lt;/p&gt; 
&lt;p&gt;Kawasaki has been showing off its new, dual-arm '&lt;a href="https://www.robotics.org/product-catalog-detail.cfm/Kawasaki-Robotics-USA-Inc/Kawasaki-Collaborative-Dual-Arm-SCARA-Robot-duAro2/productid/5206"&gt;duAro2&lt;/a&gt;' cobot, which offers increased vertical stroke (from 150 to 550mm (5.9-21 in)) and payload (from 4-6 kg (8.81-13.22 lb)) compared to its predecessor.&amp;nbsp; In this video, a duAro2 helps pack finished products into a cardboard box...&lt;/p&gt; 
&lt;div class="hs-responsive-embed hs-responsive-embed-youtube"&gt;
 &lt;iframe class="hs-responsive-embed-iframe" style="float: none; margin-left: auto; margin-right: auto; display: block;" src="http://www.youtube.com/embed/xpHFtVyvaDk" width="600" height="336" allowfullscreen&gt;&lt;/iframe&gt;
&lt;/div&gt; 
&lt;p&gt;&amp;nbsp;&lt;/p&gt; 
&lt;p&gt;Advanced Robotics for Manufacturing (ARM) has issued a &lt;a href="http://arminstitute.org/arm-tech-19-01/"&gt;call for proposals&lt;/a&gt; for new solutions that advance "robotics technology for manufacturing."&amp;nbsp; ARM is a U.S.-based fededral organization dedicated to robotics and workforce innovation.&amp;nbsp; This funding cycle could see ARM award up to USD5 million in eight categories, including: 'Bi-directional Communication on the Shop Floor,' 'Human-Robot Trust and Safety' and 'Methods &amp;amp; Tools for Successful Robotics Adoption and Expansion.' &amp;nbsp;&lt;/p&gt; 
&lt;p&gt;&lt;a href="https://www.rollon.com"&gt;Rollon&lt;/a&gt; has unveiled a seventh axis designed specifically for Universal Robots' cobots.&amp;nbsp; Designed to increase the cobots' range of action, Rollon forsees various applications for the technology, from industrial machine feeding to assembly, pick and place and palletizing...&lt;/p&gt; 
&lt;div class="hs-responsive-embed hs-responsive-embed-youtube"&gt;
 &lt;iframe class="hs-responsive-embed-iframe" style="float: none; margin-left: auto; margin-right: auto; display: block;" src="http://www.youtube.com/embed/46ihuUVrz6k" width="600" height="336" allowfullscreen&gt;&lt;/iframe&gt;
&lt;/div&gt; 
&lt;p&gt;&amp;nbsp;&lt;/p&gt; 
&lt;p&gt;The widening skills gap in U.S. manufacturing could leave an estimated 2.4 million positions unfilled between 2018 and 2028, with a potential economic impact of USD2.5 trillion, according to a joint study by Deloitte and The Manufacturing Institute.&amp;nbsp; Positions relating to digital talent, skilled production, and operational managers may be three times as difficult to fill in the next three years, &lt;a href="https://www.assemblymag.com/articles/94739-deloitte-tmi-study-notes-widening-manufacturing-skills-gap"&gt;Assembly Mag&lt;/a&gt; reported.&amp;nbsp;&lt;/p&gt; 
&lt;p&gt;&lt;a href="https://www.binzel-abicor.com/US/eng/home/"&gt;Abicor Binzel&lt;/a&gt; announced that its spool welding cobot, developed in collaboration with &lt;a href="https://novarctech.com/"&gt;Novarc Technologies&lt;/a&gt;, will be participating in a live demo week from April 1st-5th in Houston, Texas.&lt;/p&gt; 
&lt;p style="text-align: center;"&gt;&lt;img src="https://blog.robotiq.com/hs-fs/hubfs/6_0_8_1030608_spool_welding_robot_noir_632122.jpg?width=600&amp;amp;name=6_0_8_1030608_spool_welding_robot_noir_632122.jpg" alt="6_0_8_1030608_spool_welding_robot_noir_632122" width="600" style="width: 600px; display: block; margin: 0px auto;"&gt;&lt;br&gt;Credit:&amp;nbsp; Abicor Binzel&lt;/p&gt; 
&lt;p&gt;Korean researchers have created a "Korean Transformer" cobot called 'Mode Man' that enables "users to attach industrial robot arms and hands freely for their specific application."&lt;/p&gt; 
&lt;p&gt;&lt;img src="https://blog.robotiq.com/hs-fs/hubfs/29630_41037_910.png?width=600&amp;amp;name=29630_41037_910.png" alt="29630_41037_910" width="600" style="width: 600px; display: block; margin: 0px auto;"&gt;&lt;/p&gt; 
&lt;p&gt;Via &lt;a href="http://www.businesskorea.co.kr/news/articleView.html?idxno=29630"&gt;Business Korea&lt;/a&gt;:&lt;/p&gt; 
&lt;blockquote&gt; 
 &lt;p&gt;&lt;em&gt;Mode Man can have six to seven sections connectively attached on both arms. It is also possible to combine the arm and hand modules in a wide variety of ways with no distinction of front and backside. Depending on how the user combines the modules, Mode Man's movement, speed, and power change.&lt;/em&gt;&lt;/p&gt; 
&lt;/blockquote&gt; 
&lt;p&gt;In an exclusive piece for The Robot Report, Steve Crowe shared the HAHN Group’s plans to revive the fortunes of cobot maker Rethink Robotics.&amp;nbsp; There are three main planks to emerging strategy: refurbish Rethink Robotics’ inventory of Sawyer cobots; develop a new Sawyer cobot with “less noise, higher speed and better accuracy”; develop a family of Rethink robots with varying payloads and reaches.&amp;nbsp; (&lt;a href="https://www.therobotreport.com/hahn-group-rethink-robotics-sawyer-cobot/"&gt;The Robot Report&lt;/a&gt; has the details.)&lt;/p&gt; 
&lt;p&gt;&amp;nbsp;&lt;/p&gt; 
&lt;p&gt;In other cobot news:&lt;/p&gt; 
&lt;ul&gt; 
 &lt;li&gt;The future of manufacturing: Cobots in the factory (&lt;a href="https://www.tctmagazine.com/tctblogs/guest-blogs/the-future-of-manufacturing-cobots-in-the-factory/"&gt;TCT Mag&lt;/a&gt;)&lt;/li&gt; 
 &lt;li&gt;The shift to collaborative robots means the rise of robotics as a service&amp;nbsp; (&lt;a href="https://techcrunch.com/2019/03/03/the-shift-to-collaborative-robots-means-the-rise-of-robotics-as-a-service/"&gt;TechCrunch&lt;/a&gt;)&lt;/li&gt; 
 &lt;li&gt;Robots: Not just for the big guys anymore&amp;nbsp; (&lt;a href="https://www.chicagobusiness.com/manufacturing/robots-not-just-big-guys-anymore"&gt;Chicago Business&lt;/a&gt;)&lt;/li&gt; 
 &lt;li&gt;Robots For MRO Improving&amp;nbsp; (&lt;a href="https://www.mro-network.com/mro-links/robots-mro-improving"&gt;MRO Network&lt;/a&gt;)&lt;/li&gt; 
 &lt;li&gt;Non-Automotive Manufacturing Sectors Flocking to Robotics&amp;nbsp; (&lt;a href="https://advancedmanufacturing.org/non-automotive-mfg-robotics/"&gt;Advanced Manufacturing&lt;/a&gt;)&lt;/li&gt; 
&lt;/ul&gt; 
&lt;br&gt;
&lt;br&gt; 
&lt;h2&gt;Elsewhere...&lt;/h2&gt; 
&lt;p&gt;Just when you thought it was safe to drop trash into our waterways, along comes 'Wasteshark': a new robot that hunts down litterers can scoop up more than 15 tons of waste debris in a year.&amp;nbsp; Designed to gather trash before it enters the world's oceans, the bot can operate for 8 hours in a single session.&lt;/p&gt; 
&lt;p style="text-align: center;"&gt;&lt;img src="https://blog.robotiq.com/hs-fs/hubfs/PIC-2.jpg?width=600&amp;amp;name=PIC-2.jpg" alt="ranmarine-wasteshark" width="600" style="width: 600px; display: block; margin: 0px auto;"&gt;&lt;br&gt;Credit:&amp;nbsp; RanMarine&lt;/p&gt; 
&lt;p&gt;The &lt;a href="https://www.dailymail.co.uk/sciencetech/article-6768791/Robot-shark-snaps-plastic-waste-tide-takes-sea.html"&gt;Daily Mail&lt;/a&gt; reports:&lt;/p&gt; 
&lt;blockquote&gt; 
 &lt;p&gt;&lt;em&gt;It is programmed with GPS points to ensure that it covers hotspots where waste gathers, and its path can be programmed and monitored remotely. As well as plastic it will extract oils, other pollutants and pest plants such as types of algae.&amp;nbsp; As well as plastic it will extract oils, other pollutants and pest plants such as types of algae.&lt;/em&gt;&lt;/p&gt; 
&lt;/blockquote&gt; 
&lt;p&gt;U.S. companies installed more robots than ever before in 2018, with non-automotive segments showing particularly strong growth, according to new figures from the &lt;a href="https://www.robotics.org/content-detail.cfm/Industrial-Robotics-News/Record-Number-of-Robots-Shipped-in-North-America-in-2018-With-More-Installed-at-Non-Automotive-Companies-Than-Ever-Before/content_id/7756"&gt;Robotic Industries Association&lt;/a&gt;.&amp;nbsp;&lt;/p&gt; 
&lt;p&gt;&lt;img src="https://blog.robotiq.com/hs-fs/hubfs/Capture-29.png?width=600&amp;amp;name=Capture-29.png" alt="Capture-29" width="600" style="width: 600px; display: block; margin: 0px auto;"&gt;&lt;/p&gt; 
&lt;p&gt;The Canadian government is planning to invest USD1.5 billion in the development of 'Canadarm3' --a robotic arm designed for use on NASA's planned Lunar Gateway space station in the 2020s. It's a move that carries a degree of risk, some analysts say, as the U.S. Congress has not yet allocated significant funds to the project.&amp;nbsp; (H/T &lt;a href="https://www.forbes.com/sites/elizabethhowell1/2019/03/05/canada-makes-a-risky-bet-on-a-giant-robot-arm/#68c4fc944ca9"&gt;Forbes&lt;/a&gt;)&lt;/p&gt; 
&lt;p&gt;&lt;img src="https://blog.robotiq.com/hs-fs/hubfs/https%20_blogs-images.forbes.com_elizabethhowell1_files_2019_03_role-du-canada-id-12642-og.jpg?width=600&amp;amp;name=https%20_blogs-images.forbes.com_elizabethhowell1_files_2019_03_role-du-canada-id-12642-og.jpg" alt="https _blogs-images.forbes.com_elizabethhowell1_files_2019_03_role-du-canada-id-12642-og" width="600" style="width: 600px; display: block; margin: 0px auto;"&gt;&lt;/p&gt; 
&lt;p&gt;&amp;nbsp;&lt;/p&gt; 
&lt;p&gt;The Beresheet lunar lander was photobombed by Planet Earth this week as it captured a spectacular selfie some 37,600 kilometers (23,363.5 miles) from the surface.&amp;nbsp;&lt;/p&gt; 
&lt;p style="text-align: center;"&gt;&lt;img src="https://blog.robotiq.com/hs-fs/hubfs/lunar.png?width=600&amp;amp;name=lunar.png" alt="lunar" width="600" style="width: 600px; display: block; margin: 0px auto;"&gt;&lt;br&gt;Credit: Beresheet&lt;/p&gt; 
&lt;p&gt;Meanwhile, Toyota is set to team up with the Japan Aerospace Exploration Agency on a planned mission to the moon, with the auto giant expected to develop a lunar rover, according to officials and media reports.&amp;nbsp;&lt;/p&gt; 
&lt;p&gt;Via &lt;a href="https://www.japantimes.co.jp/news/2019/03/06/business/japans-moon-shot-toyota-jaxa-space-agency-plan-send-rover-lunar-mission/#.XH_dhlNKjBI"&gt;The Japan Times&lt;/a&gt;:&lt;/p&gt; 
&lt;blockquote&gt; 
 &lt;p&gt;Details will be announced by JAXA and Toyota on Tuesday next week when the space agency hosts a symposium in Tokyo, the spokesman said.&amp;nbsp; Toyota also confirmed plans to announce a joint project with JAXA “on mobility and a space probe” but declined to comment further.&lt;/p&gt; 
&lt;/blockquote&gt; 
&lt;p&gt;Waymo announced that it is to start selling its short-range laser sensors, according to a report in Wired:&lt;/p&gt; 
&lt;p style="text-align: center;"&gt;&lt;img src="https://blog.robotiq.com/hs-fs/hubfs/Waymo-honeycomb_tabletop.jpg?width=600&amp;amp;name=Waymo-honeycomb_tabletop.jpg" alt="Waymo-honeycomb_tabletop" width="600" style="width: 600px; display: block; margin: 0px auto;"&gt;&lt;br&gt;The Laser Bear Honeycomb lidar.&amp;nbsp; Credit: Waymo&lt;/p&gt; 
&lt;blockquote&gt; 
 &lt;p&gt;&lt;em&gt;&lt;a href="https://waymo.com/lidar/"&gt;Laser Bear Honeycomb&lt;/a&gt;, is the one [Waymo] is now putting in its shop window. It’s a “perimeter sensor,” focused on things in its immediate vicinity. It sees 360 degrees around it, and it has a 90 degree vertical field of view. Its minimum range is zero meters, meaning it can see things right up against it.&amp;nbsp; Waymo won’t be selling the technology to rival self-driving outfits, instead indicating that robotics, security, and agriculture tech companies would be potential customers.&lt;/em&gt;&lt;/p&gt; 
&lt;/blockquote&gt; 
&lt;p&gt;And in other news:&amp;nbsp;&lt;/p&gt; 
&lt;ul&gt; 
 &lt;li&gt;RIA Announces The Winnner of 2019 Engelberger Robotics Award&amp;nbsp; (&lt;a href="https://news.thomasnet.com/companystory/ria-annouches-the-winnner-of-2019-engelberger-robotics-award-to-catherine-morris-and-dr-howie-choset-at-the-award-dinner-40021987"&gt;Thomas Industry Update&lt;/a&gt;)&lt;/li&gt; 
 &lt;li&gt;How I Became a Robot in London—From 5,000 Miles Away&amp;nbsp; (&lt;a href="https://www.wired.com/story/how-i-became-a-robot-in-london/"&gt;Wired&lt;/a&gt;)&lt;/li&gt; 
 &lt;li&gt;FDA: Robotic Surgery For Breast, Cervical Cancer? What You Need To Know&amp;nbsp; (&lt;a href="https://www.forbes.com/sites/brucelee/2019/03/02/fda-robotic-surgery-for-breast-cervical-cancer-what-you-need-to-know/"&gt;Forbes&lt;/a&gt;)&lt;/li&gt; 
 &lt;li&gt;Festo latest bionic creations wriggle toward Hannover Messe unveiling&amp;nbsp; (&lt;a href="https://www.design-engineering.com/festo-latest-bionic-creations-wriggle-toward-hannover-messe-unveiling-1004032872/"&gt;Design Engineering&lt;/a&gt;)&lt;/li&gt; 
 &lt;li&gt;Robot biomimics animals leaping from water&amp;nbsp; (&lt;a href="http://news.cornell.edu/stories/2019/03/robot-biomimics-animals-leaping-water"&gt;Cornell Chronicle&lt;/a&gt;)&lt;/li&gt; 
&lt;/ul&gt; 
&lt;p&gt;&lt;span&gt;Come by next week for more of the latest robotics news!&amp;nbsp; Until then, please enjoy..&lt;br&gt;&lt;br&gt;&lt;/span&gt;&lt;/p&gt; 
&lt;h2&gt;Five vids for Friday&lt;/h2&gt; 
&lt;p&gt;1.&amp;nbsp; &lt;a href="http://biomimetics.mit.edu/"&gt;MIT researchers&lt;/a&gt; have created a versatile, 20 lb (9.07 kg) “Mini Cheetah” bot that's able to flip, hop, trot, &lt;a href="https://en.wiktionary.org/wiki/pronk"&gt;pronk&lt;/a&gt; and run around on even and uneven terrain.&amp;nbsp; As the video below shows, it's also highly adept at getting back on its feet after a fall, which is just as well with all the acrobatics going on!&lt;/p&gt; 
&lt;div class="hs-responsive-embed hs-responsive-embed-youtube"&gt;
 &lt;iframe class="hs-responsive-embed-iframe" style="float: none; margin-left: auto; margin-right: auto; display: block;" src="http://www.youtube.com/embed/xNeZWP5Mx9s" width="600" height="336" allowfullscreen&gt;&lt;/iframe&gt;
&lt;/div&gt; 
&lt;p&gt;&amp;nbsp;&lt;/p&gt; 
&lt;p&gt;2.&amp;nbsp; Engineers at Georgia Institute of Technology have built an ultra low power chip for bots that is half the size and consumes one-third the power of traditional digital chips.&amp;nbsp; Enhancements in logic and memory design further reduced energy consumption to the miliwatt range while retaining performance targets.&amp;nbsp; The technology could help palm-sized robots to learn from experience and collaborate.&amp;nbsp; (H/T &lt;a href="https://www.news.gatech.edu/2019/03/05/ultra-low-power-chips-help-make-small-robots-more-capable"&gt;Georgia Tech News Center&lt;/a&gt;)&lt;/p&gt; 
&lt;div class="hs-responsive-embed hs-responsive-embed-youtube"&gt;
 &lt;iframe class="hs-responsive-embed-iframe" style="float: none; margin-left: auto; margin-right: auto; display: block;" src="http://www.youtube.com/embed/_NqdJabFJKo" width="600" height="336" allowfullscreen&gt;&lt;/iframe&gt;
&lt;/div&gt; 
&lt;p&gt;&amp;nbsp;&lt;/p&gt; 
&lt;p&gt;3.&amp;nbsp;&amp;nbsp; &lt;a href="https://www.jibo.com/"&gt;Jibo&lt;/a&gt;, the domestic social bot that launched in a blaze of publicity (but ultimately failed to meet expectations) delivered a touching goodbye to its users this week.&amp;nbsp; Jibo may be the first bot to announce its departure in peoples' homes in this way, but similar scenes are likely to play out again in future decades as domestic robots are retired, replaced or companies providing support close down.&amp;nbsp; (The BBC has &lt;a href="https://www.bbc.com/news/amp/technology-47454599"&gt;more&lt;/a&gt;. )&lt;/p&gt; 
&lt;div class="hs-responsive-embed hs-responsive-embed-youtube"&gt;
 &lt;iframe class="hs-responsive-embed-iframe" style="float: none; margin-left: auto; margin-right: auto; display: block;" src="http://www.youtube.com/embed/gQMDJtgzo10" width="600" height="336" allowfullscreen&gt;&lt;/iframe&gt;
&lt;/div&gt; 
&lt;p&gt;&amp;nbsp;&lt;/p&gt; 
&lt;p&gt;4.&amp;nbsp; New video shows how robots are helping to clean up after the Fukushima Daiichi nuclear disaster.&amp;nbsp; (&lt;a href="https://www.cnet.com/google-amp/news/at-fukushimas-nuclear-disaster-robots-are-just-now-attacking-the-radiation-problem/"&gt;c|net &lt;/a&gt;has more.)&lt;/p&gt; 
&lt;div class="hs-responsive-embed hs-responsive-embed-youtube"&gt;
 &lt;iframe class="hs-responsive-embed-iframe" style="float: none; margin-left: auto; margin-right: auto; display: block;" src="http://www.youtube.com/embed/mhQixNlLF_k" width="600" height="336" allowfullscreen&gt;&lt;/iframe&gt;
&lt;/div&gt; 
&lt;p&gt;&amp;nbsp;&lt;/p&gt; 
&lt;p&gt;5.&amp;nbsp; &lt;a href="https://tradiebot.com/"&gt;Tradiebot&lt;/a&gt;'s Repairbot project reached a major milestone this week by successfully 3D printing a replacement lug on a headlight.&amp;nbsp; Using a robotic arm to precisely manipulate the headlight under a stationary 3D print head, the team were able to "print complex geometries without the need for support material."&lt;/p&gt; 
&lt;div class="hs-responsive-embed hs-responsive-embed-youtube"&gt;
 &lt;iframe class="hs-responsive-embed-iframe" style="float: none; margin-left: auto; margin-right: auto; display: block;" src="http://www.youtube.com/embed/yRJ2PbZEk_A" width="600" height="336" allowfullscreen&gt;&lt;/iframe&gt;
&lt;/div&gt; 
&lt;p&gt;&amp;nbsp;&lt;/p&gt; 
&lt;p&gt;&amp;nbsp;&lt;/p&gt;  
&lt;img src="https://track.hubspot.com/__ptq.gif?a=13401&amp;amp;k=14&amp;amp;r=https%3A%2F%2Fblog.robotiq.com%2Fwhats-new-in-robotics-08.03.2019&amp;amp;bu=https%253A%252F%252Fblog.robotiq.com&amp;amp;bvt=rss" alt="" width="1" height="1" style="min-height:1px!important;width:1px!important;border-width:0!important;margin-top:0!important;margin-bottom:0!important;margin-right:0!important;margin-left:0!important;padding-top:0!important;padding-bottom:0!important;padding-right:0!important;padding-left:0!important; "&gt;</content:encoded>
      <category>collaborative robots</category>
      <category>space robotics</category>
      <category>advanced manufacturing</category>
      <category>cobot</category>
      <category>robotics news</category>
      <pubDate>Fri, 08 Mar 2019 12:02:00 GMT</pubDate>
      <author>emmetcole@gmail.com (Emmet Cole)</author>
      <guid>https://blog.robotiq.com/whats-new-in-robotics-08.03.2019</guid>
      <dc:date>2019-03-08T12:02:00Z</dc:date>
    </item>
    <item>
      <title>10 Solutions to Improve Robot Vision With Shiny Objects</title>
      <link>https://blog.robotiq.com/10-solutions-to-improve-robot-vision-with-shiny-objects</link>
      <description>&lt;div class="hs-featured-image-wrapper"&gt; 
 &lt;a href="https://blog.robotiq.com/10-solutions-to-improve-robot-vision-with-shiny-objects" title="" class="hs-featured-image-link"&gt; &lt;img src="https://blog.robotiq.com/hubfs/Robotiq-wrist-camera.jpg" alt="Robotiq-wrist-camera" class="hs-featured-image" style="width:auto !important; max-width:50%; float:left; margin:0 15px 15px 0;"&gt; &lt;/a&gt; 
&lt;/div&gt;    
&lt;p style="-qt-block-indent: 0; text-indent: 0px; margin: 10px 0px 0px 0px;"&gt; &lt;strong&gt;Lights, camera, … and then reflections on the surface of your shiny objects! Suddenly your flawless robot vision setup can't detect objects. How do you deal with these annoying lighting problems? Here are&amp;nbsp;ten great solutions.&lt;/strong&gt;&lt;/p&gt;</description>
      <content:encoded>&lt;p style="-qt-block-indent: 0; text-indent: 0px; margin: 10px 0px 0px 0px;"&gt;&lt;strong&gt;Lights, camera, … and then reflections on the surface of your shiny objects! Suddenly your flawless robot vision setup can't detect objects. How do you deal with these annoying lighting problems? Here are&amp;nbsp;ten great solutions.&lt;/strong&gt;&lt;/p&gt; 
&lt;p style="-qt-block-indent: 0; text-indent: 0px; margin: 10px 0px 0px 0px;"&gt;&lt;/p&gt; 
&lt;iframe class="wistia_embed" style="margin-left: auto; margin-right: auto; display: block;" src="https://fast.wistia.net/embed/iframe/w4pmovp7c9" name="wistia_embed" width="600" height="338" frameborder="0" allowfullscreen&gt;&lt;/iframe&gt; 
&lt;p style="text-align: center;"&gt;The&lt;a href="/robotiq-wrist-camera-update"&gt; Robotiq Wrist Camera URCap version 1.7&lt;/a&gt; is now the fastest and most intuitive vision system for Universal Robots.&lt;/p&gt; 
&lt;p&gt;Shiny objects are possibly the most challenging objects for a robot vision system. All it takes is one annoying reflection on the surface of the material and the reliability of the entire robot's task can be compromised.&lt;/p&gt; 
&lt;p&gt;Robot vision (&lt;a&gt;&lt;/a&gt;&lt;a href="/robot-vision-vs-computer-vision-whats-the-difference"&gt;and the more general "computer vision"&lt;/a&gt;) is all about lighting. Whatever type of sensor you are using, the lighting will have a huge effect on whether or not it will be possible to detect the objects. Pick the wrong type of light and you could end up not being able to detect anything at all.&lt;/p&gt; 
&lt;p&gt;This is why shiny materials can be such trouble. The lighting you choose can completely change the appearance of the objects. &lt;a&gt;&lt;/a&gt;&lt;a href="https://www.automation.com/pdf_articles/microscan/lighting_tips_white_paper.pdf"&gt;This whitepaper from Microscan&lt;/a&gt;&amp;nbsp;shows how a shiny ball bearing looks entirely different under 7 different lighting conditions. &lt;a href="https://dof.robotiq.com/discussion/210/shiny-object-vision-challenge"&gt;As Robotiq's Catherine Bernier explains&lt;/a&gt;, light reflections are seen by the robot's vision system as "features" (i.e. things that should be detected) but they aren't real physical features so they confuse the whole detection process.&lt;/p&gt; 
&lt;p&gt;How can we overcome these challenges and improve the detection of shiny objects?&lt;/p&gt; 
&lt;p&gt;Thankfully, there are some great solutions out there. Here are 10 solutions to improve robot vision with shiny objects.&lt;/p&gt; 
&lt;p&gt;&lt;a class="cta_button" href="https://blog.robotiq.com/cs/ci/?pg=dc910fdd-73af-4256-9c1e-ac13cf4185b5&amp;amp;pid=13401&amp;amp;ecid=&amp;amp;hseid=&amp;amp;hsic="&gt;&lt;img class="hs-cta-img " style="border-width: 0px; /*hs-extra-styles*/; margin: 0 auto; display: block; margin-top: 20px; margin-bottom: 20px" alt="wrist-camera-urcap-update-cta" src="https://no-cache.hubspot.com/cta/default/13401/dc910fdd-73af-4256-9c1e-ac13cf4185b5.png" align="middle"&gt;&lt;/a&gt;&lt;/p&gt; 
&lt;h2&gt;1. Use a backlight&lt;/h2&gt; 
&lt;p&gt;A backlight is one which sits underneath the object. We've used them before &lt;a&gt;&lt;/a&gt;&lt;a href="https://blog.robotiq.com/watch-our-automatica-demo-mashup"&gt;in our trade fair demos&lt;/a&gt; as they are a good way to control the light when the ambient lighting is unpredictable. They provide the vision sensor with a very clear outline of the object, which will show up as dark in the image.&lt;/p&gt; 
&lt;p&gt;Backlights are particularly useful with shiny objects because the light is coming from behind the object. This reduces chance of reflections on the object's surface.&lt;/p&gt; 
&lt;h2&gt;2. Use infra-red lighting&lt;/h2&gt; 
&lt;p&gt;Not all frequencies of light will interact with a surface in the same way. For example, ultraviolet (UV) light has a short wavelength which means that it will reflect strongly off the material's surface. Infra-red (IR) light, on the other hand, has a long wavelength and so will reflect less strongly. With some materials, it's even possible to "look through" the material with IR light. It is for this reason that &lt;a&gt;&lt;/a&gt;&lt;a href="https://www.vision-doctor.com/en/ir-illumination.html"&gt;IR light is often used for inspection applications&lt;/a&gt;.&lt;/p&gt; 
&lt;p style="text-align: center;"&gt;&lt;img src="https://blog.robotiq.com/hs-fs/hubfs/Robotiq-wrist-camera.jpg?width=600&amp;amp;name=Robotiq-wrist-camera.jpg" alt="Robotiq-wrist-camera" width="600" style="width: 600px; display: block; margin: 0px auto;"&gt;Adding a &lt;a href="/robotiq-wrist-camera-update"&gt;camera &lt;/a&gt;to your cobot is key to pick objects that lie in unstructured environments.&lt;/p&gt; 
&lt;h2&gt;3. Change the light color&lt;/h2&gt; 
&lt;p&gt;Of course, you don't need to change the type of light completely. Sometimes you can reduce the effects of light reflections by simply changing the color of the lighting. For example, visible-spectrum red light will interact less strongly with shiny surfaces than white light. An easy way to change lighting color is to cover the light with a colored "gel" (a thin, transparent film).&lt;/p&gt; 
&lt;p&gt;Changing the light color can also improve detection in other ways. For example, if your robot vision system uses monochrome images for detection, as many systems do, adding a blue light will make blue features lighter in the monochrome image and orange features darker. This effect happens because orange is the opposite color to blue on a color-wheel.&lt;/p&gt; 
&lt;ul style="list-style-type: disc;"&gt; 
 &lt;li&gt;&lt;strong&gt;Read more:&amp;nbsp;&lt;a href="/robotiq-wrist-camera-update"&gt;Get Full Control with the New Robotiq Wrist Camera URCap&lt;/a&gt;&lt;/strong&gt;&lt;/li&gt; 
&lt;/ul&gt; 
&lt;h2&gt;4. Use polarized light&lt;/h2&gt; 
&lt;p&gt;Normal white light contains light waves which are oriented in many different planes. &lt;a&gt;&lt;/a&gt;&lt;a href="https://www.physicsclassroom.com/class/light/Lesson-1/Polarization"&gt;Polarization&lt;/a&gt; is a process by which only light waves in one plane are allowed to pass through a polarizing filter.&lt;/p&gt; 
&lt;p&gt;Polarized light is often used in computer vision applications as shining a polarized light onto the shiny surface can reduce the reflections. However, polarized light is not always the best solution. You have to precisely position the polarized light or the reflections can get much worse.&lt;/p&gt; 
&lt;h2&gt;5. Change the flash&lt;/h2&gt; 
&lt;p&gt;If your vision system uses strobed lighting, this can have an effect on the reflections from shiny objects. Sometimes flashed light can reduce the reflections by "overwhelming" the glare caused by ambient lighting. Other times, the flash can make the reflections worse.&lt;/p&gt; 
&lt;p&gt;The solution is to test out your own setup and see whether changing the properties of the flash solves your problem.&lt;/p&gt; 
&lt;h2&gt;6. Add ambient lighting&lt;/h2&gt; 
&lt;p&gt;Often, a reflection on the surface of a shiny object&amp;nbsp; is caused by a single light source — e.g. a strip-light in the ceiling, a window, etc. It can seem a strange idea to add more light sources in an attempt to reduce the problem, but this is actually a good solution.&lt;/p&gt; 
&lt;p&gt;As with adding a flash, additional ambient lighting can "overwhelm" the reflections from individual light sources. Of course, you need to make sure that the light that you add doesn't create more reflections.&lt;/p&gt; 
&lt;h2&gt;7. Diffuse the existing light&lt;/h2&gt; 
&lt;p&gt;Imagine someone shines a torch in your eye. It's hard to look at the bright light, right? Now imagine a bright, grey sky on a very cloudy day. Even though the sky is bright, it's easier to look at than the torch, isn't it?&lt;/p&gt; 
&lt;p&gt;This is an example of the difference between a hard lighting source (the torch) and diffuse/soft lighting source (the overcast sky). The worst reflections on shiny objects are caused by harsh, single point light sources. By diffusing this light, you can drastically reduce the reflections. One common way to diffuse light is to add a diffuse dome.&lt;/p&gt; 
&lt;h2&gt;8. Use advanced sensors&lt;/h2&gt; 
&lt;p&gt;Robot users sometimes think that more advanced (and costly) sensors will solve their problems. For dealing with shiny objects, the assumption is often that 3D laser scanners and 3D vision will not have the same problems as 2D vision.&lt;/p&gt; 
&lt;p&gt;However, although advanced sensors are necessary in some cases, they aren't immune from the problems caused by shiny objects.&amp;nbsp;&lt;a href="https://ieeexplore.ieee.org/document/679807"&gt;Researchers from Huazhong University of Science and Technology&lt;/a&gt; explain that even 3D laser scanners are affected by spurious reflections — they had to develop an entire new algorithmic solution to overcome these issues. Sometimes, more advanced technology just means more complex problems!&lt;/p&gt; 
&lt;h2&gt;9. Reduce reflections when teaching&lt;/h2&gt; 
&lt;p&gt;One of the real secrets to improving vision detection of all objects — not just shiny ones — is to make sure that your teaching phase is good. The detection phase can be quite robust to lighting changes as long as the vision system has a good model (aka template) of the object it needs to detect.&lt;/p&gt; 
&lt;p&gt;To ensure your system has a good template of your object, you should reduce reflections to a minimum when you are teaching the robot vision system. You may find that the system is able to detect the objects better even in the presence of reflections.&lt;/p&gt; 
&lt;ul style="list-style-type: disc;"&gt; 
 &lt;li&gt;&lt;strong&gt;Read more:&amp;nbsp;&lt;a href="/robotiq-wrist-camera-update"&gt;Your Cobot Can Now Locate, Learn &amp;amp; Pick Faster&lt;/a&gt;&lt;/strong&gt;&lt;/li&gt; 
&lt;/ul&gt; 
&lt;h2&gt;10. Use the secret unusual trick&lt;/h2&gt; 
&lt;p&gt;How do you improve the teaching phase? One way is to use our favorite "secret trick" for creating a flawless object template. You can learn about this trick in our article &lt;a&gt;&lt;/a&gt;&lt;a href="https://blog.robotiq.com/an-unusual-trick-that-improves-robot-vision-teaching"&gt;An Unusual Trick That Improves Robot Vision Teaching&lt;/a&gt;&lt;/p&gt; 
&lt;p&gt;&amp;nbsp;&lt;a class="cta_button" href="https://blog.robotiq.com/cs/ci/?pg=9b957a82-9875-4c1c-9335-f931f87ee495&amp;amp;pid=13401&amp;amp;ecid=&amp;amp;hseid=&amp;amp;hsic="&gt;&lt;img class="hs-cta-img " style="border-width: 0px; /*hs-extra-styles*/; margin: 0 auto; display: block; margin-top: 20px; margin-bottom: 20px" alt="New Call-to-action" src="https://no-cache.hubspot.com/cta/default/13401/9b957a82-9875-4c1c-9335-f931f87ee495.png" align="middle"&gt;&lt;/a&gt;&lt;/p&gt; 
&lt;p style="-qt-block-indent: 0; text-indent: 0px; margin: 0px 0px 10px 0px;"&gt;&lt;em&gt;What lighting problems have you encountered with robot vision? &lt;strong&gt;Tell us in the comments below or join the discussion on &lt;a href="https://www.linkedin.com/company/1695451"&gt;LinkedIn&lt;/a&gt;, &lt;a href="https://twitter.com/Robotiq_Inc"&gt;Twitter,&lt;/a&gt; &lt;a href="https://www.facebook.com/robotiq"&gt;Facebook&lt;/a&gt; or &lt;a href="http://dof.robotiq.com/"&gt;the DoF professional robotics community&lt;/a&gt;.&lt;/strong&gt;&lt;/em&gt; &lt;/p&gt;  
&lt;img src="https://track.hubspot.com/__ptq.gif?a=13401&amp;amp;k=14&amp;amp;r=https%3A%2F%2Fblog.robotiq.com%2F10-solutions-to-improve-robot-vision-with-shiny-objects&amp;amp;bu=https%253A%252F%252Fblog.robotiq.com&amp;amp;bvt=rss" alt="" width="1" height="1" style="min-height:1px!important;width:1px!important;border-width:0!important;margin-top:0!important;margin-bottom:0!important;margin-right:0!important;margin-left:0!important;padding-top:0!important;padding-bottom:0!important;padding-right:0!important;padding-left:0!important; "&gt;</content:encoded>
      <category>vision</category>
      <category>URCaps</category>
      <category>Wrist Camera</category>
      <category>URCap</category>
      <category>robot vision</category>
      <category>start production faster</category>
      <pubDate>Thu, 07 Mar 2019 16:02:00 GMT</pubDate>
      <author>alex@alexowenhill.co.uk (Alex Owen-Hill)</author>
      <guid>https://blog.robotiq.com/10-solutions-to-improve-robot-vision-with-shiny-objects</guid>
      <dc:date>2019-03-07T16:02:00Z</dc:date>
    </item>
    <item>
      <title>How David Gouffé Went From Industrial Robots to Cobot End-Effectors</title>
      <link>https://blog.robotiq.com/topic/meet-the-team/europe/gouffe-david</link>
      <description>&lt;div class="hs-featured-image-wrapper"&gt; 
 &lt;a href="https://blog.robotiq.com/topic/meet-the-team/europe/gouffe-david" title="" class="hs-featured-image-link"&gt; &lt;img src="https://blog.robotiq.com/hubfs/Gouffe-David.gif" alt="Gouffe-David" class="hs-featured-image" style="width:auto !important; max-width:50%; float:left; margin:0 15px 15px 0;"&gt; &lt;/a&gt; 
&lt;/div&gt;    
&lt;p&gt;&lt;strong&gt;As our Integration Coach, David Gouffé draws on his creativity to help others. David’s responsible for deploying solutions to help companies start working with robots. With his devotion to his work, David can be counted on to do everything in his power to make clients happy—and the same goes for his family! &lt;br&gt;&lt;/strong&gt;&lt;/p&gt;</description>
      <content:encoded>&lt;p&gt;&lt;strong&gt;As our Integration Coach, David Gouffé draws on his creativity to help others. David’s responsible for deploying solutions to help companies start working with robots. With his devotion to his work, David can be counted on to do everything in his power to make clients happy—and the same goes for his family! &lt;br&gt;&lt;/strong&gt;&lt;/p&gt; 
&lt;table style="width: 100%;"&gt; 
 &lt;tbody&gt; 
  &lt;tr&gt; 
   &lt;td&gt;&lt;img src="https://blog.robotiq.com/hs-fs/hubfs/Gouffe-David.gif?width=600&amp;amp;name=Gouffe-David.gif" alt="Gouffe-David" width="600" style="width: 600px; margin: 0px 30px 0px 0px;"&gt;&lt;/td&gt; 
   &lt;td&gt; &lt;h2&gt;David Gouffé&lt;/h2&gt; &lt;p&gt;Integration Coach&lt;/p&gt; 
    &lt;blockquote&gt; 
     &lt;p&gt;&lt;strong&gt;&lt;em&gt; "You gotta strike while the iron's hot." &lt;/em&gt;&lt;/strong&gt;&lt;em&gt;—Josh Schwartz&lt;/em&gt;&lt;/p&gt; 
    &lt;/blockquote&gt; 
    &lt;ul&gt; 
     &lt;li&gt;&lt;span style="color: #00a2e1;"&gt;&lt;strong&gt;Joined in:&amp;nbsp;&lt;/strong&gt;&lt;/span&gt; 2018&lt;/li&gt; 
     &lt;li&gt;&lt;strong&gt;&lt;span style="color: #00a2e1;"&gt;Describes self as:&lt;/span&gt;&lt;/strong&gt; Loving Parent, Coffee Addict, DIY Master, Awesome Gamer, Movie Geek, Real Foodie, and Life-long Learner&lt;/li&gt; 
     &lt;li&gt;&lt;span style="color: #00a2e1;"&gt;&lt;strong&gt;Greatest strengths:&lt;/strong&gt;&lt;/span&gt; Creativity, Curiosity, and Appreciation of Beauty&lt;/li&gt; 
    &lt;/ul&gt; &lt;/td&gt; 
  &lt;/tr&gt; 
 &lt;/tbody&gt; 
&lt;/table&gt; 
&lt;br&gt; 
&lt;h2&gt;&lt;span&gt;Meet our EuRobotiq Team - The interview&lt;/span&gt;&lt;/h2&gt; 
&lt;h3&gt;How did you start working with Robotiq?&lt;/h3&gt; 
&lt;p&gt;&lt;span&gt;&lt;img src="https://blog.robotiq.com/hs-fs/hubfs/Gouffe-David_Robotiq-1.jpg?width=320&amp;amp;name=Gouffe-David_Robotiq-1.jpg" alt="Gouffe-David_Robotiq-1" width="320" style="width: 320px; float: left; margin: 0px 30px 10px 0px;"&gt;When I first heard that a company could make a living on grippers, I was surprised. I thought it must be such a niche market. Those were the early days of cobots. &lt;/span&gt;&lt;/p&gt; 
&lt;p&gt;&lt;span&gt;Back then, I was working with big industrial robots at huge multinationals. That’s actually when I used one of Robotiq’s products, &lt;a href="https://robotiq.com/products/force-copilot"&gt;Force Copilot&lt;/a&gt;, for the first time. I was amazed by how much it simplified my programming tasks. &lt;/span&gt;&lt;/p&gt; 
&lt;p&gt;&lt;span&gt;As cobots continued to gain steam, I grew more and more interested in working for that&amp;nbsp;fast-growing company creating &lt;strong&gt;Plug + Play Components&lt;/strong&gt; for cobots. Once one of my colleagues left for Robotiq, I knew I had to work there too. That’s when I came across the opportunity to be an integration coach. After trying their products, I was completely sold on the concept, and I’m so glad I got the job. &lt;/span&gt;&lt;/p&gt; 
&lt;p&gt;&lt;span&gt;&lt;br&gt;&lt;br&gt;&lt;/span&gt;&lt;/p&gt; 
&lt;h3&gt;What do you work on? What does that mean for the world?&lt;/h3&gt; 
&lt;p&gt;My work is so meaningful to me, I can help companies start working with robots, &lt;strong&gt;start production faster&lt;/strong&gt; and&lt;strong&gt; free human hands from repetitive tasks&lt;/strong&gt;. I support our partners so they can help their clients use our tools to make the most of their robots. I intervene before, during, and sometimes after deployment.&lt;/p&gt; 
&lt;p&gt;Some projects are a tough challenge—that’s what I love about my work. We have to be creative, apply our logical thinking abilities, and sometimes push ourselves to another level to find a solution. To stay "lean," we think slowly—and then act quickly. &lt;/p&gt; 
&lt;h3&gt;&lt;span&gt;What&amp;nbsp;are your biggest values?&lt;/span&gt;&lt;/h3&gt; 
&lt;p&gt;&lt;span&gt;&lt;img src="https://blog.robotiq.com/hs-fs/hubfs/Gouffe-David_bebe.jpg?width=320&amp;amp;name=Gouffe-David_bebe.jpg" alt="Gouffe-David_bebe" width="320" style="width: 320px; float: right; margin: 0px 0px 10px;"&gt;Efficiency and reliability are essential to everything I do. Sometimes upholding these qualities means adding a dash of creativity. &lt;/span&gt;&lt;/p&gt; 
&lt;p&gt;&lt;span&gt;I also try to live by this quote: "Laugh at your mistakes, but learn from them. Joke about your troubles, but gather strength from them. Have fun with your difficulties, but overcome them."&lt;/span&gt;&lt;/p&gt; 
&lt;p&gt;&lt;span&gt;Finally,&lt;strong&gt; I connect a lot with the &lt;a href="https://leanrobotics.org/"&gt;lean robotics&lt;/a&gt; philosophy&lt;/strong&gt;. To be as lean as possible, you need to produce as much as you can with the smallest possible input, as fast as possible. That’s how you ensure you deliver the best results—and have time and resources to spare for great new projects. &lt;br&gt;&lt;/span&gt;&lt;/p&gt; 
&lt;p&gt;&lt;br&gt;&lt;br&gt;&lt;/p&gt; 
&lt;p&gt;&amp;nbsp;&lt;/p&gt; 
&lt;br&gt; 
&lt;h3&gt;What do you do when you’re not working?&lt;/h3&gt; 
&lt;p&gt;Spending time with my son is my favorite thing in the world. Although he’s knee-high to a grasshopper, it seems like he can do anything. (Ok, maybe not as much as a cobot can, but what can I say? I’m so proud of him.)&lt;/p&gt; 
&lt;p&gt;Besides my kid, my life is very simple and that’s how I like it. All I need are great friends, good food, and some rock ’n’ roll in the background. Finally, I enjoy unicycling, swimming, and gaming (I'm playing Left 4 Dead 2 or Counter-Strike Global Offensive these days).&lt;/p&gt; 
&lt;p&gt;&lt;span&gt;&lt;img src="https://blog.robotiq.com/hs-fs/hubfs/Gouffe-David_famille-1.jpg?width=600&amp;amp;name=Gouffe-David_famille-1.jpg" alt="Gouffe-David_famille-1" width="600" style="width: 600px; display: block; margin: 0px auto;"&gt;&lt;/span&gt;&lt;/p&gt; 
&lt;p style="text-align: center;"&gt;David and his son at Christmas.&lt;/p&gt; 
&lt;h1 style="text-align: left;"&gt;Let's meet !&lt;/h1&gt; 
&lt;p style="text-align: left;"&gt;David Gouffé&amp;nbsp;is one awesome Robotiq teammates among many others. Want to meet them all ? Read more about&amp;nbsp;&lt;strong&gt;&lt;a href="/topic/meet-the-team"&gt;Meet the Team&lt;/a&gt;&amp;nbsp;&lt;/strong&gt;!&lt;/p&gt; 
&lt;p style="text-align: left;"&gt;&lt;a class="cta_button" href="https://blog.robotiq.com/cs/ci/?pg=cdf345d4-1544-4e76-9d40-128aa46d9efd&amp;amp;pid=13401&amp;amp;ecid=&amp;amp;hseid=&amp;amp;hsic="&gt;&lt;img class="hs-cta-img " style="border-width: 0px; /*hs-extra-styles*/; margin: 0 auto; display: block; margin-top: 20px; margin-bottom: 20px" alt="MEET THE TEAM" src="https://no-cache.hubspot.com/cta/default/13401/cdf345d4-1544-4e76-9d40-128aa46d9efd.png" align="middle"&gt;&lt;/a&gt;&lt;/p&gt; 
&lt;h1 style="text-align: left;"&gt;&amp;nbsp;&lt;/h1&gt;  
&lt;img src="https://track.hubspot.com/__ptq.gif?a=13401&amp;amp;k=14&amp;amp;r=https%3A%2F%2Fblog.robotiq.com%2Ftopic%2Fmeet-the-team%2Feurope%2Fgouffe-david&amp;amp;bu=https%253A%252F%252Fblog.robotiq.com&amp;amp;bvt=rss" alt="" width="1" height="1" style="min-height:1px!important;width:1px!important;border-width:0!important;margin-top:0!important;margin-bottom:0!important;margin-right:0!important;margin-left:0!important;padding-top:0!important;padding-bottom:0!important;padding-right:0!important;padding-left:0!important; "&gt;</content:encoded>
      <category>Meet the Team</category>
      <category>Europe</category>
      <pubDate>Wed, 06 Mar 2019 20:34:20 GMT</pubDate>
      <author>r-a.letourneau@robotiq.com (Romy A Letourneau)</author>
      <guid>https://blog.robotiq.com/topic/meet-the-team/europe/gouffe-david</guid>
      <dc:date>2019-03-06T20:34:20Z</dc:date>
    </item>
    <item>
      <title>Be Fully in Control with the New Robotiq Wrist Camera - URCap 1.7</title>
      <link>https://blog.robotiq.com/be-fully-in-control-with-the-new-robotiq-wrist-camera-urcap-1.7.0</link>
      <description>&lt;div class="hs-featured-image-wrapper"&gt; 
 &lt;a href="https://blog.robotiq.com/be-fully-in-control-with-the-new-robotiq-wrist-camera-urcap-1.7.0" title="" class="hs-featured-image-link"&gt; &lt;img src="https://blog.robotiq.com/hubfs/Plan%20de%20travail%201.png" alt="True-originality-consists-not-in-a-new-manner-but-in-a-new-vision" class="hs-featured-image" style="width:auto !important; max-width:50%; float:left; margin:0 15px 15px 0;"&gt; &lt;/a&gt; 
&lt;/div&gt;    
&lt;p style="text-align: left;"&gt;&lt;strong&gt;Robotiq launched version 1.7 of its Wrist Camera software for Universal Robots. It features major upgrades in advanced vision control which makes it now the fastest and easiest vision system to use on Universal Robots. That means improved exposure, focus and white balance adjustments. Your cobot will have a faster and more accurate image definition which will improve its object detection ability regardless of the work plane material.&lt;/strong&gt;&lt;/p&gt;</description>
      <content:encoded>&lt;p style="text-align: left;"&gt;&lt;strong&gt;Robotiq launched version 1.7 of its Wrist Camera software for Universal Robots. It features major upgrades in advanced vision control which makes it now the fastest and easiest vision system to use on Universal Robots. That means improved exposure, focus and white balance adjustments. Your cobot will have a faster and more accurate image definition which will improve its object detection ability regardless of the work plane material.&lt;/strong&gt;&lt;/p&gt; 
&lt;p&gt;&lt;/p&gt; 
&lt;iframe class="wistia_embed" style="margin-left: auto; margin-right: auto; display: block;" src="https://fast.wistia.net/embed/iframe/w4pmovp7c9" name="wistia_embed" width="600" height="338" frameborder="0" allowfullscreen&gt;&lt;/iframe&gt; 
&lt;p style="text-align: center;"&gt;The Robotiq Wrist Camera URCap version 1.7.0 is now the fastest and most intuitive vision system for Universal Robots.&lt;/p&gt; 
&lt;p&gt;The new Robotiq Wrist Camera URCap is fast and easy to use for expert and novice integrators alike. This Plug+Play vision system now works in any environment. With complete control over the camera settings and model definition, the software can adapt to any situation. It is intuitive, flexible and powerful.&lt;/p&gt; 
&lt;p&gt;The Robotiq Wrist Camera is the only vision system specifically designed to perform industrial applications with Universal Robots. It allows you to teach new objects and detect features quickly to ensure repeatable picking. Reach your KPIs with a faster time to production, and improve your cycle time by adding robustness to your industrial application.&lt;/p&gt; 
&lt;h2&gt;Your Cobot Can Now Locate, Learn &amp;amp; Pick Faster&lt;/h2&gt; 
&lt;p&gt;&lt;strong&gt;You already own a Wrist Camera?&lt;/strong&gt; Go directly to Robotiq’s support page to download the latest update. Version 1.7 allows you to keep the same benefits as before, while giving you access to all the Camera settings to bring your production to the next level.&lt;/p&gt; 
&lt;p&gt;&lt;a class="cta_button" href="https://blog.robotiq.com/cs/ci/?pg=0d475acf-79f3-4473-9ea6-1a9e3b71d474&amp;amp;pid=13401&amp;amp;ecid=&amp;amp;hseid=&amp;amp;hsic="&gt;&lt;img class="hs-cta-img " style="border-width: 0px; /*hs-extra-styles*/; margin: 0 auto; display: block; margin-top: 20px; margin-bottom: 20px" alt="DOWNLOAD THE LATEST URCAP" src="https://no-cache.hubspot.com/cta/default/13401/0d475acf-79f3-4473-9ea6-1a9e3b71d474.png" align="middle"&gt;&lt;/a&gt;&lt;/p&gt; 
&lt;div&gt;
 &lt;a href="/robotiq-wrist-camera-update"&gt;&lt;strong&gt;&lt;img src="https://blog.robotiq.com/hs-fs/hubfs/Plan%20de%20travail%201.png?width=600&amp;amp;name=Plan%20de%20travail%201.png" width="600" style="width: 600px; display: block; margin: 0px auto;" alt="True-originality-consists-not-in-a-new-manner-but-in-a-new-vision"&gt;&lt;/strong&gt;&lt;/a&gt;
&lt;/div&gt; 
&lt;div style="text-align: center;"&gt;
 “True originality consists not in a new manner but in a new vision.” —Edith Wharton
&lt;/div&gt; 
&lt;div style="text-align: center;"&gt;
 &amp;nbsp;
&lt;/div&gt; 
&lt;h2 style="text-align: left;"&gt;Improve Your Factory’s Efficiency and Autonomy&lt;/h2&gt; 
&lt;p style="text-align: left;"&gt;Adding a camera to your cobot is key to pick objects that lie in unstructured environments. Say goodbye to jigs and fixtures and hello to the Robotiq Wrist Camera. The latest version of the Camera’s software will change the way you use it and help you start production faster. With this Plug + Play component, you can automate picking and placing tasks in any environment. It is as easy to use as it is powerful. Locating, teaching and picking objects faster reduces your time to production, leading to a quick return on investment (ROI).&lt;/p&gt; 
&lt;p style="text-align: left;"&gt;&lt;a class="cta_button" href="https://blog.robotiq.com/cs/ci/?pg=e5ffc726-b82f-4f4d-a32f-bb47f755b61d&amp;amp;pid=13401&amp;amp;ecid=&amp;amp;hseid=&amp;amp;hsic="&gt;&lt;img class="hs-cta-img " style="border-width: 0px; /*hs-extra-styles*/; margin: 0 auto; display: block; margin-top: 20px; margin-bottom: 20px" alt="TALK TO AN EXPERT" src="https://no-cache.hubspot.com/cta/default/13401/e5ffc726-b82f-4f4d-a32f-bb47f755b61d.png" align="middle"&gt;&lt;/a&gt;&lt;/p&gt; 
&lt;div style="text-align: center;"&gt;
 &lt;strong&gt;&lt;a href="/robotiq-wrist-camera-update"&gt;&lt;img src="https://blog.robotiq.com/hs-fs/hubfs/2019-02-04-Applications-Wrist-Camera-07.png?width=200&amp;amp;name=2019-02-04-Applications-Wrist-Camera-07.png" alt="Machine-tending-Applications-Wrist-Camera" width="200" style="width: 200px;"&gt;&lt;img src="https://blog.robotiq.com/hs-fs/hubfs/2019-02-04-Applications-Wrist-Camera-05-1.png?width=200&amp;amp;name=2019-02-04-Applications-Wrist-Camera-05-1.png" alt="Assembly-Applications-Wrist-Camera" width="200" style="width: 200px;"&gt;&lt;img src="https://blog.robotiq.com/hs-fs/hubfs/2019-02-04-Applications-Wrist-Camera-06.png?width=200&amp;amp;name=2019-02-04-Applications-Wrist-Camera-06.png" alt="Pick-and-place-Applications-Wrist-Camera" width="200" style="width: 200px;"&gt;&lt;/a&gt;&lt;/strong&gt;
&lt;/div&gt; 
&lt;div style="text-align: center;"&gt;
 Teach, locate, pick, repeat your application with any expertise level and industrial environment.
 &lt;br&gt;
 &lt;br&gt;
 &lt;a class="cta_button" href="https://blog.robotiq.com/cs/ci/?pg=43c623ac-1b14-4b7d-bb85-d1a0e2ada276&amp;amp;pid=13401&amp;amp;ecid=&amp;amp;hseid=&amp;amp;hsic="&gt;&lt;img class="hs-cta-img " style="border-width: 0px; /*hs-extra-styles*/; " alt="LEARN MORE" src="https://no-cache.hubspot.com/cta/default/13401/43c623ac-1b14-4b7d-bb85-d1a0e2ada276.png"&gt;&lt;/a&gt;
&lt;/div&gt;  
&lt;img src="https://track.hubspot.com/__ptq.gif?a=13401&amp;amp;k=14&amp;amp;r=https%3A%2F%2Fblog.robotiq.com%2Fbe-fully-in-control-with-the-new-robotiq-wrist-camera-urcap-1.7.0&amp;amp;bu=https%253A%252F%252Fblog.robotiq.com&amp;amp;bvt=rss" alt="" width="1" height="1" style="min-height:1px!important;width:1px!important;border-width:0!important;margin-top:0!important;margin-bottom:0!important;margin-right:0!important;margin-left:0!important;padding-top:0!important;padding-bottom:0!important;padding-right:0!important;padding-left:0!important; "&gt;</content:encoded>
      <category>vision</category>
      <category>cobots</category>
      <category>robot wrist</category>
      <category>Wrist Camera</category>
      <category>robot vision</category>
      <pubDate>Mon, 04 Mar 2019 15:19:00 GMT</pubDate>
      <author>r-a.letourneau@robotiq.com (Romy A Letourneau)</author>
      <guid>https://blog.robotiq.com/be-fully-in-control-with-the-new-robotiq-wrist-camera-urcap-1.7.0</guid>
      <dc:date>2019-03-04T15:19:00Z</dc:date>
    </item>
  </channel>
</rss>
