Controllers Archives - The Robot Report https://www.therobotreport.com/category/technologies/controllers/ Robotics news, research and analysis Wed, 04 Jan 2023 19:17:05 +0000 en-US hourly 1 https://wordpress.org/?v=6.2 https://www.therobotreport.com/wp-content/uploads/2017/08/cropped-robot-report-site-32x32.png Controllers Archives - The Robot Report https://www.therobotreport.com/category/technologies/controllers/ 32 32 Siemens, Comau collaborate on Sinumerik Run MyRobot https://www.therobotreport.com/siemens-comau-collaborate-on-sinumerik-run-myrobot/ https://www.therobotreport.com/siemens-comau-collaborate-on-sinumerik-run-myrobot/#respond Wed, 04 Jan 2023 19:17:05 +0000 https://www.therobotreport.com/?p=564689 Siemens has entered into a cooperative agreement with Comau to offer their jointly engineered product the Sinumerik Run MyRobot / DirectControl.

The post Siemens, Comau collaborate on Sinumerik Run MyRobot appeared first on The Robot Report.

]]>
siemens comau

Siemens and Comau worked together to create Sinumerik Run MyRobot / DirectControl. | Source: Siemens

Siemens has entered into a cooperative agreement with Comau, an Italian robot manufacturer, to offer their jointly engineered product the Sinumerik Run MyRobot / DirectControl. With this product, robot kinematics can be fully integrated into a CNC system, optimizing control of all robotic machining and handling tasks. 

Sinumerik Run MyRobot / DirectControl allows a CNC system to control robotic arms and perform the safety functions that the robot controller typically performs. The Sinumerik CNC controls the articulated robot arm directly with a basis of complex algorithms, meaning users don’t need a separate robot controller. Robots can also be programmed entirely on the Sinumerik CNC operator panel. 

Integrating control of a robot arm into the CNC helps improve path and positioning accuracy and reliability, according to Siemens and Comau. The CNC can also give a robot enhanced dynamic response during robot-aided machining tasks, which makes the robot able to undertake more challenging machining assignments. 

Sinumerik Run MyRobot / DirectControl can perform all of the same functions as Siemens’ previous Run MyRobot variants and allows for more dynamic applications of a robotic arm with more accuracy control and an improved capacity for robot operations of run simultaneously with machining time.

Comau is based in Turin, Italy and was founded in 1973. It recently launched a new robotics learning center with Ferrari. The e.DO Learning Center will use Comau’s robots to help students explore STEM subjects, coding and robotics. The facility is equipped with five of Comau’s e.DO 6-axis robots, complete with all necessary materials and accessories.

In September, Siemens announced that it was collaborating with RapidPlan to integrate RapidPlan’s software with Siemens Process Simulate. The partnership will allow Siemens customers to use Realtime’s robot motion planning and control software, RapidPlan, without leaving Siemens Process Simulate. The integration enables users to visualize, prioritize and simulate robot task plans. Users can then validate those task plans through virtual commissioning. 

The post Siemens, Comau collaborate on Sinumerik Run MyRobot appeared first on The Robot Report.

]]>
https://www.therobotreport.com/siemens-comau-collaborate-on-sinumerik-run-myrobot/feed/ 0
How AI chipset bans could impact Chinese robotics companies https://www.therobotreport.com/how-ai-chipset-bans-could-impact-chinese-robotics-companies/ https://www.therobotreport.com/how-ai-chipset-bans-could-impact-chinese-robotics-companies/#respond Thu, 01 Sep 2022 22:34:07 +0000 https://www.therobotreport.com/?p=563712 The US is restricting the sale of the most powerful artificial intelligence processors to China.

The post How AI chipset bans could impact Chinese robotics companies appeared first on The Robot Report.

]]>

NVIDIA and AMD said on Wednesday that the United States government has ordered them to halt exports of certain AI chipsets to China, which is the world’s second-largest economy. Both companies now require licenses for the sale of AI chipsets to China.

The restrictions cover NVIDIA’s A100 and upcoming H100 integrated circuits, and any systems that include them. AMD said the new license requirements will stop its MI250 chips from being exported to China. However, it said it doesn’t foresee this having a material impact on its business. NVIDIA, however, said this move could result in a loss of $400 million in sales this year.

NVIDIA said U.S. officials told it the new rule “will address the risk that products may be used in, or diverted to, a ‘military end use’ or ‘military end user’ in China.” NVIDIA has development and manufacturing within China. Today it announced the export restrictions do not cover the movement of materials related to the development and manufacturing of the H100 chip. NVIDIA confirmed it will be allowed to fulfill orders of the A100 and complete development of its H100 chip through the company’s Hong Kong facility until Sept. 1, 2023.

“We are working with our customers in China to satisfy their planned or future purchases with alternative products and may seek licenses where replacements aren’t sufficient,” NVIDIA said in a statement. “The only current products that the new licensing requirement applies to are A100, H100 and systems such as DGX that include them.”

The NVIDIA A100 and H100 chipsets are Tensor Core GPUs, designed to process enterprise workloads. NVIDIA’s ninth-generation H100 data center GPU features 80 billion transistors. Built on the Hopper architecture, NVIDIA’s new accelerator is advertised as “the world’s largest and most powerful accelerator,” making it ideal for intensive HPC and AI simulations.

The NVIDIA A100 Tensor Core GPU, which was launched in 2020, was the highest-performing elastic data center for artificial intelligence (AI), data analytics (DA), and high-performance computing (HPC) at the time. The Ampere architecture provides up to 20X higher performance than its predecessors.

For AMD, the restrictions cover the Instinct MI250 Accelerator. Accelerators from AMD’s latest Instinct MI200 family are designed to fuel discoveries in mainstream servers and supercomputers, including some of the largest exascale systems, so that researchers may take on problems as diverse as climate change and vaccine development.

Impact on Chinese robotics companies

There are a number of robotic applications that leverage AI to operate effectively. This includes tasks like vision guidance for industrial robots for bin picking, sorting and palletizing. For autonomous mobile robots, AI is used for perception and obstacle avoidance. Within the warehouse, many suppliers are employing AI to optimize the daily workflow, inventory placement and both goods-to-person and person-to-goods operations.

The class of GPUs coming under export controls is arguably the most powerful on the market. The chips in question are designed to be deployed in data center applications and embedded in enterprise-class servers. The likely reason for the export ban is that the US government wants to keep these chips from being diverted to other (i.e. military) applications.

These servers require lots of power, cooling and high-speed network connections to do their work. As a result, according to multiple sources The Robot Report spoke to, these specific chips are not likely to be engineered into embedded systems like a robot controller or an autonomous mobile robot controller. However, these GPUs are important for training deep learning or reinforcement learning models for a robotic application. Model training is computationally intensive and these chips help to accelerate that task.

Typically, model training is done using a set of servers in a data center, accessed over the network. Amazon, Google and Microsoft deploy thousands of these types of servers to support their web services. Many AMR, drone and robot manufacturers around the world are deploying the NVIDIA Jetson as an embedded controller, and this class of technology is not included in any restrictions.

A source said if the NVIDIA block “is only on Ampere and Hopper, then that isn’t as severe as many (companies) still use the Jetson Xavier, which uses the predecessor to Ampere and works quite well.” However, the source added if future bans impact these lower-performance GPUs, then Chinese robotics companies could be in bad shape and could have to find alternative solutions.

Related impacts of US policy decisions on robotics

US policy has impacted Chinese robotics manufacturers in other ways recently.

In November 2021, the SPAC merger for autonomous trucking company Plus was blocked because of “developments in the regulatory environment outside of the United States.” At the time the story was reported, Plus had a partnership with China’s FAW, which is the world’s largest heavy truck manufacturer. Plus was working with some of the largest fleets in the US and China to pilot commercial freight operations.

A source told The Robot Report the US military is rejecting exoskeletons where the only part that was made in China is the cloth around them. 

Chinese company HIK Robotics has been banned from selling in the US due in part to the use of its cameras and vision technology by the Chinese government for surveillance operations. As a result, the company is focusing on sales in China and Korea.

The post How AI chipset bans could impact Chinese robotics companies appeared first on The Robot Report.

]]>
https://www.therobotreport.com/how-ai-chipset-bans-could-impact-chinese-robotics-companies/feed/ 0
FORT’s NSC sends commands to robots wirelessly https://www.therobotreport.com/forts-nsc-sends-commands-to-robots-wirelessly/ https://www.therobotreport.com/forts-nsc-sends-commands-to-robots-wirelessly/#respond Tue, 05 Apr 2022 15:18:56 +0000 https://www.therobotreport.com/?p=562353 With FORT's NSC embedded, users are able to send commands through the company's patented technology for safety and data integrity.

The post FORT’s NSC sends commands to robots wirelessly appeared first on The Robot Report.

]]>
FORT Robotics' Nano Safety Controller

FORT Robotics’ Nano Safety Controller. | Source: FORT Robotics

FORT Robotics announced its Nano Safety Controller (NSC) at MODEX 2022 last week. The NSC is an embeddable board that lets users build FORT wireless communication and safety technology directly into their machines. It’s the latest product in the company’s line of safety and security technology. 

With the NSC embedded, users are able to send commands through the company’s patented technology for safety and data integrity without any bolt-on hardware. Users can send wireless emergency stopping, sprint or crawl, change in state and other critical commands. 

Users have the ability to control large deployments of robots by embedding NSC into each robot and sending commands to them simultaneously. 

NSC, along with a FORT software agent, is able to communicate over almost any network, including WIFI and other IP networks. The company utilizes black channel communication principles to isolate and transmit critical safety data, regardless of the network. The controller also comes with error detection and latency control built in.

“Until now, customers who wanted our trusted command technology have had to use our bolt-on hardware. That’s not always ideal for every project,” Nathan Bivans, FORT CTO, said. “With the NSC, we can provide a solution that’s smaller, more flexible, more integrated and cost effective for large numbers of machines.”
 
NSC is designed to meet ISO 13849 and IEC 61508 SIL 2 standards. 
 
“Safety is at the core of everything we do, and it’s hard to get this level of safety integrity over IP networks,” Dave Sullivan, principal product manager at FORT, said. “We believe this is a game changer.”
 
FORT was founded in 2018 by now-CEO Samuel Reeves. It was Reeves’ second company at the time. Over a decade earlier, in 2006, he created Humanistic Robotics to produce an autonomous solution to safely clear landmines. Through the development of that technology, Reeves saw an opportunity to create a safety-and-security overlay that can be easily integrated into next-generation autonomous machines across multiple industries.
 
FORT’s other safety products include: 
  • Wireless E-Stop: A wireless button that can stop any machine from a safe distance. 
  • Safe Remote Control: A remote controller that allows for safer autonomous systems by taking manual control of machines, or stopping a machine remotely with a built-in emergency stop button
  • Vehicle Safety Controller: A transceiver that acts as an input, output or bridge. 
  • Endpoint Controller: A controller that can send, receive and execute trusted commands over wireless networks. 
  • Fort Manager Software: A cloud application that allows users to monitor their FORT Pro devices. 
 

The post FORT’s NSC sends commands to robots wirelessly appeared first on The Robot Report.

]]>
https://www.therobotreport.com/forts-nsc-sends-commands-to-robots-wirelessly/feed/ 0
NVIDIA accelerates mobile robot development with Isaac Nova Orin https://www.therobotreport.com/nvidia-accelerates-mobile-robot-with-development-of-isaac-nova-orin/ https://www.therobotreport.com/nvidia-accelerates-mobile-robot-with-development-of-isaac-nova-orin/#respond Tue, 22 Mar 2022 16:30:53 +0000 https://www.therobotreport.com/?p=562139 State-of-the-art computing and sensor architecture powered by the new Jetson platform enables next-gen 3D-sensing for AMRs.

The post NVIDIA accelerates mobile robot development with Isaac Nova Orin appeared first on The Robot Report.

]]>

Next time socks, cereal or sandpaper shows up in hours delivered to your doorstep, consider the behind-the-scenes logistics acrobatics that help get them there so fast.

Order fulfillment is a massive industry of moving parts. Heavily supported by autonomous mobile robots (AMRs), warehouses can span 1 million square feet, expanding and reconfiguring to meet demands. It’s an obstacle course of workers and bottlenecks for hospitals, retailers, airports, manufacturers and others.

To accelerate development of these AMRs, we’ve introduced Isaac Nova Orin, a state-of-the-art compute and sensor reference platform. It’s built on the powerful new NVIDIA Jetson AGX Orin edge AI system, available today. The platform includes the latest sensor technologies and high-performance AI compute capability.

New Isaac Software Arrives for AMR Ecosystem

In addition to Nova Orin, which will be available later this year, we’re delivering new software and simulation capabilities to accelerate AMR deployments — including hardware-accelerated modules, or Isaac ROS GEMs, that are essential for enabling robots to visually navigate. That’s key for mobile robots to better perceive their environment to safely avoid obstacles and efficiently plan paths.

New simulation capabilities, available in the NVIDIA Isaac Sim April release, will help save time when building virtual environments to test and train AMRs. Using 3D building blocks, developers can rapidly create realistic complex warehouse scenes and configurations to validate the robot’s performance on a breadth of logistics tasks.

The following video will show how AMR’s can leverage visual navigation using Isaac Sim and Isaac ROS.

Isaac Nova Orin Key Features

Nova Orin comes with all of the compute and sensor hardware needed to design, build and test autonomy in AMRs.

Its two Jetson AGX Orin units provide up to 550 TOPS of AI compute for perception, navigation and human-machine interaction. These modules process data in real time from the AMR’s central nervous system — essentially the sensor suite comprising up to six cameras, three lidars and eight ultrasonic sensors.

Nova Orin includes tools necessary to simulate the robot in Isaac Sim on Omniverse, as well as support for numerous ROS software modules designed to accelerate perception and navigation tasks. Tools are also provided for accurately mapping the robots’ environment using NVIDIA DeepMap.

The entire platform is calibrated and tested to work out of the box and give developers valuable time to innovate on new features and capabilities.

Diagram of the 3D Sensor Field of Isaac Nova Orin. | Credit: Nvidia

Enabling the Future

Much is at stake in intralogistics for AMRs, a market expected to top $46 billion by 2030, up from under $8 billion in 2021, according to estimates from ABI Research.

The old method of designing the AMR compute and sensor stack from the ground up is too costly in time and effort. Tapping into an existing platform allows manufacturers to focus on building the right software stack for the right robot application.

Improving productivity for factories and warehouses will depend on AMRs working safely and efficiently side by side at scale. High levels of autonomy driven by 3D perception from Nova Orin will help drive that revolution.

As AMRs evolve, the need for secure deployment and management of the critical AI software on board is paramount. Over-the-air software management support is already pre-integrated in Nova Orin.

Learn more about Nova Orin and the complete Isaac for AMR platform.

 

 

The post NVIDIA accelerates mobile robot development with Isaac Nova Orin appeared first on The Robot Report.

]]>
https://www.therobotreport.com/nvidia-accelerates-mobile-robot-with-development-of-isaac-nova-orin/feed/ 0
NVIDIA announces availability of Jetson AGX Orin Developer Kit https://www.therobotreport.com/nvidia-announces-availability-of-jetson-agx-orin-developer-kit/ https://www.therobotreport.com/nvidia-announces-availability-of-jetson-agx-orin-developer-kit/#respond Tue, 22 Mar 2022 16:30:26 +0000 https://www.therobotreport.com/?p=562148 NVIDIA has over one million developers now deploying on Jetson with John Deere, Medtronic, Hyundai Robotics, Komatsu, Meituan among early adopters

The post NVIDIA announces availability of Jetson AGX Orin Developer Kit appeared first on The Robot Report.

]]>
picture of the Nvidia Jetson AGX

NVIDIA Jetson AGX Orin supersedes the Jetson AGX Xavier. | Credit: Nvidia

During GTC 2022, NVIDIA announced the availability of the NVIDIA Jetson AGX Orin developer kit, a powerful, compact and energy-efficient AI supercomputer for advanced robotics, autonomous machines, and next-generation embedded and edge computing.

The Jetson AGX Orin delivers unprecedented edge compute performance up to 275 trillion operations per second, giving customers over 8x the processing power of its predecessor, the Jetson AGX Xavier. The new unit retains the same palm-sized form factor and pin compatibility of the Xavier at a similar price point.

For autonomous mobile robot (AMR) developers, this new compute platform will enable more complex algorithms, and sensor fusion use cases while pushing forward the capabilities of AMRs.

Diagram of the 3D Sensor Field of Isaac Nova Orin

Diagram of the 3D Sensor Field of Isaac Nova Orin. | Credit: Nvidia

It features an NVIDIA Ampere architecture GPU, Arm Cortex-A78AE CPUs, next-generation deep learning and vision accelerators, high-speed interfaces, faster memory bandwidth and multimodal sensor support to feed multiple, concurrent AI application pipelines.

“As AI transforms manufacturing, healthcare, retail, transportation, smart cities and other essential sectors of the economy, demand for processing continues to surge,” said Deepu Talla, vice president of Embedded and Edge Computing at NVIDIA. “A million developers and more than 6,000 companies have already turned to Jetson. The availability of Jetson AGX Orin will supercharge the efforts of the entire industry as it builds the next generation of robotics and edge AI products.”

NVIDIA announced that customers using Jetson AGX Orin can leverage the full NVIDIA CUDA-X accelerated computing stack, NVIDIA JetPack SDK, pretrained models from the NVIDIA NGC catalog and the latest frameworks and tools for application development and optimization such as NVIDIA Isaac on Omniverse, NVIDIA Metropolis, and NVIDIA TAO Toolkit.

This reduces time and cost for production-quality AI deployments, allowing developers to access the largest, most complex models needed to solve robotics and edge AI challenges in 3D perception, natural language understanding, multi-sensor fusion and more.

Broad Customer and Ecosystem Support

Jetson AGX Orin has received strong feedback from the robotics and embedded computing ecosystem, through early testing from customers including Microsoft Azure, John Deere, Medtronic Digital Surgery, AWS, Hyundai Robotics, JD.com, Komatsu, Meituan and many more.

“We are extending the powerful Microsoft Azure platform to the intelligent edge. Combining Azure’s advanced capabilities with performance and software development tools such as NVIDIA Jetson AGX Orin helps give developers a seamless experience to easily build, deploy and operate production-ready AI applications.” — Roanne Sones, corporate vice president, Microsoft Azure Edge + Platforms

“With the global population expected to reach nearly 10 billion people by 2050, farmers have a steep challenge of feeding the world and they can’t do it alone. With less available land and labor, and many variables to work through, deploying and scaling advanced technology like autonomy is key to building a continually smart, evolving and more efficient farm. Our fully autonomous tractor, featuring two NVIDIA Jetson GPUs for quick and accurate image classification at the edge, will be on farms this year, supporting farmers in overcoming challenges and providing for our growing world.” — Jahmy Hindman, chief technology officer at John Deere

“As a recognized medical technology leader, Medtronic continues to innovate and advance solutions to improve surgical patient care. We recognize the key role for AI in digitization of surgery through quantitative analytics and real-time clinical decision support systems. The latest NVIDIA Jetson platform brings us a new level of computational performance in the operating room and enables us to advance intraoperative systems to better support surgeons, through data-enabled solutions.” — Dan Stoyanov, chief scientific officer at Medtronic Digital Surgery

“Advances in edge AI and robotics are reshaping entire industries by overcoming rising costs and limitations in labor and materials. Every industry will benefit from AI and robotics in the future, and 2022 is proving to be a key tipping point. Combined with NVIDIA pretrained AI models, frameworks like TAO toolkit and Isaac on Omniverse, and supported by the Jetson developer community and its partner ecosystem, Jetson AGX Orin offers a scalable AI platform with unmatched resources that make it easy to adapt to almost any application.” — Jim McGregor, principal analyst at TIRIAS Research

The Jetson embedded computing partner ecosystem encompasses a broad range of services and products, including cameras and other multi-modal sensors, carrier boards, hardware design services, AI and system software, developer tools and custom software development.

Jetson AGX Orin Pricing and Availability

The NVIDIA Jetson AGX Orin developer kit is available now at $1,999. Production modules will be available in the fourth quarter starting at $399.

“To learn more about Jetson AGX Orin, watch the GTC 2022 keynote from Jensen Huang. Register for GTC 2022 for free to attend sessions with NVIDIA and industry leaders.”

The post NVIDIA announces availability of Jetson AGX Orin Developer Kit appeared first on The Robot Report.

]]>
https://www.therobotreport.com/nvidia-announces-availability-of-jetson-agx-orin-developer-kit/feed/ 0
ABB adds 2 OmniCore robot controllers https://www.therobotreport.com/abb-adds-2-omnicore-robot-controllers/ https://www.therobotreport.com/abb-adds-2-omnicore-robot-controllers/#respond Tue, 14 Dec 2021 17:47:42 +0000 https://www.therobotreport.com/?p=561163 ABB said both controllers include more than 1,000 hardware and software functions encompassing areas such as programming, offline commissioning and simulation, maintenance, vision and safety.

The post ABB adds 2 OmniCore robot controllers appeared first on The Robot Report.

]]>
ABB OmniCore E10 Controller.

ABB OmniCore E10 Controller. | Photo Credit: ABB

ABB is adding two new members to its OmniCore controller family. Available for a range of robots across ABB’s portfolio, the new E10 and V250XT controllers extend the possibilities for enhanced robot control in a variety of industries, from electronics assembly to automotive, logistics, and general manufacturing.

ABB said both controllers offer increased scalability through the inclusion of over 1,000 hardware and software functions encompassing areas such as programming, offline commissioning and simulation, maintenance, vision and safety. The controllers also feature built-in connectivity to ABB Ability Connected Services cloud-based service suite for robots.

Featuring a slimline 19-inch rack-mount design, the E10 is designed for confined space and high-density production lines where space saving is a key requirement, such as small parts assembly and material handling in the electronics industry. The E10 controller is designed to power ABB’s SCARA robots and articulated robots with payloads of up to 11kg such as the IRB 920T and IRB 1300.

Designed to power articulated robots with a payload of up to 300kg such as IRB 6700, the V250XT controller is designed for use in electric vehicle production, automotive manufacturing, logistics and general industrial applications.

ABB said the OmniCore E10 and V250XT controllers consume up to 20% less energy compared to its previous IRC5 controllers. Both controllers are used with ABB’s FlexPendant hand-held controller. The large 8” multi-touch display supports standard gestures, such as pinch, swipe, and tap to simplify robot programming. FlexPendant is “hot-swappable”, which means it can be unplugged or re-connected without interrupting ongoing production. This allows the FlexPendant to be shared between multiple robots, accelerating robot deployment and minimizing costs.

All of ABB’s OmniCore controllers are based on the company’s RobotWare operating system. The OmniCore controller family will be extended across ABB’s robot portfolio and more applications will follow into 2022.

“The growing demands of the industry for quicker, more diverse production and greater responsiveness to changing market conditions calls for solution that bring new levels of speed, accuracy and flexibility,” said Antti Matinlauri, head of product management for ABB Robotics. “As part of ABB’s OmniCore controller family, the E10 and V250XT provide manufacturers with an expanded range of possibilities, enabling them to maximize their productivity and meet changing demands with minimum downtime.”

The post ABB adds 2 OmniCore robot controllers appeared first on The Robot Report.

]]>
https://www.therobotreport.com/abb-adds-2-omnicore-robot-controllers/feed/ 0
Miniature controllers from FAULHABER control a range of dc motors https://www.therobotreport.com/miniature-controllers-faulhaber-control-range-of-dc-motors/ https://www.therobotreport.com/miniature-controllers-faulhaber-control-range-of-dc-motors/#respond Mon, 19 Apr 2021 16:18:38 +0000 https://www.therobotreport.com/?p=559387 The new MC 3001 are unhoused versions of the FAULHABER motion controllers and, by means of the integrated output stage with optimized current measurement, can control dc micromotors, linear dc servomotors or brushless dc motors from the company’s product line from 6 to 30 mm. With an overall height from 2.6 mm and a format…

The post Miniature controllers from FAULHABER control a range of dc motors appeared first on The Robot Report.

]]>
FAULHABER

FAULHABER’s MC 3001 B (foreground) and MC 3001 P (middle), shown with a matching motherboard from the starter kit in the background. | Photo Credit: FAULHABER

The new MC 3001 are unhoused versions of the FAULHABER motion controllers and, by means of the integrated output stage with optimized current measurement, can control dc micromotors, linear dc servomotors or brushless dc motors from the company’s product line from 6 to 30 mm.

With an overall height from 2.6 mm and a format from 16 x 27 mm, the new motion controllers are extremely miniaturized. They feature high control dynamics and can be operated with 1.4 A in continuous operation and with up to 5 A peak current. With these new variants, FAULHABER has rounded off its motion controller portfolio at the lower end.

These thumb-sized controllers feature the same functionality as well as the same interfaces (RS232 and CANopen) and encoders as the other more powerful products of the MC V3.0 generation. As an intelligent driver module, they are especially well suited for installation in customer-specific applications. The full thermal protection of the motors is ensured with the integrated thermal models and by means of the high PWM frequency.

The ideal fields of use of the new motion controllers of the MC 3001 series are applications from the areas of robotics, automation technology, machine construction as well as medical and laboratory technology. Applications in these areas often have limited space yet call for high control dynamics and high performance.

The controllers are available in two variants: the MC3001 B can be plugged into a motherboard with three micro board-to-board connectors, whereas the MC3001 P can be plugged in via a 28-pin plug connector. To help customers quickly and easily get to work on the development of their drive system, FAULHABER offers a starter kit that includes, among other things, a motherboard and makes it easier to get started. In addition, up to six different motherboard variants (depending on the variant of the motion controller and the used motor) are available. To meet specific customer needs, other boards can be created that may also include an EtherCAT interface, for example.

The motion controllers are designed for follower operation and can be easily and quickly combined with a number of higher-level leader systems via standard interfaces. After basic commissioning via Motion Manager, the controllers can alternatively also be operated at any time in stand-alone mode by means of integrated sequence programs.

Editor’s Note: This article first appeared on our sister publication Motion Control Tips.

The post Miniature controllers from FAULHABER control a range of dc motors appeared first on The Robot Report.

]]>
https://www.therobotreport.com/miniature-controllers-faulhaber-control-range-of-dc-motors/feed/ 0
Yaskawa taps Realtime Robotics’ motion planning expertise https://www.therobotreport.com/yaskawa-taps-realtime-robotics-motion-planning/ https://www.therobotreport.com/yaskawa-taps-realtime-robotics-motion-planning/#respond Tue, 26 Jan 2021 18:01:49 +0000 https://www.therobotreport.com/?p=558776 Yaskawa is tapping Realtime Robotics’ motion planning technology to improve a variety of materials handling and fulfillment applications, including piece picking and mixed case palletizing. Yaskawa said using Realtime’s technology enables its robot cells to be deployed and used more efficiently. The first of these mixed pallet robot cells will soon start being installed at…

The post Yaskawa taps Realtime Robotics’ motion planning expertise appeared first on The Robot Report.

]]>
Yaskawa Realtime Robotics

Lee Moulder, sales and application director, Yaskawa Nordic. Yaskawa partnered with Realtime Robotics.

Yaskawa is tapping Realtime Robotics’ motion planning technology to improve a variety of materials handling and fulfillment applications, including piece picking and mixed case palletizing. Yaskawa said using Realtime’s technology enables its robot cells to be deployed and used more efficiently.

The first of these mixed pallet robot cells will soon start being installed at customer sites. Each cell consists of Realtime’s motion planning solution with two Motoman-GP180 robots with YRC1000 control system, servo grippers, roller conveyors, and safety fencing.

The companies said their joint multi-robot architecture empowers a streamlined approach to deployments by eliminating the need for programming interference zones. This results in robotic work cells that have smaller footprints and higher outputs.

“Planning motion in real time is central to safe autonomy, but the algorithms were too slow,” George Konidaris, founder and chief roboticist at Realtime, recently told The Robot Report. “The core breakthroughs in motion began in 1979 with an MIT paper, but industrial robotics hadn’t changed much in 40 years.”

“At Duke University, we figured out how to make time-consuming processes go faster. The motion-planning algorithms were good but sequential; we needed massive parallelism,” he added. “We’ve blown open what you can do with stupid robots, now that they can adapt to changing workspaces.”

“We are pleased to be further intensifying and expanding our industrial control technology,” said Lee Moulder, sales and application director at Yaskawa Nordic. “The partnership with Realtime combines control for logic, motion and robotics with solutions for Industry 4.0 applications.”

Realtime Robotics is a startup that spun out of Duke University in 2016. It also was once a resident of MassRobotics, the Boston-based non-profit group that serves as an innovation hub for robotics and smart connected devices.

Realtime Robotics has raised about $16 million to date. It raised an $11.7 million Series A in October 2019 and then a $2 million Venture Round in late 2020.

To learn more about how Realtime Robotics is helping robots avoid collisions, read this profile of the company. It dives into how the company was founded, dives into its motion planning technology and more.

The post Yaskawa taps Realtime Robotics’ motion planning expertise appeared first on The Robot Report.

]]>
https://www.therobotreport.com/yaskawa-taps-realtime-robotics-motion-planning/feed/ 0
ABB introduces a new ROS driver for its robots https://www.therobotreport.com/abb-introduced-new-ros-driver-robots/ https://www.therobotreport.com/abb-introduced-new-ros-driver-robots/#respond Mon, 28 Dec 2020 19:54:05 +0000 https://www.therobotreport.com/?p=558543 A new ROS driver for ABB robots was introduced at the recent ROS-Industrial Conference. The driver, which is now available on GitHub, is designed to ease interaction between ABB robot controllers and ROS-based systems by providing ready-to-run ROS nodes. Here is a look at the included (principal) packages are brief descriptions for each: abb_rws_state_publisher: Provides…

The post ABB introduces a new ROS driver for its robots appeared first on The Robot Report.

]]>
A new ROS driver for ABB robots was introduced at the recent ROS-Industrial Conference. The driver, which is now available on GitHub, is designed to ease interaction between ABB robot controllers and ROS-based systems by providing ready-to-run ROS nodes.

Here is a look at the included (principal) packages are brief descriptions for each:

abb_rws_state_publisher: Provides a ROS node that continuously polls an ABB robot controller for system states, which then are parsed into ROS messages and published to the ROS system.

abb_rws_service_provider: Provides a ROS node that exposes ROS services, for discrete interaction with an ABB robot controller, like starting/stopping the RAPID program and reading/writing of IO-signals.

abb_egm_hardware_interface: This package, which is only recommended for advanced users, provides ROS nodes for:

  • Running a ros_control-based hardware interface, for direct motion control of ABB robots (via the Externally Guided Motion (EGM) interface).
  • Automatically stopping ros_control controllers when EGM communication sessions ends (a user-provided list can specify controllers that are ok to keep running).

The GitHub post mentioned the included packages have not been productized, so academia is the intended audience. The packages are provided “as-is,” and no more than limited support can be expected. The packages have mainly been tested with ROS Melodic (on both Ubuntu and Windows).

Jon Tjerngren, a corporate researcher at ABB AB (Sweden), led the development of the ROS driver. He recommended using the RobotWare StateMachine Add-In to ease the setup of the ABB robot controller system. The StateMachine Add-In is optional, but without it the driver nodes will only be able to provide basic interaction with ABB robots.

Tjerngren said the packages in this repository need to be ported over to ROS 2. The packages have been developed with this in mind, he said, and most of them should be straightforward to adapt to the ROS 2 APIs. The video above is a talk Tjerngren gave at the 2018 ROS-Industrial Conference called “Ease-of-Use Packages between ROS and ABB Robots.”

The driver was developed during the European project ROSIN (ROS-Industrial Quality-Assured Robot Software Components), which received funding as part of the European Union’s Horizon 2020 research and innovation program.

Michael Ferguson, director of R&D at Cobalt Robotics, a San Mateo, Calif.-based developer of robotic security services, recently shared with us his five must-have features that would make ROS 2 ready for primetime.

The post ABB introduces a new ROS driver for its robots appeared first on The Robot Report.

]]>
https://www.therobotreport.com/abb-introduced-new-ros-driver-robots/feed/ 0
MGS Manufacturing uses Stäubli robots, controls for inspection of medical devices https://www.therobotreport.com/mgs-uses-staubli-robots-controls-inspect-medical-devices/ https://www.therobotreport.com/mgs-uses-staubli-robots-controls-inspect-medical-devices/#respond Thu, 19 Nov 2020 15:05:45 +0000 https://www.therobotreport.com/?p=107338 MGS, a contract manufacturer for a major European medical device maker, turned to Stäubli robots to support automated inspection of single-use catheters.

The post MGS Manufacturing uses Stäubli robots, controls for inspection of medical devices appeared first on The Robot Report.

]]>
MGS Manufacturing uses Stäubli robots, controls for inspection of medical devices

MGS developed two workcells including cameras, robots, and common controls. Source: Stäubli

In the healthcare industry, many plastic single-use devices such as catheters must be manufactured in high volumes in accordance with strict hygienic standards. MGS Manufacturing Group recently automated the inspection process for a major European manufacturer of two-material, two-shot molded products. It turned to two Stäubli TX2-60L six-axis robots to handle the catheters after inspection with high precision, and the robot controller is integrated into one programming platform.

Stäubli is a mechatronics solutions provider focusing on connectors, robotics, and textile equipment. Originally founded in 1892, the Pfäffikon, Switzerland-based company today operates 14 industrial production sites and has more than 5,500 employees across 60 locations in 29 countries. It said its network of agents in 50 countries provides innovative solutions to all industrial sectors.

Catheter maker faces inspection challenge

To maintain quality control, all catheters must be visually inspected before they are shipped. Previously, the manufacturer inspected and sorted the devices manually. This task alone required more than 30 dedicated workers.

This posed several problems, explained Shawn Krenke, vice president of MGS’s Equipment Division. “Irrespective of the location of the production site, it is very difficult to find such a number of qualified and committed employees,” he said. “Furthermore, the workers can become fatigued during the shift, and the inspection decision-making process can vary depending on the operator performing the task. This may have an impact on the results of quality control“.

The medical device maker understood that this task could benefit automation and partnered with MGS. Germantown, Wis.-based MGS is a custom manufacturer that delivers tooling as well as molding and comprehensive equipment services, taking on the role of a single-source supplier on a worldwide basis. The company has several cleanroom production sites worldwide for healthcare customers, including a cleanroom molding factory in Ireland, where it molds the medical catheters.

Case study at a glance

Company:MGS Manufacturing Group
Location:Site in Ireland
Industry:Contract manufacturing of medical devices
Challenge:Visual inspection and sorting of catheters
Solution:TX2-60L six-axis robots, plus eight-position grippers and control software
Supplier:Stäubli
Task:Hygienic inspection and handling of products
Value driver:Maintaining quality control
Results:Two workcells can inspect, sort, and package 50 catheters per minute.

MGS develops cells with cameras, robot arms

MGS used its in-house expertise to develop two automated inspection and sorting cells for its Irish plant. It used four cameras for inspection and sorting, as well as robot arms and grippers for handling the catheters.

In the workcells, molded catheters are deposited into a hopper at the input side of the cell and then dispensed into a vibratory feeder system, where they are oriented correctly and delivered into a servo escapement. The escapement separates eight catheters at a time for pickup by a servo transfer robotic arm.

The transfer arm passes the catheters through each of the four inspection cameras, which scan for defects such as embedded particulates, unsightly material spotting, and pinholes in the tips. The catheters are also inspected for “material shorts” or other non-compliance issues that may have occurred during the two-material/two-shot molding process.

After the inspection is complete, a Stäubli TX2 six-axis robot picks the eight catheters off the transfer arm and separates those that failed inspection. As the systems delivers inspection reports on each part, the robot knows which ones have passed or failed. The non-conforming products are discarded into a secured access non-compliance bin. All accepted catheters are layer-packed into the final reusable packaging tote.

The end-of-arm tooling on the robot features an eight-position gripper assembly that can individually select and release the catheter as required. Control of the eight grippers is accomplished using a pneumatic manifold mounted directly on the arm of the robot. Each cell can inspect, sort, and package over 50 catheters a minute.

The MGS engineers make flexibility a high priority when designing the cells. “No tooling change is required when switching between variants,” said Craig Nisleit, an electrical engineer at MGS. “Furthermore, we designed the system to minimize the change over time. All process changes are handled electronically when the user selects a new recipe from the machine HMI [human-machine interface].”

MGS staubli

After the catheters have been inspected, they are handled by two TX2-60L six-axis robots in compliance with stringent hygienic standards. Source: Stäubli

Common robotic control helps yield results

The robot of each of the two cells is integrated with a Rockwell Automation control platform and EtherNet/IP network and features an Allen-Bradley CompactLogix 5380 controller. This integrated system provides one programming platform for both the robot and the cell, which saves time and money and streamlines the design process, said Stäubli.

Stäubli said its uniVAL plc (programmable logic controller) is the tool that integrates the robot into the control platform of the complete cell. The uniVAL plc allows the CompactLogix controller to drive the robot through a fieldbus using simplified function blocks. MGS said that uniVAL was one important reason, though not the only one, why it chose to use Stäubli robots.

Another advantage of the TX2 series in this application is the closed housing, which facilitates a cell design complying with hygienic standards. Even in the series version, the robots fulfill the specifications of ISO cleanroom Class 5. For higher requirements, the Stäubli TX2 robots are available in special cleanroom versions.

Furthermore, the TX2 series is the appropriate robot for this application because of its compact size in relation to its maximum payload of 3.7 kg and maximum reach of 920 mm, said Stäubli. Another strong point is its repeatability of +/- 0,03 mm, said the company.

In addition, the MGS engineers cited “soft factors.” They said they recently had success with a different automation project involving Stäubli robots, and they appreciated the follow-up and front-line support of the company’s staff.

The post MGS Manufacturing uses Stäubli robots, controls for inspection of medical devices appeared first on The Robot Report.

]]>
https://www.therobotreport.com/mgs-uses-staubli-robots-controls-inspect-medical-devices/feed/ 0
Freedom Robotics provides remote control support for European Rover Challenge https://www.therobotreport.com/freedom-robotics-provides-remote-control-european-rover-challenge/ https://www.therobotreport.com/freedom-robotics-provides-remote-control-european-rover-challenge/#respond Thu, 12 Nov 2020 15:47:02 +0000 https://www.therobotreport.com/?p=107241 Freedom Robotics explains how it helped support the ERC Space and Robotics Event with teleoperation and remote monitoring software, as well as lessons learned for industrial automation.

The post Freedom Robotics provides remote control support for European Rover Challenge appeared first on The Robot Report.

]]>

Space exploration imposes the highest demands on remote-controlled and autonomous systems. In September, university teams from around the world participated in the ERC Space and Robotics Event, Europe’s largest competition of its kind. The ongoing coronavirus pandemic required the event’s organizers to quickly find a way for the teams to participate and be monitored remotely. Freedom Robotics ultimately provided the platform for competitors to teleoperate their robots in simulated Martian terrain.

“The European Rover Challenge is held every year in Poland and is typically more hardware-focused. They build a landing site and mimic tasks,” said Achille Verheye, lead robotics engineer at Freedom Robotics. “This year, COVID-19 made travel impossible, and our solution was introduced to the competition through partners.”

“It wouldn’t have been possible to hold the competition to the same standards as previous years without a tool to remotely control the robots and to develop on them,” said Lukasz Wilczynski, president of the European Space Foundation.

Freedom Robotics enabled plug-and-play competition

“With our system, stock hardware can run their software remotely from all over the world,” Verheye told The Robot Report. “We didn’t have to do any customization. The teams were like our startup customers focused on debugging, and the judges were like our enterprise customers, which are managing fleets.”

“One of the two main tasks in the ERC challenge was using a Mars rover to drive around a field,” he explained. “It was the same for all teams, which had to improve on the software to drive manually or load algorithms and use Freedom to work on those algorithms and check inputs and outputs.”

“Another part of the competition was remote controlling a UR robot arm,” said Verheye. “They could only see what was on the camera on the table close to the robot. Seeing only part of the data was harder. The teams used an Xbox controller to control the end effector to conduct three tasks, including plugging in a power cable or using a panel like on the International Space Station or in building a base on Mars.”

Freedom Robotics pilot view

The pilot view of the collaborative robot operation. Source: Nick Cortes, Freedom Robotics

Freedom‘s support team was indispensable for making the competition run smoothly. Being forced to have the teams remote was a blessing in disguise in some ways,” said Krzysztof Walas, assistant professor at the Poznan University of Technology. “It actually forced us to replicate similar scenarios as driving robots on Mars, where teams can only see the robots from the robot‘s camera perspective.”

With only a 20-day turnaround, how much training on the company‘s platform did the teams need for teleoperation or loading algorithms?

“It was pretty light — the ERC competition had a simulation environment, and we gave a one-hour webinar,” said Nicholas Cortes, head of business intelligence at Freedom Robotics. Cortes studied mechanical engineering at Columbia University and dealt with the competition for the company.

Simulation, streaming, and performance

“We helped with the simulation environment so it would work the same day,” Verheye noted. “The ROS [Robot Operating System] topics were the same as for the Freedom Robotics platform. In the case of real world versus simulation, they had already scanned the field. Even that wasn’t a new use case for us, since we have customers that create simulated environments to test before hardware comes.”

With up to 40,000 live streams of the competition, the ERC organizers needed software that could scale. “Our infrastructure is built to scale with such a load,” said Verheye. “Teams were jumping in and out, so they were not pushing our limits.”

Freedom Robotics stream view

The stream view of cobot operation. Source: Nick Cortes, Freedom Robotics

“The time difference did pose a challenge for communications and spectators, with teams in different rooms and time zones, just as it would for space,” he acknowledged. “It was amazing to see our footage in first-person view, as well as judging cameras around the field. The organizers would hide artifacts such as a nozzle in the ground and award teams a bonus if they found them.”

Verheye compared the rover challenge to gamification for robotic delivery or pick-and-place operations. “It’s something we’re actively working toward, like providing statistics on a screen to show how well the operator and robot are performing in one location versus another location,” he said. “We can see stats for operators trying to beat certain delivery times, as well as for fleets.”

“Industrial robots fail quite a lot,” Verheye said. “Lowest-level operators might not notice or just fix it. With Freedom Robotics, both operators and managers can see which robots are causing problems, and they can see huge improvements.”

Working toward a future of full autonomy

Freedom Robotics’ platform can help robots become more autonomous, claimed Verheye. “There’s an analogy with driverless cars,” he said. “While they’re not widely approved until they get it 100% right, Tesla got out on the roads right away to collect data. We’re giving tools to help operators intervene.”

The use of a video game controller for teleoperation was another novelty that has industrial implications, said Verheye. “The teams were the first to test it, and it really opens up a lot of customers,” he said. “For instance, with welding robots, you could previously only monitor an industrial arm and send fixed commands, but now you could just take over the robot, troubleshoot, and keep the production line running.”

ERC Mars competiition

The ERC Mars landscape competition grounds. Source: Nick Cortes, Freedom Robotics

Freedom Robotics’ technology worked so well that the European Rover Challenge expects to use it again next year.  “Quarantine or no quarantine, the next edition will have the remote-control aspect to it as well,” said Wilczynski.

“We’re super-excited about next year,” Verheye said. “Even if everything reopens, we will still try to put the teams in different rooms. We enabled competitors and organizers to have visibility, just with smaller companies and our enterprise customers.”

The post Freedom Robotics provides remote control support for European Rover Challenge appeared first on The Robot Report.

]]>
https://www.therobotreport.com/freedom-robotics-provides-remote-control-european-rover-challenge/feed/ 0
University of Leeds scientists develop robot-assisted, semi-autonomous colonoscopy https://www.therobotreport.com/university-of-leeds-scientists-develop-robot-assisted-semi-autonomous-colonoscopy/ https://www.therobotreport.com/university-of-leeds-scientists-develop-robot-assisted-semi-autonomous-colonoscopy/#respond Tue, 13 Oct 2020 12:32:33 +0000 https://www.therobotreport.com/?p=106829 Scientists at the University of Leeds said their development of magnet-guided, semi-autonomous robotic colonoscopy could make the procedure easier for examiners and less painful for patients.

The post University of Leeds scientists develop robot-assisted, semi-autonomous colonoscopy appeared first on The Robot Report.

]]>

Scientists at the University of Leeds said yesterday that they have made a breakthrough in the development of systems for semi-autonomous colonoscopy, in which a robot guides a medical device into the human body.

They said their findings mark progress toward an intelligent robotic system being able to guide instruments to precise locations to examine internal tissues or take biopsies. A doctor or nurse would still be on hand to make clinical decisions, but the demanding task of manipulating the device could be offloaded to a robotic system.

Robots intended to facilitate vital screening

Colorectal cancer is the third most commonly diagnosed malignancy in the world, according to medical journals published by BMJ Publishing Group Ltd.

“Colonoscopy gives doctors a window into the world hidden deep inside the human body, and it provides a vital role in the screening of diseases such as colorectal cancer,” stated Pietro Valdastri, the professor of robotics and autonomous systems at the University of Leeds’ School of Electronic and Electrical Engineering who is leading the research. “But the technology has remained relatively unchanged for decades.”

Conventional colonoscopy is carried out using a semi-flexible tube, which is inserted into the anus in a process some patients find so painful they require an anesthetic.

“What we have developed is a system that is easier for doctors or nurses to operate and is less painful for patients,” Valdastri said. “It marks an important a step in the move to make colonoscopy much more widely available — essential if colorectal cancer is to be identified early.”

Because the system is easier to use, the scientists said they hope this can increase the number of providers who can perform the procedure and allow for greater patient access to colonoscopies.

University of Leeds arm

The robotic arm houses a magnet that interacts with magnets on a small capsule inside the patient and is able to navigate the capsule to the correct spot inside the colon. Source: University of Leeds

University of Leeds builds magnetic, flexible scope

The research team at the University of Leeds has developed a smaller, capsule-shaped device that is tethered to a narrow cable and is inserted into the anus and then guided into place — not by the doctor or nurse pushing the colonoscope but by a magnet on a robotic arm positioned over the patient.

The robotic arm moves around the patient as it magnetically maneuvers the capsule. The magnet on the outside of the patient interacts with tiny magnets in the capsule inside the body, navigating it through the colon. The researchers say it will be less painful than having a conventional colonoscopy.

Guiding the robotic arm can be done manually, but it is a technique that is difficult to master. In response, the researchers have developed different levels of robotic assistance. This latest research evaluated how effective the different levels of robotic assistance were in aiding non-specialist staffers to carry out the procedure.

University of Leeds describes levels of robotic assistance

  • Direct robot control: This is where the operator has direct control of the robot via a joystick. In this case, there is no assistance.
  • Intelligent endoscope teleoperation: The operator focuses on where they want the capsule to be located in the colon, leaving the robotic system to calculate the movements of the robotic arm necessary to get the capsule into place.
  • Semi-autonomous navigation: The robotic system autonomously navigates the capsule through the colon, using computer vision — although this can be overridden by the operator.
University of Leeds chart

Levels of autonomy of colonoscopy. Source: University of Leeds

During a laboratory simulation, 10 non-expert staffers were asked to get the capsule to a point within the colon within 20 minutes. They did that five times, using the three different levels of assistance.

Using direct robot control, the participants had a 58% success rate. That increased to 96% using intelligent endoscope teleoperation — and 100% using semi-autonomous navigation.

In the next stage of the experiment, two participants were asked to navigate a conventional colonoscope into the colon of two anaesthetised pigs — and then to repeat the task with the magnet-controlled robotic system using the different levels of assistance. A veterinarian was in attendance to ensure the animals were not harmed.

The participants were scored on the NASA Task Load Index, a measure of how taxing a task was, both physically and mentally. The NASA Task Load Index revealed that they found it easier to operate the colonoscope with robotic assistance. A sense of frustration was a major factor in operating the conventional colonoscope and where participants had direct control of the robot.

NASA ratings

NASA Task Load Index. Source: University of Leeds

“Operating the robotic arm is challenging,” said James Martin, a Ph.D. researcher at the University of Leeds who co-led the study. “It is not very intuitive, and that has put a brake on the development of magnetic flexible colonoscopes. But we have demonstrated for the first time that it is possible to offload that function to the robotic system, leaving the operator to think about the clinical task they are undertaking — and it is making a measurable difference in human performance.”

Robots could make care more accessible

“Robot-assisted colonoscopy has the potential to revolutionize the way the procedure is carried out,” said Dr. Bruno Scaglioni, a postdoctoral research fellow at the University of Leeds and co-leader of the study. “It means people conducting the examination do not need to be experts in manipulating the device. That will hopefully make the technique more widely available, where it could be offered in clinics and health centers rather than hospitals.”

The other institutions involved in the research are Vanderbilt University in the U.S., Leeds Teaching Hospitals NHS Trust in the U.K., and the University of Torino in Italy. Team RoboFORCE research was named a finalist in the KUKA Innovation Award late last year.


The techniques developed to conduct colonoscopy examinations could be applied to other endoscopic devices in healthcare, such as those used to inspect the upper digestive tract or lungs, said the University of Leeds.

The latest findings — “Enabling the future of colonoscopy with intelligent and autonomous magnetic manipulation” — is the culmination of 12 years of research by an international team of scientists led by the University of Leeds. The research was published today in the scientific journal Nature Machine Intelligence.

Patient trials using the system could begin next year or in early 2022, said the University of Leeds researchers.

The post University of Leeds scientists develop robot-assisted, semi-autonomous colonoscopy appeared first on The Robot Report.

]]>
https://www.therobotreport.com/university-of-leeds-scientists-develop-robot-assisted-semi-autonomous-colonoscopy/feed/ 0
Titan Medical obtains robotic surgery patents for camera and gesture control https://www.therobotreport.com/titan-medical-sport-obtains-patents-robotic-surgery-camera-control/ https://www.therobotreport.com/titan-medical-sport-obtains-patents-robotic-surgery-camera-control/#respond Sat, 19 Sep 2020 14:30:42 +0000 https://www.therobotreport.com/?p=106536 Titan Medical, which has been developing the Sport single-port robotic surgical system, has received U.S. patents for an in-body camera positioning system and hand gesture controls.

The post Titan Medical obtains robotic surgery patents for camera and gesture control appeared first on The Robot Report.

]]>

Titan Medical Inc., which has been developing the Sport surgical robot, this week announced that it has received two U.S. patents. One is for methods for positioning a camera during a surgical procedure, and the other covers a gesture-control system. The Toronto-based company said its global intellectual property portfolio now includes 58 issued patents and 84 patent applications. It said it expects to be issued 17 other patents in the coming months.

Sport is a single-port robotic system for minimally invasive surgery. It consists of a surgeon-controlled patient cart with a dual-view camera system and 3D and 2D high-definition vision systems and multi-articulating instruments. Sport also includes a surgeon workstation that provides an ergonomic interface to the patient cart and a 3D high-definition endoscopic view of the procedure. The company said it plans to initially pursue gynecologic surgeries.

After pausing development last November because of fundraising challenges, Titan Medical partnered with Medtronic PLC and resumed work this past summer.

Sport maker patents vision, hand controls

U.S. Patent No. 10,772,703, titled “Methods and apparatuses for positioning a camera of a surgical robotic system to capture images inside a body cavity of a patient during a medical procedure,” is directed at the autonomous positioning of a camera in response to the location of a surgical instrument.

The second patent, No. 10,758,311, titled “Hand controller apparatus for gesture control and shared input control in a robotic surgery system,” is for a novel robotic hand controller with an integrated gesture-control pad used to control one or more robotic functions, including a camera.

Titan Medical said the notices of allowance cover various aspects of robotic surgery instrumentation, autonomous error correction, imaging sensors, visual illumination, graphical user interfaces, and sterile barriers. It also said they include a range of countries and regions including the U.S., Europe, China, Japan, Australia, and Canada.


“I commend Jasminder Brar, our vice president of legal, IP, and strategic initiatives, for his leadership and foresight in formulating an intellectual property strategy that continues to create significant value for Titan’s stakeholders,” stated David McNally, president and CEO of Titan Medical. “As evidenced by our recent strategic announcements, intellectual property is fundamental to Titan’s success, providing a foundation to product commercialization, mitigating against certain risks, protecting the company’s innovations and facilitating value creation.”

“While we are always excited to obtain new patents and allowances, the issuance of these recent patents is especially exciting, as they relate to human-machine interaction, an area where we have paid special attention in advancing the ergonomic friendliness of medical robotics based on feedback from surgeons,” McNally said.

Editor’s note: For more information on medical devices, visit MassDevice.com, a sibling site to The Robot Report.

The post Titan Medical obtains robotic surgery patents for camera and gesture control appeared first on The Robot Report.

]]>
https://www.therobotreport.com/titan-medical-sport-obtains-patents-robotic-surgery-camera-control/feed/ 0
Hierarchical Reinforcement Learning helping Army advance drone swarms https://www.therobotreport.com/hierarchical-reinforcement-learning-army-drone-swarms/ https://www.therobotreport.com/hierarchical-reinforcement-learning-army-drone-swarms/#respond Mon, 10 Aug 2020 16:08:13 +0000 https://www.therobotreport.com/?p=106066 Army researchers developed a reinforcement learning approach that allows swarms of unmanned aerial and ground vehicles to optimally accomplish various missions while minimizing performance uncertainty. Swarming is a method of operations where multiple autonomous systems act as a cohesive unit by actively coordinating their actions. Army researchers said future multi-domain battles will require swarms of…

The post Hierarchical Reinforcement Learning helping Army advance drone swarms appeared first on The Robot Report.

]]>
Hierarchical Reinforcement Learning

Army researchers developed Hierarchical Reinforcement Learning that allows swarms of unmanned aerial and ground vehicles to optimally accomplish various missions while minimizing performance uncertainty on the battlefield.

Army researchers developed a reinforcement learning approach that allows swarms of unmanned aerial and ground vehicles to optimally accomplish various missions while minimizing performance uncertainty.

Swarming is a method of operations where multiple autonomous systems act as a cohesive unit by actively coordinating their actions.

Army researchers said future multi-domain battles will require swarms of dynamically coupled, coordinated heterogeneous mobile platforms to overmatch enemy capabilities and threats targeting U.S. forces.

The Army is looking to swarming technology to be able to execute time-consuming or dangerous tasks, said Dr. Jemin George of the U.S. Army Combat Capabilities Development Command’s Army Research Laboratory.

“Finding optimal guidance policies for these swarming vehicles in real-time is a key requirement for enhancing warfighters’ tactical situational awareness, allowing the U.S. Army to dominate in a contested environment,” George said.

Reinforcement learning provides a way to optimally control uncertain agents to achieve multi-objective goals when the precise model for the agent is unavailable; however, the existing reinforcement learning schemes can only be applied in a centralized manner, which requires pooling the state information of the entire swarm at a central learner. This drastically increases the computational complexity and communication requirements, resulting in unreasonable learning time, George said.

In order to solve this issue, in collaboration with Prof. Aranya Chakrabortty from North Carolina State University and Prof. He Bai from Oklahoma State University, George created a research effort to tackle the large-scale, multi-agent reinforcement learning problem. The Army funded this effort through the Director’s Research Award for External Collaborative Initiative, a laboratory program to stimulate and support new and innovative research in collaboration with external partners.

A small unmanned Clearpath Husky robot, which was used by ARL researchers to develop a new technique to quickly teach robots novel traversal behaviors with minimal human oversight. Photo Credit: U.S. Army

The main goal of this effort is to develop a theoretical foundation for data-driven optimal control for large-scale swarm networks, where control actions will be taken based on low-dimensional measurement data instead of dynamic models.

The current approach is called Hierarchical Reinforcement Learning, or HRL, and it decomposes the global control objective into multiple hierarchies – namely, multiple small group-level microscopic control, and a broad swarm-level macroscopic control.

Related Story: How the U.S. Army is improving soldier-robot interaction

“Each hierarchy has its own learning loop with respective local and global reward functions,” George said. “We were able to significantly reduce the learning time by running these learning loops in parallel.”

According to George, online reinforcement learning control of swarm boils down to solving a large-scale algebraic matrix Riccati equation using system, or swarm, input-output data.

The researchers’ initial approach for solving this large-scale matrix Riccati equation was to divide the swarm into multiple smaller groups and implement group-level local reinforcement learning in parallel while executing a global reinforcement learning on a smaller dimensional compressed state from each group.

Hierarchical Reinforcement Learning

Army researchers envision a hierarchical control for ground vehicle and air vehicle coordination. | U.S. Army graphic

Their current Hierarchical Reinforcement Learning scheme uses a decupling mechanism that allows the team to hierarchically approximate a solution to the large-scale matrix equation by first solving the local reinforcement learning problem and then synthesizing the global control from local controllers (by solving a least squares problem) instead of running a global reinforcement learning on the aggregated state. This further reduces the learning time.

Experiments have shown that compared to a centralized approach, HRL was able to reduce the learning time by 80% while limiting the optimality loss to 5%.

“Our current HRL efforts will allow us to develop control policies for swarms of unmanned aerial and ground vehicles so that they can optimally accomplish different mission sets even though the individual dynamics for the swarming agents are unknown,” George said.

George stated that he is confident that this research will be impactful on the future battlefield, and has been made possible by the innovative collaboration that has taken place.

“The core purpose of the ARL science and technology community is to create and exploit scientific knowledge for transformational overmatch,” George said. “By engaging external research through ECI and other cooperative mechanisms, we hope to conduct disruptive foundational research that will lead to Army modernization while serving as Army’s primary collaborative link to the world-wide scientific community.”

The team is currently working to further improve their HRL control scheme by considering optimal grouping of agents in the swarm to minimize computation and communication complexity while limiting the optimality gap.

They are also investigating the use of deep recurrent neural networks to learn and predict the best grouping patterns and the application of developed techniques for optimal coordination of autonomous air and ground vehicles in Multi-Domain Operations in dense urban terrain.

Editor’s Note: This article was republished from the U.S. Army CCDC Army Research Laboratory.

The post Hierarchical Reinforcement Learning helping Army advance drone swarms appeared first on The Robot Report.

]]>
https://www.therobotreport.com/hierarchical-reinforcement-learning-army-drone-swarms/feed/ 0
CNC Guide from FANUC is free and includes five-axis simulation, training tools https://www.therobotreport.com/cnc-guide-fanuc-free-includes-five-axis-simulation-training-tools/ https://www.therobotreport.com/cnc-guide-fanuc-free-includes-five-axis-simulation-training-tools/#comments Fri, 26 Jun 2020 14:38:47 +0000 https://www.therobotreport.com/?p=105605 FANUC America has added five-axis workforce training tools to its simulation and CNC Guide offerings, the latter of which is free through September 2020.

The post CNC Guide from FANUC is free and includes five-axis simulation, training tools appeared first on The Robot Report.

]]>
FANUC CNC GUIDE

CNC GUIDE is currently available for free. Source: FANUC

FANUC America Corp. this week said it is offering a free trial version of CNC Guide, its PC-based virtualization platform for control design, training, and part planning. The Rochester Hills, Mich.-based subsidiary of FANUC Corp. said it is offering the simulation tool at no cost to help machine-tool operators and builders through the economic difficulties around the COVID-19 pandemic.

CNC Guide offers a safe and immersive way to learn how to operate computer numerical control machines, even for novice operators, said FANUC. Because the software creates digital twins of machine controls, programmers can test G-code programs with no risk of damaging actual machines, it said.

CNC Guide designed to aid machine tool operators, builders

CNC Guide can also help optimize machining operations because users can experiment in the virtual environment with performance-enhancing features in the controls, said FANUC. In addition, when used with the vendor’s conversational programming tool, Manual Guide i, the software can act as a simplified CAD/CAM package.

The platform is intended to enable programming on a PC instead of the machine tool, so equipment stays in production to minimize downtime and maximize throughput, FANUC said. The company’s global headquarters are in Japan.

Not only can CNC Guide help machine tool operators, but builders can also benefit, said FANUC. Machine tool builders can get a competitive edge by using CNC Guide to prove out their design concepts faster and get their equipment quicker to market, the company claimed.

CNC Guide is available for free only through September 2020, and it is available to FANUC America customers residing in the U.S. Interested parties should contact FANUC through the CNC Guide Trial Offering page to get started.


FANUC adds five-axis simulation to training

FANUC America also announced this month that it has added five-axis simulation to its CNC training offerings. It said that interest in five-axis machining has grown as more manufacturers look to produce complex parts for high-tech industries such as aerospace and medical devices.

As this sector of the machine tool business increases, the demand for five-axis operators will grow exponentially, said the company. Finding qualified operators is a challenge for many employers, which are already facing shortages of skilled labor. Training new or existing workers in an effective and innovative way will be key to bridging this gap, said the company.

FANUC’s Machining Simulation for Workforce Development provides training for controls operation and part programming in a virtual environment. The Complex Milling Extension option combines FANUC’s CNC Guide and simulation software, so they can now operate as one of the three main five-axis mill kinematics.

FANUC CNC mixed machine

Mixed machining capability in simulation. Source: FANUC

The offering also includes training on a three-axis mill and a two-axis lathe for maximum configuration flexibility, said FANUC. Via a digital twin, the five-axis machining simulation allows users to learn how to setup and operate three common advanced five-axis milling machines: mixed type, tool type, and table type.

FANUC said the addition of five-axis simulation to its CNC Guide and simulation software offers an immersive environment to practice and understand advanced machining techniques. Since five-axis machining involves more complex machine setups, the simulation software effectively teaches users how to take advantage of the unique options and features.

In addition, the five-axis machining simulation software allows operators to experiment with and prove out the machine setup and/or part program before modifying the actual machine, the company said. For more information on FANUC’s five-axis machining simulation software, as well as other CNC workforce-development offerings, visit its machining simulator page.

The post CNC Guide from FANUC is free and includes five-axis simulation, training tools appeared first on The Robot Report.

]]>
https://www.therobotreport.com/cnc-guide-fanuc-free-includes-five-axis-simulation-training-tools/feed/ 1