As we develop the technologies that will allow robots to adapt to a dynamic environment in close proximity to humans, the number of tasks they will be capable of performing will increase by an order of magnitude. Robots capable of safely working in close collaboration with humans and adapting to dynamic changes in the workplace can enable what has been termed “Industry 4.0” — the paradigm whereby robots will ultimately work at a batch size of one and engage in preemptive error reporting and prevention.
A substantial increase in productivity will result as industries transition to a system where robots can engage in skilled evaluation of the state of their work alongside human manufacturing employees who are following the principles outlined in The Toyota Way. These advances will further make robots an integral component of smart-city technologies that clean, deliver, and repair. In the healthcare field, dynamic sensing robots will be a critical component of transporting, caring for, and perhaps even operating on patients. Similar capabilities may also permit autonomous probing for and harvesting of mineral deposits in hostile environments such as the sea floor without the need for direct or constant input from an operator.
A number of robots have already been developed in these fields that are based on existing sensing systems and advances in machine learning. The next stage of development in this space will turn these innovations into the leading edge of a dramatically broadened field of applications for robots.
Industrial robotics:
For at least the next 5 years, industrial robotics is projected to see significant growth, approaching a CAGR of 16.4% between 2021 and 2028, for a total market size of $337.1 billion in 2028. Firms in many countries, including the United States and Germany, are aiming to or are currently implementing Industry 4.0 (a compilation of industrial practices that integrate robotics, AI, augmented reality, and the Internet of Things) to drive the next disruptive leap in worker productivity.
Ideally, Industry 4.0 practices will allow humans to work with machines in smart factories. Within these factories, big data analysis, wireless cloud computing, edge sensor networks, and augmented reality will allow humans to interface with machines more effectively and manufacture products in smaller batch sizes more efficiently.
As machine sensing technologies advance, become cheaper, and miniaturize, robots will be able to have smaller and more efficient sensors integrated into their systems. This will remove a major barrier to their capacity to move to where they are needed and perform the most critical tasks as part of a more dynamic factory floor. Current advances in industrial machine sensing and modularity are a sign of things to come and will be integral to the development and execution of an Industry 4.0 paradigm.
Modular robots:
The company Modbot has developed a modular industrial robot that sports a single arm and a series of adaptors that allow it to perform a huge array of tasks. With this flexibility, the modbot maintains 6 degrees of freedom, a reach of 750 mm, and a lifting capacity of 7.5 kg.
While this technology represents an example of modularity that still requires some user intervention, it is considerably less labor intensive than many current industrial robots.
Gripping technologies:
If robots could adapt to a range of objects with variable dimensions and weight without needing to be refitted for each application, they could dramatically improve the throughput and efficiency of the process they are involved in.
The field of soft robotics may represent a means to this end through the development of biologically inspired grippers that can mimic hands or octopus tentacles that use force feedback and adaptable gripping surfaces to manipulate a wide range of objects. Soft Robotics Inc. is a pioneer in the commercialization of this technology and has developed a range of soft robotic manipulators that can safely pick up an effectively limitless range of SKUs ranging from textiles to produce.
Human-robot collaboration:
Another essential component enabling Industry 4.0 applications will be the ability of machines to sense and interpret a human’s actions. This will be essential as human instructors try to teach the machine how to perform a task without having to program in the precise motions. To this end, the Mechanical Systems Control Lab at UC Berkeley has developed a number of tools that facilitate intuitive 2-way communication between robots and human operators.
In brief, these tools include stabilizers on the robot’s structure that prevent vibration and allow for smooth mimicking of human actions. The robots in this laboratory use a combination of machine vision and force-sensing inputs to rapidly mimic the basic actions performed by the human robot trainer. This paradigm may eventually allow experts to train their robotic collaborators without the need for programming expertise.
The future of industrial robots:
Today, many industrial robots are kept in cages to protect human workers and most industrial robots are used in fields requiring repetitive and precise motions in a near-sterile environment, such as in car and phone manufacturing.
In the future, advances will permit robots to move freely on the factory floor while applying learned behavior. This will in turn effectively allow skilled workers to become programmers capable of translating their deep domain knowledge into a smart robotics platform. This will have the follow-on benefit of keeping those who are most familiar with the industrial process in touch with the factory floor without the need for constant intervention by robotics experts. These same advances will allow robots to even leave the factory and begin working among people, making cities, farms, hospitals, and resource collection more efficient.
Urban robotics:
The concept of the smart city that can dynamically respond to the needs of its population captivates the imagination. As the United Nations has projected that the portion of the world’s population living in urban areas will grow from the present 55% to 68% by 2050, the demands on urban infrastructure will necessarily become more intense. Repair, maintenance, cleaning, and new construction will be critical needs that may be met with next-generation robots.
Self Repairing Cities Project:
The Self Repairing Cities Project in the United Kingdom aims to tackle the challenge of zero disruption from street works in the country by 2050. This project is a collaboration between a number of academic groups in the United Kingdom such as the University of Leeds, the University of Birmingham, the University of Southampton, and University College London. In brief, their research goal is to develop robots that can automatically identify, diagnose, and repair street works in a smart city. They are developing three classes of robots for this task. The first is a set of aerial drones capable of flying sections of road to the site of a pothole or flying to a street lamp to repair it.
The second class of robots encompasses ground vehicles that use machine vision to safely reach the work location, perceive holes in the road using a suite of onboard sensors, and fix them automatically while on site.
Lastly, they are developing robots that travel along pipes looking for leaks or other damage that can then be corrected before the damage worsens. Each of these robot systems will greatly benefit from advances in sensing technology that make them safer, smaller, and more affordable for a city. Once implemented, they may become a valuable part of a smart city’s infrastructure.
Other urban robotics:
A number of companies have also begun to enter this space. This includes companies such as Echord, which has developed both an aerial robot (ARSI) and a ground robot (SIAR) that inspect sewers with minimal human intervention.
TeleRetail is a firm developing urban logistics robots capable of providing deliveries to citizens of smart cities without the need of a human operator. These robots are solar powered and capable of traveling almost unlimited distances. Further improvements may lower the cost of these robots while increasing their safety.
The Urban Robotics Lab has developed additional applications for drones working in urban centers. For example, they have developed a prototype aerial drone capable of repairing the windows of skyscrapers without human intervention, drones that can fight fires, and drones that clean green algae from bodies of water.
The application of these robots may free up enormous quantities of labor from maintenance tasks while allowing a city to perform preventative maintenance and ultimately run more efficiently.
Resource harvesting:
The collection of material from hostile environments is, theoretically, a wonderful use case for robots. Mining robots could allow for access to hostile environments without putting human lives at risk and may also be able to detect resources such as veins of ore that a human eye might otherwise miss.
Mining robots:
Robots are already being deployed by companies such as SMS Equipment, which announced in 2018 that they will implement autonomous haulage systems at their company-operated mines and deploy over 150 autonomous haul trucks over the next 6 years.
Deep-sea exploration robots:
One potentially valuable use of robots in this space is deep-sea exploration and harvesting of biomass for scientific study or the creation of new compound libraries for drug development. The Metals Company has developed technology that allows them to detect and collect polymetallic nodules on the ocean floor using advanced imaging and undersea drones.
The Wood and Gruber laboratories have collaborated to make a soft robotic gripper that can operate in the deep sea for gently harvesting biomass without destroying it.
(A) Bellows-type gripper collecting soft coral (Dendronephthya sp.) with inset image showing the sample on the deck of the ship. (B) Boa-type gripper collecting an Alcyonacean whip coral at a depth of 100 m. The arm and gripper were visually controlled using the Deep Reef’s onboard cameras.
Source: Soft Robotic Grippers for Biological Sampling on Deep Reefs
As they mature, these technologies may open up vast amounts of undersea mineral wealth that may be necessary to fuel our continued production of electronics.
Medical robots:
There is a major labor shortage in healthcare, and the demands on the industry will continue to increase as the population ages. One potential solution to this issue is to increase the productivity of healthcare workers with robotic assistance. This would necessarily require that the robots be developed with strict attention to safety.
Robot nurses:
Several firms have developed robots that help healthcare workers achieve dramatically improved efficiency. For instance, Diligent Robotics, utilizing the dynamic obstacle avoidance system developed by Fetch Robotics along with a manipulator arm, has developed a robotic nurse that can safely navigate a hospital while moving drugs or supplies around the hospital in response to doctor commands.
Robotic surgery:
Advances in telemetry mediated remote surgery allow for high precision procedures to be performed by surgeons even if they cannot be at the hospital where the surgery is taking place. The Smart Tissue Autonomous Robot (STAR) developed at Children’s National represents one of the first robots capable of performing surgery on soft tissue, allowing surgeons to work much more efficiently and with less fatigue.
These and other advances will help address the looming labor shortages facing healthcare, decreasing the heavy strain and burnout that the medical workforce is facing.
Conclusions:
The next generation of applications for robots is being made possible, in part, by advances in safety and sensing technologies. Improvement in the cost/benefit ratio of using these robots will depend upon the cost-effectiveness of their sensors, the ease of their control, and their autonomy. Developing more robust control mechanisms and sensor suites will allow robots to perform more complex tasks in concert with humans in a high-variation environment. Accomplishing this will allow one expert worker to control one robot intuitively, and may eventually allow them to coordinate suites of robots working in concert while benefiting from the operator’s domain knowledge to rapidly accomplish a series of tasks. This will allow firms to utilize the human individual’s expertise as broadly and efficiently as possible.
This excerpt was taken from our Disruptors report titled “Disruption in Human Robot Collaboration.“ The full report can be viewed here.