Fundamentals of robotics

Controllable machines have fascinated humankind since ancient times. But where do robotics and robots have their roots? What were the milestones in the past and what will the future bring? What are the most important components and how do these machines work? What are cobots and nanobots? How intelligent are robots really? Can they develop feelings and learn social behavior? We will explain the most important facets of a technology that will fundamentally change the future of humankind.

Definition of Robots

Robotics is a sub-domain of engineering and science that includes mechanical engineering, electrical engineering, computer science, and others. Robotics deals with the design, construction, operation, and use of robots and computer systems for their control, sensory feedback, and information processing. A robot is a unit that implements this interaction with the physical world based on sensors, actuators, and information processing. Industry is a key application of robots, or to be precise Industry 4.0, where industrial robots are used.

Areas of Application and Benefits of Robotics

Cobots

Collaborative robots or cobots are becoming increasingly important. In a growing number of areas in industry, conventional industrial robots are being replaced by or backed up with collaborative robots. Cobots work together with humans in production processes and are no longer kept separate from their human co-workers with protective devices, like typical industrial robots. Compared to traditional industrial robots, collaborative robots are smaller, can be used more flexibly, and are easier to program.

Cobots don’t replace human workplaces, they supplement them. Paradigm Electronics in Canada is a example: Productivity was increased by 50 percent by using cobots, and not one single job was lost. Staff carry out new tasks in newly created areas of activity, such as programming the machines and quality control at the end of the automatic production process. Experts from the Boston Consulting Group assume that in the future the use of robots will increase productivity per employee by up to 30 percent.

Industrial robots

Industrial robots are programmable machines that are used to handle, assemble, or process workpieces in the industrial environment. Most of these robots consist of a robot arm, a gripper, various sensors, and a control unit. They can also carry out actions autonomously depending on how they are programmed. Global robot density has increased significantly over the last years: In 2015, on average there were 66 units per 10,000 employees, but this has now risen to 74 units. In Europe the average robot density is 99, in the US it is 84, and in Asia it is 63.

According to statistics from the IFR (International Federation of Robotics, the international umbrella organization of all national robotics associations) in 2016, with about 31,500 installed units, the US recorded an all-time high for industrial robots, a 15 percent increase over 2015. Worldwide in 2016 roughly 290,000 industrial robots were in use, 14 percent more than in 2015. This trend will continue in the future: Average growth of 12 percent p.a. is expected in the coming years.

Industrial robots in the automotive industry

In this key industry for robotics, machines have played an important role in automated production processes for more than 50 years to make workflows more efficient, safer, faster, and more flexible. The first industrial robot, Unimate, was integrated into the production process at General Motors in 1961. The robot was used there to remove injection molding parts. In 1973, the first industrial robot began working at VW in Wolfsburg, Germany. The in-house development, nicknamed “Robby” by its human co-workers, was used in the production process for the Passat. According to a statistical survey by the IFR (International Federation of Robotics, the international umbrella organization of all national robotics associations), more than 17,600 industrial robots were in use in the US in 2016, 43 percent more than in 2015.

Automated Guided Vehicles / AGV

An AGV is a driverless transport vehicle with its own drive that is controlled automatically and guided without touch. AGVs are typically used to transport materials in production facilities. In the industrial environment, they represent the development from the traditional, bulky conveyor belt to a space-saving, highly flexible solution. Warehouses are another popular operating place for AGVs, where individual goods or large ranges of goods are taken to defined packing stations where they are processed. This type of robot generally moves at approx. 1-2 meters per second and is able to transport payloads of up to roughly 2,000 kilograms. AGVs differ in terms of their power supply, the tasks they carry out and the navigation and routing process. Power is supplied either through a cable (for rail-mounted AGVs), the rail itself, or a battery. The battery is charged via a induction charging plate or at charging stations, where the batteries can also be replaced. Depending on their tasks and area of deployment, AGVs as forklifts can move pallets, as tractor units can tow trailers, or as moving cargo areas can transport boxes or packages. AGVs can, for example, be navigated by lasers, in which case the robots scan labels attached at certain locations so that they can find their next destination. Optical navigation via recognition of colors, etc., is another option. Antennas or rails are also used to steer AGVs. The most flexible units are autonomous AGVs that scan their complete surroundings and create virtual maps from the results. They are able to notify other AGVs about obstacles and generate the optimum route for transportation. AGVs are moved by one to four actively driven wheels, depending on their area of deployment and the required degree of movement.

Service robots

A service robot is a machine that provides services for humans. A distinction is made between use for private individuals and professional applications. Vacuum cleaner and lawn mower robots have become established in the private sphere. Here are the different types of service robots:

Lawn mowing, vacuum cleaning, or window cleaning: Robots can take over some of the everyday household chores. People are impressed by the time and work that they save: According to a study by the German Federal Association for Information Technology, Bitkom, 42 percent of the more than 1,000 people who took part in the survey could imagine having a robot in their home. More than 80 percent would like assistance for vacuum cleaning or washing floors, 41 percent would like a robot to help out in the garden. 15 percent of the interviewees already had a robot in their home.

Although data protection and data security issues would have to be considered, 49 percent of the participants in the above-mentioned representative Bitkom survey could imagine trusting the security of their own home to a robot. A security robot looks after the household while the residents are on vacation, on a business trip, or at work. These robots can be controlled from an app via an Internet connection. If the robot senses impulses via motion detection, it sends an alert to a smartphone. The integrated camera makes HD recordings and has a two-way audio intercom feature.

Korean electronics group LG presented its new robot product series CLOi at the 2018 Consumer Electronics Show in Las Vegas. The Serving Robot model provides customers with food and beverages. It can be used round the clock (e.g. at airports, railroad stations, and in hotels) and serves meals on a tray that customers can take with them. When it has served the customer, the robot finds its way back to the service station to fetch new snacks and complete its next assignment.

Agriculture is another area that offers a lot of potential for robots. At present, pilot projects are being carried out in which robot arms and multi-spectral cameras installed on a harvester optimize the processes of cucumber harvesting. For seeding, small planting robots controlled from a tablet not only sow the seeds, they also document all the important information. Drones are suitable for monitoring the ripeness of vegetable products and weed growth and, if necessary, they can also spray critical areas.

As treatment aids, robots are especially used where patients have to learn how to reactivate their locomotor system after a stroke or neurological disease. People who suffer from paralysis learn to walk again and even climb stairs with the help of training machines. One robot can do the work of two therapists. Patients also receive direct feedback during the exercises. A wearable robot (exoskeleton) enables paralyzed patients to walk on their own. The step movements of the robot are triggered by the patients shifting their weight.

Robots are also found in operating theaters, where they do not replace the surgeon but are used as accurate assistants for minimally invasive procedures. Instead of using operating instruments such as scissors or forceps themselves, surgeons control a robot via a console with the help of a joystick and foot pedals. Procedures using an operation robot save time and are also less invasive for patients. Risks from human errors are minimized.

Aibo, the robotic dog from Sony, is an entertainment robot, which the company stopped selling in 2006 but reintroduced to the market in 2017 in a new version. Aibo perceives its surroundings with two cameras and microphones. The acquired data is evaluated by a learning program so that the robot dog can develop an individual personality. As well as Aibo, Roberta also belongs to the toy robot category. This initiative of the Fraunhofer Institute for Intelligent Analysis and Information Systems has been using special robots since 2002 to encourage children to take a playful approach to technology and to awaken their interest in the development and programming of robots.

Humanoid robots

Humanoid robots are machines that are designed to look like humans. Joint positions and movements are inspired by the human locomotor system. This is also clear by the fact that humanoid robots usually move on two legs in an upright position. The main motive for research and development in the field of humanoid robots is artificial intelligence (AI).

Artificial intelligence

In most scientific fields, the development of a humanoid robot is deemed to be an important basis for the creation of human-like AI. This is based on the idea that AI cannot be programmed but consists of learning processes. Accordingly, a robot can develop artificial intelligence only through active participation in social life. However, active participation in social life, including communication, is possible only if the robot is perceived and accepted as an equal creature due to its shape, mobility, and sensors.

Humanoid robots as multi-functional helpers

With rollers rather than legs, but with her small size, pleasant voice, and sparkling, round eyes, Josie Pepper the robot is currently assisting passengers at Munich Airport in Germany. Munich Airport, together with Lufthansa, is one of the first airports to trial a humanoid robot live. Josie provides information about the current flight status, check-in information, and describes the way to the departure gate or the nearest restaurant. The development from French company Soft-Bank Robotics is connected to the Internet via WiFi and can thus access a cloud to process and analyze dialog and link it with the airport data. In this way, Josie learns from every dialog and answers questions individually.

Human-machine interaction

So that people with no programming knowledge can also communicate with robots and give them instructions and information in a natural manner, interaction between humans and machines via voice, gestures, and facial expressions is very important.

Even in the age of smart speakers, recognizing and interpreting natural language in real time is a highly complex process for a machine. This is due to variable factors such as surrounding acoustics, background noise, volume, dialects, accents, or the general pitch. At present, machines can recognize natural language with about 95 percent accuracy.

3D data capture in real time is needed for precise recognition and interpretation of human gestures with no latency. Scientists at the Fraunhofer Institute for Applied Optics and Precision Engineering are working on systems that record and process 3D data quickly. Two high-speed cameras and a color camera record images, while the special software converts them into 36 3D data records per second. The scientists have also developed learning software based on neural networks for the system.

It is generally possible to make conclusions about how a conversation between two people is progressing on the basis of their facial expressions. This is also to be implemented in a dialog between humans and robots. Thanks to a flexible polymer and integrated servo motors, the faces of robots produced by Hanson Robotics are able to show a wide range of expressions. The aim is that the robots will adapt their interaction based on the facial expression of humans. For example, with an anxious expression, it should keep its distance to the person, and with a questioning expression, it should provide information.

In ongoing research projects, robots are being taught to recognize and understand human emotions and to respond accordingly. With the corresponding expressions and gestures, the robot can show or simulate emotions as a response to humans. One example of this is the Emotisk training system that scientists at Humboldt University of Berlin are currently developing in collaboration with the university hospitals of Aachen and Cologne in Germany: The software evaluates information such as line of vision and facial expressions and gives humans the corresponding emotional feedback. The system is designed to help persons with autism recognize the emotions of others and to respond by sending non-verbal signals.

Because of their visual similarity to humans and their human-like behavior and actions, we tend to ascribe a personality to humanoid robots. In actual fact, simulating a personality can influence human-machine interaction. For an experiment, Japanese scientists from Toyohashi University of Technology developed a robot that follows the eyes of its human dialog partner and registers when he or she is distracted by other events. In this situation, the robot leans forward, raises its voice, and nods. The result: by demonstrating personality traits, the robot regains the attention of its human counterpart.

It is still difficult or even impossible to draw the line between “just” a smart robot and a social one. A current example is Jibo, the first social robot from the eponymous US company, which has been available since the end of 2017. According to the manufacturer, the approximately 30-centimeter tall domestic robot loves being amongst humans and building a relationship with them. It learns which people its owner especially likes and blends seamlessly into the owner’s social life. Jibo is also charming and is able to produce surprises with spontaneous actions like a little dance. At least, this is what the manufacturer says. The field trial showed that the social robot does not differ significantly from other smart systems. But it is much more expensive.

Safety and security

The increasing popularity and spread of robots in many different areas of life and the associated interaction between humans and machines offer both opportunities and challenges for the safety and security of humans and data. The safety requirements are especially obvious for the use of industrial and collaborative robots at the workplace.

When robots are used in industrial production, workplace safety measures ensure that humans are protected. These measures include adequate safety distances between humans and machines, safety barriers, photoelectric barriers, and scanners in monitored zones. The safety precautions also include emergency switches on the robot and its ability to recognize collisions with objects and humans and to respond appropriately. This applies especially to cobots.

With these newer industrial robots, there are no separating safety devices in certain working areas. Other technical safety measures are used instead.  For example, if a person is several meters away, the robot operates in normal mode. If the person gets closer, from a defined threshold, the robot slows down. If the person is very close and there is just a one-meter gap, it stops.

With newer systems, ToF (Time of Flight) technology is used. This technology uses 3D camera systems that measures distance based on time of flight. The surroundings are illuminated with a modulated light source. For each pixel, the camera measures the time that the light needs to reach an object and be reflected, which is then used to calculate the distance of each pixel to the object in question. Radar sensors are also used in this area. In this case, movements are detected on the basis of electromagnetic waves in the radio frequency range. Safety for humans can also be increased by combining several redundant technologies.

In a workplace where an increasing number of complex systems are connected and communicate with each other, it is important that these systems are protected against data theft and manipulation. In addition to manipulating configuration files (changing the motion areas or the position data) and code manipulation (reprogramming sequences), manipulating the robot feedback (deactivating alarms) is the greatest threat. These interventions can lead to the destruction of products, damage to robots and, in the worst-case scenario, injuries to people working in these areas. To guarantee the security of data, interfaces, and communication channels, a growing number of companies choose external software solutions. These solutions offer protection against manipulation of configuration files by encrypting them and storing them in a Secure Element (SE). Authentication also prevents unauthorized access to the central processing unit. To prevent code manipulation, software solutions offer authorization of sent commands by means of a hash process and verification of the code.

History

If you think of robots or robotics, your thoughts will likely wander around the last 50 years. Most people will picture a more or less human-looking machine, with arms, legs and a friendly smile. The fascination for humanoid machines and mechanical helpers has been around for centuries. Here are a few highlights from the past to illustrate the evolution of robotics:

Even in the 1st century BC, there were inventions, machines, and works that could be seen as the predecessors of robots and robotics as we know them today. They come from Hero of Alexandria, also known as Heron of Alexandria, a Greek mathematician and engineer. That also explains his “nickname” Mechanicus. In his work “Automata“ (the book of machines), Hero describes various “robots”. Some of them could automatically open temple doors or play music. You could almost say, the first ancient smart home. Apart from his drafts for catapult-like weapons, Mechanicus is especially remembered for his aeolipile, also known as Hero’s engine. This is the first heat engine, a predecessor of the steam engine. The forefather of robotics designed more than 100 drafts for robots and machines.

In 1495, the multiple genius and polymath Leonardo da Vinci designed what was probably the first human-like machine. His Mechanical Knight was able to sit and stand Other functions: lifting the visor and full movement of the arms. The movements were carried out with a complex system of pulleys and cables. Whether da Vinci actually built the robot or only designed it cannot be proven historically. In any case, robotic knights that were built according to the plans were fully functioning.

The word robot was first used in the English language in the 1920 science fiction play R.U.R. by Czech author Karel Čapek. It was first performed on January 25, 1921. In the internationally successful play, autonomously thinking machines are the spitting image of humans. When they develop a consciousness, the robots rebel against their role as slaves which leads to the extinction of the human race. A touch of Terminator on the stage.

Elektro made his appearance at the 1939 New York World Fair. The more than two-meter tall and 120 kilogram heavy humanoid robot was able to speak about 700 words thanks to an integrated record player. As well as having movable arms and legs, the mechanical man was also able to distinguish between different colored lights and to smoke cigars. The year after the world fair, Elektro appeared together with the robot dog Sparko.

George Devol was granted the first patent for an industrial robot in 1961. That same year, the Unimate was deployed on an assembly line of General Motors. It consisted of a computer-like box that was connected to another box and an arm. The robot removed heavy die-cast components from an assembly line and welded them on to automobile bodies. At that time, this section of the production was associated with high health risks for human employees. In addition to the danger from chemical substances, there were also several accidents where employees lost limbs. In Germany, the age of industrial robots began in the 1970s.

In November 2017, the Munich-based start up Franka Emika received the 2017 German Future Prize from the German President for its development of inexpensive, flexible, intuitively operated robots. The lightweight robots can be used in industrial applications and in healthcare. This is possible due to the torque sensors installed in the joints that respond to human touch. Another special feature of this robot is the price, which is well below the market average and, consequently, makes highly efficient, ultramodern robots appealing and affordable even for small and medium-sized companies.

Robot drive and control

The most important types of robot drives

Basically, one distinguishes between two different types of robot drives: electric motors and hydraulic drive. What are the main features of these two types of drive? And what are the fundamental differences?

Many modern robots currently use electric motors. While humanoid robots and smaller robots are powered mainly by DC motors, most industrial robots and CNC machines operate with 3-phase motors. These motors are preferred in automatic systems where robots frequently make the same movement, such as a rotating arm.

Modern hydraulic drives in robots work like artificial muscles. Since 2014, Japanese developers have been working on an artificial muscle consisting of a rubber hose, tension-proof fibers, and a protective collar. This system, which imitates a human muscle, does not use compressed air but is moved hydraulically. The advantages of this concept: The hydraulic muscle is more efficient and can also carry out fine movements. The system is also more sturdy than an electric motor. Robots equipped with a hydraulic drive system can withstand unfavorable conditions in disaster zones.

The three phases of robot control

Robots are always controlled in three phases – perception, processing, and action. At present, most robot types are controlled by pre-programmed or learning algorithms. With humanoid robots and cobots, the robots perceive their surroundings and other important information, such as recognizing workpieces, via sensors. The robots process this information and pass it on to their motors as signals, which put the mechanical elements into action. Artificial intelligence (AI) is another way for a robot to determine how to act optimally in its environment. Within the scope of human-machine interaction, control systems can be split into different levels of autonomy:

With this type of control, humans are in complete control. They control the robot either directly by touch, by remote control, or via an algorithm that is programmed for the control unit.

Humans specify basic positions and movement sequences. The robot then determines how to use its motors optimally within the scope of the specifications.

With these systems, humans specify a general task. The robot autonomously determines the optimum positions and movement sequences to fulfill the task.

The robot recognizes its tasks autonomously and carries them out completely on its own.

Gripper

To carry out production processes and move objects, robots need mechanical extremities. These are available in various designs:

This commonly used type of gripper is used especially on industrial robots and, in most cases, has a pneumatic or hydraulic drive system. Smaller robots with the corresponding smaller grippers have a pneumatic drive that allows precise movements at a reasonable cost. Hydraulic drive systems are used for heavy loads.

With magnet grippers, a distinction is made between permanent and electromagnetic grippers. With the simpler permanent magnet grippers, the gripping power is provided by a permanent magnet. The gripped material is released with the help of a piston that is installed inside the permanent magnet gripper. Electromagnetic grippers are supplied with direct current that provides the necessary magnetic field. The material is picked up and released by switching the electric energy on an off.

Adhesion grippers are used to pick up small objects, such as cans and boxes. Adhesive forces are the attracting forces at the contact surfaces of two different or same substances through molecular forces. The substances can be in a solid or liquid state. The robot gripper moves the objects, using the adhesive force of liquids or by using special adhesives.

Vacuum grippers can lift heavy loads. The object is pressed against the sealing lips of the sucker on the gripper, using the excess pressure of the ambient air. Heavy objects, such as workpieces or vehicle windows, are held in place by the vacuum in the sucker. These objects must have a smooth surface so that they can be sucked up by the sucker.

Humanoid hands allow much finer actions to be carried out than conventional grippers. An example of this is the Kanguera project at São Paulo University. This robot hand is the shape and size of a human hand. The signals are sent via cables and a transformer, which allows more precision compared to previous robot hands.

Sensors

With the aid of integrated sensors, robots perceive physical or chemical influences and convert them into pulses. This allows them to identify and localize objects, for example. The robots can also recognize other important factors in the environment, such as temperature, motion, pressure, light, or moisture, with sensors. Internal sensors provide information about speed or load status; external sensors help especially as regards interaction and navigation. Below is an overview of the most important sensor types:

Force/torque sensors are among the most commonly used sensor types. They are implemented in grippers and can record both force and torque. Strain gauge strips recognize deformations in the micrometer range. These deformations are converted into three force and torque components via a calibration matrix. Force/torque sensors have a digital signal processor that captures and filters the sensor data in case of deformation, calculates the measured data and sends this via the communication interface.

Inductive sensors are also described as proximity sensors. Without touching them, they recognize metal parts that are within their measurement range. Because of this, they are very suitable for wear-free recording of end positions of moving machine components, for example. The surface of the sensor radiates an oscillating electromagnetic field. If metal objects are in the measurement range, they absorb a small amount of energy from the oscillator. If the energy transfer reaches a threshold, the target object recognition is confirmed and the sensor output changes its state.

Capacitive sensors consist of two metal parts isolated from each other and can recognize both metallic and non-metallic materials. Measurement is touchless through the change in capacity of an electric capacitor. Since the capacity of a capacitor changes with the distance of its electrodes, this measurable variable is used to measure distance. Capacitive sensors are used, for example, to reliably recognize humans in the vicinity of the robot.

Magnetic sensors are used for touchless, exact position detection and recognize magnets even through stainless steel, plastic, and wooden structures. The sensors are based on the GMR effect (giant magnetoresistance). This effect occurs in structures that consist of alternating magnetic and non-magnetic thin layers that are just a few nanometers thick. With this effect, the electric resistance of the structure depends on the mutual orientation of the magnetization of the magnetic layers. Magnetization in the opposite directions is much higher than in the same direction.

Tactile sensors perceive the mechanical touching of objects and derive signals that are then sent. For example, a gripper arm can determine the shape and position of an object with the aid of tactile sensors. Even if a sensor cannot yet match human senses, innovative tactile sensors can imitate the mechanical properties and tactile receptors of human fingertips. This allows the robot to autonomously adjust the gripping intensity according to the condition of the object, an important property, especially in human-machine interaction.

In robotics, optical or visual sensors have the tasks of acquiring information from an image or an image sequence, analyzing this, and acting or reacting on the basis of this analysis. For example, the data is recorded by one or more cameras (2D or 3D) or by a scanner. Optical sensors play an important part in the navigation of robots and their orientation in the surroundings.

Movement

There are many ways in which robots get from A to B. The most common are robots with wheels because they easy to control and move in an energy-efficient manner. In many cases, however, alternative means of movement are more suitable, such as in rough terrain or when robots have to move in confined spaces. Autonomous movement of robots is a great challenge in this area. This means that the robot decides on its own which type of movement is most suitable for the respective situation and environment.

The most common method or movement is on four wheels. But there are also robots with one or two wheels to increase mobility and save components. All-terrain robots have six or more wheels.

Feeding robots in the agricultural industry are one example of this type of movement. The feed container, mixing device and weighing device are suspended on a rail or are guided laterally. Power is supplied by batteries, a trailing cable, or a supply rail. The robots are controlled by a process computer installed on the container. The feeding robot gets new feed from stationary storage or mixing containers.

Robots that walk upright on two legs are still a great challenge for developers, especially as regards stability. The ZMP (Zero Moment Point) algorithm is a solution from Honda that is used by the ASIMO robot to move on two legs. However, this model needs a flat surface to move. Consequently, this robot is not suitable for trips in rough terrain. A more advanced method is the use of a dynamic compensation algorithm. It is more robust than ZMP technology, as the robot’s movement is constantly monitored and the feet are placed to ensure stability. Robots that use this technology can even jump. Another approach is passive dynamics in which the momentum of the swinging limb is used for more efficiency. With this technology, robots can even walk uphill and are said to move ten times more efficiently than robots with ZMP technology. Currently the most impressive example as regards movement and balance comes from Boston Dynamics: The latest version of the walking robot Atlas is able to perform incredible jumps and backflips.

If we think about flying robots, the first thing that comes to mind is the drone, which is now used extensively in civil and military areas. But there are other interesting concepts, such as the EU project ARCAS (Aerial Robotics Cooperative Assembly System). Scientists at the German Aerospace Center have integrated a robotic gripper arm in an autonomous helicopter. This robot is used to inspect and repair pipelines. Other conceivable areas of use are maintenance of satellites or industrial plants or building infrastructures on other planets. In 2013, researchers at Harvard University developed robot bees that can fly and dive into water. Perspectively, these tiny robots will take over the tasks of bees that are threatened with extinction and pollinate plants.

Mobile robots are equipped with a combination of navigation hardware and software to perceive their surroundings, navigate optimally and respond to dynamic events, such as people or moving objects. In most cases, a mixture of GPS navigation device, radar sensors, but also Lidar technology or cameras ensures that the robots can navigate and act safely in the environment.

Overview and Outlook

Smart digital assistants in the ISS to reduce stress for humans in future Mars missions. Avatar robots, including VR suit, with which everyone can visit any place in the world without being physically present. A four meter high, three ton heavy racer that redefines racing. There's no doubt: Considering the rapid developments of the past years, robotics will have a lasting influence and will shape the future for humans and their cooperation and coexistence with increasingly intelligent, autonomous machines. The associated questions have more to do with ethics than technology.

But should humans do everything that is technically feasible? How intelligent should machines become? Should robots be made so intelligent that some time in the future they see humans as an obstacle to their continued progress? How should driverless vehicles react when a collision is unavoidable: Should it steer into a group of older people rather than a group of schoolchildren? Should it be possible for us to transfer our personality onto a machine hard drive so that we can continue to exist even when the shell of flesh and blood no longer allows this? At present, these considerations are still a long way off. We’ll return to the here and now and give an overview of projects from various areas of our lives which show how robots are currently used.

Sample applications: Service

No seats, no tables, not even bar tables. Cafe X, which opened in San Francisco in 2018, is designed completely for efficiency and ultrafast service. Two coffee machines and a robotic arm provide the caffeine hit of the future. Customers choose their hot beverage plus any extras on a touch screen, enter their cell phone number, and pay by credit card. About 30 seconds later, a code is sent to their smartphone. They enter this code at the dispensing station and their coffee is poured. The complete process takes no longer than a minute. One employee in the café is sufficient to monitor the robot and replenish coffee beans, milk, etc.

Zume Pizza is located in the San Francisco Bay Area. The centerpiece of the company is the aptly named Doughbot that presses the dough into shape up to five times faster than a human being. Robots also dispense the perfect amount of sauce, spread the sauce, and remove the pizzas from the oven. Each morning, the number and types of pizzas required for that day are calculated. This means that each pizza doesn't have to be prepared from scratch, but is still fresh compared to classic fast food. All orders are placed by smartphone or on the company’s website. The company has no shops where pizzas can be picked up. All pizzas are delivered.

Sample applications: Geriatric care and nursing

The high level of physical stress for nurses and carers when lifting or moving patients is often the cause of health problems. With its Elevon project, the Fraunhofer Institute for Manufacturing Engineering and Automation wants to provide some relief for people working in the healthcare industry. Elevon is a semi-autonomous lifter to lift and transport persons. The multi-functional patient lifter is designed for use in stationary care and will autonomously move the patient to the intended location. Elevon can carry out various transfer processes and transport patients while they are lying and sitting. Staff can request the lifter electronically, which saves time and unnecessary legwork.

Sample applications: Transport

The world's first multicopter that is approved for manned flights comes from Germany. The Volocopter 2X vertical takeoff and landing aircraft has 18 rotors, flies completely autonomously and has room for two persons. Passengers also have the option of steering the multicopter with a joystick. It can be used as a flying taxi or a trade fair shuttle. Using its own hubs, the Volocopter, which is powered by electric motors, connects vital intersections like airports or business parks with city centers.

Logistics service provider Hermes is currently trialling package deliveries in German cities with the Starship robot. The company plans to use logistics robots in five to ten cities in 2018. Starship is developed, built, and maintained by the eponymous startup, which is based in Estonia and London, UK. The robot will deliver packages from Hermes branches to customers within a radius of up to four kilometers. The machines are controlled via satellite positioning. Once the systems have learned the street maps, they will be able to navigate completely autonomously.

The Future Bus from Mercedes first made its appearance in Amsterdam, Netherlands, in 2016. In trial mode, the bus traveled a 20 kilometer section from Schiphol airport to the city of Haarlem with no problems. The bus drives autonomously (initially with a driver for monitoring purposes) on a separate bus lane (Bus Rapid Transit, BRT) and brakes automatically when it recognizes obstacles. The system communicates with traffic lights and stops accurately at bus stops. The city buses can share information with each other, such as details about the weather, the number of passengers on board, and road conditions. The Future Bus will go into serial production from the start of 2020.

Sample applications: Work area

In January 2017, Netherlands-based Robot Security Systems launched its SAM3 security robot on the market. The robot is designed for building security and can recognize people and objects while on the move, can avoid obstacles, and can even operate elevators and doors. If a person cannot identify themselves via the integrated ID reader within a time limit or tries to sabotage the robot, SAM triggers an alarm. The robot scans the area it is programmed to patrol and is able to detect fire or faults in technical systems. The machine has a thermal imaging camera and four cameras so that is has 360 degrees vision. It has a top speed of 10 kilometers per hour.

In 2017, Danish company Nilfisk introduced a scrubber-vacuum cleaner for autonomous wet cleaning of floors. The Advance Liberty A 50 floor-care robot has sensors, cameras, and software, with which it can record a room with a single manual ride. The robot recognizes obstacles as small as a tennis ball and avoids these on its own. It can even be used in supermarkets during opening hours.

Sample applications: Household

According to the producer, Asus, the domestic robot Zenbo assists people at home as a helper, entertainer, and companion. When it is integrated into a smart home system, Zenbo can check who is ringing the doorbell. When the camera image from the front door has been sent to the robot, the house owner can unlock the door by remote control. The emergency call function is intended for use in homes where older people live. If the robot registers that someone has fallen down, it alerts other family members. Zenbo can also remind people in the house to take their medication or that they have an appointment with their physician.

Sample applications: Medicine

Researchers at Arizona State University in the US have developed nanorobots that can search for tumors in the body on their own and then cut off the blood flow to them. Tumors and metastases have been successfully fought with this technology in experiments on mice. If a nanobot encounters a tumor blood vessel in the bloodstream, it triggers a blood clot in it. As a result, the tumor shrivels and dies. The scientists report that the nanobots work quickly and lay siege to the cancer cells just a few hours after being injected. To date, there have been no side effects.

Last update: March 2018