Actroid is a type of android (humanoid robot) with strong visual human-likeness

Actroid is a type of android (humanoid robot) with strong visual human-likeness

Actroid is a type of android (humanoid robot) with strong visual human-likeness developed by Osaka University and manufactured by Kokoro Company Ltd. (the animatronics division of Sanrio). It was first unveiled at the 2003 International Robot Exhibition in Tokyo, Japan. Several different versions of the product have been produced since then. In most cases, the robot’s appearance has been modeled after an average young woman of Japanese descent.

The Actroid woman is a pioneer example of a real machine similar to imagined machines called by the science fiction terms android or gynoid, so far used only for fictional robots. It can mimic such lifelike functions as blinking, speaking, and breathing. The “Repliee” models are interactive robots with the ability to recognize and process speech and respond in kind.

The original Repliee Q1 had a “sister” model, Repliee R1, which is modeled after a 5-year-old Japanese girl.

More advanced models were present at Expo 2005 in Aichi to help direct people to specific locations and events. Four unique faces were given to these robots. The ReplieeQ1-expo was modeled after a presenter for NHK news. To make the face of the Repliee Q2 model, the faces of several young Japanese women were scanned and the images combined into an average composite face.

The newer model Actroid-DER2 made a recent tour of U.S. cities. At NextFest 2006, the robot spoke English and was displayed in a standing position and dressed in a black vinyl bodysuit. A different Actroid-DER2 was also shown in Japan around the same time. This new robot has more realistic features and movements than its predecessor.

In July 2006, another appearance was given to the robot. This model was built to look like its male co-creator, roboticist Hiroshi Ishiguro, and named Geminoid HI-1. Controlled by a motion-capture interface, Geminoid HI-1 can imitate Ishiguro’s body and facial movements, and it can reproduce his voice in sync with his motion and posture. Ishiguro hopes to develop the robot’s human-like presence to such a degree that he could use it to teach classes remotely, lecturing from home while the Geminoid interacts with his classes at Osaka University.

In May 2011 a Danish Lector, Henrik Schärfe, revealed a robotic version of himself. Manufactured in Japan and called a Geminoid-DK, its actions are controlled remotely by a person operating a computer, but it is programmed with Schärfe’s own unique body movements, such as shrugs and glances.

Tipron is a transforming robot projector that looks like a rolling eyeball

  

Tipron is a transforming robot projector that looks like a rolling eyeball

Tipron is a transforming robot projector that looks like a rolling eyeball

The Tipron, developed by Japanese smart device maker Cerevo, looks like a white eye on a sleek robotic stalk and wheeled base — a child’s friendly sidekick droid in some ’70s science fiction movie. It has one purpose: to move around your house and project things on walls. I assume there is a segment of the population that has always wanted a mobile robot projector; in fact, another one debuted two years ago. The rest of us can enjoy it for what it is: the prototypical weird CES gadget.

It’s hard to say how well the Tipron fulfills its intended purpose on a crowded show floor, especially given the bright lights and lack of a screen to show off the projection quality. It’s managed via a smartphone app that acts like a remote control for both the projector head and the robot itself. While moving, the Tipron folds up and slowly rolls wherever it’s directed. With a button tap, it extends into projection mode, where users can change the angle and keystone of an image.

Tipron can automatically project content you’d like to see such as movies or pictures on an 80 inch screen not only on a wall, but also on a floor or ceiling at any angle from a distance of 3 meters.

Using the built-in RSS reader function, it is easy to display information such as news, weather forecasts and Twitter feeds. The news can also be set to scroll automatically.

Automatic chargingTipron returns to it’s charging station automatically and starts charging himself after finishing all scheduled actions. He can also return to recharge when his battery is low*1.*1 Only supported when operating in scheduled mode.

Shimon a four-armed, marimba-playing robot

Shimon a four-armed, marimba-playing robot

Article by http://www.news.gatech.edu/2017/06/13/robot-uses-deep-learning-and-big-data-write-and-play-its-own-music.

A marimba-playing robot with four arms and eight sticks is writing and playing its own compositions in a lab at the Georgia Institute of Technology. The pieces are generated using artificial intelligence and deep learning.

Researchers fed the robot nearly 5,000 complete songs — from Beethoven to the Beatles to Lady Gaga to Miles Davis — and more than 2 million motifs, riffs and licks of music. Aside from giving the machine a seed, or the first four measures to use as a starting point, no humans are involved in either the composition or the performance of the music.

The first two compositions are roughly 30 seconds in length. The robot, named Shimon, can be seen and heard playing them here and here.

Ph.D. student Mason Bretan is the man behind the machine. He’s worked with Shimon for seven years, enabling it to “listen” to music played by humans and improvise over pre-composed chord progressions. Now Shimon is a solo composer for the first time, generating the melody and harmonic structure on its own.

“Once Shimon learns the four measures we provide, it creates its own sequence of concepts and composes its own piece,” said Bretan, who will receive his doctorate in music technology this summer at Georgia Tech. “Shimon’s compositions represent how music sounds and looks when a robot uses deep neural networks to learn everything it knows about music from millions of human-made segments.”

Bretan says this is the first time a robot has used deep learning to create music. And unlike its days of improvising, when it played monophonically, Shimon is able to play harmonies and chords. It’s also thinking much more like a human musician, focusing less on the next note, as it did before, and more on the overall structure of the composition.

“When we play or listen to music, we don’t think about the next note and only that next note,” said Bretan. “An artist has a bigger idea of what he or she is trying to achieve within the next few measures or later in the piece. Shimon is now coming up with higher-level musical semantics. Rather than thinking note by note, it has a larger idea of what it wants to play as a whole.”

Shimon was created by Bretan’s advisor, Gil Weinberg, director of Georgia Tech’s Center for Music Technology.

“This is a leap in Shimon’s musical quality because it’s using deep learning to create a more structured and coherent composition,” said Weinberg, a professor in the School of Music. “We want to explore whether robots could become musically creative and generate new music that we humans could find beautiful, inspiring and strange.”

Shimon will create more pieces in the future. As long as the researchers feed it a different seed, the robot will produce something different each time — music that the researchers can’t predict. In the first piece, Bretan fed Shimon a melody comprised of eighth notes. It received a sixteenth note melody the second time, which influenced it to generate faster note sequences.

The da Vinci Surgical System is a robotic surgical system

  

The da Vinci Surgical System is a robotic surgical system

The da Vinci Surgical System is a robotic surgical system made by the American company Intuitive Surgical. Approved by the Food and Drug Administration (FDA) in 2000, it is designed to facilitate complex surgery using a minimally invasive approach, and is controlled by a surgeon from a console. System is commonly used for prostatectomies, and increasingly for cardiac valve repair and gynecologic surgical procedures.

According to the manufacturer, Da Vinci System is called “da Vinci” in part because Leonardo da Vinci’s “study of human anatomy eventually led to the design of the first known robot in history.”

Da Vinci Surgical Systems operate in hospitals worldwide, with an estimated 200,000 surgeries conducted in 2012, most commonly for hysterectomies and prostate removals. As of September 30, 2016, there was an installed base of 3,803 units worldwide – 2,501 in United States, 644 in Europe, 476 in Asia, and 182 in the rest of world.The “Si” version of the system costs on average slightly under US$2 million, in addition to several hundred thousand dollars of annual maintenance fees. DA Vinci system has been criticised for its cost and for a number of issues with its surgical performance.

Deep Trekker 340-Innovation to Pipe Inspection Robot

Photo and Article credit by -Deep Trekker

Deep Trekker 340-Innovation to Pipe Inspection Robot

Deep Trekker 340-Innovation to Pipe Inspection Robot

Introducing the world’s only truly portable, battery-operated pipe crawler system. That’s right, everything you need comes in only two carrying cases, no more dedicated trucks or complicated systems; you can deploy from anywhere in under 5 minutes.

Photo credit by Aqua world 

Not only does the DT340 pipe crawler include internal batteries, it also comes with a lightweight handheld control console, a strong but thin tether, a pivoting tether connection, wheel and track options, and plug-and-play integrations – all designed to make your pipe inspections easier.

We’ve taken what we’ve learned in the underwater submersibles world and applied it to the pipe inspection industry. The new DT340 Pipe Crawler is depth rated to 50 m (164 ft), requires no topside power, and is affordable for small municipalities and service companies. The DT340 Pipe Crawler is perfect for water pipe and sewer pipe inspections

Method -manned bipedal robot designed by Vitaly Bulgarov

Method -manned bipedal robot designed by Vitaly Bulgarov

Method -manned bipedal robot designed by Vitaly Bulgarov

 

Method -manned bipedal robot designed by Vitaly Bulgarov

A South Korean robot is getting attention for both its form and function. The Method 2 robot bears a striking resemblance to robots from the silver screen, but could soon see action in the DMZ between South and North Korea.

“Everything we have been learning so far on this robot can be applied to solve real-world problems,” said designer Vitaly Bulgarov on his Facebook page.

He has previously worked on film series such as Transformers, Robocop and Terminator.

The Method 2’s creators, at Hankook Mirae Technology, claim that the “robot is the world’s first manned bipedal robot and is built to work in extreme hazardous areas where humans cannot go .” That’s according to Mirae company chairman Yang Jin-Ho, who also noted that the robot was in “baby steps” and needed a couple years before it would be allowed to “move freely.”

Building the giant robot was a challenge for the engineers – most of them in their mid and late 30s – as its unprecedented scale meant they had nothing to refer to, said one who declined to be named.

iCub is a 1 metre high open source robotics humanoid robot testbed

iCub is a 1 metre high open source robotics humanoid robot testbed

Photo :Jll at English Wikipedia

iCub is a 1 metre high open source robotics humanoid robot testbed

iCub is a 1 metre high open source robotics humanoid robot testbed for research into human cognition and artificial intelligence.

 

Photo by Icub.org

It was designed by the RobotCub Consortium of several European universities and built by Italian Institute of Technology, and is now supported by other projects such as ITALK.[1] The robot is open-source, with the hardware design, software and documentation all released under the GPL license. The name is a partial acronym, cub standing for Cognitive Universal Body. Initial funding for the project was €8.5 million from Unit E5 – Cognitive Systems and Robotics – of the European Commission’s Seventh Framework Programme, and this ran for 65 months from 1st September 2004 until 31st January 2010.

Lorenzo Natale – We took the picture – Wikipedia.

The motivation behind the strongly humanoid design is the embodied cognition hypothesis, that human-like manipulation plays a vital role in the development of human cognition. A baby learns many cognitive skills by interacting with its environment and other humans using its limbs and senses, and consequently its internal model of the world is largely determined by the form of the human body. The robot was designed to test this hypothesis by allowing cognitive learning scenarios to be acted out by an accurate reproduction of the perceptual system and articulation of a small child so that it could interact with the world in the same way that such a child does.

Advanced Step in Innovative Mobility – Asimo created by Honda

  Advanced Step in Innovative Mobility - Asimo created by Honda

Vanillase -Wikipedia photo

Advanced Step in Innovative Mobility – Asimo created by Honda

Advanced Step in Innovative Mobility – Asimo created by Honda

ASIMO (whose name comes from English initials or words Advanced Step in Innovative Mobility) is a humanoid robot created by Honda in 2000. It is currently displayed in Miraikan museum in the Japanese capital city of Tokyo.

Honda began developing humanoid robots in the 1980s, including several prototypes that preceded ASIMO. It was the company’s goal to create a walking robot. E0 was the first bipedal (two-legged) model produced as part of the Honda E series, which was an early experimental line of self-regulating, humanoid walking robot with wireless movements created between 1986 and 1993. This was followed by the Honda P series of robots produced from 1993 through 1997. The research made on the E- and P-series led to the creation of ASIMO. Development began at Honda’s Wako Fundamental Technical Research Center in Japan in 1999 and ASIMO was unveiled in October 2000.

ASIMO stands 130 cm (4 ft 3 in) tall and weighs 54 kg (119 lb). Research conducted by Honda found that the ideal height for a mobility assistant robot was between 120 cm and the height of an average adult, which is conducive to operating door knobs and light switches. ASIMO is powered by a rechargeable 51.8 V lithium-ion battery with an operating time of one hour. Switching from a nickel metal hydride in 2004 increased the amount of time ASIMO can operate before recharging. ASIMO has a three-dimensional computer processor that was created by Honda and consists of a three stacked die, a processor, a signal converter and memory. The computer that controls ASIMO’s movement is housed in the robot’s waist area and can be controlled by a PC, wireless controller, or voice commands.

ASIMO has the ability to recognize moving objects, postures, gestures, its surrounding environment, sounds and faces, which enables it to interact with humans. The robot can detect the movements of multiple objects by using visual information captured by two camera “eyes” in its head and also determine distance and direction. This feature allows ASIMO to follow or face a person when approached.

There are sensors that assist in autonomous navigation. The two cameras inside the head are used as a visual sensor to detect obstacles. The lower portion of the torso has ground sensor which comprises one laser sensor and one infrared sensor. The laser sensor is used to detect ground surface. The infrared sensor with automatic shutter adjustment based on brightness is used to detect pairs of floor markings to confirm the navigable paths of the planned map. The pre-loaded map and the detection of floor markings help the robot to precisely identify its present location and continuously adjusting its position. There are front and rear ultrasonic sensors to sense the obstacles. The front sensor is located at the lower portion of the torso together with the ground sensor. The rear sensor is located at the bottom of the backpack.