Skip to main content

This bug-like robot is learning to improvise on the go

In the world of robotics, one of the most commonly studied types of creatures are insects, and there are lots of of robots attempting to replicate the movement patterns of various creepy crawlies. One thing that even the most advanced A.I. currently has trouble with is improvisation. Insects might not be the most intelligent creatures to inhabit this planet, but they can still adapt to new situations, which is something that a lot of A.I. has trouble with. And now a new robot from Tokyo Tech provides a fresh look at how robotics is taking cues from nature.

Recommended Videos

“Perhaps the most exciting moment in the research was when we observed the robot exhibit phenomena and gaits which we neither designed nor expected, and later found out also exist in biological insects,” lead researcher Ludovico Minati said in a press release.

Theoretically, you could program an A.I. with a way to respond to one of thousands of pre-programmed situations. The Tokyo Tech team believes that there must be a simpler way, since insects are capable of responding to new situations despite being less than intelligent.

The insectoid machine does make use of a pattern generator, but it is still a simpler approach than than the one used by a lot of robotics. The pattern generator sends a master signal to the oscillators, which control the legs. From there, the robot simply needs to tweak one of its five pre-programmed responses to create something new.

“An important aspect of the controller is that it condenses so much complexity into only a small number of parameters. These can be considered high-level parameters, in that they explicitly set the gait, speed, posture, etc.,” said Yasaharu Koike. “Because they can be changed dynamically, in the future it should be easy to vary them in real-time using a brain-computer interface, allowing the control of complex kinematics otherwise impossible to dominate with current approaches.”

Aside from simply being an interesting development in the field of robotics, this technology could have practical applications as well. The team is hopeful that this will make it easier to use robots for tasks that involve traversing unfamiliar terrain, since they can more easily adapt to their surroundings.

Eric Brackett
Former Digital Trends Contributor
Why teaching robots to play hide-and-seek could be the key to next-gen A.I.
AI2-Thor multi-agent

Artificial general intelligence, the idea of an intelligent A.I. agent that’s able to understand and learn any intellectual task that humans can do, has long been a component of science fiction. As A.I. gets smarter and smarter -- especially with breakthroughs in machine learning tools that are able to rewrite their code to learn from new experiences -- it’s increasingly widely a part of real artificial intelligence conversations as well.

But how do we measure AGI when it does arrive? Over the years, researchers have laid out a number of possibilities. The most famous remains the Turing Test, in which a human judge interacts, sight unseen, with both humans and a machine, and must try and guess which is which. Two others, Ben Goertzel’s Robot College Student Test and Nils J. Nilsson’s Employment Test, seek to practically test an A.I.’s abilities by seeing whether it could earn a college degree or carry out workplace jobs. Another, which I should personally love to discount, posits that intelligence may be measured by the successful ability to assemble Ikea-style flatpack furniture without problems.

Read more
The BigSleep A.I. is like Google Image Search for pictures that don’t exist yet
Eternity

In case you’re wondering, the picture above is "an intricate drawing of eternity." But it’s not the work of a human artist; it’s the creation of BigSleep, the latest amazing example of generative artificial intelligence (A.I.) in action.

A bit like a visual version of text-generating A.I. model GPT-3, BigSleep is capable of taking any text prompt and visualizing an image to fit the words. That could be something esoteric like eternity, or it could be a bowl of cherries, or a beautiful house (the latter of which can be seen below.) Think of it like a Google Images search -- only for pictures that have never previously existed.
How BigSleep works
“At a high level, BigSleep works by combining two neural networks: BigGAN and CLIP,” Ryan Murdock, BigSleep’s 23-year-old creator, a student studying cognitive neuroscience at the University of Utah, told Digital Trends.

Read more
New A.I. hearing aid learns your listening preferences and makes adjustments
Widex Moment hearing aids.

One of the picks for this year’s CES 2021 Innovation Awards is a smart hearing aid that uses artificial intelligence to improve the audio experience in a couple of crucial ways.

Among the improvements the Widex Moment makes to conventional hearing aids is reducing the standard sound delay experienced by wearers from 7 to 10 milliseconds seconds down to just 0.5 milliseconds. This results in a more natural sound experience for users, rather than the out-of-sync audio experience people have had to settle for up until now.

Read more