Skip to main content

UK council's latest staff member is an AI called Amelia

IPsoft Enfield
Movies like to frame the rise of artificial intelligence as a kind of Terminator-style hostile takeover. The reality is somewhat different.
Recommended Videos

This week, robots ticked off another box in the “win” category when the UK’s Enfield council in London announced that it has acquired a new AI program called Amelia, which will be used to answer the questions of local residents. To do this, Amelia relies on cutting edge speech recognition, alongside an ability to understand the context of conversations, apply logic, learn, resolve problems — and even sense the emotions of the person it’s speaking with.

Amelia_half_body_2“Amelia has an emotional ontology in which the words exchanged in the customer dialogue are mapped onto a well-established psychological model,” Frank Lansink, European CEO at Amelia’s home company IPsoft, tells Digital Trends. “This allows her to interpret the emotional mood and personality. This means she can react appropriately to the dialogue in terms of what she says and in her facial expression and gestures. Of course, if a customer is going across a specific threshold that indicates the customer is getting angry or frustrated she can escalate to a human colleague. Conversely, a human colleague may be brought in to discuss a new offer or service with a customer who is very happy with the exchange.”

Amelia will start out answering questions on the council’s website, although it is hoped her job capabilities will soon expand elsewhere. Amelia is reportedly 60 percent cheaper than a comparable human worker — which comes in handy when considering that Enfield council, like many organizations around the world, is right now in the middle of cost-cutting measures.

But despite a growing number of reports suggesting that tools like Amelia are helping put the jobs of fleshy mortals at risk, Lansink says we don’t have too much to fear.

“At Enfield council and elsewhere, Amelia plays a complementary role; she supports human staff,” he says. “There have been no job losses because of the introduction of Amelia and there are no plans to reduce human staff. As an assistive technology, Amelia focuses on taking care of routine tasks, so that human staff have more time to tackle the more complex cases – and as a result helps to improve service delivery for all residents. Amelia opens up the possibility for re-designing roles so that she absorbs the mundane, repetitive questions at speed and at very high volume. This in turn allows her human colleagues to focus on exceptions, complex queries, analysis and future design of services for customers.”

Of course, that’s what they said about Skynet!

Luke Dormehl
I'm a UK-based tech writer covering Cool Tech at Digital Trends. I've also written for Fast Company, Wired, the Guardian…
Finishing touch: How scientists are giving robots humanlike tactile senses
A woman's hand is held by a robot's hand.

There’s a nightmarish scene in Guillermo del Toro’s 2006 movie Pan's Labyrinth in which we are confronted by a sinister humanoid creature called the Pale Man. With no eyes in his monstrous, hairless head, the Pale Man, who resembles an eyeless Voldemort, sees with the aid of eyeballs embedded in the palms of his hands. Using these ocular-augmented appendages, which he holds up in front of his eyeless face like glasses, the Pale Man is able to visualize and move through his surroundings.

This to a degree describes work being carried out by researchers at the U.K’.s Bristol Robotics Laboratory -- albeit without the whole terrifying body horror aspect. Only in their case, the Pale Man substitute doesn’t simply have one eyeball in the palm of each hand; he’s got one on each finger.

Read more
Nvidia’s latest A.I. results prove that ARM is ready for the data center
Jensen Huang at GTX 2020.

Nvidia just published its latest MLPerf benchmark results, and they have are some big implications for the future of computing. In addition to maintaining a lead over other A.I. hardware -- which Nvidia has claimed for the last three batches of results -- the company showcased the power of ARM-based systems in the data center, with results nearly matching traditional x86 systems.

In the six tests MLPerf includes, ARM-based systems came within a few percentage points of x86 systems, with both using Nvidia A100 A.I. graphics cards. In one of the tests, the ARM-based system actually beat the x86 one, showcasing the advancements made in deploying different instruction sets in A.I. applications.

Read more
Nvidia is renting out its AI Superpod platform for $90K a month
nvidia hgx 2 supercomputer server nvidiadgx2

Nvidia is looking to make work and development in artificial intelligence more accessible, giving researchers an easy way to access its DGX supercomputer. The company announced that it will launch a subscription service for its DGX Superpod as an affordable way to gain entry into the world of supercomputers.

The DGX SuperPod is capable of 100 petaflops of AI performance, according to the company, and when configured 20 DGX A100 systems, it's designed for large-scale AI projects.

Read more