https://www.nplusonemag.com/issue-44/essays/human_fallback/
"The recruiter was a chipper woman with a master’s degree in English.
Previously she had worked as an independent bookseller. “Your experience as an
English grad student is ideal for this role,” she told me. The position was at
a company that made artificial intelligence for real estate. They had developed
a product called Brenda, a conversational AI that could answer questions about
apartment listings. Brenda had been acquired by a larger company that made
software for property managers, and now thousands of properties across the
country had put her to work.
Brenda, the recruiter told me, was a sophisticated conversationalist, so fluent
that most people who encountered her took her to be human. But like all
conversational AIs, she had some shortcomings. She struggled with idioms and
didn’t fare well with questions beyond the scope of real estate. To compensate
for these flaws, the company was recruiting a team of employees they called the
operators. The operators kept vigil over Brenda twenty-four hours a day. When
Brenda went off script, an operator took over and emulated Brenda’s voice.
Ideally, the customer on the other end would not realize the conversation had
changed hands, or that they had even been chatting with a bot in the first
place. Because Brenda used machine learning to improve her responses, she would
pick up on the operators’ language patterns and gradually adopt them as her
own."
Via
The RISKS Digest Volume 33 Issue 58:
http://catless.ncl.ac.uk/Risks/33/58#subj5
Cheers,
*** Xanni ***
--
mailto:xanni@xanadu.net Andrew Pam
http://xanadu.com.au/ Chief Scientist, Xanadu
https://glasswings.com.au/ Partner, Glass Wings
https://sericyb.com.au/ Manager, Serious Cybernetics