![]() In general, search technologies should support the various ways that people use search engines today, many of which are not served by direct answers. Shah and Bender suggest a number of solutions to the problems they anticipate. Why not use them as a kind of search engine, one that can synthesize responses from multiple sources and package up the information into easily understood sentences? Related Story Trained on hundreds of books and much of the internet, they absorb vast amounts of information-so the thinking goes. Large AI models can mimic natural language with remarkable realism. But she says that Google has no plans to turn this new research into products yet: “We agree there are a number of open challenges in language understanding, which is why we’re taking a very cautious approach overall.” Mindless mimics “We’re deeply invested in advancing language understanding because it makes products like Google Search more useful for people,” says Jane Park, a communications manager in Google’s Search team. Built on top of a language model, it is designed to respond to users’ queries by pulling together information from different sources. Google is also developing a technology called a multitask unified model, or MUM. Last year Google researcher Don Metzler and his colleagues proposed that search could be reimagined as a two-way conversation between user and language model, with computers answering questions much as a human expert might. But some believe that language models could be used to overhaul how search is done. Google already uses language models to improve its existing search technology, helping it interpret user queries more accurately. “It is infantilizing to say that the way we get information is to ask an expert and have them just give it to us.” “I think there is something wrong with the vision,” she says. It isn’t just that today’s technology is not up to the job, she believes. “The Star Trek fantasy-where you have this all-knowing computer that you can ask questions and it just gives you the answer-is not what we can provide and not what we need,” says Bender, a coauthor on the paper that led Timnit Gebru to be forced out of Google, which highlighted the dangers of large language models. ![]() “We got too bogged down by what we could do we haven’t looked at what we should do,” says Shah. ![]() Asking computers a question and getting an answer in natural language can hide complexity behind a veneer of authority that is not deserved. And with the rise of voice assistants like Siri and Alexa, language models are becoming a go-to technology for finding stuff out in general.īut critics are starting to push back, arguing that the approach is wrong-headed. The vision of a know-it-all AI that dishes out relevant and accurate information in easy-to-understand bite-size chunks is shaping the way tech companies are approaching the future of search. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |