Amazon is opening up the voice-control technology behind its Alexa assistant to everyone. This will allow software and app developers to use the voice control tech for their own creations. The technology, which is called Lex, bundles in speech recognition, conversational interactions and text recognition.
Opening Alexa voice technology to help in AI efforts
By doing this, the e-commerce company is clearly not just looking only for money in the short term; rather, it wants to become the dominant voice computing platform. Reuters described Amazon’s recent decision as a move to become the best company in voice-controlled computing.
Lex was unveiled at a major cloud-computing conference in December, but it wasn’t widely available at that time. According to Amazon CTO Werner Vogels, the platform could lead to chatbots and assistants that sound more human and friendlier than their predecessors. Amazon’s Lex platform will enable tech companies to make text- or voice-based chat interfaces for their applications quickly, making them interactive in the same way as Google Assistant, Siri and Alexa, notes The Verge.
Seth Klarman: Investors Can No Longer Rely On Mean Reversion
"For most of the last century," Seth Klarman noted in his second-quarter letter to Baupost's investors, "a reasonable approach to assessing a company's future prospects was to expect mean reversion." He went on to explain that fluctuations in business performance were largely cyclical, and investors could profit from this buying low and selling high. Also Read More
Further, allowing developers to access the Lex platform will help the e-commerce company continue to colonize the AI space. Several U.S.-based tech firms have seen an opportunity to selling similar easy-to-use artificial intelligence (AI) tools. Both Microsoft and Google offer their own suites of similar services as well.
How Amazon plans to make Lex better
For the e-commerce company, the Echo was an amazing and surprising hit, and it is now moving to capitalize on its lead by putting its voice assistant into a wide range of devices, from speakers and lamps to cars and clocks.
Vogels said that sales of the Echo will likely never match those of the iPhone, but people use Alexa for multiple tasks around their house, while they tend to interact with a smartphone’s voice assistant only when they are in their cars.
“There’s massive acceleration happening here. The cool thing about having this running as a service in the cloud instead of in your own data center or on your own desktop is that we can make Lex better continuously by the millions of customers that are using it,” Vogels told Reuters.
Instead of living within the actual software and apps, Lex lives in the cloud. This means that Amazon can make the platform better and better by constantly feeding it with training data from people’s interactions with Alexa.
We cannot ignore that Google and Apple have more access to valuable data because of the thousands of millions of Android devices and iPhones they sell to users who test their AI tools and provide their feedback, notes The Verge. Thus, Amazon requires a lot of sources of data; hence, its latest decision to open the technology will help it channel the interactions of the apps created by third-party developers using Lex.
Earlier this year, the e-commerce company began prepping a software package that included its developer services dubbed Polly and Lex, notes Engadget. Such real-world feedback is very valuable for building AI tools.