Building on the work already completed at the NZ event, Jim worked with the team to further develop their mobile/wearable application, which combines chatbot and advanced voice recognition technologies.
Jim is Theta’s resident cognitive services and chatbot expert, and his skills in this area were put to good use at the hackfest. The team worked with Microsoft’s cognitive services toolkit, including cris.ai custom speech recognition, to build a prototype application - using Xamarin – that can record, recognise and process speech.
The proposed application was such a new and innovative use of cognitive services that Microsoft assigned developers from their CRIS (speech recognition) and LUIS (language understanding) teams to support our team’s work. This was a great opportunity for the team. Says Jim Taylor:
"It was so useful to have members of the Microsoft Developer team on board who were able to use their expertise to refine and enhance our solution.They had direct access to the teams responsible for the products we were using and were able to get help and provide direct feedback on any issues we were having."
“We already has some good training data to help this process along, including six different voices, all with different accents and pronunciation. One thing we didn’t anticipate and had to overcome was the model recognizing “NZ” as “N Zed”, because of course in the US “Z” is pronounced” “zee”.”
By the end of the week-long hackfest, the team had proved their concept, and built a working prototype covering most elements of the proposed solution. They are keen to continue the work back in NZ, so watch this space…