February 13, 2024

Jim’s Inside Scoop From the Microsoft AI Tour 2024

By

Jim Taylor, Emerging Tech Archetct

Theta

Emerging Technologies Architect Jim Taylor has been soaking up the latest AI advancements during his time in Sydney at the Microsoft AI Tour. Here’s a summary.

Last week, I had the pleasure of attending some insightful sessions and workshops at Microsoft's AI Tour in Sydney. From a business and technical standpoint, all the bases were covered, with a broad range of topics from the future of education with AI to developing production-level LLM-powered applications and exploring the latest generative AI technologies.

Here's a summary of my favourite sessions and my reflections on them:

The Future of Education in the Era of AI

This session gave a full overview of AI's role in transforming today's education. It highlighted its potential to automate administrative tasks, personalise learning, and save time for educators. A notable mention was Khamingo, Khan Academy's AI-powered chatbot, which exemplifies AI's ability to engage students actively and assist teachers.

Key Points:

  • Cautious yet innovative AI adoption in schools: Trials in New South Wales demonstrate the integration of AI in educational settings, ensuring safety through content protection and alignment with the Australian curriculum.
  • Australian framework for generative AI in schools: This framework serves as a guide for ethical AI use in educational contexts, emphasising the need for responsible implementation.
  • Challenges and safeguards: There’s a huge importance in maintaining an educational focus, using semantic filters, and involving real-time human oversight to ensure AI's effective and safe use in schools.

Industry Panel Insights:

  • Emphasis on critical thinking: It’s important for students to learn truth-detecting skills in the AI era, along with managing expectations around AI's capabilities.
  • User engagement strategies: Suggestions were made to improve AI interaction by encouraging more conversational exchanges and leveraging LLMs for better engagement.

[Australian Framework for Generative AI in Schools: Australian Framework for Generative AI]

[Azure AI Content Safety: Semantic Filters]

Developing a Production-level RAG Workflow

This hands-on technical workshop walked us through how to set up a RAG Workflow (Retrieval Augmented Generation), which pulls data from an outside source to enhance your prompts and queries.

We learnt about creating a Copilot in VS Code, using Azure AI Search for data indexing, and incorporating the RAG pattern for enhancing a large language model (LLM).

Workshop learnings:

  • LLM lifecycle stages: We learned what it takes to build and operate a large language model focusing on continuous improvement and feedback loops – very important.
  • RAG architecture application: We gained insights into building, evaluating, and deploying RAG-based LLM applications, with examples provided on GitHub.

And for our more technical readers, here’s a bit more detail about the "LLM based applications Lifecycle in the real world" …

This lifecycle represents a systematic approach to integrating LLMs into business solutions, emphasising continuous iteration and improvement.

Structured phases of Large Language Model (LLM) development and deployment in business applications:

Ideating/exploring: This initial stage involves identifying a business need, generating ITD/Prompts, forming hypotheses, and locating appropriate LLMs for the task. This is the conceptual and preparatory phase where the potential of LLMs to address business challenges is explored.

Building/augmenting: In this technical phase, the LLM is developed and enhanced through:

Retrieval/augmented generation: Training the LLM with additional data or improving its responses.

Exception handling: Managing errors or unexpected inputs.

Prompt engineering: Refining prompts to optimise LLM outputs.

Error analysis: Identifying and correcting mistakes made by the LLM.

A feedback loop to the ideating/exploring stage indicates an iterative process, allowing for refining ideas and approaches based on insights gained during development.

Operationalising: The final stage focuses on implementing the LLM in a live environment:

Content filtering: Setting up mechanisms to ensure the generation of appropriate content.

Prepare for app deployment: Readying the application for its release.

Ongoing Q&A and cost management: Ongoing quality assurance and cost control of the operational system.

Monitoring: Continuous oversight of system performance and output.

Safe rollout/scaling: Careful deployment and potential expansion of the LLM application.

Feedback: A "Send Feedback" prompt provides a mechanism to allow for post-deployment improvements to the LLM or its applications.

What's New in Generative AI?

Michelle Sandford's session was a deep dive into the latest developments in Azure's AI services, showcasing the rapid advancements in AI technology and the endless possibilities for innovation in various fields. We saw the creation of custom avatars using Azure AI Speech Studio and the power of Azure AI Studio for GenAI applications.

A standout moment was definitely the Azure AI speech studio demo, where Michelle trained a model to speak in her voice (check it out below). It just goes to show the raw potential of generative AI to create personalised digi-experiences.

Build Your Own Copilots With Microsoft Copilot Studio

This workshop took us through the steps required to build, run, evaluate, and deploy a RAG-based Large Language Model App to Azure.

We learnt how to use prompt flows, a tool that streamlines the entire development and makes your creation process a lot easier. We also learnt how to edit and customise prompt flows using YAML or a visual editor in Azure AI studio.

Interested in giving it a go? You can try the workshop yourself by following the steps here: github.com/Azure-Sample

[Workshop Samples: Contoso Chat, Contoso Web]

Build Your RAG Application With Prompt Flow in Azure AI Studio

In this technical workshop, we used AI Studio, a generative AI development hub, to build a Retrieval Augmented Generation (RAG) Large Language Model (LLM) application with Azure AI, Prompt Flow, and VS Code. The workshop notes are very comprehensive and contain all the information required to try this workshop for yourself.

[Workshop and Prompt Flow End-to-End Guide: Prompt Flow Guide]

AI in Healthcare

A session dedicated to generative AI's impact on healthcare, exploring its application in clinical simulation, training, and operational improvements. Demonstrations included a doctor-patient assessment trainer to show how AI can enhance medical training and decision-making.

Key Takeaways:

  • Generative AI in Clinical Training: Practitioners can use multiple GPT agents to simulate patient interactions and assessments.

I found all of these sessions valuable and left with a bunch of cool insights I’m eager to weave into my expertise and share with my team and businesses alike. There are so many considerations in the world of AI, from practical applications and ethical considerations of AI across various sectors.

Summed up:

  • The discussions and workshops emphasised the importance of responsible AI use.
  • The potential for personalisation and efficiency.
  • And the continuous need for innovation and learning in the AI space.

Want more clarity on a topic we covered?