Workshops small

Spatial AI & A Future Beyond the Screen

Date / Time: Saturday 23 November / 15:35 AM -  16:45 EET @ Workshop room (-1st floor - Room 2)

Speaker: Violet Whitney
Co-Founder @ Spatial Pixel

Workshop Description:

Large language models are typically constrained within singular interfaces like chat apps, but in this workshop you’ll learn to create rich coordinated experiences across multiple applications and the physical world (smart home cameras, door locks, APIs etc.).
You will be using a new application by Spatial Pixel called Funkify, which enables you to author and test LLM function calls easily. Using OpenAI’s API, you will build your own function calls, which will include external API calls to a physical smart plug, weather API, MapBox API, and others, so you can develop tangible interactive experiences at different scales. You will then export your function declarations from Funkify into your own code. For example, we will trigger lights based on computer vision results or prompt for driving and cycling distances.

What will you Learn:
Learn to deploy LLMs to connect to physical things, sensors, actuators, and objects in the physical world, and become familiar with using natural language for function calling to create AI-enabled spatial interactions.

Level: Beginner & Intermediate

TAGs: #AI, #AmbientComputing, #SpatialComputing

Target audience: Developers and Product designers

Prerequisites on Audience (HW/SW, Know-how) : 

  • JavaScript, Python, or other programming language knowledge is useful.

  • They’ll need to bring a laptop. The workshop is all browser-based; Google Chrome is recommended.

  • Attendees intending to participate should sign up for an OpenAI platform account with billing enabled.

Deliverables: 

  • You will create AI-enabled spatial interactions.

Schedule: 

  • Introduction 10 mins
    Brief overview of the workshop and its objectives.
    What is “function calling” in the context of large language models and why does it matter?

  • Building a Function Call
    • Demo 10 mins
      • Demonstration: Building a function call.
      • Demonstration of the Funkify interface.
      • Demonstration of how to use semantic models to structure actions and rules.

    • Hands on Exercise 15 mins
      • Write a basic LLM function call.
      • Using a semantic model to trigger your function call.

  • API Calling with Function Calls
    • Demo 10 mins
      • Demonstration: Building an API call to MapBox
      • Demonstration: Building an API call to a smart plug
    • Hands on Exercise 20 mins
      • Create your own function call with the Google Maps API.
      • Doing distance and other calculations that natural language can’t do.
      • Create your own function call to a physical sensor with a smart plug.
      • Structuring natural language queries to trigger sensor actions.

  • Wrap Up / Share 5 mins
    • Share our creations!