Vocabulary infrastructure for Alexa+ and the AI Action SDK
evolution
⌄
The Scenario
“Alexa, teach me a new word.” Alexa+, powered by generative and agentic AI via Amazon Bedrock, chains multiple API calls in a single conversation. The Alexa AI Action SDK calls Word Orb for a verified definition, streams pronunciation audio to the Echo speaker, delivers a structured lesson, and runs a comprehension quiz — a complete voice learning experience in one multi-step agentic flow.
1
Word Orb looks up the word
One API call returns a verified definition, 47 translations, pronunciation audio, and etymology.
Loading…
2
Lesson Orb delivers a structured lesson
A 5-phase lesson with the explorer teaching archetype.
Loading…
3
Quiz Orb assesses comprehension
Interactive assessment aligned to the lesson. Try it yourself.
Loading…
4
The Knowledge Graph connects everything
30,288 connections link words to lessons to assessments.
Loading…
Why this matters for Amazon Alexa
Alexa AI Action SDK integration — one Action that chains Word Orb → Lesson Orb → Quiz Orb for a complete voice learning experience
Alexa+’s Multi-Agent SDK enables a vocabulary agent that coordinates with other Alexa services — OpenTable, Uber, Ticketmaster are already integrated
Amazon Bedrock’s model-agnostic architecture means the underlying LLM may change — Orb’s deterministic content stays constant across Nova and Anthropic models
240,000 pronunciation audio files stream directly to Echo devices via URL — sub-5ms edge response for real-time voice interaction