We taught AI assistants how to build video apps. Properly.
Following our MCP Server launch, we're making further moves to make Digital Samba more accessible for AI-assisted development.
We've added two new features to enhance your development toolkit.
- llms.txt - A standardised file containing our complete API and SDK documentation in a format optimised for AI prompt consumption.
- Context7 integration - We added Digital Samba to this fantastic real-time documentation discovery resource that ensures AI assistants always work with current documentation and information.
These changes help address some of the fundamental problems of AI-assisted development:
- Outdated code suggestions based on obsolete training data.
- Searching the internet and accessing messy, inconsistent resources bloated with markup that is contextually costly for your AI assistant.
The problem with AI code generation
When developers ask AI assistants for help with their Digital Samba integration, they may end up with code using deprecated methods. For example, an AI might suggest initializeRoom() - a method we deprecated two years ago in favour of createControl().
This happens because AI models often work with fragmented, historical data and lack structured context about current APIs, best practices, and proper usage patterns. The result is wasted developer time debugging outdated or incorrect implementation by their coding assistant.
How llms.txt works
Large language models increasingly rely on website information, but face a crucial limitation: context windows are too small to handle most websites in their entirety.
Converting complex HTML pages with navigation, ads, and JavaScript into LLM-friendly plain text is both difficult and imprecise.
The llms.txt standard, proposed by Jeremy Howard at Answer.AI, provides a simple solution to this challenge. The idea is to use a structured markdown file that gives AI systems precise information in a common location on websites.
While websites serve both human readers and LLMs, the latter benefit from more concise, structured, expert-level information gathered in a single location.
This is particularly important for use cases like development environments, where LLMs need quick access to programming & API documentation.
Our implementation at digitalsamba.com/llms.txt
includes:
- Current SDK methods, events and initialisation examples
- API endpoint documentation
- Authentication requirement
- Working code examples
It is deliberately minimal on formatting - no navigation elements, no marketing content, just technical documentation that AI can consume in a context-conscious fashion.
Context7: Dynamic documentation discovery
Context7, developed by Upstash, is a service that provides AI coding assistants with up-to-date, version-specific documentation from official sources. It addresses the problem of AI models relying on potentially outdated information by fetching fresh documentation and injecting it into the AI as context as part of your prompts. It's somewhat akin to the directory of llms.txt.
The real power of this service is harnessed via its MCP server. Adding this MCP server to your AI Assistants toolkit allows it to discover documentation as it needs it in real time.
Example prompt:
Create a Digital Samba room with recording enabled. use context7
The AI returns the current, working code:
async function createRoomWithRecording() {
const API_KEY = 'your-api-key';
const API_URL = 'https://api.digitalsamba.com/api/v1';
const roomConfig = {
name: "Recording Enabled Room",
description: "Room with cloud recording capabilities",
privacy: "public",
max_participants: 50,
// Recording configuration
recordings_enabled: true,
recording_autostart_enabled: false, // Set to true for automatic recording
recording_logo_enabled: true,
recordings_layout_mode: "tiled", // Options: "tiled", "active_speaker"
participant_names_in_recordings_enabled: true,
hide_tiles_in_recordings_enabled: false,
// Media settings
audio_on_join_enabled: true,
video_on_join_enabled: true,
hd_video_quality: "720_1.5",
audio_quality: "32",
// Feature settings
screenshare_enabled: true,
chat_enabled: true,
qa_enabled: true,
polls_enabled: true,
transcription_enabled: true,
transcription_auto_start_enabled: true,
captions_enabled: true,
whiteboard_enabled: true,
// UI settings
toolbar_enabled: true,
toolbar_position: "bottom",
primary_color: "#3771E0",
background_color: "#FFFFFF",
palette_mode: "light",
// Roles
default_role: "attendee",
roles: ["moderator", "attendee"]
};
try {
const response = await fetch(`${API_URL}/rooms`, {
method: 'POST',
headers: {
'Authorization': `Bearer ${API_KEY}`,
'Content-Type': 'application/json'
},
body: JSON.stringify(roomConfig)
});
if (!response.ok) {
throw new Error(`HTTP error! status: ${response.status}`);
}
const room = await response.json();
console.log('Room created successfully:', room);
return room;
} catch (error) {
console.error('Error creating room:', error);
throw error;
}
}
Implementation
For Cursor or Claude Desktop users, add this configuration:
json:
{
"mcpServers": {
"Context7": {
"command": "npx",
"args": ["-y", "@upstash/context7-mcp"]
}
}
}
Alternatively, you can access our llms.txt file directly and provide it to your AI assistant as part of your prompts.
Practical applications
With these tools, AI assistants can now accurately:
- Generate room creation code with current configuration options
- Implement authentication using valid endpoints
- Build cool video experiences using our latest SDK methods
- Create integrations that follow current best practices
For example, asking about transcript exports will yield code using the correct endpoint (GET /rooms/{id}/transcripts/export
) with proper authentication headers, rather than outdated or fictional alternatives.
Review the output!
Even with access to up-to-date documentation, current state-of-the-art LLMs may not always generate correct code. Treat the generated code as a starting point, and always review and refine it before shipping code to production.
Technical architecture
We now provide three complementary tools for AI-assisted development:
- MCP Server - Enables conversational control of Digital Samba infrastructure
- llms.txt - Provides structured documentation for AI consumption
- Context7 support - Ensures real-time accuracy of generated code
This combination can help form part of your AI-assisted development toolkit for building sophisticated, white label video conferencing experiences; from initial POCs, all the way to production deployment of complex video integrations into your application.
Future developments
We're planning several enhancements:
- Framework-specific implementation patterns
- Expanded llms-full.txt with more comprehensive API coverage
- Specialised context for a common integration scenario
Getting started
To begin using these tools:
- With Context7: Configure your AI assistant with the Context7 MCP server and add "use context7" to your prompt
- Direct access: Visit
digitalsamba.com/llms.txt
and provide the content to your AI assistant when prompting
Both approaches will significantly improve the accuracy of AI-generated Digital Samba code.
Feedback
We welcome input on these new features. If you encounter gaps in documentation or need specific examples, please contact us at support@digitalsamba.com.
By providing AI systems with accurate, current information about our platform, we're reducing development friction and enabling faster, more reliable integrations. This represents a practical step forward in making video conferencing APIs more accessible to developers working with AI assistance.
Share this
You May Also Like
These Related Stories

How to Configure Your Team Settings with the Digital Samba Developer API

How to configure your Digital Samba Rooms using developer API and SDK
