Introducing NLWebGO: The Natural Language Web and the Future of Sites

By Helen Thomas
4 days ago
nlweb

The web we grew up with—Web 1.0—was full of simple, static pages that anyone could read but no one could converse with. Over time, we added interactivity (Web 2.0) and data layers (Web 3.0), yet most websites still behave like digital brochures. They deliver one-size-fits-all information, and visitors passively click or scroll. But today’s users demand more: they want instant, personalized answers in their own words. They expect a website to listen and talk back. Experts predict that by 2026, “websites that feel more like conversations than information dumps” will be the norm.

From social media to shopping apps, our experiences have already become richly interactive. People are accustomed to Netflix knowing what they like, and voice assistants handling tasks. It’s only a matter of time before every website steps up.

As the founder of NLWebGo, I’ve watched web technology evolve for decades, and I believe we are on the brink of a new paradigm. The tools already exist to transform any site from a static catalog into an intelligent, conversational agent. In this launch blog, I’ll explain why the old model of passive pages is failing businesses, and how Natural Language Web (NLWeb) – powered by the Model Context Protocol – solves these problems. I’ll lay out a straightforward roadmap for exposing your content to AI agents, and make the business case that this dramatically lowers acquisition costs and boosts engagement. Finally, I’ll share a vision of the near future: a fully conversational, AI-native web built on accessibility and user agency. Let’s begin.

The Web’s Transformation: From Static Pages to Smart Conversations

In the early days, the internet’s value lay in making information globally accessible. Web 1.0 sites were “static placeholders”: one-way streams of content. Users absorbed what was there but could not interact. Then came Web 2.0, the social web. Sites became dynamic, users generated content, and two-way forums and comments thrived. Yet even Web 2.0 platforms often expect users to navigate menus, search bars, and forms. This model is still largely “one user at a time with fixed navigation,” even if that user can now comment or share. In practice, most websites still rely on keyword searches or menus to get answers.

Meanwhile, the idea of a truly intelligent Web 4.0 (the “Intelligent Web”) is emerging. We now have powerful AI assistants (Siri, Alexa, ChatGPT) that “understand” language and context. It stands to reason that the next evolution is for the web itself to become conversational. Instead of isolating a visitor on a page, the web should talk with them. Research even suggests that consumers now expect this level of personalization: 31% of U.S. customers say that personalization makes their shopping experience more enjoyable. Another survey finds that 86% of marketers using AI-driven recommendations report significantly lower customer acquisition costs. In short, the market is already shifting to dialogue-driven digital experiences.

Consider how search engines have changed. A decade ago, we typed keywords; today, we often converse with AI bots that tap into live data. At NLWebGo, we believe websites must follow the same path. HTML once democratized page creation; NLWeb (the Natural Language Web) will democratize conversational experiences on the web. As visionary leaders have noted, projects like NLWeb “completely change what search or feeds can be”. Microsoft now calls NLWeb the “fastest and easiest way to effectively turn your website into an AI app”.

Watch the Microsoft Build 2025 keynote on YouTube.

Loading video...

Microsoft Build 2025 | Satya Nadella Opening Keynote

Throughout the keynote, MCP was referenced as the de facto protocol standard for agent-to-agent and agent-to-service interaction. Microsoft is now building all of its agent infrastructure — from GitHub Copilot to Azure Foundry to Windows — around MCP servers.

The solution is a new paradigm: the Natural Language Web (NLWeb). This approach transforms any website into an AI-accessible, conversational experience. In practice, that means re-engineering the web around natural language and computation rather than static HTML alone.

What is NLWeb? NLWeb – short for Natural Language Web – is a concept and open framework (launched publicly in 2025) designed to turn your site into an “AI app”. In simple terms, NLWeb lets people query your site in plain language, just like talking to an assistant. It does this by exposing your site’s content and services through a standard interface that AI models and agents can use. Instead of a fixed page, your website becomes a dynamic knowledge base. For example, a retail site can automatically reveal its product data, inventory, and FAQ to an AI assistant asking questions, just as if it had an always-on clerk. The Microsoft NLWeb announcement puts it this way: NLWeb is “the fastest and easiest way to effectively turn your website into an AI app”.

NLWeb is built on cutting-edge standards, chief among them the Model Context Protocol (MCP). MCP is an open-source framework (introduced by Anthropic in 2024) that standardizes how AI models communicate with external systems. In everyday terms, MCP is the “USB-C” of AI integration: a universal port through which an AI assistant can plug into any data source. By using MCP, we make it possible for an intelligent agent to request fresh information or execute actions on your website in real time. For example, instead of reading a static FAQ page, the agent might query an API to check today’s exchange rate or send a text message on behalf of the user. In effect, every NLWeb-enabled site becomes an MCP server, meaning it “makes its content discoverable and accessible to agents”. This interoperability is crucial: it means any AI assistant built on MCP (whether ours or a third party’s) can talk to your site just as easily as it would to any cloud service.

Under the hood, implementing NLWeb involves leveraging structured data and LLM-driven tools. Modern websites often already publish data in semi-structured formats – think Schema.org markup in HTML, RSS/Atom feeds, OpenAPI/JSON schemas, or even simple JSON/CSV exports. NLWeb platforms ingest those formats and layer on AI. In Microsoft’s description, NLWeb “leverages semi-structured formats like Schema.org, RSS and other data that websites already publish, combining them with LLM-powered tools to create natural language interfaces”. In practice, this could mean adding JSON-LD tags to your product pages, providing an OpenAPI spec for your services, or feeding your blog posts into a vector database. The NLWeb system then uses a chosen language model (OpenAI, Anthropic, etc.) to interpret user questions and fetch answers from these sources.

To give a concrete picture: imagine adding a small NLWeb layer to your site. A visitor arrives and says (via chat or voice) “Do you have blue running shoes in stock under $100?” The NLWeb agent automatically queries your product database and inventory API, then answers, “Yes – the Orion Runner (blue) is in stock for $89. It’s a best-seller for marathons.” This all happens behind the scenes via structured data and APIs; the user just sees a conversational answer. Importantly, NLWeb also adds built-in AI safety and control. Because it’s open-source, you remain in charge of the data exposed and the models used. You decide which content is available to the assistant, and you can curate which AI tools to integrate.

How NLWeb Works: Model Context Protocol and Conversational Agents

To clarify, NLWeb is not a chatbot widget. It is an architectural shift. Traditional AI plugins or chatbots require custom coding for each site (and often vendor lock-in). By contrast, NLWeb utilizes open protocols, allowing any AI agent to interact with any NLWeb-enabled site.

The keystone, again, is the Model Context Protocol (MCP). MCP defines a JSON-RPC interface for communication between an AI model (the “client”) and external data endpoints (the “MCP servers”). Each NLWeb server implements MCP out of the box, meaning it responds to standardized JSON calls. For example, there are MCP methods for listing available tools/endpoints (GETTOOLS), invoking an action (EXECUTE), or reading data (READFILE). Practically, this means an AI assistant doesn’t have to scrape HTML. Instead, it says in code: “Hey, give me all functions this site offers,” and the NLWeb server replies with, say, getProductList, getProductDetails, findNearestStore, etc. The assistant picks the tool they need and uses it with parameters. Under the hood, that function might look up a SQL database or call a REST API. Because MCP is model-agnostic and transport-agnostic, the same interaction works with any LLM that speaks JSON-RPC.

By standardizing on MCP, NLWeb achieves two big advantages: freshness and interactivity. Information on static pages goes stale fast. With NLWeb/MCP, the model fetches data on demand. If a visitor asks the current weather or inventory, the assistant hits your live API, not a cached text file. Moreover, the interaction is dynamic: an AI assistant could perform multi-step conversations, remembering context. For instance, it could ask follow-up questions (“Do you prefer running shoes for trail or road?”) and filter results accordingly. This is possible because NLWeb servers can include “memory” functions or connect to vector stores; the assistant can even call GPT functions to refine queries.

In essence, NLWeb makes websites behave like databases and services. Each web page or API becomes part of an AI-accessible knowledge graph. As one expert put it, “NLWeb enables websites and APIs to expose their content and services in natural language, making them accessible to intelligent agents”. Because NLWeb endpoints are MCP servers, they are inherently part of the same ecosystem as the agents. It’s like turning the web inside-out: instead of agents only crawling and scraping, the sites now push their capabilities outward through a shared protocol.

At NLWebGo, we leverage this to help brands create true AI-native interfaces.

Adopting NLWeb doesn’t require an army of engineers. We’ve distilled it into a practical, step-by-step framework that any brand can follow:

  1. Inventory Your Content and Services. Start by cataloging what’s on your site, including product catalogs, service descriptions, help articles, databases, booking forms, and more. Think like an AI assistant: what data might someone want to query? This could include FAQs, pricing tables, knowledge articles, user forums, and even operational tools (such as checking order status via API). For each item, ask, “How would I answer this if someone asked in chat?” For example, a restaurant would list its menu, hours, and reservation API.
  2. Add Semantic Structure. For each content item, expose it in a machine-readable way. Use Schema.org markup in your HTML for static info (products, events, FAQs), and publish RSS/JSON feeds or OpenAPI specs for dynamic data. For instance, mark up each blog post with a schema:Article snippet and provide a /blog.json feed. If you have a product database, ensure that there is an API or generate a JSON file of your inventory. In practice, adding JSON-LD and OpenAPI definitions is often a one-time effort: after that, any AI agent can understand your content. (Microsoft’s NLWeb documentation specifically calls out using RSS, JSON, and other data “websites already publish” as a source for the natural language interface.)
  3. Deploy an MCP/NLWeb Server. Choose an NLWeb implementation or library and configure it to use your site’s data. Microsoft provides an open-source NLWeb server that you can run (locally or in the cloud), which listens for MCP calls and responds by retrieving the structured data you have set up. Essentially, you are turning part of your web backend into an MCP endpoint. Even if you have a modern CMS or platform, this often means adding an integration layer (for example, a small Node.js or Python service). Most production setups connect directly to live databases instead of duplicating content to keep data fresh. You can host this on any cloud or on-premises, and it will handle the AI requests. If you prefer a managed option, watch for NLWeb-as-a-service offerings.
  4. Connect Your AI Agent. With the MCP server in place, select or build an AI “brain” to drive it. This could be an LLM like GPT-4, Claude, or any model that can make JSON-RPC calls. Train or prompt the model to use your MCP tools. For example, you might tell it: “Use getProductList, getFaq, and chat tools when answering shopping questions on our site.” The beauty of NLWeb is that the same model can serve both human visitors and other AI agents. So, whether a user types in a chat box on your site or your site content is queried by an intelligent assistant, the interaction flows through the same NLWeb interface. Testing is key: try common questions and ensure the system picks the right tools and returns accurate, up-to-date answers.
  5. Launch and Iterate. Release the NLWeb interface to a subset of users or internally first. Monitor how visitors interact – what queries they ask, what actions follow, and where misunderstandings occur. Over time, refine your prompts, tweak which data is exposed, and expand capabilities (e.g., by adding more APIs or memory). Measure engagement metrics: are conversations staying on your site longer? Are users converting or consuming more content? This iterative process, guided by real user feedback, will gradually optimize your conversational experience.

At each step, remember that you maintain control over the data and logic. NLWeb does not force any particular AI; it simply makes your data accessible. You can require user authentication to access specific tools, filter results by region, or insert custom branding into every response. This is not some black-box chat; it’s your web, on your terms, with AI as the interface layer.

The Horizon: A Conversational, Accessible, Agentic Web

Looking ahead, the digital landscape will be completely transformed by this shift. Imagine a web where every interaction can be via voice or chat, where accessibility is baked in, and users have true agency:

  • Full Conversational Interaction: Users will no longer endure static forms or menus. Instead of hunting for information, they’ll ask. For example, a visually impaired visitor could simply speak to a site and have it narrate product details using AI text-to-speech. Non-native speakers could ask for explanations in their language. Everyone will be able to request exactly what they need, such as “Show me my account summary” or “Schedule an appointment next Tuesday,” and the site will comply. This is a world where "browsing" and "transaction" blur: discovering products, buying, booking, learning – it all happens in natural dialogue.
  • Ubiquitous AI Agents: AI assistants won’t just live on your site’s page; they’ll roam. Soon, your content could be accessed not only by on-site chat but by any agent on any platform – phone, home assistant, wearable, or even other websites. The NLWeb architecture envisions that your site “can be discovered and accessed by other agents”. For instance, a shopping app or a smart speaker might use the same NLWeb interface you provide. This turns your website into a service in the AI ecosystem. Your brand can monetize this too – for example, one could charge partners to access premium data via NLWeb tools.
  • Greater Accessibility: By design, NLWeb enhances inclusivity. People with disabilities, different learning preferences, or limited technical skills all gain access. As one AI education expert noted, turning text into speech and interactive quizzes makes learning “feel more human, interactive, and much easier”. Likewise, complex forms become as simple as asking a question. This isn’t just a feel-good feature; it opens new markets. If your site can truly answer the needs of, say, non-tech-savvy seniors or multilingual users, you can dramatically expand your audience.
  • User Agency and Privacy: Paradoxically, giving AI power to your site also gives power to users. With NLWeb, users don’t have to rely on generic search engines or massive platforms to find answers. They have agency to query your domain directly. And because NLWeb is open and can be hosted by you, data sovereignty is possible. You can choose to keep interactions confidential on your servers or allow user-chosen agents to access only what you expose. In other words, users get the best experience and you retain control over your data.
  • Seamless Integration Everywhere: The web will no longer be a “web” of pages, but a web of APIs and agents. Traditional navigation (menu bars, links) will primarily exist for fallback purposes. Instead, users will expect, for example, to say “Find my nearest store with this item” and get an immediate response. This is akin to living in a world where websites are as responsive as voice assistants. It’s a profound shift in how we consume content.

All of this is not science fiction. The building blocks are here: rapid advances in generative AI, universal protocols like MCP, and frameworks like NLWeb are converging. The vision is clear: a fully conversational, AI-native internet where information and services are universally accessible. Websites become like virtual assistants; users get what they need simply by asking.

NLWebGo is founded on this vision. We are committed to making this future real for businesses, educators, nonprofits, and every innovator. By adopting NLWeb today, you position your organization on the leading edge of the agentic web. You’ll create experiences that delight both customers and employees, and you’ll build skills and infrastructure that pay dividends for years to come.

The web is entering an era of accessibility and agency. It’s the internet, not just of pages, but of conversations. At NLWebGo, we’re excited to be at the forefront of this change. Here is the introduction video for our sister company, MascotGO, which is built on the NLWeb for the 2025 AI/Tech Investing Forum: 

Loading video...

Introducing MascotGO at the AI Tech Investing Forum 2025 in San Francisco


Categories:ainlweb