{"__v":0,"_id":"5845a4a99f6fbb1b004307ed","category":{"version":"5845a4a89f6fbb1b004307b7","project":"54d3007669578e0d002730c9","_id":"5845a4a89f6fbb1b004307b9","__v":0,"sync":{"url":"","isSync":false},"reference":false,"createdAt":"2015-07-30T06:25:25.645Z","from_sync":false,"order":1,"slug":"key-concepts","title":"Key Concepts"},"parentDoc":null,"project":"54d3007669578e0d002730c9","user":"54d3006a5616470d0013cc4d","version":{"__v":1,"_id":"5845a4a89f6fbb1b004307b7","project":"54d3007669578e0d002730c9","createdAt":"2016-12-05T17:32:24.708Z","releaseDate":"2016-12-05T17:32:24.708Z","categories":["5845a4a89f6fbb1b004307b8","5845a4a89f6fbb1b004307b9","5845a4a89f6fbb1b004307ba","5845a4a89f6fbb1b004307bb","5845a4a89f6fbb1b004307bc","5845a4a89f6fbb1b004307bd","5845a4a89f6fbb1b004307be","5845a4a89f6fbb1b004307bf","5845a4a89f6fbb1b004307c0"],"is_deprecated":false,"is_hidden":false,"is_beta":false,"is_stable":true,"codename":"","version_clean":"25.0.0","version":"25"},"updates":["575a3257d5797e0e00751834","576c119cfb62dd20001cdbba"],"next":{"pages":[],"description":""},"createdAt":"2015-09-02T19:48:59.851Z","link_external":false,"link_url":"","githubsync":"","sync_unique":"","hidden":false,"api":{"results":{"codes":[]},"settings":"","auth":"required","params":[],"url":""},"isReference":false,"order":0,"body":"* [API.AI: NLU and Dialog Management](#apiai-nlu-and-dialog-management)\n* [Platform Integrations](#platform-integrations)\n* [Input Methods](#input-methods)\n* [Output Methods](#output-methods)\n* [Fulfillment](#fulfillment)\n* [Get Started Today](#get-started-today)\n[block:api-header]\n{\n  \"type\": \"basic\"\n}\n[/block]\n\nAPI.AI is a platform for building conversational interfaces for bots, applications, and devices. \n\nThe diagram below shows how API.AI is related to other components and how it processes data: \n[block:image]\n{\n  \"images\": [\n    {\n      \"image\": [\n        \"https://files.readme.io/a769bab-API-AI_key-concepts.png\",\n        \"API-AI_key-concepts.png\",\n        1000,\n        341,\n        \"#e2e3e4\"\n      ],\n      \"sizing\": \"full\"\n    }\n  ]\n}\n[/block]\nIn the diagram, the green is provided by the API.AI platform. Your app / bot / device code provides the input and output methods and responds to actionable data. You can also provide an optional webhook implementation which API.AI uses to connect to your web service. Your web service can then perform business logic, call external APIs, or access data stores.\n[block:api-header]\n{\n  \"type\": \"basic\",\n  \"title\": \"API.AI: NLU and Dialog Management\"\n}\n[/block]\nAPI.AI receives a **query** as input data. A <a href=\"https://docs.api.ai/docs/query\" target=\"_blank\">query</a> is either text in natural language or an <a href=\"https://docs.api.ai/docs/concept-events\" target=\"_blank\">event</a> name sent to API.AI. \n\nAPI.AI matches the query to the most suitable <a href=\"https://docs.api.ai/docs/concept-intents\" target=\"_blank\">**intent**</a> based on information contained in the intent (examples, <a href=\"https://docs.api.ai/docs/concept-entities\" target=\"_blank\">entities</a> used for <a href=\"https://docs.api.ai/docs/concept-intents#section-example-annotation\" target=\"_blank\">annotations</a>, <a href=\"https://docs.api.ai/docs/concept-contexts\" target=\"_blank\">contexts</a>, <a href=\"https://docs.api.ai/docs/concept-actions\" target=\"_blank\">parameters</a>, events) and the <a href=\"https://docs.api.ai/docs/concept-agents\" target=\"_blank\">agent</a>'s <a href=\"https://docs.api.ai/docs/machine-learning\" target=\"_blank\">machine learning</a> model. API.AI transforms the query text into **actionable data** and returns output data as a <a href=\"https://docs.api.ai/docs/query#response\" target=\"_blank\">JSON response object</a>.\n\nThe process of transforming natural language into actionable data is called Natural Language Understanding (NLU). Dialog management tools such as <a href=\"https://docs.api.ai/docs/concept-contexts\" target=\"_blank\">contexts</a> and <a href=\"https://docs.api.ai/docs/concept-intents#intents-priority\" target=\"_blank\">intent priorities</a> allow developers to control the conversation flow. \n[block:api-header]\n{\n  \"type\": \"basic\",\n  \"title\": \"Platform Integrations\"\n}\n[/block]\nThere are several ways of connecting an API.AI agent to your app, bot, or device:\n\n- You can send GET and POST HTTP requests to your agent using the <a href=\"https://docs.api.ai/docs/query\" target=\"_blank\">/query endpoint</a>.\n- You can use our <a href=\"https://docs.api.ai/docs/sdks\" target=\"_blank\">SDKs</a> to send query requests to different platforms.\n- For popular messaging platforms, you can use one-click <a href=\"https://docs.api.ai/docs/integrations\" target=\"_blank\">integrations</a> that allow you to create bots without any coding. You can also use our source code from <a href=\"https://github.com/api-ai\" target=\"_blank\">Github</a> to extend these integrations and create custom bot implementations.  \n[block:api-header]\n{\n  \"type\": \"basic\",\n  \"title\": \"Input Methods\"\n}\n[/block]\nAPI.AI is designed to receive text as input data. If you want to send voice requests to your app, bot, or device, consider using <a href=\"https://cloud.google.com/speech/\" target=\"_blank\">Google Speech API</a> or other 3rd party solutions which can convert speech to text.\n\nDepending on the app, bot, or device platform, speech recognition might be available natively. For example, if you are building an <a href=\"https://docs.api.ai/docs/actions-on-google-integration\" target=\"_blank\">Action on Google</a>, an Android or iOS app, speech recognition is supported natively.\n[block:api-header]\n{\n  \"type\": \"basic\",\n  \"title\": \"Output Methods\"\n}\n[/block]\nAPI.AI allows you to define <a href=\"https://docs.api.ai/docs/concept-intents#section-text-response\" target=\"_blank\">text responses</a> for your app, bot, or device directly in intents. For some messaging platforms, you can also define <a href=\"https://docs.api.ai/docs/rich-messages\" target=\"_blank\">rich messages</a> as your bot responses.\n\nIf you want your app, bot, or device to provide voice responses, you can integrate your agent with a 3rd-party text-to-speech engine.\n[block:api-header]\n{\n  \"type\": \"basic\",\n  \"title\": \"Fulfillment\"\n}\n[/block]\nOptionally, you may want your app, bot, or device to perform operations (e.g., open applications, extract data, do math, etc.) and provide responses to those operations. We use the term **fulfillment** to refer to such operations and responses.\n\nFulfillment can be implemented in your app, bot, or device code. \n\nYou can consider creating a web service and integrating it with your API.AI agent via an API.AI <a href=\"https://docs.api.ai/docs/webhook\" target=\"_blank\">**webhook**</a>. In your web service, you can make calls to other APIs and send results back to your API.AI agent as responses.\n[block:api-header]\n{\n  \"type\": \"basic\",\n  \"title\": \"Get Started Today\"\n}\n[/block]\nIntegrate your product to process natural language using the API.AI tools in <a href=\"https://docs.api.ai/docs/get-started\" target=\"_blank\">five easy steps</a>.","excerpt":"Understand the key concepts","slug":"key-concepts","type":"basic","title":"Introduction"}

Introduction

Understand the key concepts

* [API.AI: NLU and Dialog Management](#apiai-nlu-and-dialog-management) * [Platform Integrations](#platform-integrations) * [Input Methods](#input-methods) * [Output Methods](#output-methods) * [Fulfillment](#fulfillment) * [Get Started Today](#get-started-today) [block:api-header] { "type": "basic" } [/block] API.AI is a platform for building conversational interfaces for bots, applications, and devices. The diagram below shows how API.AI is related to other components and how it processes data: [block:image] { "images": [ { "image": [ "https://files.readme.io/a769bab-API-AI_key-concepts.png", "API-AI_key-concepts.png", 1000, 341, "#e2e3e4" ], "sizing": "full" } ] } [/block] In the diagram, the green is provided by the API.AI platform. Your app / bot / device code provides the input and output methods and responds to actionable data. You can also provide an optional webhook implementation which API.AI uses to connect to your web service. Your web service can then perform business logic, call external APIs, or access data stores. [block:api-header] { "type": "basic", "title": "API.AI: NLU and Dialog Management" } [/block] API.AI receives a **query** as input data. A <a href="https://docs.api.ai/docs/query" target="_blank">query</a> is either text in natural language or an <a href="https://docs.api.ai/docs/concept-events" target="_blank">event</a> name sent to API.AI. API.AI matches the query to the most suitable <a href="https://docs.api.ai/docs/concept-intents" target="_blank">**intent**</a> based on information contained in the intent (examples, <a href="https://docs.api.ai/docs/concept-entities" target="_blank">entities</a> used for <a href="https://docs.api.ai/docs/concept-intents#section-example-annotation" target="_blank">annotations</a>, <a href="https://docs.api.ai/docs/concept-contexts" target="_blank">contexts</a>, <a href="https://docs.api.ai/docs/concept-actions" target="_blank">parameters</a>, events) and the <a href="https://docs.api.ai/docs/concept-agents" target="_blank">agent</a>'s <a href="https://docs.api.ai/docs/machine-learning" target="_blank">machine learning</a> model. API.AI transforms the query text into **actionable data** and returns output data as a <a href="https://docs.api.ai/docs/query#response" target="_blank">JSON response object</a>. The process of transforming natural language into actionable data is called Natural Language Understanding (NLU). Dialog management tools such as <a href="https://docs.api.ai/docs/concept-contexts" target="_blank">contexts</a> and <a href="https://docs.api.ai/docs/concept-intents#intents-priority" target="_blank">intent priorities</a> allow developers to control the conversation flow. [block:api-header] { "type": "basic", "title": "Platform Integrations" } [/block] There are several ways of connecting an API.AI agent to your app, bot, or device: - You can send GET and POST HTTP requests to your agent using the <a href="https://docs.api.ai/docs/query" target="_blank">/query endpoint</a>. - You can use our <a href="https://docs.api.ai/docs/sdks" target="_blank">SDKs</a> to send query requests to different platforms. - For popular messaging platforms, you can use one-click <a href="https://docs.api.ai/docs/integrations" target="_blank">integrations</a> that allow you to create bots without any coding. You can also use our source code from <a href="https://github.com/api-ai" target="_blank">Github</a> to extend these integrations and create custom bot implementations. [block:api-header] { "type": "basic", "title": "Input Methods" } [/block] API.AI is designed to receive text as input data. If you want to send voice requests to your app, bot, or device, consider using <a href="https://cloud.google.com/speech/" target="_blank">Google Speech API</a> or other 3rd party solutions which can convert speech to text. Depending on the app, bot, or device platform, speech recognition might be available natively. For example, if you are building an <a href="https://docs.api.ai/docs/actions-on-google-integration" target="_blank">Action on Google</a>, an Android or iOS app, speech recognition is supported natively. [block:api-header] { "type": "basic", "title": "Output Methods" } [/block] API.AI allows you to define <a href="https://docs.api.ai/docs/concept-intents#section-text-response" target="_blank">text responses</a> for your app, bot, or device directly in intents. For some messaging platforms, you can also define <a href="https://docs.api.ai/docs/rich-messages" target="_blank">rich messages</a> as your bot responses. If you want your app, bot, or device to provide voice responses, you can integrate your agent with a 3rd-party text-to-speech engine. [block:api-header] { "type": "basic", "title": "Fulfillment" } [/block] Optionally, you may want your app, bot, or device to perform operations (e.g., open applications, extract data, do math, etc.) and provide responses to those operations. We use the term **fulfillment** to refer to such operations and responses. Fulfillment can be implemented in your app, bot, or device code. You can consider creating a web service and integrating it with your API.AI agent via an API.AI <a href="https://docs.api.ai/docs/webhook" target="_blank">**webhook**</a>. In your web service, you can make calls to other APIs and send results back to your API.AI agent as responses. [block:api-header] { "type": "basic", "title": "Get Started Today" } [/block] Integrate your product to process natural language using the API.AI tools in <a href="https://docs.api.ai/docs/get-started" target="_blank">five easy steps</a>.