Discover how ChatGPT now offers API access through Azure, enabling integration of OpenAI's templates into your website and enterprise applications!
Integrate your enterprise data with OpenAI templates
If access to these APIs gives you the ability to expose an OpenAI model within a conversational interface (a chatbot for example), how do you enrich it with your own enterprise documents within your Azure instance?
You can of course create your own model by enriching it with your enterprise data. This is made easier by the use of pre-existing models, but it is still an ambitious task.
The method used by Microsoft is simpler and is based solely on natural language:
- The user asks a question within your enterprise chatbot. Let's take the example of a member of a mutual insurance company: "How much will I be reimbursed for a physiotherapy session?
- In order to allow the model to generate answers based on relevant data, we can inject this information directly in the form of "prompts". The model can then read this information as well as any instructions, contexts or questions and respond accordingly. This method does not require re-training or refinement of the model, and the responses can immediately reflect changes in the underlying data.
- Injecting this data as "prompts" can be done through a data aggregation service, such as Cognitive Search in the Azure ecosystem. It is this service that draws information from databases to feed the semantic model.
- Note that multilingualism can be directly managed by "prompts" within "prompts" to translate in real time. A user can thus ask a question in English about your data, which is in French, for example.
How do you avoid model ramblings and ensure the reliability of answers?
Azure has implemented several safeguards within its templates:
- Direct quote: each response can include a quote with a link to the source content.
- Supporting content: Each response or chat bubble generated by ChatGPT has an option to display all original content that was used to build the response.
- Orchestration process: each response can display the entire user interaction process.
What about security and data governance?
How to ensure that the data ingested by the ChatGPT solution within Azure remains confidential and anonymous? The news surrounding ChatGPT has been punctuated by several leaks of personal data.
In our case, the data remains within the Microsoft ecosystem with the same confidentiality and security commitments as all other Azure services. This is the great strength of the model: infusing AI everywhere in the Microsoft enterprise ecosystem that is already well established in your home!
The arrival of the plugin system
This is the other big news of the week: ChatGPT now includes a plugin system that can enhance its capabilities. This announcement, apparently technical, hides a significant advance of the service, especially in terms of use.
Indeed, these plugins allow the model to access updated information, to perform calculations and to use third party services. It is clearly a "conversational assistant" model as we know it with Siri, Alexa, or Google Home: users can book a table at a restaurant, do their shopping online, book a flight, have a product delivered...
The goal is to allow the model to access up-to-date information. Traditional language models are usually trained on a static text corpus and are therefore not able to provide up-to-date information. That's why, when you use the consumer version of chatGPT, you can only get answers about events that happened until 2021.The plugins for ChatGPT are there to allow the model to access real time data.
How to create a plugin on ChatGPT ?
It is quite simple! You just need to provide access to an endpoint of an API. This can be an existing API or an API encapsulated specifically for consumption by the ChatGPT service. Write an OpenAPI specification documenting your API, along with a manifest file that references the OpenAPI specification, and you're done.
Limitations of the plugin system
- On the user side, only those with the paid GPT+ model currently have access to the plugin service.
- Model control: OpenAI has included a section dedicated to "security considerations", which indicates that the company has stepped up security: content filtering to prevent access to inappropriate information, as well as monitoring the reliability of sources and the veracity of information. Of course, only time will prove the reliability of the model!
- Data governance: the most delicate point! Here we are not in the Azure ecosystem. Conversation data is still stored on OpenAI servers (which is not open-source as its name might suggest), and you cannot access it. Moreover, the conversations with your users can also be used to re-train the model, which can cause a second problem.
What next?
We don't have a crystal ball, but it's certain that these services will have a major impact on the way the general public will consume the web. In the meantime, don't hesitate to ask us about :
- Combine Azure OpenAI services with your enterprise data to create conversational assistants that can analyze and concatenate complex information.
- Our data consultants can help you work on use cases and the mapping and slicing of your source documents.
- Advise you on which OpenAI model to use.
- Anticipate your OPEX and work on the business model (OpenAI APIs are, of course, not free!)
- Or create your own plugin on ChatGPT!