# Integrations
Docgility provides a flexible and easy interface to configure integrations for the application. Once configured, these integrations can be used throughout the application.
If there are integration needs that are not available in the preset configuration list, Docgility can help create these custom integrations.
# Return Format
All integrations are formated to return with in the following format as a set of rows, each row containing the following values:
- title: the header used to display the results. (ex. for simple prompts, you may want to display the prompt)
- value: text value displayed in the results. (ex. the result of the prompt call)
- link: optional link to allow the user to click to get further information.
- descr: optional long description of the results.
As an example, an LLM prompt integration may return rows similar to below:
- title: Rewrite the paragraph with best terms for originating party.
- value: TCL, the originating party, would like to limit liability for ...
- link: https://example.docgility.com/1561353par
- descr: Click on the link for further details.
# Available Context
Since Docgility showcases multiple documents at the same time, prompts and/or integrations can reference a large amount of information when it is called, including:
- paragraph context - this is the text of the exact paragraph that is highlighted in the prompt window.
- text of the entire document - even though a specific paragraph may be highlighted, you may want to create a summarization prompt for the entire document (i.e. summarize the other liability clauses in the document)
- associated playbooks - for each document and related document type, you can access the other playbooks and clause sections that may be applicable for this document (i.e. Rewrite this paragraph using playbooks indicated in ...)
- other synced documents - if other documents are synced, you can use the context in the input as well. (i.e. Compare the other documents for differences in liability clauses)
This design allows maximum flexibility in crafting the appropriate context-specific response for the best recommendations. This provides a simple and flexible design to support practically any type of integrations necessary for your organization.
# Large Prompt Configuration & Storage
LLM/GenAI prompts have gotten progressively very large and long in some instances. In addition, there's a lot of instances in which prompts use the same repetitive language repeatedly across different prompts.
In order to address this, Docgility created methods to recursively construct prompts as needed at runtime, so that you can structure your prompts with as much flexibility as possible.
The code for prompts is provided below:
def promptwithlookup(promptstring):
display = ''
firstdisplay = True
while True:
firstpos = promptstring.find('DG(')
if firstpos == -1:
break
secondpos = promptstring.find(')', firstpos)
LKexpression=promptstring[firstpos:secondpos]
lookuprecord = mongogetbyobjidsync('prompts', LKexpression.replace('DG(', '').replace(')', ''))
promptstring = promptstring.replace(LKexpression, lookuprecord['prompt'])
if firstdisplay:
display = lookuprecord['display']
firstdisplay = False
return display, promptstring
The code above takes any string that is of the format DG(xxxx) and expands it with a lookup of xxxx key in the prompts table. This is done recursively to expand the prompt definition as needed. The display or title of the prompt is from the first key encountered.
The user can then store all prompts or prompt fragments in the prompts table for retrieval during execution.
# Azure OpenAI/LLM Integration
Since Azure OpenAI/LLM is a popular integration, we describe how this is represented as a configuration in the application.
To specify the integration, configure and overwrite the following values in the helm chart (this is the example for the first integration. Up to 4 can be included in the application):
Example of an Azure OpenAI Integration:
app.integ1name: AzureOpenAIInteg
app.integ1type: integAzureOpenAI
app.integ1creds: 2024-05-01-preview,4EF8yZQR7fsqZbB2UQ9NiTxcBRlgPXNAO8XaSCbKkLk181lK5OdGJQQJ99BDACYeBjFXJ3w3AAABACOGprJK
app.integ1specs: https://docg.openai.azure.com/,gpt-4o-mini
app.integ1paras: Formalize in legal language and make the terms,better for originating party,better for counter party,less overall risk,DG(testdemo),DG(testdemo1)
Here is the format for each field:
- app.integ1name = Display name for the integration, this is presented as a dropdown option in the user application.
- app.integ1type = Name of the integration, this is used as key to retrieve the integration name. For this integration, use integAzureOpenAI to access the generic Azure OpenAI integration.
- app.integ1creds = Credentials for the integration, for integAzureOpenAI, the format should be the name of OpenAI model + ',' + credential key to access the model.
- app.integ1specs = Specification for the integration. For integAzureOpenAI, it should be access point + ',' + type of model.
- app.integ1paras = This is a comma-separated list of prompts. The first entry is repeated for all simple prompts. So, in the example above, this is equivalent to "Formalize in legal language and make the terms better for originating party", "Formalize in legal language and make the terms better for counter party", etc. In addition, you can specify the Large Prompts by refercing it by name (i.e. DG(testdemo) - would lookup the prompt definition from the tables.)
After configuring the above configuration, restart the Docgility instance and all users would be able to access the AzureOpenAIInteg option under the File menu.
Docgility provides the ability to integrate up to 4 integrations at the same time.