- Before you start
- Overview
- Fetch Data Actions
- Fetch Customer by Email
- Fetch Customer by ID
- Fetch Orders by Customer ID
- Fetch Orders by Customer Email Address
- Fetch Order by ID
- Fetch Product by ID
- Modify Order Actions
- Refund an Order
- Return Line Items in an Order
- Response code 200:
- Possible errors - response code 400:
- Sample Apps
- Summary
- Getting Started
- GraphQL Schema
- ACTIONS
This tutorial introduces many concepts and requires a few hours of learning before you will be equipped to write your own app. Please complete the following tutorials in the order they are written below.
Many tools and technologies used here, such as Go Templating and GraphQL (see the Before you start section) are widely used by many organizations. Take the time to search online for various tools and tutorials that exist if what we have provided leaves you with remaining questions. We intentionally chose widely available tools so that you could leverage publicly available educational materials.
By the end of the tutorial you will have a fully functional Gladly App which interacts with a fictitious order management system. Even if you are not familiar with GraphQL or Go Templating you will be able to build the app by following this tutorial.
Remember that if you get stuck you can always compare your app with the finished version. It’s available in our app-platform-examples Github repo – sample_oms.
Before you start #
Install the CLI tool, then read the OMS API documentation. Both instructions are in the sections below.
Useful links
Example apps: https://github.com/gladly/app-platform-appcfg-cli
CLI Documentation https://github.com/gladly/app-platform-appcfg-cli/tree/main/docs
Sample Apps #
For your convenience we have created a repository of some sample apps: https://github.com/gladly/app-platform-examples In particular sample_oms connects to the Sample OMS API referenced above. In this tutorial we will be replicating it’s functionality. Clone the example repo:
git clone [email protected]:gladly/app-platform-examples.git
Summary #
A pre-requisite for this tutorial is that you have read through the documentation of our sample OMS noted above. Please familiarize yourself with the Sample OMS APIs before continuing. We will be using this as our third-party system.
appcfg aims to help you create the files/folders needed for a Gladly App Platform App. Files will be created as placeholders that you can fill in manually. This process will require running some commands, manually updating files, performing validation and tests, and then running your app. Once the app is written, it must be installed in Gladly, configured, and activated. Writing an app without this tool is possible, but we built it to make the process easier for you.
When you build an app on Gladly’s App Platform, know that you are responsible for the data returned, actions taken by the app, maintenance, and testing. We are simply providing you with the ability to integrate with external systems. Given that, we recommend you pull as little data as necessary and enable as few actions as necessary. Because apps must be backward compatible, adding rather than taking away is much easier. Start small and build up from there.
This tutorial is meant to introduce you to basic App Platform concepts you can use when building your own app — it is not a comprehensive tutorial.
The top-level folders and files in your app folder will resemble the following image. The sample_oms app we are referencing already contains all of these files. Please refer there if you get lost. We will go through the motions of adding all of this relevant configuration yourself in the tutorial below:
We will build a sample app against our example OMS Server, which will walk you through creating your own app. Please reference our documentation
Getting Started #
- Before we begin configuring our own app, let’s quickly run a few
appcfg
commands and validate the existing example app provided to you in the Sample OMS. - Run the following commands from the
sample_oms
folder to validate the existing actions and data. This app should pass validation. If everything looks good, proceed to the next step. Otherwise, fix any existing errors or reach out to your CSM. This step tells us that the app we are replicating passes validation. If the app we build later does not pass validation, we should compare it against the example.- Export variable to point to app root:
export GLADLY_APP_CFG_ROOT=app
- Validate actions:
appcfg validate actions
- Validate data:
appcfg validate data
Tip – At any point in the development cycle, feel free to Runappcfg validate
to check your work and expose any issues.
- Export variable to point to app root:
- Initialize your App.
- If you haven’t already initialized your app in step Install External App CLI Setup # do so now. From the root directory of your app run:
appcfg init -n <app_name> -a <org_name> -d <description>
e.g.<br>appcfg init --app-name tutorial --author ela.com --description "My first Gladly app"
Running the command will return GLADLY_APP_CFG_ROOT value which you’ll use in the next step. - Export the root variable:
export GLADLY_APP_CFG_ROOT=<value returned by previous command>
e.g.export GLADLY_APP_CFG_ROOT=/Users/ela/gladly/projects/tutorial
Note: since we set the root app folder name, this will be the root of any work you do using the configuration tool unless you change the root app folder name.
- If you haven’t already initialized your app in step Install External App CLI Setup # do so now. From the root directory of your app run:
- Before continuing to configure our sample app, we want to identify data pull and action names and set them aside. In our example, we will start with our three data pulls
customers
,orders
, andproducts
, then move on to our three actions:cancel_order
,return_order
, andrefund_line_items
. We will start by pulling customer id by email address. - As you build your app, please refer to the example JSON in our test folders to understand the data’s shape. This is meant to aid your discovery as you familiarize yourself with the app we will build on the App Platform.
AUTHENTICATION #
Our sample OMS app does not need any action-specific or data pull-specific headers. They will all inherit our app-level auth headers. Fill out the following auth to run our app against the external system. In this example, we are configuring auth before data pulls/actions, but feel free to start the data pulls or actions first if that’s how you prefer to think about things.
Adding Auth Header #
Determine Auth Header name. In our case, we need to provide an apiToken
key value pair. The command we will use is: appcfg add auth-header {header name} [flags]
- Run the command:
appcfg add auth-header apiToken
apiToken.gtpl
contains <{{.integration.secrets.apiToken}}
authentication/headers/_test_/data
folder contains two files:expected_apiToken.txt
containssecret_token
integration.json
contains:
{
"configuration": {
},
"secrets": {
"apiToken": "secret_token"
}
}
Test the auth header by running appcfg test auth-header
.
If no error is returned, the tests have passed.
DATA PULLS #
- Review the Sample OMS Server API docs again (in the Before you start section) to confirm you understand the relationships between our three data pulls. Take your time to digest the structure of the external APIs. You’ll notice that
customer
does not have any dependent data,orders
are dependent on the customer, andproducts
are dependent on the order. This tutorial is meant to walk you through each data pull one by one. You will run through this tutorial three times, once for each data pull. - Run the
add data-pull
command for our three desired data pulls (customer
,orders
,products
). Customer is highlighted in pink, Orders is highlighted in orange, and Products is highlighted in yellow. Start with the Customer data pull, going through all the steps highlighted in pink. When you finalize that data pull at the end of this tutorial, return and run through the tutorial again for Orders and then a third time for Products. We recommend this iterative process to help you identify and fix errors by building data pulls on top of one another rather than simultaneously. But it is ultimately up to you to decide how you like to build apps going forward. Find an approach that works for you. Some steps are the same for all three data pulls and contain no highlight. - You can create multiple requests by outputting multiple bodies via the request body. This approach should be used when external data is requested via a POST. The start of each body is specified in the template by adding a “#body” marker on the line that immediately precedes the start of the body content. The marker can appear anywhere on the line making it possible to place it just above the actual body content for clarity. For simplicity, it is not necessary to add a “#body” marker to templates that only generate one body. For requests where only the request body changes, the request url template can simply generate one URL to be used with each request body. This will be the most common case for a POST. It is, however, possible to have a unique URL for each request body by generating as many URLs as there are request bodies. For example:
{{ range .customer.emailAddresses }}
#body
{
“emailAddress”: “{{.}}”
}
{{ end }}
Add Data Pulls #
appcfg add data-pull <data-pull-name> -m <http-method> -t <data-type> -v <semantic-version> -d <dependent_data>
(example command).
- CUSTOMER:
appcfg add data-pull customer -m GET -t sample_oms_customer -v 1.0
- ORDERS:
appcfg add data-pull orders -m GET -t sample_oms_order -v 1.0 -d sample_oms_customer
- PRODUCTS:
appcfg add data-pull products -m GET -t sample_oms_product -v 1.0 -d sample_oms_order
- Navigate to your root folder and confirm the appropriate folders and files have been created for your data pull above. See the list below. If all looks good, continue to the next step.
request_url.gtpl
config.json
external_id.gtpl
external_parent_id.gtpl
external_updated_at.gtpl
response_transformation.gtpl
data_schema.graphql
(one file for all three data pulls will be filled in later)
- Again, we do not run the
add data-pull-header
command for each data pull because we have no headers specific to a data pull. Our auth headers are generic and can be applied to all data pulls. We created an auth header that will apply to all of our apps’ REST calls in an earlier step.
Configuring the Data Pulls #
- Let’s fill out the various configuration files in our new data pulls. As you know from earlier steps, each data pull has its own folder within
data/pull
. We will be referencing template data here. - Each data pull also has a
_test_/data
folder comprised of dataset folders. The customer and correlationId files come pre-loaded with sample data. These data set folders also contain our expected value files that we will fill out. This configuration tool allows us to use static data so that the outcome of our tests is deterministic. Normally, Gladly will provide customer data dynamically to the App Platform API requests. - Note that the
urlquery
go template function is available to you to escape url queries to be url-safe (for more information on Go Template Functions, runappcfg platform template-functions
). We start at the root of the template data with a.
, and then we select theprimaryEmailAddress
from the customer object. - We recommend validating and testing incrementally as each component of a data pull is created. For example, you would start by adding the request url below and then running
validate
command to ensure it passes validation. After that, you should set up the test file forrequest_url
and then run the test. After you’ve tested your request url, move on to adding the external id and so on.
Next, let’s clean up our data pull and data set by deleting unnecessary files that are not relevant. For example, we do not need to request payloads for our GET data pulls or any transformations. We will delete such files as response_transformation.gtpl
and external_updated_at.gtpl
.
Validate Data Pulls #
Before moving on to GraphQL schemas, let’s validate and test our data pulls. In the prior step, you tested specific data pull tests:
- Run
appcfg validate data
- You’ll notice that empty files will appear in the validator. Please delete any empty files that are still noted in the validator.
- If the validator has any other issues, please fix them before testing our app.
Test Data Pulls #
If the configuration you’ve written passes validation, it’s time to run our tests against the sample payloads again and confirm that everything works.
- We have already set up our test expectations and tested data sets individually. Confirm all expectations are correctly noted in the
expected_...
files. - Run the following commands to test each data pull. Adding additional tests in data-set folders within
{data pull name}/_test_/data
is possible if you want to add additional test cases. Use the flag-d
to name the specific test folder you want to use. In our case, because we only have one test data set calledsuccess
, we will set the-d
flag.- appcfg test data-pull customer -d success
- appcfg test data-pull orders -d success
- appcfg test data-pull products -d success
After running the tests above, compare the actual output with the expected output. If there is a failure, you will see an error in the console. If there are any errors while running the test, read and resolve them before re-running the test.
As an example, the expected test output for appcfg test data-pull customer -t url should look something like the following in your terminal.
data/pull/customer/request_url.gtpl
-----------------------------------
https://gladly-sample-oms.vercel.app/api/customers?emailAddress=primary%40email.com
Once tests are passing, we will run our data pulls against the external system.
Run Data Pulls #
appcfg run data-pull -s '{"apiToken": "token"}'
GraphQL Schema #
It is time to think about the data transformations and our data schema. While we will not be doing any data transformations because the raw data returned by our sample OMS is returned exactly in the format we want; we will be modeling data in GraphQL in the way that makes the most sense for our business. We want two GraphQL query entry points, one for customer and one for orders. Using an online transformation tool to convert your raw JSON response to GraphQL is a nice place to start. From there, you can tweak and modify. You can also write your GraphQL schema by hand.
Add the following GraphQL Schema to data_schema.graphql
:
- Note that we use GraphQL directives to reference the data pull the GraphQL datatype relates to. Each transformation needs to correspond to a data definition in GraphQL.
type Product @dataType(name: "sample_oms_product", version: "1.0") {
id: ID
name: String
productType: String
sku: String
}
type LineItem {
id: String
quantity: Int
product: Product @childIds(template: "{{.productId}}") (we return product object rather than product ID here)
status: String
}
type Order @dataType(name: "sample_oms_order", version: "1.0") {
id: ID
orderNumber: String
status: String
shippingAddress: String
billingAddress: String
orderDate: DateTime
customerId: String
shippingSpeed: String
totalPrice: Currency
shippingAndHandling: Currency
lineItems: [LineItem]
}
type Customer @dataType(name: "sample_oms_customer", version: "1.0") {
id: ID
emails: [String]
fullName: String
ltv: Currency
numberOfOrders: Int
phones: [String]
state: String
orders: Order @parentId(template: "{{.id}}")
}
type Query {
customer: Customer
orders: [Order]
}
Note on GraphQL Types that we support: #
- All the GraphQL types from the spec (Int, String, etc)
- Datetime
- Currency
- Object Type Definitions
- Enum
- Input Type
Note on GraphQL Query Entry Points: #
We have chosen only to expose two GraphQL Query entry points here (customer and orders) even though many data objects are relevant to users in Gladly. GraphQL is so efficient and flexible that clients can request only the data they need in a query. This helps to avoid issues with both underfetching data and overfetching data. In our case, because all of our data is hierarchical under Customers, users can request whatever nested fields they need using our Customer or Orders Queries, and there is no need for any additional explicit queries, such as lineItems or product.
Note on GraphQL Directives: #
We utilize only the built-in GraphQL directives(@deprecated
, @include
, and @skip
) plus three additional directives below (@dataType
, @childIds
, @parentId
).
STANDARD #
GraphQL provides several built-in directives that are commonly used in GraphQL schemas. These directives include:
@deprecated
: Indicates that a field or enum value is deprecated and should not be used. It takes an optional reason argument to explain why the field is deprecated.@include(if: Boolean)
: Conditionally includes a field in the response based on the value of theif
argument. Ifif
is true, the field is included; otherwise, it is omitted.@skip(if: Boolean)
: Conditionally skips a field in the response based on the value of theif
argument. Ifif
is true, the field is skipped; otherwise, it is included.
These directives allow us to control the shape and content of GraphQL responses based on runtime conditions and deprecation status.
ADD-ON #
We have added three additional directives for you to use:
@dataType
: We use the@dataType
directive to associate a GraphQL type with a given data type from a data pull in Gladly. For example, below, you will see the GraphQL type GladlyCustomer is associated with thegladly_customer
data pull version"1.0"
type GladlyCustomer @dataType(name: "gladly_customer", version: "1.0") {
...
}
@childIds
: This directive specifies how child entities are identified within a parent entity. It typically takes a template as an argument, which is used to construct the unique identifiers of child entities based on the parent entity’s data. For example,@childIds(template: "{{.productId}}")
might indicate that each child entity’s ID is derived from theproductId
field of the parent entity.
@parentId
: This directive identifies the parent entity associated with a child entity. It often takes a template argument as well, which is used to construct the unique identifier of the parent entity based on the child entity’s data. For instance,@parentId(template: "{{.id}}")
could indicate that the parent entity’s ID is derived from the id field of the child entity.
These directives help establish hierarchical relationships between different data types in GraphQL schemas, allowing for structured and efficient data retrieval.
Validate and Run GraphQL #
- Once your GraphQL data schema has been written, it is time to validate and test it.
- Start by running the
vּalidate
command. - If your data schema passes validation, please run each GraphQL entry point using the following command.
appcfg run data-graphql -d success -q customer -s '{"apiToken": "token"}'
appcfg run data-graphql -d success -q orders -s '{"apiToken": "token"}'
ACTIONS #
In our example, we will define three actions: cancel_order
, refund_order
, and return_line_items
in our Sample OMS Server. This will be similar to our data pull tutorial, but we will work on actions in the external system rather than data pulls. You will be reusing many concepts from the data pull section above.
- Run the
add action
command for each of our actions. Each of them is aPOST
withapplication/json content
.appcfg add action cancel_order -m POST -c application/json
appcfg add action refund_order -m POST -c application/json
appcfg add action return_line_items -m POST -c application/json
Now that the action folders and files have been created, let’s go in and add data and test data. Be sure to validate and test your actions as you incrementally add each component below:
app/actions/cancel_order
request_url
ishttps://gladly-sample-oms.vercel.app/api/orders/{{urlquery .inputs.orderId}}/cancel
expected_request_url.txt
containshttps://gladly-sample-oms.vercel.app/api/orders/order1/cancel
request_body
:
{
"cancellationReason": "{{.inputs.reason}}"
}
expected_request_body.txt
contains{"cancellationReason": "broken"}
rawData.json
contains{"status": "success"}
inputs.json
:
{
"orderId": "order1",
"reason": "broken"
}
app/actions/refund_order
request_url
ishttps://gladly-sample-oms.vercel.app/api/orders/{{urlquery .inputs.orderId}}/refund
expected_request_url.txt
containshttps://gladly-sample-oms.vercel.app/api/orders/order1/cancel
request_body
:
{
"refundReason": "{{.inputs.reason}}"
}
expected_request_body.txt
contains{"refundReason": "broken"}
rawData.json
contains{"status": "success"}
inputs.json
:
{
"orderId": "customer1",
"reason": "broken"
}
app/actions/return_line_items
request_url
ishttps://gladly-sample-oms.vercel.app/api/return
expected_request_url.txt
containshttps://gladly-sample-oms.vercel.app/api/return
request_body
:
{
"orderId": "{{.inputs.orderId}}",
"lineItemIds": {{toJson .inputs.lineItemIds}}
}
expected_request_body.txt
contains:
{
"orderId": "order1",
"lineItemIds": ["lineItemA", "lineItemB"]
}
rawData.json
contains{"status": "success"}
inputs.json
:
{
"orderId": "order1",
"lineItemIds": ["lineItemA", "lineItemB"]
}
GraphQL Schema #
It is time to fill out the GraphQL schema for our actions.
app/actions/actions_schema.graphql
:
type Result {
status: String!
message: String
}
type Mutation {
cancelOrder(orderId: String!, reason: String!): Result @action(name: "cancel_order")
refundOrder(orderId: String!, reason: String!): Result @action(name: "refund_order")
returnLineItems(orderId: String!, lineItemIds: [String!]!): Result @action(name: "return_line_items")
}
- Delete unnecessary files that are not relevant.
Before we move on to running against the external app, validate and test our actions again.
- Run
appcfg validate actions
- You’ll notice that empty files will appear in the validator. Delete any empty files that are still noted in the validator.
- If the validator has any other issues, please fix them before testing our app.
If the configuration you’ve written passes validation, it’s time to run our tests against the sample payloads and see if our actions work.
Use the flag -d
to name the specific test folder you want to use. Because we only have one test dataset, we will use the -d success
flag to select it.
- Run
appcfg test action cancel_order -d success
- Run
appcfg test action refund_order -d success
- Run
appcfg test action return_line_items -d success
After running the tests above, compare the actual output with the expected output. If there are any errors while running the test, read and resolve them before re-running the test. The expected test output for appcfg test action cancel_order -d success
should look something like the following in your terminal:
==========================
Test action "cancel_order"
==========================
actions/cancel_order/request_url.gtpl
-------------------------------------
https://gladly-sample-oms.vercel.app/api/orders/order1/cancel
actions/cancel_order/request_body.gtpl
--------------------------------------
{
"cancellationReason": "broken"
}
actions/cancel_order/response_transformation.gtpl
-------------------------------------------------
Once tests are passed, we will run our actions against the external system.
appcfg run action -s ‘{“apiToken”: “token”}’
Then run our GraphQL mutations against the external system.
appcfg run action-graphql -d success -m cancelOrder -s ‘{“apiToken”: “token”}’ -i ‘{“orderId”: “stuff”, “reason”: “stuff”}’
appcfg run action-graphql -d success -m refundOrder -s ‘{“apiToken”: “token”}’ -i ‘{“orderId”: “stuff”, “reason”: “stuff”}’
appcfg run action-graphql -d success -m returnLineItems -s ‘{“apiToken”: “token”}’ -i ‘{“orderId”: “stuff”, “lineItemIds”: “stuff”}’
Final Steps #
- Validate all or a subset of your configuration again (actions or data) or both.
- Run
appcfg validate actions
- Run
appcfg validate data
- Run
- Generate markdown documentation for your new app.
- Run
appcfg docs -d <root directory for the documentation tree>
- Run
- Create the app zip file containing all of the configurations you created above.
- Decide the path of the app file you’d like to create.
- Run
appcfg build -f <path of app file>
Install and Configure Your App #
Install, configure, and activate your new app in Gladly. You can install this sample app in Gladly or stop here and begin configuring your own app, then install it in Gladly. For now, PS will walk you through the installation process while we work on building out an admin experience.
- At this time, PS has to install, configure, and activate on your behalf.
- You’ll need to provide them with a zip file containing your app.
- Any secrets/configuration.
- Name of the integration.
- Any username, public, or private keys.
- Any custom configurations. If your app needs any environment-specific settings in the
integration.json
, they would need to provide those to PS now.
- Now an instance of your app has been associated with the required configuration (such as API token stored on the integration), and is officially ready for use in Gladly Hero and Sidekick.