The ultimate guide to APIs
By allowing web services to communicate with each other and existing business systems, APIs drill through data silos and open up huge possibilities for data integration and application integration.
It is now a case of ‘survival of the fittest’ for web services. APIs give you the ability to migrate data and build integrated systems so you can take advantage of services with new functionality with lower costs. This flexibility keeps cloud companies on their toes and means you can use the best services for your needs.
In this post, we’ll walk through what APIs are, why they should be at the center of your data integration strategy, and how you can use them to share and access data. More specifically we’ll take a look at:
- What an API is and why it is so important to the modern enterprise.
- How to move data between web services, and why this is going to be the new norm.
- Examples of how to use APIs to migrate data or entire systems, along with some challenges and solutions.
- How (and why) to build your own API without hosting any of your own infrastructure, using the AWS API Gateway and FME Cloud.
What is an API?
To be a successful business, you need to provide a way for clients to interact with your product and services. For example, a restaurant might offer their customers the convenience of ordering food by phone or website for delivery, as well as dining-in.
In software and cloud technology, an API (application programming interface) is another way companies can serve their tools and services. APIs are used as a communication medium between clients and servers that make it easy for software to share data and services with one another for a variety of purposes. Because of this, APIs are becoming a primary point of interaction. They allow partners and customers to access core business systems, whenever they want, in a stable and secure way.
APIs are quietly running behind the scenes in most applications you use today. If you have ever geotagged a photo on Instagram, received a push notification from Uber, or booked a flight on Expedia, you have been exposed to an API. These applications rely on APIs to enhance their user experience by providing additional functions. For example, Instagram uses the Facebook Places Graph API to access its location database for geotagging photos. By leveraging Facebook’s extensive database created from user check-ins and addresses, Instagram is able to provide location-based services to its users.
If a company does not have an API for their software or service, it will become impossible for clients to integrate the service with their business systems. In fact, the market is now so competitive that success is not just about whether or not a company has an API, but about how usable and intuitive it is.
API Technical Components
Typically, an API is defined as a set of routines, protocols, and tools for building applications. But from a business perspective, an API can be treated as a product with three core functional components:
- API management and security
- The interface itself (resources, methods, etc.)
- The business logic tied to each resource
There are other important elements too, such as monitoring, analytics and threat protection, but these are not required to deliver an API, especially on a small scale.
Why Are APIs So Popular?
APIs used to be niche technology, created by tech companies such as Salesforce, AWS and Google—the pioneers of APIs. This is no longer the case. As a result of software permeating nearly every industry and product, APIs are now mainstream. There are several reasons for this:
Effortless Integration – APIs allow partners and customers to access your systems in a stable and secure way. | Mobile Phones – Devices embedded with sensors are everywhere and fit the service-based structure of APIs perfectly. |
Competitive Market – The market is now so competitive that a company’s success may depend on how usable and intuitive their API is. | Flexibility – APIs allow you to quickly leverage and use your desired services. This lowers risks and allows for greater innovation and more rapid development. |
Cloud Computing – Organizations rely more and more on cloud infrastructure, and with new models like serverless on the rise, it’s never been easier to offload computing to the cloud. APIs are needed for both the initial migration and continuous integration with cloud systems. | Proven Success – Companies that adopted an API-first strategy caused disruption of entire sectors (think Salesforce, Ebay, Amazon, Twitter) and left larger incumbents scrambling to catch up. |
APIs can also be hugely beneficial on a smaller scale. Producing internal APIs can transform and streamline internal business processes. With tools now available to create a fully functioning scalable API in less than a day, organizations are realizing the potential for APIs to modernize and unify distributed legacy systems under a common interface, sometimes only for the life-cycle of a project.
Using APIs for Cloud Data Migration
Due to the success of cloud technology, many organizations now want to migrate data to be used in their cloud services or simply into cloud storage systems. Two data challenges tend to occur when shifting operations into the cloud. Using APIs is the key to solving both.
The Initial Bulk Upload
(aka. Getting data from an on-premise source into the cloud, or moving data from an existing cloud service)
Migrating data in bulk, either from on-premises infrastructure or from another service, can take significant effort. And it’s crucial to ensure as much data as possible is mapped from the original data source to the new service. Considerations include:
- Renaming attributes
- Cleaning and validating data (Removing duplicates, truncating data, removing special characters, etc.)
- Merging data from multiple sources into one new schema
System Integration
(aka. Connecting services with existing business processes)
Enterprises are leveraging web services to save time and money, but the penalty is a highly fragmented enterprise. It is therefore imperative that you are able to connect these web services. The following are important when integrating services:
- A scheduling tool that enables you to automate your connection workflows
- A cloud-based deployment so you don’t have to worry about managing further infrastructure
- Fault tolerance and monitoring
General Data Migration Steps
Migrating data between services or systems can be a complex process. One of the most challenging parts is understanding the data models to construct an accurate mapping. The good news is once that’s finished, flexible data transformation tools, like FME, make the actual migration very straightforward and you will come away with repeatable, reusable processes.
1) Connecting To and Authenticating APIs
To access services, you need to determine the authentication mechanism: token, OAuth 2.0 or maybe HTTP Basic. Each service usually interprets the standard slightly differently. For example, with token-based authentication, does the token go in the query string or in the header?
The complexities around authentication, especially if the service is using OAuth 2.0, make it one of the biggest barriers for working with web services. However, if you choose to use FME, OAuth 2.0, token, and basic authentication are all supported. So, once you determine the best way to authenticate to the APIs you want to use, you can set it and forget it in FME, and focus your efforts on the migration work itself.
2) Moving the Core Information
This is the crux of any migration project. Every migration is unique and has different types of challenges for each set of data being worked with, so it’s important to know exactly what pieces of core information you have. Core information can also be thought of as the primary or largest data ‘objects’ in your system such as:
- Articles from a knowledge base
- Data about user accounts
- Custom files like images, categories, or video attachments
- Cases or contacts from a CRM
- Work tickets in an issue-tracking system
- Work orders from an asset management system
- Employee records from an HR system
Although this step revolves around large scale data migration, there are many small scale considerations to make. Relationships, hierarchies, metadata, and many other fine details will need to be accounted for. Imagine mapping lower level data like nested comments, replies, or sub-tasks; migrating between different statuses, labelling systems, or tagging schemas; or redefining permissions or ownership in a new system.
3) Creating Repeatable Migration Process
A major difference between loading data via API calls and a direct read-and-write method is that the loading process can easily become a multi-phase process. One piece of data can be loaded and the resulting object, now immediately available through the API, can be used in the next phase of the migration. This does require a bit of a shift in approach as creating a repeatable migration process is more about defining a set of steps than about mapping out an exact target dataset.
For example, consider a common top-level item in a migration from one asset management system to another like a work order. Once a work order is loaded into the new system via an API, its new URL (or at least ID) will be returned in the response header or body. From here, it would be easy to extract the URL (or ID) and use it to post related items like comments, tasks, or equipment costs to the work order.
4) Leaving Room for Special Cases or Enhancements
When undertaking the migration of data from A to B, you might also uncover a need to improve business processes by adding C. For example, you might want to level-up to site-wide authentication for your enterprise in system B, meaning you might have an extra step of migrating user information to another system like Auth0 to make log-in that much easier for your customers or daily users.
Ideally, these requirements would be incorporated into the initial plan and scope of the migration project, but sometimes the need is not visible until the data itself is examined, and it can be valuable to maintain a degree of flexibility.
5) Handling API Errors
API errors are a fact of life. They can be caused by data anomalies, network timeouts, improperly formatted requests, or various server errors.
At a minimum, it’s important to log these to confirm that information doesn’t get lost. Ideally, it is possible to identify what caused the error and resubmit the requests with only the failed content. Again, thankfully, these are extremely visible and trackable in FME, and the flexibility of partially running workspaces with subsets of data means these errors can easily be worked out.
Tools & Solutions For Data Migration
You don’t need to be a hardcore developer to work with APIs. In fact, coding isn’t necessary with data integration tools and they can save you a lot of time and effort. Two types of tools exist for data integration with APIs: point-to-point and flexible solutions.
Point-to-Point Solutions
With the explosion of web services came the explosion of tools to help you move data between services. The majority of these tools provide point-to-point integration, a solution that solves one specific challenge. There are, however, significant limitations with point-to-point solutions.
- Limited data transformation capabilities.The data transformation component is often baked into the connector with little or no control over transforming the data as it moves between applications. This puts you at risk because if you wish to transform the data in a non-standard way, you will need to do custom development.
- Short term savings, long-term pain.Once you have spent time training staff in how to use the software, there is no way to reuse that knowledge and apply the logic to other integration workflows.
- Complex to maintain and monitor.In the short-term, point-to-point offers an enticing promise of lower IT maintenance costs, but as more one-off integrations pile up, complexity and costs rise dramatically. As you add more and more point to point integrations it becomes increasingly hard to monitor and ensure reliability.
Flexible Solutions
Flexible data integration tools are single investments that can accommodate multiple new applications without users having to learn new concepts or build new components. Flexible tools also allow you to transform data as it moves, which means you can use your data exactly how it’s needed. Choosing a tool that provides flexibility, like FME, is a crucial part of delivering long-term data integration architectures!
Building an API
The API has evolved over the years, and companies now have many choices for building and deploying APIs. The decision depends on the requirements of the project.
- Self-managed API – This is the most flexible option, but also requires a strong development capacity and the ability to deploy, monitor and maintain a web stack.
- Managed API – This option takes away a large amount of the pain around running a production API. You still need to create your API using your technology of choice, but the management, security, analysis and usability of the API is handled by the service.
- Serverless and Codeless API – This takes away the pain of managing and running your own infrastructure. All you have to worry about is the business logic. Authorization and authentication can be handled by a service.
Why Create APIs
As discussed previously, to be successful in the modern enterprise, you need to provide a clear interface to your business for developers so they can access core business systems whenever they want, in a stable and secure way. Another benefit APIs bring is that they abstract the internal implementations, so you can make changes to internal behaviour without impacting customer implementations. This is important, as if you decide to migrate a data store, you can migrate the data and then connect it to the original API, and the user can keep using the API. This decoupling lowers the risk considerably for consumers of the API.
Creating an API has now turned into a commodity, with many vendors such as AWS and Azure providing services. The complexity, therefore, lies not in creating the API, but how to connect the APIs to the data. This part is not trivial and was traditionally done with code, but with FME you can connect an API to hundreds of data sources without writing any code.
FME allows you to connect data and applications without writing any code. While this is extremely powerful, the workflows you create do not provide an easy way for developers to interact with your data.
When is a Codeless API a Good Fit?
There are many ways you can go about building and hosting an API. We are focusing on the codeless and serverless model using AWS API Gateway in conjunction with FME.
To assess if this is a good fit for your scenario, here is a checklist:
- You wish to allow developers to access your data and processes.
- You don’t have access to developers and want to do everything within a GUI.
- Agility is key and you wish to create disposable APIs that might only last a short duration—say the lifetime of a project.
- You are prototyping a new service. Don’t just put a website up for beta users—get an API in front of them. APIs are much stickier than web apps. If you can get beta users to integrate your solution into their workflows, you will have a higher chance of retaining them.
A serverless and codeless API is probably not a good fit if you want to create a large complex API that is going to serve a significant user base with millions of requests, as you will need more control to ensure you can optimize.
APIs & FME
We’ve covered API basics, best practices for migrating data with APIs, and steps and considerations for building your own API. Now let’s take a look at what it takes to work with APIs using FME as the central tool.
A migration project using FME will rely on many of the steps referred to in the sections above (like General Data Migration Steps), with the added benefit of FMEs complete suite of tools and features that make migration and API integrations easy. With FME, you can build workflows with a visual programming language of transformers, view your data as you manipulate it and make API requests, connect to anything, and build flexible repeatable processes.
Here’s a rundown of some of the most useful and commonly used FME tools and features for using APIs to migrate data.
Transformers are used to read data into the workflow, validate the data and correct it. Several key transformers take center stage when performing a bulk migration using APIs:
- HTTPCaller – all API communication is an HTTP request. This transformer allows you to make a request to a specific URL, use Web Connections with stored credentials, send payloads, and even make dynamic requests based on the data in your workspace.
- JSONTemplater – data sent to modern APIs is commonly in JSON, and this transformer is used to generate properly formatted JSON for request bodies based on FME attribute values.
- FeatureMerger/FeatureJoiner – these transformers are indispensable when linking up related features, datasets, and metadata from different sources. They can be extremely helpful in joining data when multiple API calls or a blend of API calls and local data are needed to get the full picture.
- Sampler – this transformer allows you to narrow down your data to a subset of your choice for testing and validation before running it against an API in bulk. Indispensable when developing an API-based workflow in FME!
- AttributeManager – a bit like the swiss army knife of transformers, this can be used for everything from schema mapping, to attribute cleanup, to field calculations. When working with APIs, it’s a common practice to build URLs, request bodies, and even query parameters with an AttributeManager before sending the request with an HTTPCaller.
Authentication is a crucial part of working with APIs. Web Connections allow you to authenticate via token, OAuth 2.0 or HTTP Basic which covers the most popular forms.
While designing a migration workflow, it’s a common best practice to test out your workflow ideas in a staging environment. When you are satisfied everything is running smoothly, switch the target over to the production environment. Rather than carrying out the tedious task of changing URLs, usernames and passwords in every HTTPCaller in your workspace, all of these can be set to published parameters – including the Web Connection. This has several advantages:
- You can easily switch between environments
- The migration can easily be run with a different user account
- Credentials are kept separate from the migration workspace
Feature Caching, Partial Runs, and Visual Preview – these built-in FME features have technical names, but chances are you’re already using them. They’re intuitive to use and make working with APIs a breeze. Together, they form a suite of tools that allow you to visually inspect your live data at any point in your FME workflow as well as selectively run only the pieces of your workflow that you choose. This allows you to build your workflow piece by piece, test and debug specific transformers or transformer sequences, and be aware of the changes to your data at every step. It also means that FME remembers all that data you got from your API when you ran your workflow a minute ago, and you can keep working with it without having to repeat the requests!
Assessing the Result
APIs give you the flexibility to choose the best services and fit-for-purpose applications for your needs. The key takeaway? You’re never stuck in one system. With APIs and the data transformation technology of FME to get your data exactly how it’s needed, you have the freedom to choose whichever services are best for your needs.
To learn more, check out these resources or download a trial of FME for free: