In the old days, hosting packages consisted of the applications and servers that ran on them.
Today, however, the hosting stack has become much more complex. As more and more applications move to distributed, MicroservicesArchitectures based on host environments, a new type of tool – API Gateway – has become a base layer for many packages.
Here’s 101 on what API gateways do and when you might – or might not – want to use one.
What is the API Gateway?
An API Gateway is a tool or service that accepts API requests from clients, forwards them to applications, and sends results back to clients.
In other words, API Gateway is a layer that sits between clients and applications in order to manage incoming API calls.
Key benefits of API Gateways
You don’t strictly need an API Gateway to accept API requests for an app. You can simply send requests directly to the apps and let them respond.
However, adding an API Gateway to your hosting stack provides several important benefits:
- Managing API requests: By intercepting API requests, API gateways can merge, reformat, or otherwise process the requests and the resulting response. This is useful if clients “said” one thing when calling an API but your microservices need to “hear” something different in order to respond. In this case, the API Gateway essentially acts as a translation layer for API calls.
- rate setting: API gateways can “throttle” incoming requests or rate them, which means limiting the number of requests clients can make in a given time frame. Rate limiting helps mitigate security abuse. It also protects against the risk that bugs or mismanagement of customers may overwhelm applications by making a large number of redundant and unnecessary requests.
- load distribution: Although API Gateways do more than just load balancing, the ability to balance load by distributing traffic across multiple application instances or microservices is one of their advantages.
- Monitoring and monitoring: API gateways can monitor and log API requests, and provide the data needed to enhance monitoring capability.
- protection: API gateways can also enforce security rules. For example, they can block malicious requests to prevent them DDoS attacks.
Many of these functions can be achieved, at least in part, with other types of tools. For example, you could try monitoring and logging API requests directly within your app. But by outsourcing these tasks to the API Gateway, you get a centralized way of handling all the major aspects of API management, without having to implement these functions directly within the application logic.
API gateways vs. load balancers, marshalers, and service networks
API gateways aren’t the only new layer you’re likely to find in modern hosting heaps. They often work in conjunction with other types of solutions, including load balancers, orchestrators, and service networks, which companies are also beginning to deploy to help manage distributed applications based on microservices.
The functionality of the API Gateway complements, and in some cases overlaps with, the functionality of these other types of services. But API Gateways are different in several key ways:
- API Gateways for Load Credits: The only task of the load balancer is to distribute the load of the application. API gateways can do this, but they can do many other things as well, as explained above.
- API Gateways vs. Regulators: Coordinators manage the deployment of microservices. They also usually provide some monitoring and security features, which may interfere with API Gateway features. Moreover, API Gateways and Moderators are essentially different types of tools.
- API Gateways vs. Service Networks: service networks Manage internal API calls that microservices make to each other. In contrast, API gateways manage external API calls that originate from outside the application.
Where to find API gateways
There is no shortage of API Gateway offerings in the market.
all major public clouds, Like AmazonProvides API gateway services that customers can use in conjunction with their cloud-hosted workloads. In addition, IT teams can take advantage of a variety of standalone API gateways – such as Kong And Tyke – that operate in the public cloud, as well as private cloud, hybrid cloud, or on-premises environments.
The features of different API Gateway solutions are a bit different, particularly when it comes to the monitoring, observation, and security functionality that they provide. But they all perform the same basic function of managing API requests.
API Gateways Pros and Cons
API Gateways are powerful tools, but they are not necessary to deploy every application.
The main use case for an API Gateway is when you deploy an application that accepts external API requests that have any degree of complexity, and/or that you need to enforce traffic or security controls on.
However, disadvantages of API gateways include:
- Potentially slower performance: API gateways create another layer that traffic needs to go through, which can slow application performance – especially when the gateway is poorly configured.
- Reliability challenges: If you implement a single instance of the API Gateway, the gateway becomes a single point of failure for your application, which leads to reliability risks.
- Abundance: As mentioned above, much of the functionality of API gateways can instead be implemented using other tools. If you already handle traffic management, monitoring, observation, and security in other ways, the API Gateway is redundant and adds unnecessary complexity to your hosting stack.
In short, API gateways have become an important part of many hosting packages. If you need a centralized way to manage all aspects of your external API calls, the API Gateway is the most direct way to do it.
However, API gateways are not necessary – or useful – for every use case. Before adding an API Gateway to your stack, think critically about the potential downsides to using an API Gateway, as well as alternative ways to achieve the functionality you’re looking for.
About the authorChristopher Tozzi He is a technical analyst with expertise in cloud computing, application development, open source software, virtualization, containers, and more. He also lectures at a major university in the Albany, New York area. His book, For Fun and for Profit: A History of the Free and Open Source Software Revolution, is published by MIT Press.