API Gateways Load Balancers Reverse Proxies

API Gateways Load Balancers Reverse Proxies

ยท

5 min read

Are you familiar with the essential components of managing inbound traffic in modern application architectures?

Let's explore the key concepts behind API Gateways, Load Balancers, and Reverse Proxies, and how they play a vital role in optimizing traffic flow.

API Gateway

API Gateways serve as the ultimate entry point, efficiently routing incoming requests to various backends. With the capability to perform validations, API Gateways ensure the integrity of requests before passing them along. By consolidating multiple services into one cohesive API, they streamline the development process and enhance overall system performance.

A single entrance point distributes requests to different backends and can perform validations.

The idea is to group multiple different services within one public API. So in theory, there is one entry point and N backends.

Load Balancer

Imagine a scenario where you have numerous backend instances handling incoming traffic. Load Balancers act as the traffic's central hub, distributing it evenly across multiple backend copies. With the ability to scale and balance the load, Load Balancers optimize resource utilization, prevent bottlenecks, and ensure smooth operations even under high traffic volumes.

The single entry point distributes incoming traffic to M instances of the backend.

The idea is to have multiple instances of a service but a single entry point to handle them. The load of requests is spread according to a decided strategy (e.g., round-robin). In theory, there is one entry point and M copies of the same backend.

Reverse Proxy

When it comes to handling credentials and transforming requests, Reverse Proxies shine. Acting as intermediaries between clients and backends, they validate credentials and perform request transformations before forwarding them to the appropriate backend. By providing an extra layer of security and flexibility, Reverse Proxies enhance overall system resilience and performance.

A single entrance for a service intermediates before passing the request to the backend.

The idea is to have a service that checks credentials in advance or transforms the requests before passing them to the backend service. So in theory, there is one entry point and one backend.

What do software providers do?

When acting as a Reverse Proxy, such as Nginx, it has the capability to distribute incoming requests among multiple copies of the same backend. Consequently, Nginx can function as a load balancer. However, this capability doesn't stop there. Nginx can also handle N different backends, making it suitable for fulfilling the role of an API Gateway.

The key factor to consider is the intended purpose of the incoming traffic. Whether it involves distributing requests to multiple services, scaling multiple service instances, or performing intermediation for request validation or transformation, the Reverse Proxy plays a crucial role.

Additionally, a Reverse Proxy can reduce the number of requests sent to the backend by efficiently caching objects.

It's worth noting that Load Balancers and API Gateways are both non-caching services. In contrast, CloudFront can serve as a Reverse Proxy with caching capabilities.

When comparing Load Balancers and API Gateways, the latter is much more extensible. While Load Balancers primarily forward requests to services, API Gateways offer advanced features like inspecting authenticated endpoints for authentication tokens, performing request manipulation (e.g., using a lambda processor) before forwarding, and managing quotas. Furthermore, API Gateways seamlessly integrate with identity providers, making them a versatile and powerful tool.

Example Scenario

Let's begin with a simple scenario and gradually introduce more complexity:

  1. Scenario 1 - Reverse Proxy:

    • In this scenario, we have two machines: Machine A with IP 192.168.0.12, which is not directly accessible from the internet due to firewall restrictions, and Machine B with a public IP 1.1.1.1, accessible from the internet.

    • Internet users connect to Machine B (1.1.1.1), unaware that their requests are transparently forwarded to Machine A (192.168.0.12). This process is commonly known as 'port forwarding' or 'reverse proxy'.

  2. Scenario 2 - Load Balancing:

    • In this new setup, we introduce two machines, both performing the same work, with IPs 192.168.0.12 and 192.168.0.13. Both machines are firewalled off from direct internet access.

    • Instead of a simple reverse proxy, we implement load balancing to evenly distribute user requests between the two machines. This ensures improved performance and optimal utilization of resources.

    • The load balancer can be configured to intelligently direct more traffic to 192.168.0.13 if it can handle a higher load, and we can also set rules to maintain session consistency for users (once a user is routed to .13, they consistently go to .13).

  3. Scenario 3 - API Gateway:

    • Now, we enhance the system by introducing additional functionalities like API call limits and authentication.

    • Rather than having individual machines handle these functionalities, we consolidate them into the load balancer, effectively transforming it into an API Gateway.

    • The API Gateway takes charge of managing access control, call limits, and throttling for both machines (.12 and .13) and any potential future servers. This centralized approach streamlines API management and ensures consistent application of security measures across the system.

In summary, these scenarios show the progression from a simple port forwarding setup to a more complex load balancing system and eventually evolving into an API Gateway with advanced functionalities. Each step adds complexity and provides additional capabilities based on the specific needs of the system.

Conclusion

As the digital landscape continues to evolve, managing inbound traffic becomes increasingly crucial. Whether you're seeking to consolidate services, balance loads across multiple instances, or ensure secure request handling, understanding the roles of API Gateways, Load Balancers, and Reverse Proxies is paramount. By leveraging these powerful tools, you can optimize your system's performance, scalability, and security.

Thank you ๐Ÿ˜Š for taking the time โฐ to read this blog post ๐Ÿ“–. I hope you found the information ๐Ÿ“š helpful and informative ๐Ÿง . If you have any questions โ“ or comments ๐Ÿ’ฌ, please feel free to leave them below โฌ‡๏ธ. Your feedback ๐Ÿ“ is always appreciated.

Portfolio GitHub LinkedIn Twitter

Did you find this article valuable?

Support Rajiv's Insights by becoming a sponsor. Any amount is appreciated!

ย