Setting Up an API Gateway Using NGINX
An API Gateway acts as a single point of entry for all your API clients. It's like a facade that forwards API requests to one or more internal microservices. One of the benefits of using an API Gateway is the abstraction of your backend services. Clients need not know about your backend's microservices; they only need to communicate with the gateway.
There are various tools and platforms available for setting up an API Gateway, and NGINX is one of them. NGINX is a high-performance web server, reverse proxy, and also an IMAP/POP3 proxy server. But it's also widely used for its load balancing and ability to handle multiple incoming API requests, distributing them across different backend services.
In this post, we'll cover how to set up an API Gateway using NGINX.
Step-by-Step Setup:
1. Create Basic APIs:
To begin, ensure you have Flask installed:
pip install Flask
Now, let's create two separate APIs:
server_3001.py:
from flask import Flask
app = Flask(__name__)
@app.route('/service1/')
def service1():
return "Hello from Service 1"
if __name__ == "__main__":
app.run(port=3001)
server_3002.py:
from flask import Flask
app = Flask(__name__)
@app.route('/service2/')
def service2():
return "Hello from Service 2"
if __name__ == "__main__":
app.run(port=3002)
To run each service, open two terminal windows (or tabs). In the first window, run:
python server_3001.py
And in the second window, run:
python server_3002.py
Your two services are now running on http://localhost:3001/service1/
and http://localhost:3002/service2/
respectively.
With these services running, if you have set up the NGINX API gateway as described in the previous answer, accessing http://api.mydomain.com/service1/
would give you "Hello from Service 1" and http://api.mydomain.com/service2/
would give "Hello from Service 2".
2. Install NGINX:
If you haven't already, install NGINX:
sudo apt update sudo apt install nginx
3. Create Configuration:
Create a new NGINX configuration file or modify the existing default one. For this example, let’s create a new file called api_gateway.conf
inside /etc/nginx/sites-available
.
sudo nano /etc/nginx/sites-available/api_gateway.conf
4. Configure the Reverse Proxy:
For the purpose of demonstration, assume you have two backend services running on localhost
at port 3001
and 3002
.
Paste the following configuration:
http {
upstream service_3001 {
server service_3001:3001;
}
upstream service_3002 {
server service_3002:3002;
}
server {
listen 80;
location /service1/ {
proxy_pass http://service_3001;
}
location /service2/ {
proxy_pass http://service_3002;
}
}
}
events {}
This configuration sets up two reverse proxies. Any request that comes to api.mydomain.com/service1/
will be forwarded to the microservice running on port 3001
and likewise for service2
.
5. Enable the Configuration:
Create a symbolic link of the config file from sites-available
to sites-enabled
.
sudo ln -s /etc/nginx/sites-available/api_gateway.conf /etc/nginx/sites-enabled/
6. Test and Restart NGINX:
Before restarting NGINX, always check if the configuration syntax is correct:
sudo nginx -t
If everything is correct, restart NGINX:
sudo systemctl restart nginx
7. Test Your API Gateway:
Now, if you hit http://api.mydomain.com/service1/
or http://api.mydomain.com/service2/
, NGINX should forward the request to the corresponding backend service.
8. Use docker-compose:
Create a Dockerfile
for the Python web servers.
# Use an official Python runtime as the parent image
FROM python:3.11.5-slim
# Set the working directory in the container
WORKDIR /app
# Copy the current directory contents into the container at /app
COPY . /app
# Install Flask
RUN pip install --no-cache-dir Flask
# Specify the command to run on container start
CMD ["python", "./server_3001.py"]
Create a docker-compose.yml
to combine the 3 servers.
version: '3'
services:
nginx:
image: nginx:alpine
volumes:
- ./nginx/nginx.conf:/etc/nginx/nginx.conf:ro
ports:
- "8080:80"
depends_on:
- service_3001
- service_3002
service_3001:
build:
context: .
dockerfile: Dockerfile
command: python ./server_3001.py
expose:
- "3001"
service_3002:
build:
context: .
dockerfile: Dockerfile
command: python ./server_3002.py
expose:
- "3002"
Additional Tips:
SSL: Secure your API gateway using SSL certificates. Let's Encrypt provides free SSL certificates which you can set up easily with NGINX.
Rate Limiting: Use NGINX's rate limiting feature to prevent abuse of your API.
Logging: Ensure to configure access logs and error logs for monitoring and debugging purposes.
Load Balancing: If you have multiple instances of a microservice, NGINX can distribute the load among them. Modify the
proxy_pass
directive accordingly to incorporate multiple backend addresses.
Conclusion:
Using NGINX as an API gateway is an efficient and effective way to manage and scale your microservices infrastructure. By consolidating all external-facing APIs under a single gateway, you can achieve centralized management, monitoring, and security. Happy coding!