Learn some infra
Published:
What is Rabbit MQ (message queue)
Think when we have webpage -> server -> DB.
When the website server is overloaded, say it’s shutdown, then all the upcoming requests are lost.
Instead user’s request now is sent to messsage broker. Broker “cache” the request into a queue and let the user do his job. Broker then in sequence sending requests from the queue to the server to process.
Reconsider the situation when the server is down. Now requests are saved in the broker, waiting until the server is back online. It increases the resilience.
What’s the additional benefit? Instead of binding a request/user with one server, now we can have a bunch of servers, waiting for requests being distributed by the broker. Thus it scales the frontend and backend by disconnect the direct connection between them.
What is the diff between load balancer, reverse proxy and message broker
CDN (Content Distribution Network)
What is CDN
Original server of a web service may be too far from the end users. To reduce latency and increase availability, we deploy copies/cache of the service in locations that are closer to users.
How to cache
People use a hierachical structure to send updates from original server can split content availability checks, to avoid overwhelming the original server.
Caching strategy can be a mix of cache pull and push.
Edge server selection
Two major approaches to route a request to an edge server: anycast (same IP for all servers, anycast cluster send request to server to balance load and latency) and DNS-based routing (different IP for each server).
Kafka
Use case
Realtime events streaming, log aggregations from thousands of servers.
