For a scenario where there are multiple LambdaLabs servers, what is recommended for having them sit behind a load balancer? My use case is one where I want at least two LambdaLabs servers handling on-demand requests and I would like to have a low-latency load balancer to evenly distribute the requests across the servers.
Related topics
| Topic | Replies | Views | Activity | |
|---|---|---|---|---|
| Remote access of lambda workstation by multiple users | 0 | 1557 | October 31, 2019 | |
| How to Enable Inter-Instance Communication or Set Up a Custom Cluster on Lambda Labs? | 0 | 77 | December 3, 2024 | |
| Is there any asia-region mirror of archive.lambdalabs.com? | 2 | 268 | April 26, 2024 | |
| Stable Diffusion Inference Server Deployment | 2 | 1244 | May 18, 2023 | |
| How to Optimize Deep Learning Models with Lambda Labs Hardware? | 0 | 125 | January 17, 2025 |