Stingray: Setting up your Application Delivery Controller

In the first two posts of this series we have had guest blogs from Riverbed on the what (is it) and the why (use it) and so logically we’ll now look at the how (to set it up). If you have missed the other posts in this series you can click here to review prior to diving into this post.

How to set up load balancing using Stingray Traffic Manager

Guest post by Paul Wallace at Riverbed

In the two previous articles, I covered some of the basic concepts, but now it is time to look at the first stage that most people start with – simple load balancing. Setting up basic services is made easier in Stingray Traffic Manager using Wizards, and for simple load balancing we use the Manage a new service wizard on the Home screen:

In the wizard, all that’s needed is to enter a couple of simple parameters to define a name for the site or service. In this image, it's listening for HTTP traffic, on port 80:

Next you can enter either the hostnames or the IP addresses for the web servers you want to send traffic to, review the settings, and once you’re happy, hit finish to create a virtual server and a pool to accept traffic and load-balance it to the servers.

Now traffic goes through the traffic manager, and is load-balanced up to the servers that host the website. Simple!

For more information on Riverbed Stingray – Click Here

For more information on using Stingray as a Content Delivery Cloud (alternative to CDN) – Click Here



Post written by Paul Wallace, Riverbed