The Company that runs the popular RPS-Game-As-A-Service is having trouble to handle traffic and performance. The service is a single solution with six endpoints. They now seek your help to scale the solution to handle the ever growing traffic.
In this lab you're going to diagnose their problem and investige options around azure web sites scaling that could aid in solving the issues.
Note: The client doesn't utilize and caching, for the purpose of the lab.
- Add nuget feed - https://myget.org/f/treefort
###API
POST (api/games)
`{ playerName ="player", gameName = "testGame", move = "paper"}`
returns Accepted (202) with location header.
PUT (api/games/available/{gameId})
{ playerName = "player2", move = "rock" }
returns Accepted (202) with location header.
GET (api/Games/available/)
returns available games.
GET api/Games/available/{ id }
returns single available game (200/404)
GET (api/Games/ended)
returns ended games. (200)
GET api/Games/ended/{ id }
returns single ended game (200/404)
- Publish the solution to an Azure web site.
- Try out site manager on yoursite.scm.azurewebsites.net
- Enable Trace logging in the app Startup
- Find one or more ways to use tracing. (see resources)
- Add some metric to observe. documentation
- Follow the guide to configure an alert on your metric. Use the Dummy client to get the alert to trigger.
- Add NewRelic to your site, usig site extensions or documentation.
- Use the Dummy client to produce some metric to explore in NewRelic.
- Add loader.io to your web site and follow instruction to verify your site.
- Create a simple loader.io test. Check results and NewRelic metrics.
In this part you should determine a scaling strategy. Configure your site for scaling and explore the strategy through load tests. In the end of the lab session each team should present their findings and lessons learned.
We'll dicuss this part in the lab introduction.
Remove all your sites and add-on after presentations.
- Streaming Diagnostics Trace Logging from the Azure Command Line (plus Glimpse!)
- Azure Website Logging - Tips and Tools
- Scaling a standard Azure website to 380k queries per minute of 163M records with loader.io
In the code, their are two utils for simulating time and CPU usage. Tweak these as you please to inprove your test/simulation.
Time and CPU on all GET, attribute;
config.Filters.Add(new LoadAttribute(100, 20));
Util for Prime Number calculation (CPU);
CpuUtils.Slow(1500);