Space City Weather’s grand 2022 pre-season server upgrade – Space City Weather

Howdy, people—I’m Lee, and I do all of the server admin stuff for Space City Weather. I don’t put up a lot—the final time was back in 2020—however the website has simply gone by way of a fairly large structure change, and I assumed it was time for an replace. If you’re in any respect within the {hardware} and software program that makes Space City Weather work, then this put up is for you!

If that sounds lame and nerdy and also you’d slightly hear extra about this June’s debilitating warmth wave, then concern not—Eric and Matt shall be again tomorrow morning to let you know all about how a lot it sucks exterior proper now. (Spoiler alert: it sucks a complete lot.)

The outdated setup: bodily internet hosting and sophisticated software program

For the previous few years, Space City Weather has been operating on a bodily devoted server at Liquid Web’s Michigan datacenter. We’ve utilized an online stack made up of three main parts: HAProxy for SSL/TLS termination, Varnish for native cache, and Nginx (with php-fpm) for serving up WordPress, which is the precise software that generates the positioning’s pages so that you can learn. (If you’d like a extra detailed rationalization of what these purposes do and the way all of them match collectively, this post from a few years in the past has you lined.) Then, in between you guys and the server sits a service known as Cloudflare, which soaks up many of the load from guests by serving up cached pages to people.

It was a resilient and bulletproof setup, and it acquired us by way of two large climate occasions (Hurricane Harvey in 2017 and Hurricane Laura in 2020) with no single hiccup. But right here’s the factor—Cloudflare is especially glorious at its main job, which is absorbing community load. In truth, it’s so good at it that in our main climate occasions, Cloudflare did virtually all of the heavy lifting.

Screenshot from Space City Weather’s Cloudflare dashboard throughout Hurricane Laura in 2020. Cached bandwidth, in darkish blue, represents the visitors dealt with by Cloudflare. Uncached bandwidth, in mild blue, is visitors instantly dealt with by the SCW internet server. Notice how there’s virtually no mild blue.

With Cloudflare consuming virtually the entire load, our fancy server spent most of its time idling. On one hand, this was good, as a result of it meant we had an incredible quantity of reserve capability, and reserve capability makes the cautious sysadmin inside me very comfortable. On the opposite hand, extra reserve capability with no plan to make the most of it’s only a fancy approach of spending internet hosting {dollars} with out realizing any return, and that’s not nice.

Plus, the arduous fact is that the SCW internet stack, bulletproof although it might be, was in all probability extra advanced than it wanted to be for our particular use case. Having each an on-box cache (Varnish) and a CDN-type cache (Cloudflare) typically made troubleshooting issues an enormous ache within the butt, since a number of cache layers means a number of issues it is advisable ensure are correctly bypassed earlier than you begin digging in in your difficulty.

Between the price and the complexity, it was time for a change. So we modified!

Leaping into the clouds, lastly

As of Monday, June 6, SCW has been hosted not on a bodily field in Michigan, however on AWS. More particularly, we’ve migrated to an EC2 instance, which supplies us our personal cloud-based digital server. (Don’t fear if “cloud-based digital server” feels like geek buzzword mumbo-jumbo—you don’t should know or care about any of this with the intention to get the day by day climate forecasts!)

Screenshot of an AWS EC2 console
The AWS EC2 console, displaying the Space City Weather digital server. It’s listed as “SCW Web I (20.04)”, as a result of the digital server runs Ubuntu 20.04.

Making the change from bodily to cloud-based digital buys us an incredible quantity of flexibility, since if we ever must, I can add extra sources to the server by altering the settings slightly than by having to name up Liquid Web and organize for an outage window during which to do a {hardware} upgrade. More importantly, the digital setup is significantly cheaper, chopping our yearly internet hosting invoice by one thing like 80 %. (For the curious and/or the technically minded, we’re making the most of EC2 reserved instance pricing to pre-buy EC2 time at a considerable low cost.)

On high of controlling prices, going digital and cloud-based offers us a a lot better set of choices for a way we are able to do server backups (out with rsnapshot, in with actual-for-real block-based EBS snapshots!). This ought to make it massively simpler for SCW to get again on-line from backups if something ever does go fallacious.

Screenshot of an SSH window
It’s simply not a SCW server except it’s named after a well-known Cardassian. We’ve had Garak and we’ve had Dukat, so our new (digital) field is known as after David Warner’s memorable “How many lights do you see?” interrogator Gul Madred.

The one potential “gotcha” with this minimalist digital strategy is that I’m not making the most of the instruments AWS supplies to do true high availability hosting—primarily as a result of these instruments are costly and would obviate most or the entire financial savings we’re at the moment realizing over bodily internet hosting. The solely conceivable outage state of affairs we’d must get better from could be an AWS availability zone outage—which is uncommon, however definitely happens from time to time. To guard in opposition to this chance, I’ve acquired a second AWS occasion in a second availability zone on chilly standby. If there’s an issue with the SCW server, I can spin up the chilly standby field inside minutes and we’ll be good to go. (This is an oversimplified rationalization, but when I sit right here and describe our catastrophe restoration plan intimately, it’ll put everybody to sleep!)

Simplifying the software program stack

Along with the internet hosting swap, we’ve re-architected our internet server’s software program stack with an eye fixed towards simplifying issues whereas holding the positioning responsive and fast. To that finish, we’ve jettisoned our outdated trio of HAProxy, Varnish, and Nginx and settled as an alternative on an all-in-one internet server software with built-in cacheing, known as OpenLiteSpeed.

OpenLiteSpeed (“OLS” to its pals) is the libre model of LiteSpeed Web Server, an software which has been getting an increasing number of consideration as a super-quick and super-friendly different to conventional internet servers like Apache and Nginx. It’s purported to be quicker than Nginx or Varnish in lots of efficiency regimes, and it appeared like an ideal single-app candidate to interchange our advanced multi-app stack. After testing it on my private website, SCW took the plunge.

Screenshot of the OLS console
This is the OpenLiteSpeed internet console.

There had been a number of configuration rising pains (eagle-eyed guests may need observed a few small server hiccups over the previous week or two as I’ve been tweaking settings), however to this point the change is proving to be a massively optimistic one. OLS has glorious integration with WordPress by way of a powerful plugin that exposes a ton of superior configuration choices, which in flip lets us tune the positioning in order that it really works precisely the best way we wish it to work.

Screenshot of the LiteSpeed Cache settings page
This is only one tab from the cache configuration menu within the OLS WordPress plugin’s settings. There are lots of knobs and buttons in right here!

Looking towards the longer term

Eric and Matt and Maria put in lots of effort and time to verify the forecasting they create you is as dependable and hype-free as they will make it. In that very same spirit, the SCW backend crew (which to this point is me and app designer Hussain Abbasi, with Dwight Silverman appearing as mission supervisor) attempt to make good, accountable tech selections in order that Eric’s and Matt’s and Maria’s phrases attain you as rapidly and reliably as doable, come rain or shine or heatwave or hurricane.

I’ve been dwelling right here in Houston for each considered one of my 43 years on this Earth, and I’ve acquired the identical visceral first-hand data lots of you could have about what it’s prefer to stare down a tropical cyclone within the Gulf. When a climate occasion occurs, a lot of Houston turns to Space City Weather for solutions, and that stage of accountability is each horrifying and humbling. It’s one thing all of us take very significantly, and so I’m hopeful that the adjustments we’ve made to the internet hosting setup will serve guests nicely because the summer time rolls on into the hazard months of August and September.

So cheers, everybody! I want us all a 2022 crammed with nothing however calm winds, nice seas, and a complete lack of hurricanes. And if Mother Nature does determine to fling one at us, nicely, Eric and Matt and Maria will speak us all by way of what to do. If I’ve carried out my job proper, nobody should take into consideration the servers and purposes buzzing alongside behind the scenes holding the positioning operational—and that’s precisely how I like issues to be 🙂

https://spacecityweather.com/space-city-weathers-grand-2022-pre-season-server-upgrade/

Related Posts