I updated a little more code on most of my websites today, hopefully making them another 3-4% faster. The number one optimization for 90% of websites doesn't involve
complex fine-tuning of the database, but instead fixing
bad code.
While I was optimizing MySQL, I noticed that the database writes were almost as high as the database reads. Wow! That's quite unusual for a web application that mostly serves pages to be displayed, so I investigated what was causing it - and it was some of that
bad code I was talking about.
Perhaps it wasn't really
bad per-say, as it was just sloppy and unnecessary. Speeding up the website involved taking a closer look at what was happening, and fixing it.
Every action checked for spam.
We have a complex anti-spam process that checks some external databases every time you perform an action on the website. This required a couple optimizations, one of which I didn't notice immediately. The first was easy - we would save the external request in our database after we check once. If it's been checked in the past 3 days, give them the all-clear. However, this still asked the database on every action request.
Now, after being checked once, the user is given an encrypted cookie clearing them as not-a-heavy-spammer for the rest of the day. This bypasses that particular spam check, until the encryption is no longer valid at midnight.
Updating last-login time.
Every time a logged-in user did anything, their last-login time would be updated. Seemed a bit excessive, because we don't need to hit the database every time the user clicks anything. Now the last-login time will only be updated every 3 minutes.
Small tweaks here and there, carefully examining every bit of code, can save a lot of overhead.