Money-saving optimization

60%+ because of software?

Remarkably, 1/2 - 3/4 difference can be made by the exchange of old software. The server-side software such as Apache, Nginx, Node.js, or OpenLitespeed has each one different performance resources. Rather of the internet is powered by Apache web servers (it’s a classic old school). Apache misses the non-blocking I/O events, static file based (HTML) PHP site cache, and, also the programming language in which it’s written makes a huge difference. Plus, Apache, Nginx, and OpenLitespeed don’t provide the same safety. I can make you a favor.

Uninvited bots

Is 61% of the internet traffic (also used resources) sucked by bad bots harvesting e-mail addresses, etc? Accordingly to BBC, "they made 61% in 2013" (Kelion, 2013, p.1). At the moment, there are more researchers speaking anything between 20-65%. If we speak in general about the internet usage or about particular pages, it’s a different thing. In practice, it’s easy to find and see it in logging files (the path in Apache: /var/log//access_log). There are saved records about IP addresses and user agent names. Each one computer reports the name under which comes. For example, Google usually comes as: (Googlebot/2.1 (+http://www.googlebot.com/bot.html) and even the bad bots usually give the correct names. + image of a log file It’s practical to ban bad bots by user agent name or serve customized fake pages. Also, surely, there can be millions of IP addresses of different bots and botnets, however, if I block only a few IP ranges of the biggest VPS hostings such as Digital Ocean, OVH, Aruba, etc, will be blocked 99% of all bots.

Web hosting provider

Is the price reasonable for the performance of your machine?

Asynchronous load (Ajax/AMP)

For example, imagine that 60% people don’t scroll down and images below scrolled visible area are loaded at the same time as the content on the top. Images/videos have usually participated with the biggest file sizes on the vast majority of internet and make a lot of requests.

Image size and compression

File size optimization Image size is possible to decrease. Mobile devices can get dimensionally smaller images than desktop.

PHP

PHP version upgrade PHP7 is twice as fast as PHP5, i.e. PHP5 sucks the performance twice as much.

AMP cache

New Google AMP cache Google created a free AMP CDN (a reverse proxy cache) as an effort to speed up the web page load on mobile devices. The frequency of updates is dependent on the expire time. In fact, the cache can serve a huge part of the content for free. Bing, Twitter, and many others also adopted AMP. Also, Facebook and Apple are caching files (in a different way) and serve from own servers.

CDN (content delivery network)

Do you serve from one or two, tree… subdomains, or VPS/servers? Robin DNS enable us to serve the same content from different (own) servers accordingly to nearest geolocation. So, if you really wanna use a CDN you don’t have to use it in all parts of the world. You can use own server without proxy (nearby), CDN in distant regions, or you can serve only some particular files via subdomains. Too, we can use more than one CDN provider. This is possible to make a reality with a subdomain (that’s free and easy), or own reserve proxy VPS/server (again, it can be a cheap thing).

Unused periods

Which time is most busy when people come in? For example, if your audience triple between 11 am and 15 pm then it’s possible to think that the web server is unused in 20/24, so in 80% the time is paid for a 3 x stronger machine that is needed. In this case, it could be economical to combine a normal server with a load balancer, app engine, and whatsoever that charge per hour of its work, not per a whole day and could be deployed only in particular busy periods of the day to help the cheaper main machine.

A year trial

Are you a newcomer to mainstream server providers? One year of free services or hundreds of dollars to spend on Google App Engine (GAE), Amazon Web Services (AWS), etc?

Cache

We can cache PHP outputs and save it as static HTML files. After, it’s faster (also less expensive) to serve it.

External js files CDNs

Many of the hugely visited websites like Digg, Reddit, wordpress.org, etc... are using JavaScript libraries (e.g. Jquery) and special fonts (e.g. FontAwesome). If they’re calling it from an external source (they call it rather from Google’s free CDN) then it’s already cached in visitors’ local storage before they visit your website. Because the most visited websites are using it so almost everybody (including your visitors) has already cached these files. If you put exactly the same then the visitors’ machines don’t have to download anything, only must call the files stored in own computers. Thus there’s no need to serve it from own server. If your server or CDN isn’t faster than Google’s servers then there’s no need to spend money for bandwidth and server performance because of these files.

Static files served as dynamic

Many of web designers and some CMSes (e.g. Wordpress) force the web owners to serve images and other static files in an URL format that is called a “dynamic URL”. This is used as a signal for CDN providers, and mainly web browsers that they shouldn’t cache the files. Pragmatically, that caching prevents web designers and Wordpress users to see any changes of their work (if it’s cached in their browsers then it doesn’t load the updates, changes to render). In fact, that prevention of caching wastes a lot of money and slows the page load speed. Decrease PHP template requests, calls Client-side requests (too many files?)