Server optimisation is needed for your websites, the webserver is the backend control point for on-line files and/or your domain email accounts.
If you already have a website on your own hosting package, we can be your webmaster. Performing server and database optimisation as well as maintaining your Drupal or WordPress instance with security updates.
For businesses in need of managed hosting. We provide a complete service from domain name registration, hosting, CMS design & maintenance, content creation and of course SEO. Hosting with us saves us time and you, money.
Our hosting and support packages came about originally as a result of a client request and we’ve never looked back.
- database optimisation
The htaccess is a control file for your webserver providing functionality by adding lines of simple code.
Using simple code snippets you can speed up your site, deter hackers, improve your security, keep spammers away, and easily add redirects. Taking advantage of your htaccess file can supercharge your ability to customise your site. Within the htaccess we have the ability to leverage
Caching is the process of storing the current version of your website on the hosting and presenting this version until your website is updated. This means that the web page doesn’t render over and over again for each user. A cached web page doesn’t need to send database requests each time.
For Drupal sites, the process of loading the entire site and related queries into memory. With no read/writes from/to the hard drive this can increase site speeds by up to 300%
Gzip Compression is an effective way of reducing communication from the server to the web browser. Minimizing HTTP requests and reducing the server response time. Gzip compresses the files before sending them to the browser. On the client-side, a browser unzips the files and presents the contents. This method can work with all files on your website.
For controlling web crawler activity. The Robots.txt is a text file used to instruct web robots on how to crawl your website. For instance, you would not want the admin parts of your site accessed. The robots.txt sets how web bots crawl, access and index content. These crawl instructions are specified by “disallowing” or “allowing” the behaviour of certain (or all) user agents.
Database optimisation in CMS
Database optimisation is an effective way to increase performance. With most CMS’s as database size increases your website works slower. In other words plugins, content and in particular spammed sites where the comments tables are full causes a hit to server/website performance.