Pages

Sunday, March 21, 2010

How much traffic can I have?

We do not have a limit to the number of visitors. However, there is a physical limit our network can handle.
 
There is no way to determine the maximum number of visitors your site can sustain. Each site is different and the amount of traffic you can handle will depend on the resources each visitor is using.
 
Once you are online, you can view your site statistics, like the number of visitors and which browser they use. Your cPanel includes a statistics program called AWStats.
 

What about huge and unexpected spikes in traffic?

Basically, if you spike the server too hard or for too long, you can be suspended for taking the resources from your fellow shared server users. If you don't want this to happen, you need to move to a dedicated server.
 

If I need to handle more traffic, what should I do?

Shared, reseller/master reseller, and alpha master reseller (SemiVPS) all handle the same amount of traffic. However if you need lot of traffic you may use our Alpha Master Reseller (SemiVPS) plan.

Monday, March 8, 2010

DNS Propagation Explained


There's always plenty of tech-speak floating around among developers in the office. I certainly try to keep such spoken language at a minimum when working with clients but sometimes it just works its way in there. DNS and other domain-related processes can be pretty puzzling, period, let alone to someone dying for their website to go live. So how to excuse DNS propagation? .


I RECENTLY took in a co worker utilize the best explanation of DNS propagation ever in the world. It proceeded something like this :.

When a new phonebook comes out, We may already have the new one while yours has yet to be delivered. So when dialing the same number we might have two different expectations for who picks up on the other end.
Now that's definitely it. Every time a new domain name is pointed at the place a website is being hosted it can easily take up to a day or more for every corner of the web to get updated with that new information. The internet is certainly a strange beast but with solid analogies such as that no one has to be left in the dark.


Sunday, March 7, 2010

Misconception about Server Specification!

 “Oh! Damn! That server seems pretty cheap, giving me 12GB RAM, dual intel xeon 5430, so, lets go and purchase it” – a  big misconception to judge a server specification looking at their cpu and the ram size in current web hosting industry. I have been playing a lot in Webhostingtalk and some other hosting forums these days and found people are asking the same question everyday, which server is going to be right for me. Here I will go through some basic idea why the idea of users are biasing everyday and how to judge a proper server specification.

Most of the server companies/data-centers are offering big specials with giant CPU and RAMs these days. I have seen most of the RAM sizes are now crossing 12GB by default with Xeon 5000 series processors. An old but not uncommon offer is the 8GB Special with the Intel Quad core Q series processors.  People are getting obsessed with the price Datacenters are offering for these specials. Most of them are under 300$ and some of them are way too cheap. People are thinking, they could crowd up any of these servers with thousands of clients and make all of them pretty happy. But within one month use, they come to an understanding that the performance doesn’t seem alright. Server is getting overloaded time to time. So are those systems faulty? or Datacenters are fooling me and selling low end cpu/ram? Eventually you are fooling yourself.

Let me give you an example. Suppose, you have 3 pipes, one is able to transfer 10000 cubic meter oil per second, another one is 9000 cubic meter, and the other one is 2000 cubic meter. Now you are making a combined line in series with these 3 pipes, what would be the transfer rate? Very simple, ha? The maximum would be the lowest rate you can flow through this series, that means the 2000 cubic meter.

The same theory goes on servers as well. There are something more you should think including CPU and RAM. That is the speed of your secondary IO or hard drive. Most of the people want to have large hard drives and cheaper rate. But you have to understand, excusing one of the pipe would reduce the overall rate of your transfer when all of them are working in a series. A simple 7200 RPM SATA can not be compared with the 667 Mhz ECC FB-DIMM DDR2 RAM or Dual Intel Xeon 5000 series processors. You should consider upgrading your hard drives not just in space but also in speed which we often forget to do so as it doesn’t come with most of these specials by default & seems extra charges for no use. A simple RAID 10 / RAID 5 would improve the read performance 10 times than the one you were planning for and could tremendously improve the performance of the same CPU/RAM configuration. But for such configuration of CPU and RAM, I would probably go with SAS (15000 RPM) with RAID 10 to make sure I am maxing the CPU performance and the RAM size. But it would definitely a costly solution, but if you are with tight budget, make sure to go with at least a RAID which includes striping like RAID 10 and RAID 5. You should never forget, striping increases the possibility of drive failure. So, you must go with some sort of redundancy while using Striping like you can find from RAID 10 or RAID 5. You can read more about RAID here:

http://en.wikipedia.org/wiki/RAID

About SAS: http://en.wikipedia.org/wiki/Serial_attached_SCSI

We at CentrioHost uses SAS with RAID 10 for most of our newest servers. There are some which are using Quad core processors and 4GB RAM using SATA II with RAID 10.

It is probably a good idea to check, your network card is enough capable to sustain on the transfer rate you are going to have from those above configuration. Using a 10Mbit card would probably a bad idea which can limit your powerhouse. Above server can easily go beyond 10Mbit speed when utilized at full rate. A couple of dollars would help you to increase the port speed as well. So, using at least 100 Mbit port speed is a better idea for a very busy server using high end CPU, RAM and Hard drive configuration.

So, while choosing a host, do not just look at the number of CPUs or RAMs they have, but also look at how well they are managing their IOs. All of them make the perfect series of pipes. I have been always facing people querying about the number of CPUs and RAMs. A well managed host is not going to choose an imbalanced server. At least CentrioHost doesn’t. So feel secure & fastest with CentrioHost ;)

Good luck in choosing the right server configuration next time. :)

Having Pages Crawled Sooner Using Twitter.


Since the introduction of real time search we 've been successful with getting pages crawled by Google quicker, nearly immediately, by posting them on Twitter. No Cloaking. With sitemaps and webpages submissions, from the time it's "live", it's normal for us to expect about 1-2 weeks before new content appears in results, longer before actually ranking if it's appealing. Spammers obviously have the right idea for their niche, blast links as fast as possible. For them it does n't matter, it is n't their website that's over displaying itself in a negative way. For those of us wanting to take time and energy to create a unique experience for users with fresh content and used proper SEO, it's easy to get excited and overbearing like marketers, but moderation and patience has been key to getting exposed.

There are still performance differences between the pages and it's still being studied whether or not it 's the content or "submission" method. We 're leaning toward the idea that Twitter's an excellent tool when used properly way because the content is useful and the short period of time between post and intial visitor click-through has been as fast as 15 minutes.

What We Did : Utilizing TweetDeck and a small group of niche usernames, a variety of original and retweeted comments containing ONE backlink each was passed around with a mind toward SEO by understudying social and business submissions :

A.) This is fast and furious as soon as a set of related pages was close enough to live. One original comment followed 15 RTs including Facebook and LinkedIn without any in between posts to break them up. There were six sets of three pages each, exposing them at a rate of two urls per day, one morning and night then skipping one page allowing bots to find it a different kind of naturally. Those results took less than 4 hours to be picked up and looked good at first but traffic slacked off quickly, within a day. Some of this returned slowly after about 2 weeks. The best was under 15 minutes from click to SERP. This way covers more ground that googlebot could be treading on at that moment.

3 day twist off back in the saddle

B.) The next extreme is posting 1 address once and then re-tweeting randomly between random tweets from each sequent accounts over a days time. From this, the amount of exposure has been enough to cause a trickle, but it's steady and on an individual basis, the pages seem to gain appeal daily. This variation was too random, robots can easily miss it if you 've sown your seeds too far apart. We 're trying to get go beyond or picked up, it's attention that's needed, just not too much. This was n't enough.

Performing On It : Looking at this from a scheduling and SEO point of view, here's how this has been playing out. Keeping a 8-12 spread between indexed pages and what's in the sitemap, we keep the initial traffic flowing using a more relaxed version of "A" above., Trying to push the issue, two pages disappeared from Google for over a week. Thought it was time for a reinclusion request on a site less than 2 months old. And if you read this far, the site is also # 3 at Bing already for it's EXACT key phrase, and yes, they check Twitter too.

KEEP IT START:

All of our Truth About the Cloaking : Using several shorteners, we noticed the domain used got the credit for the page and was indexed, NOT the actual site, but the redirection service address. That's fine for spammers but it is n't cool to give away material or run the risk of a duplicate content issue, especially against oneself.


Saturday, March 6, 2010

Reducing CPU usage for WordPress users!

WordPress is one of the most demanding content management system of recent days. Most of the users these days use wordpress for their blogs or websites. Around 85% sites of our servers are using wordpress and most of the clients are utilizing multiple wordpress blogs for their business. WordPress has been found to be using pretty good sum of CPU and Memory. Today’s shared hosting environments are more limited based on the CPU and memory rather than the Space and Bandwidth. It is always a wiser choice to spend little amount of time to reduce the overall cpu usage. This makes the blog running faster and hosting companies feel good to host sites which are nicer to their CPUs :) Here are some tips to reduce the CPU usage on a wordpress blog and improve the site performance.

One of the first plugin I suggest all the wordpress users to install is “wp-super-cache”. You can download this plugin here:

http://wordpress.org/extend/plugins/wp-super-cache/

It is pretty easy to install. But a documentation can always be found in wordpress site:

http://wordpress.org/extend/plugins/wp-super-cache/installation/

wp-super-cache is the fastest caching plugin for wordpress blogs. It is always better to serve it from cache instead of running select command for each user of your blog. Enabling super cache would potentially reduce the cpu usage around 60-75%. One thing you should make sure that you are not using multiple caching plugin. I have seen couple of users think using multiple caching plugin would provide better result, but probably it is a bad idea for your blog to mix up both caching algorithm and result a potential mess.

If you are running scheduled posts on your blog, then it is probably a better idea to run wp-cron.php using cronjobs. WordPress calls wp-cron.php each time a user comes into your blog which is fairly a stupid idea. I am not sure why wordpress does so, but calling it once a 2 hours seems enough. You can set the cronjobs from cpanel. To set the cronjobs every two hour, you would need to set the timing something similar to the following:

    0 */2 * * *

This would run at the very first minute of each even hours of the day. In the command section use something similar:

    php -q /home/cpanelusername/public_html/wp-cron.php

Replace cpanelusername with your original cpanel username. If you have added the blog as addon then probably, wp-cron.php is not in the public_html, but in a subfolder, so you would need to change the path accordingly, something similar to the following:

/home/cpanelusername/public_html/addondomain.com/wp-cron.php

A very well written article regarding the High CPU usage of wp-cron.php can be found here for your reference:

http://trinity777.wordpress.com/2008/10/28/wordpress-26-the-issue-of-wp-cronphp/

Two more interesting plugins which are frequently used by the clients can cause excessive CPU usage, they are “All in SEO Pack” and “Featured Gallery Plugin like Nextgen”. If you have no other option than using a gallery, then probably, you would have to stick with the Gallery, but I strongly suggest not to use all in seo pack. Using all these modules one by one is better than using this all in one plugin. A very well written article for WordPress SEO can be found here and I suggest you better read it before blindly installing All in one SEO pack:

http://yoast.com/articles/wordpress-seo/


A good percentage of users run autoblogs. Autoblogs are pretty popular in these days with wordpress. Autoblogs tends to take high CPU with their cron executions. There isn’t much you can do to reduce those certain high cpu usage time to time but a better idea to set the cronjobs at odd timing. For example setting the cron to run at 17 minutes of each hour may improve the performance instead of setting it at very first minute of the hour. Most of the users tend to use their crons at very first minute. It sometimes cause a little load issues when lots of cron tries to run at the same time. So using odd timing is truly a pretty decent idea for both parties. You should also find the best timing interval for your autoblog updates. A reasonable gap of 2-4 hours is always a better idea as it reduces the frequency of your cronjob. But if you have no other option than running it every hour, then just don’t think, put it for every hour :)

For any sort of additional help, please post a comment. We at CentrioHost, provide absolute free consultation of reducing high CPU usage. Moreover, we would be pretty happy to install all the module and reconfigure your blog to make sure it takes less CPU and loads faster. So never hesitate to contact us for help :)

Friday, March 5, 2010

Getting the web site started.


Gonna start keeping an eye on what is done to websites and make notes about SEO trick experiments that we see results with and those we do n't, as well as how it was done. Right now the site of focus is Bionic Domain, free people search http : www.freepeoplesearch.us, not quite 6 million dollars, but enough. There is the very first trick mentioned on the site, backlinks, getting a link back to your site is extremely important, I'll ramble about it periodically, maybe make a category for it eventually, there are enough things to do to generate back links that they ca n't all come to mind at once.
Enough of that, I may have to make a few more news items just to fill the main page, thanks and good searching.

Wednesday, March 3, 2010

What is Cloud Computing?

In simple terms put, cloud computing is actually a set of pooled computing resources delivered over the Web. The Cloud delivers a hosting surroundings that does n't limit an application to a specific set of resources. Dependant upon the platform, an application can scale dynamically & increase its share of resources on-the-fly.

Monday, March 1, 2010

Shared Hosting – CentrioHost

If you are looking for a shared hosting provider for your multiple websites, CentrioHost maybe a suitable option to you. It was established in 2009 and from its foundation to till now the company maintains a standard quality hosting services; as a result, the company is treated one of the best shared hosting providers. The main reason, that makes the company best than among others, that is their quality services. They do not compromise about their quality services at any rate.

Moreover, they always give priority to their consumer choice and they always try to serve as their consumer preference. If any consumer gets unsatisfied with their support they can complain directly or personally to the company president at any time. That’s why a consumer of CentrioHost always remain tension free than other providers because it is promised to ensure you quality support.

Shared Hosting features:

    * Unlimited Disk Space
    * Bandwidth Unlimited
    * Unlimited Domains Allowed
    * Site Builder
    * 24×7 Support
    * Instant Backups
    * 99.9% Uptime Guarantee
    * 45 Day Money Back Guarantee
    * Free Instant Setup
    * Unlimited Addon/Parked Domains
    * Unlimited Sub Domains
    * Unlimited FTP Accounts
    * Unlimited MySQL Databases
    * Unlimited POP3 Accounts
    * Web Mail
    * AWStats (Real Time Updates)

The whole thing you can get by $7.95 USD/month. If you want to know opinion of their consumers, you may see CentrioHost review, where general users share their opinion and that will help you to know the positive and negative side of this company.
 

Blogger news

Blogroll

About