How to protect your website from unscrupulous competitors trying to hurt your Google ranking
When it comes to ranking on Google Maps, competitors play to win, and in this case, playing to win involves negative SEO. Negative SEO involves underhanded, deceptive, and sometimes fraudulent efforts to harm your website in the eyes of Google and other search engines. By doing so, competitors successfully lower your ranking, which can harm your local business as it decreases your chances at placing on the front page of Google’s search results. Although the results can be disastrous, protecting yourself is simple, and it involves vigilance.
Things to Watch:
If hackers infiltrate your site, they can modify your website in ways that a search engine can see. The thing that makes such hacking difficult to detect is that humans cannot see the modifications without inspecting the HTML code. Depending on your experience, inspecting HTML can be either tedious or impossible. Consequently, you need to have a site-auditor occasionally check your site for behind-the-HTML spam and code changes. Such auditors include Screaming Frog.
2. Customer reviews
Positive customer reviews are the hallmark of a successful business, and bad reviews are clear signs that a business does not offer quality services or products. However, this common knowledge is not always accurate because competitors can leave fake reviews on such sites as Yelp or Amazon in order to keep your SEO rankings down.
Perhaps more than any other tactic, fraudulent fake reviews will devastate your local business, lower your organic rankings, and get you delisted from Google Local 3-Pack, so you have to monitor review sites as well as your Google + reviews. If you notice fraudulent reviews, you need to contest the bad reviews that are not verified by your records as having come from actual customers. If you are not working with an SEO agency that does this for you, you can use a tool such as local Viking or Bright Local to monitor your reviews.
3. Stats in your cPanel
In your web host control panel, you will have a statistics page that shows your server load. Your server load indicates how much traffic your server takes as a result of people visiting your site. If this load is disproportionately high, you might be the victim of bots crawling your site. This type of attack decreases your site speed and can keep search engines from being able to crawl your site.
To combat this, you need to pay attention to the CPU load and memory usage as indicated in your cPanel. Another way to protect yourself is to engage the services of a web application firewall (WAF) to help prevent fake traffic from reaching your site.
4. Search results
The common wisdom is to monitor your search ranking and Google page rank via auditing sites. The logic is that you will be able to catch any problems long before they impact your business. If you see a drop, you should check your robot.txt file. Simply put, this advice is reactive.
You need to be proactive. You should, of course, always monitor your search rankings to see if you have dropped in ranking. However, instead of waiting for a drop in rankings, you should have a backup of your robot.txt file and update it regularly with a fresh file that allows search engines to crawl your site. Once you update your robot.txt file, all you need to do is submit a request for Google to re-crawl your site.
You should monitor the quality of your backlinks. backlinks are among the most important components in the updated Google Local 3-Pack. If competitors upload your business to link farms and spam sites, your organic rankings as well as your ability to remain listed in Google Local 3-Pack will suffer. Although search engines usually combat link farms, you can report a site as spam if you find your local business as being associated with one.