Guidelines to backlink removals

When it comes to the ranking, the links earned by the website plays a major role. Back links can hurt or improve the website rank. There are lots of online resources for backlink building. But unfortunately, there are no proper guidelines for the backlink removal process. So I decided to prepare my own. I’ve spliced the guidelines into two sections. The first section talks about how to identify the backlinks which is really hurting you and the next section talks about what is the best way to kick the backlinks.

How to find Dark Links or Unnatural links or Toxic Links

We all desire to improvise the rank. However, the most important thing is, our effect towards improvising shouldn’t hurt the current rank at all. So deciding which links must be removed is the major challenge. The page rank, domain authority, relevancy, IP, Link title, traffic from the link, these factors can decide the links if it is a dark link.

    1. Page Rank

Basically the site which got penalized by Google has PR of 0 or “N/A”. However, this factor alone shouldn’t be taken alone for removal. Please refer the exception section.
PR checker –
The newly launched site of PR will start with “0”. So, please refer the relevancy section to decide.

    1. Blacklisted IP Or DomainWithout any doubt if the backlink site is one of the blacklisted or the IP of domain blacklisted. We can straight away go ahead and remove this backlink.


    1. RelevancyRelevancy is one most important factor in deciding the dark links.
        • Industry: If you serve IT industry and one of your backlinks from the article “How to make Hot Dog”. Then it should be removed immediately. However, attributes words like “Developed By”, “Designed By” Or “Powered By” are exceptional.
        • Link title or “ALT” text: Any website can like your website with the attribute of “title” or “ALT” text. The text inside the attribute should be relevant to the linked page. For example, let us take this page this page shouldn’t have the backlink with the non-relevant title or ALT like “wordpress development”. However, if you think the backlink is very genuine then you can talk to the webmaster to correct the ALT or title text.
        • Traffic: How much traffic you get from the particular backlink is also matters even if it under your PR. You can leave this link on the other hand if the backlink exist quite long and doesn’t not providing nay traffic to your site. This can be removed to keep your backlinks clean.
        • Domain Age and Number of Google Indexed pages: The older the domain age is good. The backlink can be removed if the particular site doesn’t indexed by Google.

      Domain Age Checker

      Google Index Checker – Simple open and search site: If you zero results the backlink can be removed immediately.

      Example: Askan Technologies website has a backlink from this website However when i try to find the indexed pages on Google by searching site: I found zero results, so without doubt we can remove.

        • Domain Authority: According to the official SEOmoz glossary, “Domain Authority represents SEOmoz’s best prediction about how a website will perform in search engine rankings” and it is calculated by “combining all of our other link metrics (linking root domains, number of total links, mozRank, mozTrust, etc.) into one single score.” In other words, domain authority is a measurement of a website’s backlinks.

      DA Checker

      You can check the DA of backlink and if it is under 20, it can be removed.

Examples of low DA links that could be natural:

    • A link from a relevant, high quality website that is new and hasn’t yet built much authority.
    • A link from a small niche website that’s tightly targeted to a specific geographic area or topic that is relevant to your website.
    • A link from the personal website of an expert in your industry.
    • A link from a small local website in your industry.

How to remove the backlinks without affecting your current rank

After finding the bad backlinks, submitting the links straightway to disavow tools is bad idea. Google strongly suggests that one must take manual action before submitting to disavow tool.

    • Create a Google sheet with the below headlines
      • Link From URL: URL where the link resides.
      • Link to URL: The page (URL) on your website the link points to.
      • Email contact: For the “Link From” website.
      • First Link Removal Request: Insert date of removal request.
      • Second Link Removal Request: Insert date of removal request (One week after first request).
      • Third Link Removal Request: Insert date of removal request (One week after second request).
      • Link Status: Live or removed.


    • Start link removal request

Gather email ids of all the domain owners. You can use for getting the contact details of domain owner.

Please refer the email template below or you can come with your own template.


  • Update all your request detail in the google sheet as per the template referred in the point 1.

Don’t use your own domain to send emails. Though it provides more authentications, the webmaster can mark your emails as spam. In return, it hurts your domain name more.

Record the date of every link removal request sent. In some cases you will need to submit a web form in lieu of an email – remember to also record these form submission dates on the spreadsheet.


Record every link removal and remember to stop emailing webmasters after links are removed. After five days have passed, send a “second notice”, to those who failed to respond the first time.

Once again, record every link removed and stop emailing those webmasters. After five more days, send a “final” notice to any holdouts.

The last step is to wait five more days to allow responses to the third round of emails. Any links still remaining, after three removal requests, will be added to the “Disavow Links tool.”


Yes, this is a lot of work. Each minute is worth spending and if it is not done properly, this will ruin. I’ve tried my level best to put the things here. All said, this alone cannot be used for the process. I believe, we need to put lot of common sense in the process of removing the back links.

Zero Down the security issues in PHP by using PHP Frameworks

As per Open Web Application Security Project (OWASP) there are top 10 security issues. Unfortunately PHP openness allows hackers to exploit  it.

Use PHP Frameworks to avoid hackers

1) Injection Injecting code as part of the actual command or query in order to break or hostile the system. This can be SQL Injection, OS Injection or LDAP injection. This occupies the top list of security flaws.

Continue reading “Zero Down the security issues in PHP by using PHP Frameworks”

How to move your Linux wordpress site to Windows Server

wordpress website_linux_to_windows(IIS)


When I first asked to migrate a linux based wordpress site to windows server, I was really wicked. Because there was plenty of challenges in front of me,

  1.  Site is up and live, I shouldn’t down the site for long time.
  2. WordPress, Mysql and all PHP stuffs goes well with linux but not with windows.
  3.  I need to move all the wordpress posts and pages without any blunder.
  4. I should maintain the existing URL for all the pages.

Now I relaxed and took long breath. Ok. Before starting the process my mind was asking me to hatch a plan.  So here are the plans,


  1. Backup the entire wordpress folder
  2. Export the mysql wordpress database into a sql file format

Continue reading “How to move your Linux wordpress site to Windows Server”

Optimize the website to load fast

Website load time is very important factor which even decides the revenue of the business. Statistical data tells that high-speed loading websites have a quantifiable result on revenue and user experience. Everyone hates a slow loading websites and applications. Though we have several programming scripts like PHP,, JSP etc., but the browsers understands only HTML, Javascipt and the CSS nothing else. Browser is not smart as we are 😉 . The browser never cares about your programming language or the database we use. No matter what technologies we use the web server just takes your server side code and generates simple mark up language (HTML). So the optimization can be done in server side and client side to make a website load fast.


  1. Optimizing the server side  (to reduce the webserver execution time)
  2. Optimizing client side  (HTML, Javascript, CSS, Images)

The recommended size of the web page is less than 1MB.

How to optimize the website to load fast?

Continue reading “Optimize the website to load fast”

Let me load – Jquery plugin to stop interacting user with the loading web page

According to Google the average loading speed of a website is 6 – 8 seconds. Just think what the user do with with 6 seconds ? Do you think user will stay calm the website to load fully ? Not at all. Though i’m a webmaster i never keep my patience 😉 I use to click the links or drag something or mouse hover, definetly i will do something with the poor partially loading website. In turns before loading completely the page redirects me to somewhere or it reacts to my actions very weirdly.

This happens to us many times. When you use a web application the interaction with partially loading web page, sometime leads to crashing the website. The funniest thing happens when you try filling partially loading online form. Most of the time the value you enter gets vanished after loading completely. This really frustrates us, when you are in slow connectivity this could happen many times.

I’m trying to give some universal solution.
Continue reading “Let me load – Jquery plugin to stop interacting user with the loading web page”

Automatic Website Backup From Linux Server PHP Script

Back up Issue:

Manual backing up of website content from the server to local machine should be fully automated. The system must be able to zip the website content and download to the local machine twice every week using simple PHP and shell script.

Solution: Automatic Website Backup From Linux Server – PHP script

Automatic Website Backup From Linux Server PHP script using FTP function and sh code for zipping the backup files on timely base.
Created two services to make a weekly backup.
1)      Server( Centos) – A shell script which compresses the entire web site files and stores in a specific path. This script runs twice every week. Cron will take care to execute periodically based on scheduled time.

2)      Client or Local machine – PHP script which downloads the zip file through FTP commands. Using windows task scheduler, the PHP script will execute periodically to download the files.
Continue reading “Automatic Website Backup From Linux Server PHP Script”

SEO – Meta details from list of URLS

This is a simple tool which was particularly built for my friend who is a great SEO tech. His nature of work involves visiting various links and preparing a excel with the meta description(keywords), title of that link. So we planned for building a tool which can automatically grab all the meta and title information of various links provided in the excel/text document.

Just 3 Steps
Continue reading “SEO – Meta details from list of URLS”