Posts

, , , ,

SEO Tips That You Must Start Employing Today

Every individual owing a website desires to see their web page on top of Google’s search results. To achieve this, each person strives towards making panda and penguin happy by setting every on-page SEO element right. In return, they hope that Google acknowledges the efforts and their site is gifted with a promotion in ranks. It often happens that even after setting every on-page SEO element right your site does not rank as desired.

This is where the role of off-page SEO comes into play. Offline SEO strategies play a major role in promoting your site and in certain situations prove to be more vital than on-page SEO strategies. You must be thinking what is so important about off-page SEO. A greater insight will help you understand the strategy better.

SEO Tips That You Must Start Employing Today

Points to ponder on:-

Off-page SEO refers to activities you undertake outside the boundaries of your website which help your web page to get ranked higher in Google’s search results. Mentioned below are 11 easy steps which if followed will not only make the bird and the bear happy but also raise your SERP ratings by considerable levels.

1. Blogs:-      

One of the greatest ways to promote your website in today’s world – blogs are meant to be written. Posting blogs on your website at regular intervals will engage Google more as regular updates will indicate that your site is under constant maintenance and activity. As Google prefers active sites as to dormant ones, this will help to give you a surge in SERP ratings. Moreover, regular blog posts will give your visitors a reason to return to your site at regular intervals.

Blogs should preferably consist of unique contents such as tutorials, question-answer forums and trending video links to keep your visitors engaged. In addition to this, you should comment on other blogs same as your genre, participate in question-answer forums which give you a chance to post a link to your blog in their comment or answer section. If visitors find it relevant, your site traffic is sure to increase.

2. Social Bookmarking:-      

Penguin and Panda love popular bookmarking sites such as Reddit, Stumbleupon, etc. Posting your blog links in these websites can give you a ranking surge as a content of these websites is updated regularly. If your blog has valid content which is related to the information on this site, people could find it useful to click on your link giving you that raise in ranking.

Bookmarking also helps to promote an author’s name to the world. If you have posted a link to your blog or website to Reddit and people there find it helpful and relevant to their needs, they are likely to share it more. This will help Google identify it as a genuine and relevant site which will help the site to get ranked higher through the process.

3. Acquire backlinks:-   

Acquire backlinks

I am sure, as a website owner, you would love to receive valid links to your site from trusted sources and so does Google. Receiving backlinks from higher ranked authentic sites will put your website in Google’s good books. If you are thinking how does it help? Well, web crawlers see that site as consisting of useful and relevant information. Useful content is always appreciated and awarded by Google.

But be careful as Penguin does not like spam links. Suppose your website represents clothes, while you receive a backlink from a blog post related to cars. Penguin identifies these as spams and would result in the de-ranking of your site. Therefore be sure to check you link surely.

4. Social Media Promotion:-      

Social Media Promotion

You are surely aware of all the major social networking sites such as Google+, Facebook, LinkedIn, Twitter, and Instagram. Then you should also be aware of how to use it to give a push to your site’s SERP ratings. If not, then have a read.

Sharing your website or blog in one of these social networking sites offer a chance of free promotion. Since Facebook and Twitter are considered to be the biggest online platforms today, sharing your work in these places is sure to attract more viewers than any other place.

5. Market Forums:-      

Forum marketing involves getting involved in communities related to your genre. You can participate in online forums discussing a particular topic relevant to your website or blog. As a return, you can post “Do follow” links to your website with a chance of increasing online traffic. This also helps search engines to find your site more easily.

With the help of marketing forums, you can make yourself known to everybody. Moreover, if your site has unique and valuable content, visitors are likely to share it on other platforms giving that all needed exposure.

6. Local Listing techniques:-      

Instead of targeting a global audience, local listings are an important technique you can apply if it goes with your website’s niche. This also enables Google to find your site easily. Local listing refers to an online profile that will contain your company name, phone number, location and the service it provides. You can do local listing by submitting your site to Google Maps, Yahoo local, Yellow Pages, and Google+ Local.

7. Guest Blogging:-      

Guest-Blogging

If you can put in a little more labor for the good of your site, then guest posting is a very effective way. All you need to do is write and post content on some other websites or blogs related to your genre. When visitors see your website name mentioned in several places on a trusted website, they will judge your site as a reliable source of information which in turn will help your website’s traffic.

So how is it done? As mentioned, writing content and publishing it on another website is only the first step. What follows is putting a link to your website, sharing it on social media and keep visiting to answer queries and comments. Guest posting will help you build relations with you readers and is an effective way to get yourself known to others.

8. Submit to search engines:-      

This is considered as an effective way of internet marketing to increase rankings of a website or webpage. You can directly submit your website to search engines such as Google, Yahoo or Bing. You can submit your website to two processes. Either you can submit one page at a time by using webmasters tools or you can submit your entire website. This is done by submitting the home page of your site to as many search engines as possible.

9. Directory Submission:-      

Listing your site in several directories or databases under concerned categories or subcategories is known as directory submission. Proper directory submission will enable you to get exposure, provide reliable backlinks and will help increase your blog’s overall earning. Chances are there, that you might also get paid post opportunities.

10. Ask:-      

Simply asking for a link is often quite beneficial what many of the bloggers forget to do. Suppose, your blog name has been mentioned in any article but without a link, you can simply ask that respective author to include a link to your blog. Moreover, you can also ask for a mention of your blog in return for a similar favor if both blogs are the same niche. Both bloggers gain equally and also helps up in building contacts.

11. Link Baiting:-      

The process by which you can get your visitors to share your website’s link is known as link baiting. The primary criteria of successful link baiting are creating quality and unique content. You should be able to make your readers believe that your site has that piece of information that is worth sharing.

In addition to this, do not forget to come up with engaging and attractive content which will compel a reader to click on your site. It must be kept in mind that one tactic is related to the other. Without quality content, an attractive link has no value. Similarly, in the absence of a catchy link, visitors are less likely to click on your site even if you have a catchy content written inside.

 SEO-Search-Engine-Optimisation

What are the benefits of using off-page SEO?

You can get the following advantages by using off-page SEO strategy the correct way:-

  • More traffic:
    As your page ranks higher, your website gets more visitors, followers and social media shares. This is a never ending process where the only criterion is to create good content and regularly update your website.
  • Online Branding:
    If your website manages to please Google through its offline SEO strategies, you will be rewarded for your hard work with online branding facilities from larger companies. In other words, large e-commerce sites will want to hire your page for advertising their products. This, in turn, increases not only your page value but generate that extra income.

Final words:

Therefore, it can be seen that not only online SEO’s, but off-line SEO strategies are very important as well. A survey from trusted sources has shown that people spend 70% of their time maintaining on-page SEO and the remaining 30% goes to off page SEO. Experts recommend a more balanced approach to make your site more SEO friendly. You must remember Google loves pages better optimized for SEO and that is what everyone is striving for today.

 

, ,

How to Score 100/100 in Google PageSpeed Insights with WordPress

How Important is Google PageSpeed Insights?

Google PageSpeed Insights is a web performance tool created by Google to help you easily identify ways to make your site faster and more mobile-friendly, by following recommendations on best web practices. A very important thing to remember though is that you shouldn’t always obsess over scoring 100/100. This might not even be possible in all scenarios, depending upon how your WordPress site is setup. With a lot of multipurpose themes and sites with dozens of external scripts, you simply will have an almost impossible time trying to achieve a perfect score. Which is perfectly OK.

We recommend looking at the speed of your site, more than the scores. Scores with tools like Pingdom, GTMetrix, and Google PageSpeed Insights can sometimes lead you astray.

Scoring 100/100 on Both Shared Hosting

We thought it would be fun to explore the new Twenty Seventeen theme in WordPress 4.7.4. This is the first default WordPress theme that is aimed at businesses instead of a typical blog, which is exciting! So today we are going to show you how to score that perfect 100/100 on both Desktop and Mobile. We have installed common tools and services that many WordPress sites use, such as Google Analytics, Akismet, Yoast SEO, etc.

While this is a small site, it is a good foundation to at least understand a little bit about how Google PageSpeed Insights works.

How to Score 100/100 in Google PageSpeed Insights with WordPress

100/100 in Google PageSpeed Insights with Shared Host

Our first test site, we have WordPress 4.7 with the Twenty Seventeen Theme running on a popular low-budget shared host (Apache). SSL is configured and the following plugins are installed.

  • Yoast SEO
  • Akismet

We also have Google Analytics running within the <body> of our header.php file. The only modification we have made has we added a featured image to the default dummy “Hello world!” blog post. We run our test site through Google PageSpeed Insights and out of the box, we get a 69/100 desktop score and a 58/100 mobile score. So we definitely have some improvements that should be made here. Let’s dig through each one of these to see how we can fix them.

shared-hosting-google-pagespeed-insights

Enable Compression

We will start with desktop first as many of the fixes will also apply for mobile. The very first Google PageSpeed Insights recommendation that we need to fix is the Enable Compression warning.

google-pagespeed-insights-enable-compression

According to Google, to fix this we need to enable Gzip compression. Unfortunately, the shared host doesn’t have this automatically enabled already on their servers, so we have to do it manually.

All modern browsers support and automatically negotiate Gzip compression for all HTTP requests. Enabling Gzip compression can reduce the size of the transferred response by up to 90%, which can significantly reduce the amount of time to download the resource, reduce data usage for the client, and improve the time to first render of your pages.

There are a couple ways you can go about doing this. The first and one of the easiest is by using a caching plugin that supports enabling Gzip. WP Rocket, for example, adds Gzip compression rules in your .htaccess file automatically using the mod_deflate module. W3 Total Cache also has a way to enable this for you under its performance section.

The second way to enable Gzip compression is by editing your .htaccess file.  Most shared hosts use Apache, in which you can simply add the code below to your .htaccess file. You can find your .htaccess file at the root of your WordPress site via FTP.

<IfModule mod_deflate.c>
# Compress HTML, CSS, JavaScript, Text, XML and fonts
AddOutputFilterByType DEFLATE application/javascript
AddOutputFilterByType DEFLATE application/rss+xml
AddOutputFilterByType DEFLATE application/vnd.ms-fontobject
AddOutputFilterByType DEFLATE application/x-font
AddOutputFilterByType DEFLATE application/x-font-opentype
AddOutputFilterByType DEFLATE application/x-font-otf
AddOutputFilterByType DEFLATE application/x-font-truetype
AddOutputFilterByType DEFLATE application/x-font-ttf
AddOutputFilterByType DEFLATE application/x-javascript
AddOutputFilterByType DEFLATE application/xhtml+xml
AddOutputFilterByType DEFLATE application/xml
AddOutputFilterByType DEFLATE font/opentype
AddOutputFilterByType DEFLATE font/otf
AddOutputFilterByType DEFLATE font/ttf
AddOutputFilterByType DEFLATE image/svg+xml
AddOutputFilterByType DEFLATE image/x-icon
AddOutputFilterByType DEFLATE text/css
AddOutputFilterByType DEFLATE text/html
AddOutputFilterByType DEFLATE text/javascript
AddOutputFilterByType DEFLATE text/plain
AddOutputFilterByType DEFLATE text/xml
# Remove browser bugs (only needed for really old browsers)
BrowserMatch ^Mozilla/4 gzip-only-text/html
BrowserMatch ^Mozilla/4\.0[678] no-gzip
BrowserMatch \bMSIE !no-gzip !gzip-only-text/html
Header append Vary User-Agent
</IfModule>
Ensure that you add it below the current contents of your .htaccess file. Example below:
gzip

If you happen to be running on NGINX, simply add this to your nginx.conf file.

36 gzip on;
37 gzip_disable "MSIE [1-6]\.(?!.*SV1)";
38 gzip_vary on;
39 gzip_types text/plain text/css text/javascript application/javascript application/x-javascript;

A tool like  Check Gzip Compression can actually show you how my bytes were saved by enabling Gzip compression. Here is an example below of what we saved on our test site.

gzip

If we run our site through Google PageSpeed Insights again we can see that the Gzip compression warning is now gone and it has raised our desktop score from 69/100 to 80/100 and our mobile score from 58/100 to 67/100.

 google-pagespeed-after-gzip-compression

Optimize Images

The next Google PageSpeed Insights recommendation that we need to fix is the Optimize images warning. Our default “Hello world!” blog post has a featured image which is throwing up this error.

This is a very important and useful warning. According to HTTP Archive, as of November 2016, images made up for on average 65% of a web page total weight. Optimizing your images can be one of the easiest ways to see performance improvements with your WordPress website.

There are a couple ways you can fix this. The first is to use an image optimization plugin. A plugin can actually go through and bulk optimize your entire WordPress media library and also automatically optimize them when you upload them. Below are a few popular image optimization plugins:

Those plugins will fix the issue, or you can also compress them before you upload them in a tool like Adobe Photoshop, Gimp, or Affinity Photo. Below is the featured image that is throwing up that warning. We can compress it beforehand by both scaling it down and lowering the quality. It is best to keep your images as small as possible. This image was originally 2.32 MB, after down-scaling and compression, it is now 99.38 kB. Remember, it is best to upload images at scale and not rely on CSS to resize them. This slows down your site.

compress-image-with-affinity-photo

If we run our site through Google PageSpeed Insights again we can see that the Optimize images warning is now gone and it has raised our desktop score from 80/100 to 88/100 and our mobile score from 67/100 to 73/100. We are making progress!

google-pagespeed-insights-after-image-compression

Eliminate Render-blocking JavaScript and CSS in Above-the-fold Content

The next Google PageSpeed Insights recommendation that we need to fix is the Eliminate render-blocking JavaScript and CSS in above-the-fold content warning.

eliminate-render-blocking-javascript-and-css

When a browser loads a web page, JavaScript and CSS resources usually prevent the web page from being displayed until they are downloaded and processed by the browser. Some resources need to be downloaded and processed before displaying anything. However, many CSS and JavaScript resources are conditional–that is, only applied in specific cases–or are simply not needed to render above-the-fold content. To produce the fastest possible experience for your users, you should try to eliminate any render-blocking resources that aren’t required to display above-the-fold content.

As far as Render-blocking Javascript, Google has three recommendations:

  • If you don’t have a lot of JavaScript, you can inline it to get rid of this warning. You can inline JavaScript with a plugin like Autoptimize. However, this is really only valid for very small sites. Most WordPress sites have enough JavaScript where this could actually slow you down.
  • The second is to load your JavaScript asynchronously. Async Javascript essentially downloads the file during HTML parsing and will cause the HTML parser to execute it when it has finished downloading.
  • The third is to defer your JavaScript. The defer attribute also downloads the file during HTML parsing, but it only executes it after the parsing has completed. Also, scripts with this attribute execute in order of appearance on the page.

In our example, we are going to make our JavaScript load asynchronously. To do this we are going to use a free plugin called Async JavaScript. You can download it from the WordPress repository or search for it within your WordPress dashboard under “Plugins > Add New.” As of writing this it currently has 9,000+ active installs with a 4.2 out of 5-star rating. Essentially the plugin adds the ‘async’ or ‘defer’ attribute to all JavaScript loaded by the WordPress wp_enqueue_script function. The developer also has a premium version available which allows you to choose the scripts you want to async or defer.

Async Example

<script src="file1.js" async></script>

Defer Example

<script src="file1.js" defer></script>

After installing simply go into the settings and enable Async JavaScript.

 async-javascript
And for larger sites the script exclusion can come in handy. Or getting the premium version of the plugin. We won’t need it in this example, but if you have a site with a lot of JavaScript most likely you will end up with things breaking if you simply set everything to Async or Defer. In which case you will need to troubleshoot with which ones you can.
async-exclusions

If you don’t want to use a plugin for this there are a few other alternatives. Such as adding the following code to your functions.php file.

/*function to add async to all scripts*/
function js_async_attr($tag){
# Add async to all remaining scripts
return str_replace( ' src', ' async="async" src', $tag );
}
add_filter( 'script_loader_tag', 'js_async_attr', 10 );
We run our site through Google PageSpeed Insights again and as you can see the Render-blocking JavaScript is now fixed and we are left with the Optimize CSS delivery warning.
optimize-css-delivery

You can see that the first CSS we need to optimize is our Google fonts (fonts.googleapis.com). CSS is by default render-blocking, which includes CSS coming from web fonts. To fix this we are going to install the free Disable Google Fonts plugin. The plugin author, Milan Dinić, just recently updated this to include the new Twenty Seventeen Libre Franklin font. After installing the plugin, your Google Fonts will obviously break. So you will want to head over to Google Fonts and grab the embed code manually. We select the same font weights that are by default included in the Twenty Seventeen theme.

<link href="https://fonts.googleapis.com/css?family=Libre+Franklin:300,300i,400,400i,600,600i,800,800i" rel="stylesheet">
google-fonts-embed
Then you will need to add that to your footer.php file, right before the </body> tag. Note: Doing it this way will result in FOUT, which is what they refer to as flash of un-styled text. But it will also get rid of the render-blocking issue. You should decide on your own site if FOUT is an acceptable user experience for your visitors. You can also use Google’s Web Font Loader.
wordpress-footer-google-font
We run our test site through Google PageSpeed Insights again and now under the Optimize CSS Delivery warning we are only left with one thing, and that is the style.css file.
optimize-css-delivery-query-stringsOne of the easiest ways to fix this is to use a free WordPress plugin called Autoptimize
autoptimize-plugin

This plugin is pretty lightweight, only 176 KB to be exact. As of writing this, it currently has over 200,000 active installs with a 4.7 out of 5-star rating. The plugin helps you with a concatenation of your scripts, minification, expires headers, and the ability to move styles to your header and scripts to your footer. This plugin is fully compatible with the Async JavaScript plugin which was used earlier.

After installing the plugin, click the settings and select “Optimize CSS Code.” Then click the advanced tab and also enable “Aggregate inline CSS” and “Inline All CSS.” Note, depending on what theme you are doing this on, it might not be recommended to use this method. For large sites, inlining can be bad, in which case it would be actually better to simply ignore that particular Google PageSpeed Insights warning. And remember that with HTTP/2, concatenation can sometimes actually slow your site down.

optimize-css-code

We also recommend enabling the optimize HTML code option.

If we run our site through Google PageSpeed Insights again we can see that the Eliminate Render-blocking JavaScript and CSS in the Above-the-fold Content warning is now completely gone! It also fixed the Minify CSS warning which was further below and hadn’t even got to yet. We have raised our desktop score from 88/100 to 92/100 and our mobile score from 73/100 to 89/100. We are almost there.

google-pagespeed-insights-after-JS-CSS-optimization

Leverage Browser Caching

The next Google PageSpeed Insights recommendation that we need to fix is the Leverage browser caching warning. We actually have an entire in-depth post on the leverage browser caching issue, as it pertains to WordPress.

pagespeed-insights-leverage-browser-caching

The most common reason the leverage browser caching warning is triggered is that your web server doesn’t have the appropriate headers in place. In the screenshot above you can see that all of our internal scripts have an expiration is not a specified warning. When it comes to caching there are two primary methods which are used, Cache-Control headers and Expires headers. While the Cache-Control header turns on client-side caching and sets the max-age of a resource, the Expires header is used to specify a specific point in time the resource is no longer valid.

You don’t necessarily need to add both of the headers, as this is a little redundant. Cache-Control is newer and usually the recommended method, however, some web performance tools like GTMetrix still check for Expires headers. These are all examples, you can change file types, expire times, etc. based on your needs. Here are some options below. We are going to simply add expire headers in Apache on our shared host for this tutorial.

Adding Cache-Control Header in Nginx

You can add Cache-Control headers in Nginx by adding the following to your server config’s server location or block.

location ~* \.(js|css|png|jpg|jpeg|gif|ico)$ {
expires 2d;
add_header Cache-Control "public, no-transform";
}

Adding Expires Headers in Nginx

You can add Expires headers in Nginx by adding the following to your server block. In this example, you can see how to specify different expire times based on file types.

location ~* \.(jpg|jpeg|gif|png)$ {
expires 365d;
}
location ~* \.(pdf|css|html|js|swf)$ {
expires 2d;
}

Adding Cache-Control Headers in Apache

You can add Cache-Control headers in Apache by adding the following to your .htaccess file.

<filesMatch ".(ico|pdf|flv|jpg|jpeg|png|gif|js|css|swf)$">
Header set Cache-Control "max-age=604800, public"
</filesMatch>

Adding Expires Headers in Apache

You can add Expires headers in Apache by adding the following to your .htaccess file.

## EXPIRES HEADER CACHING ##
<IfModule mod_expires.c>
ExpiresActive On
ExpiresByType image/jpg "access 1 year"
ExpiresByType image/jpeg "access 1 year"
ExpiresByType image/gif "access 1 year"
ExpiresByType image/png "access 1 year"
ExpiresByType text/css "access 1 month"
ExpiresByType application/pdf "access 1 month"
ExpiresByType application/javascript "access 1 month"
ExpiresByType application/x-javascript "access 1 month"
ExpiresByType application/x-shockwave-flash "access 1 month"
ExpiresByType image/x-icon "access 1 year"
ExpiresDefault "access 2 days"
</IfModule>
## EXPIRES HEADER CACHING ##

Remember we enabled Gzip compression earlier? Below is what our .htaccess file below now looks like after also adding the expires headers. We simply place it below the compression block.

expire-headers-example-1

We run our test site through Google PageSpeed Insights again and now under the Leverage browser caching warning we are only left with one thing, and that is our Google Analytic’s script. This is kind of ironic seeing as this is Google’s own script. The issue is that they set a low 2 hour cache time on their asset, as seen in the screenshot below. They most likely do this because if for some reason they were to modify something on there end they want all users to get the changes as fast as possible.  However there is a way to get around this, and that is by hosting Google Analytics script locally. Please be aware though that this is not supported by Google.

leverage-browser-caching-google-analytics

There is a great free little plugin called Complete Analytics Optimization Suite,  which allows you to host Google Analytics locally on your WordPress website.

complete-analytics-optimization-suite

You can download Complete Analytics Optimization Suite from the WordPress repository or by searching for it under “Add New” plugins in your WordPress dashboard. As of writing this the plugin currently has 1,000+ active installs with a 5 out of 5-star rating. The plugin allows you to host your Google Analytics JavaScript file (analytics.js) locally and keep it updated using wp_cron(). Other features include being able to easily anonymize the IP address of your visitors, set an adjusted bounce rate, and placement of the script (header or footer).

Just install the plugin, enter your Google Analytics Tracking ID, and the plugin adds the necessary tracking code for Google Analytics to your WordPress website, downloads and saves the analytics.js file to your server and keeps it updated using a scheduled script in wp_cron(). We recommend also setting it to load in the footer. Note: This plugin won’t work with other Google Analytics WordPress plugins.

local-google-analytics

If we run our site through Google PageSpeed Insights again we can see that the Leverage browser caching warning is now completely gone! And we have raised our desktop score from 92/100 to 97/100 and our mobile score from 89/100 to 96/100. So close we can almost taste it.

pagespeed-insights-after-leverage-browser-caching

Reduce server response time

The next Google PageSpeed Insights recommendation that we need to fix is the Reduce server response time warning. The one and the only reason this is happening is because we are on a slow budget shared hosting plan. The server is not fast and Google knows it. So to fix this we need to implement some type of caching to speed things up. There are a lot of great caching plugins out there. In our example, we are going to be using the free Cache Enabler plugin from the team over at KeyCDN.

As of writing, this Cache Enabler has 10,000+ active installs with a 4.6 out of 5-star rating. It is a lightweight caching plugin for WordPress that makes your website faster by generating static HTML files plus WebP support. There are no settings to enable, simply install and your good to go. This plugin is fully compatible with the Async JavaScript and Autoptimize plugins that were used earlier. If you want even more speed though we do recommend also adding the advanced snippet to bypass PHP.

If we run our site through Google PageSpeed Insights again we can see that the Reduce server response time is now completely gone! And we have raised our desktop score from 97/100 to 99/100 and our mobile score from 96/100 to 99/100. We are about to cross the finish line.

pagespeed-insights-after-response-time-fix

Minify JavaScript

The last Google PageSpeed Insights recommendation that we need to fix is the Minify JavaScript warning.

pagespeed-insights-minify-javascript

To fix this we are actually going to go back into the Autoptimize plugin settings and simply enable the Optimize JavaScript Code option. Since you now have a caching plugin running, you might also need to clear your cache after doing this to see results.

autoptimize-optimize-javascript-code

And that’s it! We have now successfully taken the WordPress Twenty Seventeen theme from 69/100 to 100/100 on both mobile and desktop on a low-budget shared host.

100-100-pagespeed-insights-twenty-seventeen-theme

Here are the mobile scores. We didn’t have to do anything additional for mobile. Getting the desktop version to 100/100 automatically raised our mobile version and user experience scores to 100/100 as well.

100-100-pagespeed-insights-mobile

, , , , ,

How to Edit & Optimize WordPress Robots.txt File for SEO

Have you optimized your WordPress Robots.txt petition for SEO?

On the off-chance that you haven’t, you are overlooking an essential part of SEO. Robots.txt document assumes a critical part of your site’s SEO.

You are fortunate that WordPress consequently makes a Robots.txt petition for you. Having this record is half of the fight. You need to ensure that Robots.txt record is advanced to get the full advantages.

Robots.txt document advises internet searcher bots what pages to slither and what pages to stay away from. In this post, I will demonstrate to you best practices to alter and streamline Robots.txt record in WordPress.

What is Robots.txt File?

Robots.txt record is a content document which trains internet searcher bots how to creep and list a web page. At whatever point any web index bots go to your web page, it peruses the robots.txt record and takes after the guidelines. By utilizing this record, you can indicate bots which some portion of your site to slither and which part to maintain a strategic distance from. Be that as it may, the nonattendance of robots.txt won’t stop web search tool bots to creep and list your web page.

Editing & Understanding Robots.txt in WordPress

I’ve as of now said that each WordPress site has a default robots.txt document in root registry.

In the event that you don’t have a robots.txt record, you’ll need to make one. It’s anything but difficult to do. Simply make a content record in your PC and spare it as robots.txt and transfer it to your root index. You can transfer it by means of FTP Manager or cPanel File Manager.

Presently how about we perceive how to alter your robots.txt record.

You can alter your robots.txt document by utilizing FTP Manager or cPanel File Manager. In any case, now is the ideal time to expand and somewhat troublesome.

An ideal approach to alter Robots.txt document is, utilizing a module. There are a few WordPress robots.txt modules out there. I incline toward Yoast SEO. This is the best SEO module for WordPress. I’ve officially shared how to set up Yoast SEO.

Yoast SEO permits you to adjust the robots.txt record from your WordPress administrator territory. Be that as it may, on the off chance that you would prefer not to utilize Yoast module, you can utilize different modules like WP Robots Txt.

Once you’ve installed and activated Yoast SEO plugin, go to WordPress Admin Panel > SEO > Tools.
How to Edit & Optimize WordPress Robots.txt File for SEO
Then click on “File editor”.

Then you need to click on “Create robots.txt file”.

Then you will get the Robots.txt file editor. You can configure your robots.txt file from here.

Before editing the file, you need to understand the commands of the file. There are three commands mainly.

User-agent – Defines the name of the search engine bots like Googlebot or Bingbot. You can use an asterisk (*) to refer to all search engine bots.
Disallow – Instructs search engines not to crawl and index some parts of your site.
Allow – Instructs search engines to crawl and index which parts you want to index.
Here’s a sample of Robots.txt file.

User-agent: *
Disallow: /wp-admin/
Allow: /

This robots.txt record teaches all web crawler bots to slither the web page. In the second line, it advises web crawler bots not to slither the/wp-administrator/part. In the third line, it educates web crawler bots to creep and list entire site.

Designing and Optimizing Robots.txt File for SEO

A basic misconfigure in Robots.txt record can totally deindex your site from web crawlers. For instance, in the event that you utilize the order “Deny:/” in Robots.txt record, your site will be deindexed from web indexes. So you should be watchful while designing.

Another essential thing is an enhancement of Robots.txt record for SEO. Before heading off to the accepted procedures of Robots.txt SEO, I’d get a kick out of the chance to caution you about some terrible practices.

  • Try not to utilize Robots.txt record to conceal low-quality substance. The best practice is to utilize no index and no follow meta tag. You can do this by utilizing Yoast SEO module.
  • Try not to utilize Robots.txt record to stop web indexes to file your Categories, Tags, Archives, Author pages, and so on. You can include no follow and no index meta labels to those pages by utilizing Yoast SEO module.
  • Try not to utilize Robots.txt record to deal with copy content. There are different ways.

Presently how about we perceive how you can make Robots.txt record SEO amicable.

  1. At to start with, you have to figure out which parts of your site you don’t need internet searcher bots to creep. I incline toward forbidding/wp-administrator/,/wp-content/modules/,/readme.html,/trackback/.
  2. Including “Permit:/” subordinates on Robots.txt document is not all that critical as bots will slither your site in any case. Be that as it may, you can utilize it for the specific boat.
  3. Adding sitemaps to Robots.txt document is additionally a decent practice.

Here’s an example of an ideal Robots.txt file for WordPress.

User-agent: *
Disallow: /wp-admin/
Disallow: /wp-content/plugins/
Disallow: /readme.html
Disallow: /trackback/
Disallow: /go/
Allow: /wp-admin/admin-ajax.php
Allow: /wp-content/uploads/
Sitemap: https://test.com/post-sitemap.xml
Sitemap: https://test.com/page-sitemap.xml

You can check RTB Robots.txt file here: https://test.com/robots.txt

Testing Robots.txt File in Google Webmaster Tools

After updating your Robots.txt file, you have to test the Robots.txt file to check if any content is impacted by the update.

You can use Google Search Console to check if there is any “Error” or “Warning” for your Robots.txt file. Just login to Google Search Console and select the site. Then go to Crawl > robots.txt Tester and click on “Submit” button.

A box will be popped up. Just click on “Submit” button.

Then reload the page and check if the file is updated. It might take some time to update the Robots.txt file.

If it hasn’t updated yet, you can enter your Robots.txt file code into the box to check if there are any errors or warnings. It will show the errors and warnings there.

If you notice any errors or warnings in the robots.txt file, you have to fix it by editing the robots.txt file.

, , ,

What is Negative SEO?

As a blogger, anyone needs excessively get achievement in their field and needs to end up noticeably prevalent. For this ubiquity, it is essential that your site additionally is prominent. So you have to keep up your site from programmers and some other initiate.

A hacked site can truly impact your SEO endeavors. You can’t expect an essential execution from a site. These days Negative SEO has transformed into a particularly customary issue. Software engineers are endeavoring particular systems to inject malignant codes to destinations, that makes passionate minimization the polluted site situating. Various site administrators have faulted by this systems starting late, so don’t be the next loss and make sense of how to Avoid Negative SEO campaigns.

What is Negative SEO?

Negative SEO basically alludes the utilization of the dark cap and exploitative methods rehearsed to neutralize the contender’s rankings in web indexes. There could be different sorts of techniques can be utilized by programmers to contaminate your site and decrease your site SERP.

  • Hacking your site
  • Making bunches of spammy connections to your site
  • Making a duplicate of your substance and disperse over the Internet
  • Guiding your web page joins toward different porn or malignant sites.
  • Making your site connect with porn or Viagra like catchphrases.
  • Run fake social profiles to demolish your site notoriety on the web.
  • Evacuate the best and valuable backlinks of your site.

Is Negative SEO a Real Threat?

Yes, undoubtedly about that. Negative SEO is genuine, and various sites have needed to manage this issue. Avoiding it is substantially less demanding than settling it.

In the event that you lead a look on Fiverr for “contrary SEO,” you will discover more than 15,000 individuals willing to take every necessary step for just $5.

Additionally, dark cap discussions are brimming with stories from individuals who have to prevail with this method.

Step by step instructions to Prevent Negative SEO Attacks

1. Set up Google Webmaster Tools Email Alerts

Google can send you to email cautious when:

  • Your site is being assaulted by malware
  • Your pages are not ordered
  • You have server availability issues
  • You get a manual punishment from Google

On the off-chance that you haven’t as of now, interface your site to Google Webmaster Tools.

Sign into your record and snap “Website admin Tools Preferences.”

Instructions to Prevent Negative SEO Attacks

1. Set up Google Webmaster Tools Email Alerts

Google can send you to email cautious when:

  1. Your site is being assaulted by malware
  2. Your pages are not filed
  3. You have server availability issues
  4. You get a manual punishment from Google

If you haven’t already, connect your website to Google Webmaster Tools.
Log in to your account and click “Webmaster Tools Preferences.”
webmastert-tools-preferences
Enable email warnings and get alarms for a wide range of issues. Click “Save”

webmaster-tools-preferences
This is quite recently the initial step. Presently, how about we move to the vital one, checking your backlinks profile.

2. Monitor Your Backlinks Profile

This is the most imperative move to make to keep spammers from succeeding. Regularly, they will perform negative SEO on your site by building low-quality connections or sidetracks. It is imperatively critical to know when somebody is making connections or sidetracks to your site.

You can utilize instruments like Ahrefs or Open Site Explorer, every now and then, to physically check on the off chance that somebody is building connections to your site, however, the one I prescribe you utilize is MonitorBacklinks.com. It’s one of the best and least demanding apparatuses that can send you email cautions when your site picks up or loses vital backlinks.

Rather than having to physically check your backlinks each morning, Monitor Backlinks sends all that you have to know to your inbox. Here’s the way you can utilize it:

When you have made your account, it will oblige you to include your domain and interface it to your Google Analytics account.

3. Secure Your Best Backlinks

All the time, spammers will attempt to expel your best backlinks. They, as a rule, contact the site proprietor of the connection, utilizing your name, and they ask for the website admin to expel your backlink.

To keep this from happening, you can do two things:

  1. When you speak with website admins, dependably utilize an email address from your domain, rather than utilizing Gmail or Yahoo. Thusly, you will demonstrate that you work for the site and that it’s not another person putting on a show to be you. Your email ought to resemble this: yourname@yourdomain.com.
  2. Monitor your best backlinks. For this, you can utilize Monitor Backlinks once more. Go to your rundown of backlinks and sort them by Page Rank or social movement.

report
Add tags to the backlinks you esteem the most so you can confirm if any of them get expelled.
Select your backlink and snap “alter.”
Include your tags, so you can later channel and discover these backlinks effectively.

4. Secure Your Website from Malware and Hackers

Security is critical. The exact opposite thing you need is spam on your site without you notwithstanding thinking about it. There are a few things you can do to secure your site:

  • On the off chance that you are utilizing WordPress, introduce the Google Authenticator Plugin and make a 2-stage check watchword. Each time you sign into your WordPress site, you will be required to include a code produced by Google Authenticator on your cell phone (accessible on iOS and Android).

login

  • Make a solid secret word with numbers and extraordinary characters.
  • Make backups of your documents and database all the time.
  • In the event that your site gives clients a chance to transfer records, converse with your facilitating organization and ask them how you can introduce antivirus to anticipate malware.

5. Check for Duplicate Content

A standout amongst the most widely recognized methods spammers utilize is content duplication. They duplicate your site substance and post it wherever they can. In the event that the majority of your substance is copied, there’s a major probability that your site will be punished and lose rankings.

You can check if your site has copy pages on the web utilizing Copyscape.com. Essentially include your site or the body of the article you need to confirm, and it will demonstrate you if your substance is being distributed elsewhere, without your consent.

copyscape

6. Screen Your Social Media Mentions

Once in a while, spammers will make fake records via web-based networking media utilizing your organization or site name. Attempt to evacuate these profiles by revealing them as spam before they begin adherents.

To discover who is utilizing your image name, you can utilize devices like Mention.net.

When somebody says your name on any web-based social networking or site, you will be educated, and you can choose whether you ought to make a move.

Make a record and snap “Make ready.” Name your caution, and add the catchphrases you need to be alarmed about. You can utilize different dialects, as well. Click “Next Step.”

What is Negative SEO

Select the sources you want Mention.net to look for, and add the domains you want to be ignored. Click “Create my alert,” and you will receive alerts every time your keyword (company name) is mentioned on social media, blogs, forums, and news.

7. Try not to be a Victim of Your Own SEO Strategies

Ensure you are not harming your site rankings by utilizing procedures that are not adequate to Google. These are a portion of the things you ought not to do:

  • Try not to connection to punished sites.
  • Try not to purchase joins from blog arranges, and don’t purchase interfaces at all for SEO.
  • Try not to distribute countless quality visitor posts.
  • Try not to assemble excessively numerous backlinks to your site utilizing “cash catchphrases.” At minimum 60% of your grapple writings ought to utilize your site name.
  • Try not to offer connections on your site without utilizing the “no take after” characteristic.

8. Don’t Make Enemies Online

There is no reason to create enemies. Don’t ever argue with clients because you never know who you are dealing with. There are three types of spammers and reasons why they spam:

  • For fun
  • For revenge
  • To outrank competition in search engines

I think this blog is useful for you. And furthermore straightforward you to get achievement and keep your site from negative SEO influences