Learn How To Start Basic Page Optimization A Few More Good Things
We pin looked On-Page optimization what you can do that your pages to please the search engines. In this section of the course I want to explain a few other things you need to know about on page optimization. A few simple ideas you need to know searches and robots.txt file and HTML sitemaps things if you add your website very quickly.
What We Do Paragraphs:
As another quick tip get nice keywords spread around your website create what we do paragraph. The what we do paragraph is my name to block a text because the bottom of a webpage is generally will web pages a website. The contains many in the site is most important keywords and use the popular one time a few years ago to a simple lists of keywords into webpage is often quite large lists.
So the important keywords and appearing multiple pages to recite the site, these lists all the help from these days is likely the search engines seeing a list like that will field as a list of keywords is a helpful the site uses you should give the words and list anyway to ranking search results.
What I believe is a viable them is to create a paragraph that explains what the site does and place adding to the bottom of every page in the site you can mix things up a bit of course for example Ecommerce sites with multiple categories of products by using different paragraph within each category area. A couple of the samples and showing here actually on the small you can go been given these apps every 5100 words.
The press is website this first you produce a list of primary keywords no the giant list of hundreds but a list of the dozen or two dozen keywords that are commonly used of your products and services.
"The Rabbit Run is the country's premier manufacturer of rabbit hutches and rabbit cages. We've been helping bunny lovers with rabbit hutch plans, indoor rabbit hutches, and other rabbit supplies for 15 years..."
You then start leaving these keywords into a paragraph that text an actual explanation of what you do will sell you can take this paragraph and put it into the site footer. The site effect of making pagematch more combinations of important keywords and AdWords word.
What about the supposed duplicate content issue as a subject discuss elsewhere in the course but just I here to duplicate content things being grossly exaggerated a little duplicate content here and there doesn’t do any home especially something like this a footer paragraph I'm not suggesting that you create a giant 500 words and put a put in every footer a few sentence is going to any home.
Robots.txt the robots Meta Tag:
In this lesson will take a quick look at the robots.txt file and the robots Meta tag the former is a file that sits in the root directory of a website the latter is a Meta tag that can be placed into individual webpages and provide instructions to computer programs that call went through your website leading web pages searches the search bots the download pages from where the search index.
The major search engines and most readable smaller search engines Wheaties things and follow instructions pretty much every site needs robots.txt file and meta tag maybe use some cases but probably not all often.
Let's start of the robots.txt file there are really two things and the uses for leaking use it to block areas of the web site that we don't want an indexed and you can also provide a pointed to are xml Sitemaps is a text file of course and is placed into the top of your site from the root directory.
Is an example CNN dot com robot.text file which is providing links to various site maps blocking certain areas and they specifically allowing access to sub directory one directory blocked?
User-agent:*
Disallow: /private
First the blocking areas of the site if you have areas of your site you don't want indexed by the search engines you can include an instruction like these two lines first of the user agent line that in this case effectively says of this instruction applies to all robots all programs downloaded pages from my site and then to disallow alliance aimed which parts of the site should not be downloaded in this case an area the site called private.
User-agent: Googlebot
Disallow: /private
You can also provide instructions to particular box like this this case of course the search bots names google bots is being told not to download from the private directory.
User-agent: *
Disallow: /*
You a finalist of robots here that of course you people you need to specify instructions for particular ports you'll notice that if you screw something up in your robust out text file you can really hurt your site this instruction blocks or robots from all areas of your website will be dropped from the search index is if you use this so you wane be careful.
What We Do Paragraphs:
As another quick tip get nice keywords spread around your website create what we do paragraph. The what we do paragraph is my name to block a text because the bottom of a webpage is generally will web pages a website. The contains many in the site is most important keywords and use the popular one time a few years ago to a simple lists of keywords into webpage is often quite large lists.
So the important keywords and appearing multiple pages to recite the site, these lists all the help from these days is likely the search engines seeing a list like that will field as a list of keywords is a helpful the site uses you should give the words and list anyway to ranking search results.
What I believe is a viable them is to create a paragraph that explains what the site does and place adding to the bottom of every page in the site you can mix things up a bit of course for example Ecommerce sites with multiple categories of products by using different paragraph within each category area. A couple of the samples and showing here actually on the small you can go been given these apps every 5100 words.
The press is website this first you produce a list of primary keywords no the giant list of hundreds but a list of the dozen or two dozen keywords that are commonly used of your products and services.
"The Rabbit Run is the country's premier manufacturer of rabbit hutches and rabbit cages. We've been helping bunny lovers with rabbit hutch plans, indoor rabbit hutches, and other rabbit supplies for 15 years..."
You then start leaving these keywords into a paragraph that text an actual explanation of what you do will sell you can take this paragraph and put it into the site footer. The site effect of making pagematch more combinations of important keywords and AdWords word.
What about the supposed duplicate content issue as a subject discuss elsewhere in the course but just I here to duplicate content things being grossly exaggerated a little duplicate content here and there doesn’t do any home especially something like this a footer paragraph I'm not suggesting that you create a giant 500 words and put a put in every footer a few sentence is going to any home.
Robots.txt the robots Meta Tag:
In this lesson will take a quick look at the robots.txt file and the robots Meta tag the former is a file that sits in the root directory of a website the latter is a Meta tag that can be placed into individual webpages and provide instructions to computer programs that call went through your website leading web pages searches the search bots the download pages from where the search index.
The major search engines and most readable smaller search engines Wheaties things and follow instructions pretty much every site needs robots.txt file and meta tag maybe use some cases but probably not all often.
Let's start of the robots.txt file there are really two things and the uses for leaking use it to block areas of the web site that we don't want an indexed and you can also provide a pointed to are xml Sitemaps is a text file of course and is placed into the top of your site from the root directory.
Is an example CNN dot com robot.text file which is providing links to various site maps blocking certain areas and they specifically allowing access to sub directory one directory blocked?
User-agent:*
Disallow: /private
First the blocking areas of the site if you have areas of your site you don't want indexed by the search engines you can include an instruction like these two lines first of the user agent line that in this case effectively says of this instruction applies to all robots all programs downloaded pages from my site and then to disallow alliance aimed which parts of the site should not be downloaded in this case an area the site called private.
User-agent: Googlebot
Disallow: /private
You can also provide instructions to particular box like this this case of course the search bots names google bots is being told not to download from the private directory.
User-agent: *
Disallow: /*
You a finalist of robots here that of course you people you need to specify instructions for particular ports you'll notice that if you screw something up in your robust out text file you can really hurt your site this instruction blocks or robots from all areas of your website will be dropped from the search index is if you use this so you wane be careful.
Locally there are robots.txt testers will you to utility tester robots.txt file and tell the file will do. In the particular this google is test tool which you can find in the google webmaster account tool google small recently started courting the search console a subject you learn about later in the Course.
Remember:
- This is a Non-Secure System – A request only
- Don’t Block JavaScript and CSS
Another thing to consider the robots.txt many provides requests to the box is nothing the forces them to comply so if you have areas of your site that are really important to protect robots.txt is not the way to do it and one wall blocking issue these days’ search engines in particular google mainly to a JavaScript and CSS files. So although many developers use to use robots.txt that block the directory is holding those files, not a good idea to block the search engines from such information these days.
Sitemap: http://www.yourdomain.com/sitemap.xml
Now the xml Sitemaps reference all be tell you more about the xml Sitemaps and later lessons so I won't go into detail explanation. I just tell you robots.txt file should contain this line telling search engines where they can find you xml Sitemaps most websites need and xml Sitemaps so most websites need a robot .txt file if only for this reference.
<HEAD>
<meta name="robots" content="index" nofollow">
</HEAD>
Right time to look at the robots Meta tag the robots Meta tag allows you to provide instructions related to a particular webpage a tag which is placed in the head area of your web page is really very useful so we'll just take a quick look. Is an example the instructions appear in the content attribute of Course?
Index = "Please index this page"
Follow = "Please follow links on this page"
Noindex = "Please don’t index this page"
Nofollow = "Please don’t follow links on this page"
There are a few different instructions you can use really only the last two of a used telling the search engines to not index a page and did not follow links on the page the first two of the default settings if the search engine find the page with no robots Meta tag and will the see if it's allowed and exit and follow the links so you don't need to provide these instructions.
Most people really love need to worry much about the robots .text file and that Meta tags be on using the file to specify the site map location but will come back to that issue and later lessons.
HTML Sitemap:
Later in this course will be discussing xml Sitemaps which provide an index of your site the search engines. Search engines can quickly find all pages of content such as videos and new articles in the website.
In the pass it was popular to provide an HTML site map to help search engines find their way through sites.
This probably isn't very important these days assuming you've done a good job of ensuring that your site has plenty of easily read links throughout the pages.
it's only won't hurt and may provide a little assistance the search engines a specially for new site try to get index quickly.
So if you have a web design and wants to create an HTML site map that him of the go ahead and do it that you surely you going to get good keywords and the link of Corse.
A great degree HTML site maps of disappeared being replaced by xml Sitemaps but there are still in use here and there this one for instance is on the money and finance area of CNN's website and Sitemaps can also be helpful to actual site uses so many web site owners of kept them as another navigational tool for visitors.
Comments
Post a Comment