Even though search engines grow intelligent every year, they are yet to reach human level intelligence where they can decipher what a page is about, whether its spam or useful content and so on.
In order to understand the context of a web page, search engines have to focus on the technical aspects of a web page. Search engine optimization is all about getting search engines to understand your page better and consequently get ranked higher.
If your website or certain pages on it have not been optimized a search engine will rank pages optimized better than yours and you’ll lose tons of organic traffic.
This article will focus on various aspects of web design that are geared towards pleasing the search engine gods.
Indexable Content
Indexable content refers to content that can be interpreted by search engines.
If you want to perform better in the search results your important content should be published in HTML format. Search engines are able to index your content by going through the html content on your page using crawlers. If they can’t parse the content on your page because it isn’t in HTML they’ll either devalue it or outrightly ignore it.
Search engine crawlers can’t parse flash files, images, Java applets etc.
If your site demands greater visual display styles and formatting you can do the following:
- Add the “alt” tag to your images
- Enhance search boxes with crawlable and navigatable links
- Enhance Java plug-ins and flas with text on the page
- Provide transcript for audio and video content
If you’d like to find out how search engines see your website, go to SEO-browser.com.
By using the above tool you’ll be able to see what’s wrong with your webpage and act accordingly.
Crawlable Link Structures
In the same way that search engines need to detect HTML content in order to parse web pages, they also need to detect links. This is how they’re able to discover web pages in the first place. In order for search engines to discover all the web pages on your site, they need to a crawlable link structure.
A crawlable link structure is one that makes it easy for search engine crawlers to detect all the pathways of a website. If search engines don’t detect all the pages on your website, then there’s no way your pages will get indexed.
Unfortunately, there are many websites that design their web pages in a way that search engines can’t access other pages on a website.
The undetectable web pages are known as orphans. No matter how well you market your website it will be all for naught as long as the orphan pages can’t be detected.
Here are some reasons why pages might not be reachable.
Submission-required forms
Sometimes when you visit a website users are required to complete an online form before they’re able to access those pages. This might be a survey or a password-protected login. Since search crawlers don’t index form any content that has to be accessed that way will be invisible to the engines.
Search Forms
Search engine crawlers are unable to read search forms. Some webmasters wrongly believe that by including a search form on their website search engines will be able to locate content on their website in the same way that human users can.
This couldn’t be further from the truth as crawlers don’t perform website searches to find content.
Links In Unreadable JavaScript
Search engines either give very little weight to links that are embedded with JavaScript code or completely ignore them. If you intend for links to be crawlable, you should ensure that your JavaScript links are replaced with HTML links.
Links In Java, Flash and other plug-ins
Search engine crawlers aren’t capable of detecting links that are in either Java or flash and thus can’t be indexed.
Blocking by Meta Robots Tag
If there are pages on your site that you would like to restrict search engine access from, you can use the robots.txt and Meta robots tag to block search engine crawlers from indexing. Sometimes thPagough the restriction might happen unintentionally. If you happen to notice that certain pages on your website aren’t indexing, that should be the first place you look.
Pages With An Excessive Number of Links
Search engine crawlers will not crawl each and every link on a page especially when you have so many of them. By not crawling each and every link, search engines are able to conserve rankings and reduce spam.
Try as much as you can to only include relevant links in your content.
Follow and Do-Follow Links
In general, there are many HTML attributes associated with links. However, search engines will ignore almost all of them and instead focus on just one; rel = “nofollow”.
By including this attribute on a link’s HTML, you’re simply telling search engines that even though you’re linking to a website, Google or any other search engine shouldn’t consider it as an endorsement.
According to Google usually, when their search engines come across the rel=”nofollow” tag, they interpret the link as plain HTML text.
Nofollow links are considered good for SEO as they indicate a diverse link profile. Studies show that sites that do well in search engine result pages tend to have a high number of nofollow links pointing in from other sites.
Keyword Targeting and Use
When it comes to conducting searches online, keywords play a very critical role. In fact, so central is their role that instead of storing a database of web pages, search engines store their data based on keyword searches.
On-Page Optimization
On-page optimization refers to working on your site’s internal pages to ensure that search engines rank them higher than others. That is the layman definition.
Here are some ways you can use keywords to optimize your site:
- Title Tag – You should endeavor to have the keyword to the front of the title tag as much as you can. The title tag is the option that appears at the top of the browser tab.
- Place the keyword at the very top of the page, at least in the very first 100 words of your first paragraph.
- Sprinkle your keyword throughout your site. The actual number varies depending on the length of the content. Just don’t make it look spammy. If you’ve created a short blog (around 500 words) eyeball the content and decide how many times your keyword should appear.
- Use the keyword in the alt attribute of your image. This helps with both web search as well as image search. Every now and then image search brings in valuable traffic so you shouldn’t ignore this.
- Mention your keyword at least once in the URL slug
- Use your keyword at least once in the meta description tag. You need to note that the meta description doesn’t affect how your page is ranked. However, users will use the description to decide whether to click on your page.
Avoid keyword cannibalization. This refers to linking to other pages on your site using keywords as link anchor text.
Here at NoRiskSEO.com we have an entire team dedicated to creating SEO friendly web designs. If you’d like a free website analysis get in touch.