Sunday, January 18, 2009

SEO for Blogs

While there are many reasons to blog, there are two primary uses that are becoming more widespread with blogging: To game Google Adsense, and to also game the search engines for better rankings.

Since Blogs are known to the search engines as daily snippets of information that are updated on a frequent basis, they get a lot of preference in the search results.

So from this, how do you optimize your blog?

Depending on the service you utilize, there are still some basics that you can apply. The two biggest blogging sites are WordPress and Blogger. If you are looking for an advantage with SEO, you should use Blogger since they are owned by Google, and you'll be indexed quicker.

Blogs aren't all that much different than a site. The power of a blog is from the frequency of the posting. If the search engines see a pattern of 3 postings per day, they will be very active on spidering that site. Depending on the topic of the blog, you may be able to boost your blog in the rankings if it's related to a hot topic on the Internet.

Now, to the actual optimization.

Since blogs are typically a few pages long, you only need to establish a few rules that will be applied sitewide:
  • Use an interesting title - Also known as linkbaiting - your title should captivate the user to read more
  • Try to use your blog keywords in the title - while it may be hard to always do this, by keeping the title related to the topic, your blog will always be relevant
  • Use the H1 tags - All blog editors allow for custom templates, so be sure to stylize your blog title in the H1 tags
  • Link to other blogs - Find other related blogs that your readers might find interesting, and ask for a link in return
  • Don't forget about your archives - You may post a topic that isn't hot at the moment, but comes up later - make sure your readers can find that posting
As stated above, the key to a successful blog is frequent postings. You don't need to write up an article everytime, two paragraphs and maybe some resource links is what a typical posting should be comprised of.

Automatic SEO

You heard it here first.

Within 3 years, almost anyone will be able to perform search engine optimization (seo) to their site.

It will all be done through a back-end content management system (CMS) - and it will be easy and fast working.

But, it doesn't mean it't the end of the world for Optimization services.

While a properly optimizd site is nice to have, if no one finds it, that won't matter. Expect to see more companies offer Search Engine Marketing as the premiere service, as that is more time intense anyways.

How can this be? Let's take a trip back to 1995.

You remember: blinking text, huge jpeg graphics, dial-up and AOL was all the rage. Back then, it was believed that all you needed was a website, and you could become an Internet millionaire. While that was true for a few lucky people, for the rest of us it was still the same old grind. But the, the emergence of template editors arrived. MicroSoft Frontpage, Macromedia Dreamweaver and others appeared making the creation of a website fairly easy.

Once the online world realized they could create their own sites, they next wanted the ability to edit their sites. Thus, the birth of CMS became a reality and many web design shops found themselves scrambling to offer thses back-end modules or services to their clients.

Today, people expect to have a website that they can edit, which brings us to SEO. There are many properties within a site that can be generated with a CMS system. For example:
  • Meta Tags
  • Page Titles
  • Page Names
  • Site Maps
  • CSS Stylization
If the web administrator knows the simple basics of seo, they can easily apply those values to all the pages any CMS system creates.

I know this is coming because I've built these exact systems for clients. Once a client learns the importance of proper naming stuctures, and the best way to naturally place their keyphrases into all parts of their website, they are already two steps ahead of their competition.

Search Engine Optimization is still a valuable way to earn money, but as more people integrate the Internet with their lives, they will become knowlegeable enough to perform those tasks on their own.

The next step will be the automation of SEM, which I still think is only about 5 years away...

Grab the money while you can!

-To your online success!

Balaji J H

Local Search Engine Optimization

As the fastest growing vertical in search, many people are now starting to recognize the value of local search engine optimization can have on their site traffic. Also known as regional search, it's basically geo-targeting your audience when they search.

Local search works best for the service provider, or a retailer that has numerous locations. While the search volume won't be as great as a non-regional phrase, the person who reaches your site will be a more targeted visit and most likely ready to convert.

Another happy accident in local search is that for sites that are well optimized may also pick up rankings in mobile search.

So, here's what you need to do in order to rank for local seo:
  • Be sure to have your location(s) full address
  • If you have a regional number, list that as well since some people start with an area code
  • Be sure to include driving directions to your location
  • Use a mapping service to display your location
  • Have pictures of your locations and name them with your street address
  • Make sure your site appears in any regional directory that might be online
  • If you can afford it, get listed in your local yellow pages
  • Place the regions you want to rank for in your page titles
  • Get text links that contain the regional phrase
Most of these techniques are are not only common sense, but also good web design. If you're in business, you want people to be able to find you, right?

-To your online success!

Google Tricks

Enter just the word http for your search to find the top 1000 PageRanked sites.

Enter only www in your search to see how Google ranks the top 1,000 sites.

Manually type the following prefixes and note their utility:

  • link:url Shows other pages with links to that url.
  • related:url same as "what's related" on serps.
  • site:domain restricts search results to the given domain.
  • allinurl: shows only pages with all terms in the url.
  • inurl: like allinurl, but only for the next query word.
  • allintitle: shows only results with terms in title.
  • intitle: similar to allintitle, but only for the next word. "intitle:seoforgoogle google" finds only pages with seoforgoogle in the title, and google anywhere on the page.
  • cache:url will show the Google version of the passed url.
  • info:url will show a page containing links to related searches, backlinks, and pages containing the url. This is the same as typing the url into the search box.
  • spell: will spell check your query and search for it.
  • stocks: will lookup the search query in a stock index.
  • filetype: will restrict searches to that filetype. "-filetype:pdf" to remove Adobe PDF files.
  • daterange: is supported in Julian date format only. 2452384 is an example of a Julian date.
  • maps: If you enter a street address, a link to Yahoo Maps and to MapBlast will be presented.
  • phone: enter anything that looks like a phone number to have a name and address displayed. Same is true for something that looks like an address (include a name and zip code)
  • site:www.somesite.net "+www.somesite.+net" - (tells you how many pages of your site are indexed by google)
  • allintext: searches only within text of pages, but not in the links or page title
  • allinlinks: searches only within links, not text or title
  • Saturday, January 17, 2009

    The web professional's online magazine of choice.

    We all know the importance of having a Web site rank well in search engine results for searches on specific keywords/phrases. If your Web site doesn’t have a page appearing in the top 10 search engine result positions (SERPs) the chances of someone clicking on your listing, and actually visiting your site, drop dramatically. If you’re not in the top 20 you have almost no chance that someone will scan through the SERPs that far to find your page.


    The basics of code optimization are just sound HTML coding practices; when followed, they go a long way toward SEO.
    Optimizing your site and content for a search engine, for a better ranking in SERPs, is known as Search Engine Optimization (SEO), yet many Web developers/designers either don’t take time to code a site properly or don’t know how to do proper SEO. The basics of code optimization are just sound HTML coding practices; when followed, they go a long way toward SEO.

    There is a lot you can do to optimize your Web site for search engines from the code level. Where you can also affect things, and this is beyond the work of the developer/designer, is in the actual content. Understanding how to tag the content, and where to place it in the HTML, is critical. Here is a basic outline of SEO best practices.

    Understand the Search Engines and Search Engine Spiders
    So how does your site get into a search engine? A search engine obtains your URL either by you submitting your site directly to the search engine or by others linking to your site. Then, at a time of its choosing, a search engine sends out its spider (or “bot”) to visit your site.

    Once there, the spider starts reading all the text in the body of the page, including markup elements, all links to other pages and to external sites, plus elements from the page head including some meta tags (depending on the search engine) and the title tag.

    It then copies this information back to its central database for indexing at a later date which can be up to two or three months later.

    The spider then follows the links on the page, repeating the same process. Spiders are, for lack of a better term, dumb. They can only follow the most basic HTML code. If you’ve encased a link in a fancy JavaScript that the spider won’t understand, the spider will simply ignore both the JavaScript and the link. The same thing applies to forms; spiders can’t fill out forms and click “submit.”

    To get an understanding of what a spider sees, try accessing your site with a Lynx browser from a Unix server. Lynx is non-graphical, does not support JavaScripts, and will display only text and regular a href tags. This is what the spider can see and therefore index. Does your page work without graphics or JavaScript? If no, then the spidering won’t work either and you’d better head back the drawing board.


    Once the SE has all your content in its database, it runs an algorithm (a mathematical formula) against the content. These algorithms are unique to each SE and are constantly changing, but, in essence, all the search engines are looking for the important words on your page (based on word density—how often a word or phrase is used in relation to the total amount of text) and they assign a value to these words based on the code surrounding the words.


    In addition to content, the search engine looks for what other sites, or pages on the same site, are linking to that page. The more links to a given page, the more important that page is. Getting other sites to link to your site is very important, but not part of optimizing your site and will be covered in a future column. From a site optimization standpoint, make sure you link to your important pages from more than just the index page (e.g., create a primary navigation that appears on all pages.)

    Tip 1
    The first rule of SEO is not to design your site in such a way that the code prevents a spider from being able to index it. This means avoiding pages which are 100% graphics and no text, such as pages that contain all images, or are Flash-only. Furthermore, if the first thing a user encounters is a log-in page, before being able to see the site’s content, then that’s what a spider will see and it won’t go any further, either.

    If you’re planning to build a Web site entirely in Flash, DON’T. If you have no choice, then read my previous column, Search Engine Optimization and Non-HTML Sites.

    Tip 2
    To find out what a spider sees on your site, run a spider simulator on a given page. The simulator will show you what text the spider sees and what links it finds. There are many good ones on the market at various prices. If you’re looking for something that’s free, I’d suggest Search Engine Spider Simulator.

    Tip 3
    Each Web site should have a file called robots.txt. This file tells the spiders what directories they should not spider. Make sure this file is present and that it gives the appropriate permissions to the spiders. This includes access to content and to CSS.

    For more information on the robottxt.org

    Page Structure
    Once you’ve built an SE-friendly Web site, you then need to be sure each page is also SE-friendly. As I said earlier, good HTML structure is the foundation for building an SEO Web page. There are two primary areas of a Web page. The area contained between the tags and that which is contained between the tags. What information you place in these areas has a huge impact on how a page is indexed and, to a certain degree, what will appear in the SE results page.

    When designing your page, or placing content on your page, remember that spiders read like people. They go from left to right and from top to bottom (though this may be different for other languages.) They also feel that the most important information is located at the top of the page. If it’s important, why would you place it at the bottom? When reading specific tags (title, h1, h2, etc.) search engines value words to the left more highly than words to the right.

    The Title Tag
    Let’s start at one of the first elements in a Web page—the title tag (). This is one of the, if not the, most important elements for SEO on the entire page. All too often, the information contained in this tag is either left blank, has a default value (e.g. “insert title here”), or is simply the company name.

    Why is this tag so important? First of all, it is used by every major search engine as a key indicator of the page’s content, and, second, it used by the search engine as the first line in the SERPs.

    Give this tag the consideration it deserves.

    Tip 4
    Determine the main topic of the page and use it as the title. A page about high-performance running shoes from manufacturer XYZ shouldn’t have the title “XYZ”—it should have a title something like “High-performance Running Shoes.” If the brand is important, then add it to the end of the line like this: “High Performance Running Shoes - XYZ.”

    The Meta Tags
    Over the years, various meta tags have come in and gone out of favor with search engines. One of those which has lost its value is the “keywords” meta tag. Most search engines say they don’t look at it anymore but if you have time to create one, go ahead and do so. It doesn’t hurt.

    The only meta tag that all search engines presently acknowledge is the "description" meta tag. Once again, this tag should be unique to each page and match the content on the page itself.

    The proper format for the description meta tag is, for example:


    Tip 5
    Write a unique description for each page. If you use the same meta tag across all pages, the search engine will pick up on this and potentially ignore the content of the meta tag or possibly the entire page.

    JavaScript
    We’re all familiar with loading the top of the HTML page with all sorts of JavaScript functions that are necessary for various page features. This includes, but is not limited to: mouse-overs, form validators, cookie checkers, etc. To search engine spiders, this is clutter, and, while they ignore it, they still need to wade through all that code to find the real content of the page. Many spiders have timeouts or maximum character counts associated with them—if they have to wade through too much junk, they’ll abandon their spidering and move on to another site. So avoid making your pages too top heavy by placing too much code between the tags.

    Tip 6
    Put all your JavaScripts in external files and link to them. You’ll be creating an SE-friendly page while also making your markup cleaner and your Web site management easier.

    The Page Body
    This is the part of the Web page that your visitors will be seeing and yes, you can make pages both eye-pleasing and, at the same time, well-optimized for search engines.

    Page Headings and Other Word Graphics
    For stylistic reasons, many of us have chosen to display page headings as graphics. By turning to our favorite graphical editor, to create unique and creative headings, we’ve removed important words from our Web pages.

    Your Web site users may not really care that it took you four hours to create a groovy page heading that says “Yellow Widgets.” They just want to know that they’re on a page about Yellow Widgets.

    From the perspective of a search engine spider, the graphic about yellow widgets is just a graphic and spiders won’t read them. One option is to fill in the "alt" attribute in the "img" tag with the actual words. However, search engines give very little value, if any, to “alt” content these days. This attribute is still a requirement for accessibility, but it won’t do much toward getting your page ranked well in a search engine.

    The same thing applies to all those great key words on your site that form your site navigation menu. Perhaps, you’ve created graphics of the words for a mouseover effect, but, once again, they’re graphics and a spider couldn’t care less about them.

    Instead of spending all that time creating graphics of words, use real text. They are words, after all. If you must use graphics, consider a form of CSS image replacement; the spider should still be able to access the text of your heading.

    Tip 7—Page Titles
    Search engines love content that appears in header tags (h1, h2, etc.) yet very few Web sites actually use them. Their original intention was to be the visible title of the page (long before Web browsers actually supported graphics), with the primary title using h1 and subsections of the page encased in h2 tags, and so forth. In the early days of Web design, we had little to no control of these elements and they simply appeared as big black text on your page. This all changed with the introduction of Cascading Style Sheets (CSS.)

    Take time to define your header tags in your CSS and use the header tag for the titles and secondary titles of your content.

    To avoid spamming search engines, a Web page should have only one h1 tag. They can have as many h2 tags as necessary.


    Tip 8—Mouseovers
    Instead of spending all that time creating mouseovers, trying using the hover feature of CSS.

    If, for specific reasons, you can’t use CSS (perhaps you must support really old browsers), then repeat the menu options at the bottom of the page with plain text.

    Tables
    Graphic designers love using tables to slice and dice a graphical design to use on the Web. Unfortunately, these designers never really understood that the Web is the Web and not a printed page and that designs should be easy to code into Web pages.

    The problem with tables is that all the slicing and dicing can create Web pages containing tables embedded four or more deep to accommodate the design—and all the good content ends up inside the inner-most embedded tables.

    From a technical perspective, search engine spiders can read tables, and even embedded tables, but once a design gets to be more than about three tables deep, most spiders run into problems. Either it’s simply too much code for them to keep track of, or the search engine thinks you placed that content deep in the page because it’s not important, and so the engine gives it little or no value.

    Tip 9
    Avoid unnecessary tables where possible. Limit your table embedding to a depth of three.

    Where possible avoid the whole table thing and start using XHTML with div tags and CSS to define position. This makes for a much cleaner design and has the bonus of being easier to manage.

    Using Bold and Strong
    If there is an important phrase in your content, be sure to tag it appropriately. This is good for the user experience—and since you’re telling your users that the words are important, the search engines are likely to think the same way.

    Tip 10
    Use either or to mark up important words on your page. While most people use bold (), according to the W3C the correct markup is “strong” for important words.

    Summary
    By following this basic outline, you’ve created search engine-friendly pages. Your pages will be easily indexed by the search engine spiders, and, with important words and phrases appropriately tagged, those words will receive proper valuation by the search engines. All that’s left is to identify the appropriate words in the Web site copy and to find out if they are the words people actually search for—then develop an appropriate linking strategy. Those are lessons for another day.

    Thursday, January 8, 2009

    History

    Webmasters and content providers began optimizing sites for search engines in the mid-1990s, as the first search engines were cataloging the early Web. Initially, all a webmaster needed to do was submit a page, or URL, to the various engines which would send a spider to "crawl" that page, extract links to other pages from it, and return information found on the page to be indexed. The process involves a search engine spider downloading a page and storing it on the search engine's own server, where a second program, known as an indexer, extracts various information about the page, such as the words it contains and where these are located, as well as any weight for specific words, as well as any and all links the page contains, which are then placed into a scheduler for crawling at a later date.

    Site owners started to recognize the value of having their sites highly ranked and visible in search engine results, creating an opportunity for both white hat and black hat SEO practitioners. According to industry analyst Danny Sullivan, the earliest known use of the phrase search engine optimization was a spam message posted on Usenet on July 26, 1997.

    Early versions of search algorithms relied on webmaster-provided information such as the keyword meta tag, or index files in engines like ALIWEB. Meta tags provided a guide to each page's content. But using meta data to index pages was found to be less than reliable because the webmaster's account of keywords in the meta tag were not truly relevant to the site's actual keywords. Inaccurate, incomplete, and inconsistent data in meta tags caused pages to rank for irrelevant searches. Web content providers also manipulated a number of attributes within the HTML source of a page in an attempt to rank well in search engines.

    By relying so much on factors exclusively within a webmaster's control, early search engines suffered from abuse and ranking manipulation. To provide better results to their users, search engines had to adapt to ensure their results pages showed the most relevant search results, rather than unrelated pages stuffed with numerous keywords by unscrupulous webmasters. Since the success and popularity of a search engine is determined by its ability to produce the most relevant results to any given search allowing those results to be false would turn users to find other search sources. Search engines responded by developing more complex ranking algorithms, taking into account additional factors that were more difficult for webmasters to manipulate.

    While graduate students at Stanford University, Larry Page and Sergey Brin developed "backrub", a search engine that relied on a mathematical algorithm to rate the prominence of web pages. The number calculated by the algorithm, PageRank, is a function of the quantity and strength of inbound links. PageRank estimates the likelihood that a given page will be reached by a web user who randomly surfs the web, and follows links from one page to another. In effect, this means that some links are stronger than others, as a higher PageRank page is more likely to be reached by the random surfer.

    Page and Brin founded Google in 1998. Google attracted a loyal following among the growing number of Internet users, who liked its simple design. Off-page factors (such as PageRank and hyperlink analysis) were considered as well as on-page factors (such as keyword frequency, meta tags, headings, links and site structure) to enable Google to avoid the kind of manipulation seen in search engines that only considered on-page factors for their rankings. Although PageRank was more difficult to game, webmasters had already developed link building tools and schemes to influence the Inktomi search engine, and these methods proved similarly applicable to gaining PageRank. Many sites focused on exchanging, buying, and selling links, often on a massive scale. Some of these schemes, or link farms, involved the creation of thousands of sites for the sole purpose of link spamming. In recent years major search engines have begun to rely more heavily on off-web factors such as the age, sex, location, and search history of people conducting searches in order to further refine results.

    By 2007, search engines had incorporated a wide range of undisclosed factors in their ranking algorithms to reduce the impact of link manipulation. Google says it ranks sites using more than 200 different signals. The three leading search engines, Google, Yahoo and Microsoft's Live Search, do not disclose the algorithms they use to rank pages. Notable SEOs, such as Rand Fishkin, Barry Schwartz, Aaron Wall and Jill Whalen, have studied different approaches to search engine optimization, and have published their opinions in online forums and blogs. SEO practitioners may also study patents held by various search engines to gain insight into the algorithms.


    Source : Wikipedia

    Monday, January 5, 2009

    SEO - Introduction

    SEO.. also known as Search Engine Optimization.. is useful.. indeed..

    Search engine optimization (SEO) is the process of improving the volume and quality of traffic to a web site from search engines via "natural" ("organic" or "algorithmic") search results. Usually, the earlier a site is presented in the search results, or the higher it "ranks," the more searchers will visit that site. SEO can also target different kinds of search, including image search, local search, and industry-specific vertical search engines.

    As an Internet marketing strategy, SEO considers how search engines work and what people search for. Optimizing a website primarily involves editing its content and HTML coding to both increase its relevance to specific keywords and to remove barriers to the indexing activities of search engines.

    The acronym "SEO" can also refer to "search engine optimizers," a term adopted by an industry of consultants who carry out optimization projects on behalf of clients, and by employees who perform SEO services in-house. Search engine optimizers may offer SEO as a stand-alone service or as a part of a broader marketing campaign. Because effective SEO may require changes to the HTML source code of a site, SEO tactics may be incorporated into web site development and design. The term "search engine friendly" may be used to describe web site designs, menus, content management systems and shopping carts that are easy to optimize.

    Another class of techniques, known as black hat SEO or Spamdexing, use methods such as link farms and keyword stuffing that degrade both the relevance of search results and the user-experience of search engines. Search engines look for sites that employ these techniques in order to remove them from their indices.

    Source: Wikipedia