วันอังคารที่ 30 กันยายน พ.ศ. 2551

Professional SEO: Hand Off to Bob or Outsource the Job

We are often asked if professional SEO (search engine optimization) can be done effectively utilizing in-house talent. Despite our obvious self-interests on the subject, our answer is always a qualified "yes"? you can achieve professional SEO results using existing talent. However, for every company we have known that has met with great in-house SEO success, we know of many more that have seen their in-house efforts fail. We have also discovered the companies that have succeeded share some common traits.</p><p>If your company is considering doing SEO in-house, there are some critical questions that you should address before you proceed.</p><p><li><b>Do I have the proper resources at my disposal to achieve professional SEO results?</b></p><p>Search engine optimization takes time, and your internal SEO expert will need to have a great deal of it at his or her disposal ? especially at the project's outset when target audiences, keyphrases, and optimization schemes are first being established. Even after the initial optimization effort, the nature of SEO will require this person to spend ample time keeping up with industry trends, monitoring campaign progress, performing A/B testing, and expanding the campaign as new product and service areas are added.</p><p>Perhaps even more important than time, achieving professional SEO results requires a unique set of aptitudes. The person responsible for your internal SEO initiative must possess the ability to learn quickly and to look at your website from a macro-perspective, marrying together the needs of sales, marketing, and IT. He or she can not be an aggressive risk taker, as this is often a surefire way to get your website penalized and potentially removed from the major search engines. These gifted people exist in many companies, but given the unique attributes that these individuals possess, their time is often already spent in other crucial areas of the business.</p><p>Without enough time to invest in the project or the right type of person to execute it, an internal SEO initiative is likely doomed to fail.</p><p><li><b>Do I know which departments of my company should be involved, and will they work with an insider?</b></p><p>As mentioned above, professional SEO, by necessity, involves marketing, sales, and IT. The SEO expert must work with marketing to find out what types of offers and initiatives are working offline to help translate them effectively online. He or she must work with sales to identify the types of leads that are most valuable so that you can target the right people in the keyphrase selection process. And, finally, your SEO expert will need to work with IT to determine any technical limitations to the SEO recommendations, learn of any past initiatives based on a technical approach, and get the final optimization schemes implemented on the website.</p><p>Sadly, in many businesses, these departments have a somewhat adversarial relationship. However, it is the duty of the SEO expert to act as a project manager and coordinate the efforts of all three departments if you are going to get the most out of your campaign. No professional SEO project can be completed in a vacuum. For whatever reason, it is often easier for an outsider to get adversarial departments on the same page, in the same way that a marriage counselor might convince a woman of her undying love for her husband while the husband is still grimacing from a well-placed knee in the parking lot.</p><p><li><b>Will someone be held accountable for the results?</b></p><p>This may seem like a small consideration, but it can have a tremendous impact on the success of the campaign. If you have added this responsibility to some poor soul's job description with the direction that he or she should "do the best you can," you'll be lucky to make any headway at all (especially if the person is not enthusiastic about SEO). Whether SEO is done in-house or outsourced, someone will have to take responsibility for showing progress, explaining setbacks, and continually improving results. Without this accountability, it is very common to see an initiative fade as the buck is passed.</p><p><li><b>Can I afford delayed results based on a learning curve?</b></p><p>It's a reality ? professional SEO expertise has a steep learning curve. While the information on how to perform the basics of optimization are freely available on the web, much of the information out there is also contradictory, and some of it is actually dangerous. It takes time for someone unfamiliar with the discipline to sort the SEO wheat from the SEO chaff (on a side note, a "quoted" search of Google reveals that this may actually mark the first occasion in human history that the phrase "SEO chaff" has been used ? we're betting it's also the last). Simply put, if the person you are putting on the job has no experience, it will take longer to get results. This may not be a consideration if you aren't counting on new business from SEO any time soon. However, if you are losing business to your competition due to their professional SEO initiatives, time might be a larger factor.</p><p><li><b>Will it cost me less to do it in house than it would to choose a professional SEO firm?</b></p><p>Often, companies will attempt this specialized discipline in-house in order to save money, and sometimes this works out as intended. However, accurate calculations of the cost of in-house labor that would be involved versus the price of the firm you would otherwise hire should be performed to make an accurate comparison. When making this calculation, also factor in the opportunity cost of the resource ? the tasks that your in-house people are not able to perform because they are involved in SEO.</p><p>In addition, if worse comes to worst and your in-house SEO expert is led astray by some of the more dangerous "how-to" guides available, it can cost even more to repair the damage than it would have to hire a professional SEO firm to perform the optimization from the outset. And an internal SEO campaign gone wrong can cost even more than the stated fee ? websites that violate the terms of service of the major search engines (whether intentional or not) can be severely penalized or even removed, costing you a lot of lost revenue when potential customers can not find your website for a period of time.</p><p><li><b>Do I believe that the end result I'll get in-house will be equal to or greater than the results I would have gotten from a professional SEO firm?</b></p><p>Search engine optimization can create huge sales opportunities, and slight increases in overall exposure can have not-so-slight increases in your bottom-line revenue. If you believe that your talented in-house resource will, given enough time, achieve results equal to or greater than those that could have been achieved by the professional SEO firm you might have chosen, it may make sense to do it internally.</p><p>However, in addition to a better knowledge of industry trends, one clear advantage that search engine optimization firms have is the benefit of the experience and macro-perspective that comes from managing many different websites over time. Professional SEO firms can watch a wide range of sites on a continual basis to see what trends are working, what trends aren't, and what formerly recommended tactics are now actually hurting results.</p><p>This macro-perspective allows professional SEO firms to test new tactics as they appear on a case-by-case basis and apply those results across a wide range of clients to determine what the benefit is. It is harder for an individual with access to only one site to perform enough testing and research to achieve optimum results all the time, something that should also factor into the equation.</p><p><li><b>Do I have at least a slight tolerance for risk?</b></p><p>Neophytes to SEO can make mistakes that can lead to search engine penalization or removal. This happens most commonly when they have an IT background and treat SEO as a strictly technical exercise. We are often called in to assist companies who have had an internal initiative backfire, leaving them in a worse position than the one they were in before they started. The simple truth is that you cannot perform effective SEO without marrying your efforts to the visitor experience, but this is not something that is intuitively understood when people approach SEO for the first time.</p><p>However, professional SEO firms are not perfect either. Some firms use those same optimization methods that violate the search engines' terms of service and can get your site penalized. So, if you do decide to outsource, educate yourself on SEO and do some research on the firm. Know the basics of the business, find out who the firm's clients are and how long they've been in business, and ask for professional references ? just like you would do with any major business purchase.</p><p>If you have considered all of the above questions, and your answers to all seven are "yes," your company may be uniquely equipped to achieve professional SEO results in-house. If you answered "no" to any of the first three questions but "yes" to the rest, it does not necessarily mean that you can't perform SEO in-house ? just that you may not be in a position to do so at this time. Taking the actions required to get you in the right position to answer in the affirmative might be worth your while. However, if you answered "no" to any of the last four questions, you may want to consider outsourcing the project to a professional SEO firm.</p><p>A professional SEO firm has the resources, the time, the expertise, and, most importantly, the experience, to launch an SEO initiative for your website that will have a positive effect on your bottom line. Whichever option you choose, it is important that you fully embrace the channel. A half-hearted initiative, whether done internally or outsourced, can be as ineffective as taking no action at all.</p><p><b>About the Author</b></p><p>Scott Buresh is the CEO of <a target="_new" href="http://www.mediumblue.com">Medium Blue Search Engine Marketing</a>. He has contributed content to many publications including Building Your Business with Google For Dummies (Wiley, 2004), MarketingProfs, ZDNet, SEO Today, WebProNews, DarwinMag, SiteProNews, ISEDB.com, and Search Engine Guide. Medium Blue, an <a href="http://www.mediumblue.com/profile.html">Atlanta search engine optimization company</a>, serves local and national clients, including Boston Scientific, DuPont, and Georgia-Pacific. To receive internet marketing articles and search engine news in your email box each month, register for Medium Blue's newsletter, <a target="_new" href="http://www.mediumblue.com/newsletter.html">Out of the Blue</a>.

Ten Steps To A Well Optimized Website - Step 5: Internal Linking

Welcome to part five in this search engine positioning series. Last week we discussed the importance of content optimization. In part five we will cover your website's internal linking structure and the role that it plays in ranking highly, and in ranking for multiple phrases.</p><p>While this aspect is not necessarily the single most important of the ten steps it can be the difference between first page and second page rankings, and can make all the difference in the world when you are trying to rank your website for multiple phrases.</p><p>Over this series we will cover the ten key aspects to a solid search engine positioning campaign.</p><p>The Ten Steps We Will Go Through Are:</p><p><ul></p><p><li>Keyword Selection (http://www.beanstalk-inc.com/articles/search-engine-positioning/keywords.htm)</p><p><li>Content Creation (http://www.beanstalk-inc.com/articles/search-engine-positioning/content.htm)</p><p><li>Site Structure (http://www.beanstalk-inc.com/articles/search-engine-positioning/structure.htm)</p><p><li>Optimization (http://www.beanstalk-inc.com/articles/search-engine-positioning/optimization.htm</p><p><li>Internal Linking</p><p><li>Human Testing</p><p><li>Submissions</p><p><li>Link Building</p><p><li>Monitoring</p><p><li>The Extras</p><p></ul></p><p>Step Five ? Internal Linking</p><p>With all the talk out there about linking, one might be under the impression that the only links that count are those from other websites. While these links certainly play an important role (as will be discussed in part eight of this series) these are certainly not the only important links.</p><p>When you're about to launch into your link work why not stop and consider the ones that are easiest to attain and maximize first. That would be, the ones right there on your own site and those which you have total and complete control of. Properly used internal links can be a useful weapon in your SEO arsenal.</p><p>The internal linking structure can:</p><p>Insure that your website gets properly spidered and that all pages are found by the search engines</p><p>Build the relevancy of a page to a keyword phrase</p><p>Increase the PageRank of an internal page</p><p>Here is how the internal linking structure can affect these areas and how to maximize the effectiveness of the internal linking on your own website.</p><p>Getting Your Website Spidered</p><p>Insuring that every page of your website gets found by the search engine spiders is probably the simplest thing you can do for your rankings. Not only will this increase the number of pages that a search engine credits your site with, but it also increases the number of phrases that your website has the potential to rank for.</p><p>I have seen websites that, once the search engines find all of their pages, find that they are ranking on the first page and seeing traffic from phrases they never thought to even research or target.</p><p>This may not necessarily be the case for you however having a larger site with more pages related to your content will boost the value of your site overall. You are offering this content to your visitors, so why hide it from the search engines.</p><p>Pages can be hidden from search engines if the linking is done in a way that they cannot read. This is the case in many navigation scripts. If your site uses a script-based navigation system then you will want to consider the implementation of one of the internal linking structures noted further in the article.</p><p>Additionally, image-based navigation is spiderable however the search engines can't see what an image is and thus, cannot assign any relevancy from an image to the page it links to other than assigning it a place in your website hierarchy.</p><p>Building The Relevancy Of A Page To A Keyword Phrase</p><p>Anyone who wants to get their website into the top positions on the search engines for multiple phrases must start out with a clearly defined objective, including which pages should rank for which phrases. Generally speaking it will be your homepage that you will use to target your most competitive phrase and move on to targeting less competitive phrases on your internal pages.</p><p>To help build the relevancy of a page to a keyword phrase you will want to use the keyword phrase in the anchor text of the links to that page. Let's assume that you have a website hosting company. Rather than linking to your homepage with the anchor text &quot;home&quot; link to it with the text &quot;web hosting main&quot;. This will attach the words &quot;web&quot; and &quot;hosting&quot; and &quot;main&quot; to your homepage. You can obviously leave the word &quot;main&quot; out if desirable however in many cases it does work for the visitor (you know, those people you're actually building the site for).</p><p>This doesn't stop at the homepage. If you are linking to internal pages either through your navigation, footers, or inline text links - try to use the phrases that you would want to target on those pages as the linking text. For example, if that hosting company offered and wanted to target &quot;dedicated hosting&quot;, rather than leaving the link at solely the beautiful graphic in the middle of the homepage they would want to include a text link with the anchor text &quot;dedicated hosting&quot; and link to this internal page. This will tie the keywords &quot;dedicated hosting&quot; to the page.</p><p>In a field as competitive as hosting this alone won't launch the site to the top ten however it'll give it a boost and in SEO, especially for competitive phrases, every advantage you can give your site counts.</p><p>Increasing The PageRank Of Internal Pages</p><p>While we will be discussing PageRank (a Google-based term) here the same rules generally apply for the other engines. The closer a page is in clicks from your homepage, the higher the value (or PageRank) the page is assigned. Basically, if I have a page linked to from my homepage it will be given more weight that a page that is four or five levels deep in my site.</p><p>This does not mean that you should link to all of your pages from your homepage. Not only does this diffuse the weight of each individual link but it will look incredibly unattractive if your site is significantly large.</p><p>Figure out what you main phrases are and which pages will be used to rank for them and be sure to include text links to these internal pages on your homepage. It's important to pick solid pages to target keyword phrases on as you don't want human visitors going to your &quot;terms and conditions&quot; page before they've even seen the products.</p><p>If that hosting company noted above has a PageRank 6 homepage, the pages linked from its homepage will generally be a PageRank 5 (sometimes 4, sometimes 6 depending on the weight of the 6 for the homepage). Regardless, it will be significantly higher that if that page was linked to from a PageRank 3 internal page.</p><p>How To Improve Your Internal Linking Structure</p><p>There are many methods you can use to improve your internal linking structure. The three main ones are:</p><p></p><p><li>Text link navigation</p><p><li>Footers</p><p><li>Inline text links</p><p></p><p>Text Link Navigation</p><p>Most websites include some form of navigation on the left hand side. This makes it one of the first things read by a search engine spider (read &quot;Table Structures For Top Search Engine Positioning&quot; by Mary Davies (<a href="http://www.beanstalk-inc.com/articles/se-friendly-design/table-structure.htm" target="_new">http://www.beanstalk-inc.com/articles/se-friendly-design/table-structure.htm</a>) for methods on getting your content read before your left hand navigation). If it is one of the first things the search engine spiders sees when it goes through your site it will have a strong weight added to it so it must be optimized with care.</p><p>If you are using text link navigation be sure to include the targeted keywords in the links. Thankfully this cannot be taken as meaning &quot;cram your keywords into each and every link&quot; because this is your navigation and that would look ridiculous. I've seen sites that try to get the main phrase in virtually every link. Not only does this look horrible but it may get your site penalized for spam (especially if the links are one after another).</p><p>You don't have to get your keywords in every link but if workable, every second or third link works well. Also consider what you are targeting on internal pages. If you homepage target is &quot;web hosting&quot; and you've linked to you homepage in the navigation with &quot;web hosting main&quot; which is followed by your contact page so you've used &quot;contact us&quot;, it would be a good idea to use the anchor text &quot;dedicated hosting&quot; for the third link. It reinforces the &quot;hosting&quot; relevancy and also attaches relevancy to the dedicated hosting page of the site to the phrase &quot;dedicated hosting&quot; in the anchor text.</p><p>Footers</p><p>Footers are the often overused and abused area of websites. While they are useful for getting spiders through your site and the other points noted above, they should not be used as spam tools. I've seen in my travels, footers that are longer than the content areas of pages from websites linking to every single page in their site from them. Not only does this look bad but it reduces that value of each individual link (which then become 1 out of 200 links rather than 1 out of 10 or 20).</p><p>Keep your footers clean, use the anchor text well, and link to the key internal pages of your website and you will have a well optimized footer. You will also want to include in your footer a link to a sitemap. On this sitemap, link to every page in your site. Here is where you can simply insure that every page gets found. Well worded anchor text is a good rule on your sitemap as well. You may also want to consider a limited description of the page on your sitemap. This will give you added verbiage to solidify the relevancy of the sitemap page to the page you are linking to.</p><p>Internal Text Links</p><p>Internal text links are links placed within the content of your work. They were covered in last week's article on content optimization, which gives me a great opportunity to use one as an example. (in the HTML version the text "content optimization" was used as an example inline link to <a href="http://www.beanstalk-inc.com/articles/search-engine-positioning/optimization.htm" target="_new">http://www.beanstalk-inc.com/articles/search-engine-positioning/optimization.htm</a>).</p><p>While debatable, inline text links do appear to be given extra weight as their very nature implies that the link is entirely relevant to the content of the site.</p><p>You can read more on this in last week's article.</p><p>Final Notes</p><p>As noted above, simply changing your internal navigation will not launch your site to the top of the rankings however it's important to use each and every advantage available to create a solid top ten ranking for your site that will hold it's position.</p><p>They will get your pages doing better, they will help get your entire site spidered, they will help increase the value of internal pages and they will build the relevancy of internal pages to specific keyword phrases.</p><p>Even if that's all they do, aren't they worth taking the time to do right?</p><p>Next Week</p><p>Next week in part six of our "Ten Steps To an Optimized Website" series we will be covering the importance of human testing. Having a well-ranked website will mean nothing if people can't find their way through it or if it is visually unappealing.</p><p>About The Author</p><p>Dave Davies is the owner of Beanstalk Search Engine Positioning (<a href="http://www.beanstalk-inc.com/" target="_new">http://www.beanstalk-inc.com/</a>). He has been optimizing and ranking websites for over three years and has a solid history of success. Dave is available to answer any questions that you may have about your website and how to get it into the top positions on the major search engines.</p><p><a href="mailto:info@beanstalk-inc.com">info@beanstalk-inc.com</a>

วันจันทร์ที่ 29 กันยายน พ.ศ. 2551

Keywords Finalization Methodology

To arrive at the set of keywords that:</p><p>Describe business correctly (are relevant) Attract traffic (are popular & are searched for) Have less competition (are relatively un-optimized for )</p><p>Steps</p><p>Step I: Lets start by saying that the for the keyword finalization of a web site the first step is to device the theme of the web site. The keywords then should be generated which are in sync with the themeing structure of the site. The home pages & the other higher level pages should target more general(main theme)keywords. The deeper pages (embedded in subdirectories or sub domains) should target more specific & qualified keywords.</p><p>Once the sites themes & sub-themes are done, lets start by looking for the keywords</p><p>StepII:</p><p>The finalization of the keywords for any given site can be done in the following way:</p><p>Generation of the seed keywords for the site (theme keywords).</p><p>Expansion of the seed keywords into key-phrases by adding qualifiers (sub theme keywords)</p><p>Generating a larger set of keywords by word play on the key-phrases generated in step II.(sub theme targeting)</p><p>Lets take them one by one:</p><p>SEED Keywords/Primary keywords:</p><p>The seed keywords can be generated by either of the ways mentioned below:</p><p>The client provides the terms he feels are relevant to his business.</p><p>The SEO firm generates the seed words by understanding the business domain & the business model of the client.</p><p>Some outside domain consultant provides them.</p><p>Another way of generating seed keywords is to look for the meta tags of the competition web sites.</p><p>WARNING: do not place any unnecessary emphasis on these tags. Use them just to generate you seed keywords list.</p><p>If one has certain set of keywords then tools like WT & Overture can also be used to arrive at the other relevant seed keywords.</p><p>Typically seed keywords are single word. A good number of seed Keywords is between 10-12.</p><p>SUB theme Keywords (add Qualifiers)</p><p>Now to these seed keywords add qualifiers.</p><p>These qualifiers can be anything location/sub-product/color/part no/activity/singular etc.</p><p>By utilizing these qualifiers one can expand the list of the seed keywords. Say a good number would be anywhere between 20-30.</p><p>Typicaly a sub theme key phrase could be of 2-3 -4 word length.</p><p>One recent study suggests that</p><p>The typical searcher often uses longer queries. Many contain more than three words. Within three different search engines, keyword distribution data tells a compelling story:</p><p>Words in Query LookSmart (%) Ask.com (%) Teoma (%) 1 27.00 12.76 38.04 2 33.00 22.46 29.59 3 23.00 19.34 18.13 4 10.00 11.89 8.00 5 7.00* 7.86 3.51 6 - 6.19 1.39 7 - 5.47 0.63</p><p>LookSmart does not report beyond 5 search terms, instead grouping five or more terms into one category.</p><p>Approximately 40 percent of queries in LookSmart have three or more words. About 32 percent in Teoma have three or more. Ask Jeeves has an even higher skew, nearly 62 percent, because of its natural language focus. Within FAST, the database that powers Lycos and others, the average is 2.5 terms. That suggests a similar frequency distribution to LookSmart and Teoma.</p><p>Hence we can keep the average length of sub theme keywords at around 3.</p><p>Rakesh Ojha is a successful Internet marketer utilizing both pay-per-click marketing and search engine optimization to increase website traffic. To learn more, visit <a target="_new" href="http://sem.mosaic-service.com">http://sem.mosaic-service.com</a>

Directories and Their Importance for Search Engine Rankings

About directories:</p><p>A directory is simply a web site that contains a categorized listing of links from around the web. They aid surfers to locate the 'best' and most informative links for a particular category. For example a category may be called 'Home and Garden' and in this category there is list of links about home improvement and gardening. Directories consist of a collection of categories into which links are seperated. Categories can have sub-categories to make the division of links more specific.</p><p>Directories are important tools in building link popularity and as a result help improve search engine ranking. They are an excellent source of inbound, one-way links, which are the most powerful types of links to help build link popularity.</p><p>There are a multitude of varied directories on the web at present. They range from general directories that include categories for almost everything, to specific directories that contain categories to match specific area/s of interest e.g. web sites about fishing. It is helpful to get your web site listed in as many directories as possible as this will help you beat the competition in rankings.</p><p>Getting your link listed varies between directories as each has their own different process. Some directories require a fee but most do not. A lot of directories offer a mixture between free and paid listing. When you pay to get your link listed in a directory it is normally added within a week and it is guaranteed to be added. On the flipside a directory offer a free listing can take anywhere from a week to several months to add your link, and also there is no guarantee that the link will be added.</p><p>Some directories use other approaches to get your link listed. For example the Joeant directory http://www.joeant.com requires you to register as an editor in order to submit a link. The Zeal directory http://www.zeal.com also requires you to register as a member but first you must complete and pass a 20 question quiz to prove your worthiness.</p><p>Directories to submit to:</p><p>Getting listed in quality directories such as DMOZ or Yahoo can be more beneficial for your link popularity than a lot of links from smaller, newer directories. It is therefore useful to know which to submit to. The following is a listing of some of the higher quality directories:</p><p>DMOZ<br> http://www.dmoz.org<br> The most important directory that exists on the web today. Getting listing in this directory is critical if you aim to achieve top rankings..</p><p>* Listing - Free<br> * Time to get listed - 1 to 6 months, maybe even longer</p><p>Yahoo Directory<br> http://dir.yahoo.com<br> After DMOZ, probably the next most powerful directory to get listed in.</p><p>* Listing - Free or paid<br> * Time to get listed - Free inclusion can take several months, paid inclusion a few days</p><p>Zeal & Looksmart<br> http://www.zeal.com<br> http://www.looksmart.com<br> These two directories are very closely related as Looksmart owns Zeal.<br> * Listing - Zeal: Free for non-commercial sites (sites can only be submitted after passing a 20 question quiz) but paid for commercial sites. Looksmart: Paid</p><p>* Time to get listed - Zeal: Free inclusion can take 1-2 months, paid inclusion a few days<br> Looksmart: A few days</p><p>JoeAnt<br> http://www.joeant.com<br> JoeAnt is a directory that is growing in popularity.</p><p>Gimpsy<br> http://www.gimpsy.com<br> Operates differently to other directories. Web sites are organized according to question asked.</p><p>Information needed to submit to directories:</p><p>When submitting a link to a directory you will more than likely be asked to provide the following information about your web site:</p><p>* URL<br> * Title<br> * Description<br> * Keywords that describe it</p><p>One thing I recommend that you do before submitting is to prepare this information in advance and to use the same information when submitting to all directories. As much as possible place keyphrases that describe your web site into the Title text and Description text you submit, especially into the Title text (this is known as anchor text optimization).</p><p>When a directory asks you to provide keywords that describe your web site it uses these keywords to locate your site when someone performs a search within the directory. So for example if I submit the keyword 'lawn' I have a better of chance of being listed in the directory's search results when someone searches within it using the keyword 'lawn'. However focus more on writing keyphrase-rich Title and Description entries than Keywords when submitting.</p><p>Tips when submitting to directories:</p><p>* Submit your link to the most appropriate category Every directory will tell you this as it makes their job easier when it comes around to listing your link. This point should be stressed though as it not only makes the directory editor's job a little easier therefore speeding up your listing, but also the search engines will look at the other links that surround you on the page you are listed on and if they are of a similar theme to your own this can boost your rankings.</p><p>* Be patient One point to remember about directories that provide a free listing, be patient. If you are not paying to be in the directory then your link listing has a lower priority than paid listings. It can take several months to get into some directories so it can be frustrating. The best thing to do when you submit is to put it out of your mind.</p><p>* Keep a log One of the most useful things to do upon submitting to a directory is to write the name of the directory and the date of submission into a log of some sort (try an Excel sheet). This prevents you from submitting to a directory more than once (keeping the editors happy) and gives you the satisfaction that you are getting something done. When your link gets listed in the directory mark it off in your log. Job well done!</p><p>Summary:</p><p>Submitting your link to directories is an excellent way to build your web site's link popularity and improve search engine rankings. There are many directories out there, ranging from the general to the specific, each with their own link submission processes. Prepare keyphrase-rich text that accurately describes your web site to submit to the directories. Choose the most appropriate category for your web site before submitting.</p><p>Submitting to directories requires a little effort and a lot of patience. Your search engine rankings will thank you for the extra effort.</p><p>Frank Kilkelly is a Search Engine Optimization (SEO) Expert and Webmaster at <a target="_new" href="http://www.seo-ireland.com/">http://www.seo-ireland.com/</a>, a complete search engine optimization resource. The highlight of the site is an SEO forum <a target="_new" href="http://www.seo-ireland.com/">http://www.seo-ireland.com/forum/</a> for discussion of the latest techniques and tips to improve the ranking of your web site.

Put the Full Power of Google to Work with 11 Google Power Search Tips

Google has many ways to help you find want you want easily. Here are 11 Power Search Tips to make your Google Experience easier and more fun. These Search Tips were found on the Following Google web page http://www.google.com/help/operators.html</p><p><b>1 - cache</b></p><p>This will return the Web Page that Google has in it's Memory <ul> <li>cache:www.google.com ---- Returns Web Page <li>cache:www.google.com Advanced Search --- Returns web Page with Advanced Search Highlighted </ul></p><p><b> 2 - link </b></p><p>Lists web pages that have links to Target Web Page</p><p><ul> <li>link:www.google.com Return Links to www.google.com </ul></p><p><b>3 - related:</b></p><p>Find Web Pages that are similar to the Target Page <ul> <li>related:www.google.com </ul></p><p><b>4 - info:</b></p><p>This will show information Google has about the Web Page</p><p><ul> <li>info::www.google.com </ul></p><p><b>5 - site:</b></p><p>Google will restrict its search to only find results from the Target Domain <ul> <li>help site:www.google.com Will Find Pages about Help </ul></p><p><b>6 - allintitle</b></p><p>Will only return Search results where the Search terms appear in the Title <ul> <li>allintitle:google search </ul></p><p><b>7 - intitle:</b></p><p>Will only Return Pages that contain that word in the Title</p><p><ul> <li>intitle:search </ul></p><p><b>8 - allinurl: </b> Will only return Search results where the Search terms appear in the URL <ul> <li>allinurl: google search </ul></p><p><b>9 - inurl: </b> Will only Return Pages that contain that word in the URL <ul> <li>inurl:search </ul></p><p><b>10 - stocks:</b></p><p>Google Will Treat everything that Follows Search term stocks: as a Stock ticker symbol and Google will return a Stock Information Page</p><p><ul> <li>stocks:ibm </ul></p><p><b>11 - define:</b></p><p>This will return the definition of a word or phrase <ul> <li>define:google </ul></p><p>These were just some of the things you can Do with Google Search. If you would like to see some other fun things you can do with Google like currency conversions, Street Maps, Travel information, Fedex Package Tracking and to many others to list visit http://www.google.com/help/features.html</p><p><b>About The Author:</b><br> Mike Makler has been Marketing Online Since 2001 When he Built an Organization of over 100,000 Members<br> <br> Get Mike's Newsletter:<br> <a target="_blank" href="http://ewguru.com/newsletter">http://ewguru.com/newsletter</a><br> <br> More Articles by Mike:<br> <a target="_blank" href="http://ewguru.com/tips">http://ewguru.com/tips</a><br> <br> Permission Based E_Mail Marketing Methods<br> <a target="_blank" href="http://ewguru.com/hbiz/amazingoffer.html">http://ewguru.com/hbiz/amazingoffer.html</a><br> <br> Copyright ? 2005-2006 Mike Makler the Coolest Guy in the Universe

วันอาทิตย์ที่ 28 กันยายน พ.ศ. 2551

Buying Textlinks: The Latest SEO Craze

With search engine algorithms changing seemingly daily, the quest to rank high in the search engines and stay there is proving to be quite the challenge for most webmasters. One of the more recent popular ways of achieving this is through buying text links on websites that have high PR's (page ranks) on Google and that also rank well in the other major search engines. Is buying text link placement worth it?</p><p><b>Purpose</b><br> The primary purpose of buying a text link on a website that ranks better in search engines than yours is to receive a backlink to your site, without having to reciprocate a link back (as this dilutes the quality of a link). This backlink counts as a &quot;vote&quot; for your website and especially if coming from a site that is credible to the search engines, helps your site establish credibility as well. For example, a website has been online for three years and currently has a PR of 7. Your site is three months old and has a PR of 2. The 3 year old website places a link to your site from their homepage. As this site has history and is therefore established, this &quot;vote&quot; from a PR 7 website holds a lot of value. Compare this to a site linking to yours that is only a year old and holds a PR of 1 ? it makes sense that you would want links coming from older sites that have high PRs. The higher the number and the better the quality of backlinks your website receives, generally the higher your rank on the SERPs (search engine results page).</p><p>Not only are text links great for search engine purposes, but if placed well, can actually drive traffic to your site. And whose website nowadays couldn't use more traffic? Enough said.</p><p>Those are the benefits behind purchasing text links. But what are the disadvantages?</p><p>1) Cost ? it's not uncommon to pay $100 monthly for a 3 word textlink on a PR 6 website. If you do choose to go this route, choose your text carefully and budget wisely.</p><p>2) Relevancy to search engines ? if you're running a homemade toys website and have a popular online pharmacy website linking to yours, this won't be as relevant as say having Mattel? linking to you. Remember that search engines are becoming more and more sophisticated and they can tell whether two sites have complimentary, competitive, or completely unrelated website content.</p><p>3) Limited link length - rarely are you given the option of selecting more than three words when purchasing links. You may have difficulty coming up with only three keywords relevant to your website, so this can often prove to be a difficult task.</p><p>4) Page is already populated with other text links ? online auction sites that have sellers auctioning off website text links are notorious for selling textlinks on websites that already have 50 more on the same page. Look for sites that limit the number of textlinks sold.</p><p>5) Search engine spam ? your site linked on every single page of a 3000 page website using the same keywords and url can be considered spam to a search engine. If you choose to purchase multiple text links on multiple websites, make sure your linking text varies.</p><p>The lesson is to be simply cautious. Using a textlink broker to find relevant and complimentary websites to buy links from can prove to be highly beneficial ? just do your research first. Would a $100 monthly textlink investment be better spent on a pay-per-click (PPC) campaign, or will the long term benefits of buying a text link outweigh a temporary influx in visitors? Return on investment is key ? whichever route will yield you the highest ROI should be your ultimate determining factor.</p><p>Veronica Dubak is an SEO expert, internet entrepreneur, and the owner of the successful <a target="_new" href="http://www.surveybounty.com">free online paid surveys</a> directory, SurveyBounty.com. With a comprehensive listing of market research companies classified by region, and background information on the online survey industry, SurveyBounty.com is the legitimate source for online survey information.

The Google Sandbox ? A Frustrating Inevitability or a Golden Opportunity?

<h2>Introduction</h2></p><p>The Google Sandbox is a term applied to the phenomenon experienced by many new websites that delays the sites inclusion within the main Search Engine Results Pages (SERPS) of Google. Often new websites can find themselves confined to the 'Sandbox' for 6-9 months, during which time traffic to the site is severely compromised. The Google Sandbox is therefore usually seen as a frustrating inevitability by webmasters and one for which there is no quick easy solution.</p><p>My recent observations however have led me to believe that the time your website spends in the Google Sandbox should be seen as a golden opportunity rather than a frustrating inevitability.</p><p><h2>Into the Sandbox</h2></p><p>Many webmasters respond to their websites confinement to the Sandbox by spending endless hours forever checking the listings in Googles results pages for any sign of their website. Not only is this a waste of precious time but also serves only to increase the frustration caused by the Sandbox. Other webmasters more sensibly focus their time and effort improving the Search Engine Optimisation (SEO) of their website in order to improve its rankings within other search engines such as MSN, Yahoo, Ask Jeeves and others. Although these may not be as widely used as the Google search engine, they don't have the same aging delay of the Sandbox that Google does. Therefore, achieving good listings in these search engines early in a websites life can at least generate some traffic and hopefully some sales whilst confined to the Sandbox.</p><p>Earlier this year the Tsunami disaster in Asia forced me to change the name of my web development business from Tsunami-Site-Design to Pixelwave Design. I had to register a new domain name, build a new website and start from scratch with my web promotion. The new site was an ideal candidate for confinement to the Google Sandbox, and sure enough after an initial day or two of good rankings the new site couldn't be found in a Google search for any of my keywords. A search for the business name did bring the new site up in first place though so I was safe in the knowledge that my site was contained within the Google database but the lack of inclusion for my keywords suggested that my site had been confined to the Sandbox. The inclusion of my own personal site within the Google Sandbox gave me a great opportunity to monitor and track its progress.</p><p>The first thing I noticed was the high frequency with which a Googlebot spidered the pages of my site. The Googlebots were visiting a few times everyday and visiting all the pages of the site. It struck me that if Google had gone to the trouble of writing and implementing the 'Sandbox' filter as part of their algorithm and regularly spidering the sites within the Sandbox then the Sandbox wasn't simply an area into which new sites get put in order to delay their inclusion in the SERPS. Instead it can be considered as a probationary period for new websites during which Google pays close attention to the sites development.</p><p>This means that as far as Google is concerned, your websites time in the Sandbox may well be one of the most important times in its development. Rather than seeing this time as a frustrating inevitability you should see it as an opportunity to really show Google how your website can shine. The Google Sandbox should be a time to make the most of all the attention your site is getting from Google and show the search engine what it wants to see.</p><p><h2>The Steps to Success </h2></p><p>Now that we have decided that Google is actually paying your site a good deal of attention whilst it is in the Sandbox it is time to see how we can make the most of this Golden Opportunity.</p><p>The first thing to do is plan ahead and get your site into the Sandbox as soon as possible, so attracting this attention from Google. Don't wait until your website is built in all its glory before registering a domain name and uploading your site. Instead make sure the first thing you do is register your domain name, set up a hosting account and go live with one or two pages explaining what your site is about and what can be expected from it as it develops.</p><p>Next get the Googlebot to visit these embryonic pages so that Google is aware of the new site as soon as possible. There is no need to submit your site manually or repeatedly to search engines, a couple of inbound links (IBL's) from other sites that are regularly spidered will be sufficient to get the Googlebot calling. Once the robots have paid an initial visit they will return.</p><p>You may be lucky and find that your site gets straight into the main SERPS, but if it is a new domain name and new site then it is likely that before long you will find your new creation in the Sandbox where it will receive a great deal of attention from the Googlebot. Now is the time to be proactive with your website development, don't sit back patiently waiting to be released from the Sandbox, instead make the most of your time in there and show Google the potential of your website.</p><p><h2>Content is King</h2></p><p>The first thing you need to do is continue the development of your website. Regularly add new pages packed full of relevant content to your site. As far as search engines are concerned, 'content is king'. Search Engines exist to provide their users with links to content relevant to their search criteria, therefore they are always on the lookout for websites that contain plenty of good quality, regularly updated relevant content. Of course, good quality, relevant content will also be beneficial to your websites human visitors, which at the end of the day is your number one priority. Add plenty of new, relevant content of interest to real visitors and the Googlebot will thank you for it.</p><p><h2>Linking Strategy</h2></p><p>Next you need to develop and implement a strategy for obtaining a network of inbound links. Googles algorithm relies heavily on link popularity so it is likely that it pays attention to the number of IBL's your site gains whilst in the Sandbox. Don't sit back waiting for people to link to your site, get out there and be proactive. As always relevance is the key, and a sensible linking strategy whilst in the Sandbox will be noticed by the Googlebot.</p><p>There are numerous ways to generate inbound links and I have covered these in previous articles. However, things rarely stand still for long in the world of Search Engine Optimisation and latest research seems to suggest that simply getting huge amounts of IBL's whilst in the Sandbox may no longer be sufficient. It now seems that the rate of accumulation of IBL's may be important. Google is now thought to pay attention to the rate of accumulation of IBL's and expects to see them develop in what it considers a natural, organic manner. This means that suddenly gaining a huge number of inbound links may be frowned upon by Google. Instead your linking strategy should be a sustained effort aimed at gaining new IBL's from relevant websites over a long period of time.</p><p><h2>Summary</h2></p><p>Although the Google Sandbox is still a frustrating inevitability and there is no quick fix way limit the amount of time a website spends confined to it, patiently sitting by waiting for this confinement to end is a waste of what could be a golden opportunity. Google pays a lot of attention to sites in the Sandbox making confinement to the Sandbox an ideal time to really let your website shine. During this time give Google what it wants to see; regularly updated relevant content, lots of new pages and a sustained increase in the number of inbound links. Not only could this improve your sites ranking within the SERPS once its confinement to the Sandbox is over, but it will pay dividends for your site in general by providing its visitors with the information they require.</p><p>Alan Cole runs <a target="_new" href="http://www.pixelwave.co.uk">http://www.pixelwave.co.uk</a>, a one-person web design studio. His aim is to provide <a target="_new" href="http://www.pixelwave.co.uk">cost effective website design</a> production and maintenance by offering professional web solutions that stand out from the crowd. Increasingly his work involves website promotion and <a target="_new" href="http://www.pixelwave.co.uk/seo.shtml">Search Engine Optimisation</a> as well as training courses on all aspects of web design and promotion.

SEO Expert Guide - Black Hat SEO - Activities to avoid (part 8/10)

In parts 1 - 7, you learnt how to develop your proposition, identify your key words and optimize and promote (for free) your site and pages on the world's search engines. You were also introduced to our mythical Doug (who sells antique doors, door handles, knockers, door bells or pulls and fitting services) in Windsor in the UK.</p><p>There are some search engine optimization and promotion techniques I did not cover, as they are unethical. In this part of the guide, I outline this techniques, so you can recognize and avoid them!</p><p><h3>(a) Search Engine Ethics</h3>Borrowing from the wild west, white hat SEO generally refers to ethical techniques, whilst black hat SEO is unethical techniques. Search engines are designed to help people find genuinely relevant results for the key words they enter, in a ranked order. Relevancy is a mixture of the "authority" of the site generally and the specific relevance of the page content to the search made. Anything which undermines this (ie. by creating false impressions of authority or relevance) is unethical because it undermines the key purpose of search engines.</p><p>Black hat practitioners tend to see search engine optimization as a war, and search engines and SEOs as the enemy, to be beaten by means fair or foul. White Hatters tend to view search engines as friends, who can help them get business.</p><p><h3>(b) Hidden Page Text</h3>Blackhatters create hidden text in page code (not intended for humans). At a simple level, this could be white text on a white background. The text is generally hidden because it does not fit with the rest of the page content but does help with search engine results. This by definition means that - as a human searcher - you are likely to be disappointed by the result when you land on this page.</p><p>In their <a target="_new" href="http://www.google.com/webmasters/guidelines.html">Guide for Webmasters</a>, Google implore you to "make pages for users, not for search engines. Don't deceive your users, or present different content to search engines than you display to users. " The guide goes on to specifically recommend you "avoid hidden text or hidden links". If you want to avoid being blacklisted by Google, then I would recommned you pay attention to this advice.</p><p><h3>(c) Buying Inbound Links</h3>In their <a target="_new" href="http://www.google.com/webmasters/guidelines.html">Guide for Webmasters</a>, Google ask you to "avoid tricks intended to improve search engine rankings. A good rule of thumb is whether you'd feel comfortable explaining what you've done to a website that competes with you. Another useful test is to ask, 'Does this help my users? Would I do this if search engines didn't exist?' ".</p><p>You can find on the web links from PR8 sites on sale for $200. From our earlier exploration of PageRank, you'll understand why such a high price can be supported. As you can imagine, Google and others frown on this activity, as it undermines the whole principle of democracy that underpins PageRank. Buying votes? Unethical!</p><p>The consensus in forums is that Google look out for unnatural linking patterns, including substantial cross linking, sharp growth in backlink numbers and same anchor text in most links. I would advise you avoid this sort of activity altogether!</p><p><h3>(d) Use of Link Farms and IBLNs</h3>In their <a target="_new" href="http://www.google.com/webmasters/guidelines.html">Guide for Webmasters</a>, Google say "don't participate in link schemes designed to increase your site's ranking or PageRank. In particular, avoid links to web spammers or 'bad neighborhoods' on the web as your own ranking may be affected adversely by those links."</p><p>In practice, Google identify 'bad neighbourhoods' by devaluing back-links from the same IP subnet. Where a site is simply a link farm site (that lists loads of links to other sites, in exchange for links back or money) Google will eventually identify it as a 'bad neighbourhood' and remove the links from it's index.</p><p>Independent Back Linking Networks (IBLNs) are a network of sites that all directly or indirectly link back to your site in such as a way as to promote it through the search engine rankings. The way IBLNs get around Google's IP monitoring is by using a completely different web-hosting plan for every site you want to link back directly to you.</p><p>This is very time-consuming and will cost you a lot of money. It is also not fool-proof and (if detected) can lead to Google simply wiping out all the direct referrers from their index (the sites they find flagrantly built simply to link to your main site) or, worse, dropping your entire IBLN - including the main site your were trying to optimise for. Don't be daft - keep it clean! <h3>(e) Use of Cloaking Pages or Sneaky Redirects</h3>In their <a target="_new" href="http://www.google.com/webmasters/guidelines.html">Guide for Webmasters</a>, Google recommend you "avoid 'doorway' pages created just for search engines, or other 'cookie cutter' approaches such as affiliate programs with little or no original content."</p><p>When Doug reads this, he begins to understand why doorknockers.com fails to rank higher in the search engines. That domain simply redirects to a different site (with a regular business name) which also fails to rank well in Google. This poor business-owner has clearly become an unwitting and almost certainly innocent victim og Google's policy to catch out Blackhatters.</p><p>He also understands why having his content on antique-door-knockers.com will be preferrable to redirecting people to a domain based on his company name (Doug Chalmers Limited).</p><p>Next we turn to tools you can use to monitor your ongoing optimization effectiveness...</p><p><h3></h3><h4>Navigate the guide</h4>Previous : <a target="_new" href="http://viney.com/search-engine-optimization-expert/2005/03/seo-expert-guide-paid-site-promotion.html">SEO Expert Guide - Paid Site Promotion (Marketing) (part 7/10)</a></p><p>Next: <a target="_new" href="http://viney.com/search-engine-optimization-expert/2005/03/seo-expert-guide-ongoing-monitoring-of.html">SEO Expert Guide - Ongoing Monitoring of Results (part 9/10)</a></p>About the author:</b></p><p>David Viney (<a href="mailto:david@viney.com">david@viney.com</a>) is the author of the Intranet Portal Guide; 31 pages of advice, tools and downloads covering the period before, during and after an Intranet Portal implementation.</p><p>Read the guide at <a target="_new" href="http://www.viney.com/DFV/intranet_portal_guide">http://www.viney.com/DFV/intranet_portal_guide</a> or the Intranet Watch Blog at <a target="_new" href="http://www.viney.com/intranet_watch">http://www.viney.com/intranet_watch</a>.

วันเสาร์ที่ 27 กันยายน พ.ศ. 2551

Search Engine Saturation Tool ? A Must Have SEO Tool

<b>Search Engines</b> have become the soul of the Internet. They provide a means of aggregating, correlating, indexing and categorizing the vast amounts of content in the wild world of Internet. They have gotten complex over the years with better algorithms to serve the folks who want to find something, really find something. They have become extremely adept at finding out duplicates and hidden texts, detecting and punishing the search engine spammers. Every webmaster should take utmost care on what gets to be listed in the search engine. Things that were employed earlier to spam the search engine to get high ranking will come back to haunt you if you don't do the garbage disposal. In this article we will employ one tool that makes the SEO or webmaster's task much simpler. this article provides you with some SEO tips on how to use the saturation tool.</p><p><b>Search engine saturation tools</b> provide a snapshot of what is currently indexed or known to the popular search engines. They provide you a way to understand what areas of your website are indexed and what is not. Alternately it provides information on, did the thing you don't want to be indexed, got indexed or safe from the eyes of the dragon. This tool shows exactly the weakest portions of your website. Here is the next step you need to understand it is saturation density. This is calculated as the percentage of your website pages that shows up in the saturation tool results. The percentage should exclude the pages that you wanted to be excluded. Additionally you should exclude the image files and object files. Once you take inventory of the file list you want to target and the number of files you got indexed, you can get your personal saturation indicator. This personal target should obviously be close to 100%.</p><p>The next factor you need to consider is the saturation density of your competitors. Just look at the saturation index of your competition. And compare against yours. This will give you a pretty good idea on the probability of some one finding your web pages over theirs. If your competition has 1000 pages indexed each for a unique keyword on top of the common keywords. They are going to get the Lion's share of the traffic. Ultimately this means PageRank. There is a lot you can learn about your competitor than you would by visiting their website. For example your site might be very rich in content and the competitor may seem to be low in content but they still rank higher. But if you look closer they may have a flat file based vibrant forum that gets indexed in the search engines giving them higher relevancy than yours. This is just one example I can reveal. There is tons of other such goldmine data that can be collected by simply using the saturation tool.</p><p>There are many search engine optimization websites and companies who offer this tool for free over the web. The resource box contains one such website. The saturation tool is typically a taken for granted tool. The average webmaster just discounts what is indexed and what the competition is index on. Looking at just the top 10 results and analysis of the same won't suffice. Dig down deeper, you will be amazed at what you can find out about the competition using these tools. Also remember what I said in the first paragraph. There is stuff that you don't want your competition to know about like a simple customer list that gets stored somewhere because you used an unprotected flat file system. Everybody is learning your competitor is also reading this article and they would have started using the saturation tool to spy on you. This tool is great as it enables you to be a good responsible business. Happy optimization!</p><p>To get more <a target="_new" href="http://www.web-inspect.com">SEO Tips</a> like this please visit web-inspect.com. Author does freelancing for many search engine websites and can be reached through the no fee free <a target="_new" href="http://www.freelancefree.com">Freelance</a> website - freelancefree.com. Author recommends the following <a target="_new" href="http://www.tutorialized.com/tutorials/Photoshop/1">Photoshop Tutorials</a> Website for your further development - tutorialized.com.

The Secret Benefit Of Search Engine Optimisation: Increased Usability

A higher search ranking is what many website owners dream of. What they don't realise is that by optimising their site for the search engines, if done correctly, they can also optimise it for their site visitors.</p><p>Ultimately this means more people finding your website and increased sales and lead generation. But are search engine optimisation and usability compatible? Aren't there trade-offs that need to be made between giving search engines what they want and giving people what they want? Read on and find out (although I'm sure you can guess the answer!)...</p><p>1. Keyword research carried out</p><p>Before you even begin building your website, you should carry out keyword research to identify which keyword phrases your site should target. Using publicly available tools such as Wordtracker (http://www.wordtracker.com), you can discover which keywords are searched for the most frequently and then specifically target those phrases.</p><p>Doing keyword research is also crucial for your site's usability. By using the same keywords in your website that web users are searching for in search engines, you'll literally be speaking the same language as your site visitors.</p><p>For example, you might decide to target the phrase, "sell toys", as your website does in fact sell toys. Keyword research would undoubtedly show you that web users are actually searching for, &quot;buy toys&quot; (think about it - have you ever searched using the word, &quot;sell&quot;, when you want to buy something?). By placing the phrase, &quot;buy toys&quot; on to the pages on your website, you'll be using the same words as your site visitors and they'll be able to find what they're looking for more easily.</p><p>2. 200 word minimum per page</p><p>Quite simply, search engines love content - the more content there is on a page the easier it is for search engines to work out what the page is actually about. Search engines may struggle to work out the point of a web page with less than 200 words, ultimately penalising that page in the search rankings.</p><p>In terms of usability, it's also good to avoid pages with very little content. A page with less than 200 words is unlikely to contain a large amount of information, so site visitors will undoubtedly need to click elsewhere to find more detailed information. Don't be afraid to put a reasonably large amount of information on to a page. Web users generally don't mind scrolling down anymore, and provided the page provides mechanisms to aid scanning (such as employing sub-headings - see point 6 below) it shouldn't be too difficult for site visitors to locate the information that they're after.</p><p>3. 100kb maximum HMTL size</p><p>If 200 words is the minimum page content size, then 100kb is the maximum, at least in terms of HMTL file size. Anything more than this and search engines may give up on the page as it's simply too big for them.</p><p>A 100kb HMTL file will take 20 seconds to download on a 56k dial up modem, used by three in four UK web users as of March 2004 (source: http://www.statistics.gov.uk/pdfdir/intc0504.pdf). Add on the time it takes for all the other parts of the page to download, such as images and JavaScript files, and you're looking at a highly un-user-friendly download time!</p><p>4. CSS used for layout</p><p>The website of Juicy Studios (http://www.juicystudio.com) saw a six-fold increase in site visitors after switching from a table-based layout to a CSS layout. Search prefer CSS-based sites and are likely to score them higher in the search rankings because:</p><p>- The code is cleaner and therefore more accessible to search engines</p><p>- Important content can be placed at the top of the HTML document</p><p>- There is a greater density of content compared to coding</p><p>Using CSS for layout is also highly advantageous for usability, as it leads to significantly faster download times.</p><p>5. Meaningful page title</p><p>If you know anything about search engine optimisation you'll know that search engines place more importance on the page title than any other attribute on the page. If the title adequately describes the content of that page then search engines will be able to more accurately guess what that page is about.</p><p>A meaningful page title also helps site visitors work out where they are, both within the site and the web as a whole. The page title is the first thing that loads up, often quite a few seconds before the content, so a descriptive, keyword-rich page title can be a real aid to help users orientate themselves.</p><p>6. Headings and sub-headings used</p><p>Search engines assume that the text contained in heading tags is more important than the rest of the document text, as headings (in theory at least) summarise the content immediately below them.</p><p>Headings are also incredibly useful for your human site visitors, as they greatly aid scanning. Generally speaking, we don't read on the web, we scan, looking for the information that we're after. By breaking up page sections with sub-headings that effectively describe the content beneath them, scanning becomes significantly easier.</p><p>Do be sure not to abuse heading tags though. The more text you have contained in heading tags within the page, the less importance search engines assign to them.</p><p>7. Opening paragraph describes page content</p><p>We've already established that search engines love content, but they especially love the first 25 words or so on each page. By providing an opening paragraph that adequately describes the content of the rest of the page (or the site if it's the homepage), you should be able to include your important keyword phrases in this crucial area.</p><p>As web users, whenever we arrive at a web page the first thing we need to know is whether this page has the information that we're after. A great way to find this out is to scan through the first paragraph, which, if it sufficiently describes the page content, should help us out.</p><p>8. Descriptive link text</p><p>Search engines place a lot of importance on link text. They assume that link text will be descriptive of its destination and as such examine link text for all links pointing to any page. If all the links pointing to a page about widgets say 'click here', search engines can't gain any information about that page without visiting it. If on the other hand, all the links say, 'widgets' then search engines can easily guess what that page is about.</p><p>One of the best examples of this in action is for the search term, 'miserable failure'. So many people have linked to George Bush's bio using this phrase as the link text, that now when miserable failure is searched for in Google, George Bush's bio appears top of the search rankings!</p><p>As web users, we don't generally read web pages word-for-word - we scan them looking for the information that we're after. When you scan through text you can't take any meaning from the word 'click here'. Link text that effectively describes its destination is far easier to scan and you can understand the destination of the link without having to read its surrounding words.</p><p>9. Frames avoided</p><p>Frames are quite an old-school technique, and although aren't as commonplace as they once were, do still rear up their ugly head from time to time. Using frames is one of the worst possible things you could do for your search engine ranking, as most search engines can't follow links between frames.</p><p>Even if a search engine does index your pages and web users find you through a search engine, they'll be taken to one of the pages within the frame. This page will probably be a content page with no navigation (navigation is normally contained in a separate frame) and therefore no way to navigate to any other page on the site!</p><p>Frames are also disadvantageous for usability as they can cause problems with the back button, printing, history and bookmarking. Put simply, say no to frames!</p><p>10. Quality content provided</p><p>This may seem like a strange characteristic of a search engine optimised website, but it's actually crucial. Search engines, in addition to looking at page content, look at the number of links pointing in to web pages. The more inbound links a website has, all other things being equal, the higher in the search rankings it will appear.</p><p>By providing creative, unique and regularly updated content on your website, webmasters will want to link to you as doing so will add value to their site visitors. You will also be adding value to your site visitors.</p><p>Conclusion</p><p>Optimising your website for both search engines and people needn't be a trade-off. With this much overlap between the two areas, you should easily be able to have a website that web users can find in the search engines, and when they do find it, they can find what they're looking for quickly and efficiently.</p><p>This article was written by Trenton Moss. He's crazy about web usability and accessibility - so crazy that he went and started his own web usability and accessibility consultancy ( Webcredible - <a target="_new" href="http://www.webcredible.co.uk">http://www.webcredible.co.uk</a> ) to help make the Internet a better place for everyone.

Do the Robot!

Everyone should realize that the search engines (sponsored ads aside)are not tools for advertisement, they are meant to be tools for everday web users. Users who search the web are looking for information, thats it. They may want information on how to buy something, or they may simply want to what the weather will be later that day, but the fact is they want information, and search engines are out there to help them find the information!</p><p>Search Engines (SE's) work as follows:</p><p>1. They index sites. They do this by following the "web" (after all, the internet is one big web). While indexing one page, they find a link to your new website, and follow that link to your site. Because your site is new, they index you. While not that simple, that is the basic idea.</p><p>2. They return to index more of your site. The rate of indexing is dependant on how often your site updates. If the se's come to a site that hasn't been updated since the last time they came, the next visit won't happen again until a while later. Similiary, if you update far too often (this is only a problem with rss feeds), as in more than a few times a minute, you could be considered a spammer, and given lower rankings.</p><p>3. Once you have been indexed, your pages will be included for searches. When a search term is typed and entered into the SE's, they will scan their indexed pages FIRST for keywords (the search term entered). From there it will determine the number of pages with said keywords, and the density. Too few keywords, are far too many keywords will result in lower rankings.</p><p>4. Next they will check your link popularity. If sites are linking to your site with your keywords as their "anchor text" (you click on make money, and it brings you to whatever website, make money would be the keywords). The number of link ins, and the relevance of said link - ins all factor in your "link popularity"</p><p>5. Finally, they will rank you and display your site based on your keywords, link popularity, frequency of updates, and general size of your site.</p><p><a target="_new" href="http://www.emoneyreport.com">http://www.emoneyreport.com</a> - How I make my money<br> <a target="_new" href="http://www.therealincome.com">http://www.therealincome.com</a> - How I market to make my money.

Tread Towards A Successful ?Internet Research?

Internet is a terrific resource containing billions of web pages dedicated to thousands of topics. Since the amount of information available on the Internet is so vast and mind baffling you may feel lost.</p><p>Your expectations from &quot;The Information Superhighway&quot; will crash if you proceed with the general view that exaggerates the ease of Internet usage. What is required is a moderate, balanced approach keeping one's head on shoulders. Approaching the net should be similar to any other research endeavor i.e. by adopting a formal strategy to maximize results. &quot;Motivation&quot; is the key word here. If you do not have a serious research goal, you cannot dig far.</p><p>To make the search a more meaningful exercise and profitable too, you should know where to search and how to search. Be clear about what you are looking for? Be specific in the identification and use of keywords. Being an advanced Internet researcher, you should always use the advanced services when available on a search engine or a directory or a look-up. Have a list of Boolean search strings ready before proceeding on for advanced search in a search engine. For preparing Boolean expressions the following may be used as required:</p><p></p><p>&quot;AND&quot; is used to tell the search engine to find both terms on the same page.</p><p>&quot;OR&quot; is used to find one term or the other. Looking for one term is very useful when the same term may appear in two different ways.</p><p>&quot;NOT&quot; tells the search engine to look for web pages with the first term but not the second.</p><p>&quot;NEAR&quot; instructs the search engine, only to return the web pages in which the terms are near each other.</p><p></p><p>Categories of Search Tools available are:</p><p></p><p>Search Engines ? They use keywords or phrases to search the Internet. Many of them allow you to enter questions rather than simply a few search terms. Most search engines have catalogues that sort a limited number of sites on the topic. Some frequently used search engines are:</p><p><ul></p><p><li>Alta Vista</p><p><li>AOL</p><p><li>Excite</p><p><li>Google</p><p><li>Hotbot</p><p><li>Lycos</p><p><li>MSN Search</p><p><li>Northern Light</p><p><li>Overture</p><p><li>Web Crawler</p><p></ul></p><p>Meta Search Engines - They quickly and superficially search several individual search engines at once and return results compiled into a convenient format. They only catch about 10% of search results in any of the search engines they visit. Some examples of Meta S.Es are:</p><p><ul></p><p><li>Ask Jeeves</p><p><li>Copernic</p><p><li>Meta Crawler</p><p></ul></p><p>Subject Directories ? They are the collection of web sites picked by editors (sometimes experts in a subject) and organized into hierarchical subject categories. They are often carefully evaluated and kept up to date. Some widely used directories are:</p><p><ul></p><p><li>About.com</p><p><li>BigHub</p><p><li>DMOZ</p><p><li>Invisible Web</p><p><li>Yahoo</p><p></ul></p><p></p><p>Search engines are wonderful but the problem is that none of them has indexed even half of the Internet. Each search engine indexes the web differently, searches the web differently and thus has very different results. This means if you enter a search into Altavista and get zero results, this may not be the case if you go to Northernlight or Google. A good web researcher must search a few search engines before exhausting the search.</p><p>What is MUST before setting out for search is a glance at the FAQs. As each search engine is different from each other, it is essential to set your doubts at rest before you proceed.</p><p>X-raying the websites ? At times you will not be allowed to access a particular page on the website as it may not be linked with any of the pages on that URL. A good researcher knows that just because you are not allowed access to a page, does not mean you can't still get in. If you can't get in from front door of the website, then try the back door of the server by using the advanced function on a search engine. Even if a page is not linked, the search engines might have indexed it. All you need to do is go to the advanced search function on Altavista and type in host:XYZ.com AND the words you expect to find on your page.</p><p>Flip Searching ? Flipping is a technique in which you look for pages containing links to specified URLs. Use linkdomain:ABC.com to tell the search engine to locate all the pages that are linked to &quot;ABC URL&quot;.</p><p>Every great researcher should have an organized library of resources. Whether this comes in the form of organized bookmarks and favorites or a notebook, it is imperative to track your research.</p><p>The Internet is like an ever-changing medium. What worked yesterday, may not work today. Therefore, a good researcher should always have &quot;Ever Onwards&quot; as the motto and should not surrender the search easily. With over billions of pages on the Internet you can turn all odds in your favor and succeed in finding what you are looking for.</p><p>About The Author</p><p>Jagmohan Saluja is a Virtual Professional providing support services to small businesses. To know more about him visit <a href="http://www.internet-researcher.com." target="_new">http://www.internet-researcher.com.</a></p><p><a href="mailto:indianva@internet-researcher.com">indianva@internet-researcher.com</a>

วันศุกร์ที่ 26 กันยายน พ.ศ. 2551

Search Engine Optimization Tip: Dont Buy Anything Before You Learn More!

"content-type" content="text/html;charset=utf-8">
<TITLE>302 Moved

Arrogant Overture Placing Greed Ahead Of Their Customers Needs

"content-type" content="text/html;charset=utf-8">
<TITLE>302 Moved

วันพฤหัสบดีที่ 25 กันยายน พ.ศ. 2551

Link Popularity --- Its Role and Importance In Getting Top Search Engine Rankings

"content-type" content="text/html;charset=utf-8">
<TITLE>302 Moved

วันพุธที่ 24 กันยายน พ.ศ. 2551

Are You Getting Nuked By Google Lately?

Since the last update of Google, there were many cases and examples of Google Nuke Bot! This I call anyway. Have you visited a favorite website last only to realize they have been Nuked by Google? More and more we see Internet Marketing / SEO firms get Nuked by Google eliminate their databases. I will not mention any names, because I am sure that owners of sites for people they already know and are ashamed of this development. As word has been launched in WebPositionGold be banned from Google for automatic queries sent to Google, and we are noticing other sites in the account. For more information on WebPositionGold click: http://www.socialpatterns.com/search-engine-marketing/webposition-banned/ Ting, as we already know about Webpositiongold, as other sites that are seriously affected? The site was Nuked? It seems that some areas, which had thousands of links from Google will be hardest hit. It seems that Google is the fight against spam tactics , submission tactics and everything related practices unethical SEO. Google made an effort to comply with its guidelines for webmasters? Will it reach a point where if we do not defend the policies of Google, we can not succeed on the Web? This idea is ridiculous, but almost afraid to believe that this can happen! What is on the Web site continues to hide text through the background color? Div hidden layers? and mirror pages? Why has not Google attacked those issues first? You can almost make the assumption that Google nuking websites that send automatic classification and link popularity queries to the database, this May will be a major effort to ease tensions in the search for servers to free memory of some. How to make your site Google Nuked affect its credibility? An event ultimately that could destroy your reputation online. People who have come to rely on their knowledge and failure rankings on Google May never looks the same company the most. People can think for themselves: I do not want Nuked as they did. How can you say that Nuked? * Your Google PageRank is 0-2/10, and now must be at least 5 / 10 * You have links back to zero in Google * You already have zero internal site with Google to make lists (site: www.yoursite. Com) * Google cache of your site is over who are newer to the sites, not to be confused between this process and nuking its own evolution online. Achieve and maintain a high PR a lot of work. When Nuked, Google again? The question I have for sites that have been Nuked: Is it possible to see in Google files its registration statistics for your site? If so, I wonder if Google is still keeping an eye on and watch every move? In conclusion: Stay away from programming that automatically generate search in Google. You do not see your link popularity 3 times a week, and most importantly, do not check their ranking on search engines twice a day. Only promote your site and measure its success through its internal Web site and monthly benefits. Google can not have more websites and their attraction to measure success, than having to leave their daily actions. Now it's your success on the Web! About the author: Martin Lemieux is the president of the Smartad advertising network that helps companies like yours to increase your business online and offline. Smartad Internet Marketing: http://www.smartads.info http://www.smartads.info
Smartad Canada: http://www.smartads.ca http://www.smartads.ca author right? 2005 Smartad advertising network - reprints accepted. Author resources should be included!

How to Avoid the Google Duplicate Content Filter?

More and more webmasters are building websites with publicly available content (data feeds, news, articles). This leads to sites with similar content on the Internet. In the case of the construction of news or data feeds, you can also find sites that correspond with each other 100% (excluding design). More copies of the same content in a search engine that does not really do something good and so Google apparently decided to eliminate some of that content to deliver cleaner and better search results . Plain copies of websites were most affected. If a webmaster was publishing the same content on several areas, all areas in question has been removed from Google's index. Many websites based on affiliate programs suddenly took a great success in the loss of traffic from Google.com. Began shortly after the webmaster of some forums saw the same complaints and stories over and over 1 + 1 produced a clear picture of the situation are available: a dual filter content was used. Duplicate content is not always bad and always exist in one form or another. News sites are the best example of similar content. No one expected that it will be removed from Google's index. So how webmasters can avoid duplication of content filtering? There are several things webmasters can do when using duplicate content of any sort and still create unique pages and content on it. Let some of these options are explained here. 1) the unique content sites with similar content. On pages where duplicate content is used, the exclusive content will be added. I do not like a few words or a link / navigation menu. If you (the webmaster) can add 15% - 30% exclusive content pages where the content of the proportion of similar content over the entire contents of the page down. This will reduce the risk of having a page flagged as duplicate content. 2) Randomization content never seen those Quote of the day given thin on some sites? It adds a random quote of the day to a page at a time. Whenever you return to the page will be different. The scripts can be used for much more than seeing a quote of the day with a few changes to the code. With a little creativity a webmaster can use a script to create the impression pages are always updated and always different. This can be an excellent tool to prevent Google from using the same content filter. 3) Yes exclusive content, unique content is still king. But sometimes you can not avoid duplicate content at all. It is good. But what about adding unique content on your site, too. If the general relationship unique content and duplicate content is well-balanced chances that the dual filter content to your site are much lower. Personally, I recommend a site that has at least 30% of exclusive content to offer (I admit it - I sometimes find it hard to reach my level, but I try). Does this guarantee that your site remains in the Google index? I do not know. To be successful a website should be unique. Unique content is what draws visitors to a site. Everything else can be found elsewhere, and visitors have no reason to visit a single website if you can get the same elsewhere.
About the Author
Christoph Puetz is a successful entrepreneur and author of the book internationally. Currently, Web sites are powered by Christoph http://www.realcreditrepair.info credit problems and help http://www.highlandsranch.us Highlands Ranch Colorado. SEO and PPC services provided by the sponsor can be found on http://www.netservicesusa.com U. S. Net Services LLC. Note: This article may be published by a person as the resource box (About the author) is on the Web, including links, and that these links to click. The last paragraph in italics source of information on the author resource box May not be ublished.

วันอังคารที่ 23 กันยายน พ.ศ. 2551

How to Succeed with the Search Engines

Cold facts? .. One of the most important factors for success in your business is to learn to rank high on search engines. It was said that search engines can represent up to 95% of all traffic on your site! If you do not take advantage of the main engines that could be set aside almost all of its targeted traffic! I do not know any company that can afford to throw so many visitors. Your site is ready? There are many factors to consider before you submit your site. Does the site to a maximum of position? Each party will be presented separately, and each page is as follows. &lt;ul&gt; since &lt;li&gt; &lt;li&gt; &lt;li&gt; Meta tags keywords in the header tags &lt;li&gt; ALT tags on all images keywords full &lt;li&gt; Content &lt;li&gt; keywords as close to the top of each page as possible &lt;li&gt; Keywords in links to other pages &lt;li&gt; keywords in outgoing links &lt;/ ul&gt; [title of the page] It is important to place the keywords as close to the start of his side as possible. Search engines usually appear on your page title as part of the description that appears when your site appears in search results. Place keywords at the top of the page can be very beneficial. For example, my page title, ie Affiliate Programs Directory @ smoke Soft Inc. This means that my site will do well in the search for affiliates and affiliate programs directory programs. Try to make your page title, if you can include as many keywords as possible. [Meta Tags] Despite the importance of meta-tags disappear every day, are easy to create and some search engines is a factor, it would be better to take on their pages. For search engines that are not meta tags in mind that you just ignore them. [Keyword in the title on labels] This means that you create a separate provision for each page containing the keywords you are targeting as part of the page. For example, the departure of my home page is: &lt;p align=center&gt; Affiliate Programs Directory say that in my Web page is posted as: Affiliate Programs Directory This will greatly increase its position in the ranking. Caring for the preparation of each side in this regard. [Alt tags on all images] another trick is to add the ALT attribute for each image on each page of your site. In fact, you can place a keyword in each image on your site! Follow this example: Some search engines use the text in images when examining the relevance of keywords, and add more content in their favor. [Keyword full content] There is no excuse not to fill their sites with keywords full of quality content that is an important factor in how you rank in search engines. Other keywords to search engines to find the best classification of its website. Period. I found that one of the best ways to do this is with articles like this. Imagine for a moment that I used the search engines or keywords that two of my key words on a page, now skimmed through the article and see how often those two words appear. As you can see what would be a great asset in determining how the parts of my site, and it is an article! If you can not write their own stories, there are many places where you can get items to place on your site from another author. One such place is http://www.smokesoft.net/Articles.html target = _New http://www.smokesoft.net/Articles.html [keywords as close to beginning of each side as possible] This is one of the easiest and most effective ways to strengthen the position. On each page, make sure the first visible text is a keyword, search engines love that! If you do not put keywords as the first visible text on your page, consider changes or reforms to allow that. It's worth the time and effort and will almost certainly see an immediate improvement in your search engine rankings. [Keywords in links to other pages] You should try to include keywords in the links whenever possible. For example, if you target marketing keywords on a page, link to a page on your site and call the marketing tactics. Page link leads to consist of elements that could have been written about the marketing boards that are found. Do not just search engines use the text on a link to find out what is on your side, but is also used to determine the theme of the page. If you really want to take to another level, try using a link full of keywords as the first visible text on your page. Rocket also their ranking in search engines. [Keywords in the production links] This concept is the same as a link to other pages, with the exception rather than linking to another site. For example, if I were a link to another subsidiary of promoting the site you want to put the keyword in any Link.This Affiliate Program is a must! My advice to you is to save this article and refer to each step when you create a Web page. It is not all there is to know about search engines, but if you follow these guidelines, it will be very good. Good luck! About the author Adam Buhler is the owner of Affiliate Programs Directory: http://www.smokesoft.net target = _New http://www.smokesoft.net Adam is the author of the Weekly newsletter Secrets of branches. It provides a free copy of the eBook Internet ATM For a limited time to all those who subscribed to: http://www.smokesoft.net/newsletter.html target = _New Http: / / www.smokesoft.net / newsletter.html smoke@dwave.net smoke@dwave.net