The #1 Biggest Mistake That People Make With Adsense
By Joel Comm
It's very easy to make a lot of money with AdSense. I know it's easy because in a short space of time, I've managed to turn the sort of AdSense revenues that wouldn't keep me in candy into the kind of income that pays the mortgage on a large suburban house, makes the payments on a family car and does a whole lot more besides.

But that doesn't mean there aren't any number of mistakes that you can make when trying to increase your AdSense income - and any one of those mistakes can keep you earning candy money instead of earning the sort of cash that can pay for your home.

There is one mistake though that will totally destroy your chances of earning a decent AdSense income before you've even started.

That mistake is making your ad look like an ad.

No one wants to click on an ad. Your users don't come to your site looking for advertisements. They come looking for content and their first instinct is to ignore everything else. And they've grown better and better at doing just that. Today's Internet users know exactly what a banner ad looks like. They know what it means, where to expect it - and they know exactly how to ignore it. In fact most Internet users don't even see the banners at the top of the Web pages they're reading or the skyscrapers running up the side.

But when you first open an AdSense account, the format and layout of the ads you receive will have been designed to look just like ads. That's the default setting for AdSense - and that's the setting that you have to work hard to change.

That's where AdSense gets interesting. There are dozens of different strategies that smart AdSense account holders can use to stop their ads looking like ads - and make them look attractive to users. They include choosing the right formats for your ad, placing them in the most effective spots on the page, putting together the best combination of ad units, enhancing your site with the best keywords, selecting the most ideal colors for the font and the background, and a whole lot more besides.

The biggest AdSense mistake you can make is leaving your AdSense units looking like ads.

The second biggest mistake you can make is to not know the best strategies to change them.

For more Google AdSense tips, visit http://adsense-secrets.com
Copyright © 2005 Joel Comm. All rights reserved

Tuesday, October 28, 2008

Webmasters and Search Engines

By 1997 search engines recognized that webmasters were making efforts to rank well in their search engines, and that some webmasters were even manipulating their rankings in search results by stuffing pages with excessive or irrelevant keywords. Early search engines, such as Infoseek, adjusted their algorithms in an effort to prevent webmasters from manipulating rankings.

Due to the high marketing value of targeted search results, there is potential for an adversarial relationship between search engines and SEOs. In 2005, an annual conference, AIRWeb, Adversarial Information Retrieval on the Web, was created to discuss and minimize the damaging effects of aggressive web content providers.

SEO companies that employ overly aggressive techniques can get their client websites banned from the search results. In 2005, the Wall Street Journal reported on a company, Traffic Power, which allegedly used high-risk techniques and failed to disclose those risks to its clients. Wired magazine reported that the same company sued blogger Aaron Wall for writing about the ban. Google's Matt Cutts later confirmed that Google did in fact ban Traffic Power and some of its clients.

Some search engines have also reached out to the SEO industry, and are frequent sponsors and guests at SEO conferences, chats, and seminars. In fact, with the advent of paid inclusion, some search engines now have a vested interest in the health of the optimization community. Major search engines provide information and guidelines to help with site optimization. Google has a Sitemaps program to help webmasters learn if Google is having any problems indexing their website and also provides data on Google traffic to the website. Google guidelines are a list of suggested practices Google has provided as guidance to webmasters. Yahoo! Site Explorer provides a way for webmasters to submit URLs, determine how many pages are in the Yahoo! index and view link information.

Getting indexed

The leading search engines, Google, Yahoo! and Microsoft, use crawlers to find pages for their algorithmic search results. Pages that are linked from other search engine indexed pages do not need to be submitted because they are found automatically. Some search engines, notably Yahoo!, operate a paid submission service that guarantee crawling for either a set fee or cost per click. Such programs usually guarantee inclusion in the database, but do not guarantee specific ranking within the search results. Yahoo's paid inclusion program has drawn criticism from advertisers and competitors. Two major directories, the Yahoo Directory and the Open Directory Project both require manual submission and human editorial review. Google offers Google Webmaster Tools, for which an XML Sitemap feed can be created and submitted for free to ensure that all pages are found, especially pages that aren't discoverable by automatically following links.

Search engine crawlers may look at a number of different factors when crawling a site. Not every page is indexed by the search engines. Distance of pages from the root directory of a site may also be a factor in whether or not pages get crawled.

Preventing indexing

Robots Exclusion Standard


To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots. When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed, and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.

Custom Search

Bookmark and Share

Add to Google Reader or Homepage

Add to My AOL

Subscribe in NewsGator Online