Google is very picky about duplicate pages and may penalize your site if it finds too much (or any?) duplicate content. To help squeeze a little-bit more SEO friendliness out of Drupal you can add a few entries to your ROBOTS.TXT to avoid duplicate pages from being indexed and keep the search-bots away from pages that they don't really need to be in.
If you are using Drupal 5 or older, try adding these entries to your ROBOTS.TXT
User-agent: *
Disallow: /admin
Disallow: /aggregator
Disallow: /tracker
Disallow: /node/add
Disallow: /user
Disallow: /files
Disallow: /search
Disallow: /book/print
Disallow: /filter
Disallow: /filter/tips
Disallow: /comment
Disallow: /comment/reply
Disallow: /spam
Disallow: /spam/report
Disallow: /spam/report/comment
Subscribe to Post Comments [Atom]