# Define access-restrictions for robots/spiders # http://www.robotstxt.org/wc/norobots.html # By default we allow robots to access all areas of our site accessible to # anonymous users, except for search, which burns our CPU for no reason. # Block all URLs including query strings (? pattern) - contentish objects expose query string only for actions or status reports which # might confuse search results. # This will also block ?set_language # Add Googlebot-specific syntax extension to exclude forms # that are repeated for each piece of content in the site # the wildcard is only supported by Googlebot # http://www.google.com/support/webmasters/bin/answer.py?answer=40367&ctx=sibling User-agent: * Disallow: /*? Disallow: /*atct_album_view$ Disallow: /*folder_factories$ Disallow: /*folder_summary_view$ Disallow: /*login_form$ Disallow: /*mail_password_form$ Disallow: /*search Disallow: /*search_rss Disallow: /*searchRSS Disallow: /*updated_search Disallow: /*sendto_form$ Sitemap: /sitemap.xml.gz