Hiding Your Tracking Domain From the SE's

Fiver

New member
Jan 30, 2009
2,011
12
0
How do I hide my tracking domain in the search engines so no one's poking around my 202 login screen?
 


I'm a complete noob and I may be off about this, but I think you can do it with robot.txt.

How I have no idea.
 
Use .htaccess to require a username and password for the prosper installation.

Even though I use robots.txt files, I don't trust Google to obey them, since I have seen them completely ignore exclusions before.

Using .htaccess will not only stop search engines from poking around your site, but also stop anyone else as well. Since it's not that hard to find a tracking link on a lander and then just append /202-login.php to the domain.

Stick the code below in a .htaccess file in the root dir of the prosper install.

Code:
AuthName "Restricted Area"
AuthType Basic
AuthUserFile /{location of file}/.htpasswd
AuthGroupFile /dev/null
require valid-user

#Stop google chrome from asking for a username and password when it requesting the favicon (this is a bug in google chrome!)
<files "favicon.ico">
    Satisfy any
</files>
Stick the text 'Satisfy any' in a .htaccess file and put that file in the dirs '/tracking202/static/' and '/tracking202/redirect/'

Of course you'll have to set up a .htpasswd file and modify the code above to reflect its location but you only have to do this once. And it's not that hard. There is plenty of documentation on the net about how to do this.

What will happen is your tracking scripts in prosper will work as normal, but if some one/thing tries to access any other part of your prosper install it will require a username and password.
 
Quick update, after skimming the link Compound posted, most of it doesn't need to be done if you use the .htaccess method.

Also nothing the author suggests will stop some one exploiting a bug in prosper that maybe found in the future.

The .htaccess will stop any future exploits as long as they aren't in the files in the publicly accessible directories.

Also in step 6 the author suggest using SSL. While he is mostly right about the scrubbing of the referrer (not all browsers will scrub the referrer with SSL, IE6 using frames wont) he doesn't mention that it will be substantially slower, since the SSL connection has to be negotiated before any actual data is sent.

This will increase you load time by seconds. I have seen prosper redirects go from less than .2 of a second to will over 2 seconds with SSL.

If you don't care about speed, then I would say go for it, but I wouldn't bother.
 
robots.txt is probably the best way to do it


User-agent: * Disallow: /

Found this on a site. Can you tell me the difference?

Code:
# Disallow Web Bots
User-agent: *
Disallow: /

# Disallow Archive Bots
User-agent: ia_archiver
Disallow: /