View previous topic :: View next topic |
Author |
Message |
Lawrence -
Joined: 16 Jan 2003 Posts: 207 Location: Brisbane, AU
|
Posted: Tue Apr 22, 2003 11:40 am Post subject: Browser blocking based on Client ID? |
|
|
Is it possible, either within Abyss or by using some external software or script, to block views based on client ID? For example, suppose I wanted to block Inktomi's Slurp client from reading any of my pages.
Any ideas? |
|
Back to top |
|
|
aprelium -
Joined: 22 Mar 2002 Posts: 6800
|
Posted: Tue Apr 22, 2003 1:33 pm Post subject: Re: Browser blocking based on Client ID? |
|
|
Lawrence wrote: | Is it possible, either within Abyss or by using some external software or script, to block views based on client ID? For example, suppose I wanted to block Inktomi's Slurp client from reading any of my pages.
Any ideas? |
You can write a script that intercepts all page requests and allow/block them based on the browser id. _________________ Support Team
Aprelium - http://www.aprelium.com |
|
Back to top |
|
|
Lawrence -
Joined: 16 Jan 2003 Posts: 207 Location: Brisbane, AU
|
Posted: Tue Apr 22, 2003 10:50 pm Post subject: |
|
|
Woohoo! An answer to the question and a point to the right direction. I'm off to solve the mystery!
Thanks again (And hustle with that reasonably priced pro edition ;) |
|
Back to top |
|
|
wamba -
Joined: 20 Jun 2006 Posts: 1
|
Posted: Tue Jun 20, 2006 7:34 pm Post subject: blocking inktomi |
|
|
An idea...
The number of visits by inktomi is growing in my domain. I dont like my domain being visited more by the robots than by people!
The only idea I have now: Analyzing the group of inktomi's ips ( obtained manually) and applying the results to obtain a filtering script. The real group of ips will ever be greatter than we have, so the filter may work with partial ips and may trying to recognize a shadow of inktomi, blocking correctly a group of ips and sacrifying someothers.
That's all a can do now. The number of diferent ips for this robot is surprising me as well as its activity on my site. |
|
Back to top |
|
|
TRUSTAbyss -
Joined: 29 Oct 2003 Posts: 3752 Location: USA, GA
|
Posted: Tue Jun 20, 2006 8:01 pm Post subject: |
|
|
You can use this PHP code to protect your pages from certain bots.
Code: | <?php
/**
* Banned User Agents Script
* Created by: Josh (TRUSTAbyss)
*/
$bad_user_agents = array("inktomi",
"User Agent Here",
"User Agent Here");
foreach ($bad_user_agents as $bad_user_agent) {
if ($_SERVER["HTTP_USER_AGENT"] == $bad_user_agent) {
header("403 Forbidden");
exit;
}
}
?>
|
Put that code on the very top of your PHP pages before the HTML starts and
it will stop bad user agents to where they cannot reach people's information.
Note: This script will need to match the full User Agent name in order to stop
the User Agent from reading any information on your page.
Sincerely, TRUSTAbyss
Last edited by TRUSTAbyss on Tue Jun 20, 2006 8:32 pm; edited 3 times in total |
|
Back to top |
|
|
Anonymoose -
Joined: 09 Sep 2003 Posts: 2192
|
|
Back to top |
|
|
AbyssUnderground -
Joined: 31 Dec 2004 Posts: 3855
|
Posted: Tue Jun 20, 2006 8:18 pm Post subject: |
|
|
The google bot can be a good thing. It helps people find your website when they type in search parameters. Stopping them browsing will simply reduce your visitor count. _________________ Andy (AbyssUnderground) (previously The Inquisitor)
www.abyssunderground.co.uk |
|
Back to top |
|
|
|