[Solved] Open proxy scanners - successful?

 
Post new topic   Reply to topic    Aprelium Forum Index -> General Questions
View previous topic :: View next topic  
Author Message
Lawrence
-


Joined: 16 Jan 2003
Posts: 207
Location: Brisbane, AU

PostPosted: Thu Jan 15, 2015 4:39 am    Post subject: [Solved] Open proxy scanners - successful? Reply with quote

EDIT: Solved, see post #2

In my new test system running X1 I've found some interesting access.log entries, like this:

Code:
1.2.3.4 - - [date +1000] "GET http://www.baidu.com/ HTTP/1.1" 200 107 "" ""


The important part is the malformed GET command, which should start with a slash:

Code:
"GET /index.html


but instead skips the slash and includes someone else's URL:

Code:
"GET http://www.baidu.com/


Now this seems to indicate I'm being scanned by robots looking for open proxies, according to this StackExchange page.

What's interesting, and a little bit alarming, is that Abyss seems to be returning a 200 response, which indicates Abyss received, understood and is delivering the desired data.

This is obviously very bad! Abyss should not be serving up pages from anywhere but my own server!

Now, there's no way to tell in the access.log if content is actually being delivered, but using the cURL example code in that StackExchange page:

Code:
curl -H -x http://domain.name.here www.google.de


The response I received from this was my server's content followed immediately by the second URL's content. ie: my page followed by google's page.

This seems to be exceedingly bad. I can't tell if it's retrieving and relaying the content or if the cURL client is being instructed to get the content itself, but in any case this should result in a 404 or 403 or some other error.

Following the advice on that StackExchange page I threw in a quick URL rewrite to intercept the bad calls, and return a 404 error instead. Unfortunately this didn't work. It seems that my cURL request isn't quite the same as the robots' request, because what shows up in access.log is different - it only shows a slash, no additional data:

Code:
"GET /


and so the rewrite doesn't work. I assume it'll work for the bots, but I haven't seen proof of it in the logs yet.

And so:

1. The 200 response seems to be correct according to Apache's page on proxy abuse but Apache says no content should be served.

2. The cURL command I'm using is getting content, and it's leaving a different trail in access.log than the bots do.

3. Therefore I cannot be sure the bots are receiving content, and I cannot be sure that the content I get with my cURL request is being delivered from my server (ie: acting as proxy) or whether cURL is simply retrieving it on its own.

And finally, my question:

Is Abyss acting as a proxy?
Back to top View user's profile Send private message Visit poster's website ICQ Number
Lawrence
-


Joined: 16 Jan 2003
Posts: 207
Location: Brisbane, AU

PostPosted: Fri Jan 16, 2015 12:07 am    Post subject: Reply with quote

Solved!

Thanks to Aprelium support for getting back to me.

Abyss lacks the capability to retrieve and relay external sites. The cURL command I was using simply requests two sites in succession.

They suggested I use telnet to manually create GET requests similar to the bots' and this did not result in anything but an error page.

So Abyss is safe in this regard. I did expect this, but I lacked the tools to test it properly.
Back to top View user's profile Send private message Visit poster's website ICQ Number
Display posts from previous:   
Post new topic   Reply to topic    Aprelium Forum Index -> General Questions All times are GMT + 1 Hour
Page 1 of 1

 
Jump to:  
You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot vote in polls in this forum


Powered by phpBB phpBB Group