AutobahnJS on S3: excessive traffic

#1

Hi guys,

we have a (small) issue and I am asking for opinions and help.

Tavendo is providing free hosting of AutobahnJS on Amazon S3 for _development_ purposes, like this

https://autobahn.s3.amazonaws.com/js/autobahn.min.js

and related files.

Now, in September, our S3 traffic skyrocked to 130 GB / 6 Mio. requests which translates into around 15 US-$. Well, that money isn't a problem
(Tavendo sponsors development of Autobahn's OSS projects, which is of course orders of magniture more cash). However, I am a paranoid: what the hack is going on? Will it be 13TB next month?

After turning on S3 bucket logging today, I recognized that apparently nearly all traffic comes referred from the site

http://www.funweek.it

which includes AutobahnJS (I haven't analyze what they do, but they include the file).

We hesitate to take off the whole S3 bucket (since then all would suffer because of 1 misbehaving site).

We have emailed the admins. Waiting for response.

But the general question is: what would be the best for the community _and_ limit our (Tavendo) risk rgd. excessive traffic costs?

Any hints, help, opinions welcome!

/Tobias

0 Likes

#2

Personally, I don’t have much desire for AutobahnJS hosting.

However, if you would like to keep it, a couple of possible solutions if you move away from S3 and serve it from an AWS Micro instance (or a DigitalOcean droplet for cheaper bandwidth):

  • Only allow “localhost” and “127.0.0.0” as referers. (And don’t forget about the ports!) You can do this with Nginx, and I’m sure Apache has similar functionality: http://nginx.org/en/docs/http/ngx_http_referer_module.html
  • Implement per-referer rate limiting. It doesn’t seem like this is possible with Apache or Nginx, but it shouldn’t be difficult to write a simple TwistedWeb client to do this.
···

On Friday, October 4, 2013 2:31:19 PM UTC-5, Tobias Oberstein wrote:

Hi guys,

we have a (small) issue and I am asking for opinions and help.

Tavendo is providing free hosting of AutobahnJS on Amazon S3 for
development purposes, like this

https://autobahn.s3.amazonaws.com/js/autobahn.min.js

and related files.

Now, in September, our S3 traffic skyrocked to 130 GB / 6 Mio. requests
which translates into around 15 US-$. Well, that money isn’t a problem

(Tavendo sponsors development of Autobahn’s OSS projects, which is of
course orders of magniture more cash). However, I am a paranoid: what
the hack is going on? Will it be 13TB next month?

After turning on S3 bucket logging today, I recognized that apparently
nearly all traffic comes referred from the site

http://www.funweek.it

which includes AutobahnJS (I haven’t analyze what they do, but they
include the file).

We hesitate to take off the whole S3 bucket (since then all would suffer
because of 1 misbehaving site).

We have emailed the admins. Waiting for response.

But the general question is: what would be the best for the community
and limit our (Tavendo) risk rgd. excessive traffic costs?

Any hints, help, opinions welcome!

/Tobias

0 Likes

#3

@Theron: thanks! running an EC2 instance just to filter out deeplinkers would work, but seems a little bit much work/maintenance for just hosting a file;)

@all

We have setup a AWS S3 "bucket policy" which disallows access from anywhere but the following HTTP referrers

             "http://127.0.0.1*",
             "https://127.0.0.1*",
             "http://localhost*",
             "https://localhost*",
             "http://autobahn.ws/",
             "https://autobahn.ws/
"

and the "empty" referrer (for direct download via curl/wget/etc).

That's a start, though it's still not perfect:

http://stackoverflow.com/questions/19197843/aws-s3-bucket-policies-and-rejected-traffic

/Tobias

0 Likes

#4

I didn’t know you could do that with S3. Neat!

···

On Saturday, October 5, 2013, Tobias Oberstein wrote:

@Theron: thanks! running an EC2 instance just to filter out deeplinkers would work, but seems a little bit much work/maintenance for just hosting a file;)

@all

We have setup a AWS S3 “bucket policy” which disallows access from anywhere but the following HTTP referrers

        "[http://127.0.0.1](http://127.0.0.1)*",

        "[https://127.0.0.1](https://127.0.0.1)*",

        "[http://localhost](http://localhost)*",

        "[https://localhost](https://localhost)*",

        "[http://autobahn.ws/*](http://autobahn.ws/*)",

        "[https://autobahn.ws/*](https://autobahn.ws/*)"

and the “empty” referrer (for direct download via curl/wget/etc).

That’s a start, though it’s still not perfect:

http://stackoverflow.com/questions/19197843/aws-s3-bucket-policies-and-rejected-traffic

/Tobias

— Theron

0 Likes

#5

After turning on S3 bucket logging today, I recognized that apparently
nearly all traffic comes referred from the site

http://www.funweek.it

which includes AutobahnJS (I haven't analyze what they do, but they
include the file).

We hesitate to take off the whole S3 bucket (since then all would suffer
because of 1 misbehaving site).

We have emailed the admins. Waiting for response.

I haven't received any reaction.

However, turning off access via the S3 bucket policy did work;)

They took the deeplink off last night and are now self-hosting AutobahnJS.

/Tobias

0 Likes

#6

After turning on S3 bucket logging today, I recognized that apparently
nearly all traffic comes referred from the site

http://www.funweek.it

which includes AutobahnJS (I haven't analyze what they do, but they
include the file).

We hesitate to take off the whole S3 bucket (since then all would suffer
because of 1 misbehaving site).

We have emailed the admins. Waiting for response.

I haven't received any reaction.

For the track record: they (the Web boutique behind funweek.it) now (after turning off access) reacted and apologized.

/Tobias

···

Am 06.10.2013 16:16, schrieb Tobias Oberstein:

However, turning off access via the S3 bucket policy did work;)

They took the deeplink off last night and are now self-hosting AutobahnJS.

/Tobias

0 Likes