Archive for January 13th, 2010

Setting up Filezilla on EC2/Windows

When I spend a couple of hours pulling my hair out on a stupid technical issue I like to document it so I can find the solution more quickly next time I have the problem. And there will be a next time, so says Professor Murphy. :-)

When you’re setting up the Filezilla FTP server on EC2, don’t forget to add the ports used in the Passive Mode settings to your EC2 security group. Otherwise your files will never make it up to the server. The firewall will do its job and send the data off to the bit bucket.

Static hosting should be cheap and easy

15 years ago I wrote a piece called Billions of Websites.

Yesterday Google threatened to shut down its presence in China.

The reason — China was attacking Google’s computers.

An individual hosting his or her own website is putting their little boat on the same ocean as the one Google is on. But Google has huge resources to protect their data. The Internet is getting rougher all the time. An individual, no matter how talented, ultimately won’t be able to keep up.

So we’re being pushed to store our creative work in their cloud.

Unfortunately — there’s no easy and cheap way — at least that I’ve found — to store a large static content website safely with any hope that it will be around for a long time.

I’m talking about a service for people who don’t begin to know how to use a Unix command line and don’t want to learn.

Amazon S3 comes close to being what I’m looking for, but it doesn’t support index files. I assume they have this limit because they don’t want people hosting static websites in S3. I have no idea why, but the limit has been there a long time and could be easily fixed. When you ask Amazon people why it’s there, they are silent.

The service has to allow storage of (almost) any kind of file. I need to be able to store the data behind the rendered text. For example, RSS and OPML files. I understand if they have to put limits on MP3s and AVIs due to pressure from the entertainment industry.

Longevity is super-important. I need to believe that the organization I’m hiring to store the site will be around for a long time.

Keeping out denial-of-service attacks is important too. It should automatically reject requests from an IP address that is repeatedly downloading a 3MB image. A simple DOS attack like that is easily detected by machines, and should not require human intervention. Under no circumstances should the user be responsible for paying for the bandwidth for such attacks.

It’s possible that no commercial entity can profit by providing this service. If so, we must start a non-profit to do it. We can’t trust the freemium model with our writing for the long term. It’s an unacceptably shaky foundation to build on.

I’m now maintaining the web presence for two relatives who have passed away. I’m not satisfied with the job I’m doing. I’m not particularly suited for this work. I want to be able to purchase a reliable hassle-free service that can do it for me, and of course for others.

This is a huge gap in the web we’re building today. Eventually it’s going to catch up with us when we lose a huge amount of stuff we thought we couldn’t lose.