If you examine the preferences dialog of any modern Web browser (like Internet Explorer, Safari or Mozilla), you'll probably notice a 'cache' setting. This lets you set aside a section of your computer's hard disk to store representations that you've seen, just for you. The browser cache works according to fairly simple rules. It will check to make sure that the representations are fresh, usually once a session (that is, the once in the current invocation of the browser).


Here are the steps that I take to get an SPF Record going on DreamHost

v=spf1 mx ip4: ip4: ip4: ip4:208.97.1 32.0/24 ip4: ip4: ip4: ip4: ip4: ~all


One of the most cost-effective ways to drive traffic to your Web site is to optimize it for search engines. Many of them use automated programs called "crawlers" or "spiders" to create an index of the Web, which they use to determine what sites are most relevant to users' queries. These programs essentially visit Web sites, read the pages' content, and follow any links to other pages, repeating the process


full article:

  1. Everything you know is wrong sort of
  2. It's not going to look exactly the same everywhere unless you're willing to face some grief and possibly not even then
  3. You will be forced to choose between the ideal and the practicable
  4. (with thanks to Antoine de Saint-Exupéry): Perfection is not when there's nothing to add, but when there's nothing to take away
  5. Some sites are steaming heaps of edge cases
  6. Longer lead times


mod_rewrite is very useful in many situations. Yet some behaviors were not so obvious when I started to mess with it. After many testings, I understand it much better, now. Having said that, I do not pretend to know it perfectly. I also make mistakes.