Archive for October, 2006

W3C Listens, Incremental Update to HTML On The Way

Saturday, October 28th, 2006

Surprisingly, SlashDot scooped all the web design websites I normally read on Tim Berners-Lee’s announcement that HTML will be incrementally updated (as well as things such as the W3C‘s HTML validator)

In the post, Berners-Lee addresses folks like (in order of impact) Bjoern Hoehrmann, Jeffery Zeldman, Eric Meyer, and Molly Holzschlag who have been angry with the W3C for resting on their laurels, instead of listening to the community (apparently blatantly ignoring them). As everyone points out, the W3C should be advancing the web. Instead, small work groups are bringing us things like microformats and recommendations to update HTML.

So, the really important part of Berners-Lee’s post is not that HTML will be updated. It is that the W3C is claiming that it will now listen to the community by actually following the feedback mechanisms they have in place instead of blatantly ignoring them. The W3C is going to get back into the community instead of towering above it. Assuming they follow through, this could be a new age of progression.

I might have more to say after the weekend when the web design celebrities weigh in on the subject.

Ajax vs Specific Accessibility vs General Accessibility

Monday, October 23rd, 2006

I was reading Rob Cherny’s article Accessible Ajax, A Basic Hijax Example and started thinking a little more about accessibility. Cherney claims that this hijax method, using unobtrusive JavaScript to make a form submit with Ajax instead of traditional POSTing when Ajax is available, is more accessible. While I think it is more accessible than only using Ajax, it is only more accessible for generic alternative browsers; it isn’t any more accessible for disabled people.

Apparently the term hijax that was used in Cherny’s article was coined by a guy named Jeremy Keith. I don’t think the term was really needed as the concept of unobtrusive JavaScript pretty much sums up the idea of hijax. I’ve inadvertently been using hijax since May, however. So, I have some opinions on it.

Web accessibility is a big beast with two opposing heads. The original term accessible meant that the site was designed such that people with disabilities could use the site. Recently, the term has changed to include people with or without disabilities that browse from alternative devices. By alternative I mean mobile phones, ultra-mobile PCs, screen readers, and the like. These alternatives often lack the typical mouse-based interface of desktops, JavaScript support, cascading stylesheet support, or even HTML support. So, one head of the beast is accessible-for-disabled-persons and the other is accessible-for-alternate-devices.

I personally prefer the inclusiveness of the second head when I talk about accessibility. However, the caveat is that saying something is accessible to some facet of alternative browsers may not make it accessible to the disabled. If you aren’t specific about what you are trying to be accessible to, you end up confusing people (see my comment).

Cherny’s article elegantly addresses a facet of the second head. The form degrades gracefully. Unobtrusive JavaScript is designed this way: create a normal web page, then spice it up with JavaScript to improve the existing functionality. So, for Ajax forms, it is a three step process that might go as follows:

  1. Create a form that works by POST.
  2. If JavaScript is available, unobtrusively add support for validation.
  3. If Ajax is available, unobtrusively add support for Ajax.

If there is no JavaScript, the form will post normally. If there is no Ajax, the user gets error correction and the page posts normally. If there is Ajax, the user gets error correction and doesn’t have to wait for a page load. Whatever the case, the form is submitted and no alternative browser is left out.

However, there are still problems with Ajax and the first head of accessibility. I’m going to focus on screenreaders, as people with motor disabilities or deafness still have random access to the page (they aren’t limited to linearly reading the page). I will say this, though, in reference to the deaf and those with motor disabilities: the form itself must be accessible, using labels and accesskeys, or it’s still not fully accessible. Screenreaders, at this point, are still no good with dynamic page updates, which are a mainstay of Ajax. Some aren’t even good with alerts. Of all the available solutions to the problem of updating content dynamically, none of them work across the board. The only way to accessibly do dynamic content updates is to give the user another option.

Since I was aware of this problem when I redesigned my site, I built in a link to turn off Ajax above every form that uses Ajax. The Ajax is unobtrusive, or hijax if you prefer. This was the only method I could think of to allow full access to my forms. Until screenreaders catch up to the technology, the best Ajax accessibility may be no Ajax at all. So, let the user opt out if he wants.

Internet Explorer 7

Thursday, October 19th, 2006

Internet Explorer 7 has been released. Download it now! Help usher in a new era: an era where Microsoft is actually interested in web standards.

Update: There is some discussion going on over at 456 Berea Street. Particularly of interest is how to install multiple stand alone versions of Internet Explorer. While this info can be gathered elsewhere, the feed back from the comments is useful in deciding which method to use. Plus, these are opinions from web designers rather than tech pundits. So, that is nice.

Update: Apparently someone already found a vulnerability. Some things never change.

A Better Spam System

Wednesday, October 18th, 2006

Everyone hates spam. That is a lie. In order for spam to be worthwhile, some people must be using it (and they supposedly are). But I don’t and I don’t want it. I have a proposition (that, of course, will fail).

A new rash of spam has been flooding my inbox. My spam filter doesn’t pick it up because the text in it doesn’t say anything about viagra, porn, cialis, stocks, or anything else to get a guy going. It just has bits of novels or technical manuals and an image attached that has text about what they really want to sell. Supposedly, these spammers want to un-train spam filters and it might just work. I have to manually delete these e-mails so I don’t mess up my filters.

Everyone I’ve ever talked to hates spam. But some people are reading it and acting on it. If someone wasn’t, spammers wouldn’t bother. The problem is that more people don’t act on the spam. Some people, like me, recognize spam based on subject and senders and never even open it.

What about us?

Spammers buy large lists of e-mail addresses and send e-mail to everyone on the list, perhaps cross checking against list of known dead addresses. There is no way to get off these lists without deleting the e-mail account, which really does no good.

There was supposed to be a do-not-spam e-mail list or some such nonsense. This might work if spammers could be held to some law. Not every spammer is in the US. Not every spammer in the US spams through US servers. It’d be worthless and impossible to keep up with.

I propose the opposite. We need a do-spam list. Here’s how it would work.

First, spammers would login to some web site and enter every e-mail address they have. Any duplicates would be ignored. Then, every spammer would have the same large e-mail list. Spamming would continue as usual based on this list.

The database that holds the list would have three fields: email_id, address, and last_click.

Second, spammers would introduce unique URLs per-address. So, a link to the spam site would be something like http://getsomeviagra.com/?uid=3476. When the recipient clicks this link, the recipients e-mail address would be found in the do-spam list and the last_click would be updated with the current date and time.

Third, a daily cron job would go over the list and delete every e-mail address that had a last_click older than, say, 30 days. Use of the list would continue as usual while the total number of uninterested parties shrunk.

Fourth, the spammer could optionally keep his own database based on the existing users in the do-spam list. The spammer’s database might track what sorts of products the particular recipient is interested in. For example, some guy might click on porn and stocks, but not viagra because the guy is 30 and has no problems with getting an erection. That data could be shared via affiliate programs, if desired. Then, spam could be more targeted and might convert better.

By simply removing people like me from the list due to inactivity, spammers would have a higher return on investment. It would create more targeted spam and drastically reduce the amount of money spammers spend on finding victims.

There are only two issues. The first is getting new addresses. These could be collected by traditional methods. The second is that spammers can’t be trusted to manage the list. So, I propose a disinterested organization run open spam e-mail relays that use the do-spam database. All spammers have access to is the unique ids for the e-mail addresses but not the e-mail addresses themselves. Spammers would have to register with the website, and probably pay a fee to make sure they are serious about spamming.

I imagine there are still bugs to work out, but I think this is a good start. Feel free to post your own thoughts, problems, and ideas on the subject.