I’ve been making web pages long enough to notice the massive changes that have happened over the last 5 years in web designers’ approaches to web design and how we use browser functionality to aide our efforts to deliver both working and aesthetically pleasing sites.
We’ve been progressing from from horribly structured layout tables with inline formatting to a much better strain of design and content separation through CSS and “standards” markup. We have also dumped the idea of dual designs, with one browser taking one version of the site and another having a different one (read: IE vs Netscape).
But in a world that loves to move its technology along fast, are we not stagnating by drawing such attention to standards? I know that sounds like I’m saying standards are bad but I know they’re not.
Standards have allowed us a glance at how the web could be. Multiple browser platforms all reading and displaying the same pages identically, without need or use of hacks. We’re still a long way off that reality, mainly because there are too many browsers knocking around.
Mainstream development cannot move onto a new technology until the number of people on browsers that and if they do decide to move on regardless, they have to take extensive measures to ensure the older browsers can view the site, usually meaning the people on the newer browsers are the ones to suffer in some form, be that how much they end up downloading due to redundant hacks or the features that *could* have been there but were dropped for compatability.
Like it or not, Internet Explorer is going to be a majority amongst web browsers for a long time. They might change their approach to standards but evidence to date tells us that when the “next-big-thing” comes along, they’re going to be the last to join the party.
The “next-big-thing”, whatever that is, is going to have a real getting from approved standard to market. W3C has been preaching xHTML for a good long time but what percentage of the Alexa 5 (the top 5 visited sites on the internet) use it?
Uses HTML 4.01, 20k of in-page CSS… and after all of that it doesn’t validate. 37 errors. They obviously cannot be that bothered.
No document declaration! In page CSS. Fails validation at HTML 4.01 standards with 50 Errors. Absolutely surprising and terrible at the same time.
See google. Exactly the same.
No declaration! In-page CSS. Junk JS everywhere. Failed HTML 4.01 with 150 errors. Absolutely terrible, but nobody is surprised.
How can the internet hope to progress in a world that operates like this? Where we put more faith in people having a flash player than delivering a website as efficiently as possible.
Moreover when these sites do decide enough people are on xHTML 1.0 compatible browsers, and the smaller guys are designing for xHTML 2.0 (or whatever there is at that time), how stretched out is the internet going to be?
So there’s some proof that when the uptake of standard-supporting browsers isn’t good enough, the big guys stick their middle fingers up to the W3C. Viciously, browsers are slower to push forward with the new standards. The problem is, that’s hurting us, the little people that could utilise language improvements.
In a way I want to say that HTML is a flawed concept but I also know it works, well, at that. It’s organisity and willingness to be bastardised to cope with multiple platforms has made it a popular display language but with a bit of fascism it could be so much more. If we could nuke IE, Opera, kHTML (of the Konqueror browser) and all the other little engines floating around) and push a self updating, network aware version of Gecko onto every single persons computer, we might be able to handle our clients with a little more respect for standards and make designing the internet a lot more fun and a LOT easier.
For true progress, we need to get away from standards and allow ourselves to define what things mean and how they should be interpreted and browser should listen.