Jump to content

HTML XHTML? Any difference?


cs.punk

Recommended Posts

  • Replies 151
  • Created
  • Last Reply

Daniel, what's wrong in wanting a stricter, cleaner version of HTML that forces you to write clean code anyways?

 

I went through many sites in the gallery of http://www.cssdrive.com/, most being done by web design firms or professional web developers, and seeing all of the examples I've looked at are using some form of XHTML served as HTML (some using a Transitional DTD, but most using Strict), I would think that it's not such a bad idea to do the same.

 

Just because you can write clean HTML 4.01 code if you put your heart to it, it doesn't mean that XHTML is useless when served as HTML. There's no way of validating HTML code against the same strict rules as XHTML unless you're using some custom DTD document, which won't mean anything to customers. However if you show the "Validated W3C XHTML 1.0" icon in the footer of a web site, then it shows you're following the market and the new standards and are up to date.

 

HTML 4 is old and dirty, XHTML is new, clean, and shiny. I like clean and shiny!

Link to comment
Share on other sites

Daniel, what's wrong in wanting a stricter, cleaner version of HTML that forces you to write clean code anyways?

 

I went through many sites in the gallery of http://www.cssdrive.com/, most being done by web design firms or professional web developers, and seeing all of the examples I've looked at are using some form of XHTML served as HTML (some using a Transitional DTD, but most using Strict), I would think that it's not such a bad idea to do the same.

 

Just because you can write clean HTML 4.01 code if you put your heart to it, it doesn't mean that XHTML is useless when served as HTML. There's no way of validating HTML code against the same strict rules as XHTML unless you're using some custom DTD document, which won't mean anything to customers. However if you show the "Validated W3C XHTML 1.0" icon in the footer of a web site, then it shows you're following the market and the new standards and are up to date.

 

HTML 4 is old and dirty, XHTML is new, clean, and shiny. I like clean and shiny!

 

You're committing to two logical fallacies:

 

1) Appeal to novelty (argumentum ad novitatem): Newer doesn't imply better. By that logic I can create any new standard and claim it as a replacement for XHTML, and as such it will automatically be better. It's a logical fallacy.

 

2) Appeal to common practice/tradition (argumentum ad antiquitatem): Common practice doesn't imply better. Just because a lot of people do X instead of Y doesn't make X better than Y. This would mean that it's better to use horse wagons instead of cars because that's what people always used to do.

 

As a matter of fact, your argumentation is rather peculiar. You claim that XHTML is better because it's newer, but at the same time you claim that XHTML is better because that's what most people use. This essentially turns out to be a logical contradiction. HTML dates farther back than XHTML which would mean that HTML has had a larger user base than XHTML at some point, thus your two claims contradict each other.

 

Further, by showing the "Validated W3C XHTML 1.0" boilerplate you aren't showing anything other than that you are ignorant and do not know what you are doing. I stopped counting the times I've said this, and it's amazing that it's not clear yet: XHTML served as text/html will NOT be parsed as XHTML but rather as HTML, which essentially means it's not XHTML after all.

 

Finally, I really do not get you on the dirty/shiny part. It's completely nonsensical. HTML is not XHTML. Saying that HTML is bad because it doesn't follow XHTML's syntax doesn't make sense. That's like saying Lisp sucks because it doesn't use Haskell's syntax. Complete nonsense.

 

I didn't know that, but still, w3.org is written in xhtml...

 

I don't know. I sent them an email and I'll reply here with their answer. Even then that argument doesn't hold up though. How often haven't you heard about corrupt legislators for instance? Do you honestly believe that everybody in your parliament are 100% law-abiding all the time?

 

It could also be that their web devs aren't the same people who are in the spec working groups. In that case it could be that their web devs are oblivious to the fact that what they're doing is wrong according to the specs.

 

I do agree that it's somewhat paradoxical that they aren't compliant with their own recommendations though.

Link to comment
Share on other sites

I do agree that it's somewhat paradoxical that they aren't compliant with their own recommendations though.
Yeah that is pretty funny tbh.

 

Right...how about this: if I set my site up to serve html to IE and xhtml to everything else..? I believe that would mean I'm using xhtml in the "right" way...

 

The only problem is my javascript...I actually don't know why it doesn't work. It works in opera...just not firefox. Do I need to use <script type="application/x-javascript"... or something...?

Link to comment
Share on other sites

Right...how about this: if I set my site up to serve html to IE and xhtml to everything else..? I believe that would mean I'm using xhtml in the "right" way...

 

You could do that, but history shows that UA sniffing breaks forward compatibility: http://webaim.org/blog/user-agent-string-history/

 

You cannot rely on content negotiation either. As explained earlier, Internet Explorer claims to support */* (i.e. literally anything) in its Accept header. That's obviously untrue.

Link to comment
Share on other sites

Lol, that's quite funny.

 

Well, who would want to imperonsate IE? Wouldn't this work...?

 

if(!preg_match('/msie/i', $_SERVER['HTTP_USER_AGENT']))
{
header('Content-Type: application/xml+xhtml....
}
else
{
header('Content-Type: text/html....and so forth
}

Link to comment
Share on other sites

Yes, but as mentioned, by doing that you're breaking forward compatibility.

 

Now imagine that after doing this, IE will support XHTML. You are then actively hindering the adoption of the very technology you yourself is promoting. Of course your site alone won't make a difference, but the aforementioned is why UA sniffing is generally regarded as bad practice.

 

Even then, why jump through hoops like that when you could just make it universally correct by coding in HTML 4.01 Strict?

Link to comment
Share on other sites

Well tbh, IE shouldn't be so stubborn to begin with. As that blog post stated, what IE basically does is sees what other browsers are doing, does the complete opposite, and when they realise no one likes their new standards, they eventually do what all other browsers were doing in the first place. So I reckon IE will support xhtml soon anyway...

 

But yeah, my entire site runs through one index file, so it'll take me all but about ten seconds to change it back from html to xhtml mime type again...

 

And yeah, but that's like saying "hey, why not use this language because it's widely supported, instead of your more robust language that you already know and have coded your entire site in?"

Link to comment
Share on other sites

One I have to admit, it makes it rather confusing when the standard maker fails to properly utilize the standard (www.w3.org aka w3c). Something that I would like to add is that on a positive note the use of xhtml and xml has forced microsoft to comply more than ever with web standards, and it has also caused microsoft to change many things including their browser, so much so that their broken ways have caught up with them.

http://www.theregister.co.uk/2008/12/04/interent_explorer_8_list/

Link to comment
Share on other sites

But yeah, my entire site runs through one index file, so it'll take me all but about ten seconds to change it back from html to xhtml mime type again...

 

You're totally missing the point. There are a LOT of enterprise applications that are developed according to e.g. IE6 standards. This hinders adoption of newer versions of IE because they would have to spend time adjusting to newer standards. You're also forgetting that just because you may remember to change your code then it doesn't mean everybody else will.

 

Now imagine that you are developing a popular product and you decide to use this sort of UA sniffing you are talking about performing. Eventually a lot of sites will be deployed with that version. Later on IE gets support for XHTML and as such you upgrade your software. Alas your users aren't going upgrade to the latest version and you're now stuck with a lot of websites doing incorrect UA sniffing.

 

Generally speaking, you shouldn't test which user agent it is, but rather test which features it supports. The problem is just when the user agent lies or is giving inaccurate information. Then there is the fact that you cannot trust the user agent seeing as you can send whatever user agent you want. I could easily pretend to be Googlebot if I wanted to.

Link to comment
Share on other sites

The other problem that I see is this, now that IE has made it a standard to see the web in it's own standard, they want to change that. They want to be compliant. How many users are going to use a browser that to them does not appear to work, and we all know that webmasters will refuse to adjust to meet this new attempt unless it becomes a majority among the IE browsers. Which with current standings looks to be quite the challenge and how long will that take. Once they do adjust now the older browsers become completely obsolete.

http://www.w3schools.com/browsers/browsers_stats.asp

Link to comment
Share on other sites

Well, it's IE's own fault tbh; they should have just been standards compliant in the first place. And as the stats show, they're becoming less popular...

 

It's only a matter of time before IE is annihilated and then everyone will be happy...:D

 

 

Link to comment
Share on other sites

It's not quite as easy as you make it. Back in the days virtually all the browsers created their own proprietary features that slightly differed from the standards. IE is the only of the old browsers that's left. Firefox first appeared in 2002 as Phoenix (later FireBird and the finally Firefox). In 2003 Apple forked the KHTML rendering engine to WebKit and created Safari. IE started out in 1994. The only old browser that still can be said to be in use is Opera, starting out in 1994, same year as IE. Then we have Chrome that started less than a year ago. I don't really think you can realistically claim that there are other serious players on the market today.

 

So, eventually the first browser war started between then Netscape Navigator and Internet Explorer. This had certain consequences. I guess you in a way can compare it to the Cold War. USSR and USA competed on having the most weapons. Neither could afford to fall behind because that would mean losing. So it's pretty much the same thing. Netscape and IE competed on having the most features. This first of all meant that bug fixing was neglected. Bug fixes aren't noticeable, but new features are. It also meant that a lot of proprietary features were implemented. Going through a standards committee like W3C takes time, and in a feature race, you do not have time. So they just did it themselves (all of them, not just Microsoft). Some of these proprietary features got accepted as standards later on in fact. Without the browser wars we wouldn't have XHR, and without XHR no AJAX, and without AJAX no Web 2.0 (of course somebody might have invented it later, but still). So eventually Microsoft won the browser war and Internet Explorer became the dominating browser (like USA won the cold war and became the only remaining super power), and it vastly outnumbered all other browsers in terms of the market share. Hell, even today a lot of people think that Internet Explorer is the internet ("I opened the internet" (i.e. clicked on "the blue E"), "the internet isn't working, but I can send emails", etc.).

 

By then a lot of enterprises had invested a lot of money in developing for Internet Explorer. The problem was just that IE was far from standards compliant. It was locked in. As a business you cannot just break backwards compatibility. That would be an absolutely moronic move. It would be no problem for all other browsers than IE; their market share was far too insignificant. This ultimately leads us to today where we have Internet Explorer between enterprise proprietary and standards compliance. Neither of the two choices are compatible with each other. Choosing the former means it will fall behind the other browsers. Choosing the latter means dropping the enterprise which would be devastating.

 

We're having the second browser war now. During the first browser war the WWW was new and fairly undeveloped, so it was about making new features, being the most advanced (and IE was at some point the most advanced browser). Today we have a lot of advanced things standardized, and experience has shown that standards are the way to go. So this browser war is about being the most standards compliant as possible.

 

The world isn't as black and white as you make it out to be. Microsoft cannot "just adopt the standards". Does this mean IE eventually will fail and die? Probably, but until then we will have to deal with it in light of its market share.

Link to comment
Share on other sites

I agree with what you are saying it is not as simple as just converting, especially since IE is based off the first browser that had images, NCSA Mosaic (http://webaim.org/blog/user-agent-string-history/). But I do believe the necessity to convert is being felt by Microsoft, thus the change in stance; however as stated this could be devastating, and ultimately could lead to IE losing the next browser war. Either way it should be interesting and through it all the benefactors are the webmasters, but only webmasters that ensure their sites are compatible with the majority of viewers. This time around the majority of viewers will be following standards, making our lives simpler. But I would also say that there is still a feature war underway, and each major player is actively fighting that war right now. The ability of webmasters to utilize those features will also create standouts among websites, but will continue to cause webmasters to UA sniff, and thus the war marches on.

Link to comment
Share on other sites

  • 4 weeks later...

I'm currently building a site. Only one page at the moment as my template I'm building upon and it's XHTML, served as xhtml+xml. Reading through this (and being a Mac user, forgetting about IE completely, and stupidly) I'd like to change it the latest strict HTML. It currently validates at strict XHTML.

 

Is there a guide or resource to help the transition, or is it just a case of changing the mime type and getting rid of closing slashes in <img> and <br> (etc.) tags?

Link to comment
Share on other sites

Unless you've been doing funky stuff, the transition should be rather trivial. If you have a valid XHTML 1.0 Strict document, then it should mostly just be the self-closing tag thing. The W3C specifications and the W3C validator should help you with that.

Link to comment
Share on other sites

Reading some of the posts on this thread i have come to the conclusion among with my own thoughts that it is really up to the webmaster to decide the web standard, you can code all you want but it wont count for anything if it doesnt work.

 

I personally beleive that HTML 4.0 strict with lower case tag names and trailing slashes in singular tags is the best method of programming in html and I personally see no point in the new XHTML as it doesnt add anything in my view, keep the good old HTML and just make new standard versions for that, features of the language should be replaced with new and better methods as we go not completley redesigned just because someone thought that it might be a good idea.

 

Anyway thats just my view but is probably the easiest way to code in any version of HTML as it should comply with all standards.

Link to comment
Share on other sites

http://blogs.zdnet.com/microsoft/?p=2072

Even live.com is there :S

 

Generally speaking, you shouldn't test which user agent it is, but rather test which features it supports. The problem is just when the user agent lies or is giving inaccurate information. Then there is the fact that you cannot trust the user agent seeing as you can send whatever user agent you want. I could easily pretend to be Googlebot if I wanted to.

 

I think you're right but many webmaster don't think like that. The only way for Internet Explorer to win any browser war is to die, start with another browser name with a new useragent and slowy inviting user to install the new one and drop support for Internet Explorer. I don't think it will be possible to make Internet Explorer standard compliant since they have bad history and  lot of bad code that use useragent instead of testing features.

 

For the XHTML/HTML question i don't think it's really that important. Strict over Transitional is far more important than xhtml/html. I personaly use HTML 4.01 Strict over anything else whenever possible.

Link to comment
Share on other sites

I personally beleive that HTML 4.0 strict with lower case tag names and trailing slashes in singular tags is the best method of programming in html [...]

 

You think deliberately putting syntax errors in your document is the "best method of programming"? That seems a bit odd to me.

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.


×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.