Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

W3C Releases Drafts For DOM L2 And More

timothy posted more than 11 years ago | from the standards-to-choose-from dept.

The Internet 150

TobiasSodergren writes "People at W3C seem to have had a busy Friday, according to their website. They have released no less than 4 working drafts (Web Ontology Language (OWL) Guide, the QA Working group - Introduction, Process and Operational Guidelines, Specification Guidelines) and 2 proposed recommendations: XML-Signature XPath Filter 2.0 and HTML DOM 2. Does the this mean that one can expect browsers to behave in a predictable manner when playing around with HTML documents? Hope is the last thing to leave optimistic people, right?"

cancel ×

150 comments

Sorry! There are no comments related to the filter you selected.

and there was one... (-1, Offtopic)

Anonymous Coward | more than 11 years ago | (#4635816)

one post that came first. huzzah.

firstuspostus (-1)

medicthree (125112) | more than 11 years ago | (#4635819)

indahostus.

Re:firstuspostus (-1, Offtopic)

Anonymous Coward | more than 11 years ago | (#4635830)

HAHAHA YOU LOSE!@ I AM TEH WINNAR!@$@ I HAVE DESTROEYD U ON TEH INTARWEB!@!#

Now I am going to go play some tomorrowind becuz my cycllops is teh coolests!@

Lameness filter encountered. Post aborted!
Reason: Don't use so many caps. It's like YELLING.

W3C standards getting out of hand (3, Funny)

Anonymous Coward | more than 11 years ago | (#4635823)

Who needs more than h1, b, and i tags for documents?

Re:W3C standards getting out of hand (1)

Trusty Penfold (615679) | more than 11 years ago | (#4635846)


Me; I need <ecode> (Whatever _that_ is)

Re:W3C standards getting out of hand (4, Funny)

GimmeFuel (589906) | more than 11 years ago | (#4635874)

You're forgetting the tags that are the basis of strong web design, like marquee, blink, bgsound, etc. Script tags are also very useful, for scrolling status bars, alerts that tell you how cool the page is, text that follows your cursor, and anything else you can copy/paste from annoyinghtmlforaolers.com

doesn't matter... (3, Insightful)

adamb0mb (68142) | more than 11 years ago | (#4635824)

doesn't matter how many standard that w3c sets, MS is never going to follow them. They'll just set their own standards, and those will become the de facto standards... its rough, but its the ways it is...

Re:doesn't matter... (3, Interesting)

Goalie_Ca (584234) | more than 11 years ago | (#4635853)

I agree exactly with that. How many standards does IE 6 adhere with. One of my professors actually uses microsoft office (or some other ms product) to make the website and its components and it is a pain in the ass to access unless i'm using IE 6. In fact i was using mozilla and I ended up missing 6 pages from a document. I don't see why and how MS needs to break standards other than for their own agenda. If they do set their own standards it should be something the whole world can agree upon. Communication technologies should all follow standard protocols!

Hello good sir (-1, Troll)

Anonymous Coward | more than 11 years ago | (#4635899)

I have been enlisted by teh slalshdort crew to inform you that you are teh sux.

Thank you for your time, and have a loverly day.

The rest of this post goes out to the lyrics guy

Four Fingers
Negativland

I am a man, a man with four fingers
A man with four fingers on my hand
I am a man, a man with four fingers
But that doesn't coun't my thumb.

I am a man, a man with three fingers
A man with three fingers on my hand
I am a man, a man with three fingers
But that doesn't count my index finger or my thumb.

I am a man, a man with two fingers
A man with two fingers on my hand
I am a man, a man with two fingers
But that doesn't count my middle finger, my index finger, or my thumb.

I am a man, a man with one finger
A man with one finger on my hand
I am a man, a man with one finger
But that doesn't count my ring finger, my middle finger, my index finger, or my thumb.

(whistling solo)

I am a man, a man with no fingers
A man with no fingers on my hand
I am a man, a man with no fingers
But that doesn't count my pinky or my ring finger, my middle finger, my index finger, or my thumb.

Re:Hello good sir (-1, Troll)

Anonymous Coward | more than 11 years ago | (#4635915)

Lily is dancing on the table
We've all been pushed Too far
I guess on days like this
you know who your friends are
Just another Dead Fag
to you that's all
Just another Light missing
on a long taxi ride
taxi ride

Re:doesn't matter... (5, Insightful)

ender81b (520454) | more than 11 years ago | (#4635952)

Ok... you tripped mode.

I work in a student computer lab for a fairly large university, about 28,000 students. You wouldn't *believe* the problems I have to deal with because of stupid, and I stress stupid, professors using stuff like MSword/powerpoint for their class notes and webpages.

I'll give you a few examples. Powerpoint is the most common for posting class notes. All good and fine because thanks to OpenOffice even a linux box can read pp slides just fine. The problem is printing them. Since we have only dot matrix printers (long story...) if the professor uses too weird a color scheme the slides don't print worth a damm, even with 'print only black/white' option checked. Problem #1.

The bigger problem is when they use MSword to post syllabi, notes, etc. Students have a problem viewing them at home for whatever reason (most likely they are using an old version of word) and they have to come back to campus to look at this stuff. It is insane. I always direct them to install OpenOffice but sometimes they might only have a modem so it isn't really an option. And if you talk to these professors about only posting stuff in MSWord they get defensive and say such things like 'everyone uses it' and other to the like. Try pointing out that just clicking 'save as rich text format' will cover 99% of the stuff they publish just doesn't work. Sigh. It is becoming a real problem. Same with webpages - what standards, microsoft is a stanard, I'm sure this would work fine if you would use a *microsoft* browser, etc, etc.

Not that all professors are dumb, a lot use things like rich text format and try to stay away from word but alot don't. It is a major headache to some students, and for me. And don't even get me started about how IE handles word documents - has the nasty tendancy to embed them within the current frame which causes havoc with printing, saving, etc - at least for your average student.

Seriously, more teachers need to be educated on thigns like open formats. For instance, it wouldn't be that hard to devolp a campus-wide XML format and a nice little front-end for making syllabus's, class notes, outlines, etc available to all faculty. That way you could ensure that everyone had equal access to the documents instead of forcing students to use MS products.

Standards (-1, Troll)

Anonymous Coward | more than 11 years ago | (#4636024)

Hey modders!

He is saying microsoft is gay! Mod this little fucking sloppy shiteater up!!!

Maybe your professor's balls do not care about open formats because of people like you who do not like showers & soap bother him too much when he makes his stuff too accessable to you.

Linux is our way of keeping the herbs away.

Re:Standards (-1, Troll)

Anonymous Coward | more than 11 years ago | (#4636092)

You fucking faggy sloppy cunts are always good for a laugh. Now fuck off.

Re:doesn't matter... (0)

Anonymous Coward | more than 11 years ago | (#4636029)

You might try to get a campus-wide license for Adobe Distiller and make people post pdf's instead. Then people can use Word or TeX or whatever and still be able to post things.

Re:doesn't matter... (2)

ender81b (520454) | more than 11 years ago | (#4636042)

indeed. I forgot to mention alot of people do post things in .pdf's - it just depends on if the department spent enough to buy the liscense for whatever product be it distiller or acrobat. Or if they know/have the inclination to spend the money which kindof gets back to the original point of believing microsoft *is* the standard.

Re:doesn't matter... (1, Informative)

Anonymous Coward | more than 11 years ago | (#4636119)

You can create PDF files for free by setting up a generic Postscript printer, printing to a file, and using Ghostscript to convert to PDF (it comes with a ps2pdf script). Or you can use ps2pdf.com to convert it online.

Re:doesn't matter... (1)

Sycraft-fu (314770) | more than 11 years ago | (#4636212)

Well if your university wants to have something like you suggest, they need to do it. They need to implement a campus wide system that all professors have access too. They then need to make training either available or possibaly manditory on how to use the system. If they really want it to take off, they need to mandiate its use.

However, I really don't feel much sympathy for students as I don't see professors using MS Office, or whatever else they like as a problem. There is always teh simple option of attending class and picking up the hardcopy when it is passed out. Indeed many classess I have taken have no website at all, and it is your responsbility to attend class and get your informaton that way.

Also, all the universities I have seen do at least a passable job (and usually much better) of providing computer facalities in places like the main library. It is not hard to go to the library and print what you need.

If you want to mandiate that professors all must use a given system for their websites, fine, but you'd better be prepared to make sure it works WELL and provide any and all necessary support for them to use it. Otherwise, they need to be allowed to use what they like.

Re:doesn't matter... (1)

Ed Avis (5917) | more than 11 years ago | (#4636286)

At university I found you could usually tell how good a lecturer would be by the material used for slides. Those using LaTeX and its slides package usually had the most interesting courses (if more difficult); those with wordprocessors in the middle; PowerPoint usually meant fairly fluffy. There were exceptions and it wasn't a perfect correlation, but it was certainly a factor in choosing what course to take.

Re:doesn't matter... (1)

smitty_one_each (243267) | more than 11 years ago | (#4636535)

How about a VBA macro to translate MSWord into HTML. Sure, some fidelity loss, and getting the professors to use it would be like pushing Jell-O up a hill with a toothpick, but it would be something...

Re:doesn't matter... (0)

Anonymous Coward | more than 11 years ago | (#4636059)

Because it takes years of bureaucratic dicking around before committee's actually produce anything. Look at Sun's "Liberty Alliance" framework committee. They haven't even come up with a working prototype yet, and it's nearly been 2 years. On the other hand, Passport has 120 million users (yes, 80% of them are Hotmail users). It may be proprietary, but it exists and works here today. The Liberty Alliance has produced nothing but vapor.

IE6 W3 support (5, Interesting)

Cardinal (311) | more than 11 years ago | (#4636073)

Actually, IE6 does a decent job. Their DOM1 support is good, their CSS1 is more or less complete, but their CSS2 is pretty crappy. Fixed positioning doesn't work, selectors [w3.org] like E[attr] are missing, etc.

Lately I've been working on an app for a company's internal use, which means the delightful situation of being able to dictate minimum browser requirements. As a result, the app is designed for IE6/Mozilla. All development has been in Mozilla, and a lot of DOM use goes on. And it all works in IE6, no browser checking anywhere. My only regrets is I can't make use of the more advanced selectors provided by CSS2, so the HTML has a few more class attributes than it would need otherwise. But, overall, not bad.

Another positive note, IE6 SP1 finally supports XHTML sent as text/xml. So at last, XHTML documents can be sent with the proper mime type [hixie.ch] .

So despite being a Mozilla (Galeon) user, as a web developer who makes heavy use of modern standards, I look forward to seeing IE continue to catch up to Mozilla so that I can worry even less about browser-specific issues.

Re:doesn't matter... (1, Insightful)

mijok (603178) | more than 11 years ago | (#4636181)

In case you haven't noticed MS benefits enormously by breaking standards and creating their own. To the average user MS standards are the only standards and since OpenOffice, Mozilla etc. can't implement .doc and their html 100% correct it makes them look bad, ie. "that must be crapp, my homepage looked good in IE"

Re:doesn't matter... (2)

taion (304184) | more than 11 years ago | (#4635969)

I don't think that's necessarily true. It's a given that Microsoft's track record in terms of standards compliance has been exceptionally poor relative to Mozilla and other similar efforts, but "MS HTML" is somewhat closer to w3c's standards at the present than they were previously.

Also, while IE is the most popular browser, it's not the only one, and a not insignificant proportion of the population uses Mozilla, Opera, and other browsers. Somewhat hypocritical of me, since I'm currently using IE on my Windows partition, as opposed to Mozilla on my FreeBSD partition, but on purely technical merits, IE isn't really the best browser, and the optimist in me is convinced that the greater portion of the online population will eventually go for the better solution. On the other hand, if they don't, why should we worry about it? The proletariat can do as they please. So long as "MS HTML" doesn't somehow become entirely proprietary, we retain the ability to access it, plus we get to view properly-rendered pages. Whee.

Don't forget, either, that Microsoft actually is a member [w3.org] of the w3c. Microsoft can be accused of many things, but blatantly violating one's own standards is a rather stupid thing to do.

No. (5, Insightful)

Trusty Penfold (615679) | more than 11 years ago | (#4635836)

Does the this mean that one can expect browsers to behave in a predictable manner

When there was 1 standard (HTML), browsers didn't behave predictably.

Now there are more, there is more scope for implemetations to have their quirks, not less.

Standards are large and complicated descriptions of expected behaviour. Each implementor may have a slightly different interpretation. Different implementations will have their strengths and weaknesses which make different parts of the standard easier or harder to implement fully and/or correctly. There may even be reasons why an implementor may choose to ignore part of a standard (perhaps it is difficult and he believes that users don't want or need that functionality yet).

Unfortunately, standards are an ideal to aim for, not a description of reality.

C++ XML API (4, Interesting)

be-fan (61476) | more than 11 years ago | (#4635856)

I've been looking around for a nice simple API to XML parsers, and I've yet to find one. Java and Perl both have clean, native-feeling XML APIs (JDOM and XML::Simple) but so far, the only C++ ones I've found map closely to DOM's overly complicated object model, and don't "feel" like C++ libraries (they don't use the STL and whatnot). Anybody know of a library along the lines of JDOM except for C++?

Re:C++ XML API (3, Informative)

sporty (27564) | more than 11 years ago | (#4635868)

Have you tried the Xalan type stuff? http://xml.apache.org [apache.org]

Re:C++ XML API (2)

be-fan (61476) | more than 11 years ago | (#4635914)

Xerces C++ is a very good XML parser, but it's for really heavy duty stuff, not at all like JDOM or XML::Simple. Plus, the API is almost 1:1 to the DOM API, and isn't very C++ at all. From the Xerces C++ page:

"For portability, care has been taken to make minimal use of templates, no RTTI, no C++ namespaces and minimal use of #ifdefs."

The API is basically C with classes, uses XMLChar * instead of std::string, etc. I'm looking for something more along the lines of the Boost or Loki libraries in that they integrate cleanly with the STL.

Let me use JDOM and XML::Simple as examples. They both simplify the (IMHO too complex) DOM model, as well as fitting closely to the language. JDOM, for example, uses standard Java strings and containers, while XML::Simple uses Perl associative arrays.

Re:C++ XML API (0)

Anonymous Coward | more than 11 years ago | (#4635882)

and too bad the PHP ones are all difficult to use and require a custom compilation of several trees :(

I'm going back to ASP.net :(

Re:C++ XML API (2)

lpontiac (173839) | more than 11 years ago | (#4636017)

I haven't used it yet, but looking at Arabica [jezuk.co.uk] is on my todo list. No STL integration, but it does deliver std::string or std::wstring.

Re:C++ XML API (5, Informative)

KidSock (150684) | more than 11 years ago | (#4636052)

I've been looking around for a nice simple API to XML parsers, and I've yet to find one. Java and Perl both have clean, native-feeling XML APIs (JDOM and XML::Simple) but so far, the only C++ ones I've found map closely to DOM's overly complicated object model, and don't "feel" like C++ libraries (they don't use the STL and whatnot). Anybody know of a library along the lines of JDOM except for C++?

Someone posted a neat little class to the expat mailing list ~2yrs ago. Basically it was just a Node class with STL list for children and a hashmap for attributes. It was very small, clean, and was in essance a DOM. It used expat but trust me, the code was so tiny you could use any parser with it. It was like 200 lines of code.

I liked it so much I created the same thing in C called domnode [eskimo.com] .

Search the expat archives [sourceforge.net] . Wish I could give you more to go on.

Re:C++ XML API (2)

Ed Avis (5917) | more than 11 years ago | (#4636281)

It's probably not what you want but FleXML [sourceforge.net] is a very fast way of parsing XML that conforms to a particular DTD. It's like lex and yacc - is that C++-like enough?

I completely agree about all the weird reinvent-the-wheel stuff that DOM and similar libraries contain: it would be so much better if they could use the STL in C++ and native data structures in other languages (nested lists in Lisp, etc etc). It's just that a basic function call interface is the lowest common denominator, so if you want the same library on every language you have to invent a whole new list and tree API. Perhaps this is an indication that the same library on every different language isn't such a good idea. (Think of the Mozilla debate: 'the same on every platform' versus 'native on every platform'. I have a feeling that in programming languages as well as GUIs the second choice is better.)

Standards (2, Interesting)

Stanley Feinbaum (622232) | more than 11 years ago | (#4635857)

Web standards set by the W3C have little meaning right now. Standards are controlled by marketshare, and Internet Explorer has been the leading browser for at least a couple of years. Surely Mozilla and Opera will follow these standards, as they always have, but will IE do the same?

Perhaps it's time we stopped sitting on our thumbs and complaining about Microsoft ignoring standards. An outright ban of IE is needed, from workplaces, schools, ect... Sites should block access to people using IE. This is the only way we can get our rights to web standards back!

Seriously though, does anyone have any ideas on how we can take control of web standards away from MS ?

Something about reading Eolas thingie.. (1)

euxneks (516538) | more than 11 years ago | (#4635890)

Seriously though, does anyone have any ideas on how we can take control of web standards away from MS ?

I remember a slashdot link [slashdot.org] somewhere mentioning something about IE getting eliminated due to some sort of plugin junk?

Re:Standards (2)

Soko (17987) | more than 11 years ago | (#4635902)

Perhaps it's time we stopped sitting on our thumbs and complaining about Microsoft ignoring standards. An outright ban of IE is needed, from workplaces, schools, ect... Sites should block access to people using IE. This is the only way we can get our rights to web standards back!

Y'know, in a perfect world, I'd whole heartedly agree with you. Is it a perfect world? Hence, the diatribe.

Seriously though, does anyone have any ideas on how we can take control of web standards away from MS ?

Ooops, sorry. Cancel diatribe... ;) Seriously, I don't think we as a community can really do anything substancial to Microsoft, since they don't want to listen to us anyway. Advocacy is about the only weapon we have, unless you come up with the "next killer app" that everyone needs and exclude any browser that doesn't follow the W3C standards you espouse . When you do that, you can set terms. Until then, we're just a bunch on Don Quixotes, tilting against windmills.

Sorry for the dose of reality.

Soko

Re:Standards (1)

Oluseyi (570657) | more than 11 years ago | (#4635924)

Seriously though, does anyone have any ideas on how we can take control of web standards away from MS ?

Why bother? Have you taken a look at these standard recently? They're huge and unwieldly. Perhaps a more attainable goal is to develop the next generation of browsers - a blank context for multimedia rendering as directed by the server-side script. Sort of a Shockwave Flash as a native platform.

Re:Standards (4, Informative)

eddy the lip (20794) | more than 11 years ago | (#4635940)

Somedays I'm more optimistic. Today's one of those days (tomorrow may not be 'cause I'm digging deeper into IE's weird-ass DOM than I usually care to). But...

Most web developers that have been around for a while would rather code to standards than to marketshare. Standards give you the promise of backward, and more importantly, forward, compatibility. It's also a helluva lot easier to sort out your code when a client asks for a redesign in a year or two if you've been conscious of more than just "making it look right" in the popular browser of the day.

Markup designed for IE only often does truly evil things on other platforms - there's going to be more cellphones and PDAs accessing web pages, not fewer. There are also serious organizational advantages to coding to standards - more tools for handling your pages, it's easier to whip up a quick perl script to process standards compliant HTML...the list of advantages is long.

Just like any other field, there's a trickle-down effect. Not everyone will write good, W3C compliant code, but more will, more often. And despite their megalithic, feudal mentality, Microsoft will have to pay attention. IE6 is still a long ways away from adhering to standards, but it's much, much closer than IE4 was. This seems to have been in large part a reaction to developers bitching about their lack of compliance. I'm hopeful the trend will continue.

Re:Standards (3, Interesting)

frawaradaR (591488) | more than 11 years ago | (#4636034)

Yeah, some really popular sites (like Slashdot) need to use standards compliant code and not cover for browser bugs. Wired recently went XHTML and CSS2. This is the way to go. If a browser can't render it, file a bug. If it doesn't work in IE, too bad!

My own homepage doesn't render in anything but Mozilla, currently, but small, personal sites aren't gonna break or make anything (unless they come in the millions, which is unlikely).

The people at Mozilla have provided us with a tool of 99% perfect rendering. Now it is up to the web site maintainers to actually enforce the use of Mozilla (or any other browser that fully adheres to standards; there is no other currently).

But Slashdot won't take this upon its shoulders, because it doesn't believe in standards, just like M$.

So M$ wins.

Re:Standards (4, Informative)

whereiswaldo (459052) | more than 11 years ago | (#4636079)

If a browser can't render it, file a bug. If it doesn't work in IE, too bad!

Many sites can get away with this, but many cannot. If I'm selling a product on the web, I'll make darn sure that 99% of my customer's browsers work with my site. It's a good ideal to say "fix your IE bugs", but often not realistic.

Re:Standards (2, Interesting)

MisterFancypants (615129) | more than 11 years ago | (#4636060)

Surely Mozilla and Opera will follow these standards, as they always have, but will IE do the same?

That depends quite a lot on your definition of ALWAYS as it applies to Mozilla...Considering Mozilla was originally based off the Netscape source code (though I realize now it is been virtually completely rewritten). People seem to forget that Netscape were the kings of non-standard HTML as an attempt to "lock-in" customers. Hell, IE still to this day includes Mozilla in its user agent header to work around all the sites that would deny access to anything other than Netscape, back in the 2.0 era.

Re:Standards (2)

whereiswaldo (459052) | more than 11 years ago | (#4636084)

Hell, IE still to this day includes Mozilla in its user agent header to work around all the sites that would deny access to anything other than Netscape, back in the 2.0 era.

At this I am very surprised. It's Microsoft's style to turn around and bite people in the ass when they have the upper hand. I wonder why MS hasn't "forced" Netscape only sites to change by updating their agent header?

No need - they have Passport (2)

DrSkwid (118965) | more than 11 years ago | (#4636260)

Here in the UK the Govt. has snuggled up nicely and they rolling out IE Only Govt. Services.

Changing headers is no use in that scenario

DOM Lvl 2 (0)

Anonymous Coward | more than 11 years ago | (#4635860)

Imagine a beowulf cluster of these!

Slashdot's Recent Exodus (-1)

Anonymous Coward | more than 11 years ago | (#4635864)

The Backstory

We realized soon that our setup at Digital Nation was very flawed. We were having great difficulty administering the machines and making changes. But the real problem was that all the SQL traffic was flowing over the same switch. The decision was made to move to Exodus to solve these problems, as well as to go to a provider that would allow us to scatter multiple data centers around the world when we were ready to do so.

Meanwhile, Slashcode kicked and screamed its way to v1.0 at the iron fists of CaptTofu (Patrick Galbraith) and Pudge (Chris Nandor). The list of bug fixes stretches many miles, and the world rejoiced, although Slashdot itself continued to run the old code until we made the move.

The Colocation Site

Slashdot's new co-location site is now at Andover.Net's own (pinky finger to the mouth) $1 million dedicated data center at the Exodus network facility in Waltham, Mass, which has the added advantage of being less than a 30 minute drive for most of our network admins -- so they don't have to fly cross-country to install machines. We have some racks sitting at Exodus. All boxes are networked together through a Cisco 6509 with 2 MSFCs and a Cisco 3500 so we can rearrange our internal network topology just by reconfiguring the switch. Internet connectivity to/from the outside world all flows through an Arrowpoint CS-800 switch which acts as both a firewall load balancer for the front end Web servers. It also so happens that Arrowpoint shares the same office building with Andover.Net in Acton so whenever we need Arrowpoint tech support we just walk upstairs and talk to the engineers.

The Hardware

5 load balanced Web servers dedicated to pages
3 load balanced Web servers dedicated to images
1 SQL server
1 NFS Server
All the boxes are VA Linux Systems FullOns running Debian (except for the SQL box). Each box (except for the SQL box) has LVD SCSI with 10,000 RPM drives. And they all have 2 Intel EtherExpress 100 LAN adapters.
The Software

Slashdot itself is finally running the latest release of Slashcode (it was pretty amusing being out of date with our own code: for nearly a year the code release lagged behind Slashdot, but my, how the tables have turned).

Slashcode itself is based on Apache, mod_perl and MySQL. The MySQL and Apache configs are still being tweaked -- part of the trick is to keep the MaxClients setting in httpd.conf on each web server low enough to not overwhelm the connection limits of database, which in turn depends on the process limits of the kernel, which can all be tweaked until a state of perfect zen balance has been achieved ... this is one of the trickier parts. Run 'ab' (the apache bench tool) with a few different settings, then tweak SQL a bit. Repeat. Tweak httpd a bit. Repeat. Drink coffee. Repeat until dead. And every time you add or change hardware, you start over!

The AdFu ad system has been replaced with a small Apache module written in C for better performance, and that too will be open sourced When It's Ready (tm). This was done to make things consistent across all of Andover.Net (I personally prefer AdFu, but since I'm not the one who has to read the reports and maintain the list of ads, I don't really care what Slashdot runs).

Fault tolerance was a big issue. We've started by load balancing anything that could easily be balanced, but balancing MySQL is harder. We're funding development efforts with the MySQL team to add database replication and rollback capabilities to MySQL (these improvements will of course be rolled into the normal MySQL release as well).

We're also developing some in-house software (code named "Odyssey") that will keep each Slashdot box synchronized with a hot-spare box, so in case a box suddenly dies it will automatically be replaced with a hot-spare box -- kind of a RAID-for-servers solution (imagine... a Beowulf cluster of these? rimshot) Yes, it'll also be released as open source when its functional.

Security Measures

The Matrix sits behind a firewalling BSD box and an Arrowpoint Load balancer. Each filters certain kinds of attacks and frees up the httpd boxes to concentrate on just serving httpd, and allows the dedicated hardware to do what it does best. All administrative access is made through a VPN (which is just another box).

Hardware Details

Type I (web server)
VA Full On 2x2
Debian Linux frozen
PIII/600 MHz 512K cache
1 GB RAM
9.1GB LVD SCSI with hot swap backplane
Intel EtherExpress Pro (built-in on moboard)
Intel EtherExpress 100 adapter
Type II (kernel NFS with kernel locking)
VA Full On 2x2
Debian Linux frozen
Dual PIII/600 MHz
2 GB RAM
(2) 9.1GB LVD SCSI with hot swap backplane
Intel EtherExpress Pro (built-in on motherboard)
Intel EtherExpress 100 adapter
Type III (SQL)
VA Research 3500
Red Hat Linux 6.2 (final release + tweaks)
Quad Xeon 550 MHz, 1MB cache
2 GB RAM
6 LVD disks, 10000 RPM (1 system disk, 5 disks for RAID5)
Mylex Extreme RAID controller 16 MB cache
Intel EtherExpress Pro (built-in on motherboard)
Intel EtherExpress 100 adapter
Answered by: CmdrTaco
Last Modified: 6/13/00

First "FUCK KDE" post (-1)

Anonymous Coward | more than 11 years ago | (#4635865)

Ok, maybe not....

I just wish one little thing (1)

euxneks (516538) | more than 11 years ago | (#4635873)

My Hope is that they eliminate javascript from all web-browsers, not just the ability to remove javascript... ELIMINATE IT. I have no need for it and I don't go to any webpages that are reliant on it. The last thing I need is some sleezy company (like maybe microsoft or something) popping up an annoying advertisement on my desktop.

Re:I just wish one little thing (1)

tq_at_sju (218880) | more than 11 years ago | (#4635883)

javascript is good for helping forms behave certain ways, like allowing you to change how a drop down list works etc....Javascript is definitely needed to add to the functionality of html forms, other then that i agree with you totally

Re:I just wish one little thing (1)

King of the World (212739) | more than 11 years ago | (#4635903)

I use Javascript to give live feedback on form input using regular expressions. DHTML tree-view menus can be useful for expressing a lot of information.

Re:I just wish one little thing (0)

Anonymous Coward | more than 11 years ago | (#4636139)

DHTML tree-view menus can be useful for expressing a lot of information.

Only if they're implemented properly. Each element in the list must be initially visible, then hidden using Javascript while the page is loading. This way people with Javascript enabled will see a collapsed tree (and be able to expand nodes), and people with Javascript disabled will see a fully-expanded tree.

If the nodes were initially hidden, there would be no way for users to see the tree if they had Javascript disabled (I've noticed this on several web sites).

Re:I just wish one little thing (2)

skunkeh (410004) | more than 11 years ago | (#4636418)

There are some excellent accessible, standards compliant scripts now for creating trees / drop down menus from HTML nested lists - browsers without javascript see the list, while browsers with javascript get a nice expanding tree. Two examples:

Re:I just wish one little thing (0)

Anonymous Coward | more than 11 years ago | (#4636127)

javascript is good for helping forms behave certain ways, like allowing you to change how a drop down list works etc....

That's fine, as long as the form works without Javascript enabled. I frequently see dropdown navigation lists with no submit button beside them. If Javascript is off, you can select a page from the list, but there's no way to actually navigate to the selected page - this can be very confusing for users.

Re:I just wish one little thing (4, Informative)

Cheese Cracker (615402) | more than 11 years ago | (#4635932)

JavaScript is good for many things, like eliminating travel to server for doing basic input checks, make HTML documents smaller (and thereby faster to transmit), dynamically creating HTML in a frame etc. Other people can probably give you more examples.

If you got a problem with popup ads, then please download the Opera browser [opera.com] ... you'll find F12 to be your best friend. ;)

If you really want to crusade against something, then VB script is a better candidate or why not Outlook... the worst virus spreading software ever created.

Re:I just wish one little thing (2)

bm_luethke (253362) | more than 11 years ago | (#4636102)

Outlook... the worst virus spreading software ever created.

that reminds me, since I do not use outlook/express for e-mail (I use mozilla at work and opera's stuff at home) I just set my adress list to use public addresses @ microsoft.com, that way if for some reason (someone else in the family ignores one of the computer commandments and opens some virus in an attachment) it simply sends the crap to microsoft and no one else

junk snail mail is also handled by removing the postage paid self-adressed enveloped and filling it with metal scraps and placing in the mail (receivers are charged with postage) - make the spammers/virus enablers pay whenever you can.

client side scripting: good, JavaScript: bad (2)

g4dget (579145) | more than 11 years ago | (#4636437)

Those are all useful things to do. The problem is with how JavaScript does them. For example, for making HTML documents smaller, a client-side macro facility would be more reliable, more efficient, and simpler. For doing input checks, a pattern language would be better. And on and on.

If JavaScript (by which I mean JavaScript, DOM, DHTML, etc.) were a simple, if limited, solution to those problems, it would be OK. But it isn't. It is much more complicated than technically better solutions, yet it still is extremely limited.

Simple and limited, and complex and powerful are both acceptable engineering tradeoffs. But complex and limited and buggy is a bad engineering tradeoff. And that's JavaScript.

Just because you don't feel the need .... (1)

DrSkwid (118965) | more than 11 years ago | (#4636271)

I use it to help cache my site.
The banner rotation is via js so that the main page can be cached.
(but not annoying pop-up/unders - some of us realise they are a detraction).
Our banners don't link to any external sites.
The banner is part of the web frame of reference.

We have over 500 pages of content so I'm sure you'll excuse us our right to present deep links on our main page.

This is a troll, right? (1)

Sycle (569193) | more than 11 years ago | (#4636288)

Or are you really demanding we all take a nice big step backwards and remove the capacity for client side scripting because you're a caveman and can't understand what it's used for?

Do you think javascript == popup windows? The open window call is abused, and I'd like to see the spec implement some kind of suggested behaviour along the lines of disregarding popups that aren't user activated (Mozillia already does a great job of this, but making it part of the spec would be superior) but to lose client based scripting would be a blow to the usability of the Internet and the palette of web designers trying to make intelligent sites.

Client side form validation, adapting pages, and heck, even silly stuff like graphical rollovers which you can't do in CSS yet, are all things the Internet benefits from. Only an idiot would fail to anticipate how their page would work to users who don't have Javascript turned on, but it can make the experience run that much nicer and efficiently.

Not to mention that nuking Javascript, an open, standards based, accessible language, will simply promote the use of obnoxious propriety technology like Flash.

The W3C is a joke (2, Insightful)

Anonymous Coward | more than 11 years ago | (#4635891)

What good is a standard if you never hold anyone's feet to the fire if they don't support it? If developers never have any incentive to actually get it right? If the standards are so vague that it allows for interpretations that can be so drastically different that the standard becomes useless?

Has any company yet written a complete CSS1 implementation? A complete working version of DOM0? Yet here we are toiling away on XHTML and CSS3(!) and DOM Level 2. And they don't even seem to give a rat's ass if anyone actually follows the rules.

From what I hear about CSS3, it's going to be such a massive specification that no company (save Microsoft, if they actually gave a damn) would possibly be able to implement it.

What are we doing? The W3C puts out specifications that by the year become less and less relevant because their possible implementation date grows further and further remote. We'll see CSS3 arrive but will we ever see it in action? Or will it be supplanted by CSS4 and 5 which we will also never see? In the meantime we see developers actually building websites entirely out of Flash because there's one reference implementation (one version, period) and it just works. Is that the future we want?

It's time to hold these clowns accountable. Make them do some real work: make them create a working version of their spec. Make them brand one developer's work as a reference. Make them do something to prove that these standards are more than just empty clouds of words!

Re:The W3C is a joke (0)

Anonymous Coward | more than 11 years ago | (#4635922)

What good is a standard if you never hold anyone's feet to the fire if they don't support it? If developers never have any incentive to actually get it right? If the standards are so vague that it allows for interpretations that can be so drastically different that the standard becomes useless?
True. Similar to opensource certified there should be free W3C certified compliance with standards.

CSS is becoming a complex syntax of its own, I would have preferred if they went to XML by now.

Flash isn't one version. Everyone knows that. Ahhh. yes, you're a liar / troll. Good one.

Re:The W3C is a joke (3, Informative)

Anonymous Coward | more than 11 years ago | (#4636028)

Has any company yet written a complete CSS1 implementation?
Yes. Mozilla. Got most of CSS2 as well.
A complete working version of DOM0?
Once again, Mozilla. Also supports DOM1. Oh, and most of DOM2. See the Mozilla DOM Support [mozilla.org] doc for the details.
Yet here we are toiling away on XHTML and CSS3(!) and DOM Level 2. And they don't even seem to give a rat's ass if anyone actually follows the rules.
Good job the Mozilla developers care then. Mozilla supports XHTML and some CSS3 (see below) and DOM2 (see above).
From what I hear about CSS3, it's going to be such a massive specification that no company (save Microsoft, if they actually gave a damn) would possibly be able to implement it.
Mozilla implements bits of it, mainly as vendor-specific extensions. No, that's not the same as proprietary. Vendor specific extensions are allowed by the spec if implemented correctly e.g. properties should be prefixed with -vendorname- (Mozilla uses -moz-).

Re:The W3C is a joke (2, Interesting)

frawaradaR (591488) | more than 11 years ago | (#4636048)

Mozilla supports XHTML and some CSS3 (see below) and DOM2 (see above).

Unfortunately, Mozilla does not support DOM 2 HTML in XHTML... and probably never will, because the bug assignee doesn't seem to care about this rather crucial bug.

Btw, DOM 0 is not a standard, but a collection of common garbage from the old days. It is supported in Mozilla only for backward compatibility, and people shouldn't use it in design. Mozilla explicitly does not support IE and NN4 only stuff such as document.all and document.layers.

Re:The W3C is a joke (1)

bartok (111886) | more than 11 years ago | (#4636063)

It's better for the W3C to release specifications early on than to have each vendors wait for it, and in the mean time, develop their own proprietary solutions. Browser developers are much lees likely to roll their own proprietary specifications if they can just read a W3C one and worry only about the implementation details.

Re:The W3C is a joke (4, Informative)

IamTheRealMike (537420) | more than 11 years ago | (#4636632)

What good is a standard if you never hold anyone's feet to the fire if they don't support it? If developers never have any incentive to actually get it right? If the standards are so vague that it allows for interpretations that can be so drastically different that the standard becomes useless?

You have to have standards. The W3C are the people who are widely recognized as being the technical lead for the net. Now they don't make law, quite right, but if there was no W3C then Microsoft really WOULD own the web: as it is, we can and do take them to task when they break the rules. They can ignore us of course, yet whaddaya know but IE6 supports DOM/CSS Level 1. Not a particularly impressive achievement, but it's a start.

The standards are actually very precise, which is one reason they are seens as being very large. There is hardly any room for interpretation in stuff like the DOM, CSS, XML etc. Of course, sometimes when the internal architecture of IE mandates it Microsoft simply ignore things, the mime-type issue being a good example, but also the fact that you have to specify node.className = "class" to set the style on a new element, as opposed to setting the class attribute (which works fine in Mozilla). Why? Because (according to an MS developer) internally the MS dom is based on object model attributes, so that's what you have to set.

Has any company yet written a complete CSS1 implementation? A complete working version of DOM0? Yet here we are toiling away on XHTML and CSS3(!) and DOM Level 2. And they don't even seem to give a rat's ass if anyone actually follows the rules.

[sigh] Yes. Mozilla supports DOM and CSS Level 2 and they have partial support for Level 3 now. Level 0 is the term used to refer to the pre-standardized technologies, it doesn't actually exist as a standard so EVERY browser that can script web pages has a level zero DOM. It should be noted that TBL himself has stepped in on occasion to tick off Microsoft about stuff like browser blocks, bad HTML etc.

From what I hear about CSS3, it's going to be such a massive specification that no company (save Microsoft, if they actually gave a damn) would possibly be able to implement it.

Then you hear wrong.

In the meantime we see developers actually building websites entirely out of Flash because there's one reference implementation (one version, period) and it just works. Is that the future we want?

Developers do not build web pages out of flash. Marketing departments do. Luckily most web pages are not built by marketing.

It's time to hold these clowns accountable. Make them do some real work: make them create a working version of their spec.

Poor troll. The W3C already implement all their standards, go to w3.org and download Amaya. Nobody uses it for actually browsing the web, but there it is, proof that an actually very small organization with very few coders can implement their standards.

DOM not HTML (3, Informative)

krokodil (110356) | more than 11 years ago | (#4635907)

Does the this mean that one can expect browsers to behave in a predictable manner when playing around with HTML documents?


You seems to confuse DOM with HTML standard. DOM does not enforce HTML document structure, it is just OO representation of HTML and XHTML documents.

Re:DOM not HTML (2)

Admiral Burrito (11807) | more than 11 years ago | (#4635938)

DOM can be used to "play around" with HTML documents, after they have been loaded by the browser.

I seem to recall some web site using Javascript to expand and collapse discussion threads. Think it was kuro5hin [kuro5hin.org] . I'm not sure if it's using DOM to do that, but that is the sort of thing you can do with DOM.

huh? (1, Funny)

Anonymous Coward | more than 11 years ago | (#4635919)

WWE Releases Drafts For Doom II And More

what does that mean?

*squints*

I gotta get some sleep..........

Ohhhh... _DOM_. (3, Funny)

TheSHAD0W (258774) | more than 11 years ago | (#4635947)

I thought they released a draft for DOOM 2.

Yeah, considering how long ago it was released, the draft for it would be just about due...

Re:Ohhhh... _DOM_. (1, Funny)

Anonymous Coward | more than 11 years ago | (#4636002)

And I keep seeing "w3c" and thinking "wc3! cool a warcraft 3 article...oh wait."

Re:Ohhhh... _DOM_. (0)

Anonymous Coward | more than 11 years ago | (#4636067)

I keep seeing 'Wing Commander 3'.

Yea, bash MS some more... (3, Flamebait)

Proc6 (518858) | more than 11 years ago | (#4635976)

... when Netscape did it to themselves. If you want to talk about standards, go look at charts showing what CSS properties Netscape versions properly support and which ones IE supports. IE kicks its ass all over the place. Netscape is downright broken on some very easy things. Now the new Netscape based on Mozilla, I can't comment. But that's when someone else did all the work for them, maybe Mozilla is fine. But IE is a pretty fast, stable browser that has supported more standards, more correctly than any version of Netscape prior to Mozilla. And if you want to talk about "MS's proprietary HTML tags", yea, Netscape did the same shit, so would anyone trying to own marketshare.

How about an example from around the time of the Great Browser Holy Wars...

NETSCAPE ONLY TAGS - blink - layer - keygen - multicol - nolayer - server - spacer

INTERNET EXPLORER ONLY TAGS - bgsound - iframe - marquee

Hmm... looks like Netscape had more.

Look around you, proprietary "anything" is how you keep money coming in and marketshare up. If youre talking about some kind of open source, community developed code, like Mozilla, then yes, please avoid proprietary stuff. But quit bashing Microsoft just because they have a good browser that supports standards at least as well as their only major competitor and are using the same technique as just about every other capitalist on the planet to make more money and keep investors happy. Netscape sucked and deserved to die.

Now go ahead, mod me down because I stood up for MS.

Horrible (-1, Troll)

Anonymous Coward | more than 11 years ago | (#4636015)

Seriously, you should rot in hell for your blasphemous statements.

Open Source is for communists. In a communist community like slashdot, saying what you just said could get you arrested!

Fuckers

Re:Yea, bash MS some more... (1)

PhreakOfTime (588141) | more than 11 years ago | (#4636016)

It was a choice of either a mod, or a comment. I like discussion better than point systems.

I tend to agree with you on the CCS sheets. For example, in IE there is a CSS that allows me to do a hover color change WITHOUT using the seemingly more popular java code. I like it, its a better design for sites in my opinoin, netscape(older versions) craps on it though.

However, I dont really agree that netscape sucked and deserved to die. Without it there would have been even less innovation. Even now, I use opera over IE because of the ability to go to different and seperate connection by using a simple tab layout at the top of the screen all contained in one program. Whereas to do something similar in IE, I have to open up half a dozen instances of explorer

Re:Yea, bash MS some more... (1)

pavera (320634) | more than 11 years ago | (#4636126)

Um... your comment regarding CSS is not true about later versions of netscape (6.0 and on) I use that mouseover color change all the time with CSS, and it renders perfectly in mozilla, netscape 6, 6.1, 6.2 and 7... sure netscape 4 doesn't support it but IE 4 didn't either, so thats a silly argument. I could tell you that mozilla is better than IE because IE 3 won't even open up MS's own home page anymore... but thats irrelevant.

Re:Yea, bash MS some more... (2)

zyklone (8959) | more than 11 years ago | (#4636668)

IE6s :hover is pretty much broken. You can change the color yes, but you can't get change the display: of a box within the :hover element.

No nice popup menus in other words ..

Re:Yea, bash MS some more... (2, Insightful)

Anonymous Coward | more than 11 years ago | (#4636112)

IE didn't start supporting CSS until after MS totally *destroyed* Netscape in the browser wars, so that's not why Navigator lost. Mozilla has excellent CSS and DOM support.

There are some sites that are absolutely committed to IE and use evil tech like VBscript. Mostly, sites are optimized to IE's idiosyncracies. Since there's no W3 standard on rendering broken, non-compliant code, IE made it render a particular way while Netscape rendered it a particular way. With proper, compliant code, the pages look close enough or at least don't entirely die when you load them. And of all those non-compliant tools, I typically only see iframe, spacer, and bgsound being used.

But as IE market share grew, lazy/ignorant web designers (which includes Frontpage users) started to test only for IE. When MS destroyed Netscape, most web designers stopped testing for alternative browsers. So Microsoft indirectly caused mass W3C noncompliance.

I think the problem with your post is that you confuse standards with features. CSS support is a feature. An analogy: the DMV license application form in my state comes with a voter registration form attached. DMVs aren't required to attach forms; it's just an added bonus to promote voting. But, the voter registration form has to be standard. If my DMV office created a "SDMV voter registration form" that had extra questions like political ideology and sexual preference, any other DMV would wonder what the hell the DMV branch was thinking when they made the form.

It does seem that Mozilla is a lot more willing than the old Netscape and Opera to render broken, non-standard HTML pages, although IE will still render the mind-bogglingly broken things.

With Mozilla 1.1, I have seen _no_ pages that only work in IE ( excluding those using Evil MS Tech (tm) ), and a minority (usually made by non-professionals) that totally screw up the rendering.

Re:Yea, bash MS some more... (4, Insightful)

skunkeh (410004) | more than 11 years ago | (#4636163)

Shock horror! Browser released in 1996 fails to support latest web standards!

If you want to bash Netscape, aim at Netscape 6 or 7 (both of which have superb standards compliance thanks to the Mozilla project). Netscape 4 simply isn't relevant any more, and hasn't been for several years. It's only big companies and institutions who don't want the hassle of upgrading their site-wide PCs that are keeping it alive, and with any luck even they will give it up soon.

Let's not forget JS, VBS & JSCRIPT (2)

DrSkwid (118965) | more than 11 years ago | (#4636291)

Javascript was a Netscape invention.

Hows about that for non-standard!

My first introduction to the DOM and Scripting was builing an I.E.4 Based VB Script application for Boots The Chemist Intranet. That's about as non-standard as you can get. The VBS/JS step debugger in Visual Studio was useful if you could get it going.

These days there are few differences between the different javascript/dom. (getting xml documents without screen refreshes is unfortunately one of them *sigh*). My favoured route is develop in Mozilla then test in I.E. I've done a drag and drop HTML email editor that works in Moz IE & Opera. The scope of Javascript doesn't really get excercised as far as I've seen round the web.

Does anyone ever... (2, Insightful)

AcquaCow (56720) | more than 11 years ago | (#4635997)

bother writing compliant html? People will always dream up crazy site designs, they are going to go with whatever technology they can use to make that design a reality. Look at flash, look what happened with DHTML. Netscape's DHTML manual went into documenting aspects of DHTML that weren't even supported in their browser.
Standards can be made, don't expect that people will ever follow them.

-- AcquaCow

Re:Does anyone ever... (1)

frawaradaR (591488) | more than 11 years ago | (#4636061)

Them designers would probably be shocked to find out that it is much easier writing cool design using proper standards. Not to mention how much easier it is to remake the design or just change it a bit...

The maintenance factor should be of major importance to businesses... as it is, they have sloppy code that takes years to debug (font tags, inline propriteary javascript, both CSS and styled HTML, sniffer code and so on), and they have to maintain several versions for various browsers. Maintaining one standards compliant version with style separated from content is so much economically sane.

Re:Does anyone ever... (1)

AcquaCow (56720) | more than 11 years ago | (#4636095)

Now, I write some fairly base HTML, no more than font, table, p, and br tags really. I wrote a decent layout for my site (dcw.govsci.com [govsci.com] ). True it may never get updated, but I wrote it knowing how Netscape likes its html and how IE liked html. Page in the end looked perfect in IE, but rendered mainly single column in Netscape. There were several little `quirks` I had to work around and kludge before it really worked properly in Netscape as well. Things I shouldn't have had to work around. For instance, I have a java applet on my site. I think the width of it is set to 429, the containing cell is set to 430px wide. If I take that java applet up to 430px wide, it completely breaks my site. But only in Netscape (older versions, not moz). I had all cell padding off, everything I could think of. Its just how Netscape handles that particular applet different from IE/Moz/etc. I only tested for Moz/IE/Netscape at the time though, opera wasn't even remotely popular (or even out) when I came up with that design.

This has become a slightly longer rant than I wanted to write (esp at near 4am) but I suppose my point was that sure Netscape and IE are both rendering the HTML to standard but they handle certain objects differently causing the coder (me) to be forced to adjust their site accordingly to kludge around those slight differences. Standars or not, there are still differences.

If we can come up with one solid independent rendering engine that is both fast and highly portable, use that in all browsers, I think we'd be set.

5 mins to 4 am...its time for bed.

-- AcquaCow

Re:Does anyone ever... (0)

Anonymous Coward | more than 11 years ago | (#4636488)

People should be castrated for not using a DTD...

Re:Does anyone ever... (2)

Ed Avis (5917) | more than 11 years ago | (#4636264)

Does anyone ever bother checking that their HTML is compliant? By which I mean validating it against the DTD. This ought to be an elementary step in HTML writing - just like compiling a C program is a first step towards checking it works - but it seems so difficult to set up that hardly anyone does it.

Most Linux systems nowadays include nsgmls, but that command has so many obscure options and SGML prologues are hard to understand. There needs to be a single command 'html_validate' which runs nsgmls with all the necessary command-line options (and obscure environment variables, and DTD files stored in the right place) to validate an HTML document. If that existed then I'd run it every time before saving my document and I'm sure many others would too. But at the moment setting up nsgmls to do HTML validation (at least on Linux-Mandrake) is horribly complex. (Especially if you want to validate XML as well; you need to set environment variables differently between the two uses.)

Re:Does anyone ever... (0)

Anonymous Coward | more than 11 years ago | (#4636282)

Yep. I hit W3C validator all the time. They probably hate me.

er, yes. (3, Informative)

DrSkwid (118965) | more than 11 years ago | (#4636277)

http://validator.w3.org

Is a great tool.

If your code is valid HTML then if anyone complains that their X browser doesn't render it properly that's your first point of defense.

Re:Does anyone ever... (1, Interesting)

Anonymous Coward | more than 11 years ago | (#4636665)

Yes, I do. And I do it without ever once querying browser make or version. The catch? I've stopped supporting NS4 and IE4. That makes all the difference. It's hard, though. And once you start using the DOM extensively you need to test every single line of code you write and have backup plans for every possible contingency. So far though, I'm doing better, not worse, than in the old days of if((is_nav3 || has_frames) && ((!ie || has_jscript11) || iesubversion != 4)) pathology = (stupid_table_bug ? offset-10 : offset).

Standards (0)

Anonymous Coward | more than 11 years ago | (#4635999)

Adherable standards are nothing that can be quantifyable by any rational standards. When said company uses the resources of said network infrastructure to come up with a solution to point A, point B is usually ignored, and the modest user base is left out from the masses that scower the land.

Take for example Exterior Coding levels of generation. It is a simple way for marketing executives to focus on the real problem that encompasses the world while avoiding the rich data gathering that is available on the market.

Just my 2 cents!

Eh? (5, Funny)

Wrexen (151642) | more than 11 years ago | (#4636007)

Web Ontology Language (OWL) Guide

Soon to be followed by the Acronyn Formation Policy (FAP) ?

Not "Proposed" Recommendation anymore, it's final (3, Informative)

mdubinko (459807) | more than 11 years ago | (#4636032)

>2 proposed recommendations: XML-Signature XPath Filter 2.0 and HTML DOM 2.

XML-Signature XPath Filter 2.0 is a final W3C Recommendation, not proposed.

-m

Standards (2)

Cheese Cracker (615402) | more than 11 years ago | (#4636066)

Nice with standards... now we just have to sit back and wait for people to follow them. That could be a while since there are quite a few developers who don't give a darn to adhere to them.

the last hope of the doomed (1)

myowntrueself (607117) | more than 11 years ago | (#4636098)

is not to hope for safety... in the form of standards that are adhered to!

Sorry... (4, Informative)

WhaDaYaKnow (563683) | more than 11 years ago | (#4636104)

Does the this mean that one can expect browsers to behave in a predictable manner when playing around with HTML documents?

One simple example: innerHTML. This 'property' is not part of ANY W3C draft, yet many, many websites use it because both IE and Mozilla (Netscape) support it.

Even though M$ is on the committee, their own browser still has plenty of features that are not defined in XHTML 1.0, DOM (level 2 or 3), CSS or whatever. And of course 99% of all web 'developers' are more than happy to use these features.

Goat Sex -- the movie! (-1, Troll)

Anonymous Coward | more than 11 years ago | (#4636106)

Don't miss it!

http://klerck.org/dance.swf [klerck.org]

You liked Goatse.cx the web site? Don't miss the movie!

DOM-2 irrelevant to cross-browser issues (2, Informative)

Brother52 (181351) | more than 11 years ago | (#4636187)

Does the this mean that one can expect browsers to behave in a predictable manner when playing around with HTML documents?

As long as you do things strictly DOM-1 way, current browsers have been working pretty much predictably for quite some time. I develop sophisticated DHTML and test it in IE, Mozilla and Opera, and I never have a problem as long as I use only DOM methods (which can sometimes be quite limiting, but bearable overall).

A lot of people still do pre-DOM legacy DHTML because they have to make 4.x-compatible sites, but that's another story. DOM-2 may be more featureful, but it doesn't promise making cross-browser development any easier. It can make it harder indeed if not implemented accurately and timely among different browsers. Given a lesser incentive to implement it (DOM-1 is OK for most things), I find it quite possible.

W3C: stop now (3, Interesting)

g4dget (579145) | more than 11 years ago | (#4636255)

The W3C should have stopped with a full specification of HTML. Anything they have been doing beyond that has been doing more harm than good. The web succeeded because HTML was simple.

Of course, some client-side code is useful, but unfortunately, the major contenders have dropped the ball on that one. The W3C has given us JavaScript+DOM+CSS+..., but it's way too complicated for the vanishingly small amount of functionality, and nobody has managed to implement it correctly; in fact, I doubt nobody knows what a correct implementation would even mean. Flash has become ubiquitous, but it just isn't suitable for real GUI programming and is effectively proprietary. And Java could have been a contender, but Sun has chosen to keep it proprietary, and the once small and simple language has become unusably bloated.

But, hey, that means that there is an opportunity for better approaches to client-side programming. Curl might have been a candidate if it weren't for the ridiculous license. But someone outside the W3C will do something decent that catches on sooner or later.

Re:W3C: stop now (1)

Sycle (569193) | more than 11 years ago | (#4636320)

If we don't have someone like the W3C putting this stuff in writing somewhere, how else are we going to have a hope in hell of browsers talking to each other?

Should everyone just copy whatever Microsoft comes up with, because lets face it, they have the largest userbase? Somehow I don't see people here appreciating that.

I mean sure, you can say "wah wah, Microsoft didn't follow the standards, wah wah, Opera doesn't do this yet, this standards system is flawed!" but if there is no reference point for any of these things, how could you possibly expect things to improve?

One thing that's obvious is that these technologies are needed, not just silly ideas implemented by bored programmers, so if they're going to exist, then better an appropriate committee come up with workable drafts than a lone company goes ahead and does what they feel like. (heck that's one of the main reasons MS came up with so much funky spec breaking stuff - call it embrace and extend if you want, but they wanted to do things before the standards were there, which is why we have this mess)

Re:W3C: stop now (4, Interesting)

g4dget (579145) | more than 11 years ago | (#4636419)

Should everyone just copy whatever Microsoft comes up with

Everybody is, for practical purposes. Who do you think is dreaming up a lot of the stuff that comes out of the W3C? Look at the authorships of the standards. And if you sit in those meetings, you'll quickly see that Microsoft doesn't often take "no" for an answer.

Microsoft has even told us why they like their standards to be complicated: they believe that if they just make it complicated enough, nobody else but them can implement them. Of course, Microsoft's reasoning is at the level of Wiley Coyote, with Open Source being the Roadrunner, but what can you do.

One thing that's obvious is that these technologies are needed,

We have a problem with creating dynamic web content, but the current crop of W3C standards for addressing that problem isn't working; it has turned into a Rube Goldberg contraption. Someone needs to start from scratch, and the W3C appears to be incapable of doing it.

If we don't have someone like the W3C putting this stuff in writing somewhere, how else are we going to have a hope in hell of browsers talking to each other?

Of course, things need to get written down and standardized. But the way standards are supposed to work is that people try things out in practice, whatever works well survives in the marketplace or among users, people create multiple implementations, then people get together and work out the differences among the implementations, then it all gets written up as a standard, and finally everybody goes back and makes their implementations standards compliant. It's a long, tedious process, but it does result in reasonable standards that real people can actually implement.

What the W3C is often doing is using its position to create completely unproven systems on paper and let the rest of the world figure out how to deal with it. Or, worse, the W3C is used by powerful companies to push through "standards" that haven't stood the test of time and for which only they themselves have a working implementation. If you give that kind of junk the stamp of approval of a standards body, you make things worse, not better.

Re:W3C: stop now (2)

IamTheRealMike (537420) | more than 11 years ago | (#4636648)

The W3C has given us JavaScript+DOM+CSS+..., but it's way too complicated for the vanishingly small amount of functionality, and nobody has managed to implement it correctly; in fact, I doubt nobody knows what a correct implementation would even mean.

Huh? JavaScript is the Mozilla implementation of ECMAScript, a standard (not W3C) invented by Netscape. The DOM was also a Netscape idea, now standardized. CSS was originally proposed and largely designed by a guy from Opera. There are quite a few implementations out there actually, the idea that W3C technologies are too large to implement is crazy. Look at Mozilla, Amaya, even Konqueror is getting there now.....

The W3C should have stopped with a full specification of HTML. Anything they have been doing beyond that has been doing more harm than good. The web succeeded because HTML was simple.

Yes, and now it's ubiquitous do you really think we need to keep it simple? Being simple was great when the web was small, it let it grow very quickly. Why should we keep it simple now? Just for the sake of it? I'd rather have power. If that means there are only 3 or 4 quality implementations as opposed to 20, then so be it.

The world is not a simple place, and the things we want to do with the web nowadays aren't simple either. If you want simplicity then feel free to write a web browser that only understands a subset of the standards, they are layered so people can do this. Just bear in mind that it won't be useful for browsing the web, because a lot of people like powerful technologies and use them.

wc3 and doom l2 (0)

Anonymous Coward | more than 11 years ago | (#4636367)

my glasses....

Who needs W3C standards (2)

forged (206127) | more than 11 years ago | (#4636442)

Microsoft are rolling their own anyway (.NET), and with their monopoly over 90% of the desktops and IE6 high in this prominent spot, I fail to see how this will make a difference to end-users...

Re:Who needs W3C standards (2, Informative)

ThePeeWeeMan (77957) | more than 11 years ago | (#4636636)

Just thought I'd point out that W3C standards and .NET are orthogonal; .NET doesn't specify anything about how to render web pages or do client-side scripting.

Now, if you were talking about SOAP...

It is not only Web development (1, Insightful)

Anonymous Coward | more than 11 years ago | (#4636510)

Maybe, you are missing the point on that W3C is centering its efforts in other applications that web development. Say documents representation (XML), machine understandable information, web information retrieval and so.
OWL is about information retrieval, and 'XML-Signature XPath Filter' is about document signing.
The DOM stuff, is no more only a Dynamic HTML stuff. DOM is important because it is being actively used to manage XML documents, and previous specifications are very clumpsy because they are a compromise between previous brosers specific market standards.
Maybe, it is a need to develop some simple DOM stuff from scratch instead of adding levels over a compromise approach. And again, as said above, give a reference implementation, to start with.

Vokimon
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?