×

Announcing: Slashdot Deals - Explore geek apps, games, gadgets and more. (what is this?)

Thank you!

We are sorry to see you leave - Beta is different and we value the time you took to try it out. Before you decide to go, please take a look at some value-adds for Beta and learn more about it. Thank you for reading Slashdot, and for making the site better!

New HTML Picture Element To Make Future Web Faster

Soulskill posted about 3 months ago | from the until-it's-used-for-ads dept.

The Internet 161

nerdyalien writes: At some point, haven't all web developers spent an unjustifiable number of hours trying to optimize a desktop site for mobile devices? Responsive web design provides a solution: "develop once, works in every device." However, still it downloads multi-MB images and re-sizes them based on device screen resolution. Retrieving optimized images from the server, based on device (desktop, tablet, phone) and the device's internet connection (fiber, broadband, mobile), has always been an open problem. Recently, a number of freelance developers are tackling this with a new HTML element, <picture>, which informs the web browser to download optimized images from the server. The tag will be featured in Chrome and Firefox later this year. Will this finally deliver us faster web browsing on mobile devices and an easier web development experience?

Sorry! There are no comments related to the filter you selected.

Ummm. (0)

Anonymous Coward | about 3 months ago | (#47810021)

Nah.

does this mean.... (1)

Anonymous Coward | about 3 months ago | (#47810029)

does this mean that resizing the browser window will cause it to re-download images? because that sounds annoying.

Re:does this mean.... (1)

i kan reed (749298) | about 3 months ago | (#47810077)

Presumably, they'll scale the one they have and replace it when the new one is available.

Like mipmaps in games.

Re:does this mean.... (1)

NotInHere (3654617) | about 3 months ago | (#47810143)

You should design an image format in which larger versions can use data from smaller ones. So when you browse the site with a small window, the small version gets downloaded, and when you resize, the larger version gets downloaded, and when you visit with a large browser, both versions get downloaded.

Progressive JPEG (2)

tepples (727027) | about 3 months ago | (#47810267)

Progressive JPEG already does most of that. It breaks up the DCT coefficients into multiple "scans", each of which adds more detail to the image. A slowly loading progressive JPEG starts blurry (suitable for small displays) but fills in with the higher spatial frequencies as it is received. Perhaps what progressive JPEG needs in order to become a best practice for responsive images is a header to tell how big each scan is (in spatial frequency and in bytes) so that the browser can use range requests rather than having to close and reopen a connection once it's seen enough data for a particular pixel density.

Re:Progressive JPEG (5, Interesting)

AuMatar (183847) | about 3 months ago | (#47810471)

What you'd need there is not file support, but server side support. If you're downloading a single file with all those formats, you're going to have to download everything at once. That's inefficient (important on mobile devices) and slow. Stuffing all the sizes into one container isn't the answer.

Instead, what you'd like is one URL to automatically send you the correct file(s) for your size(s). For example, you could put headers in the http request giving the desired resolutions and the response could have each of those sizes, in preference order. Basically have each image request turn into a CGI request for multiple files. More or less what this new tag is trying to do client side. Of course doing it client side, while less convenient all around, does have privacy advantages- you can't guess the device type from the sizes requested.

Re:Progressive JPEG (2)

Em Adespoton (792954) | about 3 months ago | (#47811049)

you can't guess the device type from the sizes requested.

Sure you can; the security point is that you can't KNOW the device type from the sizes requested. You have to guess.

However, it seems to me that using element fingerprinting, this element would be a pretty strong indicator to narrow down who you were.

Re:Progressive JPEG (2)

Frnknstn (663642) | about 3 months ago | (#47811109)

I don't think you were paying attention to how a progressive .jpg works. The file has only the full-detail image in the file, but the as you load more and more bits from the file, the quality improves from unrecognisably blurry to sharp and detailed.

Simply request the file from byte 1, load until you get the level of detail you need, then close the connection. If you need more detail later, just resume the download.

Re:does this mean.... (1)

tepples (727027) | about 3 months ago | (#47810207)

does this mean that resizing the browser window will cause it to re-download images?

That depends on how often you resize the browser window. Smartphone and tablet operating systems typically implement a window management policy of all maximized all the time, which means the viewport size won't change unless the user is rotating the screen. How often do most desktop and laptop PC users resize a browser window?

Re:does this mean.... (1)

wisnoskij (1206448) | about 3 months ago | (#47810461)

Once is more than enough to be annoying.

Re:does this mean.... (0)

Anonymous Coward | about 3 months ago | (#47810467)

I can't speak for everyone, but I do it all the time. Plus I have dozen tabs open at a time. I hope the browsers will be smart enough to not load images for background tabs.

Offline browsing (2)

tepples (727027) | about 3 months ago | (#47810685)

I agree that images for the foreground page should be prioritized. But not loading images for background tabs at all would break offline use. Consider someone who loads up on web pages to read on a laptop before boarding a bus: load tabs in the background, close the lid to put it in suspend, board the bus, open the lid, and then view the pages.

Re:does this mean.... (0)

Anonymous Coward | about 3 months ago | (#47810561)

viewport size won't change unless the user is rotating the screen

You forget zooming. I zoom in and out all the time on mobile devices, especially on sites that aren't made for mobile such as Slashdot. Sometimes I have to zoom so I can scroll.

Re:does this mean.... (1)

Falos (2905315) | about 3 months ago | (#47810501)

TFA goes into more detail (hell, history) and it's code that decides what to download in advance, affects the pre-load phase AFAIK, not live. Though a derp webmaster might still try that.

Images are normally loaded even before the HTML, so it was hard to preempt. Had to work low-level code, petition the standards people, or whatever.

Re:does this mean.... (2)

n1ywb (555767) | about 3 months ago | (#47810729)

Images are normally loaded even before the HTML

I think I may have misunderstood but how the hell does a browser load images before loading the html that tells it what images to load?

Re:does this mean.... (2)

SuricouRaven (1897204) | about 3 months ago | (#47810895)

HTML isn't just one file any more. It's often supplimented by CSS, rewritten on the fly by javascript that pulls additional content from new URLs, and quite dynamic in general. It's quite possible for a browser to start downloading images before it has the HTML fully rendered.

Dupe (4, Interesting)

Hsien-Ko (1090623) | about 3 months ago | (#47810041)

of the HTML 5.1 story [slashdot.org] ?

Re:Dupe (0)

Anonymous Coward | about 3 months ago | (#47810073)

Considering it's new links to new stories on the same subject, no.
A corollary to the HTML 5.1 story, perhaps, but not a dupe.

Re:Dupe (1)

wisnoskij (1206448) | about 3 months ago | (#47810453)

Um, so as long as you pick new news sources to summarise, it is not a dupe?

Re:Dupe (1)

tepples (727027) | about 3 months ago | (#47810747)

That's correct, so long as each news source reports a separate aspect of the story: support reaching Chrome Dev, support reaching Firefox Aurora, support reaching a browser's release channel, etc.

Re:Dupe (0)

Anonymous Coward | about 3 months ago | (#47810877)

As long as I don't get dinged for posting a dupe... SWEET!

New HTML5.1 code:
nerdyalien from Slashdot reported that there is a new <picture> tag in the new HTML 5.1 schema in response to the responsive web design:
http://tech.slashdot.org/story/14/09/02/1748237/new-html-picture-element-to-make-future-web-faster

Won't help (0)

Anonymous Coward | about 3 months ago | (#47810043)

Most big, professional websites already look fine. They're responsible, and update, and everything.
But there are a lot of crappy sites I'd still like to able to visit. They won't be using the new tag.

Re:Won't help (1)

Pieroxy (222434) | about 3 months ago | (#47810791)

Most big, professional websites already look fine. They're responsible, and update, and everything.
But there are a lot of crappy sites I'd still like to able to visit. They won't be using the new tag.

So? Do you really think they'll drop the IMG tag in our lifetime?

Not seeing the big deal here... (1)

Anonymous Coward | about 3 months ago | (#47810103)

This really isn't anything special if you think about it for a second. It's just a lazy way of creating a request handler with a limited scope.

Keep in mind that an <img src="filename"> doesn't require the filename to end in .jpg, .gif, or any other particular format-associated extension. Hell, it doesn't even require a particular MIME type (or one of a group of valid ones) when it gets a response from that source URL as long as the client is happy with it.

Over in ASP.Net land, we've been using src="/imageHandler.axd?name=blah&w=foo&h=bar" in img tags for years. You could certainly do the same thing from any server-side language. All you have to do is have the client substitute its window x and y size values into that request. The server can either serve up the closest appropriate size, or dynamically generate and cache one of the correct size on the fly.

All this new tag is going to do is allow those without the ability to write such a server-side script (read: crayon jockeys) to set up a list of pre-baked images for a set of screen size ranges. It will add to page text bloat and provide little actual utility, since they'll only test on Macs and iPhones anyway. This will reduce this entire feature to nothing more than a waste of time.

NoScript (0)

tepples (727027) | about 3 months ago | (#47810327)

All you have to do is have the client substitute its window x and y size values into that request.

Good luck doing that if the browser has JavaScript turned off. A lot of Slashdot users, for instance, leave JavaScript turned off for the majority of web sites out of privacy paranoia, out of a fear of non-free software [slashdot.org] , or to reduce the amount of CPU time and memory the browser spends processing social recommendation and advertisement widgets.

Re:NoScript (1)

neoritter (3021561) | about 3 months ago | (#47810395)

I don't think you need JavaScript to accomplish what he said. I vaguely remember something similar being done on a recent project using Wicket and Java.

Re:NoScript (0)

Anonymous Coward | about 3 months ago | (#47810483)

Javascript support? Nonsense.

Page is whatever.shtml

Build it any way you want. Doesn't require a darned thing from the user's browser. Use whatever server-side language you want. Everything in the cookies, headers and your filesystem is available to you to make your decisions; you can completely customize the experience from there, using whatever you like, or not using whatever you don't.

Heck, I've even got a .shtml that uses javascript, but the page CGI customizes the javascript before it gets to run.

--fyngyrz

(not at my desk, no idea what my password is, lol -- now that's safe)

Window size and pixel density in what header? (1)

tepples (727027) | about 3 months ago | (#47810517)

Everything in the cookies, headers and your filesystem is available to you to make your decisions

Cookies don't exist for a first time visitor. And since when have HTTP request headers included window size and pixel density?

Re:Window size and pixel density in what header? (0)

Anonymous Coward | about 3 months ago | (#47810825)

> Cookies don't exist for a first time visitor.

1) page is requested, headers are captured
2) cookie(s) are looked for
2a) if cookies don't exist, generate new cookie and signal the browser to re-invoke the page request (basically goto 1)
2b) else use existing cookie(s) and continue with generating page

Trivial.

> And since when have HTTP request headers included window size and pixel density?

Pixel density: Since user-agent [wikipedia.org] has been available. Jeeze, how hard is that?

Window size: This is not something you should be using. That's something the browser uses to reflow properly designed content. If you're designing content to a fixed window size, you're doing it all wrong and you're one of those people the web hates.

Re:Window size and pixel density in what header? (1)

tepples (727027) | about 3 months ago | (#47810891)

Pixel density: Since user-agent has been available

A single combination of web browser and operating system can be used on both low DPI displays and high DPI displays. Consider, for example, Safari on a MacBook with a Retina brand high DPI display and Safari on a MacBook with a normal DPI display.

Window size: This is not something you should be using. That's something the browser uses to reflow properly designed content

So if the web page's style sheet specifies that a particular image shall be 50 percent of the width of the viewport, however wide the user might have chosen to make that, how should the server go about determining how many pixels that is in order to serve the correct amount of image detail?

GUI technology has regressed since the 90s (4, Insightful)

thebeastofbaystreet (3805781) | about 3 months ago | (#47810109)

Think about just how much work is now needed to display a simple GUI on a number of different devices. Something that a single developer could once have cooked up themselves now takes teams of designers, UX people, UI coders, back end coders and the rest to do. Really, we should chuck it all out and just start again.

Bah (2)

Anonymous Coward | about 3 months ago | (#47810647)

Or, it could can just be ignored it from day one, like I did. All my pages still work fine. I never lost sight of the idea that the content was key, and that the interior-designer fascination with "GUI design" was so much ridiculous overkill.

o Does the page work, and quickly? Yes? Fine.
o Is the writing high quality and cover the subject at hand thoroughly? Yes? Perfect.
o Is it image heavy? Probably want to rethink it, because you're probably just producing pablum for the tl;dr masses.
o Does it need input? Server-side CGI then.

HTML, CSS and CGI can easily and consistently handle 99.99% of everything. And you probably don't really need that last .01%

And guess what? Your pages can work on everything that is even somewhat modern.

I'll grant you that the bloody browser makers could do a better job with consistent support of CSS, but other than that one wart, it's all a doddle. IF you know what you're doing.

Re:GUI technology has regressed since the 90s (1)

znrt (2424692) | about 3 months ago | (#47810661)

gui technology? it's the whole software industry. you need aprox the same amount of "experts" to print a fkucing report from persisted data nowadays. not to mention a herd of managers, several whiteboards and a shitload of postits.

Re:GUI technology has regressed since the 90s (1)

Tablizer (95088) | about 3 months ago | (#47810719)

Job security, shhhhhhhh

In all seriousness, while it may create job security, many developers would rather spend their time making 20 useful products using write-once-run-everywhere rather than 5 useful products with multiple versions handcrafted for different devices.

It's kind of boring re-inventing the same app for different devices even if it does pay.

Re:GUI technology has regressed since the 90s (1)

X0563511 (793323) | about 3 months ago | (#47811053)

Things are a bit more complicated than they used to be...

Client or Server side? (1)

bobbied (2522392) | about 3 months ago | (#47810121)

I'm not sure how this fixes the problem.

You either need a client side (the browser decides what image to fetch and how to display it) or a server side (the server decides what image to send) solution here. I suppose you could do a combination of both, but the problem here is who's going to get the resizing work?

Personally, I think this would be better done on the server side for the most part. That implies that the GET would somehow define for the server enough information about the display available. You are going to need colors supported, user pointer integration (type, minimum size), screen resolution of the user's view both in pixels and actual size. Then the server can decide what page format and image would best fit. All this has to be *independent* of the browser vendor or host platform.

But hey, good luck getting everybody to sign on to a working interface for this... We cannot get a consistent interface between browser vendors now...

Re:Client or Server side? (0)

Anonymous Coward | about 3 months ago | (#47810183)

I believe that the idea is to provide the different sizes on the server side, but that the browser (and user preferences) determines which version to download.

Re:Client or Server side? (1)

Marc_Hawke (130338) | about 3 months ago | (#47810355)

Yes, (replying to GrandParent, but agreeing with Parent.)

Can you explain why it would be better on the server side? I naturally assumed client side. "Get SmallScreen version of Picture." It would then be scaled by the Browser to fit the size determined by the layout.

I don't think that you'd change the layout based on which images were selected. Everything would look exactly the same, just the byte-size/quality of the image file would be different.

Re:Client or Server side? (1)

pspahn (1175617) | about 3 months ago | (#47810529)

I don't think that you'd change the layout based on which images were selected. Everything would look exactly the same, just the byte-size/quality of the image file would be different.

You are mistaken. In fact, one of the reasons they decided on this method was so that the image is now from layout constraints. You might want to have the image be 400px square on one layout, while it will be 800x600 in another layout.

On top of that, if you are providing images of varying proportions, you don't really want some computer to auto-crop everything depending on the requesting device since the results will be artistically inferior to a trained human.

Re:Client or Server side? (1)

bobbied (2522392) | about 3 months ago | (#47810459)

I believe that the idea is to provide the different sizes on the server side, but that the browser (and user preferences) determines which version to download.

I get that but it means the server now has multiple URL's to get the picture and these URL's are defined by how the client will format the GET. I'm just pushing the whole page formatting issue to the server so you pull one URL and the server decides which image to give you. Then the server side is free to have the pictures already processed on disk or process them on the fly and is in total control of the formatting on the page. It also frees up network bandwidth and processing and memory requirements for less powerful platforms.

Re:Client or Server side? (1)

SQLGuru (980662) | about 3 months ago | (#47810655)

Something akin to using a background image and media queries to pick the right version of the image based on the window size?

Re:Client or Server side? (2)

X0563511 (793323) | about 3 months ago | (#47811075)

Sounds like a perfect use case of HTTP headers. Why would a new HTTP method be required?

Get the headers? Great! Use them. Don't recognize them? Oh well, ignore them. Don't get them? Use sane defaults.

Responsive Web Design (4, Insightful)

vux984 (928602) | about 3 months ago | (#47810153)

" Responsive web design provides a solution: "develop once, works in every device."

Name one good responsive web design that isn't shit on at least one of desktop or mobile. (And an awful lot of them are shit on both.)

Anything actually good just builds them separately, and lets you switch between them; and selects as default the right one based on screen size (screen not window) Nothing sucks worse than making a desktop window smaller because you just want to keep one part visible while you work with something else and having the site spontaneously implode into a mobile version -- just one of the countless forms of SUCK thanks to "responsive web design".

Re:Responsive Web Design (1)

Peter Kingsbury (3046159) | about 3 months ago | (#47810201)

Name all desktop and mobile combinations which (a) include all contemporary desktop and mobile options, and (b) which don't include shitty desktop and mobile options. Impossible, you say? Then your point is moot.

Which site "collapses"? (1)

tepples (727027) | about 3 months ago | (#47810365)

Nothing sucks worse than making a desktop window smaller because you just want to keep one part visible while you work with something else and having the site spontaneously implode into a mobile version

On which public web site have you seen this happen, so that I can go see for myself what you're talking about?

Re:Which site "collapses"? (2)

trevc (1471197) | about 3 months ago | (#47810441)

www.fiattech.com

Re:Which site "collapses"? (1)

tepples (727027) | about 3 months ago | (#47811061)

On the front page of FiatTech.com, I don't especially see what's wrong with what appears to be a fairly predictable switch from a 2-column layout to a 1-column layout around 750px. Is horizontal scrolling really better? Or should it have been using the 1-column layout in the first place for all screen sizes?

Re:Which site "collapses"? (0)

Anonymous Coward | about 3 months ago | (#47810563)

NCSA's website [illinois.edu]

NCSA is doing it wrong (1)

tepples (727027) | about 3 months ago | (#47811117)

Ouch. That transition from 2 columns to 1 is so close to 1024px it sickens me. Even a browser window snapped to half of a 1920x1080 screen, or a user of a 1024x600px netbook whose scroll bars are set just wider than normal, would get the 1-column layout with a hamburger menu on the front page of NCSA's web site.

Re:Which site "collapses"? (4, Insightful)

vux984 (928602) | about 3 months ago | (#47810677)

There are several, but site I was referring to in particular was mashable.com. It came up at work as an example of "good responsive design" to which I argued that it was in fact abysmal.

These were some of my notes taken at the time (I don't know if they all still apply, but a click glance confirms at least most of them still do)

Chunks of the site can't be reached from mobile at all - how do I get to "Jobs" or "Advertising" from a smartphone?

And on the desktop, parts of the site can't be reached depending on the size of the browser window and we're not talking perversely small either: that "more" popup menu on the desktop starts losing sections outright at around 1100px). 1100px is too narrow! Want a job at mashable? They don't have a section for that unless your on a widescreen.

Worse, if you shrink the page below 1000px wide, you start losing content columns off the home page too -- they're just gone. You can't scroll horizontally to get to them, and unlike the mobile version which displays one column at a time with a column selector to switch, that selector doesn't appear on the desktop. If you shrink your window, you just lose columns. No selector, no scrolling, the content is just gone.

Additionally the column selector names are different from the desktop column headers... "What's new" is renamed "New" for space and that's fine as the translation is preserved. But "The Next Big Thing" is renamed "Rising" for space -- that's a navigation cue that got lost in translation. If I were to say, 'Look for the article under "The Next Big Thing", ' nobody is going to make that connection.

Re:Which site "collapses"? (0)

Anonymous Coward | about 3 months ago | (#47810731)

anandtech.com and theverge.com are good examples

Re:Which site "collapses"? (0)

Anonymous Coward | about 3 months ago | (#47810919)

We have actually been in a webinar that demonstrated this new "responsive web design" that they will be pushing on our corporate website (read: forced upon): www.porsche.com [porsche.com] and the respective dealerships's [blah].porschedealer.com (where [blah] is the name of the dealership).

Re:Which site "collapses"? (1)

tepples (727027) | about 3 months ago | (#47811043)

The front page of Porsche USA [porsche.com] isn't that bad in my testing (from about 500px to 1024px wide). Things get put in more or fewer columns, but that's similar to what happens to text in any liquid layout. The most drastic changes are the layout of the "Build & Find" menus near the bottom below about 500px and the addition of a "hamburger" menu at the top below about 700px.

Re:Responsive Web Design (1)

znrt (2424692) | about 3 months ago | (#47810609)

Nothing sucks worse than making a desktop window smaller because you just want to keep one part visible while you work with something else and having the site spontaneously implode into a mobile version

current users aren't supposed to be able to do that without extra help or risking injury. didn't you notice the window managers / desktops most of them are using? split the screen? are you nuts???

by they way, *their* fix for your problem would be eradicating desktop design versions completely. "develop once, works in every device". why should we go through the hassle of developing a desktop version for elitists like you if we're anyway served well enough with a single tap mobile interface, which is what our average collective brain can afford to handle! go away!

Re:Responsive Web Design (1)

vux984 (928602) | about 3 months ago | (#47810953)

by they way, *their* fix for your problem would be eradicating desktop design versions completely.

As bad as that is, that would actually be an improvement over the mess that "responsive web design" has made.

At least then the design would be relatively simple, and it would be easier to maintain. The problem with responsive web designs is that they are inherently complicated and things break or go missing or become inaccessible between the mobile and desktop transitions. Maintenance cycles tend to make them worse as every modification and feature has to be considered in a "responsive context" ... you get tasked with adding this to this column here and that menu item there, and it works on the desktop, but doesn't make any sense on mobile without a complete costly redesign.

Caching anyone? (0)

Anonymous Coward | about 3 months ago | (#47810167)

There are already technologies to detect user agents, and pull back the correctly sized images from servers. There are also image servers which resize on the fly. And there's CACHING.

It's the 1990s all over again. (5, Insightful)

oneiros27 (46144) | about 3 months ago | (#47810171)

Back in the days of HTML, they decided that it was awful that the people using dial-up had to wait so long for images to load ... so they came up with the 'lowsrc' attribute to the IMG element:

<img lowsrc='...' src='...' ...>

Or, you could could go with the 2000s route, and use CSS's media queries so that you don't try to push large images down to small-screen devices.

Wouldn't it make more sense to just use a known attribute or method rather than trying to come up with yet another solution every few years?

Re:It's the 1990s all over again. (0)

Anonymous Coward | about 3 months ago | (#47810211)

Wouldn't it make more sense to just use a known attribute or method rather than trying to come up with yet another solution every few years?

If we come up with a new solution every few years, we can build frameworks around it, and keep our jobs. Also, everyone has to "upgrade" their browsers to get the new UX "innovations" if they want "the web" (meaning "all that crap we did to keep our jobs!") not "to break."

Sometimes I think we'd be better off if we'd never left the HTML 3.0 trees.

Re:It's the 1990s all over again. (1)

NotSanguine (1917456) | about 3 months ago | (#47810351)

Wouldn't it make more sense to just use a known attribute or method rather than trying to come up with yet another solution every few years?

That's the wonderful thing about standards. there are so many to choose from!

Re:It's the 1990s all over again. (2)

tepples (727027) | about 3 months ago | (#47810399)

<img lowsrc='...' src='...' ...>

That was never standardized, and its implementation was removed for reasons described in bug 92453 [mozilla.org] .

Or, you could could go with the 2000s route, and use CSS's media queries so that you don't try to push large images down to small-screen devices.

Do media queries allow changing the effective src of an img element, or do they work only with background images?

Re:It's the 1990s all over again. (1)

Tablizer (95088) | about 3 months ago | (#47810871)

LOWSRC has been renamed to the COMCAST attribute :-)

Re:It's the 1990s all over again. (1)

PRMan (959735) | about 3 months ago | (#47810403)

Lowsrc didn't take high-DPI screens into account, which is what they are trying to solve with Picture.

Unnecessary (0)

Anonymous Coward | about 3 months ago | (#47810199)

This is unnecessary as wireless broadband speeds will continue to double and make the size of a graphical image insignificant. Audio now is nearly trivial to send even over mediocre wireless connections. It will only get more trivial.

Re:Unnecessary (1)

Dracos (107777) | about 3 months ago | (#47810349)

Actually, it's unnecessary because everything picture does could have been added to img instead. There's no semantic difference between the two, so why add a new one? Extending img would have been more backwards compatible as well (one of WHATWG's stated goals, despite doing lots of stupid crap like this).

A list of semantically equivalent image URIs (1)

tepples (727027) | about 3 months ago | (#47810435)

Actually, it's unnecessary because everything picture does could have been added to img instead. There's no semantic difference between the two, so why add a new one?

The semantic difference, as I understand it, is that the picture element means "images behind all these URIs are semantically equivalent, and here is a formula to determine which is preferred in a particular environment."

Re:A list of semantically equivalent image URIs (1)

Dracos (107777) | about 3 months ago | (#47810817)

Images themselves have no semantic value, only the elements that points to them... furthermore, the dimensions of the image are semantically irrelevant. This is a lame, flawed attempt to solve a visual layout problem with misplaced semantics. You wouldn't invent a redundant element for audio files based on varying bitrate because audio similarly has no semantic value and the media type is inherently non-visual.

Layne's Law break: "semantic value" (1)

tepples (727027) | about 3 months ago | (#47810967)

Images themselves have no semantic value

I detect definition disagreement [c2.com] , and this thread can't usefully continue until we clear this up. I'm unfamiliar with the definition of "semantic value" that you're using. Is it from a standard? If so, which?

You wouldn't invent a redundant element for audio files

Except they did just that. The audio element [mozilla.org] allows specifying multiple otherwise equivalent sources, each in a source element [mozilla.org] , so that the browser can choose the most technically appropriate one. This allows letting the browser choose, say, an MPEG-family format if it's an Apple or Microsoft browser or a royalty-free format if it's a third-party browser.

Burn through your cap (1)

tepples (727027) | about 3 months ago | (#47810423)

In practice, wireless speeds increasing just means you'll burn through your 5 GB per month cap twice as fast.

Kodak had the right idea decades ago (4, Interesting)

Solandri (704621) | about 3 months ago | (#47810213)

For this solution to work, not only do you need to implement a new HTML element and get the servers and browsers to support it, people uploading photos (or their servers) need to generate and store multiple size versions of the same pic.

Kodak pretty much solved this problem in the 1990s with their ill-fated Photo CD format. JPEG encodes pictures in sequential 8x8 pixel blocks. So once you set the image size and encoding quality (which determines files size), everything from that point on is committed to those settings. By contrast, Photo CD [wikipedia.org] encoded a low-res Base version of the picture (512x768). A double-resolution version (1024x1536) called 4 Base is then created by doubling the size of the base and storing the (compressed) delta from that and the resized original photo. The process is repeated for 16 Base (2048x3072).

Essentially, whereas JPEG stores the picture in sequential translated blocks, Photo CD stores the picture in zoomed blocks - kinda like a fractal. If you want the low-res Base version of the picture, you only have to read the first part of the image file. If you want the med-res 4 Base version, you read further along the image file. If you want the high-res 16 Base version, you read the entire image file. (Speaking of which, there was a fractal-based compression algorithm. But the licensing fees were so onerous it never went anywhere.)

Despite Kodak's eventual demise, they were at the forefront of digital imaging (why they held on as long as they did - they owned most of the digital photography patents). And their engineers put a lot of time and thought into the Photo CD format and future-proofing it.

Re:Kodak had the right idea decades ago (1)

QuasiSteve (2042606) | about 3 months ago | (#47810257)

There's also progressive JPEG - pretty much the same effect, you'd end up displaying a low-res/blurry version of the image first that gradually refines to a higher resolution version, building off of the earlier lower resolutions.

But how does a browser know how many bytes? (1)

tepples (727027) | about 3 months ago | (#47810489)

Progressive JPEG is almost a solution but still has a practical problem. Some HTTP servers don't support range requests for some reason, such as if the server's administrator has in the past seen abuse of range requests by "download managers" to try to download six ranges simultaneously. Besides, even on a server that supports range requests, the browser doesn't know in advance how many bytes to download for a particular spatial frequency to be filled in. This means a web browser can't just request enough data for a particular size. Instead, it has to wait until it has seen enough data and then close and reopen the HTTP connection and endure another TLS handshake and another TCP slow start before it can download more things.

Re:But how does a browser know how many bytes? (0)

Anonymous Coward | about 3 months ago | (#47810695)

Instead, it has to wait until it has seen enough data and then close and reopen the HTTP connection and endure another TLS handshake and another TCP slow start before it can download more things.

You are aware that HTTP allows more than one request to be sent on a single connection, right?

Overhead of dozens of requests (1)

tepples (727027) | about 3 months ago | (#47810805)

Sending a separate request for each 10,000 byte chunk of the picture will increase latency, as the browser has to look at each downloaded chunk and decode it to see whether it should send another request. It'll also burn through more of the user's data plan as the request headers and response headers are resent for each chunk. How should the browser estimate how big the chunk should be for a given pixel count?

Re:Kodak had the right idea decades ago (0)

Anonymous Coward | about 3 months ago | (#47810325)

Doesn't progressive JPEG do something similar?

Re:Kodak had the right idea decades ago (0)

Anonymous Coward | about 3 months ago | (#47810375)

Generally, "directory-in-a-file" designs have their own drawbacks, so this was by no means "solved". Kodak wanted to keep physical mediums in a lot of the process of photography. They were blindsided by cheap digital cameras and users wanting a simple file-system-based storage. PhotoCD has other problems, not the least of which is the patent.

Re:Kodak had the right idea decades ago (1)

alexhs (877055) | about 3 months ago | (#47810407)

AFAIK, PhotoCD uses JPEG. It's just that it's JPEG's hierarchical mode, that nobody else uses. (See my other post [slashdot.org] )

Re:Kodak had the right idea decades ago (0)

Anonymous Coward | about 3 months ago | (#47810443)

That's not really the same solution to the same problem. This problem is about more than just not wasting bandwidth, it's about art direction. You might want to serve a different file for lower resolutions, not just a "lower resolution" one.

For instance, perhaps you'd like to serve an SVG for higher res, but a custom-tailored PNG for very lower resolutions. Or you might want to show the user a specific part of an image on smaller screens, or even an entirely different image altogether.

Besides that, browsers do not have to support Kodak's format now, nor Google's similar (but perhaps less thought-out) kitchen-sink format WebP to "solve" the issues. In fact they can simply avoid format wars entirely, and ask the server for what they can support. Like how they were supposed to work to begin with.

Re:Kodak had the right idea decades ago (2)

StormReaver (59959) | about 3 months ago | (#47810657)

And [Kodak's] engineers put a lot of time and thought into the Photo CD format and future-proofing it.

And after the patent restrictions expire, this format may possibly become useful. As it is, this format is completely useless because of the patent threat.

Software Patents: killing innovation since 1998.

Re:Kodak had the right idea decades ago (1)

Anonymous Coward | about 3 months ago | (#47810739)

It's called JPEG2000, uses wavelet transformations instead of discrete cosine transformations that JPEG uses and has been around since over a decade ago. No one uses it.

Re:Kodak had the right idea decades ago (1)

Tablizer (95088) | about 3 months ago | (#47810837)

Agreed. It would be easier on many levels to promote and support the browser use of a progressive-resolution image-file format rather than overhaul markup standards and load & store multiple image versions on servers.

Let's hope sanity and logic prevail, and this tag idea is dumped.

And I hope patent issues don't derail it also.

Further, we don't have to have an entirely new tag. Just add something like a LOWRES or LOWRESSRC attribute to the existing IMG tag. Old browsers would still use the regular image and ignore the new attribute. This is better backward compatibility than an entirely new tag. An entirely new tag would outright not function in older browsers. (The HTML standard says to ignore any attributes a browser does not recognize rather than skip the entire tag.)

Variable crop (1)

tepples (727027) | about 3 months ago | (#47810989)

A progressive resolution image format wouldn't help art directors show the image with a tighter crop on devices with lower pixel density.

Let's see (5, Funny)

Aaron H (2820425) | about 3 months ago | (#47810215)

Let's see how Internet Explorer can manage to munge THIS tag up.

Re:Let's see (1)

bruce_the_loon (856617) | about 3 months ago | (#47810527)

It will start by downloading the wrong size image first, then repeatedly redownload the image each time the screen refreshes or resizes until you run out of mobile data.

Re:Let's see (0)

Anonymous Coward | about 3 months ago | (#47810607)

Internet Explorer on a mobile device? My head asplode.

Let's see (0)

Anonymous Coward | about 3 months ago | (#47810669)

Using any webserver other than IIS will make it download the super-ultra-hires version everytime, proving IIS is the fatest web server.

That's it? (0)

Anonymous Coward | about 3 months ago | (#47810239)

How about making autoplay of videos an opt-in type of thing?

How about stopping all the damn links/like/whatever for social network sites?

And even Amazon, WTF? I do a search and their site just locks up. And I watch the status bar and it's hitting all these other sites for shit....

This image thing is being penny wise and pound foolish. Images are not the problem. All the other shit that every website has to bring in is causing the loading latency.

I mean really, adblock is great for the only reason that it speeds up page loading because advertising servers are shit.

In other words, images aren't the problem. It's the sites feeding into your site.

Re:That's it? (1)

Dracos (107777) | about 3 months ago | (#47810405)

I run AdBlock like most /.ers, but it doesn't get the chance to do much because I have ~132,000 hostnames mapped to 0.0.0.0 in my hosts file specifically for this reason (in Linux and Win2k anyway... Vista/7/8 can only handle about 3000). Anyone who watches me surf is amazed at how the Internet looks without ads.

Re:That's it? (1)

tepples (727027) | about 3 months ago | (#47810593)

If you want to regain support for big honkin' APK hosts files on Windows, you could try writing your own DNS server that runs on localhost and enforces your hosts file before passing other queries off to 8.8.4.4 and 8.8.8.8. I've written a few notes on efficient hosts file processing [pineight.com] .

Your Privacy are belong to U.S. ! (0)

Anonymous Coward | about 3 months ago | (#47810261)

We have an S.C.A. ongoing. Suspect is swift, slimy, sticky, brown or chrome and spreading like cheap money.

[1] Super Cookie Alert

Forget it. (0)

Anonymous Coward | about 3 months ago | (#47810305)

JQuery Mobile.

Works fine.

Not an open problem. (1)

alexhs (877055) | about 3 months ago | (#47810333)

Retrieving optimized images from the server, based on device (desktop, tablet, phone) and the device's internet connection (fiber, broadband, mobile), has always been an open problem.

Nope. It was already solved by the JPEG's hierarchical mode, more than twenty years ago. You're limited to scaled sizes that are the inverse of a power of 2 of the full size, but on the other hand the client wouldn't even need to inform the server and just proceed with a partial download, up to the point where it has enough data for the desired resolution.

Re:Not an open problem. (1)

strstr (539330) | about 3 months ago | (#47810847)

this certainly sounds like the best solution. it's sort of like "bitrate" pealing. one file, all resolutions/quality levels built in.

the tag is not going to be a killer app because it's already possible to do this with JavaScript. JavaScript reads the screen resolution & downloads the appropriate version.

It does not need to be a dedicated tag to do it.

What fucking idiots. :B

Designer (3, Informative)

CanadianMacFan (1900244) | about 3 months ago | (#47810343)

Or the graphics designer could just optimize the graphics so that they are the smallest possible size in the first place. Then they are the fastest for everyone. Designers, like programmers, are usually used to having the fastest computers and connections so they forget about people with slower computers and connections.

Smallest possible byte size depends on viewport (1)

tepples (727027) | about 3 months ago | (#47810629)

The smallest possible size in bytes for a photo in a 100x100 pixel viewport isn't the same as the smallest possible size in bytes for the same photo in an 800x800 pixel viewport, unless you want unacceptable blurring on the larger viewport.

Wrong Way to Solve This Problem (1)

wisnoskij (1206448) | about 3 months ago | (#47810385)

It seems to me that there is a far easier and more effective way to solve this problem.

This Solution: Have a list of different resolution images for display, have the browser device which one to pick.

Better Solution: Have an image format based on streaming. As the data is steamed in the resolution increases. Based on my experience with slow internet we already have a format very like this (starts out fussy, but complete, gets more details as the picture finishes loading). The browser just stops the stream when it has an image of the appropriate size. You do not need to have 18 different copies of the same picture lying around and, you do not need to expressly code in every circumstance. The HTML code we already have and have been using for a long time is already capable and it is forward and backwards compatible. Visit a page with your computer hooked up to some some of a kind 40K monitor no problem as long as the original image is of a high quality, visit with a 640X* smartphone screen, no problem. The site designer doe snot have to know every use case, let the browser worry about that.

Can anybody tell me, please (1)

Mister Liberty (769145) | about 3 months ago | (#47810531)

What with desktop screen become wider and wider over the past ten years, what with mobile screens being and staying portrait troughout, what with rationalizations given for each of these trends, these rationalizations being apparently highly selective of their own conclusions -- anybody please, tell me where the rationale is in keeping, in wanting to keep, in wanting to program consciously, to remain within the sapproriate standards, to keep ergnomics in mind, to... well, believe anything that has ever been brought up to justify 'the industry's' decisions in manufacture and marketing?

Re:Can anybody tell me, please (0)

Anonymous Coward | about 3 months ago | (#47810709)

Desktop screens have had two sizes in the past 10 years to my knowledge: 4:3 and 16:9 (or close to it), so they have not been getting wider and wider. As for the why - computers have always had landscape screens and smartphones started out with portrait screens. Landscape is better for video (like a TV), portrait is easier for reading text. I think Blackberries had a square screen, and there was a (or a few) slide out keyboard phones that had a wide screen. But, I'd say the #1 reason screens are the way they are is "people expect it".

Re:Can anybody tell me, please (1)

tepples (727027) | about 3 months ago | (#47810841)

Desktop screens have had two sizes in the past 10 years to my knowledge: 4:3 and 16:9

Display aspect ratios aren't "sizes". A 24" desktop display can show a lot more than a 10" netbook or tablet display, even if they're both close to 16:9.

Re:Can anybody tell me, please (1)

pspahn (1175617) | about 3 months ago | (#47811119)

I don't know, what is the rationale in asking such a long-winded question when you could have made it clearer by saying it in about seven words?

Why trust industry standards when they are so fleeting?

Maybe you'd rather have standards that last forever? Or possibly no standards at all? What exactly are you getting at?

waste of time... (0)

Anonymous Coward | about 3 months ago | (#47811055)

They're already putting 2k displays on phones.. it's just a matter of time until they put 4k displays on phones and then there's no way a browser is going to download a lower resolution image for the device. The screen is 4k so just download the big picture....

Never mind that you can't see the difference unless you're extremely nearsighted and don't have your glasses on.

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?