Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.
We are sorry to see you leave - Beta is different and we value the time you took to try it out. Before you decide to go, please take a look at some value-adds for Beta and learn more about it. Thank you for reading Slashdot, and for making the site better!
ptaff writes "I recently upgraded a friend's computer from an old Windows to the latest Ubuntu LTS (so that I can avoid reinstalling for a while). As he gets online through a dial-up connection, he now complains that downloading updates takes forever, compared to his old Win box. I understand that GNU/Linux distributions update all installed software, but I wonder, does the update process need to use such bandwidth just to fix buffer overflows and fix input validation? At least, why aren't executables and libraries (possibly insecure) separate packages from the data, docs and multimedia? Why isn't there a system to simply fetch and apply the fixes to the original package? Are all this limitations of package manager's designs or is there a more subtle explanation? As reducing update bandwidth seems so simple at first sight, what do you think are the reasons we still have to download so much to keep a GNU/Linux system secure?" top
ptaff writes "From the release notes "METRo is a program used on
a operational basis since 1999 that together with the input of an atmospheric forecast, road composition and observations from a road weather station, produces a local road forecast for a 48-hour period"
As most countries develop their own environmental forecast models in closed quarters, this move by the canadian government might spur fecund collaboration between meteorological bodies around the world."