Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Comments

top

Born To RUN: Dartmouth Throwing BASIC a 50th B-Day Party

time961 Re:Err, not the "birth of time-sharing" (146 comments)

JOHNNIAC Open Shop System (JOSS) was another early time-sharing system, demonstrated in 1963. By 1964, the time-sharing idea was becoming widespread.

But, yes, undisputably, Dartmouth gave us BASIC, and like George Washington's proverbial axe (which had both its head and handle replaced multiple times), BASIC remains with us today. At least it's not as harmful as C; BASIC arrays and strings always had bounds-checking.

about 5 months ago
top

Born To RUN: Dartmouth Throwing BASIC a 50th B-Day Party

time961 Err, not the "birth of time-sharing" (146 comments)

In typically breathless university press-release prose, TFA says "on May 1, 1964, ... time-sharing and BASIC were born"

BASIC, sure, but time-sharing might better be dated to 1961, when CTSS was first demonstrated, and soon after widely used at MIT.

about 5 months ago
top

Ask Slashdot: Will Older Programmers Always Have a Harder Time Getting a Job?

time961 Economic bias, not just cultural (379 comments)

As others have observed, older workers tend to want to be compensated for their experience... so they're more expensive.

In a rational hiring world, that might not matter much--they usually deliver greater value, too--but it's often not rational people (or, let's be polite and say, people who could be better-informed) that are making that decision--it's people who want to minimize costs no matter what.

Hire an expensive engineer who really understands the work? Or two young cheap ones who might not? The latter, of course--for the same reason that outsourcing to the third world is so popular despite the incredible hurdles of management and quality. And if the bet fails, and neither of the young'ns can get it done (despite the 80-hour weeks that they can deliver and have come to expect), well, you'll be off to another job by then anyway and no one will know.

It's a vicious cycle: VCs like start-ups that live on ramen noodles because they're cheap to fund, unlike ones that have a real staff and a real track record. And sure, some of those cheap ones will succeed, and they'll get the press (in no small part because they are young), and that will perpetuate the myth that only young folks can innovate, leading the VCs to believe in their own decisions.

I don't see the bias going away. As a general rule, young people are less expensive, more dedicated, more attractive, and just more fun than us old farts. The market want crap in a hurry, and this is one of the primary reasons they get it.

about 6 months ago
top

Jon Oxer Talks About the ArduSats That are On the Way to ISS (Video)

time961 Re:Missed opportunity? (17 comments)

You might think that larger gates are an inherent advantage, but it's not that simple. To a modest extent, the advantage is there, but the counter-effect is strong, also: smaller gates have that much less cross-section in which a particle hit can deposit charge or cause damage. In practice, radiation tolerance is much more dependent on a bunch of other process characteristics, and it is very difficult to predict.

Failover is rarely "simple". There's a lot of code and mechanism, somewhere, to decide when a failure has occurred, determine the kind of failure, apply applicable recovery procedures, and restore what context can be restored and resume. This is a lot easier to do when you're not also trying to fit in 32KB of flash.

Space computing is very conservative. It is astonishing how much has been accomplished with such simple processors. But advances in the semiconductor art beg to be used, and projects like this could help light the way if not hamstrung by limited architectural choices.

about a year ago
top

Jon Oxer Talks About the ArduSats That are On the Way to ISS (Video)

time961 Missed opportunity? (17 comments)

Arduinos make somewhat more sense than phonesats (Really? We're sending a touch-screen and graphics controller into low earth orbit? Because the boss couldn’t think of any sillier project and had a spare $100K for launch costs?).

But it's hard to understand why a 17-wide parallel configuration of 8-bit microprocessors each having just 2.5KB of RAM makes for a sensible satellite payload processor. Why not something with an architecture more like a Raspberry Pi or BeagleBoard? Not those specific boards necessarily, but a similar, simple one-chip SoC approach and a decent amount of memory. A processor like that could drive a bunch of experiments (more than you can fit in a Cubesat), and have enough room for the software to be comfortable and maybe even maintainable on-orbit.

A SoC-based system would fit in the same low cost profile but could run much more interesting applications. Ardusat feels like a missed opportunity, because it has lots of other things going for it: open source, submission process, international coalition, hobbyist/student focus, etc.

about a year ago
top

Massachusetts Enacts 6.25% Sales Tax On "Prewritten" Software Consulting

time961 Just be sure your customers acknowledge it (364 comments)

Consultants can largely solve this problem by having customers declare explicitly that the work doesn't fall in the realm of taxable services as defined by the ruling.

There's so much ambiguity in the wording that as long as you're not in the crosshairs of being a reseller who supplies expensive software (think Oracle, not so much Windows) in the guise of a (heretofore) non-taxed service, you'll be fine. It's not worth their time to enforce it otherwise.

The key is being creative. Supplying customized Drupal installations? No, you're writing unique software to customer specifications for the customer to use with their existing Drupal platform. And maybe you're supplying training about operation and installation of Drupal systems. And helping them evaluate their business needs that might be met by aforesaid custom software. The ruling (section II) even explicitly exempts "training" and "evaluation". Maybe a small fraction of your business might fall under the ruling, but if that's the case, you just need to make sure it's covered by separate contracts. If there isn't significant money flowing out of your business for (reseller tax-exempt) software that your customers eventually get, it will be pretty challenging for the DOR to argue that your business is taxable... as long as you're smart about how you define the business.

I'm as worried as the next fellow about jackbooted thugs from the government running my business into the ground. However, the reality here is that these are overworked civil servants who are motivated by meeting their goals--and they'll do that by pursuing the cases that the statute is intended to target, because those will be most likely to generate revenue. No bureaucrat wants a lawsuit, they want passive compliance. Maybe ten years from now, it will be different, but if it is, I'd bet it's because the law is expanded (to cover all services, in the name of "fairness"), not because this statute is egregiously misinterpreted.

about a year ago
top

How Do You Get Better Bug Reports From Users?

time961 Embed logging technology in your software (205 comments)

By this I mean that you should instrument the code with real, meaningful activity logging, not just some afterthought that grabs a stack trace and some state variables (although you'll want to have that data, too). If you instrument your code with logging that produces readily human-interpretable information about what's going on, the payback is huge, because it makes internal developers' lives easier, and it allows even first-level support folks to to a better job of triage and analysis. It's really important to make it meaningful to the human reader, not just "readable"--an XML representation full of hexadecimal doesn't cut it, it needs to include symbolic names.

Let the users see the logged data easily, if they ask for it, and maybe give them a single knob to turn that controls the level of logging. This will help technically sophisticated users give more useful reports, and it's really helpful in any sort of interactive problem resolution (OK, do X. Now read the last few log messages. Do any of them say BONK?).

It's really useful to include high-resolution time--both clock time and accumulated CPU time--in log messages. This is great for picking up weird performance problems, or tracking down timeouts that cause mysterious hangs. Depending on your architecture and implementation technology, other sorts of "ambient" data (memory usage, network statistics) can useful here, too.

There's a trade-off between logging by frameworks, mixins, macros, etc., and logging of specific events. The former approach gets comprehensive data, but it often can't provide enough contextual semantic information to be meaningful. The latter approach scatters logging ad-hoc throughout the code, so it's very hard to make any argument for comprehensiveness, but if done properly, it's spot-on for meaningful messages. Usually best to do some of each, and have good control knobs to select.

Logging can generate a lot of data, so it's important to be able to minimize that burden during routine operation (especially in deployed applications, where there should be a strict limit on the amount of space/time it takes up). But it's also useful (especially when it's configured to generate a lot of data) to have tools that allow efficient ad-hoc review and analysis--an XML tree view, maybe filtered with XSLT, can be easier than a giant text file.

In any complex system, logging is one of the very first things I recommend implementing. After the architecture is settled enough to know what will be some of the meaningful activities and objects to record, bolting in a high-efficiency, non-intrusive logging infrastructure is the very next step. Then comes business logic, user interface, and all the other stuff. Pays for itself many times over.

about a year ago
top

NSA Backdoors In Open Source and Open Standards: What Are the Odds?

time961 Re:Historically, NSA have done the opposite. (407 comments)

Considering the rest of Coppersmith's work, I have no trouble believing in his genius or that he independently invented differential cryptanalysis. Are you suggesting that he didn't, and instead lied about it 20 years later?

Your post rather mischaracterizes the content of that section of Wikipedia. It is hardly "everyone else's version" that NSA made changes. That section cites both the Senate inquiry and Walter Tuchman (then of IBM) as saying that NSA did not dictate any aspect of the DES algorithm. The Konheim quote ("We sent the S-boxes to Washington...") is an un-referenced comment from Applied Cryptography (which says "Konheim has ben quoted as saying..." without saying where or by whom). Schneier goes on to express admiration for IBM's work and how it scooped the rest of the open crypto world for 17 years.

In any case, the important point is that changes were made, whether by IBM alone or in collaboration with NSA, and they unequivocally made the algorithm much better, as opposed to the conspiracy theory that NSA made it worse. The 56-bit key is reasonably commensurate with the security DES actually supplies (against the attacks of the day, secret and otherwise). Now if it had turned out to be weak against linear cryptanalysis, or indeed any other attack of the last 40 years, that would be news--but it's not weak, it's just average, strongly suggesting that no better attacks were known back then.

about a year ago
top

NSA Backdoors In Open Source and Open Standards: What Are the Odds?

time961 Re:Historically, NSA have done the opposite. (407 comments)

Biham and Shamir, Differential Cryptanalysis of the Data Encryption Standard, at CRYPTO '92. They showed that the S-boxes were about as strong as possible given other design constraints.

Subsequently, Don Coppersmith, who had discovered differential cryptanalysis while working (as a summer intern) at IBM during the development of DES in the early 1970's, published a brief paper (1994, IBM J. of R&D) saying "Yep, we figured out this technique for breaking our DES candidates, and strengthened them against it. We told the NSA, and they said 'we already know, and we're glad you've made these improvements, but we'd prefer you not say anything about this'." And he didn't, for twenty years.

Interestingly, when Matsui published his (even more effective) DES Linear Cryptanalysis in 1994, he observed that DES was just average in resistance, and opined that linear cryptanalysis had not been considered in the design of DES.

I think it's fair to say that NSA encouraged DES to be better. But how much they knew at the time, and whether they could have done better still, will likely remain a mystery for many years. They certainly didn't make it worse by any metric available today.

about a year ago
top

NSA Backdoors In Open Source and Open Standards: What Are the Odds?

time961 The GSM ciphers are an interesting story (407 comments)

I can't find a good reference right now, but I recall reading a few years back the observation that one of the GSM stream ciphers (A5/1?) has a choice of implementation parameters (register sizes and clocking bits) that could "hardly be worse" with respect to making it easily breakable.

This property wasn't discovered until it had been fielded for years, of course, because the ciphers were developed in the context of a closed standards process and not subjected to meaningful public scrutiny, even tough they were nominally "open". The implication was that a mole in the standardizing organization(s) could have pushed for those parameters based on some specious analysis without anyone understanding just what was being proposed, because the (open) state of the art at the time the standard was being developed didn't include the necessary techniques to cryptanalyze the cipher effectively. Certainly the A5 family has proven to have more than its fair share of weaknesses, and it may be that the bad parameter choices were genuinely random, but it gives one to think.

Perhaps some reader can supply the reference?

The 802.11 ciphers are another great example of the risks of a quasi-open standardization process, but I've seen no suggestion that the process was manipulated to make WEP weak, just that the lack of thorough review by the creators led to significant flaws that then led to great new research for breaking RC4-like ciphers.

about a year ago
top

Speeding Ticket Robots — Laws As Algorithms

time961 A job for legislators, not programmers (400 comments)

The truly frightening thing about this article is that the authors apparently felt it was the job of the programmers to determine what the reasonable algorithmic interpretation of the law's intent was, thus again demonstrating how completely out of touch with reality many academics seem to be.

The legislative process is appallingly imperfect, to be sure, but at least it has the pretense of openness and consideration of constituent interests. That's where these decisions need to be made.

Fortunately, since legislators break these laws as much as the rest of us, we probably don't have too much to worry about. Think about all those electronic toll systems--they certainly know how fast you were going, on average, and an intuitive application of the mean value theorem will quickly show that you were speeding, but we rarely if ever get tickets from those systems.

about a year and a half ago
top

Are Lenovo's ThinkPads Getting Worse?

time961 Lower-quality, Market-trailing (271 comments)

I've used Thinkpads exclusively since I bought a 560 in late 1996. I'm currently using a 2009-vintage W500 and hoping it doesn't break, because it has more pixels (1920x1200) than any Windows laptop made today. They've always been rugged, functional, and effective tools for getting work done.

What did I want from yesterday's Lenovo announcement? A retina-class (i.e., 2560x1600) display, modern CPU/memory/SSD hardware, and no significant changes elsewhere, because Thinkpads are in fact pretty darn well-engineered (and designed), and remarkably reliable.

What did I get? A paean to how important it is to design for millennials (who apparently need dedicated multimedia buttons), a bunch of important features gone (physical buttons? function keys? replacement battery? indicator LEDs? Thinklight?) and an explanation that the single hardest decision they had to make for the T431 was how to re-orient the logo on the lid. I can't even get a big SSD--their largest is 256GB, unlike the 600GB Intel unit I installed in the W500 18 months ago.

Bah. I'd vote with my feet, except there aren't any alternatives. Why is there no Windows laptop with a high-resolution display? I suppose I can get a Macbook or a Chromebook and run everything in a VM. But then there's no Trackpoint.

about a year and a half ago
top

Will Renewable Energy Ever Meet All Our Energy Needs?

time961 Planetary damage and energy vs. fantasy (626 comments)

It must be silly season over at the good ol' BAS. First we get "bio-terror is impossible", and now this. I miss Hans Bethe.

Other posters have pointed out how silly it is to base any argument on hundreds of years of exponential growth. Yep, if that happens, and all other things stay the same, we're screwed. But clearly, all other things aren't going to stay the same. Even Malthus knew that argument is bogus.

Will population and concomitant energy use increase inexorably? Err, maybe not. There's a lot of demographic evidence that population growth slows, even reverses, as living standards improve and, especially, as women become better educated and control their own destinies.

Can solar (or nuclear) solve all our energy problems? Probably not, at least not without a lot of improvement in battery technology, because the energy density of hydrocarbons is so appealing. And there are indeed real resource issues that may put a crimp in massive production of electronics, solar panels, transmission lines, reactor vessels, you name it. For production on a significantly more massive scale, those issues need to be addressed. But scarcity relative to current practices is a strawman--as material costs increase, economic pressures generally yield optimizations. A lot of these look like issues because nobody has even tried to solve them, because material supplies haven't been an issue.

Is conservation important? Yah, you betcha. The cheapest energy of all is that which doesn't get used.

Is energy supply the compelling motivation for solar ? No, it's climate change and pollution. The longer we dither about renewables, the sooner we will face the massive costs for mitigating all the damage caused to date. We'll pay a lot of those costs eventually--the harm is too far along to cure itself. But at this rate, it's not our grandchildren, or our children, who will be paying for huge sea walls around Manhattan, it's us! The longer we can push off those mitigations, the easier they will be. That, to my mind, is the overwhelming argument for solar (and other low-emission) energy.

about a year and a half ago
top

Putting Biotech Threats In Context

time961 revolution vs. evolution (117 comments)

In the subject article, Kathleen Vogel argues that research in the field of Science and Technology Studies (of which, no doubt coincidentally, she is a professor) that technology development proceeds more along an "evolutionary" path than a "revolutionary" path.

This is actually not a bad argument. if one is describing the integration of technologies into the overall fabric of societies. Even though technologies do tend to develop in a revolutionary fashion (a well-established fact that she'd apparently like to ignore), actual widespread integration and beneficial use of technologies is a much slower and uncertain process.

With respect to bi-terrorism, however, it is a supremely irrelevant argument. The putative terrorist's goal is not beneficial integration of technology into society, it is the use of technology to induce terror. While, for example, beneficial technology needs to be safe and effective, and particularly in the health and biotech sector is held to extremely high standards for those attributes, a terrorist's exploitation of technology need be neither. If it even works a little, and doesn't kill the terrorist before deployment, it's good enough. The evolution model is simply inapplicable here, and Vogel should be embarrassed to make such a fallacious argument.

Not that the revolution argument is all that great, either, as a predictor of trerrorism. Biotech is hard, and even marginal success is challenging to obtain. But the evidence is incontrovertible that high-quality tools for biotech experimentation have become far more effective and far less expensive over the last 20 years, and the knowledge base has certainly increased dramatically. However, assimilating that knowledge base and using it effectively for developing weapons is not trivial--the literature tends to focus on beneficial applications, and a fair amount of skill and insight is needed to move in other directions.

In the end, these factors are unknowable (except perhaps the equipment cost advantage). That's what makes the threat worth studying, and that's one of the reasons we have an intelligence community: to think, as Herman Kahn put it, about the unthinkable. I certainly wouldn't argue that the IC always does a great job at those analyses, but at least they do them. Arguing, from an irrelevant academic viewpoint, that we can ignore the bio-terror threat because industry hasn't been able to bring effective gene therapies to market, well, that's just silly.

about a year and a half ago
top

Ask Slashdot: What Equipment and Furniture For an Electronics Hardware Lab?

time961 furniture, lighting, storage (208 comments)

I recently put together such a lab in a room in my office space.

Electronic equipment depends completely on what kind of work you're doing: digital, analog, high-speed, low-signal, RF, etc. So it's hard to answer that question.

Pretty much everything, however, needs some basics: ESD protection, furniture, lighting, storage.

ESD protection: Install a conductive tile floor. Most vendors for this stuff prefer to work on whole buildings; finding someone to do a single room took a bit of looking. I ended up buying the tiles myself (from StaticWorx, from their odd lots selection at about $3/sqft) and hiring one of the big company's installers to moonlight over a weekend.

Other folks have talked about grounding. It's just as important as they say. Most electricians who do commercial work will understand how to get this right.

Furniture: Get one or two heavy duty lab benches with anti-static surfaces and shelves above the bench. It's a little detail, but I recommend bullnose fronts instead of square, to make chipping and other damage less likely. Benches should be 36" deep if you have the room, so you can have relatively deep equipment on the bench and still have room to work in front of it.

You should be able to get behind the bench to fuss with cabling and such (and to vacuum... dust accumulates like nobodys business if you have your test equipment pushed all the way back to the wall behind it).

I have a couple of anti-static lab chairs: conductive fabric, little chain to connect to the anti-static floor.

I also have a big folding table that I unfold when I need to lay out a bunch of stuff and reorganize it.

Lighting: This is really important. You can't have too much. I have a bunch of 4-bulb T-8 fluorescent fixtures on the ceiling (in several groups with different switches, so it doesn't have to be that bright all the time).

I also have a big magnifying lamp, and a big stereo microscope, although I'm still looking for a good solution for lighting on the microscope.

Storage: I have a bunch of little drawer cabinets. Most aren't anti-static, so I have a lot of stuff in conductive foam. It's a trede-off: anti-static is safe, but it's opaque, but clear drawers are a lot easier to work with (and cheaper). A lot of stuff (machine screws, switches, resistors) doesn't need anti-static.

I also have a bunch of open shelves filled with Akro-Mils plastic bins. These are great for storing miscellaneous stuff like multimeters, tools, small project pieces, larger components, etc. They come in many colors, which I've never figured out how to use effectively as an organizing scheme. I try to keep everything loose in one of these bins so it's easy to put a bunch of bins on the shelf to make room fo a project.

I do a fair amount of work with surface-mount devices, and I struggled with how to store them. It's a nuisance to handle the devices in cut tape form: the tape is bulky and springy and clumsy, and it's a pain to get devices out of it one at a time. Once extracted, the devices are way too small to make effective use of drawer cabinets: it's like storing grains of sand. But then I found these nice little (conductive) aluminum canisters at American Science & Surplus, and they're great: about an inch in diameter, glass window in lid, and stored 20 to small aluminum box the size of a small book. I now have a bunch of those "books".

Multimeters: Someone suggested getting a bunch of cheap multimeters; this is a great idea. $5/each from Harbor Freight (or free sometimes with a coupon). An extra cheap oscillosope isn't a bad idea, either--an old Tek 465 is cheap on eBay and very quick to use.

Lots of power outlets: I ended up bolting a ton of cheap power strips all over the benches, because there are so many things that need power. And because so many of them use power bricks, it's important to have plenty of access around the power strips--they can't be mounted under a shelf or in a corner.

Lots of USB: I also mounted a lot of easily accessible USB hubs (and some Ethernet) to accommodate the many devices that connect by one or the other. I really don't like crawlng around to mess with cables.

Articulated monitor mounts: Most of the time, I don't need to interact with the computer. It's nice to be abe to push the monitor(s) out of the way (and hang the keyboard/mouse out of the way, too)

about 2 years ago
top

NSA Mimics Google, Angers Senate

time961 Who benefits? (193 comments)

Clearly, someone must have paid for this charming little legislative tidbit. But who?

I mean, I could understand if Lockheed-Martin had a proprietary solution that they were offering (with just a few change orders needed to satisfy NSA's requirements, of course), but the beneficiaries here seem to be the Cassandra and HBase projects, neither of which seem likely to have much of a lobbying budget. Was it their forebears at Facebook? Could they possibly care enough?

And blaming it on "conservatives-want-smaller-government" seems pretty silly, too. Sure, turfing Accumulo might conceivably further that goal in some tiny, tiny way, but it's not like some senator was likely to have figured this out by himself. No, clearly someone put them on to it, but who and why?

It's an intriguing mystery. Any ideas?

more than 2 years ago
top

Cisco's Cloud Vision: Mandatory, and Killed At Their Discretion

time961 Is Cisco can do it, who else can? (307 comments)

I mean, what a great opportunity for malware distribution, sabotage, spying, etc... Just connect to every "Linksys" router you can find and "upgrade" its firmware yourself! (change them all to DD-WRT, maybe?).

Since experience tells us that mechanisms like this are rarely, if ever, properly secured, this seems like a major security catastrophe in addition to a privacy debacle. Even if sound cryptography and digital signatures are employed to make sure the updates are valid, there may be implementation flaws in the routers, vulnerabilities in Cisco's upgrade servers, key leakage, bad protocol design, etc.

Wow.

more than 2 years ago
top

Cisco Pushing 'Cloud Connect' Router Firmware, Allows Web History Tracking

time961 Re:Verizon has been doing it for ages. (351 comments)

Be interesting to know how this feature is "secured", too. How hard would it be for J. Random BadGuy to connect to my FIOS router and replace its firmware?

more than 2 years ago
top

Cisco Pushing 'Cloud Connect' Router Firmware, Allows Web History Tracking

time961 Looks like an opportunity to me! (351 comments)

What enables Cisco to DO remote management? And what ensures that no other entity in the world can remotely "manage" my router in the same fashion?

What a great avenue this could be for malware distribution, intelligence collection, massive denial of service, etc. Be pretty cool for bad guys (or LE or TLAs) to be able to replace the firmware in millions of routers unbeknownst to their owners.

Does anyone here know how Cisco's remote management interface is "secured"? Even if there's sound cryptography involved, there's always router software flaws, bad key generation, vulnerabilities in Cisco's upgrade servers, poor operational security at Cisco, and other avenues to attack the overall system. And of course the cryptography itself might be unsound, too... usually takes folks a couple tries to get that right.

Wow.

more than 2 years ago

Submissions

top

Security Technology Doesn't Work

time961 time961 writes  |  more than 3 years ago

time961 (618278) writes "At least that's the message in Information Week's October 11 cover story, Outgunned: How Security Tech Is Failing Us. Probably not news to many Slashdot readers, but among the many entertaining statistics are:

79% of respondents said that they had been victims of malware that evaded their AV measures. How? Automated, point-and-click obfuscation technology for virus creators is a big part of it. And the 10 million new malware examples McAfee sees every year are only the tip of the iceberg these days.

A best-case evaluation of "vulnerability assessment tools" says that they find only 20-30% of flaws, and that real-world tests would probably be worse.

92% of information losses involve databases, yet very little budget is spent on DBMS security. Even better, 70% of respondents said they were assessing database security, but 64% of those same respondents said they didn't know how they were doing so.

For the mainstream press, it's pretty remarkable to see an article suggesting that most of the security industry is a sham, peddling outdated products that cost a lot, add lots of overhead, and accomplish nothing of value."

top

Will telephone calls ever get better?

time961 time961 writes  |  more than 4 years ago

time961 (618278) writes "Older readers may remember a time when telephone calls sounded good. The engineering genius of Bell Labs allowed telephone companies to wrest exceptional results from the meager 3 KHz of allotted bandwidth: calls were crisp and quiet with no time lag, the parties could talk simultaneously, and quality was, roughly, related to distance: when I called my next door neighbor, my electrons zipped downtown at nearly the speed of light and zipped right back out again. Eventually, even transcontinental long-distance worked very well--remember when Sprint advertised their Fiber Optic Network by saying "You can hear a pin drop"?

Of course, that was when telephones connected one individual person in one place to another individual in an equally fixed location, using hardwired equipment that could do nothing more (or less) than place and receive calls. Today, we have a myriad of telephone-related services, some unimagined 25 years ago: ubiquitous mobile phones, speakerphones everywhere, calls placed by computer, trivial conference calling, etc. And we get many of these services at amazingly low prices, or even for free. However, it seems like communication quality has plummeted as variety has expanded. Theodore Vail must be spinning in his grave.

For example, people are embracing the "mobile only" lifestyle, yet voice quality from one cellphone to another is often abysmal. This is understandable, say, if both parties are outside in noisy environments, but it's only a little better when both are in quiet, empty rooms--it still sounds like we're gargling marbles. Worse, if we try to talk over each other, all is lost, and we have to wait a second or two before trying to speak again (unless the call gets randomly dropped, in which case the wait is longer). It's not as bad as CB radio, but it's sure not like hearing a pin dropping. Even local calls, apparently between landlines, are worse. I may have a hard copper connection back to my central office, but by the time my voice has been digitized, turned into packets, and sent through routers in East Overshoe, Nebraska before finally getting back to my neighbor's "Triple Play" cable modem phone, it, too, is a low-quality, noisy, choppy imitation of what the Bell System once provided. And speakerphones--what is it with high-ranking executives who, alone in their empty offices, say "I'm gonna put you on speaker" and then end up sounding like they're in a submarine? Are they seeking plausible deniability on the grounds that the other party couldn't actually understand what they said?

But that's all whining for background. My question is, is this situation inevitable? Can it get better? Will it? Are we just in a period where new technologies haven't quite been tamed, much as the early steamship era was punctuated by boiler explosions? Or is the tradeoff of service variety for quality something that can't be avoided (or undone)? Obviously, there's a huge collection of technologies underlying modern telecommunications, and they operate and interact in complex and mysterious ways, so no one factor is to blame. But is that technology even capable of providing good voice quality? What are the technical roadblocks? Is it primarily an economic issue? What are the economic obstacles? Conversation is such a basic human activity, it seems important to have the technology work better."
top

Office 2003SP3: Old file formats, now unavailable!

time961 time961 writes  |  more than 6 years ago

time961 writes "In Service Pack 3 for Office 2003, Microsoft has disabled support for many older file formats, so if you have old Word, Excel, 1-2-3, Quattro, or Corel Draw documents, watch out! They did this because the old formats are "less secure", which actually makes some sense, but only if you got the files from some untrustworthy source.

Naturally, they did this by default, and then documented a mind-bogglingly complex workaround (KB 938810) rather than providing a user interface for adjusting it, or even a set of awkward "Do you really want to do this?" dialog boxes to click through. And, of course, because these are, after all, old file formats, many users will encounter the problem only months or years after the software change, while groping around in dusty and now-inaccessible archives.

One of the better aspects of Office is its extensive compatibility mechanisms for old file formats. At least the support isn't completely gone—it's just really hard to use. Security is important, but there are better ways to fulfill this goal.

This was also covered by the Windows Secrets newsletter, although I can't find a story URL for it."

Journals

time961 has no journal entries.

Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>