×

Announcing: Slashdot Deals - Explore geek apps, games, gadgets and more. (what is this?)

Thank you!

We are sorry to see you leave - Beta is different and we value the time you took to try it out. Before you decide to go, please take a look at some value-adds for Beta and learn more about it. Thank you for reading Slashdot, and for making the site better!

Comments

top

The Challenges of A DVR Service

ChelleChelle Re:Huh? (134 comments)

It's not that Tim Burton is no longer cofounding TiVo...it's that I somehow managed to get "Tim Burton" out of "Jim Barton" when submitting the article...how did this happen? No idea, I apologize. Although I was entertained by your comment ("Hi, Son. I am no longer your daddy. I unprocreated you") But to make things clear--Jim Barton did cofound TiVo. Tim Burton is apparently involved in movies.

more than 8 years ago

Submissions

top

The Software Industry IS the problem

ChelleChelle ChelleChelle writes  |  more than 3 years ago

ChelleChelle (969883) writes "The time has come for software liability laws—or so Poul-Henning Kamp believes. Drawing on Ken Thompson’s famous Turing Award lecture—in which he stated “You can’t trust code that you did not totally create yourself”—Kamp proposes a way to —http://queue.acm.org/detail.cfm?id=2030258">"> introduce product liability into the software world. A thought-provoking read from an always contentious writer."
top

Too Much Data? Then "Good Enough" is Good Enough

ChelleChelle ChelleChelle writes  |  more than 3 years ago

ChelleChelle (969883) writes " Today’s data systems differ greatly from those of the past. While classic systems could offer crisp answers due to the relatively small amount of data they contained, today’s systems hold humongous amounts of data content—thus, the data quality and meaning is often fuzzy. In this article, Microsoft’s Pat Helland examines the ways in which today’s answers differ from what we used to expect, before moving on to state the criteria for a new theory and taxonomy of data."
top

National Internet Defense

ChelleChelle ChelleChelle writes  |  more than 3 years ago

ChelleChelle (969883) writes "On May 8th, 2007 the web Sites of Estonian banks, media outlets, and government were the targets of a cyberterrorism attack. A little more than a year after the Estonian incident, Georgia was subjected to cyber attacks in conjunction with the Russian incursion into South Ossetia in August 2008. This article from the ACM Queue examines these two attacks in order to highlight key vulnerabilities in national Internet infrastructure as well as suggest ways to establish a more robust and defensible Internet presence."
top

Virtualization--Blessing or Curse?

ChelleChelle ChelleChelle writes  |  more than 3 years ago

ChelleChelle (969883) writes "Over the last ten years virtualization has been the source of much hype. Many have come to perceive virtualization as a panacea for a host of IT problems, ranging from resource underutilization to data-center optimization. Yet the question remains— can virtualization deliver on its promises? According to the vice president of Morgan Stanley, Evangelos Kotsovinos, yes it can, just not right out of the box. For virtualization to deliver on its promise, both vendors and enterprises need to adapt in a number of ways. This article cuts through the hype surrounding virtualization and focuses on the hidden costs and complex and difficult system administration challenges that are often overlooked."
top

The Theft of Business Innovation

ChelleChelle ChelleChelle writes  |  about 4 years ago

ChelleChelle (969883) writes "When most people think of cybercrime and the online theft of valuable, business-related information they tend to consider only the obvious information at risk--½Â½Âthink banking codes or secret inventions. Today's criminals, however, have broadened their definition of high-value commercial information to include the more mundane but valuable information such as manufacturing processes, suppliers, customers, factory layout, contract terms, employment data, and general know-how. This means that any business that shows leadership in any aspect of its industry is a potential target for attack. In this new age of cybercrime past security wisdom is no longer valid. To address how the current threat environment has evolved and how businesses can seek to protect themselves ACM initiated a roundtable discussion with some of the top minds in the industry."
top

Keeping Bits Safe: How Hard Can It Be?

ChelleChelle ChelleChelle writes  |  more than 4 years ago

ChelleChelle (969883) writes "As even a quick glance at this article will reveal, the author’s title was clearly intended to be tongue-in-cheek. Keeping bits safe is something that is actually quite difficult to do. In fact, as storage systems grow increasingly larger and larger, protecting their data for long-term storage is becoming more and more challenging. In this article David Rosenthal examines the claims of various storage system manufacturers regarding the reliability of their products and explores the different techniques that are used to prevent data loss before addressing some of the possible steps that should be taken in the future to handle the inevitable failures of long-term storage."
top

Photoshop Scalability: Keeping It Simple

ChelleChelle ChelleChelle writes  |  more than 4 years ago

ChelleChelle (969883) writes "In this latest case study from acmqueue, Russell Williams (principle scientist, Adobe Photoshop) and Clem Cole (architect of Intel’s Cluster Ready program) discuss Photoshop’s long history with parallelism and what they now see as the main challenge. “ Photoshop’s parallelism, born in the era of specialized expansion cards, has managed to scale well for the two- and four-core machines that have emerged over the past decade. As Photoshop’s engineers prepare for the eight- and 16-core machines that are coming, however, they have started to encounter more and more scaling problems, primarily a result of the effects of Amdahl’s law and memory-bandwidth limitations.” An interesting read, especially since any software engineer who has ever attempted to achieve parallelism in an application will recognize many of the problems and challenges the Photoshop team is now facing."
top

Computers in Patient Care

ChelleChelle ChelleChelle writes  |  more than 4 years ago

ChelleChelle (969883) writes "Information technology has the potential to radically transform health care by providing a variety of advantages ranging from a decrease in medical errors and paperwork to improved patient safety and care. Yet progress over the last several decades as been slow. In this article Dr. Stephen Cantrill discusses the history of HIT (health information technology), examining why so many efforts in this field have failed. In doing so he pinpoints some of the major challenges that still exist today in the application of medical informatics to the daily practice of health care. Foremost amongst these challenges are the issues of developing an effective human-machine interface as well as the reliability and availability of systems."
top

Injecting Errors for Fun and Profit

ChelleChelle ChelleChelle writes  |  more than 4 years ago

ChelleChelle (969883) writes "Errors, whether transient or permanent, are unfortunately a fact of life. In order to make sure that a system can properly handle them, it is absolutely essential to test the error-detection and correction circuitry by injecting errors. This is the main topic of a recent article from acmqueue in which Steve Chessin of Oracle talks about injecting various types of errors (e-cache errors, memory errors) on the Ultrasparc-II. "
top

Moving to the Edge: Network Virtualization

ChelleChelle ChelleChelle writes  |  more than 4 years ago

ChelleChelle (969883) writes "The advent of virtual machines and cloud computing has greatly changed the IT world, offering both new opportunities (making applications more portable) as well as new challenges (breaking long-standing linkages between applications and their supporting physical devices). Before data-center managers can take advantage of these new opportunities, they must have a better understanding of service infrastructure requirements and their linkages to applications. With this in mind acmqueue initiated a roundtable discussion, bringing together providers and users of network virtualization technologies from leading companies (including Yahoo!, Hewlett-Packard and Citrix Systems) to discuss how virtualization and clouds impact network service architectures. "
top

Software Development with Code maps

ChelleChelle ChelleChelle writes  |  more than 4 years ago

ChelleChelle (969883) writes "Software developers regularly draw diagrams of their systems. Such diagrams, be they hastily sketched on a white board or in high-quality poster format, are of great assistance to a developer’s daily work (helping him examine and understand source code, explain existing code to a coworker, etc). A group of researchers from Microsoft Research—Robert DeLine, Gina Venolia and Kael Rowan-- feel, however, that software could make some improvements to this process. They are currently designing an interactive code map for development environments. As they see it, “making a code map central to the user interface of the development environment promises to reduce disorientation, answer common information needs, and anchor team conversations.”"
top

Visualizing System Latency

ChelleChelle ChelleChelle writes  |  more than 4 years ago

ChelleChelle (969883) writes "Latency has a direct impact on performance—thus in order to identify performance issues it is absolutely essential to understand latency. With the introduction of DTrace it is now possible to measure latency at arbitrary points—the problem, however, is how to visually present this data in an effective manner. Towards this end heat maps can prove to be a powerful tool. When I/O latency is presented as a visual heat map, some intriguing and beautiful patterns can emerge. These patterns provide insight into how a system is actually performing and what kinds of latency end-user applications experience."
top

A Tour through the Visualization Zoo

ChelleChelle ChelleChelle writes  |  more than 4 years ago

ChelleChelle (969883) writes "The production of digital information is increasing at an astonishing rate. In order to put this information to good use, we need to find ways to explore, relate and communicate the data in a meaningful way. Hence visualization, which involves the principled mapping of data variables to visual features such as position, size, shape and color, is becoming an area of great interest. According to three scholars from Stanford University, Jeffrey Heer, Michael Bostock and Vadim Ogievetsky, “The goal of visualization is to aid our understanding of data by leveraging the human visual systems’ highly tuned ability to see patterns, spot trend and identify outliers. Well-designed visual representations can replace cognitive calculations with simple, perceptual inferences and improve comprehension, memory and decision making.” In this article Heer, Bostock and Ogievetsky provide a survey of several powerful visualization techniques. As an added bonus, many of their visualizations are accompanied by interactive examples (created using Protovis, an open source language for Web-based data visualization)."
top

Securing Elasticity in the Cloud

ChelleChelle ChelleChelle writes  |  more than 4 years ago

ChelleChelle (969883) writes "Cloud computing has been generating a lot of buzz lately—yet is it really a revolutionary new concept or simply the industrial topic du jour? According to the author of this article (Dustin Owens of BT Americas) cloud computing is an extraordinarily evolutionary and potentially revolutionary concept due to its elasticity. For Owens, once elasticity is combined with on-demand self-service capabilities it could truly become a game-changing force for IT. As he states, “elasticity could bring to the IT infrastructure what Henry Ford brought to the automotive industry with assembly lines and mass production: affordability and substantial improvements on time to market.” While this sounds fantastic, there are several monumental security challenges that come into play with elastic cloud-computing. The bulk of this article is dedicated to examining these challenges."
top

Principles of Robust Timing over the Internet

ChelleChelle ChelleChelle writes  |  more than 4 years ago

ChelleChelle (969883) writes "The NTP (Network Time Protocol) system for synchronizing computer clocks has been around for decades and has worked well for most general-purpose timing uses. However, new developments, such as the increasingly precise timing demands of the finance industry, are driving the need for a more precise and reliable network timing system. Julien Ridoux and Darryl Veitch from the University of Melbourne are working on such a system as part of the Radclock Project. In this article they share some of their expertise on synchronizing network clocks. The authors tackle the key challenge, taming delay variability, and provide useful guidelines for designing robust network timing algorithms."
top

Why Cloud Computing Will Never Be Free

ChelleChelle ChelleChelle writes  |  more than 4 years ago

ChelleChelle (969883) writes "Today, many cloud end customers use price as their primary decision criterion when selecting a cloud provider. Due to a variety of factors (the reduced deployment costs of open source software, the perfect competition characteristics of remote computing, etc) cloud providers are expected to continuously lower their prices. While low prices may seem like a good thing, it is important to keep in mind the costs to performance. In order to provide cheap service, cloud providers are frequently required to overcommit their computing resources and cut corners on infrastructure, resulting in variable and unpredictable performance of the virtual infrastructure. As this article discusses, this is a situation that needs to change."
top

Enhanced Debugging with Traces

ChelleChelle ChelleChelle writes  |  more than 4 years ago

ChelleChelle (969883) writes "Using traces—an essential technique in emulator development—can be a useful addition to any programmer’s toolbox. This article examines how adding snapshots, tracing and playback to existing debugging environments can significantly reduce the time required to find and correct stubborn bugs. From the article, “Detailed CPU state traces are extremely helpful in optimizing and debugging emulators, but the technique can be applied to ordinary programs as well. The method may be applied almost directly if a reference implementation is available for comparison. If this is not the case, traces are still useful for debugging nonlocal problems. The extra work of adding tracing facilities to your program will be rewarded in reduced debugging time.” "
top

Sharing Visualization with the World

ChelleChelle ChelleChelle writes  |  more than 4 years ago

ChelleChelle (969883) writes "“Visualization can be a pretty mundane activity: collect some data, fire up a tool, and then present it in a graph, ideally with some pretty colors.” As this interview reveals, however, all of this is changing. Thanks in large part to a new generation of collaborative visualization tools and the efforts of research scientists such as Martin Wattenberg and Fernanda Viegas (both from IBM’s Visual Communication Lab) the communicative potential of visualizations is becoming known. In this interview Wattenberg and Viegas discuss their interest in the artistic component and social uses of visualizations , interests which are apparent in their many collaborative projects (for example, their most recent project Many Eyes )."

Journals

ChelleChelle has no journal entries.

Slashdot Login

Need an Account?

Forgot your password?