Beta

Slashdot: News for Nerds

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Comments

top

How Citigroup Hackers Easily Gained Access

Bones3D_mac Um, just a sec, gotta check on something... (371 comments)

This doesn't even qualify as a hack. It's more like a tactic a curious script kiddie would try just to see how something worked, and suddenly being pleasantly surprised when some other user's data was handed to them on a silver platter as a reward for bothering.

Sadly, I'm willing to bet this kind of "exploit" is probably far more common than anyone is willing to admit. Like those of us who have ever "left the water running" and only coming to realize it 50 miles down the road.

It's something so stupid, most developers wouldn't bother checking their own work for such a "rookie mistake", simply because they're just that good.

more than 3 years ago
top

The Next Phase of Intelligent TVs Will Observe You

Bones3D_mac It seems very unlikely... (294 comments)

But...then again, that "kinect" stuff went off like wildfire, with thousands of families instantly installing a networked 3D camera in their living room, completely unaware of the potential implications. You'd think someone would have read "1984", or at least watched the movie after hearing about it on the news following the adoption of the patriot act. Soon it'll be the same for our daily use media devices and smart phones...

It'll probably have to become a video game before they figure it out, but by then, the creepiness will be outweighed by the false sense of security, knowing that "there's an app for that"... featuring the next generation apple iDevice that features multiple 3D cameras that can view the device's entire surroundings as a single 3D sensitive sphere around 15-20 feet in diameter, that can use AI assisted augmented reality to pick out and identify every object in view, then recreate the scene entirely 3D from a database of similar 3D objects as hastily collected as google's image search function. Which in turn will be uploaded to YouTube3D, where random users will watch you in realtime 3D, able to rummage through your belongings without having to actually being there. Finally, someone will think to turn this YouTube3D thing into a service you can pay for to have random people watching you 24/7 like Brinks home security, except the security "staff" actually pay for the service as well to watch you like an episode of "survivor", except it's "interactive"... and the viewers can choose to either watch you die your own home from a fire or break-in, or, call the police and be the great busy-body hero they imagine themselves to be... or just to collect a cash reward, like some sick game show.

more than 3 years ago
top

What's Killing Your Wi-Fi?

Bones3D_mac Number 1 Cause... (248 comments)

... mismatched devices!

You would not believe how many people "upgrade" their broadband to 20+ Mbit/sec service and then complain that their computer is still only getting 1-3Mbit/sec speeds. A lot of them don't realize that the older 802.11 devices can significant reduce the performance of a modern wireless network.

Most 802.11b devices (which are still in use today) usually top out at around 10-11Mbit/sec, and that's under perfect conditions. If you start adding multiple users, competing networks and outside interference, things get out of hand pretty quick.

Here's a list of things to look for in examining your wireless network for performance issues:

- Replace the router.

If you're router is over 3 years old, it might be time to replace it. Especially if it's an older 802.11a/b model. The really old 802.11 devices, like Apple's original AirPort base station, have a lot of problems working correctly when they encounter other networks within their own service range. This can result in dropped or spotty connections and overall losses in bandwidth. Many of these first generation wireless network devices barely worked, but they worked well enough for the few people that could afford them. Most of these devices have since been trashed for more recent models either because they started failing under the weight of other networks or simply died from various flaws or age.

- Update the firmware.

Many wireless devices have firmware chips on them that can be upgraded through software. This can help weed out networking issues that might be caused by buggy firmware, or may add enhanced features that can help your device work better under heavier loads from competing networks, interference, multiple users and various security issues.

- upgrade all client-end networking hardware at the same time.

When putting a wireless network together, or upgrading an existing one, make sure your client devices use similar configurations. (Or identical, if possible...) A single, poorly configured client device can significantly impact your wireless network's performance. By making the network devices functionally similar to each other, the simpler it will be to put together an efficient network setup. For example, if you have a network consisting of only 802.11g devices and set up a router to only accept 802.11g connections, it'll run at around 54Mbit/sec. But, if you have a network consisting of random 802.11 devices and a router that will support several protocols going back to 802.11b, the network will default to using the slowest, most common protocol available (802.11b) and will force all connected clients to run at that speed (11Mbit/sec), regardless of each client's individual configuration. That bandwidth is then divided by every connection, making then network seem much slower than it is. By keeping the client and router hardware similarly configured, the network speeds are less likely to suffer. Your maximum network performance is limited only by the hardware you use to build it.

- Secure your network.

Make sure your network hardware is secure on both the router and client end. Set up your router to use the most powerful encryption protocols it supports and utilize MAC address detection to identify each piece of hardware on the network, so you can ensure no one outside of your client list can access your network. Also, don't use DHCP to assign IP addresses. Manually configure each client, so they have a static IP. Finally, disable SSID broadcasting. This will reduce the likelihood of a war-driver finding your network and tagging it for others to find.

- Use the latest available network protocols.

Using protocols like 802.11g or 802.11n may help to significantly improve your network speeds over older ones, but may also offer some added flexibility. Unlike the older 802.11b/a protocols, some of the newer protocols aren't limited to one broadcast frequency (2.4GHz). While the broadcast frequency of your wireless hardware has relatively little to do with your network bandwidth speeds (5GHz vs 2.4GHz), it can indirectly improve your network performance by moving your network to a less-common broadcasting range. As long as you don't need to share your connection with random users, you can simply isolate your network to use only the 5GHz channel, effectively removing the interference from all networks within your own network's range... at least, until your neighbors figure this out and start using the 5GHz frequency on their setup.

It's a short term fix, but effective.

- Move the router to a new location.

Generally speaking, placing your router at the middle of the overall area each client resides will give you the best connectivity. But you must also be mindful of other devices that could interfere with the network, because they work on a similar frequency or simply draw a lot of power. This includes items like cell / mobile phones, microwave ovens, vacuum cleaners, CRT-based monitors/TVs, radio controlled devices, etc... Other items, such as construction materials, can naturally affect a wireless network's performance, like a Faraday cage. These can include items like sheet metal, aluminum siding, metal girders, fencing / screening materials, heat ducts, metal pipes, etc...

Anyway, hopefully these tips will be useful to someone. You never know when your network might fail you next, and the answer might be something as simple as one of your kid's video game consoles with a poor configuration that's causing it.

more than 3 years ago
top

The Machines That Sparked the Beginning of the Computer Age

Bones3D_mac It goes waaaaaay further back than this... (139 comments)

There is evidence that some of the first computers ever produced existed as far back as 150 BC, A device found in 1901, called the Antikythera mechanism, is a mechanical computing device believed to have been used to chart astronomical positions. It's overall design rivals the complexity of an early mechanical watch.

Another fun item, the japanese Karakuri ningy, or clockwork doll. They are some of the earliest known examples of robotics, going back to the 17th century. The Karakuri ningy was primarily used by wealthy dignitaries for ceremonial purposes, like serving tea. One of these clockwork doll would be placed upon a table, holding up a small tray. When a weighted object, such as a tea cup, was placed on the tray, the weight of the object would set the mechanics in motion, causing the doll to turn 180 degrees from the server and would then begin walking toward the guest at the other side of the table, to deliver the tea. Once the weight was removed from the tray, the action stopped and the mechanism would reset itself for the next use... allowing both server and guest to repeatedly serve each other as a form of entertainment.

Although much of this has been replaced by electronic devices, such as the Sony Aibo and the Honda Asimo, the old style Karakuri ningy design is still in use today, but mostly as large scale devices in factory settings as carts for moving large, heavily-weighted objects, like car engines to different parts of an assembly line, as a cheap way to conserve power by using an object's own weight to move it.

more than 3 years ago
top

Scientists Afflict Computers With Schizophrenia

Bones3D_mac Re:I've heard something like that before (143 comments)

Actually, I have an issue like this that makes me very uncomfortable around crowds. For example, when I go to a restaurant, I can hear every conversation going on around me with the same relevance as someone sitting next to me. It drives me nuts because it prevents me from enjoying a conversation with the people I actuslly care about. In that situation, I either have to process everything at once and parse out the stuff relevant to who I'm trying to talk to, or block out everything as white noise and not participate at all. I also get a bit nauseated from dealing with so much info at one time.

Needless to say, I don't go out much and prefer to be in locations where I control the environment. Usually keeping things dark and quiet.

more than 3 years ago
top

AppleCrate II: Apple II-Based Parallel Computer

Bones3D_mac It makes one wonder... (126 comments)

What kind of impact would this have had if people were doing this back in 70's?

Granted, this guy is just using it mostly for audio processing. (Impressively done, though... especially if you ever experimented with audio sampling on an Apple II using self-designed software and custom-built I/O interfaces)

What I'm curious about, is whether the video output from each of thee boards could be combined into either a single high resolution display matrix approaching VGA at a low depth, or layered atop each other to increase output depth, but at the Apple II's default resolution. (Basically, something like the output of 12 machines combined into a 4x3 matrix on a single display, which would be controlled by a 13th machine for high res output, or layering the output from all 12 machines with the 13th machine controlling the alpha value of each layer to create the illusion of a higher bit depth than the Apple II was capable of.)

Maybe then all that shareware porno imagery every library in america once hosted might actually have been identifiable...

more than 3 years ago
top

Do Violent Games Hinder Development of Empathy?

Bones3D_mac Pets (343 comments)

More likely, the simple act of owning a pet probably contributes more toward instilling empathy into a child, than any video game could do to decay it. Sure there's violence in games and movies/tv, but much of it is short term and probably forgotten, versus something like living with an abusive parent or similar issues.

Havin a child learn about responsibility through caring for an animal is long term, meaning the child eventually develops an emotional bond with it. Children quickly learn failing to act responsibly wth an animal can have serious consequences, including death of the animal itself. And, while that could be somewhat traumatizing to learn about harsh topics like death that way, it will make them far more empathetic toward others versus any short-term habit changes like taking away their video games.

more than 3 years ago
top

Limewire Being Sued For 75 Trillion

Bones3D_mac Easy! (545 comments)

Countersue for $74,999,999,999,999.00 for emotional distress from such an absurdly large lawsuit amount. That way, they're only out $1.

more than 3 years ago
top

Michio Kaku's Dark Prediction For the End of Moore's Law

Bones3D_mac A Pointless Prediction... (347 comments)

Although I like Kaku as a scientist in general, he's not exactly immune to mythbusters-style "foot in the mouth" science.

One major thing he overlooks, is the high likelihood of cloud computing eventually taking over the role of the "processor" in most PCs well before then. This isn't just a fad technology that'll go away in a few months, it's probably going to be next evolution in computing since the introduction of the world wide web. Not only will it take over processing tasks, it'll also change the very way software is distributed by letting companies post a "master" copy of a program onto a cloud server, and then rent out usage time for an instance of the software in a user's cloud space. This way, the developer doesn't have to waste months of development time trying to track down bugs specific to different system configurations. This would alllow the developer to focus solely on the software's performance within the cloud, only requiring updates to the "master" copy as they are needed. The user would never need to worry about all the downsides to installing software, such as invasive DRM, software incompatibilities or malware, as the software would never actually be running locally on their system.

Likewise, processing power would also be rented out. A larger portion of CPU time could be purchased for an extra fee, on a sliding scale. One cloud computing becomes as flexible as that, one only needs the right version of client software to access their cloud vm interface and you could theoretically access it from any machine with enough local horsepower to display a window of a stream viewport of the user's workspace. (Probably any system from 1999 to the present)

After that, processing power becomes largely irrelevant unless you are working on something seriously data intensive beyond anything we can probably comprehend now.

more than 3 years ago
top

New Hardware Needed For Future Computational Brain

Bones3D_mac Google The Brain! (143 comments)

Ok, I admit this sounds completely absurd at first, but there's an awful lot of similarities between the neural pathways of the brain and the countless number of ways websites link to each other, both directly and indirectly through their contacts, and their contacts' contacts, and all the contacts that eventually show up in an endless cycle of recursion, etc...

Now, google has to wade through all this, and constantly correct and update itself, to ensure it can get a user to the correct web page that best matches the search criteria.

You can't tell me that as more data on the web becomes increasingly more dynamic with all these forums, blogs, news sites and endless amounts of chat/social engineering sites constantly popping up and then dying, that there isn't at least some algorithm they employ that couldn't be applied to nueron connectivity and communication.

You'd think it'd just be a matter of passively connecting to a neuron to sniff it's traffic and then observing which nearby neurons carry the signals to and from it, then start listening to those neurons and so forth, then use machine learning to break down the patterns enough that google's setup could follow it... ie, determine which neuron is responsible for which patterns in what frequency, etc...

more than 3 years ago
top

DHS Eyes Covert Body Scans

Bones3D_mac Shouldn't we be more concerned about... (386 comments)

... how to make the questionable crap we post today permanently go away on demand so it doesn't come back to bite us in the ass in the future?

If this person wasn't even aware that data could be erased from the internet, you can bet it hasn't occured to them that there are far more dangers in having data that doesn't go away eventually on its own. That edgy statement you made that initially made you look cool to your friends in 6th grade might land you on some company or government blacklist, making it near impossible to get a job because some Watson-derived human resources bot assigned you a risk assessment perentage that can't be overturned by human hands any easier than getting off a sex offender list even when the case that landed you on it was later overturned, etc...

Oh, and have fun when those watson bots aren't just assessing you by your own actions, but by your associations, both directly (communication) and indirectly (shared philosophies derived by each person's actions linking you to people you never heard of). I'm sure there are at least one or more serial killers out there even you might be linked with based on interests alone.

more than 3 years ago
top

New Red Dwarf Series Threatened By the Twitter Era

Bones3D_mac Red Dwarf: The Next Generation? (228 comments)

Honestly, I can't see how this is going to work. Are they bringing back the old cast or starting over with a new one? is it going to be the same story following "Back To Earth" or a complete reboot like the recent "Star Trek" feature film versus the TOS version?

If it is a set of new characters, who are they... Lister's kids, a Rimmer Jr created by Rimmer and instructed to activate in 20 years, cloned versions of the cat, only younger, with kryten left to baby sit them all? And what about holly... crashed, hologrammed like Rimmer was or replaced by Kryten as Red Dwarf's main computer?

Personally, I think Red Dwarf needs to stay dead. The entire series is on NetFlix now if I get the urge to watch it again, and I doubt the series could survive in the same context it did in the 90's. Hell, just look at how tame The Simpsons has gotten in the last 20 years compared to the early years. They can't even show a character's butt in the current episodes without a disclaimer or a time shift, something that was perfectly acceptable early on. (Yet, the syndicated episodes are still shown with butt crackss intact.)

If you really want new brit space humor, I recommend checking out "HyperDrive" (which is also on NetFlix). Granted, the show is still rough around the edges, there is potential.

more than 3 years ago
top

J.J. Abrams Promises 'Fringe' Will Die Fighting

Bones3D_mac Fringe caters to the geeks! (392 comments)

How can you not like a show that will go as far as to create a period-accurate 1980's version of it's own opening complete with synth-heavy music, old school computer animation and even vcr tracking bleed at the start and end?! Even the pseudo-sscience jargon in the normal sequence was replaced with current day science reality jargon that probably seemed like science fiction to most of your 1980's counterparts!

I kept that episode on my DVR for months, just because of that opening!

more than 3 years ago
top

Crowdfund a Moon Monolith Mission?

Bones3D_mac A half a billion? (199 comments)

Couldn't we just pull another "Capricorn One" and pocket the other $450,000,000?

more than 3 years ago
top

Magnetic Pole Shift Affects Tampa Airport

Bones3D_mac I wonder (317 comments)

if we'll see a similar phenomenon with the bee population as we start moving into the warmer months ahead. Perhaps it's not just a cell phone boom that was to blame last year...

more than 3 years ago
top

Navy Tests Mach 8 Electromagnetic Railgun

Bones3D_mac Re:Yay! (440 comments)

This flaw of being "human" seems amazingly similar to the how animals fight and posture over one another to ensure the ongoing survival of their groups and themselves. The only difference is that humans try to justify their motives, where animals simply act. So, are the animals better or worse than us because they don't carry a concept of right and wrong?

And more to the point, why do we care about whether our actions and motives are justified, when we already know our actions are probably going to harm others, anyway? If it really mattered one way or the other, we simply wouldn't act to begin with... right?

The only reason we even bother with such trivial matters as self-justification is due to our fear of the unknown. The big question of what happens after we die... RELIGION!

And you know what? Practically every major war fought here on earth has been over religion!

So, it's no longer simply us trying to console ourselves that our own wrong-doing to wards others is justified... but now we harm one another because one side believes their justifications are greater than the justifications of those on the other side... and as a result, either side's attempt to back down before the blood has been spilled would be interpretted as a weakness of that side's justifications for fighting in the first place.

In other words, both sides must fight because not fighting is far more unjustified than the act of fighting in itself. Most likely, the side that eventually did refuse to fight would probably end up being seen as such a disgrace that they aren't even deserving of life... either to themselves or to their enemy. Those who don't die will probably end up wishing they had... as failing to fight means they abandoned their values to survive.

But while we're entertaining ourselves on this... what's the ultimate solution where no one fights, no one loses, etc.... aka, the path where the utopia is at the end? Do we all network our brains together into one massive hive mind, allowing our individuality to give way to a shared mind that''s been artificially "normalized" by a massive machine that literally polls every human mind on earth over every conflict of interest, and having the majority favored interest become the single interest of every person on earth in one fell swoop, then repeated again and again until we all end up being the exact same person as the guy standing next to us, recurrsively?

more than 3 years ago
top

NASA Finds New Life (This Afternoon)

Bones3D_mac Re:Evolution (405 comments)

That's pretty amazing to think about... not just one massive tree of ever evolving lifeforms, but perhaps several competing in parallel with one another throughout history. Completely incompatible strains of life endlessly competing with each other for dominance over resources, potentially altering the environment itself to the point that one strain wins out over the other as the loser is slowly being killed off by same world they once thrived in. Perhaps such competition might better explain some of the more major evolutionary changes in lifeforms, versus the far more subtle evolutionary changes between similar species that can simply be defined as minor mutations that made certain lifeforms slightly more successful at survival than their non-mutated counterparts.

more than 3 years ago
top

Thought-Provoking Gifts For Young Kids?

Bones3D_mac Think Geek (458 comments)

Think Geek

Seriously!

If you can't find something of interest there that's both fun and thought provoking, you probably aren't going to find anything anywhere.

more than 3 years ago

Submissions

top

Nuclear Power Plants vs Coronal Mass Ejections

Bones3D_mac Bones3D_mac writes  |  more than 3 years ago

Bones3D_mac (324952) writes "Could a major coronal mass ejection from our Sun result in planet-wide nuclear disasters similar to what we witnessed in Japan? Not only could it cause the "station blackout" effect we initially heard about, but what about the potential for loss of communication lines and overall mass confusion as our usual electrical and electronic devices suddenly go dark on us for seemingly no reason whatsoever?"
top

Spherical Processors?

Bones3D_mac Bones3D_mac writes  |  more than 3 years ago

Bones3D_mac (324952) writes "After playing around with a batch of silly putty filled with iron filings and a powerful magnet, I noticed the stuff would always form back up into a perfect sphere no matter how much I tried deforming it, which got me thinking... has anyone ever thought of using something like this to develop a processor of some kind?

If you think about it, a sphere is a good shape to work with if you're cramped for space (a problem processor manufactures are already faced with). Spheres offer the most usable surface area within a confined space. Also, using just the surface area alone would allow for a radical new approach to processor design, simply due to the fact that circuit pathways could physically go on forever as there is no end to the surface of a sphere. A processor map could repeat into itself recursively for as many times as one could ever need. And to access the processor, one would only need to encase it into a shell of electrodes touching it's surface, sort of like wiring up a golf ball at every dimple, but at a much larger scale.

And that much is just using the surface area...

But what if we take it a step further and find a way to use the entire volume of the sphere to create circuit pathways, accessible from the surface, but where each electrode can access every other electrode attached to the sphere using the shortest possible route through the sphere's volume rather than longer pathways along the surface?

This is where the silly putty comes in... what if instead of mere iron filings, the sphere were made of a more processor friendly set of materials that could reshape their processing pathways on the fly as needed?

My guess is that if you had three of these spheres (one in a static configuration for basic processing and two dynamically reconfigurable spheres) the dynamic spheres could perform specialized tasks by offloading processing jobs on each other as the other reconfigures for the next task, as needed. The more spheres you have, the more data you can process at any time. (Perhaps using some sort of neural networking algorithms to define the configuration needed from each sphere...)

Any thoughts?"
top

Is Phoning Home Killing Our Computers?

Bones3D_mac Bones3D_mac writes  |  more than 5 years ago

Bones3D_mac (324952) writes "As a Mac user, I've been spared much of the headache of viruses and other nasty surprises most PC users have been dealing with on a daily basis. Lately though, I've been looking into the windows side of things to expand my available toolsets, such as tablet laptops for things like photoshop and lightweight 3D modeling work.

The problem, however, is how do I keep a mission-critical system like this safe when the applications being used on it require an internet connection to phone home? Obviously, having no external connections would do a lot to prevent anything from causing damage to the data stored on the system. But it seems that it's become increasingly difficult to keep the internet out of the equation when it comes to the more expensive software.

Should commercial developers be considering other methods of preventing piracy besides just phoning home? Or should we start holding them responsible for making our mission-critical systems needlessly vulnerable due to their software's requirement that an internet connect always be present?"
top

How do you reason with computer illiterates?

Bones3D_mac Bones3D_mac writes  |  about 6 years ago

Bones3D_mac (324952) writes "Have you ever had to explain a serious computer-related problem to someone who is both complete ignorant of the ramifications involved and unable to comprehend the nature of the problem itself... even when you try to dumb it down with metaphors and pictures?

For example, a relative of mine recently approached me, asking me to download a bunch of copyrighted material off Limewire and burn them onto CDs. Needless to say, I attempted to explain that it wasn't legal to acquire copyrighted content this way, citing some of the numerous cases where the RIAA issued potential life-destroying lawsuits, and suggested using something like Apple's iTunes Music Store instead.

Not satisfied with the option that involved actually "paying" for the content, I was then met with the usual "if it's not legal, then why are the files on there?", and "everyone else I know does it all the time and nothing happens to them!" arguments before they finally got pissed off and stormed off empty handed.

So... how do you reason with such unbelievably flawless logic to get someone to finally understand the potential dangers involved in something they see as being completely harmless?"
top

Lightning Bug Lighting - Voluntary or Involuntary?

Bones3D_mac Bones3D_mac writes  |  more than 6 years ago

Bones3D_mac (324952) writes "Earlier today, a lightning bug managed to sneak into my house and took up a stationary position on the wall. While not entirely exciting by itself, it did allow me to make some observations about what exactly triggers the glow. Although it's widely believed that the glow is somehow used for communication purposes between these insects, I'm starting to question whether this is actually the case, or if it's even voluntary at all.

First off, I've noticed the environment seems to play heavily into when the glow reflex is triggered. However, it doesn't seem to be the state of the environment that matters, so much as changes to the environment itself. In toying with the lightning bug, I had found that blowing small puffs of air toward the bug and rapid changes in lighting both triggered predictable results to the point of getting the glow to occur a specific number of times relative to the number of times each action was performed.

Next, I've noticed at points where the state of the environment was kept static, the glow did not trigger at all at any point the lightning bug remained stationary. However, the glow would consistently be triggered at points immediately before and during the bug's movements, and then discontinue right before movement ceased.

Finally, it seems that not only does movement trigger the glow, but the patterns of the glow generated by the movements varied with the complexity of the movements. For example, walking would trigger a very slow blinking, but flying triggered a more rapid pattern.

Based on these observations, I'm starting to think this idea of the glow being used for communication purposes may be an inaccurate assessment, at least as far as any sort of voluntary communication is concerned. Instead, I'm inclined to believe the glow may actually be more of an involuntary and passive response to the lightning bug's overall brain activity, as opposed to a voluntary decision to light up or not light up.

I'm actually kind of curious what our entomology-minded slashdotter's think. If my thinking on this is correct, it could offer some interesting insight into how the brains of these insects actually work. (Sort of like a primitive, always-on EEG wired directly into the brain.)

More importantly, perhaps this could be used in some manner on more complex creatures to allow for instantaneous visual feedback about the state of that creature's brain, rather than having to remove these creatures from their natural environments to observe them via other methods."
top

Bones3D_mac Bones3D_mac writes  |  more than 7 years ago

Bones3D_mac (324952) writes "I've noticed in the last couple days that my DirecTV Plus DVR reciever has recently started to covertly record programming I did not specifically request. While this isn't necessarily a problem by itself, it has become a problem in that I cannot choose to delete, block or opt-out from receiving this content. Instead, I'm apparently "required" to receive this content and forced to keep it stored on my DVR drive until a time and date of DirecTV's choosing, all under the guise of a "feature" called Showcase.

Even more interesting, is that the channel this unwanted content originates from cannot be accessed manually. Also, it doesn't show up in any menu or recording schedule, rendering it invisible and completely unpredictable to the user.

Aside from the obvious inconveniences involved (hijacked disk space, seemingly random interruptions of multi-channel recordings, etc... ), it also raises an important question.

Are we headed for a "Max Headroom"-style future where mandatory television viewing will somehow become required by law? I'd hate to think the only option I have is to throw a towel over my TV screen so I can't see it."

Journals

Bones3D_mac has no journal entries.

Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Create a Slashdot Account

Loading...