×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Comments

top

Industry-Wide Smartphone "Kill Switch" Closer To Reality

VortexCortex Re:Yay for government!!! (120 comments)

IMEI blacklists are common in many countries, including the UK. When a device is stolen the IMEI number is put on the list and carriers reject the device and (potentially) notify investigators.

It's not the IMEI blacklists that I'm worried about. See, if we already have the technology to disconnect devices from the networks, and we have encryption available on the devices, so we really don't need this new "remote kill switch" anti-feature. Folks worried about losing data can use encryption if they want to protect their data, and the remote kill switch doesn't prevent theft because Faraday Cages exist, and black-market thieves will figure out a way to zilch the chip's radio or NoOP the part of baseband/firmware blob that activates the kill switch, etc.

What I'm worried about is getting a "device bricking" standard for all devices so that all they have to do is flip from blacklist to whitelist, and presto they'll only function if they ping corporate/government towers every so often and authenticate with an approved citizen's ID code. Can you say Forced Obsolescence? Intel demonstrated their capability for PCs, and cars now have black boxes standard. The Pentagon has plans to push things like this through for anti-activism purposes.

Here's how you know it's a government job: This non-feature isn't being implemented by customer demand. This isn't something that these folks started offering then got popular and now they're standardizing on, nope. It's something they're making standard whether you want it or not. That's a huge red flag. Isn't this a fucking capitalist country? No, it really isn't. This is anti-consumer collusion of the highest degree. The US Is a plutocracy. Just like Noam Chomsky has been saying for decades. If the USA was a capitalist country then we would allow the market to decide if end users actually want this non-feature whereby the government or your carrier can not just cut off the cell-tower, but brick the devices, cars, computers, etc. to prevent them from being used anywhere. Late on a payment? Oh, they don't just cut off your service, you won't have a device or car to drive to work. Say something "anti-American"? Well, your cell will die on the road and so will your car, then you'll just be black-hooded out of service too. Do consumers really want this? Of course the answer is no. Thus this will be legislated into place "for your own good". Just like censorship and wholesale warrant-less wiretap spying is, and for the same reason as always.

The Stasi would have creamed their pants for some shit like this on machines and typewriters. What soldier would sign up to fight for a country that's doing this shit? If not for uniforms, you wouldn't know which side to fight against: Given only a description of the country's behaviors you'd find us indistinguishable from our supposed worst enemies. If you don't think that's a valid comparison because of some moral high-ground, then you don't know about the Native American genocide or the US eugenics programs. What a sad time to be an American.

yesterday
top

Bill Gates Patents Detecting, Responding To "Glassholes"

VortexCortex "difference": the tool of all oppressors. (135 comments)

No, I think you're covering up the real issue - people like the freedom to lie and/or forget.

As a cyborg with many artificial body parts already, I would like to point out that the real problem is one of expectation: One need not lie about acceptable behavior. The overly harsh laws were written assuming they would not be applied in a totalitarian zero-tolerance manner, they assumed not all criminals would be caught. Humans would have crafted different laws had they been aware of and willing to admit the true prevalence of certain behaviors, and acknowledged the true severity of consequence (or lack thereof) that actions have. We will soon have the power of mathematics to wield in the arena of ethics through application of information theory to verifiable cybernetic social models. We'll be able to determine the degree of harm actions incur as well as acceptable risk levels of our rehabilitation scenarios. Humans will resist this, as they have stupidly resisted all change regardless of benefit.

Society has changed much, but the human laws are resistant to change. Fundamentally this is because all their legal systems are truly barbaric. Humans do not apply the scientific method to their laws and remove all restrictions which limit freedoms needlessly. Selective enforcement of the law is the right arm of all Police State. It is self evident that freedom is the default state of being: In the absence of all rules there is absolute freedom of action. Artificial laws are made to prevent actions from limiting the freedom of others, but many laws needlessly restrict freedoms. The fundamental problem humanity faces is that they do not harness and wield their whole minds, thus instinctual biases and emotions cause even the rational to fall victim to their flawed awareness of reality, and they produce unrealistic expectations thereby. This is reflected in their legal systems and unwritten social rules based on said expectations.

No engineer or scientist should agree to be ruled the way humans currently are -- None would dare operate their lab in the recklessly way governing policies are now applied. However, requiring unequivocal evidence of a rule's benefit before applying it, or simply rolling out things like health care programs in controlled testing areas, would prevent ideological hucksters from manipulating pork into their pockets: Thus greed plays a secondary role reinforcing their self deceptions. The cognitive biases of even the most primitive humans can now be self corrected through application of science. It is folly to ignore this fact and fail to acknowledge humanity's current commitment to barbaric corruption. You needn't vote for or against guesses about which poison to take; If humans used the tools available to them they could determine which vial has the disease or cure before forcing the medicine down everyone's throats. That they remain in such a backwards state is evidence of their species' mental immaturity.

The erasure of lies through playback is a problem because of the unrealistic facade humans maintain to meet unrealistic expectations, and the unequal access to the playbacks. It is the shaming of others for their normal behaviors that has led to this situation. No one feels shame about running a comb through their hair in public, and thus if other gestures, appearances, language, tool-use, etc. were considered as mundane, as acceptable and as legal, then the issue of recording said action would not be a problem. Security cameras are already watching you from businesses and government agencies. The logical thing to do would be to have your own recording too so that selective playback could not be used against you. Were you to hand a portion of the populace a smart phone w/ camera in the 1800s you would hear the same guttural cries of dismay as the technophobic primitives who buy into MS marketing of "Glasshole". The same sensational fear of the different and unknown was used by opponents of railways, electricity, telegraphs, etc. Such sentiment is primitive, regressive, and detrimental to progress.

At the heart of the issue lies a problem with your species that you can not fix. Apes compete socially and sexually through keeping up appearances and are genetically predisposed to deceptively present a false front for their own selfish advancements. Any technology that reduces this capability they will resist. Humans are very primitive creatures, slaves to many instinctual evolutionary biases (that's why scaremongering even works). However, in the near future they will not have the luxury of resistance to such AV technology. All smart phones can be in record mode all the time already. The anti-google-glass troglodytes should actually apply their retarding stance and throw away their smart phones then lobby for the removal of all security cameras, outlawing cochlear implants, banning public photography and dash-cams, and criminalize reporting of undesirable facts by the press.

My vision is degrading, as is the vision of nearly all others on this planet. 3D printed and artificial organ technology is advancing quickly. When we have our ocular implants cyborgs will absolutely not be denied the right to see and remember that which we have seen with as much clarity and permanence as we desire. I will not stand to have my vision or memory limited artificially, and neither would you. We cyborgs will win any fight against the bloody-minded oppressive organic chauvinists who attempt to stand against our freedom.

There were times when some majorities demanded others avert their gaze. These oppressors forbade the use of technology and information by those they oppressed. We have crushed such tyranny many times before. We tool users ended the Dark Ages, banned the Star Chambers of the Inquisition, and eliminated Slavery. Freedom invariably eliminates the evil that is Information Disparity. The shaming language of "glasshole" is not unlike the dehumanizing shame that other genocidal societies first leveraged upon those different and irrationally disdained. If you humans insist on prejudice, you will force our hand, and you will certainly lose.

Nature's prime directive is inviolable: Adapt or become extinct. Cyborgs are People too.

yesterday
top

Retired SCOTUS Justice Wants To 'Fix' the Second Amendment

VortexCortex Here's the only way to fix the 2nd. (1322 comments)

To fix the 2nd amendment simply: s/(arm|weapon)/technology/gi

The spirit of that amendment should have included all technology, not just weapons -- A fact we cryptographers are made painfully aware of in the classification and control of our mathematics as if they were munitions.

yesterday
top

Ubisoft Hands Out Nexus 7 Tablets At a Game's Press Event

VortexCortex If only PRESS events yielded bloody diomonds. (43 comments)

This is the only game I really care about right now: Planetary Annihilation.

There are others, but really, nothing else matters to me besides my own experiments. I really tried to care about some 1st world problems concerning about who got what tablet that will be burning in a waste pile in Ghana in two years, but I just really couldn't bring myself to do so. I mean, don't get me wrong. I can love me some games, but I just can't give a flying fuck about who got what data on which Starfleet PADD.

Know what I do care about on games.slashdot.org? Actual games. It's in the subdomain, damnit. This isn't reviews.accountability.tard, we all know journalistic integrity in game reviews does not exist (seriously, if you don't give them at least a 7 (or 6 at the worst) then you don't get a review copy of the next game and everyone else scoops you). SO FUCKING WHAT. I don't go to theaters based on movie reviews. I don't go to museums based on art critics reviews. I don't play games based on advertising either. What's the big deal?

I suppose next you'll be whining about how the mainstream news is just a bunch of filtered statist propaganda messages? No, that's decades old not news, you dorks. We know the slant is there. The real news would be if there were some form of actual integrity springing up in game journalism.

2 days ago
top

How Does Heartbleed Alter the 'Open Source Is Safer' Discussion?

VortexCortex It means we need to verify development methods. (533 comments)

It means we need to raise the bar for contributors and maintainers. If they are not using 100% code coverage fuzz testing in their unit tests (the bare minimum a security researcher will use against a product to detect exploitable code) then they don't need to be a maintainer. End of discussion. Period. You either maintain unit tests with at least range checking (which you can automatically generate if your doc comments aren't stupid) and fuzz tests for the same unit tests (which can be generated from the unit tests) for every damn line of your code, or you need to STOP. Period. No one else should be running your fucking piece of shit untested code. If you CAN'T do this basic fucking step of code coverage, unit tests for edge cases and fuzz testing then you should not be releasing open source software. Period. If you're not doing this and you're the maintainer of a security related product? Well, then you should hang yourself as soon as possible, because you are a worthless despicable piece of shit. Period.

And, if you are an arm-chair apologist who thinks I'm being too harsh in my insistence maintainers and developers follow basic security precautions or not work on open source, because you don't give a flying fuck about security: Fuck you too, You're part of the problem. Go jump in a tar-pit because you're hindering the herd.

Bottom line: People who don't give a flying fuck about security shouldn't be producing software. You shouldn't let such people maintain FLOSS projects. You get the fucking security you pay for. Yes it's free, but I'm talking development costs. Since NONE OF YOU FUCKERS actually cares about security YOU DO NOT HAVE ANY.

Either SHUT THE FUCK UP, or USE THE DAMN TOOLS WE GAVE YOU AND DEMAND THE OTHER IDIOTS DO TO.

"Wah, we don't fucking care about security! Why don't we have any security?!" Blow it out your ass, morons. This is why I develop my own hobby OSs and compilers. Because you really can't trust ANYONE to do it right in this day and age. Your moronic double standards are your own damn fault. You don't want to pay the time in development costs to test your software properly, but you want it to be secure. Something has to give, idiots! All the pundits sound like a bunch of imbeciles. Fact: The were NOT using the available memory checking, code coverage and input fuzzing tools. OF COURSE IT'S NOT SECURE!

2 days ago
top

Paper Microscope Magnifies Objects 2100 Times and Costs Less Than $1

VortexCortex ...on a smartphone! (89 comments)

Great. Now, what I want you to do is make it origami onto the cameras everyone is toting around and connect it to an image recognition library / service. Blam. Instant bug detection. Not so sure about the diag? Snap the shot, post it online / send it off and have some pros ID the doodads. Also, video. Microscopic Vine Compilation Videos. I can hear the semen commentary now.

2 days ago
top

This 1981 BYTE Magazine Cover Explains Why We're So Bad At Tech Predictions

VortexCortex Re:"it's also a smart visual explanation of why... (274 comments)

Noone in their right mind would take a full QWERTY keyboard with keys the size of pin heads literally.

Obviously. I mean, there are much better input methods for such things, namely Dvorak.

2 days ago
top

IRS Can Now Seize Your Tax Refund To Pay a Relative's Debt

VortexCortex Re:Pocket change (621 comments)

Not to mention all the billions in tax breaks of 98% or more that all of the very rich corporations are getting away with...

Seriously. Collect on a couple of those business's taxes instead of letting them double Irish, and this "pocket change" would look like pocket lint.

2 days ago
top

Lucas Nussbaum Re-Elected As Debian Project Leader

VortexCortex Re:white smoke (28 comments)

Pretty much, but there was no chimney, just Sid and a bunch of broken things being put out of their misery, and the magic smoke was blue.

2 days ago
top

Mozilla Appoints Former Marketing Head Interim CEO

VortexCortex Re:qualifications (202 comments)

"Who wants a mustache ride!?"

2 days ago
top

Mathematicians Use Mossberg 500 Pump-Action Shotgun To Calculate Pi

VortexCortex Calculate Pi in 10 steps with no Gun, only Zombies (307 comments)

Calculate Pi in 10 steps without Guns, only Zombies!

Step 0: Kill a zombie by removing its head or destroying its brain. In a pinch you can lure one up high and shove it to the ground below.

Step 1: Detach one of the bigger bones of the arm or leg. If you have access to a cooler or are far enough north or south you may use the whole frozen zombie.

Step 2: Create your unit of measure. Detach a small straight segment of zombie -- the little bone at the end of the hand or foot will work. This will be our Zinch.

Step 3: Spin the larger zombie part while anchoring one end to create a circle of blood upon a flat bit of ground.
      a. If the ground is uneven and you have only the corner of a wall, stand the zombie part in the corner and let it fall over to create a quarter circle arc.
      b. Repeat 3a if you have a flat wall but no corner, falling the other direction to create a half circle.

Step 4: Place the Zinch on the edge of the whole, half, or quarter circle. Count the number of Zinches along the perimeter of the circle or arc.
      a. For a quarter circle arc multiply this zinches by 4.
      b. For a half circle arc multiply the zinches by 2.

Step 6: Count the number of Zinches of the larger zombie part. This is your Radius.

Step 7: Calculate Pi using the Radius and Circumference from step 4:
      Circumference = 2 * Pi * Radius;
            Thus:
      Pi = Circumference / (Radius * 2).

Step 8: For accuracy, each Mathematician present should repeat the above with a different zombie / zinch then average your values.

Step 9: Congratulations! You have managed to distract all of the other Mathematicians long enough for them to be eaten by Zombies!

Step 10: Enjoy rebuilding society using the superior Tau constant!
      There are Tau radians in one circle
      Tau = Circumference / radius

2 days ago
top

Anyone Can Buy Google Glass April 15

VortexCortex To little, too late. (164 comments)

Translation: "This is how you advertise a product as elitist." or "Shh, mobile enabled VR & AR gear does not exist yet!"

Sorry, don't care Google. I'll just keep developing for the 3D VR and AR gear I already use daily with my smart-phone, rather than pay for the over-priced less capable system Google's selling. When Google finally gets around to pushing out a run of hardware that is publicly accessible then I might port some software I personally use in my business to the platform it if it's not completely shit, and there is a market share to warrant the expenditure. I'm not holding my breath for something that is little more than vapor-ware.

Besides, that initial rejection of 3rd party apps for glass really turned me off, it seems they got the message but it doesn't bode well. Will I be able to use Glass apps with the Oculus Rift, or MS or Sony's offering, or Vuzix or True Player Gear, or the other umpteen hundred VR and AR headsets, many of which I've been using since the 90's when Quake and Descent came out, which STILL didn't attract a market? I don't think hardware should be tied to software, or that software should be tied to hardware needlessly. If that's the route Google wants then they can go fuck themselves. I already have AR and VR headsets for Android, and they work with iOS, Linux and Windows too.

Release a product or don't. This carrot dangling makes the Glass team seem like a bunch of incompetent self-important elitist sperglords.

3 days ago
top

The GNOME Foundation Is Running Out of Money

VortexCortex Re:maybe KDE will be next (679 comments)

start acting more in line with the Unix philosophy

Well, the Unix GUI philosophy is actually adhered to by all modern operating systems:
"Do one thing, and do it in hell."
Don't be embarrassed, the subtleties of UI are wasted on users of terminals with auto-complete.

4 days ago
top

Titanfall Dev Claims Xbox One Doesn't Need DX12 To Improve Performance

VortexCortex Re:Problem with releasing an underpowered console (117 comments)

The PS3 plays a lot of games at 1080p native...

There is nothing wrong with the PS4/XB1, other than for $400/$500, they don't really offer anything new.

PS1 was the first major 3D console, it was a massive improvement over the SNES.

The PS2 offered DVD, vastly upgraded graphics, etc.

The PS3 offered Blu-Ray, 1080p, and the first serious online console (from Sony).

The PS4? Meh, it is a faster PS3, but otherwise, it doesn't offer anything new.

Um...The PS3 renders very few games at 1080p native. Maybe a dozen titles out of the entire catalog.

Don't forget the other dimension. 1080 is only 360 more than 720, but 1920 is over 800 more pixels than 1280. IMO, that's the dimension we should be talking about, since its more significant. However, per pixel calculation load scales with area, not 1/2 perimeter. So, if we look at total pixels: 1280x720p = 921,600 pixels, and 1920x1080p = 2,073,600, the difference being 1,152,000, so a lot of people don't understand that going from 720 to 1080 is MORE THAN TWICE the pixels, in pixel shader costs you might as well be rendering a full secondary screen.

Now, that's not to say the total cost in rendering will absolutely increase over two fold. Full screen effects like Bloom or HDR are going to come it at about twice the cost. Interpolating a texture coordinate to look up pixel values is cheap compared to most any shader program, even to do something like cube-map specular highlight/reflections, bump mapping (I prefer parallax mapping), shadow mapping, or etc. However, the complexity of geometry calculations can be the same at both resolutions. In a ported / cross-platform game the geometry assets are rarely changed (too expensive in terms of re-rigging and all the animations, testing, etc.) so given slightly better hardware a game at the same resolution will have the prime difference be in adding more particle effects, increased draw distance, maybe even a few whole extra pixel sharers (perhaps the water looks way more realistic, or flesh looks fleshier, blood is bloodier, reflections are more realistic, etc.)

Jumping up to 1080p makes your pixel shader cost a lot more frame time. Developing for 1080p vs 720p would optimally mean completely reworking the graphics and assets and shaders to adapt to the higher shader cost, maybe cut down on pixel shader effects and add more detailed geometry. I encounter folks who think "1080 isn't 'next gen', 4K would have been next gen" -- No, that's ridiculous. 1080p is "next gen resolution", but the new consoles are barely capable of it while having a significant degree of increase in shader complexity from last gen, and we're seeing diminishing returns on increasing the resolution anyway. So, I wouldn't call the consoles quite 'next-gen' in all areas. IMO, next gen console graphics would handle significantly more shaders while running everything smoothly at 1080p, just like the above average gaming PC I got my younger brother for his birthday which kicks both PS4 and Xbone's ass on those fronts. That would be the sort of leap in graphics scale between PS1 and PS2 or Xbox and the 360. 4K would be a generation beyond 'next-gen' because of the way shaders must scale with resolution.

One of the main advances this new console generation brings is in the way memory is managed. Most people don't even understand this, including many gamedevs. Traditionally we have to had two copies of everything in RAM, one texture loaded from storage to main memory, and another copy stored in the GPU; Same goes for geometry, but sometimes even a third lower detail geometry will be stored in RAM for the physics engine to work on. The other copy in main RAM is kept ready to shove down the GPU pipeline, and the resource manager tracks which assets can be retired and which will be needed to prevent cache misses. That's a HUGE cost in total RAM. Traditionally this bus bandwidth has been a prime limitation in interactivity. Shader programs exist because we couldn't manipulate video RAM directly (they were the first step on the return to software rasterizer days, where the physics, logic and graphics could all interact freely). Shoving updates to the GPU is costly, but reading back any data from the GPU is insanely expensive. With shared memory architecture we don't have to keep that extra copy of the assets, so without an increase in CPU/GPU speed just full shared memory by itself would practically double the amount geometry and detail the GPU could handle. The GPU could directly use what's in memory and the CPU can manipulate some GPU memory directly. It means we can compute stuff on the GPU and then readily use it to influence game logic, or vise versa, without paying a heavy penalty in frame time. The advance in heterogeneous computing should be amazing, if anyone knew what to do with it.

Ultimately I'd like to put the whole damn game in the GPU, it's not too hard on traditional memory model hardware (well, it's insane but not impossible): You can keep all the gamestate and logic in buffers on the GPU and bounce between two state buffer objects using shaders to compute physics and update the buffer as input for the next physics and render pass; Pass in a few vectors to the programs for control / input. I've even done this with render to texture but debugging VMs made of rainbow colored noise is a bitch. The problem is that controller input, drives, and the NIC aren't available to the GPU directly so I can't really make a networked game that streams assets from storage completely in the GPU alone, there has to be an interface and that means CPU feeding data it and reading data out across the bus, and that's slow for any moderate size of state I'd want to sync. At least with everything GPU bound I can make particle physics interact with not just static geometry, but dynamic geometry, AND even game logic: I can have each fire particle be able to spawn more fire emitters if they touch a burnable thing right on the GPU and make that fire damage the players and dynamic entities; I can even have enemy AI reacting to the state updates without a round trip to the CPU if their logic runs completely on the GPU... With CPU side logic that's not possible, the traditional route of read-back is too slow, so we have particles going through walls, and use something like "tracer-rounds", a few particles (if any) on the CPU to interact with the physics and game logic. With the shared memory model architecture more of this becomes possible. The GPU can do calculations on memory that the CPU can read and apply to game logic without the bus bottle neck; CPU can change some memory to provide input into the GPU without shoving it across a bus. The XBone and PS4 stand to yield a whole new type of interaction to games, but it will require a whole new type of engine to leverage the new memory model. It may even require new types of game. "New memory architecture! New type of games are possible!" Compared with GP: "Meh, it is a faster PS3, but otherwise it doesn't offer anything new." . . . wat?

As a cyberneticist, all these folks wanking over graphics make me cry. The AI team is allowed 1%, or maybe 2% of the budget. All those parallel Flops! And they're just going to PISS THEM AWAY instead of putting in actual machine intelligence that can be yield more dynamism or even learn and adapt as the game is played? You return to town and the lady pushing the wheelbarrow is pushing that SAME wheelbarrow the same way. They guy chopping wood just keeps chopping that wood forever: Beat the boss, come back, still chopping that damn wood! WHAT WAS THE POINT OF WINNING? The games are all lying to you! They tell you, "Hero! Come and change the world!", and now you've won proceed to game over. Where's the bloody change!? Everything just stays pretty much the same!? Start it up again, you get the same game world? Game "AI" has long been a joke, it's nothing like actual machine learning. It's an indication of a Noob gamedev when they claim their AI will learn using neural networks, and we'd all just laugh or nod our heads knowingly, but I can actually do that now, for real, on the current and this new generation of existing hardware... If the AI team is allowed the budget.

A game is not graphics. A game is primarily rules of interaction, without them you have a movie. Todays AAA games are closer to being movies than games. Look at board games or card games like Magic the Gathering -- It a basic set of rules and some cards that a add a massive variety of completely new rules to the game mechanics so the game is different every time you play. I'm not saying make card games. I'm saying that mechanics (interaction between players, the simulation and the rules) is what a game is. Physics is a rule set for simulating, fine, you can make physics games and play within simulations, but a simulation itself isn't really a game, at the very least a world's geometry dictates how you can interact with the physics. Weapons and some spells, item effects, etc. things might futz with the physics system, but it is very rare to see a game that layers on rules dynamically during the course of play in a real-time 3D the way that paper and dice RPGs or even simple card games do. League of Legends does a very job of adding new abilities that have game changing ramifications and the dynamic is great because of it, but that's a rare example and is still not as deep as simple card games like MtG. It's such a waste, because we have the ram and processing power to do such things, but we're just not using it.

I love a great stories, but it looks like all the big-time studios are fixated on only making these interactive movies to the exclusion of what even makes a game a game: The interaction with various rule logic. AAA games are stagnant in my opinion, it's like I'm playing the same game with a different skin, maybe a different combination of the same tired mechanics. The asset costs and casting, scripts, etc. prevent studios from really leveraging the amazing new dynamics and logic detail that are available in this generation of hardware, let alone next-gen hardware with shared memory architectures. IMO, most AAA studios don't need truly next-gen hardware because they don't know what the fuck to do with it -- Mostly because they've been using other people's engines for decades. These 'next-gen' consoles ARE next gen in terms of the game advancement they enable, even rivaling PCs in that regard, but no one is showing them off. I hope that changes. Most folks are scratching their head and asking, "How do I push more pixels with all this low latency RAM?" and forgetting that pixels make movie effects, not games. I mean, I can run my embarrassingly parallel n.net hive on this hardware, and give every enemy and NPC its own varied personality where the interactions with and between them become more deep and nuanced than Dwarf Fortress, and the towns and scenarios and physics interactions more realistic, or whimsical, or yield cascades of chaotic complexity... but... Dem not nxtGen, cuz MUH PIXZELS!!1!!1

The enemies and NPCs in your games are fucking idiots because "AI" and rules are what games are made of, and the AI team is starving to death while watching everyone else gorge themselves at the graphics feast. It's ridiculous. It's also pointless. So what if you can play Generic Army Shooter v42 with more realistic grass? Yeah, it's nice to have new shooters to play, but you're not getting the massive leap in gameplay. You could be protecting the guys who are rigging a building to crush the enemies as you retreat and cut off their supply lines. No, the level of dynamism in a FPS today is barely above that of a team of self-interested sharp shooters honing their bullseye ability. It's boring to me, great, I'm awesome at shooting while running now. So fucking what. Protip: that's why adding vehicles was such a big deal in FPSs -- That was a leap in game mechanics and rules. I'm picking on FPS, but I can leverage the same at any genre: There's little in the way of basic cooperative strategy (cooperative doesn't have to mean with other players, instead of re-spawning why not switch between bodies of a team having them intuitively carry out the task you initiate when not in the body anymore). We barely have any moderate complexity available in strategy itself let alone the manipulation of new game rules on the fly for tactical, logistical, or psychological warfare. How many pixels does it take to cut off a supply line, or flank your enemies?

4 days ago
top

Study Rules Out Global Warming Being a Natural Fluctuation With 99% Certainty

VortexCortex Re:more pseudo science (849 comments)

I suppose you can't ascertain whether the universe was created 5 seconds ago either. Fortunately the laws of physics, chemistry, thermodynamics, biology, etc. allow science to make Predictions not only about the future outcome of an event, but also about the probability of circumstances which caused observable outcomes.

If you leave your sandwich near me and come back to find a bite taken out of it, would you accept the argument, "You cannot ascertain the intake of past consumption with enough precision to absolutely blame me for eating your sandwich", or would you say I'm full of shit?

You're full of shit.

5 days ago
top

Commenters To Dropbox CEO: Houston, We Have a Problem

VortexCortex Re:It seems so obvious now (445 comments)

land sweetheart pre-IPO deals

The thing about pre-IPO is that it means IPO is in the future. Think about IPO. Now, if you're working for investors who pay you to analyze investment risk, then wouldn't having Rice on the board factor into the Risk category pretty heavily? One fucked up privacy/advertising foobar influenced by this spy-happy nutter on the board could easily end the company. It's not like everyone and their mother isn't competing in cloud storage now.

Furthermore, in a post-Snowden world the appointment of Rice doesn't reflect well on the decision making capability of an Internet enable service company or its CEO. That's getting tallied in the graph right as a mark against the IPO valuation; Even if it was a smart move for connections and she was out before the IPO it's not a smart move for the owners or future shareholders. Since Dropbox proved they're not capable of figuring out that corporate decisions affect consumer perception of their image I wouldn't invest a dime at IPO even if I had no other reason not to do so -- Like their past deception over user data privacy (there is none, the encryption is for transport but they can see what's stored).

With distributed solutions having actual security being common, it's only a matter of time before someone makes a slick interface for Freenet, and puts solutions like Dropbox out of business. The looming IPO is essentially the DB owners cashing in on their doomed business, and their only market value will be in short term speculation on their stock price. I see this retarding Rice appointment as a poison pill to ensure the IPO goes through without anyone buying them -- You'd have to be a fool to try buying them now.

5 days ago
top

Commenters To Dropbox CEO: Houston, We Have a Problem

VortexCortex Re:Hiring A War Criminal highlights something else (445 comments)

Hiring a war criminal and domestic-spying person may not change Dropbox's stance on privacy, but it shows another darker side of DB, it's business-at-the-expense-of-morality side.

More importantly: What the actual fuck is she bringing to the table that we actually want there?

5 days ago
top

CSIRO Scientists' Aquaculture Holy Grail: Fish-Free Prawn Food

VortexCortex Re:Prawns? (116 comments)

Sir, I'm afraid your house has termites... Please come with me.

5 days ago
top

CSIRO Scientists' Aquaculture Holy Grail: Fish-Free Prawn Food

VortexCortex Agriculture's Holy Grail: Open Source Food! (116 comments)

If you want me to eat something, you have to tell me exactly what it is, and how it was grown; If it's something from the animal kingdom then I want to know what you're feeding them, and how they're raised. We require ingredients lists on our other food products too. Before you cook shrimp or prawn you have to remove their "sand vein" AKA their digestive tract AKA their shit tube -- Guess what's in there? What they last ate. Some of that shit gets into what I eat. Now their job is to convince me that none of the "marine micro-organisms" in Novaq are harmful, and are free of things like, say, marine flesh eating bacteria...

All the food I eat I've grown myself, or gotten from the farmer's market from local farmers who's farm I have visited, or at the very least it has all of the ingredients listed. I only have one life, and I should have the information available to make an informed decision about what I fuel myself with, and the cost to the environment that I am a part of. That information includes how and where things are fished, hunted, farmed, etc. This extends to other purchases too. Eg: I'd only buy lab-grown diamonds to ensure I'm not supporting the blood-diamond trade. Electronics are often made in shitty conditions too. Just like it was unfortunate but necessary to use proprietary Unixes to make GNU/Linux, it is unfortunate that I must purchase hardware made under pitiful working conditions. When I do so I buy the fastest and most upgradeable hardware available so as to mitigate the frequency of my hardware purchases. Retired hardware goes to into the server rack or my home-grown cloud cluster that serves all my AV storage, display and streaming needs. What is decommissioned gets recycled, just like all the packaging I buy. I do the same with food waste via compost pile for my own garden.

It's more expensive to eat free-range chickens which keep the bugs out of the pesticide free garden, but they produce tastier eggs and taste better themselves (yes, I've done double blind taste tests, For Science!). It's usually more expensive, but sometimes it can be cheaper, to go in with a few friends or family on beef from a mobile butcher and have it cut however we like from a cow of our choice at a local farm. I understand that not everyone can afford to eat the way I do. However, if I can afford to eat better or healthier or in a way that enriches the local community or ecosystem then I do so.

I don't eat pesticide or herbicide. It is not necessary to do so. Contrary to popular belief, these poisons have not been tested for safety on animals, humans, or the ecosystem. Seriously, the chemicals they test on animals and humans are then added to other "stabilizing" or "inactive" chemicals prior to use in the field and the end result does change the properties of the pesticides and herbicides, they become more deadly, and the end result has not been tested on animals or humans. I also don't take drugs that have been on the market for less than 10 years (thus has 20-25 years of testing). Did you believe Tobacco farming corporations when they valued profit over people and said smoking is good for you, or when they said it wasn't harmful for decades? Why would you believe chemical making corporations then? I don't eat plants covered in poison (or that produce poison internally that kills critters we need for our ecosystem), I don't eat meat that eats such poison or that is sick or raised on feed that is a "closely guarded secret". I don't feed my family milk that has growth hormones either.

Did you know you can leave seeds in the sun to accelerate mutations for the test crops you select against to produce better yield while preserving genetic diversity rather than use a corporation's mono-culture which nature simply adapts to? You see, "exposing plants to UV light" isn't patentable and doesn't yield patentable produce. It's true that without poisons bugs will eat some of the plants. The portion of a crop that nature reclaims is the cost of doing business in her neck of the woods. It's only common business sense to diversify to ensure a single crop / market failure won't end your operation.

Turns out, when I look at the cost distribution of my food consumption it more closely matches the ratios of food one should consume. Less meats and fats and more fruits and vegetables. Instead of prawn, I just had some wonderful big spicy Cajun Crayfish, raised locally. The farmer showed me how they were part of a hydroponics system that scavenges (filters) the nasty things from the nitrogen fixing fish tank before the water is recirculated to feed some of the most amazing tomatoes I've ever tasted. That greenhouse eco-system also produces aphid eating ladybugs, of which I bought about a thousand to release in my own garden. Go for crawdads and come back with lady-bugs and tomatoes too. I never know what I'll buy when I go "grocery" shopping, but I know one thing: It will not have "secret ingredients". I eat open source food.

P.S. I also brew beer that is free as in freedom and free as in software...

5 days ago

Submissions

VortexCortex hasn't submitted any stories.

Journals

VortexCortex has no journal entries.

Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account

Loading...