Perl Is Undead
This brings to mind the purposes of creating and using code in the first place:
Where you are the only person that needs to see and understand it - this is fine; it serves your purposes happily for you.
On the other hand, where the purpose of creating and using the code extends beyond one person, this structure does not serve that need effectively. This is primarily because, while it may function, it is too brittle to be maintained by a team of developers over many years through many iterations in design without considerable time and errors generated in the process.
From seeing things like this, I would argue that there are way too many clever programmers - and not enough smart ones.
Age Discrimination In the Tech Industry
Your assumption there is that the youngsters turn out new ideas exclusively. The best ideas I have seen came from people who were over 40; youth does not infer a monopoly on ideas.
Interviews: Ask "The King of Kong" Billy Mitchell About Classic Video Games
How much money did you spend in the 1980s on video games? I recall thinking $10 to $20 a day was a lot of money at the arcade back then - but nowhere near enough game time to build up 'pro gamer' skills... were you independently wealthy, or had some other secret?
No, HealthCare.gov Doesn't Require 500 Million Lines of Code
Holy Christopher Columbus! Was it bring your favourite programming language to work month?
Seriously, this is a clear indication to me that this thing probably had all the halmarks of bad projects:
- Design by committee - definitely little thought given to KISS. No unifying approach/structure.
- No central control of the implementation standards.
- No thought of manageability/longevity/lifecycle of the code base - expediency at the expense of resiliency.
Managing that brittle monstrosity is going to be painful over the long haul. I feel for whoever gets that job.
The questions for those of us in the business: Why do we continue the cycle of poor craftsmanship/performance? What can we do about it?
No, HealthCare.gov Doesn't Require 500 Million Lines of Code
500M for a website isn't possible. Period.
Anything is possible. The real question: is it probable?
No, HealthCare.gov Doesn't Require 500 Million Lines of Code
Actually, I would argue that 99% of the functionality is contained in the 263 lines of Python code. All the rest is presentation...
Programmers: It's OK To Grow Up
Bravo! We need more people in this field with your attitude.
Hell, we need more people in general with your attitude - that would solve most of the stupidity I see on a daily basis.
Programmers: It's OK To Grow Up
"The lack of a standard library however makes things just repetitious and error prone", doesn't follow. Why would you copy the same code over and over in your application? Are you 'copy-paste' programming? That isn't very smart. Even as minimalistic as C is - it still has subroutines - you could even put them in a separate file from your main application to be reused in more than one program. Code reuse isn't a magical property of OO languages exclusively. Your statement actually illustrates the point - people don't have a fundamental understanding that carries over into all other languages as your abstraction level increases. You don't run before you walk - but in teaching computer science and programming we seem to be trying to do just that with our students today - and the failure of this approach is starting to be obvious in the real world where the rubber meets the road - at least for those of us who have to waste time and resources fixing the results.
Programmers: It's OK To Grow Up
When I start using Python.
Programmers: It's OK To Grow Up
C has been described as a wrapper for assembly language, and as such it requires that you really understand how the computer processor works to do anything non-trivial. C++ allows you to do that as well, but C really enforces it - and makes you think about building your own libraries of routines to do the higher order abstractions yourself.
This is valuable because most higher abstraction entry level languages today don't give you that experience (e.g. Java) - which really is what is important when designing good software, or conversely trying to troubleshoot someone else's bad software.
Case in point: We had a java application written by a vendor. It ran slow, but worse than that, it would crash after being up for some time. To make a long story short, the vendor had short circuited Java's garbage collection mechanism. All the objects it was creating in memory where not being released because they were not going out of scope. Java would reach its configured memory high water mark, and shutdown.
When we showed this to the Java programmer - he didn't have a clue as to why this happened to his application. So I would have to agree with Joel that Java is not a hard enough language because it abstracts away too much of the underlying machine. If that is all a programmer knows, then he is not a complete programmer IMHO. So I would have to say here is some support for his unsupportable premise.
Programmers: It's OK To Grow Up
Ned Lud, is that you? The 17th century is calling and they want their mime back.
Seriously though, this 'techno-shit' is what is powering the next revolution - allowing us to be far more efficient than we have been in the past. In a world of dwindling oil supplies, climate change, and over population, technology is what will allow us to survive this world, and migrate to the next. I don't know about you, but I'm not interested in contemplating future generations groveling in the dust of this parched world as it spins down to nothing. Doing my best now to move the technology bar - even a little bit in the right direction is worth the effort taken in that light.
I'm curious as to your occupation - given the subject matter of /.
Ask Slashdot: Professional Journaling/Notes Software?
I tend to agree with sticking with Moleskin(sic) - preferably 8 1/2 X 11 size --- nonruled (blank). I've tried electronic logs - I've tried electronic drawing apps (e.g. Papyrus on a Nexus 7 Android system). The main problem for me is not only do I want to write in it - but I find freeform drawing to be more helpful in conjunction with the writing. The Nexus pixel sizes for drawing where too coarse - and while you can zoom in and out -- the drawings always ended up looking odd - and took longer than just writing on paper. The only other acceptable solution I found was a $1500 Wacom electronic drawing tablet --- so I continue to buy Moleskin at a fraction of the cost.
So that does bring up the problem that is mentioned regarding indexing - and here is how I deal with that:
Each entry is dated in this manner: yyyymmdd e.g. 20140421 ; in this way each volume contains a series of entries that are uniquely numbered; if you need to add more than one entry per day - then just add hours and minutes as needed: 20140421:1405 (using the colon to visually separate the date from the time is preferred by me).
I also encode each entry as to 'type', where types are based upon single letter codes: C = computer science, A = art, etc... I put the letter code inside of a square in the upper - outer corner of each page where an entry begins.
The next step is to create an electronic index to key entries in your logs --- assuming you number your volumes sequentially - you can identify an entry like this:
Vol 2, 20140421:1400 History of FOO
With this system you can have both the flexibility of combining freehand drawing with your log entries, and also keep an index of your key entries organized however you like (perhaps by type, or project codes etc...you can expand this as you need beyond my simple method).
Ask Slashdot: Are You Apocalypse-Useful?
Apocalypse in current usage means the end of the world. By that definition - no skill will be 'Apocalypse Useful' - because no one will be around in the aftermath of the end of time (with the possible exception of an intrepid band of our great grandchildren who might figure out how to jump between multiple universe branes at the precise moment of the 'big rip' - a very remote possibility imho).
Really what we are talking about are events that while devastating, are short of the level of destruction needed to end the world. The very nature of that definition means that there will be locations that are not directly impacted by whatever happened. The most key struggle would come from dependency on things that moved by long distance transport; various foods, fuels, technology, and other manufactured goods. As a result, local replacements would have to be found and developed.
In those areas harder hit - it would be very bad, if not impossible to survive after the initial event. I can see migrations of people from these 'hot zones' to more habitable areas. Refugees might put too much pressure on less impacted areas - causing a crisis in those areas. The first few years after the event might be very chaotic due to these population pressures and migrations. The very best way to avoid a humanitarian disaster would be to make sure all of the surviving zones have good communications - and plans in place for relocating and organizing the influx of survivors into their communities. I think you would also want to move as quickly as possible to restore technology to society - maybe not in the exact forms that we are used to - but restoration just the same. Having running water, food, medicines, heating and cooling, and energy in general are critical to sustaining life. As a result, I think all disciplines will be useful to society in that situation in one way or another. One example: artists and story tellers would be useful in bringing entertainment and beauty into the lives of the survivor communities - and might be very important in keeping human knowledge alive until information systems can once again be restored. Ultimately, people would be so hard pressed to survive that it would quickly become apparent that the survivors will do better by banding together rather than fighting among themselves.
Overall - I think if the event was large enough to depopulate the world significantly, I think the survivors would be very busy indeed, with little time or energy to waste of the staples of post-apocalyptic fiction: warlords, societal breakdown, and descent of our humanity to that of the animals, leading us to prey upon our fellow man. While there may be a few sociopaths who try to benefit from the situation, I expect the rest of us to quickly control that. Essentially, humanity has lived through these sorts of things in the past, and I am sure we would make do and get on with living in the aftermath of whatever mother nature sends our way again.
Born To RUN: Dartmouth Throwing BASIC a 50th B-Day Party
My very first program was 'hello world' in Basic on the High School computer lab's Apple ][ in 1981 (learned Fortran in that same course). I got a TI 99A for my birthday that year, and I wrote more noddy programs in Basic over the next few years, saving them meticulously on cassette tape.
I can't imagine using Basic for anything useful these days, but it was fun while it lasted.
Ask Slashdot: User-Friendly Firewall For a Brand-New Linux User?
Do not feed the trolls.
The Myth of the Science and Engineering Shortage
Another aspect of the problem - Corporate policy in most large companies is to treat all of your IT programmers as identical widgets. This policy stems from HR, Finance, and IT efforts to 'normalize' positions so they can be circumscribed enough to allow 'efficient' allocation of resources, or more damaging, the allocation of resources that can be outsourced wholesale. Ultimately it all comes down to cost reduction. Poor results of IT, coupled with IT being strictly a cost center - leads to this outcome (the cost vs. value proposition as seen through the eyes of the heads of the business).
This of course, drags down everyone with it causing many good people to leave or get caught in the outsourcing net. If they are lucky - they do manage to move up into management (architects etc) - and hopefully they can influence the designs - but again - what is left behind is tragically impossible of effiently implementing even the best designs - so the problem feeds itself as your best get pulled away from programming.
Indications are CTOs are starting to see how this is not working...here's hoping they can get the HR and Finance people to turn this around, but I doubt it.
The Myth of the Science and Engineering Shortage
No, there are a lot of STEM graduates who really aren't that good and don't have much experience, yet they believe they're entitled to a senior-level salary. I am more than happy to pay a high salary to a candidate who is actually good at their job and has a demonstrated track record of performance. The glut of average performers with little more than student projects as experience, however, are not worth more just because I have open positions to fill. The key is will this person actually perform at the position I put them in rather than just fill a desk and surf the net half the day.
Too bad you posted anonymously. I would have modded you up. This is absolutely the truth.
My recommendation for anyone: if you love the carreer you want to get into - then do it. However, if you only see a bunch of dollar signs - then it is better off for you and the rest of us in STEM if you would put your energy into something else.
We are plaugued with a glut of people who cause more inefficiency than the 'solutions' they create... causing me, and others who know what they are doing to spend extra cycles fixing their broken systems. The problem is most of the time, they don't even know why their choices are a bad thing, or worse - if they do, they don't care allowing expediency to take precedence over quality.
On a positive note: this will keep me employed well beyond my retirement as a contractor fixing other people's code.
Google Android Studio Vs. Eclipse: Which Fits Your Needs?
Technically, isn't EMACS the granddaddy of all IDEs? In comparison, Eclipse is 'Johnny Come Lately'.
Computer Science Enrollments Rocketed Last Year, Up 22%
It's funny you say that - but it is true. We definitely need more STEM people in jobs not only in the technology fields (like IT) - but those jobs that interface with it as well (such as Project Management, Marketing et al).
I can't tell you how many non-technical people I have to deal with in the course of a day. Some of them are actually holding technology positions...most are just collecting a paycheck for all the positive effect they have on the business. I have to get work done in spite of them.
On the flip side - if a person got a STEM degree to make money (and whose heart is not into their field) - by all means stay away from technology...please...
Interview: Ask Richard Stallman What You Will
If, as you say, "Open source is a development methodology; free software is a social movement." in your article: Why Open Source Misses the Point of Free Software, why do you advocate not using the term 'open source', particularly if it is being used in a technical/development methodology context only?
The title says it all. The buzz is reaching a crescendo and doesn't feel like it's dropping off anytime soon.
1. No Phone
2. Limited Multitasking
3. No Camera
4. Unanswered Questions (Will it be able to sync .txt, .pdf etc files or .mp3, .wav etc files from my laptop?)
2. App Store - large number of existing apps out the gate
3. Multitouch on large screen
4. Sync - Write, Numbers etc.
5. iPod/iTunes functionality
7. Email + Photos
9. WiFi n/g/a/b
11. Long Battery Life
12. 64 Gig solid state drive
2. Closed/Filtered Development (but $99 doesn't seem like a lot of scratch to have access to the App Store).
I have recently seen some poor metaphors regarding computers and software.
This is my attempt to clarify the issue for the community in more simple terms (see notes below for more technical explanation if so inclined):
1. A computer is a simulation device which can simulate anything at all, given unlimited resources.
2. In practice we (programmers) build a subset of simulations that are most useful or entertaining for the users (because that pays the bills).
3. An operating system is a simulation that allows us to more easily manipulate our computer to run other simulations and communicate to and through ever more complex and sophisticated devices (sound cards, video cards, network interface cards, joysticks, mice, etc) that we hang off the side.
4. A very small subset of programmers have made an ungodly amount of money selling said simulations. The article kind of loses focus at this point and goes off on a tangent - I won't burden the reader here with that.
5. The CLI will not die simply because its utility and expressiveness outweigh the lack of utility and expressiveness found in pure graphical interfaces. The future is begining now - and is a hybrid - both the CLI and GUI coexisting for mutual benefit leveraging the strengths of both in ways far more sophisticated than we can envision today.
My own editorial: Until people stop reading altogether, or natural speach recognition becomes a reality, keyboards will be around for the foreseable future.
Notes (numbered to reference the numbered sections above):
1. The term simulation is defined in the dictionary as the " representation of the operation or features of one process or system through the use of another". This term is quite common in general use; everyone knows what a 'flight simulator' is for example. A computer program is really just a simulation. A bit of history will illustrate this point:
Alan Turing came up with the concept of a Turing Machine which could be used as a general purpose device to simulate any other machine or process using very simple instructions in building block fashion to produce more complex simulations. The brilliant scientist John Von Neumann further extended the idea* to encompass the first stored program computer architecture for practical use (which exists in modified form in all present PC computer cpus).
*(Though this is debated; it is true he worked at Princeton University in New York when Turing was a graduate student between 1936 and 1938 - Von Neumann even asking him to stay on as his assistant - to which he declined. What would the world have been like from such a partnership, had not WWII interceded?)
It is interesting to note that modern computer chips do not have what we think of as the basic instruction set - Assembler - hardcoded into the chip. Instead the Assembler instruction set is itself a simulation running on a far simpler 'micro code' instruction set that is hardcoded into the chip.
I think a better metaphor for computer software (which encompasses everything running on a computer, from the OS to what we think of as applications) is a series of of small boxes within larger boxes, which themselves are inside of a larger box. Some of the boxes may have more than one box inside of them (like the OS running multiple applications, for example). The largest 'lower level' boxes have the ability to serve as simulation 'stage' for the boxes that they contain. At the highest levels (the small boxes at the 'top' of the stack) they may or may not have facilities for doing further simulation (now-a-days it is more prevelant to see applications that have macros up to and including full-blown programming languages and interpreters for creating your own simulations within the instruction sets provided). The OS is simply one of the larger boxes near the bottom of the stack.
2. Sometimes the users are ourselves; this is why we see a plethora of noddy programs/simulations that don't do much usefull for larger audiences.
3. See the 'boxes-within-boxes' metaphor in number 1 above.
4. Not much more can be said. I will state my own philosophical view: I think it is more useful to programmers and to society as a whole to invent more flexible and open simulations that allow computers (and other less-general purposes devices) to communicate more seamlessly and make them a true and natural tool to augment our senses and intellect. It is not impossible -- we just have to dream it up and make it happen.
I come bearing an olive branch. I stand astride the precipice seperating CLIs and GUIs, Linux and Windows, plain text versus proprietary file formats.
The questions I hear always revolve around the core principle that there is something universally 'right' or 'wrong' about operating systems - and that we can discern those qualities.
The reality is nowhere near this black and white paradigm. The truth is each one of us individually holds the key to that answer, and we are all correct!
Given the truth of that statement, isn't it silly for us to argue about any OS aspect without prefacing it with 'For me...'? For me, Linux is the best development workstation OS. For me, Windows is a toy that lets me play some of the more interesting games available.
For you, this might be different, but then again, who am I to tell you how to enjoy your CPU cycles?
Un pobre guey (593801) pointed out the basic futility behind my sig in this article.
While not perfect, by any means, I did come up with some pretty good stuff - which I intend to massage into something more useful over time.
Life is about that [struggling for what is right against greed and stupidity]. However, our performance is rarely as consistent as our best intentions.
Conversely, the same thing can be expressed as:
Life will always be about furthering our greedy desires, despite our stupidity, at the expense of truth.
It all depends on where you mostly fall in the desire/righteousness continuum.
[commenting upon Un Pobre's suggestion that these ideals get 'crushed on a vast scale on a daily basis']
The key to not becoming disheartened is to pick your battles carefully. That way you aren't always getting squashed. Know when it is most useful to expose your hand, and when it is better to work quietly behind the scenes.
Overall, it is much better to gather 10,000 allies quietly over time, than to run out into sunlight alone and get squashed right away - unless you are into being a martyr. Patience and sacrifice = success. Sacrifice alone = death.
I do not advocate blind sacrifice. I do advocate struggling for what is right - but smartly, with your eyes open.
Here is my advice to those of you looking for a steady job in the computer field (or trying to keep the one you've got):
Stop thinking of yourself as a 'Programmer' or a 'Coder', or a 'DBA' or whatever narrowly defined field you think you are in. From now on you are a 'System Integrator', 'System Developer', 'Computer Guru'. Stand up straight - stick out that chin - be proud of who you are.
Learn as much as you can about iterative/agile development - characterized by rapid prototyping, frequent incremental releases, and a really meaningful feedback loop with your customers and team members that addresses the three key questions: What did we do not so well this iteration? What did we do very well this iteration, and finally, what can we do to improve ourselves for the next iteration?
Avoid the strict waterfall method (where specifications are defined in detail - often taking many weeks or months - before implementation - during which specifications are frozen until final release. At which point the application is 'maintained' - usually by a different group than who developed it - and changes have to go through a review process - many months - and vye for IT resources with other projects; I have lived through these - and it is not pretty); build quickly to get something in the hands of the users so they can give you feedback and shape the design where it needs to go. With vendors and internal IT development teams it is sometimes an uphill battle to break them of old bad habits - but its a good fight that pays dividends in the end.
Learn portable tools First, then everything else Second. By this I mean don't specialize in .Net or Visual Basic before you learn Python, Perl, Java and C/C++(GCC). Be able to prototype quickly on whatever hardware and OS your customer may have available (don't tie yourself to a particular architecture/OS) with minimal setup on your part. Ideally, build a tool archive that has the things you need on it - ready to go with templates for standard functionality - this will impress your clients.
One suggestion for a client-server web enabled application toolkit (90% of my projects fall into this category) is to have a tar/zip file containing Zope with your favorite products (extension modules) installed - as well as example applications, and Python - for scripting, and rapid GUI development - again with modifiable examples of stock applications you have developed available for rapid prototyping. If you have internal customers - have a machine set up this way so you can do the prototyping quickly and get feedback as soon as possible.
Don't be afraid of change. Be flexible - and make your customers so happy - they come back to you for more quality tools. Figure out who your customers are and what they want - then give it to them without having to be asked - it is easier to beg forgiveness for something useful, than to get permission to build it before hand.
Keep your technical skills up; practice your craft by building small applications that exercise some aspect of a language you are not familiar with. Keep up on the development, system integration and various standards swirling around - be like Bruce Lee: take what works, and discard the rest. Don't waste your energy trying to master every fad. Know where all of the best 'wheels' are - and use them; don't reinvent the wheel if you don't have to. If all else fails - reinvent the wheel. Remember - the only true measurement is working tools; build many and often.
Finally, examine yourself. Are you the best Computer Guru/System Integrator you can be? Are you doing the right things to satisfy your customers? More importantly, are you satisfied with your job, or would you be happier in some other line of work (there are plenty of other unemployed IT workers ready and willing to step into your empty slot).
He traveled far and wide from coast to coast, and gathered warm clothing against the frigid blasts of the 15 ton airhandlers, and avoided the slings and arrows of 48 volt DC power supplies. And lo, he found inscribed upon a lowly cabinet in an empty corner of the data centre floor...
The 1010 Commandments:
0001: Thou shalt not covet thy neighbor's information.
0010: Thou shalt not hold any OS above the one true POSIX Compliant OS
0011: Thou shalt not worship graven languages, like Visual Basic, C#, or
0100: Thou shalt not spread pestulence, disease, or internet worms in
0101: Thou shalt keep the backup day and make it holy.
0110: Thou shalt regression test agressively, for this is the way to
0111: Thou shalt patch frequently, and tithe heavily for every bug that
1000: Thou shalt not let marketroids and suits guide your projects; this
is the path to perdition.
1001: Thou shalt share thy computer science knowledge freely with all
those who grok; for all others: RTFM and FAQ.
1010: Thou shalt not use proprietary standards, in all its cloven forms;
just as Eve was seduced by the fork tongued devil, so too will
you be expelled from the raised floor of the data center for
Sco begat Caldera; Caldera renamed Sco; Sco shat on Linux.
Somewhere in there USL begat BSD; BSD broke free.
Things continue to look dark. The totalitarian internet dictatorship enterprise (TIDE) is rising swiftly now. How long will it last? Will we survive it? What of our children? I remember, as a child, having fantasies of surviving a nuclear holocaust. I would pack all of my 'survival gear' into my backpack, food, extra clothes, gear (my trusty compass and swiss army knife), as well as books that I thought important to preserve ('Alas Babylon', 'Lucifer's Hammer' et al). Things never got quite as bad as my nuclear winter nightmares, and I have learned that we are never fully prepared for the increasingly unexpected happenings of adult life.
I keep telling myself that reality is always somewhere inbetween my worse nightmares and my best wishes. I hope that holds true. Hope is a rare commodity now. Perhaps we have to make the world the way that is right by being active - being that lone voice crying out against the screaming hurricane and the enveloping tapestry of night; maybe waiting will only plunge us into darkness faster...what will happen as we stand by and do nothing?
The first rule of Animal Farm: Don't talk about Animal Farm....
Back on dry land with a new gold tooth, and a healthy appreciation for the vagaries of wood...
Considering my options...one troll, several funnys, and a handful of insightfuls later, and still no moderation points...I kick around the idea of becoming invisible electronically:
No more credit cards - chop them all up. No more ATM transactions; and cancel my direct deposit - cash and carry from this day forward. Select various facial artifices, prosthetic noses, cheekbones, change the color of my eyes and dye my hair in a rotating basis. Clothing to mask my gait. A cabin in the back woods away from prying video cameras. Inside of the cabin, a metal lined room with isolated power - a digital oasis with no contact with the outside world - just me and my perl scripts, happily beeping away...until that day when they finally come to see me - having missed my monthly diatribe mailed from the local trading post - to find my bloated corpse perched in my last resting place, green fuzz encrusted Funyons and empty bottles of Bicycle Ale encircling me like a wreath, a faded 'EFF' poster created with ASCII art leaning forlornly to one side upon the dingy wall above the monitor, holding steady at the soft white glow of a boot prompt.
Are we staring down the maw of the digital dark age? Are we entering an electronic 'Spanish Inquisition'? Does the all seeing eye rein supreme? More importantly, where is Hagbard, and what is he doing about it?