The wonderful PBS Off Book series on the art of visual coding. Some trippy stuff.
Clearly, the first thing Jeff Atwood wanted to do with the post is get your attention. He knew that by being a grumpy old contrarian, he was going to get some reactions. The problem though with the article is that the message is a bit muddy. His argument is that the “meme that everyone should learn to code has gone out of control.” He doesn’t see it as an essential life skill like reading, writing, or arithmetic. And that basically “learning to code for the sake of learning to code” is wrong. I understand his sentiment, but because of the harsh tone, most people are gonna read it as, “you shouldn’t learn x skill because you don’t need it.” That’s like saying you shouldn’t buy another pair of shoes because you really don’t need more than one pair for survival in the world.
The article is conflating “coding” with “professional software development”. Learning to code teaches you, among lots of other things:
Divide and conquer
You don’t have to learn plumbing to understand that something is wrong with the toilet, but saying that wanting to understand what’s exactly wrong with the toilet is wrong, is just plain arrogant elitism. Understanding how toilets work aren’t going to stop you from being a Mayor.
It’s funny, but it’s true. It’s been happening for a while now. The whole startup world has attracted a different kind programmer. A more extroverted untypical person that you wouldn’t think would be interested in coding.
Tech’s latest boom has generated a new, more testosterone-fueled breed of coder. Sure, the job still requires enormous brainpower, but today’s engineers are drawn from diverse backgrounds, and many eschew the laboratory intellectualism that prevailed when semiconductors ruled Silicon Valley. “I don’t need to wear a pocket protector to be a programmer,” says John Manoogian III, a software engineer and entrepreneur.
Daniel Markham argues that “programming” is going to be a required skill in life just like reading, writing, and math. He’s using programming as a broad term meaning to have the skill of knowing how to tell a computer to do something.
It used to be there were four tiers of work in the United States. The first tier was for the truly uneducated: the illiterate. The second tier was for people who could be counted on to read and write and perform basic math: the high school graduates. Then there were folks who could be counted on to learn a lot more and take up positions of greater complexity: the college graduates. Finally there was a spot in the job market every so often for an expert.
Newsflash: the second and third tier are going away. In it’s place is a single tier: people who are literate and are able to control computers. And we’re nowhere near ready for the changes coming.
Douglas Rushkoff on why everyone should learn to code.
…[W]e now live in a world with apps, networks, and stock market trading algorithms that we use, even though desperately few of us understand how they work. And while learning to code may have once been an arduous or expensive process, the college dropouts who developed Codecademy have democratized coding as surely as Gutenberg democratized text. Anyone can go to Codecademy and start learning and creating code through their simple, fun, interactive window, for free.
I’m on the second week taking the Code Year courses and I’m loving every minute of it.
A couple of days ago I sat down to watch Hackers on Netflix. I missed that movie when it came out in 95 and was curious on why it’s considered a cult movie. I can see the appeal and the reason why it’s considered a cult movie. It’s campy, rebellious, and Angelina Jollie wears leather. But it’s cult for all the bad reasons that a movie is considered cult. It’s not an underappreciated movie that was too ahead of its time. Let’s just say this one is cult because only a cult would be crazy enough to champion it.
When I finished watching it I was terribly disappointed. But more than disappointed, I had complex feelings about it. I felt sad and angry. I got all worked up and drove my wife crazy talking about computers in movies. The question that kept coming up is, “Has there been any movie about computers, or computer culture, that’s been fair?”. Note that I’m not asking if there’s been any that have been good or bad. But just fair. This question drove me to do some research. Meaning doing some Google searches and reading Wikipedia entries.
When Roger Ebert reviewed Hackers back in 95, he cited Andy Ihnatko’s impression of the film:
“Hackers wasn’t even in theaters before attacks on it started online. It represents a new genre, “hacksploitation,” Mac expert Andy Ihnatko grumbled on CompuServe, adding that like a lot of other computer movies it achieves the neat trick of projecting images from computer screens onto the faces of their users, so that you can see graphics and data crawling up their chins and breaking over their noses.”
This Hacksploitation term encapsulates it perfectly and it starts to answer part of my question. Think for a moment of movies that deal with computers. With the exception of You’ve Got Mail, which was a giant AOL advertisement, computer users are either hackers, or are either hackers. That’s not a typo. They’re dangerous people and can destroy civilization as we know it.
The list of movies about the computer world are either thrillers or science fiction. That’s actually a good definition of a Hacksploitation movie: Blurs the line between a thriller and sci-fi film. Another and even better definition could be: A movie that gives homage to the computer culture but sadly getting everything totally wrong about how computers work.
In the 80’s we had WarGames and Tron. If you were into computers, these movies were the best thing ever. A hacker almost starting World War III? That’s totally boss. Tron was like a PBS special on computer programming that used special effects to give give visual analogies.
But in the 90’s it’s when it really started getting weird and exploity. We had Lawnmower Man, Jonny Mnemonic, Hackers, The Net, and The Matrix. Three of those came out in 95. The Net, were Sandra Bullock plays the most unbelievable hacker in the world, looking more like she should be hosting The View, almost felt out of the Hacksploitation category until the floppy made the screen flash different images and made rapid swoosh sounds.
(The made for TV film, Pirates of Silicon Valley, could be included, but it’s more of a documentary. I wouldn’t classify it as Hacksploitation. That’s just a great and underappreciated film about the industry.)
The problem with these movies is not that they were bad or good. It’s just how wrong it got the computer stuff. The flashing code blown up in 3D so you could understand how they hacked. The totally bananas user interfaces. The stereotypes, which were really off the mark stereotypes. (Raver look?!) They had an agenda and the agenda was that computers and computer people are trouble.
It’s perfectly understandable. In those two decades, (80’s and 90’s) if you told someone that you were a computer programmer or just worked with computers, people couldn’t help to picture someone from Revenge of the Nerds. Even in the mid to late 90’s, computers and the internet was still this fringe activity. It wasn’t completely understood that computers were simply tools to make things.
So we get to the new millennium. The internet finally explodes. Blogging, social networking, and all that stuff starts happening. Computers are understood more as devices to create and consume media, than to code and hack. You would think Hollywood would know better. But they come out with Swordfish and Antitrust. Don’t get me wrong, Swordfish was badass, but it’s still a Hacksploitation film. In the oughts the films got more sophisticated: Firewall and Live Free or Die Hard are two that come to mind. It remains to be seen how would they hold up, but compared to films like The Net, they’re not as embarrassing.
There is a glint of hope though. In Fincher’s The Social Network we are finally given a straight up, raw computer nerd. Fake Mark Zuckerberg. This film had to address computers and web culture. That was a big part of the story. But it does so fairly in a non-dramatic way. Non-dramatic to a fault even. With The Girl with the Dragon Tattoo were getting another hacker, a goth-punk-new-raver something that happens to know a lot about computers. I haven’t seen the Fincher film, but I saw the Swedish version, and the most computer-y thing I remember the lead character doing is transferring jpgs on a Mac.
Circling back to the conflicting feelings, the “good thing” is that a movie like Hackers would never be made again. People no longer think that computers are creepy, or hard to understand. But that’s also the “bad thing”. I’m not so sure if kids would be as inspired to get into computers by watching fake Zuckerberg creating a social network, than how probably Broderick’s character in WarGames hacking into military computers inspired thousands.
The irony of it all is that the only people who could truly love these films, or equally hate them, are the same people they’re exploiting. The geeks, the nerds, and the jackals. Hacksploitation is dead. Long live hacksploitation.
NyTimes piece by Matthew B. Crawford, author of Shop Class as Soul Craft. Like in his book, the essay deals with how modern America has devalued the manual trades like plumbing, carpentry, and mechanics. He argues that since the switch to “knowledge work”, there’s a mistaken assumption that working with things and working with your hands is for “stupid” people, but it’s far more satisfying intellectually than people are aware of.
The trades suffer from low prestige, and I believe this is based on a simple mistake. Because the work is dirty, many people assume it is also stupid. This is not my experience. I have a small business as a motorcycle mechanic in Richmond, Va., which I started in 2002. I work on Japanese and European motorcycles, mostly older bikes with some “vintage” cachet that makes people willing to spend money on them. I have found the satisfactions of the work to be very much bound up with the intellectual challenges it presents. And yet my decision to go into this line of work is a choice that seems to perplex many people.
Be sure to also check out the excellent book review.
Really funny and interesting article about how the minds of programmers work. The more clear and rational you are the better you can communicate. That’s true, up to a point. Some programmers take this to heart. To do their work effectively they have to communicate clearly, sequentially, and logical. But with humans you have to do the complete opposite.
The golden rule of programming is D.R.Y. — don’t repeat yourself. This is the heart of effective programming. But this is the opposite of effective communication.
Let me say that again:
The golden rule of programming, DRY, is the opposite of effective communication.
Say everything once and only once — go ahead — then be amazed as everyone misses your point!
Humans are not machines. Memories made of this gooey, spongy stuff called a brain are nothing like memories made of silicon.
With Humans, nothing sinks in the first time. And furthermore, you may be surprised to hear that NOTHING sinks in the first time.
Newsweek’s International chief editor Fareed Zakaria on the Recession. His take?:
The global financial system has been crashing more frequently over the past 30 years than in any comparable period in history. On the face of it, this suggests that we’re screwing up, when in fact what is happening is more complex. The problems that have developed over the past decades are not simply the products of failures. They could as easily be described as the products of success.