Hacker News new | past | comments | ask | show | jobs | submit login
How Pitfall builds its world (2021) (evoniuk.github.io)
312 points by kibwen on Dec 22, 2022 | hide | past | favorite | 88 comments



I have been deep in the 6502 hole writing an emulator and watching Ben Eater. Throw off the shackles of your massive computers and compute for the sheer fun of it! Almost everything about the 6502 and writing assembly for it is relevant in some way to modern systems programming. You get to trim all the fat from programming and computing and see the ingenuity of computing unfolding before you in a tiny little 2^16 address space. Where you are going there isn’t even a div instruction. It is a cool way to compute and as retro as it is if you were ever interested in systems programming this is a cool, nostalgia themed, way to start.


The irony, when I grew up writing programs for my Atari in 6502, I dreamed of having the amazing resources of computers today. Now people are going back to the simpler times. Sort of the programmer equivalent of LARPing medieval times I guess.


I definitely dreamed of more powerful computers, but working using libraries that wrap libraries that wrap libraries (and on and on) and never needing to think about the hardware was not something I really anticipated. There’s something nice about working with hardware and software that calling it LARPing feels dismissive. Arduinos are fun in a similar way and aren’t medieval.


dismissive? I think maybe that's your attitude to LARPing? I think LARPing is great fun! Nothing wrong with it, going back to period where things were different, less technology but still interesting skills involved to do things we take things for granted these days.


It is just nice to switch modalities and make things simple. 6502 is a sweet spot of capability and simplicity for me.


It's amazing how far we've come in raw computing resources available on a desktop in 40 years.

Check the 2018 video of an Atari 8-bit emulator which can run 182 apps simultaneously (no idea what CPU is powering the box):

http://www.emulators.com/xformer.htm#XFORMER10


"Grass is greener on the other side."


Plus you can make new NES games.


That is my eventual goal. There are many NES emulators out there, but this one will be mine :)

I also wanted to release a decent pure 6502 with a good command line monitor that lets you explore programs easily. Step. Poke memory. Set registers. Good CLI. We’ll see. It really helps to have a decent playground when getting started :)


Heh. That would've been nice. Back in the day, after we walked in the snow uphill to work both ways, we wrote our code, ran it through the cross-assembler, burned EPROMs (after waiting an extra 15 minutes to erase the last ones we used because someone lost the dang tube), plugged 'em into a socketed NES cartridge and really hoped it worked.


And a whole bunch of other systems: https://en.wikipedia.org/wiki/MOS_Technology_6502#Computers_...

And, more recently, the Commander X16.


Pitfall really stood out ... and made many of us think, "why can't other 2600 games hit the bar set by Pitfall?"

I mean, there were fun games with very basic graphics (Combat springs to mind, https://www.free80sarcade.com/atari2600_Combat.php) but there were also way too many uninspiring original games and arcade copies that were just lame.

Now we know why. It's a great article, thanks for sharing @kibwen.


It still stands out as an actually-fun game on the 2600.

The 2600 is before my time. I grew up with the NES and its library. I do see 2600 nostalgia in older generations, but for me, Pitfall is pretty much the only title worth playing.


You may find many of the Activision titles still hold up. River Raid, Megamania, Seaquest, Frostbite. There's a lot of fun to still be had in those games, especially if you're competing head-to-head.

But for my money the most fun you can have on an Atari 2600 today is 4-player Warlords with well-maintained paddles.


My vote for River Raid, H.E.R.O., and Enduro. Cool fact that I only recently discovered: River Raid was designed and programmed by a woman, https://en.wikipedia.org/wiki/Carol_Shaw .


Carol Shaw is married to Ralph Merkle, of Merkle Trees fame, used in many applications, including git, Cassandra and many cryptocurrencies (including the original Bitcoin).


> for my money the most fun you can have on an Atari 2600 today is 4-player Warlords with well-maintained paddles

This was definitely the most fun for larger groups of players back in the day.

https://www.giantbomb.com/warlords/3030-25096/


Do you know about Medieval Mayhem? https://atariage.com/store/index.php?l=product_detail&p=842

"adds arcade features such as the launch dragon, multiple fireballs, and a level of polish missing from the original 2600 release."

There are a ton of homebrew games for the 2600 that are super fun.


Yeah! 2-player (or 4-player!) simultaneous Warlords or Medieval Mayhem are really fun, even in 2022.

After many decades I finally managed to try it out, on a real CRT and everything. It's not something you generally want to play for hours or anything like Mario Kart... but it is really really good.


I'll definitely have to check them all out. Thanks!


“River Raid” stands out to me as the most actually fun Atari 2600 game.

Despite having awkward controls, “Combat” is also fun due to couch multiplayer. “Jungle Hunt” is similar to Pitfall but faster. I also managed to spend a lot of time playing the 2600 version of Asteroids, but I don’t know if it actually holds up.


Yep. Anybody hating on Combat never played Combat the way it was meant to be played. I treated it as the game you just put to the side as a kid because we were too busy playing Pac-Man, Donkey Kong, Breakout, Chopper Command, Galaga, Galaxian, Defender, Adventure, Vangard, etc. But once you've played Empire Strikes Back ten dozen times and the fun of shooting glowing pixels on the backs of AT-ATs wears off... Combat was still there and took almost zero explanation for how to play.


I really loved playing Vangard when I was younger. The multi-level types and fast-paced action really drew me in.


Others have mentioned some other good games (Ms Pac, Keystone Kapers, River Raid, etc).

There was something about the first generation Activision games that seemed to stand out compared to the rest. Whether simple or complex, they had some hard to pinpoint quality that was just... 'good'. David Crane mentions this in one of the linked videos. "2 line kernel didn't look good. 1 line kernel was harder to do, but looked better, so we just did that". He's referring to doing more work to be able to make visual updates every scan line, instead of every other scan line. This made their games so much more visually sharp and appealing compared to what came before on the same hardware.


So the TIA (the Atari video chip) has a "delay" slot for its "player" graphics--a player being an 8 dot wide object - the CPU is supposed to program in new data the player graphics register during the HBLANK portion of the display for each scaline it's visible.

The intention was that the CPU would spend 1 scan line updating P0 graphics, then the next scan line updating P1 graphics, and by setting the delay for P0, both player graphics would appear on the same 2-line portion.

So "2 line kernels" were doing it in a way the TIA supported and providing time to do other things like check paddle status (because you're supposed to be programming a Pong game). Of course the TIA has been wonderfully pushed and twisted in ways never imagined by the designer. I think someone even experimented with interlaced 160x400 graphics.


Thanks for the extra details :)


The two-player games were all fun, the arcadey stuff like Centipede and Missile Command were addictive. My memories of Pitfall were mostly that it was repetitive and hard, but I was pretty young so presumably sucked. I do remember that things like Submarine Commander and Star Raiders/Solaris felt very, very sophisticated though and I can imagine they'd be worth a revisit.


A YouTube video (https://www.youtube.com/watch?v=CkDllyETiBA) showing the complete walkthrough where all 32 treasures are acquired leaves less than a minute on the 20 minute clock (actually 8 seconds left). Nintendo hard doesn't have much on certain types of Atari hard.


The best tool-assisted speedrun for this game comes in at 18:11. According to the runner, a lot of the obstacles, like swinging vines, all run on the same timer. This means that saving a bit of time on one room may not put you ahead (because it just means you have to wait longer for the next obstacle), but missing a "cycle" on an obstacle is an unrecoverable setback.

https://tasvideos.org/4000M


First time I’ve seen a map of the Pitfall world:

https://pitfallharry.tripod.com/MapRoom/PitfallMap.html


In my defence, I was probably 4 or 5, although I’ve had a lifelong dislike of jumping puzzles that no doubt stemmed from this game.


https://www.youtube.com/watch?v=B21K8X1eja0

Check this out - a list of 20 Atari VCS games the author thinks still hold up in 2022.

I'm not sure I agree with all the choices, but it's an interesting list nonetheless.


If you've got the paddle controllers in good condition, and a low latency screen, super breakout in progressive mode is a treat. And paddle controllers disappeared since then, so hard to replicate. Arkanoid with a spinner on newer systems is similar but different.


If we're listing fun games i'd like to add Berzerk :)


If you like Pitfall you might like Keystone Capers


Did you try Ms Pacman? It's pretty solid.


Probably haven't, but I'm not usually enamored with home console ports of arcade games when I can just as easily fire up MAME. I can recognize them for what they are on the hardware they target, but nearly always the arcade original is preferable.


No doubt, but I bought a VCS2600 and actually used it a couple of years ago. Pretty fun!


I'd counter with:

- Adventure

- Riddle of the Sphinx

- Raiders of the Lost Ark

- Dragonstomper

- Star Raiders


I didn't have a 2600, but my friend did. Pitfall was by far the game I remember going to his house to play. It really was worlds better than the rest.

This was also a time (1982) when arcade games (Star Wars, Tron, Zaxxon, etc.) and their graphics were an order of magnitude more impressive than home console games, but Pitfall still had enough playability and complexity to keep it interesting.


> Combat springs to mind, https://www.free80sarcade.com/atari2600_Combat.php

Fun Copyright Fact: This website is violating Atari's copyright and will do until the year 2072!

"For a work made for hire, the copyright endures for a term of 95 years from the year of its first publication" https://www.copyright.gov/help/faq/faq-duration.html


And this is all thanks to the goddamned mouse and the politicians that decided disney's desire for infinite money was more important than societal good from more media turnover and ability to riff on it.


Battlezone on the 2600 feels pretty mindblowing to me too, although only on a technical level - it's a bit of a boring game.


Awesome write up, thank you for the deep dive and nostalgia!

I also like that this is on the front page the same time as “24 cores and I can’t move my mouse” https://news.ycombinator.com/item?id=34095032

It tells us that software is like a gas - it naturally expands and fills the container it occupies.


Also known as Parkinson's Law[1]. "Data expands to fill the space available for storage." And the reason building more highways doesn't reduce traffic congestion.

[1] https://en.wikipedia.org/wiki/Parkinson%27s_law


This also applies to stuff and houses (and garages)


Awesome!

Any time I think of Pitfall!, I always remember this pamphlet you could request and receive via mail:

Programming Pitfall Harry

https://archive.org/details/program-pitfall/mode/1up


Standard plug for this book:

https://www.amazon.com/Racing-Beam-Computer-Platform-Studies...

and

https://en.wikipedia.org/wiki/Racing_the_Beam

Great read. Very in-depth on some of the great games of that era.


Came here to mention this.

It is a fascinating book - and not just for the pitfall chapter.


Without a doubt. There was a David Crane video from 2011 linked here by russellbeattie(?) in which David plugs this book himself in the middle of his talk. It's that good. :)


LFSR algorithms are super interesting. Fabien Sanglard documents how it was used in Wolfenstein 3D's 'Fizzle effect' [1]. This is also covered in his book on the development of that game [2].

A detailed write up that goes into a bit of mathematics with code examples is 'Demystifying the LFSR' [3].

The 'Computerphile' Youtube channel did a whole episode on LFSR last year which is very accessible, highly recommended [4].

[1] https://fabiensanglard.net/fizzlefade/index.php

[2] https://www.amazon.com/Game-Engine-Black-Book-Wolfenstein/dp...

[3] https://www.moria.us/articles/demystifying-the-lfsr/

[4] https://www.youtube.com/watch?v=Ks1pw1X22y4


Now I want to write a JS game using LFSR.


128 bytes. I'm trying to think of something in today's world with that little memory - a 'door opening' smart card? a car key? There are light bulbs with literally a thousand times as much.


I 2007 or so, my mother bought me a "robot" kit with a Parallax branded "BASIC stamp microcontroller" https://www.parallax.com/product/basic-stamp-2-microcontroll...

It had 32 bytes of RAM. The IDE and educational materials were very heavy handed about encouraging you to split them up into very small variables, only as big as needed. Think of all the 8 bit ints that fit into 32 bytes!

PBASIC was actually pretty cool.


128 bytes and no framebuffer.

They had to "race the beam" ... but the VCS was not as fast as the beam, which is why Atari 2600 games always have a horizontally stretched look to them:

https://en.wikipedia.org/wiki/Racing_the_Beam

"The book's title comes from the fact that the Atari 2600, initially branded the VCS (Video Computer System), did not have a video frame buffer, and required the programmers to write each line of video to the television, one line at a time. As there were only a limited number of machine cycles in which to do this, the programmers were literally racing a high speed electron beam across the screen."


DVDs. There were 16 GPRMs that were 16 bits. Doing anything that resembled complicated required the use of everyone of those bits. This was the first time I had ever had to use bits in this manner, and did things not dissimilar to how Pitfall is described using the bits it had available. Of course, I was doing things 20+ years later. I found it quite a bit of perverse fun thinking in 0s and 1s like that. Not all DVD authoring programs were made the same, and the abstraction layer software like DVD Studio Pro required use of half of those GPRMs limiting what the programmer could do.


There are some AVR microcontrollers still manufactured today like that. For example, the ATtiny261A has 128 bytes of RAM, the ATtiny5 only has 32 bytes of RAM, and the ATtiny28 has no RAM at all (just registers).


I have a "light bulb" (actually an LED light fixture that attaches to a standard ceiling socket) that runs some kind of OS and has WiFi connectivity.


I think much of this is explained at a high level in this talk from the original game designer: Pitfall Classic Postmortem With David Crane Panel at GDC 2011 (Atari 2600). https://youtu.be/MBT1OK6VAIU

I haven't seen it in a while (will rewatch now), but I remember him going over how he was able to create so many levels in such a tiny space. I love these sorts of examples of breakthrough tech products that were the result of a developer figuring out how to do something that seemed impossible. Much of Apple's early success, for example, could be attributed to this.


> We got really lucky that Microsoft released the source code for MS DOS, and maybe if we're lucky Activision and Atari and Nintendo have all their original code somewhere in a vault, which they'll release freely into the public for the good of mankind, but I'm not holding my breath. Everyone who is able should be working to preserve whatever piece of history they can, 'cause it's not gonna preserve itself.

Amen.


Even though I was grew up well into the Nintendo dominated era of video games (NES,SNES) my first video game experience was playing on my parents Atari 2600 and both Adventure and Pitfall where common go to games for me. Reading the beginning of the article gave me real food for thought about how the manual of Adventure needed to specifically explain to the user about the concept of multiple rooms. Even with my limited exposure to more modern video games at the time I instinctively knew that going to the very edge of the screen was how you got to the next screen so the idea that this was something so new in Adventure it needed to be explained just blows my mind.


My first exposure to the concept of this kind of procedural generation was with the game Elite and how it created different galaxies. With such limited capabilities available it was eye opening to see the creativity that was unleashed.

See https://www.theguardian.com/books/2003/oct/18/features.weeke...


I think Archipelagos https://en.wikipedia.org/wiki/Archipelagos_(video_game) and Sentry/Sentinel https://www.c64-wiki.com/wiki/The_Sentinel also employed this technique.


This sure takes me back to hand-coding embedded systems firmware for a family of products on Microchip 8-bit microcontrollers... and that was as recent as this millennium.

In the days before ARM and when we needed to use a $3 processor to hit our BOM cost, my code used every byte of the 8K and every register... bit-banged I2C, bit-banged RS232 up to 56Khz, LED and 8-key touchpad debouncing, and best of all, an IR receiver and IR emitter which both learned arbitrary consumer IR codes in the fashion of a then-fashionable universal remote, and reproduced them including press-and-hold behavior, while emulating their carrier frequency... and could do bi-directional device cloning via IR by holding the products up to one another face-to-face...

And had a rock-solid bootloader for encrypted firmware updates.

Ah the good ol' days... a tear for the old skool.

I called the binder with the hard copy of the commented assembler source "my first novel"...


Just for fun, here's the Pitfall TV commercial (and Jack Black's first acting gig): https://www.youtube.com/watch?v=wfLgSdAAHMA


Did something not too dissimilar for storing test answer randomization and answers in an older elearning standard. I had limited space, so for each screen, I used a single byte to store the state of if it was seen, the answer selected and order of answers. The intent was to randomize answer order on display to make it harder to pass an answer key around. This is more impressive though... Cannot even imagine the effort it took.


I was saddened by the lack of "real" (translation: actual itty bitty robots) nanotechnology taking off. I imagined everyone trying to drag old Atari programmers out of retirement with a "We're used to shipping these fuck-huge executables, how did you do all of this under such tremendous constraints?" plea. So much with such minimal resources!


> *Reddit user knome and Hacker News user nwellnhof pointed out the size of this sentence in ASCII and UTF 8 were actually the same, so I changed an ASCII apostrophe for U+2019.

I wholeheartedly approve of this correction. :)


Discussed at the time (of the article):

How Atari 2600 Game Pitfall Builds Its World - https://news.ycombinator.com/item?id=27111377 - May 2021 (29 comments)


This reminds me of something I read, am not sure if true:

If you try to save the entire mesh for a video game world’s surface, it would be absolutely massive. So you just save the gist and generate all the in-betweens.

I think it was about Morrowind?


You could say that about a lot of games. It is a common approach. Every compression scheme you can imagine has been used and more, from simply shipping the whole thing after all to procedural generation and everything in between. I seem to recall Oblivion was mostly procedural on the overworld but then people would come in and edit the maps afterwards for various things.


Minecraft doesn’t have to save generated data. As long as they seed is preserved all the data will be regenerated. It only does this to save time and to persist any changes made.


Did no one here play "Kaboom!"? That was a fun game...


We did, but there really isn't anything interesting about Kaboom to make for discussion here. There's no technical wizardry or open world or groundbreaking mechanics or sense of fascination or urban legendarium about Kaboom, it's just a straightforward reflex game doing simple things with the hardware.


And yet, this completely uninteresting game by legends Larry Kaplan and David Crane had to fit into 2,048 bytes, ran at 60fps, won several awards, set sales records, and quickly became an instant classic.


That is true of many of the other games mentioned here as well, none of which have links to detailed technical discussions like Pitfall.


> We have the original source code for basically zero games for the Atari, NES, SNES, ColecoVision, you name it

Aren't those dumpster-dived scans of source code to Atari 7800 games?


Tangent…

From 2000 to 2002 I worked as a QA engineer at a semiconductor company in Silicon Valley. We tested the chips using all sorts of software. My boss loved dumpster diving at software companies, and probably half of our lab’s software came from stuff she found in the trash. My favorite find was Risk (the board game) for Windows 3.0.


Steve Woz said something in his book that always stuck with me: the younger generation is missing out on what it was like to grow up during a time where it was possible to grasp the entire machine in your mind.

People always chime in to say that you can still do that today. But back then the constraint was real; it was the best you had to work with, and it forced you to think of creative ways to use it and maximize it.

Anyway, super cool piece — thanks for sharing!


I grew up with early 80s microcomputers. I played around with BASIC and a little asm, definitely found it fascinating but as a kid just typing in little games from magazines I didn't really ever understand the whole computer.

Then in my 30s I found a group of retro enthusiasts that were still hacking on the little oddball computer I'd had as a kid. It became a hobby for a few years, and coming back to the platform with a couple decades of experience programming was truly a revelation. Fitting the whole machine in your head, knowing what every byte in the memory map and every cycle of the CPU are doing.. its an experience I don't think is really possible on modern hardware. Maybe with a microcontroller or arduino type device, but even these are much more complex than the old 80s microcomputers.


It's still possible, but only on extremely low-resource microcontrollers, such as the PICs and AVRs from Microchip (and Atmel, now part of Microchip). Many of these are tiny 8-bit micros with only a few dozen bytes of memory, meant for simple applications.


And the generation before the one Jobs was nostalgic for didn't have integrated circuits. They knew every transistor, resistor, and capacitor.


It's even more sad for kids being raised on smartphones and iPads as their only computing paradigm. If desktop OSes aren't complicated enough, there's certainly no hope of you learning how one of those work with how locked down they are. There's a lot of people who got into programming from learning to make Minecraft mods or basic scripting for some game. It feels like the mobile application paradigm tightly cements the idea that all software that runs on a computer is from some big company and shan't be tampered with.

And I don't think Xcode Playgrounds or various Programming Environment For Babies applications draw as much attention and spark as much imagination as learning to mod infinite lives into a game you already play, or even something as simple as modifying an ini file for a mod you found online.


I do vividly remember feeling of pride when as kid I edited some game files to do something funky (like swapping Red Alert Tanya weapon to tesla coil...) or managed to cheat the game via hex editor or cheat engine.

Mixing "something you like" with "something you can learn" is always a winning combo.


Any kid who wants to build an app for the iPhone/iPad can do it pretty easily, they just need a Mac and Xcode. With Swift Playgrounds for iPad, you don't even need a Mac.

I know you mentioned it "doesn't spark as much imagination", but I'm going to strongly disagree with you there. I don't think most software engineers got started by tweaking existing software. _You_ may have started that way, but I didn't. I got into programming by seeing the kinds of apps people made for Mac and iPhone, and wanting to build those kinds of experiences.


I just want to be clear I'm not framing today's tools as "bad" or unable to spark imagination — my own kids are getting started with Scratch and it's great!

But I also feel like there's some magic in the past too. Some of it's nostalgia, but there's also lessons there that are valuable.


Well I had a BBC as a kid before progressing to pcs. The idea of programming never really occured to me. I had a book but it's idea of programming consisted of typing out an ASCII art house.

I only got in to programming in my 20s when some company called canonical were giving CDs away of this thing call Linux.

Point is. We have the internet now for kids to discover these things, and schools are more up on it and we have inexpensive computers and stuff for kids to play about with.


I don’t disagree, but it’s all perspective and it’s all fractal. Every generation is building on top. He had his own examples of what he laments, I’m sure.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: