Nice! I spent a few months in a bandwidth constrained environment (bluetooth tunnel to free Cafe wifi across the town square...). I generally got a few KB/s. The recent article on Antarctica's internet reminded me of those days.
HN was one of the only sites that was reasonably pleasant to use during that time. Still, I found it more fun and faster to surf via a VPS. So the VPS itself would load and render the page, and only send me the bytes over Mosh (a more efficient SSH protocol).
I tried a few terminal based browsers using this setup. The one that worked best for HN (especially comment indentation!) was w3m.
I did have fast mobile data but it was expensive and limited. I watched a lot of videos on YouTube and one day realized that almost everything I watched was actually just lectures or discussions. So I could just stream the audio and save 80% of the bandwidth.
I set up a little PHP script* that took a YouTube URL and ran `youtube-dl -g f bestaudio` to give you the URL of an audio-only stream. (Directly from YouTube's servers!)
I could then just load it directly in my browser, or download it for later. (I think I should have used `worstaudio` instead.)
* That little utility page is not up anymore, but others found it very useful so I should probably make another one...
Another benefit of this is to battery life (decoding video is expensive) and being able to turn off the screen (a feature that Google banned from the app store, years before making their own (paid) version available...) Later I learned about f-droid and newpipe... good times :)
The https://www.hn.zip/ website links to a GitHub repo https://github.com/lukeschaefer/hn.zip but that repo link currently gives a 404. Maybe owner of the site forgot to make the repo public or something. Can’t find any other repo on their profile that appears to be for that project either.
Ah that's on me, it's a private repo because my code is embarrassing. But behind the scenes here's how it works:
- I wrote my own scraper for HN for reasons I no longer remember
- Scraper is called every 30 seconds by an Apps Script timer on a Google Sheet (to be dirt cheap, Vercel doesn't have free cron jobs).
- Caches the front page, plus ten comments per post into a single Firestore document (to save money on reads vs 20 reads per load)
- Home page loads via SSR with Nuxt.
- Post pages will load the rest of the comments from a separate doc for each post
- the other comments load instantly on navigation if on a halfway decent connection - but if not, you still get to see the first 10 comments to get an idea of the discussion while you wait.
- Added some Workbox stuff to get it offline-first.
- Added an IndexDB wrapper around Firestore for more control over cache behavior I think.
- It's worked now for over a year with no oversight needed and not costing a penny!
Then I got distracted because I wanted to rebuild it from scratch to not use Nuxt and never completed that rewrite. Been a while since I've worked on it, had no idea other people even knew it existed! There's definitely some bugs (I think CSS gets broken on deeply nested comments) but it worked enough for my uses so I moved on.
Author of hn.zip here - thanks so much for the shout out! I never really publicize it and never 100% finished it, but I'm so glad people are using it! It's basically how I browse HN on my phone and once it reached 90% good enough I stopped working on it haha.
Can you do this with every website in the world?? I’m kidding of course but man, it pains me to browse the web as it is today knowing that things could be “this good”.
Great link! Might have found a bug in Chrome with it though :)
I was curious to see more about the internals so I opened up Chrome Devtools and as soon as I clicked on the "Application" tab, it crashes Chrome and did it every time.
The best signal was had by hanging my phone from the ceiling. Since the phone itself was connected to the cafe Wifi, I couldn't use wifi to connect to the phone. So I shared the phone's (cafe wifi) internet over Bluetooth. (I didn't know that was a thing until I tried it!)
(I think I was able to connect to that with another phone and set up a wifi hotspot for all my devices... but it's been a while!)
Maybe it was something like a Raspberry Pi that he placed at that cafe? Curious that his Bluetooth had more reach than the Wifi itself though. Maybe with a directional antenna. Agree, would be interesting to know his setup.
That sounds awesome. I have actually been working on something similar myself. I have been building a distributed NATS[0] cluster to funnel data down from the internet and index it locally. I also have a local Ollama box wired up to try to do some NLP on it.
I started with Benthos to pull in, but since the recent license changes[1] I am rethinking that :(
If you would like to collaborate feel free to shoot me an email!
Haxor-news was the first terminal client I used for HN. Since then, I have tried a few of the others but haxor is the one I used the most. When I worked in the office and people see you fiddling around in the terminal, they usually leave you alone assuming you are on the HPC or doing other work. Now that I work from home have not used a terminal client even once. Maybe check this one out just for fun.
PS – thanks for the reminder of this one, was searching to post the exact same comment….
I’ve tried this one and I found it a bit unpleasant UX to type numbers to select the story I want to read. (Hence I wrote something like hn-text that I can navigate/open articles using only one hand and visually)
Would you consider adding armhf as a build target? This would make it useful for a wide range of older devices including Raspberry Pi 1 or 0, which are more likely to be used via console interface.
This reminds me of a HN post way back, there was an article about Min versions of websites. With images and large media filtered out, so it's the bare minimum, usually just text.
Great companion tool to tailhn (https://github.com/chfritz/tailhn), which allows you to `tail` all new HN posts on the terminal where you can use `grep` and other things to filter, e.g., `tailhn | grep -r robot` to see all new posts about robotics.
Thanks, I've been looking for something like this for a while!
Hope you don't mind a few additional suggestions/feature requests!
- Single key scroll up/down a page of text (can work around it on my laptop keyboard with fn-down but would prefer not to have to use two hands).
- Maybe customizable key bindings? (i.e, I'm used to hitting space to scroll down a page of text, and don't need to open the story in a browser so often)
- Key binding to toggle "best", or even better, filters to view only the top N stories (I use this[1] site a lot for that).
Thanks for the pretty good ideas. What already works is also the combination CTRL+F (forward) and CTRL+B(backward) which is provided by the underlying library (tview) and pretty standard vim keymap for that action.
But yeah keymaps should become configurable eventually.
Unfortunately the hierarchical comments are not indented properly with Lynx.
The topic listings work just fine though. Although if I were using Lynx just for the headlines, I might start at https://brutalist.report/ instead of hacker news.
> Maybe you could add a custom stylesheet to Lynx to make the indent work?
Lynx Style Sheets (LSS) don't really cover that kind of scenario from what I remember[1] - they're more "apply this ncurses/slang-style to this element".[2]
[1] I wrote the original code but I am old and increasingly senile.
[2] I've not really been involved for ~20 years but unless the internal "renderer" has been radically overhauled, supporting anything like "real" CSS isn't really doable.
I see it's written in Go and some antiviruses love to mark Go executables as viruses because Go bundles the whole Go runtime inside the executable, and so the antivirus' heuristics marks it as a virus, just because it shares the same Go runtime inside as some other random unrelated virus written in Go (the antivirus has no idea it's just runtime code, it just sees that the executables' machine code matches by something like ~95%)
If you have an account showdead is turned on, you will see text from flagged but not dead stories through the web interface. You don’t get those through firebase.
I just came here to say, sincerely, that I love that you kept this so simple and just put everything in one single main package, with well-named files, rather than googling "the proper go project structure" and ending up with something unnecessary like:
⌞ app
⌞ cli.go <- actual package main
⌞ internal // (or pkg, or pkg/internal)
⌞ parser
⌞ parser.go
⌞ parser_test.go
⌞ ui
...etc
HN was one of the only sites that was reasonably pleasant to use during that time. Still, I found it more fun and faster to surf via a VPS. So the VPS itself would load and render the page, and only send me the bytes over Mosh (a more efficient SSH protocol).
I tried a few terminal based browsers using this setup. The one that worked best for HN (especially comment indentation!) was w3m.
Here's the current thread in w3m: https://files.catbox.moe/zuabb4.png
I did have fast mobile data but it was expensive and limited. I watched a lot of videos on YouTube and one day realized that almost everything I watched was actually just lectures or discussions. So I could just stream the audio and save 80% of the bandwidth.
I set up a little PHP script* that took a YouTube URL and ran `youtube-dl -g f bestaudio` to give you the URL of an audio-only stream. (Directly from YouTube's servers!)
I could then just load it directly in my browser, or download it for later. (I think I should have used `worstaudio` instead.)
* That little utility page is not up anymore, but others found it very useful so I should probably make another one...
Another benefit of this is to battery life (decoding video is expensive) and being able to turn off the screen (a feature that Google banned from the app store, years before making their own (paid) version available...) Later I learned about f-droid and newpipe... good times :)