Hacker News new | past | comments | ask | show | jobs | submit login
HN-text: an easy-to-use, text-first Hacker News terminal client (github.com/piqoni)
216 points by todsacerdoti 8 months ago | hide | past | favorite | 67 comments



Nice! I spent a few months in a bandwidth constrained environment (bluetooth tunnel to free Cafe wifi across the town square...). I generally got a few KB/s. The recent article on Antarctica's internet reminded me of those days.

HN was one of the only sites that was reasonably pleasant to use during that time. Still, I found it more fun and faster to surf via a VPS. So the VPS itself would load and render the page, and only send me the bytes over Mosh (a more efficient SSH protocol).

I tried a few terminal based browsers using this setup. The one that worked best for HN (especially comment indentation!) was w3m.

Here's the current thread in w3m: https://files.catbox.moe/zuabb4.png

I did have fast mobile data but it was expensive and limited. I watched a lot of videos on YouTube and one day realized that almost everything I watched was actually just lectures or discussions. So I could just stream the audio and save 80% of the bandwidth.

I set up a little PHP script* that took a YouTube URL and ran `youtube-dl -g f bestaudio` to give you the URL of an audio-only stream. (Directly from YouTube's servers!)

I could then just load it directly in my browser, or download it for later. (I think I should have used `worstaudio` instead.)

* That little utility page is not up anymore, but others found it very useful so I should probably make another one...

Another benefit of this is to battery life (decoding video is expensive) and being able to turn off the screen (a feature that Google banned from the app store, years before making their own (paid) version available...) Later I learned about f-droid and newpipe... good times :)


The Antarctica Internet one reminded me of http://hn.zip, which caches as much as it can for spotty connections.


The https://www.hn.zip/ website links to a GitHub repo https://github.com/lukeschaefer/hn.zip but that repo link currently gives a 404. Maybe owner of the site forgot to make the repo public or something. Can’t find any other repo on their profile that appears to be for that project either.


Ah that's on me, it's a private repo because my code is embarrassing. But behind the scenes here's how it works:

- I wrote my own scraper for HN for reasons I no longer remember

- Scraper is called every 30 seconds by an Apps Script timer on a Google Sheet (to be dirt cheap, Vercel doesn't have free cron jobs).

- Caches the front page, plus ten comments per post into a single Firestore document (to save money on reads vs 20 reads per load)

- Home page loads via SSR with Nuxt.

- Post pages will load the rest of the comments from a separate doc for each post

- the other comments load instantly on navigation if on a halfway decent connection - but if not, you still get to see the first 10 comments to get an idea of the discussion while you wait.

- Added some Workbox stuff to get it offline-first.

- Added an IndexDB wrapper around Firestore for more control over cache behavior I think.

- It's worked now for over a year with no oversight needed and not costing a penny!

Then I got distracted because I wanted to rebuild it from scratch to not use Nuxt and never completed that rewrite. Been a while since I've worked on it, had no idea other people even knew it existed! There's definitely some bugs (I think CSS gets broken on deeply nested comments) but it worked enough for my uses so I moved on.


>[from profile:] Recently quit my job, now I work on anything that holds my interest (until I run out of money)

I did the same thing... lots of fun, but the "run out of money" part is rough! I resorted to freelancing, and now I'm going back to university.



Neat! Thanks for the details :)


Author of hn.zip here - thanks so much for the shout out! I never really publicize it and never 100% finished it, but I'm so glad people are using it! It's basically how I browse HN on my phone and once it reached 90% good enough I stopped working on it haha.


Can you do this with every website in the world?? I’m kidding of course but man, it pains me to browse the web as it is today knowing that things could be “this good”.

I just tried it and it was an amazing experience.


This is pretty amazing on my fast connection. makes you realize how slow things actually are when you use something like this


Great link! Might have found a bug in Chrome with it though :)

I was curious to see more about the internals so I opened up Chrome Devtools and as soon as I clicked on the "Application" tab, it crashes Chrome and did it every time.


What do you mean by Bluetooth tunnel? I'm curious about that setup.


The best signal was had by hanging my phone from the ceiling. Since the phone itself was connected to the cafe Wifi, I couldn't use wifi to connect to the phone. So I shared the phone's (cafe wifi) internet over Bluetooth. (I didn't know that was a thing until I tried it!)

(I think I was able to connect to that with another phone and set up a wifi hotspot for all my devices... but it's been a while!)


Maybe it was something like a Raspberry Pi that he placed at that cafe? Curious that his Bluetooth had more reach than the Wifi itself though. Maybe with a directional antenna. Agree, would be interesting to know his setup.


Same here! Please provide some details, very curious.


> distraction-free Hacker News terminal client.

This seems contradictory :)


I have been working out plans for making an interesting subset of the internet available offline (along with a search engine).

However, I realized that my entire motivation for working offline was that it eliminates distraction. So it would defeat the purpose!

However I'd still much rather be distracted on Wikipedia (or even HN) than other sites (esp. those with primarily images, gifs or video).

I might need to segment it into "tiers", and have more fun parts come "online" as the day progresses...


That sounds awesome. I have actually been working on something similar myself. I have been building a distributed NATS[0] cluster to funnel data down from the internet and index it locally. I also have a local Ollama box wired up to try to do some NLP on it.

I started with Benthos to pull in, but since the recent license changes[1] I am rethinking that :(

If you would like to collaborate feel free to shoot me an email!

[0] https://nats.io/

[1] https://news.ycombinator.com/item?id=40537911


No more e-worms! https://www.youtube.com/watch?v=Y5BZkaWZAAA

Hope you call it the Innernette :)


If you’re looking at that along with a text based interface then be sure to checkout offpunk [1] if you haven’t already.

[1] https://sr.ht/~lioploum/offpunk/


Have you seen https://devdocs.io/ ? Even has an offline mode.


I made a similar project https://github.com/dominickp/hn But I really like how your version shows a text version of the article in the UI.


Reminds me of HN Gopher[1], which is also a great text-only way of reading, although not necessarily browsing, HN.

1. https://github.com/michael-lazar/hn-gopher


I used to enjoy this alternative CLI

https://github.com/donnemartin/haxor-news

But the docker image seemed to dissapear from docker hub so I stopped using it.


Haxor-news was the first terminal client I used for HN. Since then, I have tried a few of the others but haxor is the one I used the most. When I worked in the office and people see you fiddling around in the terminal, they usually leave you alone assuming you are on the HPC or doing other work. Now that I work from home have not used a terminal client even once. Maybe check this one out just for fun.

PS – thanks for the reminder of this one, was searching to post the exact same comment….


I’ve tried this one and I found it a bit unpleasant UX to type numbers to select the story I want to read. (Hence I wrote something like hn-text that I can navigate/open articles using only one hand and visually)


Haxor news is the best way to browse HN.


Would you consider adding armhf as a build target? This would make it useful for a wide range of older devices including Raspberry Pi 1 or 0, which are more likely to be used via console interface.


Good point, will look into it, thanks!


This reminds me of a HN post way back, there was an article about Min versions of websites. With images and large media filtered out, so it's the bare minimum, usually just text.


Great companion tool to tailhn (https://github.com/chfritz/tailhn), which allows you to `tail` all new HN posts on the terminal where you can use `grep` and other things to filter, e.g., `tailhn | grep -r robot` to see all new posts about robotics.


Is there a way to refresh from the front page? or do you just quit and start it again?


Very good feature request, will definitely implement next :)

(So for now, yes, press q to quit and run it again)


When launching it, the user can just wrap it in 'while true ; do .... ; done' and hey presto, 'q' is the refesh key.


That's pretty clever and works quite nicely (i just tried)!


I love simple solutions because my default settiing is usually to build out the hardest thing possible, lol


Thanks, I've been looking for something like this for a while!

Hope you don't mind a few additional suggestions/feature requests!

- Single key scroll up/down a page of text (can work around it on my laptop keyboard with fn-down but would prefer not to have to use two hands).

- Maybe customizable key bindings? (i.e, I'm used to hitting space to scroll down a page of text, and don't need to open the story in a browser so often)

- Key binding to toggle "best", or even better, filters to view only the top N stories (I use this[1] site a lot for that).

[1] https://hckrnews.com/


Thanks for the pretty good ideas. What already works is also the combination CTRL+F (forward) and CTRL+B(backward) which is provided by the underlying library (tview) and pretty standard vim keymap for that action.

But yeah keymaps should become configurable eventually.


Refresh feature for the frontpage is now added on 0.1.3 (active by pressing r).


Would be nice to publish a docker image or have a nix flake to run this as an alternative to the binaries.


Lynx

Or you could just use the terminal browser Lynx, that will display all sites as text-only.

https://en.wikipedia.org/wiki/Lynx_(web_browser)


Unfortunately the hierarchical comments are not indented properly with Lynx.

The topic listings work just fine though. Although if I were using Lynx just for the headlines, I might start at https://brutalist.report/ instead of hacker news.


That's because comments are indented with an invisible gif (https://news.ycombinator.com/s.gif), not with CSS. Maybe you could add a custom stylesheet to Lynx to make the indent work? Some prior art: https://news.ycombinator.com/item?id=12098017


> Maybe you could add a custom stylesheet to Lynx to make the indent work?

Lynx Style Sheets (LSS) don't really cover that kind of scenario from what I remember[1] - they're more "apply this ncurses/slang-style to this element".[2]

[1] I wrote the original code but I am old and increasingly senile.

[2] I've not really been involved for ~20 years but unless the internal "renderer" has been radically overhauled, supporting anything like "real" CSS isn't really doable.


What about using browsh[0] over SSH/Mosh/HTTP?

[0]: https://www.brow.sh/


Got a (hopefully false-positive) virus alert on Windows 11 from Defender:

> Trojan:Win32/Sabsik.FL.A!ml

> Details: This program is dangerous and executes commands from an attacker.

Maybe due to UPX?


I see it's written in Go and some antiviruses love to mark Go executables as viruses because Go bundles the whole Go runtime inside the executable, and so the antivirus' heuristics marks it as a virus, just because it shares the same Go runtime inside as some other random unrelated virus written in Go (the antivirus has no idea it's just runtime code, it just sees that the executables' machine code matches by something like ~95%)


Very good, something I would have written as side project if I had the motivation

Same language and libraries

Just don't understand why web scrapping, wasn't there an API?


If you have an account showdead is turned on, you will see text from flagged but not dead stories through the web interface. You don’t get those through firebase.


Thanks for sharing! Browsed the source code and Go seems like a pleasant language! Either that or the author wrote very pretty code


Go code has a tendency to look the same no matter who writes it.


Looks pretty standard to me. I do tend to find Go quite pleasant.


For sure! I don't work with Go so looking at it from a noob perspective


Interesting, congratulations! I'm very fond of these hobby projects.

By the way, my one and only wish for HN maintainers is to have a dark mode - please...


Until an official dark mode:

https://news.ycombinator.com/item?id=40351227 with 8 comments links to last year's discussion with 90 comments:

https://news.ycombinator.com/item?id=36342085


Looks cool. Love terminal apps. Currently still fascinated and happy with Spotify Terminal

Great work. Would love to have the options of following someone and getting a feed of their comments/or stories


Thanks for doing this. I'll give it a try.

I took a quick look at your website, and your photography looks really beautiful. Are you based in Copenhagen?


Thanks! Yes, Copenhagen :)


A few more steps and we’ll have HN over NNTP. Which would give me a reason to dust off my slrn config from god knows where it is now.


I just came here to say, sincerely, that I love that you kept this so simple and just put everything in one single main package, with well-named files, rather than googling "the proper go project structure" and ending up with something unnecessary like:

     ⌞ app
         ⌞ cli.go <- actual package main
     ⌞ internal // (or pkg, or pkg/internal)
         ⌞ parser
             ⌞ parser.go
             ⌞ parser_test.go
 
         ⌞ ui
             ...etc


is unicode text hidden in the web client, scrubbed out client side on submission, or erased when being received by the server?

does your client let us read and write full unicode?

test string: ;


cool project! nice


I keep seeing HN clients again and again to the point that... its tiresome. Why do you think there's a need to create a client for a website?


Emacs browser eww does the same thing.


Could this be done for DOS as well?


I mean really cool, but I don't need 20+ hacker news tabs in my terminal too, I have enough in my Chrome lol.


Very nice! Now if it could just support offline use like the .qwk readers from the BBS days.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: