< Back to the archive

Like what you see? Subscribe here and get it every week in your inbox!

Issue #247 - December 3, 2023

If you are looking for work, check out this month's Who is hiring?, Who wants to be hired? and Freelancer? Seeking Freelancer? threads.

Here are the top threads of the week, happy reading!

Top comment by thaumaturgy

The end result was a stalemate. Reddit did not change any of its policies. Enough of the people responsible for posting and managing content left the platform to cause a noticeable impact on it.

Here's a fun thing to look at, https://subredditstats.com/ for any major subreddit, e.g.:

https://subredditstats.com/r/worldnews

https://subredditstats.com/r/explainlikeimfive

https://subredditstats.com/r/videos

All of the most popular subreddits show a steady decline from 2019 to present, with a sharp drop in July 2023. Once this happens to a platform, it's rare for the platform to ever get those users back at scale. It's safe money that Reddit will now be a zombie platform, a la Slashdot -- still up and running with some users, but with flat or declining activity forever.

Top comment by wyclif

I hesitate to post here because I really don't like talking about my problems and have never written anything like this before.

I'm one of the original members of HN and I've been here for a long time; since week one I think.

I'm an American technologist who has done web dev work in the past, but right now I'm stuck on an island in the Philippines without income or employment, and things are getting pretty desperate since it's this close to Christmas and my wife and I have two young children (my youngest is five). The pantry and coffers are empty at this point. The reason I'm here is because my family are citizens of this country and although I am trying to get the immigration paperwork together to bring them to the States, there's quite a lot of red tape and it's taking a lot longer than I expected.

Until recently I had been working remotely but the company I had been with sold and got folded into a bigger corporation and I was laid off along with our dev team.

My goal is to find an entry level back end or front end role that is remote for now. I am working on my Python-fu and I also have experience with HTML/CSS/JavaScript. I am interested in cloud-based technologies.

Since then I've been sending resumes out and doing interviews, but I'm not getting any response or feedback so far whatsoever... not a single bite. The recent trend of BTO (Back to the office) has been killing my chances, I think. I'm willing to take just about any job, help desk, on call, roles that are APAC time zone-only, anything as long as I can do it from here in the PH.

I know this is a long shot but things are getting pretty desperate for me. My family is basically rationing and we're living very close to the edge of bankruptcy. I would be overwhelmed with gratitude if there was help, but most of all I want to just help myself. I am a hard worker and enthusiastic about work in general, and I have a technical background and have worked in several technical roles in the past. Thank you and happy holidays to whoever reads this!

Top comment by svat

A little bit of history about the book series may help understand what is in it.

In 1956, Knuth graduated high school and entered college, where he encountered a computer for the first time (the IBM 650, to which the series of books is dedicated). He took to programming like a fish to water, and by the time he finished college in 1960, he was a legendary programmer, single-handedly writing several compilers on par with or better than professionals (and making good money too). In 1962 when he was a graduate student (and also, on the side, a consultant to Burroughs Corporation), the publisher Addison Wesley approached him with a proposal to write a book about writing compilers (given his reputation), as these techniques were not well-known. He thought about it and decided that the scope ought to be broader: programming techniques were themselves not well-known, so he would write about everything: “the art of computer programming”.

This was a time when programming a computer meant writing in that computer's machine code (or in an assembly language for that machine) — and some of those computers were little more than simple calculators with branches and load/store instructions. The techniques he would have to explain were things like functions/subroutines (a reusable block of assembly code, with some calling conventions), data structures like lists and tries, how to do arithmetic (multiplying integers and floating-point numbers and polynomials), etc. He wrote up a 12-chapter outline (culminating in "compiler techniques" in the final chapter), and wrote a draft against it. When it was realized the draft was too long, the plan became to publish it in 7 volumes.

He had started the work with the idea that he would just be a “journalist” documenting the tricks and techniques of other programmers without any special angle of his own, but unavoidably he came up with his own angle (the analysis of algorithms) — he suggested to the publishers to rename the book to “the analysis of algorithms”, but they said it wouldn't sell so ACP (now abbreviated TAOCP) it remained.

He polished up and published the first three volumes in 1968, 1969, and 1973, and his work was so exhaustive and thorough that he basically created the (sub)field. For example, he won a Turing Award in 1974 (for writing a textbook, in his free time, separate from his research job!). He has been continually polishing these books (e.g. Vols 1 and 2 are in their third edition that came out in 1997, and already nearly the 50th different printing of each), offering rewards for errors and suggestions, and Volume 4A came out in 2011 and Volume 4B in 2023 (late 2022 actually).

Now: what is in these books? You can look at the chapter outlines here: https://en.wikipedia.org/w/index.php?title=The_Art_of_Comput... — the topics are low-level (he is interested in practical algorithms that one could conceivably want to write in machine code and actually run, to get answers) but covered in amazing detail. For example, you may think that there's nothing more to say about the idea of “sequential search” than “look through an array till you find the element”, but he has 10 pages of careful study of it, followed by 6 pages of exercises and solutions in small print. Then follow even more pages devoted to binary search. And so on.

(The new volumes on combinatorial algorithms are also like that: I thought I'd written lots of backtracking programs for programming contests and whatnot, and “knew” backtracking, but Knuth exhausted everything I knew in under a page, and followed it with dozens and dozens of pages.)

If you are a certain sort of person, you will enjoy this a lot. Every page is full of lots of clever and deep ideas: Knuth has basically taken the entire published literature in computer science on each topic he covers, digested it thoroughly, passed it through his personal interestingness filter, added some of his own ideas, and published it in carefully written pages of charming, playful, prose. It does require some mathematical maturity (say at the level of decent college student, or strong high school student) to read the mathematical sections, or you can skim through them and just get the ideas.

But you won't learn about, say, writing a React frontend, or a CRUD app, or how to work with Git, or API design for software-engineering in large teams, or any number of things relevant to computer programmers today.

Some ways you could answer for yourself whether it's worth the time and effort:

• Would you read it even if it wasn't called “The Art of Computer Programming”, but was called “The Analysis of Algorithms” or “Don Knuth's big book of super-deep study of some ideas in computer programming”?

• Take a look at some of the recent “pre-fascicles” online, and see if you enjoy them. (E.g. https://cs.stanford.edu/~knuth/fasc5b.ps.gz is the one about backtracking, and an early draft of part of Volume 4B. https://cs.stanford.edu/~knuth/fasc1a.ps.gz is “Bitwise tricks and techniques” — think “Hacker's Delight” — published as part of Volume 4A. Etc.)

• See what other people got out of the books, e.g. these posts: https://commandlinefanatic.com/cgi-bin/showarticle.cgi?artic... https://commandlinefanatic.com/cgi-bin/showarticle.cgi?artic... https://commandlinefanatic.com/cgi-bin/showarticle.cgi?artic... are by someone who read the first three volumes in 3 years. For a while I attended a reading group (some recordings at https://www.youtube.com/channel/UCHOHy9Rjl3MlEfZ2HI0AD3g but I doubt they'll be useful to anyone who didn't attend), and we read about 0.5–2 pages an hour on average IIRC. And so on.

I find reading these books (even if dipping into only a few pages here and there) a more rewarding use of time than social media or HN, for instance, and wish I could make more time for them. But everyone's tastes will differ.

Top comment by RheingoldRiver

Most people know about MediaWiki even if they don't realize they do, because it powers Wikipedia, but I wish more people used it for documentation.

You can create highly specialized templates in Lua, and there's a RDBMS extension called Cargo that gives you some limited SQL ability too. With these tools you can build basically an entirely custom CMS on top of the base MW software, while retaining everything that's great about MW (easy page history, anyone can start editing including with a WYSIWYG editor, really fine-grained permissions control across user groups, a fantastic API for automated edits).

It doesn't have the range of plugins to external services the way something like Confluence has, but you can host it yourself and have a great platform for documentation.

Top comment by runnerup

Universities use a lot of OracleSQL (almost all of them), are generally very accepting of older workers and have a working environment that is friendly to aging workers. The salaries aren't excellent but the comfort is high.

Top comment by semireg

$500/month (3 figures) is a great milestone. I remember talking to a mentor that said, “you have something people will buy, stop working on features and focus just a bit on marketing.” It was good advice because now I’m regularly generating revenue of 4 figures per day.

Rewind 5 years…

I was so frustrated I couldn’t easily design and print barcode labels on my Mac using a spreadsheet. I opened up vscode and wrote a crude MVP in react and got it working in electron.

I put it on the Mac App Store and the next morning had my first sale in the Caribbean of all places. It’s now making 5-figures per week.

Last year I decided to start another side hustle, importing and selling high quality label printers bundled with my app, now making 4-figures per week with a growing reseller network.

Design App at https://label.live

Label Printers at https://mydpi.com

Edit: I’m still consulting so I still consider these two projects my side hustles.

Top comment by roosgit

Back in April I bought some parts to build a PC for testing LLMs with llama.cpp. I paid around $192 for: a B550MH motherboard, AMD Ryzen 3 4100, 1x16GB DDR4 Kingston ValueRAM, 256GB M.2 SSD. I already had an old PC case with a 350W PSU.

I was getting 2.2 tokens/s with the llama-2-13b-chat.Q4_K_M.gguf and 3.3 tokens/s with llama-2-13b-chat.Q3_K_S.gguf. With Mistral and Zephyr, the Q4_K_M versions, I was getting 4.4 tokens/s.

A few days ago I bought another stick of 16GB RAM ($30) and for some reason that escapes me, the inference speed doubled. So now I'm getting 6.5 tokens/s with llama-2-13b-chat.Q3_K_S.gguf, which for my needs gives the same results as Q4_K_M, and 9.1 tokens/s with Mistral and Zephyr. Personally, I can barely keep up with reading at 9 tokens/s (if I also have to process the text and check for errors).

If I wasn't considering getting an Nvidia 4060 Ti for Stable Diffusion, I would seriously be considering a used RX 580 8GB ($75) and run Llama Q4_K_M entirely on the GPU or offload some layers when using a 30B model.

Top comment by cplantijn

I think React Native is fantastic for getting out basic applications that may need camera, map, browser, and storage, etc capabilities. However, once you need more intricate functionality, I think it's best to write modules in Swift/(Kotlin/Java) and have your React Native UI communicate with your modules via the bridge.

I think React Native solves an organizational bandwidth issue. Building true native applications per platform will always outshine any results you get from React Native, both in performance and capability. However, if you have just a few developers, I think React Native is a practicable compromise.

Companies like Airbnb have the engineering bandwidth to develop bespoke platform specific applications, it's up to you and/or the company you work for to know what compromises can be made.

Top comment by Uehreka

My honest advice: Get out of the Hacker News hivemind.

The folks on this site act like every programming job is either at a FAANG/MAGMA company, a startup, or not in tech. They are wrong.

If you are willing to look outside the valley (but still in almost any big city or suburb), there are oceans of tech companies that work in like, financial compliance dashboards or healthcare documentation or if you’re OK with it defense contractors. You can make $150-250k easy. Your life will involve waking up, attending a standup on Google Meet, adding an endpoint to a REST API, sending it off to be tested, then around 2 you fix whatever the testers found, send it off again, then clock out once there’s nothing else in your swim lane (possibly before 5).

You won’t get equity, and you won’t have the outlandish salaries people are always talking about on HN, but you’ll also have the freedom to not think about your job outside of work hours. If what you want is to leverage your skills into a comfortable life where you can spend time with your kids, this is the move.

I’ve seen people burn out and try to leave tech for greener pastures, it often doesn’t turn out as well as they expect. In other fields, you’ll be a lot more fungible and you’ll come to realize how much cushion and leverage developers get from being a somewhat scarce resource.

Top comment by addaon

If it is possible that to make a simulation which matches our experience, then it is likely possible to make an unbounded number of such simulations. Thus, if such simulations are possible, it is vanishingly unlikely that we are executing directly on the underlying substrate.

If it is not possible, then, well, it's not.

So to a good approximation, the question "do you believe it is more likely than not that we are living in a simulation?" is equivalent to the question "do you believe that a simulation of the phenomenon you have observed is possible?"

And... well, sure, there's not a strong reason to think it's /impossible/, based on the evidence available to us. So, yeah, more likely than not.

Another way of phrasing this: Do you think it's more likely than not that there's some physical law, as yet discovered, that makes high fidelity simulation impossible? Such a law is certainly imaginable (limits on information density, magical-ness of souls, whatever); but if you don't have a reason to believe such a law is likely, then you probably believe we are more likely than not in a simulation.