< Back to the archive

Like what you see? Subscribe here and get it every week in your inbox!

Issue #285 - August 25, 2024

Here are the top threads of the week, happy reading!

Top comment by allknowingfrog

You'll find yourself choosing between getting "your" work done and doing everything else on the list. It's important to accept that enabling everyone else is part of your job. In my experience, code review is the single most important thing you can do to mentor your team, maintain your code quality, and monitor your progress.

A daily standup ensures that no one spins their wheels for more than a day without giving you a chance to do something about it, which is really helpful for a junior team. However, remember that this meeting is for surfacing issues, not for solving them. Schedule follow-ups with the people who need it and let everyone else get back to work.

Document things when you get tired of answering the same question repeatedly. Focus on ROI, not some higher principle of best practice. You can get a lot of mileage out of conventions and tribal knowledge.

Top comment by 8organicbits

The recent relicensing of Redis to a non-open-source license bothered many in the community. But the groundwork for the relicensing was laid much earlier. I've been working on relicensing monitor to track various projects attributes that can affect the ease of relicensing a project.

https://alexsci.com/relicensing-monitor/

Top comment by lylejantzi3rd

The traditional career progressions for software developers are management, entrepreneurship, and carpentry.

Top comment by Hogg

I'd bet just about anything that Google uses machine learning to decide whether or not to trust a site for ads. It seems like the only solution that would work at a large enough scale to handle that kind of demand (versus more defined but more labor- and resource-intensive malware/fraud detections). I think that also explains why the review process seems so arbitrary and ineffective - in essence, not even Google knows why Google decided your site was bad. I used to help people with hacked websites, but eventually I had to refuse to work on projects where the only symptom was a Google Ads denial because it was such nonsense. In one case a guy completely removed his site and replaced it with a 0-byte page, and even after we saw Google-owned IP addresses doing a crawl in the site access logs, they still told him there was malware (including a list of infected URLs that no longer existed).

If I'm correct, changing your domain might help in that machine learning algorithms consume tons of signals and maybe altering that particular one would push your site under the "bad" threshold. But it might not do anything. It's a super frustrating problem. I hope you can stumble onto a solution or find someone at Google willing to help.

Top comment by oneplane

Scoping this to users, it wouldn't change much at all. For data transfers you would also need storage, memory and CPUs that can handle it. For streaming it also doesn't change much since even a 4K HDR stream would work on a legacy VDSL2 system. Same goes for remote compute.

For non-users (datacenters, companies, cluster systems etc) that might mostly help with distributed storage, but again, you'd also need all the other things, because distributing anything like that also means every piece of the puzzle needs to be able to handle it. Within a datacenter or a public cloud, even top speeds of 400Gb/s (which is less than half of the thought experiment) is at such a high tier that it isn't useful for individual applications or nodes.

Something that would actually make an impact after you get to around 2Gbps would be lower latency, lower jitter, net neutrality (including getting a full connection with no administrative limits), and more peered routes. More bandwidth doesn't really do much, especially if you can't use that bandwidth anyway if the amount of connections, the latency and the computing equipment doesn't scale with it.

When you have low enough latency combined with high enough bandwidth you can start to actually get new apps and also develop novel use cases (imagine a CPU-to-CPU interconnect that can span continents). But the speed of light (or some more exact latency-per-distance quantity) prevents that.

Beyond the likes of BitTorrent and Gnutella we're not likely to see network-based ideas that are currently impossible due to limits on the average speed. Perhaps the real problem right now is the lack of universal availability of reasonable connectivity.

Top comment by bheadmaster

> I sometimes come across mp3 files with a high bitrate, but they sound bad which suggest that they were re-encoded from a bad/low-bitrate source.

You could try re-compressing the mp3 file to lower and lower bitrates and check the amplitude of differences. Since mp3 is a lossy codec, there will always be a slight difference, but you should see a sudden increase in difference when you surpass the "true" encoding bitrate.

You could probably write a script for it using ffmpeg and some other tools to generate a bitrate-difference chart.

Top comment by SavageBeast

We tried to compete on price in a market that really wasn't as price sensitive as it claimed to be. Yes some people wanted to "pay less" but in reality the big players were happy to pay the "industry standard" rate. Many of those big players we would later find were happy to pay far more even. For every player who wanted to "pay less" there were easily 10 that would pay more to get the opportunities their competition wasn't willing to pay for.

Looking back I feel totally stupid about it.

Top comment by happytoexplain

Of the ~700 movies I've watched (not bragging, just context - actual movie buffs watch thousands by the time they're my age), those I happily rewatch are:

The Thing (1982)

Silence of the Lambs

Alien

Little Shop of Horrors (1986)

Young Frankenstein

Rocky Horror

Raising Arizona

O Brother, Where Art Thou?

Men in Black

Primer

But even those I've only seen a handful of times each. It takes a lot for me to want to watch/read/play something multiple times, as I'm the kind of person who always thinks, "but I could be trying something new, or getting something done".

Top comment by riedel

Even with a Smartphone you will have a terrible experience if you do not use stock vendor software. On a custom rom one needs to install zygisk modules [0] to get around the play integrity madness (before that safetynet). As this still dies not rely in Hardware attestation it could works in emulators, too. What we need are court rouling against this ! Banks actually force us to give our data and sell souls to Google an Apple.

[0] https://github.com/chiteroman/PlayIntegrityFix/releases

Edit: seemingly PoCs exist : https://xdaforums.com/t/poc-safetynet-bypass-for-emulators.4...

Top comment by thiht

If you’re a developer:

- SQL, this goes without saying but if you’re only using an ORM without knowing SQL, you’re setting yourself up for failure.

- Regular expressions, it’s not that hard. Knowing how to read and write a regex without needing an external tool (I often use regex101 but I don’t need it) is a huge life saver. It also helps develop a good intuition of when to use (or not) a regex.

- bash, a lot of people use it but never took the time to actually learn it (ie. can’t write a condition or a loop from the top of their head). You’ll use it your whole life, learn bash