We’ve been using an Apple TV. From memory, there’s a Jellyfin client.
Anything and everything Amateur Radio and beyond. Heavily into Open Source and SDR, working on a multi band monitor and transmitter.
#geek #nerd #hamradio VK6FLAB #podcaster #australia #ITProfessional #voiceover #opentowork
We’ve been using an Apple TV. From memory, there’s a Jellyfin client.
For the purpose of?
Venting? Warning? Praise?
Something else?
deleted by creator
Well, clearly this is a credible source, even has it’s own substack domain, what could possibly be suss?
It’s really simple to use, and markdown is essentially plain text.
My go-to for this is pandoc
, it takes markdown and can generate html, pdf, word, OpenOffice and other formats.
Because it uses markdown, you can use version control and grep on your documentation and include it with your source code.
Bruce Perens is currently working on a new licensing model called Post Open requiring that business with sufficient revenue to pay up.
In my opinion it’s criminal just how often this happens. Big business making obscene profit off the back of volunteer work like yours and many others across the OSS community.
Multiple camera angles are used for two reasons:
Paywalled , or so horribly broken that you cannot scroll past the first page using Firefox Focus on Android.
A female adult mosquito can live for a month or two, laying eggs every two to three days, between 100 and 300 eggs each time.
Also, they travel several kilometres for food - blood, so, I’m going out on a limb, but you might need to scale up your efforts if you’re attempting to reduce the local population.
My first recommendation is to become familiar with one flavour of Linux. Debian is a solid choice and it will give you a good understanding of how a great many derivatives operate.
The command line is a tool to get things done, it’s not an end to itself. Some things are easier to do with a GUI, many things are easier to do with the command line interface or CLI.
Many Linux tools are tiny things that take an input, process it and produce an output. You can string these commands together to achieve things that are complex with a GUI.
Manipulation of text is a big part of this. Converting things, extracting or filtering data, counting words
For example, how many times do you use the words “just” and “simply” in the articles you write?
grep -oiwE "just|simple" *.txt | sort | uniq -c
That checks all the text files in a directory for the occurrence of either word and shows you how many occurred and what capitalisation they used.
In other words, learning to use the CLI is about solving problems, one by one, until you don’t have to look things up before you understand why or how it works.
Credit bureaus are not for your protection, they’re for the protection of their clients, the banks.
Excellent.
I think I might be able to create a fail2ban rule for that.
Is the page linked in the site anywhere, or just mentioned in the robots.txt file?
This does not block anything at all.
It’s a 1994 “standard” that requires voluntary compliance and the user-agent is a string set by the operator of the tool used to access your site.
https://en.m.wikipedia.org/wiki/Robots.txt
https://en.m.wikipedia.org/wiki/User-Agent_header
In other words, the bot operator can ignore your robots.txt file and if you check your webserver logs, they can set their user-agent to whatever they like, so you cannot tell if they are ignoring you.
Like the cookie that stores the “Reject All the cookies” response for your next visit 😇
DHCP at the router that gives out these two filtered DNS servers from AdGuard:
https://adguard-dns.io/en/blog/adguard-dns-new-addresses.html
AI, also known as Assumed Intelligence