This guide will walk you through installing and configuring nginx, PHP and Mysql optimized for MacOS Big Sur on Apple Silicon - M1 Arm processors.
Note: Most of this content originated from this post. Here, I’ve pared it down and fixed the relevant content. Credit goes to the original author: Kevin Dees.
This is a Magic Mirror Module that displays the Next Rocket Launch from Earth. The data comes from the excellent Launch Library 2 API provided by thespacedevs. This is the Launch Library 2 Documentation.
I wrote a simple plugin for NZBGet that inserts download history into a MySQL database. Having my download list in a database makes operations on the data easier than groking text logs.
I wanted to display my most recent downloads on a Grafana Dashboard:
To use simply copy the Mysql-Log.py script into your NZBGet/scripts directory. In NZBGet, select settings and set the hostname of your MySQL instance.
I couldn’t find any quick references about accessing the Pihole API so I created this page.
Pihole is a great app for blocking internet advertising that was originally designed to be run on a raspberry pi. It blocks known advertisers’ domains at the DNS level by effectively null routing requests destined to serve ads. It can be run on VMs, Raspberry Pis and bare metal servers.
Here are the steps to access the pihole’s rest api. I’m using curl in this example, but you can integrate it with OpenHAB or any other system that can talk REST.
Step 1: Obtaining the web password
Most of the useful API endpoints the pihole provides wisely require authentication. After searching around the net I found that I could pass &token=A_VALID_SESSION_TOKEN to authenticate to the pihole for a session. Unfortunately, this is a temporary auth token and wasn’t suitable for my needs. After more digging, I found the gem I needed: &auth=WEBPASSWORD. Sounds great, but where do I obtain this password? You simply log onto your pihole instance or server and run:
I wrote a simple plugin for NZBGet that inserts download history into an ElasticSearch cluster (or node). It uses API calls rather than parsing filesystem logs. I wanted a quick way to just insert the data so I created this script.
Simply copy the ESLog.py script into your NZBGet/scripts directory. In NZBGet, select settings and set the hostname of your ElasticSearch instance.