# Software Misc

Cover image by [AltumCode](https://unsplash.com/photos/XMFZqrGyV-Q?utm_source=unsplash&utm_medium=referral&utm_content=creditShareLink)

# Security

- The [OWASP API Top 10](https://owasp.org/www-project-api-security/) security measures may be a good place to start when trying to decide what security to implement on your web project

# Gitea and DroneCI

I use [Gitea](https://github.com/go-gitea/gitea) to self-host my code + projects [here](https://git.jamesravey.me) and I use [DroneCI](https://www.drone.io/), a lightweight CI pipeline that integrates into gitea to do automation stuff.

### Configuration of Drone + Gitea

TODO: write about setup here - docker etc

### Drone CI Config

Drone works like many other CI systems - use yaml files in the repository to control builds. You can specify the type of CI run at the top level of the yaml doc and also give it a name:

```yaml
kind: pipeline
type: docker
name: test and build

```

You can define multiple build steps - each one can have its own docker image - this is useful for example if you have a React frontend and a Golang backend and you need to build both

```yaml
steps:
   - name: test_backend
     image: python:3.7
     commands:
      - pip install poetry
      - poetry install
      - poetry run pytest

   - name: test_frontend
     image: node
     commands:
      - npm install
      - npm test


```

#### Shared State Between Steps

Steps can share files via [temporary volumes](https://docs.drone.io/pipeline/docker/syntax/volumes/temporary/) if needed but are generally stateless and independent.

#### Conditional Execution of Steps

You can opt to only run steps under certain conditions. For example you might only want to publish code when stuff gets pushed to master. Use conditions to do this:

```yaml
   - name: publish
     when:
       branch: 
        - master
       event:
        exclude:
        - pull_request
     image: python3.7
     commands:
      - twine upload...




```

[The drone documentation](https://docs.drone.io/pipeline/conditions/) has a full set of conditions that you can use - you can also whitelist and blacklist certain events and certain branches

#### Secrets

[Secrets](https://docs.drone.io/secret/repository/) can be used to pass things like auth tokens into CI pipelines. This is useful if you want to do things like publish packages or upload files. You essentially declare an environment variable and then define which "secret" it came from in your CI yaml file:

```yaml
 - name: publish
     when:
       branch: 
        - master
       event:
        exclude:
        - pull_request
     image: python3.7
     environment:
      GITEA_PACKAGE_REPO:
        from_secret: gitea_package_repo
      GITEA_OWNER:
        from_secret: gitea_owner
      GITEA_TOKEN:
        from_secret: gitea_token

```

Then in the drone frontend you can add the value and it will be stored securely and passed to the CI at run time:

[![image.png](https://wiki.jamesravey.me/uploads/images/gallery/2022-10/scaled-1680-/X36image.png)](https://wiki.jamesravey.me/uploads/images/gallery/2022-10/X36image.png)

### Backup Mechanism

# Caddy Handler

You can set up caddy to do different things depending on the prefix using [handle](https://caddy.community/t/how-to-serve-file-server-from-different-path/10034) directive

```
testsite.com {
  handle_path /media* {
    root * /path/to/media
    file_server
  }

  handle {
    root * /path/to/normal/content
    file_server
  }
}


```

# FOSS Funding

Underfunding of FOSS projects can be disasterous as [this list ](https://github.com/PayDevs/awful-oss-incidents)shows.

# Golang Web Services and Gin

I've been using GoLang to build [IndieScrobble](https://github.com/ravenscroftj/indiescrobble)

### Live Reload

I use [this package](https://github.com/codegangsta/gin) to live-reload my application as I make changes to it.

# CRON No MTA installed discarding output

Answer from [here](https://askubuntu.com/questions/222512/cron-info-no-mta-installed-discarding-output-error-in-the-syslog)

> Linux uses mail for sending notifications to the user. Most Linux distributions have a mail service including an MTA (Mail Transfer Agent) installed. Ubuntu doesn't though.
> 
> You can install a mail service, postfix for example, to solve this problem.
> 
> ```
> sudo apt-get install postfix
> 
> 
> ```
> 
> Or you can ignore it. I don't think the inability of cron to send messages has anything to do with the CPU spike (that's linked to the underlying job that cron is running). It might be safest to install an MTA and then read through the messages (`mutt` is a good system mail reader).

<div class="mt24" id="bkmrk-the-best-option-seem"><div class="mt24"><div class="d-flex fw-wrap ai-start jc-end gs8 gsy">The [best option](https://askubuntu.com/a/804289) seems to be redirect all output to a log file:</div><div class="d-flex fw-wrap ai-start jc-end gs8 gsy"></div></div></div>&gt; (use `sudo` if the issue is with root’s crontab) and add `&gt;&gt; */some/log/file* 2&gt;&amp;1` after every command, like this: ```
0 3 * * * <em>cmd</em>  >> <em>/some/log/file</em> 2>&1

```

# Low and No Code Frontends

Quite often it is useful to have ugly-but-functional frontends for accessing things like databases and carrying out user management. Recently commercial tools like Retool have made it really easy to build this kind of thing by providing drag-and-drop UI builders that are reminiscent of the Visual Studio tooling that we had in the late 90s and early 00s.

### Appsmith

Appsmith is a FOSS low-code app builder. IT can connect with a variety of data sources and apps can be exported to git

# RSync

RSync is a [FOSS](https://wiki.jamesravey.me/books/seed-propagator/chapter/free-open-source-software-and-open-culture "Free Open Source Software and Open Culture") file copying/syncing tool that has a number of uses and can be used to sync via SSH.

### Preserving User Permission in Rync

[https://brainsteam.co.uk/2024/01/03/migrating-users-across-servers-with-rsync/](https://brainsteam.co.uk/2024/01/03/migrating-users-across-servers-with-rsync/)

### Syncing with Non-Standard SSH Ports

It's generally good practice to run SSH services on non-standard ports so that they can't be (as) easily port-scanned and attacked. If you need to use RSync with a non standard port you can tell it any extra ssh arguments it needs to know about via the `-e` argument as explained in [this article](https://www.tecmint.com/sync-files-using-rsync-with-non-standard-ssh-port/) ([mirror](https://archive.jamesravey.me/archive/1690032132.955091/index.html)):

```bash
rsync -arvz \
  -e 'ssh -p <port-number>' \
  --progress --delete \
  user@remote-server:/path/to/remote/folder /path/to/local/folder

```

# Story Mapping

##  

## Resources

[https://www.easyagile.com/blog/the-ultimate-guide-to-user-story-maps/#what-is-user-story-mapping](https://www.easyagile.com/blog/the-ultimate-guide-to-user-story-maps/#what-is-user-story-mapping)

# Logseq HTTP API

LogSeq provides a HTTP API for developing plugins. The documentation is not particularly intuitive to get used to.

### Enabling the API

<details id="bkmrk-1.-turn-on-dev-mode-"><summary>1. Turn on Dev Mode + API</summary>

You need to turn on developer mode within Logseq via the settings menu:

<div>[![image.png](https://wiki.jamesravey.me/uploads/images/gallery/2023-10/scaled-1680-/image.png)](https://wiki.jamesravey.me/uploads/images/gallery/2023-10/image.png)</div>Go into the advance settings and enable developer mode

[![image.png](https://wiki.jamesravey.me/uploads/images/gallery/2023-10/scaled-1680-/MgEimage.png)](https://wiki.jamesravey.me/uploads/images/gallery/2023-10/MgEimage.png)

Then when the app restarts you should be able to enable the API

[![image.png](https://wiki.jamesravey.me/uploads/images/gallery/2023-10/scaled-1680-/Emximage.png)](https://wiki.jamesravey.me/uploads/images/gallery/2023-10/Emximage.png)

</details><details id="bkmrk-2.-add-a-token-by-de"><summary>2. Add a Token</summary>

by default no token is provided so you won't be able to call the api. Open the manage tokens dialog and create a new token:

[![image.png](https://wiki.jamesravey.me/uploads/images/gallery/2023-10/scaled-1680-/AXOimage.png)](https://wiki.jamesravey.me/uploads/images/gallery/2023-10/AXOimage.png)

[![image.png](https://wiki.jamesravey.me/uploads/images/gallery/2023-10/scaled-1680-/A1Uimage.png)](https://wiki.jamesravey.me/uploads/images/gallery/2023-10/A1Uimage.png)

You will now be able to make HTTP requests to the given URL and PORT using `Authorization: Bearer lulz` or whatever value you chose.

</details>### Using the API  


If you open your browser and head to [http://127.0.0.1:12315/](http://127.0.0.1:12315/) you will be advised that you can POST to [http://127.0.0.1:12315/api](http://127.0.0.1:12315/api) with a JSON payload and you can specify which method to call and what arguments to pass. You can use the[ Logseq Plugin Docs](https://plugins-doc.logseq.com/) to find a list of methods that can be used.

For example if I have a page called `Logseq` I could use the following payload along with an `Authorization: Bearer <token>` header to get the page's markdown block content:

```
{
  "method":"logseq.Editor.getPageBlocksTree",
  "args":[
    "Logseq"
  ]
}
```

You can pretty much use any of the methods listed on the plugin doc page with this method - you'll need to check the required arguments in the documentation and make sure that you pass the correct args.

# Data Lakehouse

A data lake house combines together the best bits of data warehouses and data lakes.

Data Lakehouses could be seen as the natural convergence of the two architectures (see [https://cloud.google.com/blog/products/data-analytics/data-lake-and-data-warehouse-convergence](https://cloud.google.com/blog/products/data-analytics/data-lake-and-data-warehouse-convergence))

### Data Lake

Data Lake is the name we give to a collection of tools that are often used together to process large amounts of data. Typically it includes a storage system like S3 or HDFS and a processing system like Apache Spark or Hadoop.

- Store lots of data - often in its raw "unprocessed" form in pseudo-real-time
- Process a subset of data in real-time or in batch modes
- Provide language-agnostic language runtimes for data analysis.

### Data Warehouse

A data warehouse is usually where data that has been processed and is now structured is stored. It is often used directly by business analysts in downstream applications. Data warehouses don't scale easily and typically have a lot more validation and processing associated with them.

### Data Lakehouse

A data lakehouse attempts to combine elements of both Data Lake and Data Warehouse - again it is typically the name given to a group of systems architected together to provide this set of functionality. It normally supports Extract, Load and Transform paradigm.

### References

- [https://cloud.google.com/learn/what-is-a-data-lake](https://cloud.google.com/learn/what-is-a-data-lake)
- [https://www.snowflake.com/guides/what-data-lakehouse](https://www.snowflake.com/guides/what-data-lakehouse)
- [https://azure.microsoft.com/en-us/resources/cloud-computing-dictionary/what-is-a-data-lake](https://azure.microsoft.com/en-us/resources/cloud-computing-dictionary/what-is-a-data-lake)
- 

# Design Frameworks

Design frameworks provide out of the box styling and components for use in websites. Many frameworks sit on top of Javascript and Typescript libraries and some lightweight frameworks simply provide CSS styles on top of static HTML.

### React Frameworks

### Lightweight CSS Frameworks

- [SimpleCSS](https://simplecss.org/) - written by Kev Quirk and provides a very simple and lightweight framework on top of standard HTML5 elements and components.
- [Foundation](https://get.foundation/sites/docs/kitchen-sink.html#0) - another lightweight CSS framework that works without loads of javascript libraries.

# ZSH and Unraid

This set of instructions is 95% based on [this reddit thread ](https://www.reddit.com/r/unRAID/comments/mwqjs8/zsh_with_persistent_config_history_and_ohmyzsh/#:~:text=Here's%20some%20information%20about%20setting%20up%20zsh,into%20root%20folder%20when%20array%20start%20up)(see below for an archived version if the link doesn't work)

1. Install [un-get](https://github.com/ich777/un-get) plugin (using plugin manager, copy and paste [the raw link to the .plg](https://raw.githubusercontent.com/ich777/un-get/refs/heads/master/un-get.plg) from github)
2. Install zsh:  
    ```bash
    un-get update && un-get install zsh
    ```
3. Install User Scripts plugin (in Community Applications)
4. edit `/boot/config/go` and add the following to the file:  
      
    ```bash
    # Install Oh-My-Zsh 
    HOME="/root" sh -c "$(wget https://raw.githubusercontent.com/robbyrussell/oh-my-zsh/master/tools/install.sh -O -)" 
    ```
5. Create your .zshrc file at `/boot/config/extra/.zshrc:`  
      
    ```bash
    export ZSH="/root/.oh-my-zsh"
    
    ZSH_THEME="robbyrussell"
    
    DISABLE_UPDATE_PROMPT="true"
    
    HISTSIZE=10000
    SAVEHIST=10000
    HISTFILE=/root/.cache/zsh/history
    
    plugins=(
      zsh-autosuggestions
      zsh-syntax-highlighting
    )
    
    source $ZSH/oh-my-zsh.sh
    
    # User configurations
    
    alias l='ls -lFh'     #size,show type,human readable
    alias la='ls -lAFh'   #long list,show almost all,show type,human readable
    ```
6. Create a new script named "zsh" in user scripts and set it to "At Startup of Array"
7. Edit the script you just created `/boot/config/plugins/user.scripts/scripts/zsh/script`  
    ```bash
    #!/bin/bash
    
    HOME=/root
    OH_MY_ZSH_ROOT="$HOME/.oh-my-zsh"
    ZSH_CUSTOM="$HOME/.oh-my-zsh/custom"
    OH_MY_ZSH_PLUGINS="$ZSH_CUSTOM/plugins"
    OH_MY_ZSH_THEMES="$ZSH_CUSTOM/themes"
    
    mkdir -p $OH_MY_ZSH_PLUGINS
    mkdir -p $OH_MY_ZSH_THEMES
    
    # Install zsh-autosuggestions
    if [ ! -d "$OH_MY_ZSH_PLUGINS/zsh-autosuggestions" ]; then
            echo "  -> Installing zsh-autosuggestions..."
            git clone https://github.com/zsh-users/zsh-autosuggestions $OH_MY_ZSH_PLUGINS/zsh-autosuggestions
    else
            echo "  -> zsh-autosuggestions already installed"
    fi
    
    # Install zsh-syntax-highlighting
    if [ ! -d "$OH_MY_ZSH_PLUGINS/zsh-syntax-highlighting" ]; then
            echo "  -> Installing zsh-syntax-highlighting..."
            git clone https://github.com/zsh-users/zsh-syntax-highlighting.git $OH_MY_ZSH_PLUGINS/zsh-syntax-highlighting
    else
            echo "  -> zsh-syntax-highlighting already installed"
    fi
    
    chmod 755 $OH_MY_ZSH_PLUGINS/zsh-autosuggestions
    chmod 755 $OH_MY_ZSH_PLUGINS/zsh-syntax-highlighting
    
    chsh -s /bin/zsh
    
    # Remove oh-my-zsh default .zshrc
    rm /root/.zshrc
    
    # Make sure the necessary directories are existing
    mkdir -p /root/.cache/zsh/
    mkdir -p /boot/config/extra/
    
    # Make sure history file exists
    touch /boot/config/extra/history
    
    # Symlink .zshrc and history files
    cp -sf /boot/config/extra/.zshrc /root/.zshrc
    cp -sf /boot/config/extra/history /root/.cache/zsh/history
    ```
8. Reboot your server and you should now have zsh setup (or run the scripts manually and create .zshrc file in /root/)

#### References

- Z[SH with persistent config, history and oh-my-zsh](https://www.reddit.com/r/unRAID/comments/mwqjs8/zsh_with_persistent_config_history_and_ohmyzsh/#:~:text=Here's%20some%20information%20about%20setting%20up%20zsh,into%20root%20folder%20when%20array%20start%20up), /r/unraid [(Archived Version)](https://wiki.jamesravey.me/attachments/48)

# Node Python and Corporate Firewalls (ZScaler)

### Node Custom CA Certificate

Assuming the corporate CA certificate is already installed on your system, you can use `export NODE_OPTIONS="--use-system-ca"` to ensure that node applications use the corporate cert instead.

### Python 

### Reference

See [ZScaler trust store docs](https://help.zscaler.com/zia/adding-custom-certificate-application-specific-trust-store)